The final ISO ballot on C++0x closed on Wednesday, and we just received the results: Unanimous approval. The next revision of C++ that we’ve been calling “C++0x” is now an International Standard! Geneva will take several months to publish it, but we hope it will be published well within the year, and then we’ll be able to call it “C++11.”
Nice to see one of the greatest workhorses of programming languages evolve
When looking at http://en.wikipedia.org/wiki/C%2B%2B0x it is obvious there are many small and big improvements.
However they were a bit optimistic when they gave it the working title C++0x
This will be interesting given that Apple through the LLVM project has committed itself to C++ 0x compliance once fully approved – what I think will be interesting is how these new features will be used by programmers and the impact for the end user; easier to manage coder? quicker to debug? able to implement complex ideas quicker?
What we need now is an Arstechnica article talking about the ‘old’ and ‘new’ way of doing things to compare how things are done now versus how one could do it in future.
Edited 2011-08-15 00:06 UTC
I’ve been playing with some of the features already implemented in g++, and mostly I’ve discovered they help reduce the amount of code I have to write. What I’m most looking forward to are the additions to std:: like smart pointers, threads, mutexes, hash containers. std::move support (new constructors and assignment operators) is something that I think will take a bit of getting used to, but it provides a much needed distinction between copying and moving. std::unique_ptr is a good example of where this matters.
Edited 2011-08-15 00:34 UTC
I am not exactly a hard core programmer. But just looking through the changes C++ has gotten a lot more complex then.
I wonder why people haven’t looked more into C and Objective C.
Because C++ is far more powerful than C or Objective-C and people who program for a living don’t find C++ too complicated.
C is completely inappropriate for large projects. And for small ones, C++ can do anything that C can do, and better.
Objective-C (and Objective-C++) are really just layers on top of C (and C++). Objective-C is cool, but it is not a low-level language like C++.
C++ is a lower-level language than C and Object-C, but it has all of the high-level tools for building any sort of project. It is a truly general-purpose lange, where the other two are not.
What in Obj-C prevents you from using the low-level features of C ?
Look. C++ is not “lower-level” than C. It IS C with some higher-level constructs (OO, STL, etc).
C++ I also prefer but one thing C++ can’t do, which C can, is produce nice small binaries. Thats a major problem on embedded (e.g. satellite) platforms.
Edited 2011-08-15 08:18 UTC
No. C++ has many more _features_. You can, and I have, implemented a complete OO system using raw C. C++ does that for you. C++ including STL and even Boost has a very large number of CompSci features.
Nonsense. To this day C still implements most OS kernels and large amounts of other code such as audio processing, jpeg, mpeg etc. C++ can do a lot and is a superset of C (mostly – there are a very small number of exceptions to this rule).
It is true that some programming languages suit certain problem areas better than others it is not accurate to say that “C is inappropriate for big projects”.
What’s required for a big project, more than anything else, are clear goals, good communications and good project leads and an easily accessible project structure.
Also – good code is good code whether LISP, C or Prolog… ok, maybe not Prolog!
I’ll half agree with you if you can explain why Objective-C isn’t “low-level” (hint – dynamic binding and method invocation). It is difficult to make “rommable” code with Objective-C (it is quite hard to C++ – you may need RAM for virtual function tables)
Objective-C is ALSO a superset of C and therefore contains all the “low-levelness” that C offers.
C++ as a superset of C is NOT “lower level”. You see at this point I realised you are just talking balderdash. If you are a CompSci student I hope I do not put you off but you clearly have some learning to do.
C is about as low-level as you’d ever want to get (there’s asm but it is naturally unportable – even that’s only 1/2 true as I wrote a Z80 assembler for a 68000 machine that assembled Z80 code into 68000 ops)
C++ is a strict superset of C. Everything that is in C is in C++ (except for a handful of idiomatic, slightly esoteric, syntax).
All are “general purpose programming languages” and technically only differ by syntax, although extrapolating that argument to it’s illogical extreme would have us all attempting to code everything using SKI notation.
+1.
I do not understand why this comment was modded down if it is actually very accurate.
Excuse me? How exactly is C++ lower-level than C? The only thing more low-level than C, is pure assembly language. C++ is just C, extended with a lot of OO features.
Please tell me one “low-level” task that can be accomplished by C++, and not in regular C?
Yeah, this is so blatantly inaccurate I can’t figure out how it escaped getting voted down to oblivion.
EDIT : Off with this never-closed bold tag ! I think the OSAlert comment system should get some automatic tag closing mechanism. Notifying Adam of this.
Edited 2011-08-16 16:11 UTC
Because C does not offer all high level abstractions and better type safety that C++ does.
And Objective-C is an Apple proprietary language that is only used to write MacOS X and iOS applications.
FYI, there are two open source Obj-C compilers (gcc and LLVM/Clang) and open source software can linked against either the OSX library stack or GNUstep.
So, by your standard, one could call C++ proprietary, since most software C++ software is written for a proprietary platform.
C++ is not proprietary. How the language looks like and what libraries are available by default is defined by ANSI and ISO standards.
While Objective-C is defined by Apple’s documentation. The compilers you mention only offer partial Objective-C 2.0 support for example.
Plus none of them offer base libraries for the language. Outside Apple systems you are left with GNUStep, which still tries to mimic the NeXTStep environment.
Very few languages are ISO standardized. That does not mean the rest are proprietary.
Languages defined by documentation, and having partial compiler support, are also not uncommon.
Why does that matter? glibc is separate from gcc, for example.
While you can assemble a desktop that is some facsimile of NeXTStep using GNUstep, the GNUstep classes now track Cocoa. This is so that OSX apps can be ported to GNUstep platforms (such as Windows), which I think is useful.
***
Perhaps to have a more useful discussion, we should agree on what constitutes a “non-proprietary language standard.” How about the Open Source Initiative’s definition? Then, which criteria does Obj-C fail to meet?
The languages without ISO standards usually have open source implementations and are not driven by companies.
A language implementation requires runtime library support, without it, it is meaningless.
OK, so where does ((gcc or LLVM) and GNUstep) fall short? You have an open source compiler+runtime and standard library. Obj-C 2.0 support is not complete but getting there; GNUstep is tracking Cocoa, though the Foundation classes for both platforms is little changed from the NeXT days.
Perhaps your objections is that Apple has so much clout that it can extend Obj-C in any way that it wants, without regard for other users? In that case, you really are restricting yourself to an ISO language standard, as any large corporation can swoop in and embrace+extend a non-ISO language …
Edited 2011-08-15 09:53 UTC
Move the goalposts why not?
Anyway, in Objective-C you do not have the power and versatility that high-level abstractions (templates) that C++ offers.
Man this thread is disintegrating badly.
Objective-C is _NOT_ proprietary. You are _not_ restricted in your use of it in ANY way whatsoever. It also pre-dates Apple going back the NeXT days (but of course Apple bought that lock, stock and barrel).
Proprietary would mean that the owners restricted, via license, your use of Objective-C is some way. This is not the case – you can implement your own compiler if you so choose to and even extend the language.
Also – I’ve seen Objective-C used WAY outside of Apple kit.
In addition to being inflammatory, this is misleading.
Objective-C is an adaptation of Smalltalk for people that like the Algol language family syntax (C like syntax).
Objective-C was not invented by Apple (and is not proprietary to them). It has been around longer than the Macintosh (the early 80’s). Their has been a GNU version of Objective-C since 1993.
Objective-C can be compiled with GCC either Clang (the LLVM C/C++/Objective-C compiler). This means that you can code in Objective-C on Mac, Windows, or Linux. No proprietary tools are required (all Open Source).
Clang is about to become the primary compiler used by FreeBSD and other BSD flavours as well.
All that said, I agree that Objective-C is only really popular on Apple platforms (OS X and iOS). Objective-C was the language choice on NeXT machines. It became the main language on Mac when Apple bought NeXT in 1996.
You’re right, actually it was developed at NeXT, which only made the source code available to the FSF, because
they modified GCC source code. Originally there were no plans to make it open. FSF had to send their lawyer.
http://www.gnu.org/philosophy/pragmatic.html
Apple acquired the rights to the language when they bought NeXT.
Except you forget that a language also requires a runtime and libraries.
GCC only provides support for Objective-C 2.0 since version 4.6.0, but still has two runtimes. No support
for changes made in Lion and Apple is no longer contributing to GCC since they have Clang.
And as you can see from this thread, still today it is not easy to convice them to give the changes back.
http://gcc.gnu.org/ml/gcc/2009-07/msg00403.html
Still a language without libraries won’t fly. And there are no libraries outside Apple platforms. GNUStep does not count, because it is tracking a mix of NeXTStep and Cocoa, plus it still makes use of a NeXStep based build framework which does not go well in other OS.
And work for Objective-C 2.0 in GNUStep is still ongoing.
http://wiki.gnustep.org/index.php/ObjC2_FAQ
No one truly masters C++. But each one likes his own subset of it
There are still lots of people using C, huge projects even. Things like GLib, embedded software (the faction that have stopped using Asm directly, I mean) or the kernel of most Unices come to mind.
As for Objective C, well… Once again, Apple have been too much of control freaks and have pissed of the geek audience because of that, the problem being that in that case, it was their target audience. Besides, there’s not much of a benefit in using Objective C over C++, and you lose a lot of language flexibility by doing so (C++ can be used at a lower level, be lighter on resource, it offers metaprogramming and operator overloading…).
To sum it up, the reason why Objective C has never taken off outside of the Apple world is that C++ is what its users want it to be, whereas Objective C is what Apple want it to be.
Edited 2011-08-15 06:44 UTC
C provides relatively little opportunity for abstraction, and less safety. To give one example: the only manner to make a generic algorithm is by using macros (bad) or void pointers. In C++ you can just write a templated definition, which provides genericity and type safety.
People use C++ because they want a fast low-level object-oriented language. Objective-C is the opposite, since it uses late binding and heterogeneous containers, it is far too slow for things that people typically use C++ for.
If you are proposing alternatives, D is probably the thing that comes closest. However, that language is plagues by having two widely used standard libraries, and having only a mature compiler for D2 that uses a non-FLOSS backend (dmd2).
However, if you want to compile to machine code, Haskell and OCaml may also be possibility. If used wisely, you can write fast programs in both languages.
Edited 2011-08-15 07:19 UTC
i’m looking forward to Go from google a lot more
c++ is a bit bloated and in 0x even more keywords introduced, because of the already bloated base, in order to corporate with some modern programming concepts it has to go that even bloated way though.
i’d like to see some simple, elegant yet powerful languages to surface.
Go is too simple.
It might replace C but not C++. At least not in its present form.
You can think of it as C with interfaces, GC and modules. But it lacks many concepts that you can find in any modern language like enumerations, generic programming, operator overloading, traits, dynamic loadin, just to cite a few things.
It actually looks like a port of Limbo.
… That will be bluntly ignored by Microsoft, forcing both native and cross platform developers to jump through hoops to get their code compiled under <Enter VS version here>.
Thank God for MinGW64!
– Gilboa
You should do some research, before bashing Visual C++ blindly. While they are not the best, they are actually in top-3 in C++0x standards compliance:
http://wiki.apache.org/stdcxx/C%2B%2B0xCompilerSupport
And do not forget that Herb Sutter (lead software architect in Visual C++ group) used to be the head of the ISO C++ standards committee for many many years. He has done significant work to change the culture in the company.
Edited 2011-08-15 07:07 UTC
Wow… so many assumptions in such a short post.
I’m not bashing VS because I enjoy bashing MS (even though its fun) – I am bashing VS because 20% of my cross platform development time is spent on getting C and CPP code ported through the abomination that you call VS.
E.g.
A. Microsoft decided, for no good reason not to support in-line assembly in x86_64 (even though its has been supported in 16 and 32bit since MSDEV 1.5 in the mid 90’s!) forcing anyone that relays on in-line assembly to jump through hoops to get the in-line assembly compiled and linked. (I’m mixing mingw and VS, thank you very much)
B. Lack-luster pre-processor that makes writing complex macros more-or-less impossible. (vargs? return value? Pffff!)
C. Try this for size (pun intended):
short s_number1 = 1;
short s_number2 = 2;
s_number1 += s_number2;
D. nmake. nuff said.
E. Safe C runtime “extensions” (Do I hear EEE)?
Now, they may decided to support the latest Cxx standard without EEE it, but in the end, using VS for anything outside C# will mostly generate grossly non-portable code.
– Gilboa
The place of assembly code is in .s/.asm files.
Regarding the other complaints you will find similar issues with other compiler vendors.
You got to be kidding. Have you actually read your own comment before pressing “Submit”?
If the place for in-line assembly is .asm/.s files (says who exactly?), why do ***ALL*** major compilers, including visual C itself, support in-line assembly, outside that is, Visual Studio 64bit (2K5/8/10)?
As for the rest of my comments, nice going! instead of answering them point to point you simply waved, mumbled something about “everybody does that” and continued. You *really* showed me!
– Gilboa
Edited 2011-08-15 08:08 UTC
It is very simple really. Microsoft decided that porting the in-line assembly code to support 64 bit was too much work. So they decided not to bother since in-line assembly really complicates the optimization stages of the compiler. And since the days when in-line assembly was “invented”, compiler vendors realized that creating intrinsic functions was a much better solution.
The intrinsic functions have the key advantage that they can be optimized further by the compiler, which means things like register management and memory locations for variables can still be controlled by the usual optimization stages.
This means that some SSE2 code that is written with assembly will have to be re-written for x64 if you want to take advantage of the new set of registers, while the version using intrinsics will simply have to be recompiled. The intrinsic version also just works on GCC, Intel’s compiler, MINGW, LLVM. This is also not the case for assembly.
Now if you still believe you’re so pro that you can outsmart the compiler’s optimization stages, then surely moving your code to an .asm file and doing the x64 function prolog should be a trivial one-time problem, right?
*Sigh*
10 words, small letters:
memory barriers, lock variants, special ops, per-cpu/platform variant.
– Gilboa
Edited 2011-08-15 09:04 UTC
Yes I did read it. Inline assembly in code had its place in the early 80’s, when I got started into programming.
In the projects I have lead responsibilities, assembly code if it is required at all, lives in .s/.asm files.
I have seen too many bad examples of C compilers used as assemblers.
Well, who said I thought it was important to discuss the remaining points?
If you are spending 20% of your cross platform development time on VS you’re clearly doing something wrong. As the main author on a cross platform game SDK (clanlib.org) I’ve personally had the experience with cross platform coding between Windows, Linux and Mac. And I do not use 20% of my time getting code to work between the platforms.
A. Microsoft removed in-line assembly because they want you to use compiler intrinsics instead. They have several advantages over custom written assembly such as allowing the compiler to actually optimize your assembly. And the best part is that even though these intrinsics aren’t part of the C++ standard, most of them are part of an Intel standard that all the different compilers support. Truly write once use everywhere (for the x86 platform). So why are you using hand-written assembly again?
B. I don’t use vargs or much complex macros, so I really wouldn’t know. However if you are trying to write something cross platform its rather stupid to insist doing things you can’t do cross platform, isn’t it? Maybe you should reconsider your coding style to be more compatible with multiple compilers.
C. Not sure what you are trying to say here.
D. Why are you using nmake? Self torture? Nobody uses nmake. Maybe you should try something like CMake if you want to only use one build system for all your platforms? Or if you insist on using a MS technology try maybe a normal vcxproj or perhaps even msbuild!
E. I assume you are referring to the strcpy extensions and so on. First of all, WHY would a C++ developer even use these functions? Self torture again? Secondly, you don’t HAVE to use the extensions if you don’t want to (hint: a simple define will disable the warnings it issues about using the unsafe versions). Thirdly, there’s a very good reason why Microsoft want you to stop using them. They were the primary source of buffer overruns in their own software.
Just because you don’t know how to write portable C++ code doesn’t mean it is the fault of Visual C++. Most of the things you complained about isn’t even part of any C++ standard (i.e. assembly and nmake).
I agree a nice cross platform build system would be nice, but so far I like the MS solution system a lot better than any unix alternative I’ve seen (make, automake, qmake, cmake). But 20% of your time? To add the .cpp files to a makefile? That’s just trolling.
My original intent was 20% of the *porting* time and not development time. Miss-wording on my end.
Now before you start spewing the normal “why not use autobuild/automake/cmake” bullshit, keep in mind that you *don’t* know what I do and why.
I hate when people assume that they know best about problems other people are facing…. *Sigh*.
I do *not* *want* the compiler to optimize my *assembly* code. I relay on specific code order to ensure memory barriers and the last thing I could possibly want is to have the compiler mocking around with my code.
Re-read my previous post. I do write cross compiler code (read code that’s compatible with the brain dead VS pre-processor)… I simply do not like it (E.g. using inline functions instead of macros due to VS’s lack of macro-return-value support).
Try it with -w4.
No, I’m not using nmake. I’m using GNU make.
… But to me (feel free to disagree), the lack of good make alternative is a good show-sign as for MS “commitment” for C/Cxx.
Because all the “normal” functions are marked as-soon-to-be-deprecated?
E.g. “http://msdn.microsoft.com/en-us/library/2029ea5f(v=VS.100).aspx”
See above.
Sure. they only want whats best for us.
Why use standard size-secure functions that are supported by world+dog when you can invent a completely incompatible C-runtime.
I assume that this is the first time you hear about EEE, right?
Hey boy, drop the I know best attitude.
Making such bold claims without knowing who I am and what I do for a living (and for how many years) may make you look like a complete condescending asshole.
– Gilboa
Edited 2011-08-15 09:07 UTC
You seem to be quite bitter in general – this kind of attitude doesn’t help to get your point across, even though you might be right at points.
Then you should *really* consider moving that code outside the compiler’s control, shouldn’t you? Also, how really cross-platform code with inline assembly is? Sure, you can use the preprocessor macros to shoe in inline assembly for a dozen of CPUs, but why?
Then this is a matter of taste, not really superiority/inferiority of this or that compiler, isn’t it? And de gustibus non disputandum est…
It’s considered good mannered not to comment on things you don’t use or know – doing that will gain you more ears than bitching.
And there we go again – why offend somebody who writes without bad intent? What is your point here? That *YOU* know what EEE is? Everybody’s clapping hands, granted.
Personally, I see it quite the opposite. We know what he does (clanlib is out there, you can look at it) and we don’t know at all your code so, for what we can see, he’s the person with some credentials and title to write about cross-platform C++ while you, kind sir, are not. Show us the code and let’s keep discussing.
– Gilboa [/q]
Edited 2011-08-15 09:27 UTC
Oh, I’m not bitter, at least as far as I know.
But OK, point taken never the less.
Simple. Far easier for 3’rd party users to interact with C/H files than asm objects.
I’m not sure if you answered my post or the post before that. (Plus, I don’t speak Latin )
Oh, wait a minute:
Where did I say that I never **used** nmake in the past. (Or retry using it from time to time)
I do not use nmake *now* because I rather simply force my downstream users use GNU make as its simply far too hard to get the brain dead nmake (and I’m being -very- polite) do anything useful.
Again, assumptions, assumptions, assumptions.
Call it lacking reading comprehension skills on my end… but… Ughhh… what?
(The point is that Microsoft could have simply used standard methods to write “safe” functions but instead, like 10^6 times before, choose to invent a new and completely incompatible API. [Don’t get me started about the so-called-POSIX-layer])
Sorry, I must have missed that “show us the code before posting” warning sign, sorry.
You do understand that, well, whether I’m a competent C/C++ developer has nothing to do with the point I’ve made, and the only personal comment was by the previous poster and you, and in both case, tend to be a clear sign of having a weak argument to being with, right?
– Gilboa
Edited 2011-08-15 10:11 UTC
Good , always a nice thing to come to mutual understanding
Fair point, but considering that 99% of 3rd party users have no idea about assembly these days… kind of moot I still think putting assembler code in separate files is a much better approach (and discussing it quite beyond the scope of this thread )
Your post, and the Latin sentence in free translation means “one does not discuss the tastes”
*You* know *you* do not use make *now* – we don’t know that about you. All you said was “I don’t use nmake” which for all the readers means just that… How on earth should we know whether or not you used it before?
You throw in a TLA and expect everybody to know it (or not) and if somebody doesn’t you go “Oh, so I guess you haven’t heard about it… doh”. As many things in your post(s), this doesn’t buy you friends or understanding, makes you look like a show-off.
They could have, they didn’t – so what. Being all up in rage about it won’t change a thing. I’m also not in love with MS platform as far as native coding goes (until 3 days ago I had successfully stayed out of its way for 10+ years) but bitching never gets you farther than just coding your stuff. In my experience there’s always another way to do something you do – and if your way of doing somethig is becoming more and more tedious and burdensome, then there’s something wrong with it.
There’s no such policy in place, of course, but you attacked a person who showed his “credentials” while you haven’t shown yours. Makes you a bit less credible, at least IMO.
You commented on something which requires at least some working knowledge of C++. Also, your comment was written as if coming from somebody who knows C++ in and out – that puts it in a different perspective, doesn’t it? As for the personal comments – the way you presented your point was offensive and confrontational for no real reason. And it did look like the usual anti-MS trolling (so 1990s IMO). And before you comment on the last sentence – I’ve been using Linux since literally its beginning, I used to “hate” MS in the GNU zealot style but I grew up. Let me offer you a piece of advice here – just get your stuff done, don’t waste time on bitching about things you can’t change. I suppose you consider yourself to be a hacker. If yes, behave like one – work around limitations, change stuff you don’t like, invent new ways of doing things and “code boldly when no-one has coded before”
OK… But nothing you say negates my original/second post:
My beef with VS is rather simple: it requires me to do a lot of fancy leg work to get (fairly simple) things working – far more than any other platform that I use/support and as I result, I’m slowly dropping it in-favor of MinGW64.
– Gilboa
Oh I see. And you haven’t heard about the memory barrier intrinsic either that guarantees just this. And why is it so dangerous for the compiler to improve your code? You want it to run as slow as possible?
So you write cross compiler code, except that you use one specific feature not supported by all compilers. Which makes it not cross compiler code after all.
The build system is not part of C++. So it says absolutely nothing about their commitment. But if you like GNU make so much (not that I understand why), why don’t you just compile your program with GNU make?
So the message scares you. And the scary message takes 20% of your time. I’m shaking in my boots!
I do not use any of these C functions. I code in C++ with the C++ features such as string classes which ensures I don’t do buffer overruns of this type. So yes, I haven’t heard about EEE, which I assume is some solution to a C developers problem which does not apply to me. The only reason I know about this warning at all is because I compile C dependency libraries once in a while.
Not that its particular relevant since it can’t be from this warning message you spend 20% of your development time..
Takes one to know one?
But here’s some more I know best attitude for you: If I can target 4 platforms (Windows, Linux, Mac and iOS) using 3 different compilers (MSVC, GCC, LLVM) doing high performance multi-core software rendering and only spend a fraction of my time on differences between the compilers, then what am I doing right that you are doing wrong?
I think EEE mean Embrace, Extend, Extinguish.
Yeah, sure.
I’m being hired to write slow code.
#ifdef _MSC_VER?
Lets agree you’ll actually try to *read* what I wrote before posting, K’?
OK. Good luck to you, then.
OK. Good for you.
Have you considered using Google before going on a posting rampage?
Failure to read comments, again, etc, etc, etc.
So, you are doing code that multi-core rending on 4 different platforms using 3 different compilers.
As far as you know, I’m doing X, on platforms X, Y, Z, possibly PC (or possibly not), I may be writing the code in C, C++ or assembly (or possibly all). The code is either running in user mode or kernel mode (or possibly both).
… And yet you assume that you know what I’m doing right and wrong.
Please continue. I’m fascinated.
– Gilboa
Edited 2011-08-15 10:33 UTC
The only reason I commented on your original post was because I’m tired of listening to how its always the fault of Visual C++ if someones code does not compile there, and because you were talking like you had decades of experience on the matter. So I voiced my opinion as a different perspective for those that wouldn’t know better or not.
Yes, you could be doing X, Y and Z on W, but to be honest I find it highly unlikely you’re doing anything professionally with the attitude you are showing.
Seriously, grow up.
You don’t believe me. Oh! The pain. (Hint, I’m being -very- sarcastic *)
– Gilboa
* I could point out, again, that your attempts at badly constructed personal insults does little to prove your point; but we’ve already been there once or twice. So, please, do continue, you amuse me.
It is so incompatible that is part of the C standard process, ISO/IEC WDTR 24731, TR 24731-1.
http://www.open-std.org/jtc1/sc22/wg14/www/projects#24731
And was actually contributed to ISO by Microsoft.
I as unaware that _secure extensions were ISO’ed.
I concede the point. (Do I continue to claim that using MS CRT and CRT flags generated broken cross platform code, but that’s anther argument)
– Gilboa
I personally find inline assembly to be bothersome. Why use inline assembly when you can write a module in external assembly with a cross-platform assembler (yasm,fasm,nasm,whatever) that can then be used with your compiler of choice? If you’re doing little enough assembly that an external module seems like too much work, then you should probably be using intrinsics instead.
I personally prefer using the preprocessor as little as possible – there’s often a better solution. Inline template functions have less nasty surprises and are easier to debug, and if you’re doing large amounts of complex preprocessing you might be better off using a code generator instead? Hard to tell without knowing your use cases, though
Try it and… what? Buggy code generation, warnings, sloppy code, what? And which compiler versions(s)? For VS2010, there’s nothign weird.
Why, when there’s msbuild? Alternatively, cmake or scons. I wouldn’t write makefiles in GNU make either. Autoconf/make is a big barrel of yuck – unless you need to support really broken platforms, just write portable code
It’s been submitted as a standard, but you almost have a point – I haven’t found a non-MS implementation of it. Haven’t looked hard, though, since I stay away from the yucky C strings as much as possible.
How so? Nobody forces you to use strsafe. And if you look at *u*x source, there’s a fair amount of it using proprietary GCC extensions
As I said in another post in this (far-too-long) thread using inline assembly functions and/or macros is far easier both for me and my downstream users, as they are only forced to use an H header file and they are done – No matter what build environment and/or compiler they are using.
Plus, at least to me (and this is purely a personal preference) in-line asm to C interaction is far easier to debug compared to using full blown ASM compilers such as nasm. (No need to mock around with db / sections / parameter passing, compiler constraints are glued to the code, etc).
Macros are harder to debug, no doubt about it, but offer far better flexibility when it comes to accessing the caller function local parameters and symbols. Plus, and this is purely personal experience, I found that in some cases, the compiler optimizer manages to chew out better code when dealing with macros – mostly when it comes to loop unrolling and dead code removal. (Come to think about it, the best example inline assembly within inline functions compared to macros)
I’ve yet to test under VS 2K10, but under 2K3/2K5/2K8 this generated *mounds* of warning due compiler error.
Having to replace 500 lines of A<+-/*>=Z with A=A<+-/*>Z just because MS refuse(d) to fix a bug is very annoying.
Though, as you suggest, I might be nit-picking. (I actually faced this problem a couple of weeks ago so the pain is fairly recent…)
Assume that you have a good build system that works (written around GNU make and to some extent, nmake) that uses a (very) thin client-side Makefiles.
Using msbuild requires complete rewrite of the Windows side and the configuration files are anything but portable. Option deleted.
Switching to cmake will require a complete rewrite of both configuration files and build system, and cmake integration with kbuild is anything but clean (call it personal preference), especially if the same project is designed to run in both kernel mode and user mode.
… In the end, I forced a decision to use GNU make in Windows, but my life would have been far easier if MS would have simply decided to release a working make.
Plus, when you are forced to use native tools only (sometime I faced in the past) – having to choose between msbuild and nmake is like choosing between death by stabbing vs. death by hanging. You may suffer less (or more) but the results will be the same…
It was pointed out in another post, I wasn’t aware of the accepted ISO.
Microsoft has a very annoying tendency to make CRT code non-portable.
E.g. WSAStartup, closesocket, begin/endthread, _<function>, printf types (unicode, 64bit, etc), 64bit int instead of long, etc.
Sure, one should use native CreateThread/WSAFunction/etc, but unless you know in advance that MS CRT code is **not** portable, and use an OS abstraction layer (my_thread_create -> pthread_create/CreateThread, my_socket_close, etc) and abstract types (__i32, __q64, etc) you are royally f***ked.
YMMV, but the past couple of years I ported a number of Windows only projects that fall right into this trap. (My favorite being 64bit int and socket functions, of-course).
– Gilboa
Edited 2011-08-16 12:14 UTC
Valid point, although it does mean you need might need a fairly large number of vendor-specific ifdeffing… with external assembly moduled, you (should) only need one per architecture. What about intrinsics, though? Been a while since I dealt with them, but haven’t they been sorta de-facto standardized between major compilers?
For me , assembly tends to be only worthwhile if I need sizable chunks of it – so the glue is minimal.
Can you offer some sample where macros makes your life a lot easier? I’m pretty curious, since I haven’t bumped into many cases myself
…and that’s one of the reasons inline assembly was dropped in VC++, because it makes the optimizer’s job harder. Intrinsics?
It’s not nit-picking if you deal with compiler bugs – I didn’t have older versions of VC around, so I was confused with what the purpose of your code snippet was.
If you’re explicitly dealing with shorts instead of ints, I suspect it’s for very well-considered performance reasons limitd to a few source code modules. While a bit ugly, couldn’t you just disabled that particular warning for those particular modules?
Yes, -Wall and -Werror are nice, but IMHO you’re permitted leniency when dealing with compiler bugs
Legacy concerns, fair enough.
A “working make”, or “their own version that exactly mimics GNU make”? There’s a difference.
Microsoft has a very annoying tendency to make CRT code non-portable.
E.g. WSAStartup, closesocket [/q]Very little amount of extra code – and sockets aren’t part of the C++ standard. If you want portability, you use an abstraction.
Not optimal I agree, but threading wasn’t part of the standard until C++11… so you had to use an abstraction if you wanted portability.
What’s the alternative, though, if you want to use printf? At least MSDN lists which are standard and which are extensions. If you do idiomatic C++, you wouldn’t be using printf anyway
If you write code that depends on integer size, you should be defining those types explicitly since C++ doesn’t offer you any explicit guarantess. In fact the logical choice is to have ‘int’ be the native machine word size for performance reasons. I’ve always wondered why GCC choose 32bit int on x64.
Not everybody writes portable code – sometimes because of ignorance, sometimes out of lazyness, sometimes because people don’t want the code to be portable… which sometimes seems to be the case with *u*x centered stuff. “Portable? Oh yeah, that means any POSIX system with linux-specific extensions”
Partially, 32bit vs 64bit will most likely require duplicate code, so will different arches (ARM vs. x86).
Not standard as far I know.
Beyond that, intrinsic(s) are only useful as long as you are doing something simple – say locked-bit-test-and-reset. Once you start doing something more complex, using multiple sequential intrinsic commands tends to be sub-optimal.
Last and not least, its -very- hard to write dynamic arch code with intrinsic: E.g. dynamically selecting the ideal popcnt implementation in runtime is not possible when using _popcntXX, as you have little control (if any) on what the compiler is doing.
Far too much mocking about. Plus, its fairly hard to “optimize” the glue by using load and loads of macros
Of the top of my head, short gcc-only-debug-sample (not sure if it even compiles, but you’ll get the idea):
#include <stdio.h>
#include <stdlib.h>
#include <errno.h>
#define FAIL_ON_ERROR(__func) \
({ if ((error = __func) < 0) { \
{ printf(“Failure! File: %s, Line: %d, Function: %s, Error: %d.\n”, \
error = -error; \
} else { \
error = 0; \
} \
error; \
})
int test_function(int value)
{
return ((value > 3) ? 0 : -EINVAL)
}
int main(int argc, char *argv[])
{
int error;
if (argv < 2)
return 99;
if (FAIL_ON_ERROR(test_function(atoi(argv[1]))))
goto err;
printf(“OK.\n”);
return 0;
err:
return error;
}.
Now, if I use inline function, I lose access to local symbols (__FILE__, __LINE__, __func__ and error). If I use VS, I cannot return values. (I know I could rewrite the macro a combination of inline function and macros or as a series of (X) ? A : B, but this only works if the code itself is simple and in general, it will make the code highly unreadable and almost impossible to maintain.
See comment about intrinsic(s).
I could report a bug… but bugzilla.microsoft.com doesn’t work
Especially when dealing with someone else’s code, this warning makes a -lot- of sense. (E.g. 64bit int problem; using the wrong types in numerical calculations, etc)
I find the -Wall saves me a lot of trouble, even if I’m forced to #pragma out some warning in explicit blocks after some thoughtful consideration.
In general yes. (Minus kbuild)
I could live with a different syntax, as long as ifXX and include syntax is the same.
In my case, as long as the per-sub-project thin Makefile is the same and I’m not forced to shove some 3’rd part tool down my downstream user’s throat, I couldn’t care less if the build system code is partially duplicated.
The problem is that nmake is really, really, really brain dead .
In my experience, the problem is that developers, even experienced ones, tend to think that CRT == standard and as a result, throw huge amounts of CRT code everywhere instead of using abstract types and functions.
The problem is that MS, that came late to the 64bit game could simply use %lld/%llu instead of using %I64d…
As before, a lost of developers __assume__ that if they use int, its uses the same size on all platforms.
Microsoft came late to the 64bit game, and could have easily made life easier by choosing 64bit long.
Can’t say that I’m too surprised they selected int for 64bit. (Even though I do agree that choosing long for 64bit is… ugh, weird).
The sad thing about the project I helped porting is that the developers *thought* that if they stick to using CRT, they are platform free.
I wasn’t the most popular man on the planet when I told them that even the types that they use must be replaced…
– Gilboa
Edited 2011-08-16 15:08 UTC
Actually, Herb Sutter is promoting a C++ renaissance inside MS:
http://herbsutter.com/2011/07/28/c-renaissance-the-going-native-cha…
Now that I’ve started learning Ada 2005, C++ feels just kind of crusty…
I also like the language, but I fear that it will never take off outside niche markets due to the way Ada compilers were sold in the past.
They were too expensive plus Ada was coffined mostly to military development, which kind of drove away people to other languages.
Which is a pity, since it would save us from a few buffer exploits.
Hi!
Could anyone provide pointers on how a programmer that knows C quite a bit and just some C++, learn C++0x the proper way, i.e. use C++0x constructs as they should, not going needlessly deep into older C++ constructs, i.e, iterators in C++0x got a revamp, haven’t they?
Any ideas?
Thanks in advance!
Actually Microsoft has a few sessions on it on their developer’s channel.
http://channel9.msdn.com/Shows/Going+Deep/C9-Lectures-Introduction-…
http://channel9.msdn.com/Shows/Going+Deep/C9-Lectures-Stephan-T-Lav…
edit: typo
Edited 2011-08-15 17:13 UTC
Thanks!
Does it cover new stuff in C++0x – looking at it makes me thing it doesn’t.
Maciej
Partially. He does use a few things from the new standard like auto, closures and new type initializers.
C++ if a fugly language. I dont understand why anyone would want to program in it.
Several stock exchange systems, among the largest and fastest in the world, are developed in Java. Java is a much simpler language than C++. Sure, Java has its flaws, but anyone has to admit that Java is much simpler than C++.
Perfection is achieved – not when there is nothing left to add – but when there is nothing left to remove.
Stroustrup allows any feature to be added to C++, and never says no. This make C++ a horrendous bloated language. Last of I heard 10 years ago, C++ specifications were 800 pages. Scheme is completely described in 24 pages. Scheme is beatiful. C++ is ugly.
Less is more. Not, “more is more”. Gosh.
Regardless of its flaws, there is no other native language that offers the performance and set of abstractions that C++ does.
Until that language comes, most native code programming will be pretty much done in C++, with some C as well.
Ada. But nobody seems to want to use it.
I met a developer working at a fixed income stock exchange, and he said that recent benchmarks showed that Java was faster than C++ (it was algorithmic trading or something similar) in that particular application. Algo trading needs to be done really fast.
So, I beg to differ. Java rivals C++. Some of the largest stock exchange systems are done in Java.
Java has always excelled on the back end, as server applications. But as a front end Java is not that good. Sun always focused on Enterprise servers, and they tailored Java to the back end.
Microsoft has always been focusing on desktops so C# is very good as a front end client language. But on the server, Java / C / C++ is the norm. I have never seen a big system in C#. For instance, London Stock Exchange threw out their stock system running Windows, using C# – because of all the crashes and bad performance. On the back end, I never see C#. It is exclusively Java / C / C++. Though Erlang is slowly gaining ground.
Java is much simpler than C++ and Java is roughly as fast as C++. So I dont see why people wants to use C++ instead.
Funny that you mention this. Actually the guys from London Stock Exchange rewrote their system in C++
http://www.computerworlduk.com/in-depth/open-source/3260982/london-…
And Microsoft is now saying to everyone that they are sorry to have placed C++ in second place on their tools and will be back supporting it 100%.
http://www.computerworlduk.com/in-depth/open-source/3260982/london-…
Java is quite fast and in many cases it case perform as well as C++. But on games and HPC still looses out to C++. And lets not forget the disaster generics are.
They are not as powerful as C++ templates.
C++ Compilers are the only ones that are able to beat Fortran in numeric analysis and this thanks to heavy use of template metaprogramming.
Regarding the London Stock Exchange, no, the guys did not rewrote their system. LSE bought a new system that runs on Linux + Solaris. Solaris is used as the Oracle database back end.
I agree that some aspects of Java is ugly, but no language is perfect.
Sun said that Java will change slowly. This is the aim in Enterprise: slow changes, backwards compatbility. Servers are run for many years in the Enterprise. Slow changes. But the libraries will change fast, Sun has said. Java will not change fast, Java will not get every cool feature – maybe that feature is not cool in a few years? C# changes fast, that is not for Enterprise. Enterprise does not like fast changes. Just check yourself.
Still the new system is C++ based.
Yes. So?
No one disputes if C++ is fast enough to handle big stock exchanges. The question is: is C++ the only language fast enough?
Answer: No, Java is fast enough today. Thus, C++ is not the only language when you need Enterprise performance. There are other fast enough languages as well.
I am playing a bit the devil’s advocate here, to be honest.
I do know JVM/.Net based systems quite well, since they are my daily tools and they are quite fast if
coded properly.
But C++ also was my daily tool for several years and I am pretty versed in the standard.
It is true that most VM systems are quite fast nowadays. Still those systems are usually coded in C++, even though some attempts have been done in VM languages.
If you are targeting native code, there is no language that can easily replace C++. At least one that offers both the low level access and the high level abstractions.
These things are important, when you are writing code that has to be cache friendly, for example.
Actually I would like to see a FP language take off as possible replacement, but most “standard” programmers cannot grasp FP. As for Go or D as possible replacements they are still far away. So that leaves us with C++, when you need to go down to the metal.
Java written to be super-fast is even more fugly than C++ code.
I really do wonder when I see statements like that.
My guess is the C++ implementation was a port of Java code (or na~Avely written C++) rather than idiomatic high-performance C++. Hint: just because syntax is similar, best practices aren’t equal in the two languages. You don’t want to do a lot of new/delete (or smartptr equivalents) in C++ code, at least not with the default allocators. But C++ gives you the freedom to use your own allocators… if you know enough about your allocation patterns, yhou can beat GC pretty severely.
If the bottleneck was std::string, well, you’re free to cook up a fast alternative – it’s not rocket science.
If the bottleneck was elsewhere, it would be interesting to know. Theoretically, a JITer could generate better code when knowing hotspots, I just haven’t seen it myself – especially not compared to a decent C++ compiler with profile guided optimization.
That’s more of a business decision than a language/performance trait, though.
I think you know that the largest stock exchanges need to be really fast, with extremely low latency and massive throughput. The High Frequency Traders and algo traders choose another stock exchange if the exchange is not fast enough.
So, how can Stock Exchanges made in Java rival C++ Stock Exchanges? Have you thought about that? Is the GC such a big problem? No, if you know what you do, then Javas GC is not a problem. I know how they do it. :o)
Have you read the offical benchmark study from Google a while ago? Java vs C++ vs Go, etc? Googles conclusion was that C++ is the fastest – if you are an C++ expert and wrote non trivial code. So, you can gain some few percent in speed – if you are an C++ expert and use heavy optimization. That sounds exactly like assembler: you gain a few percent speed if you are a good asm programmer. Needless to say, asm programming is diminishing.
That’s more of a business decision than a language/performance trait, though. [/q] [/q]
I dont agree. Javas VM is extremely fast on the server side. C# is not that optimized. Sun knew Enterprise servers. MS dont. Just look at London stock exchange which ran MS Windows + C#. A total failure.
Either you don’t deal with a lot of allocations/deallocations to begin with, or you did but changed to an object-reuse scheme.
A lot of what’s needed to make a stock exchange fast has to do with networking, anyway – location and hardware is important, and then comes the OS network stack. None of those have anything to do with C++ vs Java
I skimmed it when it came out, and I’ve just skimmed it again… most of the C++ performance tweaks (especially the ones accounting for largest speed boosts) don’t really look “expert” or “non-trivial” to me – but more about knowing your data structures and moving allocations out of loops. “Oh, is a container with O(1) lookup faster than one with O(n) and bad cache traits? ZOMG YOU’RE AN EXPERT!”
The fastest Java version ran 3.7x slower than the C++ version. “A few percent”? The normal Java versions were 12.6x slower (32bit) and 5.8x slower (64bit). The Java versions used 3.7x as much memory as the C++ version.
Furthermore, this is one specific benchmark, and it’s very heavy on use of data structures. That’s relevant and important for real-life code, but if you really want to see C++ shine you should compare a task that’s very compute-heavy and focuses less on data structure traversal.
The paper is slightly flawed, btw – it doesn’t mention which C++ compiler is used (I’d wager some version of GCC) nor which optimizations were involved, nor which libc implementation+version (which is pretty important when dealing so much with data structures). You can add this as a benefit of Java – less target platforms.
I dont agree. Javas VM is extremely fast on the server side. [/q]Dunno about that, it has always seem relatively sluggish to me. The main arguments in favor have been garbage collection (which is somewhat of a moot point because it doesn’t solve non-memory resource leaks) and that Java programmers are a dime a dozen… and that managers tend to love enterprisey things.
Dunno what went wrong there, to be honest, so I can’t really comment on it. I also haven’t pitted Java against C# performance-wise (I do know I prefer the C# language to the Java language… framework-wise both have their pros and cons). C# has language features that at least gives it theoretical advantages over Java, and I’d be surprised if the JIT is much worse than what Sun/Oracle has in the JVM… but again, I haven’t pitted the languages against eachother.
Specification might be simpler, but is working in it simpler? Nobody says you have to use every feature of the language to get the job done. I do Java development at work (and would prefer to do C#), but C++ is the language that has a special place in my heart.
Let’s see…
1) no deterministic destructors. GC only solves memory (de)allocation, and leaves you to do 100% manual other-resource management.
2) generics suck. They don’t encode type information (bad for reflection), and are really just syntactic sugar for casts. Can’t work on primitive types so you suffer the horrible overhead of (un)boxing if you try to do generic code over primitives. Yay C++ templates!
3) in the same vein, only heap allocated objects. Yeah sure, the (de)allocator is pretty fast, but still not as fast as stack pointer add/sub.
4) no operator overloading (yes, you can shoot your leg off with it if you’re irresponsible, but it can be pretty nice when you know what you’re doing – godsend for DSLs).
5) It’s the Java way or the highway. In C++, if there’s something you don’t like, you can always write your own version. There’s nothing magical about, say, the string type.
6) Multiple Inheritance. If used responsibly (ie. mixins) it’s a great feature.
7) Even more portable than Java – though of course you don’t have nearly as rich a stdlib.
8) No type inferance, even C++ has that now.
9) No lambdas, even C++ has that now.
10) PER FOR MANCE.
I assume that’s why it has taken so many years for C++11? Because features are just accepted, without proper discussion? Yeah, that must be it.
Beauty is in the eye of the beholder.
Yep, less is more – the power and flexibility of C++ (especially with the new standard) lets you get your code done with less lines than some other languages… while still remaning maintainable.
I share a similar experience.
If you do it right, then GC is not a problem. The worlds fastest stock exchanges reach extreme performance. How do they solve the GC problem? I know how they do.
5) It\’s the Java way or the highway. In C++, if there\’s something you don\’t like, you can always write your own version. There\’s nothing magical about, say, the string type. [/q]
C++ allows many ways. Java allows few ways. Few ways are preferable in my opinion. If you have written your own special C++ solution using some parts of C++ that no one else use, then new programmers need to learn your special framework. This sounds like PERL – where you can write code in your own unique style that no one understands. C++ is so huge that most programmers only use a subset, and the subsets can be different. This is a bad thing in my book.
The places where you really need multiple inheritance is far and few between
Several of the worlds largest and stock exchanges are written in Java. For instance, NASDAQ stock exchange.
Yep, less is more – the power and flexibility of C++ (especially with the new standard) lets you get your code done with less lines than some other languages… while still remaning maintainable. [/q]
I dont agree. If you use 10% of C++ and I use another 10% of C++ and if I have used some unique special C++ feature, then that code is not maintainable.
Contrast with Scheme which is small and compact. There is not much to learn, so any Scheme programmer can start to work away.
Regarding writing less code with C++. Well, I heard that to connect two telephones in C++ the telecom companies need 0.5 MB of C++ source code. In Erlang, you can do that in a few hundred lines of code. Erlang cuts down the amount of code to 33% or so.
Other than that, I agree with you. Java framework is huge. Java generics may suck. etc. But I never claimed Java is perfect, but it is good enough. Whereas C++ is like assembler: fast and difficult. I want to program, not try to figure out syntax with all those *&>< characters.
If there’s a lot of small objects involved in the stock exchange (and if the really performance sensitive parts are running Java – I somehow doubt it), then they’ve probably come up with a workaround of not allocating/deallocating objects, but re-using them. That can lead to some pretty nasty-looking code. It’s a lot of ifs, though, when I don’t know the systems. And you totally ignored that deterministic destruction is an issue with non-memory resources.
The point is that well-used language features can result in code that is very easy to read – and very easy to write, without having to do up-front reading of huge API specifications. I take it you’ve never worked with good fluent interfaces in an environment with proper code insight/intellisense?
And if anything is framework-heavy, it’s Java. The language might be simple, but for enterprise work you’re forced to deal with big, over-engineered frameworks with insane class hierarchies and lots of up-front documentation reading.
The right tool for the job – you use what you need, and what makes the job easy and the code readable. Java has an extremely huge class library – would you use every part of it for every project you do?
If you “need” multiple inheritance, you’re pretty much “doing things wrong”. But try reading what I wrote: mixins. For Java (and C# for that matter), you need to “implements Mixininterface“, and the implement the code for Mixininterface in every class you want to use the mixin in… sure you can (often) use objject composition for the implementation part, but you’re still faced with writing delegating stubs in each and every class that wants the mixin. For C++, you simply add Mixinclass as a MI superclass, and It Just Works(TM).
Perhaps it’s time you back up this claim with some references? Even if true, it’s only one isolated use case, and would very likely be due to politics rather than language performance.
I dont agree. If you use 10% of C++ and I use another 10% of C++ and if I have used some unique special C++ feature, then that code is not maintainable. [/q]Why does that make the code unmaintainable? If you used “unique special C++ features”, then you hopefully had a reason for doing so – performance, code eloquence, or whatever. If you used “special features” just to use them, you would be a jerk that would write crap code no matter what language you were jockying around with.
C++ is not an end-all-be-all (though it’s probably the single language that comes closest ) – good developers use the right tool for the job. Erlang is heavily geared towards parallel/concurrent programming, something C++ doesn’t offer a lot of features for.
C++ isn’t difficult unless you try to make it difficult, really. Sure, legacy codebases might be ugly, but that’s usually true of any legacy codebase. “All those characters”, heh. Yes, you need ‘.’ vs ‘->’ depending on pointer vs. value/reference data types, and for your function declarations you need to decide on pass-by-ref/pointer vs. pass-by-value. I personally can’t see why that should be a big hurdle.
And with C++, because of RAII I spend a lot less time worrying about resource cleanups and leaks than I tend to do with GC languages. Sure, GC solves cyclic references – but how often do you have data structures with cyclic references, compared to how often you need to manage objects that deal with non-memory resources that need cleanups?
But I know the systems. I work in finance. And the matching engine, the subsystem that pairs buy orders and sell orders, is the one which needs that high performance and extremely low latency that is needed by HFT and algo traders. NASDAQ’s matching engine is written in Java, and runs on RedHat. There is an article which says they run on Gentoo, but that is wrong. There are other errors in that article, btw. They talk about New York Stock Exhcange with a latency of 2 ms. That is really bad. A fast stock exchange should have a latency of 0.1 ms. That is, 100 micro seconds. Which you get with Java and NASDAQ.
And yes, the trick is to reuse the objects, and never destroy any objects. This way the GC is never triggered. It would be a disaster if a fast stock exchange system triggered the GC and paused for a few ms. A disaster.
Regarding the C++ many features. There is common complaint that C++ developers only use a subset of the language, and the subsets might not be equal. Thus, there are complaints that C++ code is hard to maintain. One programmer used feature A, and the other used feature B. And they do almost the same thing.
But Java is not perfect. But Java is a simpler C++. And simple is good. Complexity is to be avoided. But I agree the Java Enterprise frameworks are bloated. But the language Java is simple and quite small. Just as Sun wanted.
And a big important part of how you get that low latency is hardware, physical location and OS network stack. You don’t get it because of Java, you get it in spite of Java…
In other words, you need to work around a core language feature in order to get acceptable speed. And while you put less pressure on the GC by reusing objects, you’re not stopping it – hopefully the system has other tweaks (like a non-default GC) to ensure more-or-less deterministic behavior.
This sounds like management speak or poppycock from people who haven’t learned enough (and certainly never took the time to look more than briefly at C++) (I guess that would also cover managers). It’s much easier to get lost in the over-engineered libraries for Java than it is to get confused by a few language features in C++.
Sometimes too simple, causing over-verbose code that is harder to read (and more tedious to write) than the more concise code C++ and C# allow.
Perhaps, but pretty much every Java server application I’ve seen (ServiceDesk, Zimbra, MSP Center, OpenNMS, OpenFire etc etc) sucks the devils balls when you have to actually work with and configure it. Yeah, 50 pages of Java backtraces are soooo helpful when chasing down an error. Or not. Not to mention having configuration files spread out all over the damn place and usually being XML.
C++ may be a bloated language but Java have bloated (and overengineered) applications.
True words.
And combine that with language simplicity that causes code to often be overly verbose… Go Java!
I wish OSAlert would do a kinda intermediate programming series. I am by no means a good programmer, and some of the above comments on asm and different reasons for the different additions to c++ are above my head. I can’t find any good intermediate comp. sci. type blogs.
I wish the local colleges weren’t astronomical (no good public college within 100 miles of home, and three private jebus colleges (one 100 yards from home)).
oh, well. I guess I will continue lurking on stackoverflow, here, and a few other places that have been able to fill the void that the US Higher Education system has left.
“C++ is a horrible language. It’s made more horrible by the fact that a lot of substandard programmers use it, to the point where it’s much much easier to generate total and utter crap with it. Quite frankly, even if the choice of C were to do *nothing* but keep the C++ programmers out, that in itself would be a huge reason to use C.”
I cant help loving this quote :p
A lot of huge software has been written using C++: Photoshop, the JVM, Firefox, MySQL, Maya, WinAmp, etc. All KDE stuff is written in C++.
I like a lot C and have a lot of respect to Linux and its creator, but, when talking about C++, Linux Torvalds is just a troll.
There, fixed that for you
He obviously has skills, but he really is a big, arrogant, patronizing troll when dealing with things he doesn’t like/know/understand.
I think what one can learn from that is one should blame the programmer and not the language the programmer writes his software in. A horrible undisciplined programmer who writes utter crap in C++ will do the very same in C, Java, C# or what ever language one wishes to throw at the problem. When it comes to Photoshop as one example, its shittiness in terms of bugs and Mac OS X compatibility have more to do with code rot and Adobe refusing to do regular maintenance than anything to do with Mac OS X, C++ or lack of utilising Cocoa – moving from C++ to some other language won’t solve something that is baked into the very corporate culture.
That is why some people say that “You can write FORTRAN in any language”.
The meaning here is not FORTRAN code is ugly, well it is if we go back to the first versions, but regardless how the a language tries to force you to write good code, if you want to write bad code, it is always possible.
Unfortunately the slippery slope to code fuglification is the word ‘flexibility’ where certain ways of doing things in a particular order are relaxed for the sake of ‘flexibility’ which allow the programmer to produce a mess of unmanageable code. The solution in many cases should be a re-look at decisions made with the C++ and C standards and hard questions asked whether a particular move has enhanced the language or has it opened up the door for ugly undisciplined programming.
I wonder if Bjarne will publish a new edition and if so how much will the page count have to increase to cover the new standard?
Congratulations are in order!