“Those in search of eternal life need look no further than the computer industry. Here, last gasps are rarely taken, as aging systems crank away in back rooms across the U.S., not unlike 1970s reruns on Nickelodeon’s TV Land. So while it may not be exactly easy for Novell NetWare engineers and OS/2 administrators to find employers who require their services, it’s very difficult to declare these skills — or any computer skill, really — dead.” My Take: “C” dying should have been “x86 Assembly”.
My comment to their article:
This is a perfect example of an article written by someone who appears to have more experience reading press releases and print magazines than actually working in the field.
COBOL is still one of the most important languages on the face of the planet – virtually every financial transaction touches a COBOL system of some sort. Y2K was the best opportunity the world had to get rid of COBOL but it has lived through it, and now with Micro Focus buying AcuCorp there will be more unification of the varying platforms which should help its longevity.
As for C being dead, when was the last time you saw an operating system or device driver written in PHP, or even Java?
ColdFusion did quieten down during the v5 and v6 days but it has seen a resurgence since v7 was released – Macromedia were actually surprised at how many sales they made and they are busy polishing off v8 for release fairly soon (probably this summer). With what has been announced for v8, it will be one of the most powerful and easiest to use web development platforms anywhere and should see continued growth. To say that ASP, .NET, Java, et al have superceded CF is just naive.
And what is up with the “PC administrator” details? What definition is being used? Every systems administator I’ve seen has had to build machines from scratch, including component installation, OS & software installation & configuration, etc, etc. Just because you have an MCSE doesn’t mean you know how to install a device driver or copy a file from one drive to another, or know what FTP is.
So, please, if you are going to write an article lambasting technologies you don’t read about in industry magazines anymore, at least do some of your own research so you don’t come off looking like an ignorant beginner who just passed their MCSE.
And what is up with the “PC administrator” details? What definition is being used? Every systems administator I’ve seen has had to build machines from scratch, including component installation, OS & software installation & configuration, etc, etc. Just because you have an MCSE doesn’t mean you know how to install a device driver or copy a file from one drive to another, or know what FTP is.
While that is correct, and I generally agree with you, having gone through the MCSE cert (years after already being in the biz, just to get the “piece of paper”) I assure you MCSE’s know what FTP is if they passed the test, and would certainly be competent installing and configuring software, as well as copying files.
I understand your opinion, but don’t heavily discount the MCSE cert. It does greatly assist in making sense of the Microshaft Madness!
So, please, if you are going to write an article lambasting technologies you don’t read about in industry magazines anymore, at least do some of your own research so you don’t come off looking like an ignorant beginner who just passed their MCSE.
Agreed; I don’t see systems administrators the way I know them (and am one) going away anytime soon, especially given the state of both Windows server and client environments, and certainly not Linux, which is still feared in the 3 industries I’ve worked in so far (banking auxiliary services, legal, environmental).
When you’ve got Exchange patches breaking how Blackberry Server works, and a rogue Windows Update patch whacking svchost.exe all the way to 100% effectively taking down a workstation, who are you going to call? Joe User?
Windows (both client and server) work great, but are both volatile enough for the foreseeable future that nothing’s going to be a plug and play networking environment. Surely the author meant something different, though he wasn’t able to get that across in this poorly written article.
COBOL is still one of the most important languages on the face of the planet – virtually every financial transaction touches a COBOL system of some sort.
Agree. In fact, COBOL programmers can actually get paid quite a bit of money for maintaining legacy systems.
As for C being dead, when was the last time you saw an operating system or device driver written in PHP, or even Java?
I believe that the article was referring to C application developers. Which is probably true. C is in declining demand as an application language. System programming? Sure. Still in wide use.
Yup: the huge legacy COBOL applications are generally employed in banks.
And banks will NOT migrate them at gun point.
Good COBOL programmers are few and far in between, and are paid premium.
Besides, amongst the high level work announces on newspapers there always is some “COBOL and RPG programmers” wanted with X years of experience.
If COBOL was not that horrible it would be a fair career choice
I think those C Application developers will still have a lot of work from Novel and Red-hat for a few years… working on Gnome and related applications. This is just my guess anyways.
I’d have to agree completely with your statement. I don’t code in COBOL, but I completed a study about a week ago on COBOL use in the DoD, which heavily tied into private business use as well. COBOL is still prevalent on the backend, a large percentage of business apps (58% according to a Computerworld survey) are still written in COBOL. I was surprised at this last, but it makes sense with the installed base and it isn’t going anywhere anytime soon.
I’m really surprised Computerworld released this considering they also have 2 other recent articles stating how deep COBOL use is in most organizations.
http://www.computerworld.com/action/article.do?command=viewArticleB…
http://www.computerworld.com/action/article.do?command=viewArticleB…
COBOL was MADE for business. It incorporates accumulated business rules and processes, something which a lot of programmers just don’t get and why migrations to a more modern language usually fail. When it comes to transactional processing, there is still no match for COBOL. A very small percentage of organizations have successfully migrated mission-critical apps from COBOL to a more modern language. Most major upgrades have failed and organizations have resorted to modernizing the codebase using COBOL 2002 and web-based frontends.
As the first article says, “we will run out of COBOL programmers before we run out of COBOL programs.” I have been seriously considering adding COBOL to my skillset – specifically specializing in training myself to migrate apps away from COBOL. A lot of orgs want to do it, but they lack the know-how.
“””
As the Web takes over, C languages are also becoming less relevant.
“””
<sarcasm>
Of course they are, now, where’s that web-server written in PHP gone…
</sarcasm>
Edit: Quotes added for clarity.
Edited 2007-05-24 21:26
“””
As the Web takes over, C languages are also becoming less relevant.
“””
“””
Of course they are, now, where’s that web-server written in PHP gone…
“””
You are correct, but since the article talks about *dead* technologies, to say C is dead is a bit of a stretch, isn’t it?
My 2e, chris
‘My Take: “C” dying should have been “x86 Assembly”.’
I *entirely* agree (at least about ‘C’). ‘C’ is FAR from dead or even dying. In fact, for its main purposes, it’s probably more popular than ever. It’s by far the best way to do any kind of firmware/microprocessor programming, and it is the de facto language for low level operating system design and driver development.
‘basic C-only programmer today, and you’ll likely find a guy that’s unemployed and/or training for a new skill’
Or, you’ll find a guy who is making well into 6 figures at any major low-level software house: Microsoft, Intel, Apple, IBM, Atmel, Via, Nvidia, etc. Heck, I think Gnome is programmed in ‘C’.
Can I “Third” this?
x86 Assembly (or any kind of assembly) is on the way out as programs are more complex.
C is actually seeing a resurgence now in the embedded space. As low-level programming gets more complex, people are phasing out Assembly and moving on to C.
Granted that C is losing ground on high level programming, it’s actually gaining ground in the embedded space.
According to Google, C is as popular as Java, and both are more popular than C++ and .NET:
http://www.google.com/trends?q=Java%2C.NET%2CC%2B%2…
Well, C was designed as a higher-level abstract assembly language in the first place, so it’s not surprise that everywhere the non-portable assembly can be replace with highly portable code, it’s done…
Or, you’ll find a guy who is making well into 6 figures at any major low-level software house: Microsoft, Intel, Apple, IBM, Atmel, Via, Nvidia, etc. Heck, I think Gnome is programmed in ‘C’.
You know what I’m going to say…
C/C++ is almost dead in the business world. I started out as a Visual C++ MFC developer. My skills were in great demand. Now, everyone (I am a consultant) wants VB.NET, C# or Java. There are very few jobs that want anything else. I realize there are markets still for C/C++, just as there are for Fortran, etc. Obviously, certain areas such as System Programming, Drivers, Embedded, etc. will be using C/C++ for quite a while. However, if you are viewing this from a business perspective, which does employ a huge percentage of all programmers, the shift has already happened.
So, it depends on your view. If you are talking numbers, the shift happened a few years ago. However, in the “trenches” of OS development, driver development, etc., the shift hasn’t happened, and may not happen for many years (if ever). However, the number of programmers who are going to be working on Kernel development or embedded systems programming is always going to be a snall percentage.
If I were counseling students, I would steer most of them towards .NET, Java, or one of the scripting languages (Ruby/Perl/Python/PHP). This would make them marketable for the most jobs. I believe only the most gifted developers are going to be able to make a living in the future C/C++ world.
C/C++ are by no means dead in the business world.
What on earth do you think those heavyweight database servers, message queue engines, and so on are all written in? Java?
Higher-level languages are fine for the middle tier, and even higher-level languages for front-end stuff, but C/C++ make it all tick.
In any case, a good C/C++ programmer can learn new languages pretty swiftly (really, a good programmer can learn new languages pretty easily, whether they know C and C++ or not)
Edited 2007-05-24 21:43
>What on earth do you think those heavyweight database servers, >message queue engines, and so on are all written in? Java?
Yes.
I’m not talking about the ISV’s. Most large applications are written in C/C++, business or otherwise. I’m talking about business programming. In any given large city, a huge number of programmers are employed making medium to large businesses go. They are creating Desktop, Web, MS Office apps that allow business, government, and the military to perform their daily functions. I believe they are a majority of all programmers employed. They used to code primarily in VB6 and Visual C++. They now code in VB.Net, C# and Java. In addition, a huge number of web developers use one of the scripting languages – PHP, Perl, Ruby, Python. In fact Web programming in general (not Web Servers like Apache) is the fastest genre of programming out there, and it has all but “forgotten” C/C++. This is the gigantic group that no longer considers C/C++ relevant (for the most part, of course).
I’m talking numbers, not importance. I would argue that almost all of the _important_ applications, such as Operating Systems, Database Servers, Drivers, etc are written in C/C++. However, in terms of quantity, the number of C/C++ jobs is shrinking.
Most large applications are written in C/C++, business or otherwise. I’m talking about business programming.
That’s simply not true. Most large business applications are written in higher-level languages.
It’ll be a while before games are written in managed code.
does xbox 360 run unmaged code ?
Most games for the 360 are wrote using unmanaged C++.
There are actually a lot of games written in managed code. It just comes down to using the right tool for the job. Sometimes it makes sense. Sometimes it doesn’t. Many times the devloper makes the wrong choice and performance suffers.
I would agree that this is one very popular and growing area that might make a student want to consider majoring in C/C++ developemnt. Game development is growing. However, the same principles that have lead to the growth of RAD languages in the business world will probably lead to the rise of RAD languages in the gaming world. Game engines will continue be coded in lower-level languages for performance, but I believe the average game developer will probably be coding in a higher-level language such as .NET or Java. If I were developing a DirectX 10 game for Windows and XBox 360, I would certainly consider using Microsoft’s managed code for DirectX 10, the XNA framework. Even Java has improved by orders of magnitude in this area, and it is now possible to create performant 3D multiplayer games in java.
However, the same principles that have lead to the growth of RAD languages in the business world will probably lead to the rise of RAD languages in the gaming world.
It’s already happening. http://www.st.cs.uni-sb.de/edu/seminare/2005/advanced-fp/docs/sween…
According to Sweeny, half of Unreal is already scripting code. A lot of the performance-critical code is shader code, run on the GPU. And what he wants for the engine code basically looks a lot like ML.
Not to mention that, with .NET at least, a sizable portion of the code backing the framework is written in unmanaged code (C++).
Same for Civilization IV, which exposes a python API to make it easy for modders to customize the look & rules of the game.
http://civilization4.net/files/modding/PythonAPI/
Whoa, interesting doc. This guy is reinventing Ada.
The units bit makes you think of ADA, but the keyword you missed was “functional”.
A developer at Red Storm Rising told me they originally coded one of their popular Tom Clancy games in Java so as to make porting it to Mac easier, and it only got recompiled for x86 because management felt OS X was a poor market to enter (NOT because of performance issues).
Bad reasoning.
If there are no performance issues, then why migrate? Even if they eventually wouldn’t enter the OS X market, wouldn’t it be good to always have that option?
In game development, C/C++ is still very popular.
In game development, C/C++ is still very popular.
I agree… I have a friend working at EA who is actually doing Assembly hacks – just to squeeze the last bit of performance for the last few extras frames per second. They already had pretty good, clean C code..
If games were written in Java… we would probably be getting < 30 FPS even on the most current hardware.
If games were written in Java… we would probably be getting < 30 FPS even on the most current hardware.
Not “Java is slow” once again, is it? Have a look at Jake or something, for heaven’s sake: http://bytonic.de/html/benchmarks.html.
But really, modern games could be written in managed code without any problems. In java, for instance, the speed is not a real problem; the memory is (the VM). But looking at the memory requirements of late games, it wouldn’t make a difference, really.
Edited 2007-05-25 07:27
Now, everyone (I am a consultant) wants VB.NET, C# or Java.
That’s not a sign that C/C++ is dying. That’s a sign that most of the software industry is off its rails and headed for disaster. The PHBs don’t know what they want, and the analysts don’t know what they’re talking about. Sadly, many programmers don’t know what they’re doing, and many that do are more than willing to keep their mouths shut and do what they’re told.
Most of what hiring managers know about software development comes from what they read in publications like BusinessWeek. The analysts, the scum that clings to the underside of the software industry, are largely responsible for driving trends. The idiots are in charge, the ignorant deliver the orders, and the spineless do the work.
What to we see when we actually put the developers in charge of setting the culture and managing the projects? We see a lot less use of these kinds of technologies, and a lot more C/C++. We see managed code relegated to some GUI development and system administration applications. We see advanced libraries grow out of these lower-level languages to greatly ease development and portability without sacrificing performance. We see that it’s no harder to develop large codebases in loosely-coupled teams with C/C++ than it is in modern managed languages. We also see no shortage of developers with considerable skill in C/C++, some of whom will work in their spare time for their own enjoyment.
Why the mainstream software industry is passing up the most talented programmers in favor of the buzzword-compliant masses is beyond me. Many of these PHP geniuses get paid better than well-qualified OS developers. If you’re willing to be an Oracle DBA, that’s worth six figures. If you can explain why the volatile attribute doesn’t address cache coherency, you start in the 70s (I apologize for the US-centric currency references).
I believe only the most gifted developers are going to be able to make a living in the future C/C++ world.
This is a very good point. Why doesn’t it also apply to Java programmers? If businesses are willing to hire damn near everybody with the right words on their resume, then they deserve the crap they hire. The thing about skilled labor is that only the best and most skilled should be able to get a job in that area. I can play baseball, but I can’t make a living playing baseball, because I’m not good enough. I know Java, but the guy who interviewed me said that since I had experience in systems programming, I was overqualified… for a higher-paying job.
Screw all those PHBs and analysts. I’d rather work with interesting people doing challenging work than make a little more money to work on an unmitigated disaster with overpaid, useless tools.
Edited 2007-05-24 23:17
I think C/C++ is not that great for making large loosely-coupled applications because there are often too many questions that arise when you pass pointers across module boundaries. You can’t allocate in one place and free in another. And you have to be really careful when making interfaces. You end up with messes like COM (which I like, but I admit that it’s pretty hostile to normal programmers).
Managed languages really DO bring vast productivity enhancements because there is only one allocator, one calling convention, and one runtime model.
I agree with everything else you said about the high pay and low skill of new “business” programmers. I know a guy who’s making six figures at a financial firm right out of college even though he couldn’t program his way out of a paper bag.
But if you have a company that’s run by engineers for engineers, they’ll probably go the way of DEC (i.e. the way of the dodo). They produced really great stuff (VMS clusters that stay up through rolling hardware replacements, alpha processors that had the performance crown for 6-7 years), but they couldn’t market and sell well enough. And perhaps they weren’t focussed on quite the right things, so they didn’t meet their customer’s perceived needs.
It seems difficult to find a CS undergrad these days with a really sharp mind between their shoulders. And all of the best ones seem to be swallowed by the giant maw that is Google to work on god knows what. Maybe you’re at the wrong company if you feel you’re not being rewarded properly for your low-level experience. Considered applying to Google or (oh the horrors!) Mirosoft?
<Profecy>
“Heaven and Earth will pass over,
but C will continue ever and ever!”
</Profecy>
Maybe there will be one million VB programmers per each C programmer; the former will work for X $us per hour, but the C programmer will work for 10X $us per hour…
It is not about quantity, it is about quality!
It is not about how to build walls inside an office in a building… it is about creating the foundations for that building.
Edited 2007-05-24 22:13
C is the most popular language, it’s just unused for desktop because it is too complex to write a fully featured desktop using pure C. But in embedded and low level programming, you almost entirely use C, and where you don’t, you use assembly.
C++ is the most widely used language for general purpose apps. So you have to work in a software house for that.
But in non-software companies, where the software is just of internal use, who cares about fast or even well written software? They have to make it quickly, and VB, C#, and Java come with a nice and easy to use toolkit.
Also for web pages, it’s just way easier to write them in PHP or ASP.
Are you kidding? Now guess what almost all Desktop environments, applications and even small extensions in the Linux world are written in? Guess what? C and C++.
On GNOME Mono seems to become more and more popular, but nonetheless the core is written in C.
My personal decision to learn and write C was that for me most scripting languages lead to dirty written programs. Since I prefer rock solid programs this was nothing for me. Of course programming in C takes a lot more time, but this time is often speant thinking about what you’re doing – something I missed when writing in scripting languages.
In my opinion the discussion whether or not C/C++ has to be considered dead is senseless.
My personal decision to learn and write C was that for me most scripting languages lead to dirty written programs. Since I prefer rock solid programs this was nothing for me. Of course programming in C takes a lot more time, but this time is often speant thinking about what you’re doing – something I missed when writing in scripting languages.
What a phenomenal load of crap.
Here are few things where C is heavily used:
1. Device Drivers
2. OS Kernels
3. Network protocol stack
4. Telecommunication protocol stack
5. Database servers
6. Webservers
7. Office Applications
8. Games
9. Embedded software, firmwares
10. Real-time software
11. Emulators, Virtualization software
12. New languages like C#
13. IDEs like Visual Studio
14. Efficient text editors like VIM
15. Compilers, Assemblers, Debuggers
hmm…and the list goes on and on..C is here to stay. C++ may or may not but C will.
Yes if you want to write user interface or web application, you won’t use C but hey just remember the browser running your script is written in C
And when you run your .NET program, don’t forget the C# interpreter running underneath written in C
Long Live C – The best language.
Edit: Added Compilers etc to the list.
Edited 2007-05-25 09:24
How does that long list of applications for which C is a poor choice of language answer my post? My contention is that the claim that “scripting languages” cannot be used to write robust software is “a load of crap”. The fact that lots of people use C for things they shouldn’t be using C for isn’t a response to this contention.
I should also point out, and this may be a surprise to you, that every single one of those applications has successfully been implemented in a non-C language. There are applications for which I’d still use C, but the operative phrase is “few and far between”.
And when you run your .NET program, don’t forget the C# interpreter running underneath written in C
In this respect, at least, C# is a joke of a language. Real men write their language implementations in itself. Allegro CL is written in Lisp. OpenDylan is written in Dylan. SML/NJ is written in ML. There is a quote commonly used regarding programming languages “a language that cannot be used to write its own compiler is beneath contempt”.
Show me one OS, one Word processor and One compiler written in scripting language that gives the same number of features and performance as written in C.
Most functional languages have an implementation iin themselves which can be quite performant. For example, a large part of ocaml interpreter/compiler is written in ocaml, with some parts in C of course; there is also FFTW, a library for high performances FFT, which produces a C library, but which sources are produced by an OCAML program.
Emacs is by far the most featured text editor you can find, and not a lot of it is written in C. And above all, you are citing softwares which may account for 0.1 % of total written lines of code in the world today.
Of course some things need to be written in C, or even in asm, because well, that’s its purpose: C is a portable assembler. That’s why it is really valuable. But many people, because they do not know anything else than C/java, fail to see the purpose of high level languages. For many projects, the software is slower in C, because it takes so much time to make trivial things to work correctly that nobody gets it right.
It also makes the entry barrier really high for new commers in a project, specially for open source softwares. For example, I was disappointed by the applications under linux to learn kanji. I wanted to improve one of them, but the code was really difficult to read: there was a lot of hand coded list processing, etc… Instead, I started rewritting it in python. Because the language is much more high level, I could easily replace the custom search in binary files by a sql database, and have on par, even better performances. Instead of spending time to debug dangling pointers in lists, I can see which structures are the most perfomant, implement database access caching really easily, things which would be a nightmare in C.
So of course, you could do something much faster in C if you wanted: but nobody will ever spend the ressources for such a niche application. People spend time writing OS, drivers in C, because it is really valuable to spend a lot of time on it, and because it is the appropriate language.
Edited 2007-05-26 04:17
I agree!
GNOME, Xfce and a lot of UNIX desktops and applications are written using GTK+, a C widgets library.
KDE and all its stuff is written using Qt, a C++ widgets library.
Ok, the people are happy using System.Windows.Forms, saying that C is dead, but that library is a .NET wrapper built on top of the Win32 API, written in C.
“Of course programming in C takes a lot more time, but this time is often speant thinking about what you’re doing – something I missed when writing in scripting languages.”
Pitch that to your project stakeholders and see how quickly you’ll be dismissed. You simply cannot honestly believe what you’re saying…or you’ve had too much kool-aid.
Are you replying to my post or what?!? That’s exactly what I said!
Actually, even in the embedded world there is a lot of software written in higher-level languages (namely Java). The OS and lower layers by all means still make use of C and assembly, but on embedded devices as complex as phones or larger, there are lots of places for Java. As an embedded developer I still highly value my C skills, though: programmers that know C (as well as other higher-level languages) are definitely better off than those who’ve been using garbage-collected languages in all their programming experience.
Very often, it’s more due to poor/old design API/toolkit than anything else, whatever the API language bindings. Looks at Windows C API. Win32 is over 15-20 old now, I guess. And was designed on 80’s API paradigms, when more was better was so hyped in API landscape.
Writing in C for Windows is hard, no doubt. Stdio-like features were not even available under win32 for years. In comparaison, programming in C any POSIX target is quite easier. And programming in C/C++ for Qt, KDE, Syllable, BeOS/Haiku or even in Objective C for MacOS X is far more easier, for example.
Unfortunatly, Microsoft’s MFC C++ API were not that good and just wrapping the lower C API. It’s no surprise people developing on Windows walk away C API as soon as they could. VB, then C#/.net or Java.
But you didn’t see the same with Unix programmers. Maybe the API lack of quality, the fact that there is only old-designed C one available until the recent .NET forced people to use higher level programming language. The CPU cycles and memory footprints are not happy about that, but Wintel cartel is.
This being said, nobody should think its skills in one single language is enough anymore these days. Learn new language as needed, as your current language skills will die one day. And today a programming language average lifetime seems shorter than before.
That’s the reason I walked away from Windows programming and eventually Windows itself. I figured what good is it programming using a “high-level” framework when you still have to do a lot of low level stuff that seems irrelevant to what you want to accomplish.
I still have “Windows 95 Programming” and “Advanced Windows Programming” that cover pure C programming for Windows but they’re shelved and I never look into them anymore.
Maybe complete ports of open source desktop environments and applications to Windows could rekindle some enthusiasm from C programmers for Windows otherwise it seems pretty dead to me.
UNIX system and application programming seems to be a breath of fresh air compared to this because of its clean, modern C/C++ toolkits that perform very well and are continuously improving (at least until user space won’t suck anymore according to Dave Jones from Red Hat ).
“…but try to find a basic C-only programmer today, and you’ll likely find a guy that’s unemployed and/or training for a new skill…”
How many programmers does this guy think are one-trick ponies? I’d like to see a programmer that has ever been employed only knowing one language in the last ten years at least.
C is not dead, programmers are just expected to be more diverse in their skills set. That is not a bad thing, and does not imply that any particular language is ‘dead’.
2. Nonrelational DBMS!? And what exactly is XML then? This is CS101 dude.
Edited 2007-05-24 23:08
XML isn’t exactly a DBMS.
“And what exactly is XML then?”
It sure isn’t a DBMS, relational or otherwise.
“This is CS101 dude.”
You failed.
“And what exactly is XML then?”
XML can be used to express relational data just fine actually. There are numerous open source web apps that use XML as the sole backend, and they perform just fine.
Sure. But XML provides here the datastore, not the relational “engine”. Like raw disk partition or raw files did/does for classical DBMS like Oracle, Sybase, Informix and so on.
The rotten thing is the stagnation.
I’m the generation that was brought up (mislead?) that 4GL’s would be replaced by 5GL’s and only the drivers and low level OS would be written in C or Assembler.
We’re still stuck in antiquity. C, C++, Java (essentially C), LISP, BASIC, SQL. All old languages.
The closest thing to where I was promised we should be is macromedia authorware.
Anyone else feel cheated?
Where’s my jet-pack and flying car!?
We’re still stuck in antiquity. C, C++, Java (essentially C), LISP, BASIC, SQL. All old languages.
One minor nit: Java is *not* “essentially C”. It’s closer to a stripped-down C++.
We’re still stuck in antiquity. C, C++, Java (essentially C), LISP, BASIC, SQL. All old languages.
Oh please. First of all, I’m not sure what Java is doing in the same “antiquity” pot as LISP. The difference between IPL (1959) and Java (1995) is, uhm, huge.
Second, based on the amount of programs written in Python, C#, or even Ruby/JavaScript/PHP we are hardly stuck with old languages.
You probably meant to respond to the parent post, right?
Why should I feel cheated? Actually I’m quite happy things are the way they are. In the last few decades we have come a long way. Having just done initial port of a FORTRAN 77 program (this program is STILL actively being developed in FORTRAN 77!!!) to C++ I can say that C/C++ with modern programming techniques (e.g. use of STL in C++) is really not so bad…
And by the way, many people working in the field of computational science are still active FORTRAN users. Often professors RECOMMEND that students just start learning and using FORTRAN!
I think C is rather prospering, and becoming more of a breakthough language! I started programming in C++, done some Java, now dropped all completely..
Business? Servers? AIX, Solaris? HP-UX? ALL WRITTEN IN C.
Linux Kernel? Prospering more and more. WRITTEN IN C.
TCP/IP and other protocol Stacks? C
Drivers? C
Embedded Software? C
Sorry, you cant live without these things.
What people are overlooking is that with C you can do %99 of portable machine operations in an *exactly expected* manner. Yet its not ugly like perl or bash scripting crap.
Except for very basic things like expecting the size of various types to stay the same, etc.. Evidently you’ve not tried porting C code to other architectures very often, there’s a lot that the spec leaves to be defined by the implementation.
“AIX, Solaris? HP-UX? ALL WRITTEN IN C.
Linux Kernel? Prospering more and more. WRITTEN IN C.
TCP/IP and other protocol Stacks? C
Drivers? C
Embedded Software? C ”
That’s all fine and good, however all of those apps make up < 1% of the total number of lines of code out in the wild.
I personally think being able to read and debug C/C++ is essential to any programmer’s toolbelt, but knowing how to write it is no longer a prerequisite to being a professional programmer. I can’t write either, and have been a pro dev for 7 years now.
“AIX, Solaris? HP-UX? ALL WRITTEN IN C.
Linux Kernel? Prospering more and more. WRITTEN IN C.
TCP/IP and other protocol Stacks? C
Drivers? C
Embedded Software? C ”
That’s all fine and good, however all of those apps make up < 1% of the total number of lines of code out in the wild.
No, not really, the balance is actually the other way round, and that’s because virtually *everything* that has a CPU in it runs something written in C (or ASM for that matter).
I cannot say how the proportion really is [edit: let’s go maybe for a 70%-30%?), but saying it’s less than 1% of the code in the wild is plainly wrong. With the architectures getting more and more complex even in embedded systems, it’s safe to assume that anything high-level runs on something that is low-level and coded in C. [Edit:] It’s simply that there are much fewer systems that run application code than systems that run low-level/embedded/system code.
Edited 2007-05-25 11:34
“No, not really, the balance is actually the other way round,”
No it isn’t, he was pretty much right (ok, maybe it’s bigger than 1% but still). There is much more code that *runs* on Windows than there is Windows code for programs to run on. This is the same for all OS’s. Heck, OO.o probably has more code than most operating systems do.
That’s not saying all those languages are dead, far from it, but they’re not as important/widely used as they once was.
Edited 2007-05-25 11:39
No it isn’t, he was pretty much right (ok, maybe it’s bigger than 1% but still). There is much more code that *runs* on Windows than there is Windows code for programs to run on. This is the same for all OS’s. Heck, OO.o probably has more code than most operating systems do.
If we are referring strictly to desktop platforms, then yes, the situation is obviously with C somewhere at the low end of the balance.
But in the broad spectrum of programming, desktops are really only part of the whole story, and both Linux and Windows share comparable number of users with ITRON or FreeRTOS. There are simply much, much more embedded systems than desktop computers, and they almost all run C code.
Indeed. What should be always a prerequisite to being a professional programmer is being skilled in *programming* – algorithms, design patterns, etc – not being skilled in programming *languages* – whatever languages.
Unfortunatly, too often the recruiters are not skilled themselves to evaluate candidates skills in programming. They, then, resort to evaluate candidates skills in programming languages. Way easier. Misleading as much as evaluate an author skills based on his spelchecking test, but hey, that’s business…
“Indeed. What should be always a prerequisite to being a professional programmer is being skilled in *programming* – algorithms, design patterns, etc – not being skilled in programming *languages* – whatever languages.”
I completely agree with this. Some diploma students leaving the university’s CS department even think programming == implementing, but as you surely know, that’s nonsense. GUI on, brain off. Clicking around in “Frontpage” is programming for them…
Skills that a programmer should offer:
– ability to educate himself
– ability to learn principles and concepts
– ability to differ between algorithm and program
– ability to turn algorithms into program code
– knowledge about the right tools for the respective task
– … you may continue …
And finally, he shalt not follow the NULL pointer, because we all know that chaos and madness await thee at its end.
“Unfortunatly, too often the recruiters are not skilled themselves to evaluate candidates skills in programming. They, then, resort to evaluate candidates skills in programming languages. Way easier.”
Of course, just count the number of languages listed in the application letter, and if something “cool” is on the list, great!
“Misleading as much as evaluate an author skills based on his spelchecking test, but hey, that’s business…”
Maybe counting the number of words would make the article’s quality operationable…
As it has been mentioned before:
1. It’s about quality, not quantity.
2. Good programmers write good code in any language, bad programmers do the opposite.
3. C is not dead, it’s alive. ALIVE!!!
It’s no clue to qualification if someone can click around and “draw” dialog boxes. But I think it says a lot about someone who can understand and explain (!) something like void (*signal(int sig, void (*func)(int)))(int); okay, admittedly this is not very hard…
And: No, kids, HTML is not a programming language.
Coldfusion runs Myspace and I know a couple of organisations that are currently developing new applications in coldfusion.
How is that dead?
Although, I would definatly like coldfusion to be dead.
ColdFusion only runs parts of myspace now, .net does most of it I thought…
ColdFusion is not dead or dying. The product is growing with more highly advanced features than ever. Just because Joe Blow FOSS doesn’t use it AND it’s NOT a MS product does not mean it’s dying.
Trust me…. I’ve seen just as much CRAP code written in PHP as I have CFML. More so in free scripts.
“ColdFusion only runs parts of myspace now, .net does most of it I thought… ”
.Net does all of it actually. The MySpace folks went more in depth during the Mix ’06 keynote: http://sessions.mix06.com/ (extremely interesting stuff from the MySpace devs…hate the site all you want, but the sheer amount of traffic they generate is amazing).
They ditched CF years ago and moved to .Net, but had to keep the existing URL structure.
Claiming C is a “dead” language is simply insane. First of all, all technical interviews I cared about asked me my technical questions in C. These were for jobs that I’d be coding in Java or C# or Python, but it’s just that C is such a large part of programmer mentality/culture.
Almost every computer scientist, trained now or in the past two decades, has had to learn C. It is our lingua franca.
Frankly, I agree with the interviewers. If I meet a programmer who doesn’t know C, personally even I will judge him and think he isn’t a serious programmer. Without knowing C, how will you understand how your Java VM or Python VM works?
(I’d even argue serious programmers should know a bit about assembly, but my standards are high — _at least_ some C!)
Add to that the following facts:
1. most UNIX utilities are written in C.
2. almost all Operating Systems are written in C.
3. compilers tend to be written in C.
4. low-level libraries that interact with an OS are usually written in C.
5. drivers are almost always written in C.
6. many performance-sensitive applications are written in C/C++.
How about embedded development? Come on!
Finally, considering C++ is pretty much a superset of C, you need to know C to know C++. From the largest financial companies to pure software companies like Microsoft and Adobe, C++ still plays a huge role.
Complete rubbish, that choice.
Actually, the whole article is rubbish.
I pretty much get everything done with a combination of C, Python and/or Ruby and tie it all together with POSIX sh. I should probably throw Make in there too.
…PC network administrator?
Are those different from Network Administrators? Does PC networks all of a sudden not need administration and maintenance?
“[WTF is a…] …PC network administrator?”
Maybe it is a reference to Joe Q. Sixpack who had just bought a new PC and is trying to connect it to his DSL box? He has to administrate his little network, meaning, he will install file sharing apps and some spyware.
“Are those different from Network Administrators? Does PC networks all of a sudden not need administration and maintenance?”
The terminology used is confusing in fact… Administration usually refers to tasks not related to solving actual problems, but to keep the system running (e. g. OS updates, application installs, configuration), while maintenance refers to tasks done to hardware (e. g. installation of new disks, replacement of GPU).
Come on,Eugenia. All I can say is “Long live ASM”…
Granted, it’s a dead end NOS, but most of the CNE courses went beyond just straight NetWare. The courses went heavy into eDirectory design, and various applications which ran on NetWare. Many of those apps and eDirectory all run on Linux, so the knowledge is relevent.
I mean, how much do NetWare admins actually dig into the OS itself? Maybe if you’re nutty about oplocks or memory settings, but even those are out of the CNE course scope.
” My Take: “C” dying should have been “x86 Assembly”.
Hope the buffer overflows and dito shellcode too
/*6. C programming
C languages are also becoming less relevant */
I thought all of the Operating Systems kernel, like Linux, were programmed in C.
Edited 2007-05-25 04:49
I dont understand why the hell C is in the list. Is he talking about lame business programs only?
Just to add to the author’s knowledge C is still the most coded programming language:
1. Linux kernel: Pure C code plus some assembly.
2. GTK: GPLed Popular Gui development toolkit. Still natively coded in C
3. Most of the device drivers are programmed in C
4. C is still the only language for System development.
5. Recent news Worlds largest used Cell phone OS Symbian is now using Open C platform.
Author needs some knowledge brushup
Edited 2007-05-25 04:54
A good programmer is a good programmer. If you are good, you write good code in C, Java and PHP. If you are not a good programmer, you write bad code in C, Java and PHP.
As far as C, COBOL and other such languages going away, how many of you work at a major bank?! I did, and I assure you C, COBOL and BASIC (yeah, believe it or not, BASIC) were the languages that we developed in.
When I was in systems development, it was Assembly language, C, Ada and Bliss, but admittedly this was aeons ago.
Today I am in web development (who isn’t I guess these days) and we use Java (too much). At home I use Ruby and/or Python for the most part, simply because they have a gazillion easy-to-use libraries. But I suppose these days C does as well…
Ah well, I would love to find a nice Ada programming job that would pay my bills. That would be fun!
I laughed nice when I read the article.
I work with C at a company witht 5k+ employees where C is the main language. C is not dead and it will not be for long.
The article is probobly written by one of those .Net coders that think .Net is gonna change the world. Maybe he forgot VS is written in C/C++ =)
“Maybe he forgot VS is written in C/C++”
VS is written in an mixture of C++ and .Net actually, and has full managed bindings to the automation API. Regardless, what does VS written in *insert language here* have to do with writing .Net code?
Furthermore, if you’re a Windows developer, then .Net has indeed changed the world. For the better. I haven’t written a single line of VB6 or unmanaged C++ in years, and hope to never have to again.
> Furthermore, if you’re a Windows developer, then
> .Net has indeed changed the world. For the better.
> I haven’t written a single line of VB6 or unmanaged
> C++ in years, and hope to never have to again.
Which is fine, as long as you do not have to write applications to run on older hardware, and do not care about cross platform capability at all.
When it comes to older hardware, .NET has all the problems that Java has. It uses too much memory, it loads classes dynamically at runtime when they are needed (which is slow in general), etc.
When it comes to cross platform capability, Java clearly kicks C#’s ass.
So basically, for me, I use Java when absolute speed / performance doesn’t matter, or when I don’t have to worry about supporting older hardware. Otherwise I use C. I have no use for C# (Windows only for all practical purposes, since mono will never be fully compatible). I also have no use for C++, since it doesn’t offer me anything that C or Java doesn’t. C++ is basically an example of how NOT to write an OOP language.
That’s a good response save for one very important thing: None of the stipulations you mentioned matter in the realm of IT, mainly because it’s a controlled environment. The choice in IT usually comes down to “what platform do we want this application to run on.” If it’s Windows, .Net is the obvious choice, with an occasional need for unmanaged C++. All other platforms usually get the Java nod.
“C++ is basically an example of how NOT to write an OOP language.”
Modding you UP for this statement…I wish more folks thought like you do. Most of the C++ devs I know who moved on to Java/C# said the hardest part was unlearning the psuedo OOP principles they “learned” in C++, and learning not to fight the compiler ;-).
“None of the stipulations you mentioned matter in the realm of IT, mainly because it’s a controlled environment.”
I think you mean corporate I.T and not I.T in general. When it comes to corporate/internal systems you do have that control but you do not necessarily have it outside that problem domain.
In a business you write software for the platforms you have, not for everything under the sun. Therefore, if you are a Windows shop, or a mixed Windows/Linux shopped, C# or Java will do just fine. That’s what systems analysts do, design apps for YOUR systems
For the “better” because the previous Windows API sucks big time. It’s not because on a platform the C or C++ API was bad that is the same on all other platforms, or the language was/is the only one guilty.
People does great [embedded or not] operating systems and API in C/C++, making developers life easier. Microsoft just didn’t with win32. Maybe they didn’t/don’t care that much because it’s better for them to control everything, API *and* language *and* development toolsuite, and the hardware manufacturers are always happy to have a selling point for their ore powerfull CPUs and bigger memory chips…
Listen, you don’t have to care too much about this type of article. No, really. Do you seriouly think that any type of “skill” will ever get “obsolete”? You don’t need to “forget” anything but instead “add” something to what you already know. Any employer who thinks that have some “added” knowledge is BAD, well is simply not worth. No matter the salary.
And Assembly is way from dead. X86 assembly has faded a bit, yes. But next time you want to configure your router, think of it becomes possible It’s really a matter of which sector of the industry you choose: I myself work as an Oracle Developer but I can assure you that having had some experience with VAX and IM sure came in handy. As did my C/C++ Knowledge.
…my ideas when reading this “article”: stupid, stupid, stupid.
Yeesh.
Of course they are, now, where’s that web-server written in PHP gone…
hey, this post is very useful, I really enjoyed reading this, thx
MP4 Converter
http://www.mp4-converter.net/
I would also have to agree with the many who said C should NOT be on this list.
Even when it comes to desktop application, C has been given a new lease on life thanks to GNOME and GTK+, which in case the authors of this article forgot… Is now the default desktop on virtually every commercial UNIX distribution, as well as many major Linux distributions.
.. (or dying).
Why are people compelled to declare stuff dead? It is like a freaking disease for people who “can’t do.”
It is going up. Just last year it overtook C++ as the most used language in, I think, sourceforge.
Are games written in C?
C & C++?
3d books still use C code.
85 out 100 programmers writes software in “business” league. To do so 40 out those are using Java. The rest is using some other language with strong RAD support. Something like .NET or Delphi. By the time of introduction of VB6, Visual Basic outsold Visual C++ by factor 4.1. C/C++ was pushed out of this scene in mid-90s. This doesn’t make C++ bad or obsolete, but it does make it rarer, less needed skill than – once was.
Software is still written in languages like COBOL, PL/1 or FORTRAN. Those technologies are alive and kicking well and they will be used for many decades to come. But in comparison with something like Java or Net, COBOL skill is a rare thing.
This article was about skills in demand. Not about clash of technologies. Currently Java skill is “more hot” than C++ skill.
Word. People keep mentioning device drivers and operating systems and stuff. What percentage of programmers out there write that sort of stuff? A very low number I would think. Most professional paid programmers out there are writing apps that help businesses do what they need to do to keep running.
So sure, the database might be written in C, but how many programmers are there out there working on crappy GUI apps to run from that database versus those working on the database itself?