It’s hard to explain to regular people how much technology barely works, how much the infrastructure of our lives is held together by the IT equivalent of baling wire.
Software sucks. It really, really sucks. I have yet to meet a piece of software that didn’t make me go “…eh.” several times per hour – whether it be a videogame, a browser, or an operating system.
…now that is a lot of bitching.
What a waste of time reading… I just stopped halfway through.
Been doing this stuff for about 20+ years now, and I agree.
Most of the computing infrastructure is broken.
The only reason why I haven’t quit the field a long time ago, is because even with all of the broken software, open source software mainly LINUX, allows me too look at the broken stuff right in the face and actually have a chance in hell correcting it.
If I would have had to spent the last 20 years being a OK or CANCEL monkey on a Microsoft centric industry, I would have quit long ago.
OK or CANCEL? I could be wrong, but I think the options you are looking are: “Abort, Retry, Ignore, Fail?”
Edited 2014-05-27 00:08 UTC
So in your 20+ years, what have you actually fixed? I’m curious because most people I observe who make similar comments don’t have any track record of fixes & merged patches to the linux kernel, subsystems, drivers, or anything else.
Btw, implying that people who use Microsoft software are just OK/Cancel clicking monkeys is pretty pathetic.
Plenty of stuff.
Most recently, on my P570W when I first purchased it, I had to turn off the intel sound support channel selector code because it didn’t work and blew the machine up whenever an attempt was made to play sound.
I didn’t fix it, but because I have source code it allowed me to turn off the code (just a simple /* */ comment around the code block casuing the problem did the trick and I used a different USB sound source until it could be fixed later.)
I then posted the code block to bugzilla.redhat.com I commented out so the developers could look at it.
Eventually it was fixed.
But many things happen when you have source code. The customer has a chance to participate in the engineering process to improve the product.
Now, many of you probably do not and the typical LINUX admin I see has no clue about source code. However, it doesn’t take a lot of engineers to make a difference…in this case, just one who has access to the source code, identify the code block that is causing problems and comment it out.
Notice I helped the guys at RedHat though isolate the problem and saved them time debugging it themselves as I already did that process in my bugpost write up where I pointed out which source file, code and lines and what values certain variables had causing the problem.
The whole point though not everyone has have the skill set to look at operating system driver code and debug it. Many LINUX admins don’t care. But, when things do not work right, and you do have the source code it doesn’t take an army of people to fix it, just the possibility it can be identified and pointed out.
By little ole me.
The way proprietary operating systems work, this sort of process can’t happen.
It is a BIG reason why I even bother with computers because the day I can’t have an open system source/hardware combination to build solutions with for my customers is the day I hang up the towel and open a Indian restaurant chain.
I agree also with the article
Here, I disagree with you: a chance to correct it???
Really, when to really fix things we should change the languages they have been written it!
For example, take C: I love C but its default behaviour (performance above everything else) only made sense when CPU performance was a big deal (computers were much, much less powerful than current low end smartphone) and cracking wasn’t an issue, now it is the complete opposite: everything is connected so cracking is a big issue but CPU performance isn’t such a big issue.
So how do you plan to correct this?
Fix every bug there is in the huge amount of C software? Good luck!
I think that what is needed is: a change of compilers to “harden” C and a change of language to eventually replace what we have, but Ada had a free compiler for a long time yet we’re still using C/C++ not Ada so I’m not especially optimistic that the situation will improve..
Rust *sound good* until you actually try to use it:
http://www.reddit.com/r/rust/comments/269t6i/cxx2rust_the_pains_of_…
To be fair with C: it is not only about performance.
The C base dialect is very minimalistic, making the act of crafting bootstrap compiler for new platforms very easy. This is a really important part of OS development, because it enables portability.
Also, using C in the early stages of OS development release the developer from using another even more harmful language: Assembly.
So we have a positive feedback loop in place: the OS is in C, the base API is in C, so it is all too natural that the base system software and userland be in C too.
Craft a OS from nothing with a safe language is hard. All efforts that i know failed due compiler complexity hindering portability or making hardware abstraction cumbersome. And sometimes the hardware itself is bugged or badly designed.
We have the option to restrict all userland software development using a manageable language, this is the approach that Android takes. But crafting a OS from scratch using this approach is a huge effort that only big companies can afford: it requires to craft the kernel, the bootstrap compiler, the drivers, the new language and the whole high level API for it and finally the userland using this new language.
Best of both worlds (generates efficient C) and then some: http://www.ats-lang.org/
Yeah, not that he is wrong, but he needs to take his medicinal alcohol.
And more seriously somebody should point out to him that the same thing apply to everything not software related. Laws are released without even being tested and often after being disproven logically. Chemicals are added to our food as soon as we see one benifit though they may interact poorly with other chemicals. Cars break down as soon as one mechanical failure happens (which most software wouldn’t). It would be funny to see his world shatter even more. Everything is broken, but somehow things still work, living in a non perfect world, is living.
So, if someone tells you about all the terrible things in this world (say, famine, or war, or torture, or terrorist attacks), you also call that bitching?
Fair warning: I haven’t read “The Medium” piece, I am just responding to Thom’s statement
Really? Because I and most of the people I know, get up start working and just get stuff done. There is a mildly annoying Chrome issue that means I have to close and restart it every few days, but that doesn’t mean it sucks.
The vast majority of the hardware and software I use does its job, does it well, and doesn’t really cause any problems.
I genuinely don’t mean this as an attack, but your comment about everything sucking sounds more like a personal outlook issue than a real world problem.
I like what the software that sucks less people are trying to do but surf for example depends on something as huge and complex as webkitgtk
Motos in the computer industry :
– “If it compile, it works…”
– “It’s good enough, let’s release it, we’ll correct the bugs in the updates”
…
Kochise
I^A've developed comercial software for almost two years and I can say for sure It sucks. Not that we have to worry about but I always wonder if car/air planes mechanics work as we do we^A're doomed.
Okay, I’ll tell you THE SECRET.
Safety critical electronics and software is made of two things :
– Redundancy
– Dissemblance
– Redundancy makes that when things fail, there are some parts still functioning.
– Dissemblance ensure that not everything fails at the same time as the consequence of a single cause.
Probabilistic reliability analysis makes that it mostly works.
But it sucks almost as much as the rest…
Edited 2014-05-26 22:52 UTC
That is why we have Pascal and ADA to solve some of our problems dealing with programming languages. Software problems starts with what PL we are using.
And it was ADA that crashed the Ariane 5 because of an integer overflow. Ok, maybe it was incorrect use of ADA, but ADA did not prevent the programmers from using it that wrong way.
It was most definitely a failure to understand ADA. ADA has one of the most sophisticated type systems around. It was the failure of the programmers to make use of it that resulted in the problem not being detected.
No tool prevents problems if the user doesn’t know how to use it. There are no magic bullets.
Everything can be screwed up, but that doesn’t mean that everything sucks (to link back to the premise)
Was not the language used, was the reuse of some of Ariane 4’s weight calibration tables, as is, that rendered Ariane 5’s embedded calculators completely berserk, thus doing successive computation overflow of the redundant flight systems, one after the other. Till the final back flip.
ESA, back flipping space shuttles before it was cool…
BTW, I’ve told you to avoid using my account.
Kochise
The Ariane 5 is interesting from a software developer perspective and happened during test flight. Since it has shown to be very reliable.
The code was NOT in fault, it was hand crafted with love by professional developers and peer reviewed with attention to pass all checks.
This gem of a code was just fed with bad values and acted accordingly. Test flight or not.
Kochise
I disagree with you here: no type system can tell what will be the maximum acceleration you’ll have..
Yeah, even this site is a testimony of software suckness, just check the capcha when you submit a story, it has been “LARU” for how many years now?
At the risk of sounding jingoistic, the VERY first thing I would do if I ever set up a website or public-facing server is to blacklist EVERY IP address from China and Russia (and probably Nigeria as well) that I could. I’m sure that wouldn’t solve all of my security problems, but I bet it would put a big dent in it, and might get a lot less spam in the process.
Edited 2014-05-26 22:54 UTC
Yes, blocking 20%+ of the worlds population will definitely reduce the number of potential attackers.
The problem with your statement is if you lose 20% of the readers out there but block 80% of the internet borne problems, then you still come out ahead.
Yup, and I would encourage others to do the same. If the citizens of these countries get tired of being blocked everywhere, perhaps they will start pressuring their governments to do something about the vermin that are making life miserable for the rest of us.
WorknMan,
Haha, I doubt their governments would listen to them any more than ours do to us.
You are nonchalant about punishing the innocent here. Of course you can do whatever you want when you own a site, but do you have any evidence that China is the source of more hackers than other countries per capita? I’d genuinely like to know because mathematically speaking China should have 4.3 times the hackers in the US due to population ratios alone (all else being equal).
> You are nonchalant about punishing the innocent here. Of course you can do whatever you want when you own a site, but do you have any evidence that China is the source of more hackers than other countries per capita?
Before I set up fail2ban on my home router, my /var/log/auth.log would routinely fill up to a few megs full of failed access attempts from Chinese and Russian IPs.
SSH?. Using a non-standard port always solves this problem for me. The easiest way to decrease attacks, and thereby decreasing chances of someone breaking in. Port 443 works great for me, and it’s always open no matter where you are. I run an HTTP server, but not HTTPS.
And then this people will start or spoof IP addresses to defeat blacklisting.
Yeah, good luck with that. The people you actually have to worry about, unless your site/server is a royal clusterfsck of ineptitude, aren’t going to be stupid enough to use an IP address from those countries.
Louie CK really hit the nail on the head on this one I think.
https://www.youtube.com/watch?v=uEY58fiSK8E&feature=kp
Came here to post that exact link.
Technology is a reflection of people. Software sucks because people sucks. People can’t decide how they want it to work. They want everything – intuitiveness, efficiency, aesthetic – but they don’t demand anything of themselves, like reading instructions and spending effort learning something, learning to ignore things that don’t matter.
Well, mostly true. I’ve 20+ years of coding behind me, and as a lazy developer, I try to get the code as concise and readable as possible using good practices, naming conventions, etc…
While I can say my code is neat, well structured, tested before release, I’ve to deal on a daily basis with junior (or not) engineers that just code for the income, and obviously their only reference factor is the LOC !
When copy/paste goes bad and you have to clean the mess, maintain the code afterward, you cannot imagine the pain involved. Watching Hellraiser is painless. And those dudes gets 20% wages more than me because, you know, they are EN-GI-NEERS, and I’m not.
Look what those engineers are able :
function()
{
switch (select)
{
case 1:
do_stuff();
return (error_code);
case 2:
do_something_else();
return (error_code);
default:
return (error_code);
}
return (error_code);
}
What about the ‘break’ statement ? It’s legacy C, not even C++ ? How did they managed to get their degree in the first place ?
Kochise
Oh, another one :
…
result = is_ok();
if (result == false)
{
return false;
}
return result;
}
What – is – this ? My version :
…
return is_ok();
}
That’s probably why I earn20% less, I code 20% less :/
Kochise
Edited 2014-05-27 09:59 UTC
Ha ha, WTF.
Who pays by lines of code these days? – thats got to be the most stupid metric to value work. Less code == better!, less bugs less to maintain, less for new devs to read and understand.
Yeah… I’ve seen stuff like this too. Craziness.
In this specific case, I wouldn’t use ‘break’ either, because the return takes care of it. I personally wouldn’t even have last return because it’s taken care of by the default case.
Nope, you’re wrong, dead wrong.
You’d better feed the ‘error_code’ variable with the expected error code and leave the final ‘return’ do its dedicated job, than the unmaintainable mess currently in place.
And when you are coding in embedded when you have only a few breakpoints available, it’s easier to just put one on the final ‘return’ than on each one.
No, seriously, this is bad practice.
Kochise
I’m only working off your example, not the one you had in mind but simplified for confidential reasons. For a very small function like that, in general programming terms, not just embedded, it doesn’t make a difference in readability.
Especially in such a small function, I prefer functions that exit as early as possible, provided all resources are easily freed, rather than having to read to the end of the function.
If the function was more complex, I’d do it differently and do what you said.
What’s wrong with putting a breakpoint after where the function is called and the value is returned? Same result.
This was an example out a function that features 80+ cases…
Nope, because there could be several calling function to this one, and you should still put several breakpoint to catch all the parent calls. So having just one ‘return’ at the end you might then catch the faulty ‘error_code’ and step out and see what was the calling function that broke everything.
Geeez, I’m teaching you your job, I should quit that…
Kochise
Edited 2014-05-27 14:26 UTC
As you can tell, my debugger-fu is not strong. I personally don’t like using debuggers. I prefer to design my code so that debuggers aren’t necessary.
Actually I would return as soon as I could.
I understand your point and in C is totally valid (though the “goto” keyword does its job there too), but in C++ where RAII is in charge to release all resources used by any instance in a block or in languages where an exception can be thrown anywhere, the idea of having just a return point per se, does not make any sense to me anymore.
What if you put your breakpoint in the final return in your function if an exception was threw rendering your breakpoint unreachable?
This is embedded C++ with no RTTI and exception due to poor compiler performance (Keil) and only 2 hardware breakpoint (ARM9) or 6 (Cortex-M3) combined with Ulink probes (unlike JLink probes that have “unlimited” breakpoints)
This is no NetBeans with Java dudes, this is real stuff
Kochise
You mean “this is no desktop development”. Nothing to do with Java mentality. I hate Java, and I can understand why embedded would be different to desktop C++. Your example code is just too simple to get your point across.
I won’t copy paste a full code source extract to get you an idea, and for licensing issues, of course.
Kochise
If that is C, you don’t need (or even want) breaks if your are returning from within a case…
If you did include them after the returns, they would never be reached, and in fact could cause some compilers to spout dead code warnings.
His objection was that the returns inside the cases should be changed to breaks, so that only the end of function return is the only return. It makes sense for what he’s doing, which is on an ARM system with only two hardware breakpoints.
Like I said, his real world code that he’s basing his example on probably is more complex than the example. I would say to him that different contexts for a switch statement makes the whole “only return at one point” not a hard and fast rule.
otherwise i might have to lift something heavier than a keyboard one day.
Long ago I ran the program IEFBR14. Ran it many times. This program did absolutely NOTHING – it’s sole purpose was return control to the operating system.
The program was 2 bytes long. Return control to the OS. One simple machine instruction. Absolutely nothing could go wrong. Right?
It had a bug.
http://catless.ncl.ac.uk/Risks/6.14.html#subj1
The reason why it existed was to satisfy the requirements of valid JCL: Every job deck had to have at least one EXEC statement to be considered valid (JOB card, EXEC card, optional // card). IEFBR14 was there to provide this program that could be EXECuted. A nice side effect was to use this program to issue file (correct: “dataset”) dispositions without actually needing any “processing” program that did things with them, to manipule the file catalog or to build libraries (later obsoleted by dedicated programs). It could also be used to alter the job flow because the return code could be evaluated in later job steps with the COND= parameter. It could also serve to “tidy up” after the last job step, eliminating unused or “accidentally” kept files). Finally, it could be used to make sure, when being used in the first job step, that following steps would be able to allocate sufficient storage capacity (and if not, let the job fail early).
This is just another illustration of how broken things were – when the null program actually had several useful purposes. Having written this, I feel old now. I should run IEFBR15.
Software engineering needs to be like physical engineering:
– you need an internationally recognised software engineering qualification to release software to the public (except for testing purposes).
– all code needs to be fully documented
– all software needs a very stringent and formal testing process before being RTM
– all public domain code needs to undergo a comprehensive design process before the first line is written
– legal liability for any damage caused by defective software
– professional indemnity insurance
The result would be:
-99% of software authors would disappear overnight.
-salaries of competent software designers would rise dramatically.
-the rate of new software releases would drop dramatically.
-The quality of software would improve exponentially within a few years.
Edited 2014-05-27 02:57 UTC
No, the answer isn’t more regulation; the answer is more cooperation. Restricting software development would only slow things down and give entrenched organizations total control over our technological future. If we work together, we can make some damn good software.
I, for one, prefer freedom and accept the dangers it entails.
Edited 2014-05-27 03:36 UTC
I almost find this mentality interesting.
Can you name me another profession where those in it profess this level of freedom?
Doctors? No.
Lawyers? No.
Accountants? No.
Trades people? No.
Teachers? No.
Nurses? No.
Civil Engineers? No.
Electricians? No.
…
Do you really think people is software so special that they should ignore the lessons learned by every other profession in history? It is rather amazingly arrogant for most software engineers to think so, all the while always complaining about everything (treated like crap by business people, poor quality, poor working conditions…)
I hate to break it to you, but most people aren’t that special. Power and organization matter a heck of a lot more in influencing the world.
Not to mention, we don’t live in some free world.
This has massive issues that you can’t simply say that freedom will result in more innovation. Good people can and will choose other careers streams which offer a better life with higher professional standards. Without long term careers in the field, it is possible to face a lack of deep innovation as few would want to invest their time. Social web app innovation is good, but it doesn’t need years of deep study.
One point: as far as I recall all of those professions you mentioned were around a long time before they learned the right way to formalize. For centuries (or in some cases millennia) people learned by getting apprenticed, or by reading the classic works and then putting up their shingle.
Software engineering is still in it’s infancy, we have barely learned to stand compared to those other fields. It isn’t clear we know the right way to formalize yet. I don’t know that we could come up with an equivalent of the bar exam that was worth anything. Lord knows the MSCEs are no predictor.
We call ourselves engineers, but that was just a term to distinguish ourselves from the older more clerical definition of programmer. We are really more like craftsmen ^aEUR” sitting at the intersection of art and engineering.
I am not saying we shouldn’t formalize, but I am saying that it is going to be a lot harder than most people here seem to be asserting.
On that I agree.
How to professionalize and what to standardize is very complex.
My post was more responding to the parent who was against it in favor of a perceived value in freedom and cooperation.
“I hate to break it to you, but most people aren’t that special. Power and organization matter a heck of a lot more in influencing the world.”
You don’t like to read history books do you?
I stood and watched in disbelief at the height of the SCO trial as McBride stood in front of a jury and said it is impossible for LINUX to have come about without a PhD and the resources of a institution, therefore LINUX was copied.
Digging through more ancient history we have exhibit B, a one George Boole.
Who was told by the organizations, professional at the time at key Universities to go back to digging ditches because his idea of dual state logic was HOGWASH.
Not too mention Mr. Boole didn’t have the cash to buy his way into the ELITE CLUB called a University education.
I hope I do not have to point out, how important Boole’s work is.
Exhibit C a one named Albert Einstein. Sent home by the organized institutions of professional “learning” at the time that he was unteachable. Who spent his “unorganized” time as a patent clerk, working on theories that have changed our world forever.
Exhibit D a one named Nikola Tesla, who after spending 1 year at a “organized professional organized University” couldn’t stand it anymore and left forever declaring he would never return and built the entire foundations of energy science by his little ole self.
AC current which ironically allows you to type …
SUCH HALF ASSED COMMENTS as
“I hate to break it to you, but most people aren’t that special. Power and organization matter a heck of a lot more in influencing the world.”
All of the advances that for what I would call the modern age came from people who rejected professionalism or “organization” as you call it.
This fundamental rejection by these and many more people are what drives the huge advances in technology and REAL science, not the crap they do now at Universities for grant money roller coaster rides.
The individual built our modern age. Professional organizations or Universities did everything the could against these individuals to kill it.
Edited 2014-05-29 01:51 UTC
-Innovation dries up, as some of the most interesting individuals in tech, Steve Jobs, Bill Gates, Woz, Zuckerberg, and a host of others would never have met your criteria.
-IE 6 shows that software that doesn’t release, stagnates. If the pace of development slows, innovation follows.
-A lot of OSS and Free software got it’s start from some guy scratching a particular itch, not a comprehensive design process. Unix is a good example, it was written by Ken Thompson and Dennis Ritchie because Multics failed, on a scavanged minicomputer.
-Businesses couldn’t afford to have inhouse software written, if all their developers had to follow strict engineering guidelines.
-Heres a list of bridges that have fallen internationally. https://en.wikipedia.org/wiki/List_of_bridge_failures. Here is a list of automobile recalls in 2013 that says there were 632 recalls, for a total of 22 million vehicles, http://www.autonews.com/article/20140203/OEM11/140209973/toyota-lea…. if engineers are so infallible, how come these things can occur? I’d say engineers, like software developers are only human.
Edited 2014-05-27 03:48 UTC
BluenoseJake,
I agree. Although it’s an interesting example because it’s also caused just as much by the lack of funds for repairs as it is caused by human error.
http://www.paintsquare.com/news/?fuseaction=view&id=5346
There was big “news” in my home town where the bridges were reclassified to lower weight limits because there were no funds committed to fixing them.
Henry Ford, Thomas Edison and Isambard Kingdom Brunel wouldn’t be hired now either. Times have changed.
You don’t need constant major changes to improve software. You can continually polish and improve existing software. Many highly competent physical engineers spend their entire careers making tiny improvements to existing designs and processes.
Ritchie was a Harvard PhD. Thompson was a UC Berkeley MS. Both were extremely competent professional programmers. They weren’t self taught amateurs.
In every other profession you have to follow strict guidelines even for in house work. Airlines can’t train pilots to some arbitrary in house standard to save money. Dentists can’t use unapproved methods to sterilise their instruments. Accountants can’t use non standard audit processes.
Physical engineers are fallible. That is why they are so conservative. They thoroughly plan everything (even very minor ‘hobby’ projects) before starting. They use proven technologies, allow very large safety margins, and multiple redundancies.
Large road bridges collapsing due to design faults are so rare that they are used as engineering case studies. This has occurred a handful of times in the last 100 years.
The vast majority of car recalls are to repair very minor problems that have a low probability of causing a serious hazard. If the same consumer protection laws applied to consumer software Apple, MS and every other major consumer vendor would have been bankrupted years ago.
I’m going to play a bit of devils advocate here:
-Innovation dries up:
-IE 6 shows that software that doesn’t release, stagnates. If the pace of development slows, innovation follows.
Since when is innovation the trump all value? how about stable work, good wages, deep knowledge, time with their kids…
-A lot of OSS and Free software got it’s start from some guy scratching a particular itch, not a comprehensive design process. Unix is a good example, it was written by Ken Thompson and Dennis Ritchie because Multics failed, on a scavanged minicomputer.
A lot of OSS got its start from big powerful entities. Either monopolies like the old ATT, bells, universities…
-Businesses couldn’t afford to have inhouse software written, if all their developers had to follow strict engineering guidelines.
So? What’s wrong with that? There’s nothing wrong with being a service provider. Most companies don’t provide their own legal, power utility… services…
-Heres a list of bridges that have fallen
Sure, things go bad. But a completely open system has things go worse. Places with high professionalism and engineering.
unclefester,
That would just make the field more bureaucratic IMHO. If you really want software developers to have to buy insurance policies like doctors, then be prepared for a shock when it comes time to pay for professional insurance.
http://work.chron.com/much-doctors-pay-insurance-7304.html
Of course doctors often have more at stake, but still. Just as with ambulance chasers, “Software engineer chasers” would become a new thing. No thanks.
That’s one theory, but many developers like myself would somehow have to absorb these costs while competing with offshored firms that don’t have them.
You are going to enforce standards on public domain software? Who’s going to enforce that?
I get that many developers are just no good and the result is a shoddy product. Yet the companies themselves are often just as responsible. There’s usually a rush to get products done and little to nothing in terms of a budget for proactive security/bug fixes. We as software engineers end up doing the bare minimum because that’s all that the company gave us time to do. I think every experienced software engineer will already know that it’s not a one-sided problem.
I would argue that there is a use in distinguishing software engineering as a profession from software development. Software engineering (like other engineering professions) would be held accountable for failures.
I wouldn’t propose that all software construction needs to be regulated that way, only the ones where it makes sense, i.e., safety and life critical systems, shutdown systems for nuclear reactors, etc.
In that context it makes perfect sense to have some kind of standard and protocol to deal with failures and some way to ensure quality.
In Canada a “professional software engineer” requires graduating from an accredited university and taking a second exam (after some years of experience) to qualify for the designation of “professional engineer”. Its the same body that regulates the other types of engineering.
Th point is that you wouldn’t be competing with shoddy offshore operators. You would be competing with people who would be legally required to be competent.
Virtually every car, bridge or skyscraper is designed by highly paid Western/Korean/Japanese engineers not outsourced to lowly paid locas. Even in countries like India salaries would rise dramatically because there would be a far smaller pool of qualified software engineers.
No American/European/Australian/NZ IT Project Manager is going to hire a cheap shoddy worker if the Project Manager can be sued for negligence.
Becau
unclefester,
I’m saying that when a company looks at their options, they’ll see domestic costs increasing and foreign development will look that much more attractive. Adding new overhead for domestic software engineering will make the imbalance even worse.
A lot of offshoring has been motivated by cost reductions, which often goes hand in hand with quality reductions (ie laying off the already qualified domestic engineers and replacing them with far cheaper offshored ones).
I realize you have good intention here, but companies seeking to maximize profits just won’t care.
For me to agree with any of this I would 1) have to use drugs, and 2) be insanely high on them to think any of that is a good idea.
That’s because you don’t have a professional mindset. True professionals always want higher standards in their field. I’ve never heard surgeons say “surgical raining takes way too long…how about we cut out the residency and medical school requirements.”
Oh what a load of bull.
I care about developing the best software for our users with the resources i have at my disposal. It might not fit your list although there is some overlap, but what it does do is help our users perform their job better and easier, saving them money they can spend on better things. I am sure i would do things differently if i was designing the software for the next lunar lander, but i am not, and neither is the great majority of the IT industry. Instead we do our best to actually solve real problems, today, not in 10 years.
Sure we focus on code quality and try to fix bad stuff, but only when it makes sense and is worth the effort.
I might not be a true professional by any of your metrics, in fact i am probably the opposite as i also am the one who have to find the cost saving shortcuts that prevents others in my team from being your true professional, but i will happily continue doing what makes everyone (important) happy since we can’t print our own money so i could hire 20 extra developers to accomplish basically the same as what we manage to do with 5 right now.
Well… that is pretty much how you would describe a professional in any field.
If you were doing a house renovation and you had several contracts. The one people would describe as professional would:
1. Hold appropriate licenses
2. Provide a firm estimate and hire the correct number and quality of people.
3. Turns down a job if they cannot meet the price. This happens all the time.
There are boatloads of contractors who are not very professional, but who do the work at prices and timelines people are willing to pay. But you’d hardly call these people professional.
Yamin
Well, that’s true of any field. Some of us are professionally behaved while others are not. That will never change whether we’re certified or not, agreed?
Software in many ways is special. There are plenty of analogies used to compare software to our physical world, but these always fall short because software really is different from everything else in the physical world.
This isn’t to say we should not have standards, but any argument that software engineers should do X because other physical engineers do X is going to fall flat IMHO. You would need to make a convincing case that the software field would improve without overly relying on comparisons to other kinds of professionals.
It would be like if engineers had to deal with different laws of physics if they were building a bridge vs a car vs a skycraper vs a 787 and each at different engineering firms.
No, that’s not it. Your recommendations completely ignore the root problems, which have been pointed out by several other users already. It’s a simple theory that looks good on paper only when you put blinders on. Regardless of any over-simplified analogy you think makes sense, your theory applied in the real world wouldn’t have anywhere near the affect you’re convinced it would. I would be willing to bet it would do the opposite.
I do however give you credit for acknowledging there is a lot of room for improvement. On that point we can agree at least.
Pray tell, where is that formal testing process defined? I know we have unit testing and QA scripts and stuff like that, and I am one of those developers who tests his code before commiting… but all that is just one eyed people in the land of the blind. No one has any idea how to fully test correctness for software.
In engineering however, you have materials science and can predict what kind of forces will be involved and how to build your stuff to widthstand them. And even then they will cut corners to save costs, liability be damned
Also, you will pay 20x what you’re paying now for commercial software. Maybe 100x. Including crap like office suites that aren’t really life support. Free software (as in beer) will simply disappear because no one will take insurance out of their own pocket. Only the rich will be able to afford software
You do exactly what 19th century engineers did. They went from guesswork to testing every single material and component down to individual rivets until they failed. They used this knowledge to develop rigorous procedures and standards. Each new project was a minor iteration of a existing design using proven materials and techniques.
Software engineers often try and build a Bugatti Veyron when they still have barely worked out how to build a reliable Model T Ford.
A car maker will spend about $2 billion developing a new model. It will invariably use proven existing technology and be a very minor variant of an existing car. They will make dozens of prototypes and literally test them to destruction under far harsher conditions that they are ever likely to encounter in the real world. Then and only then, will the car go into production.
Software would be no more expensive. It would simply evolve more slowly with a far greater emphasis on quality rather than implementing unnecessary new features or style changes.
Edited 2014-05-27 07:02 UTC
At the local train station, there has never been a week where an escalator isn’t out of order. Sometimes two out of the four available.
Today, bridges and buildings are of increasingly poor quality at the end of construction. Even in the developed world, newer bridges collapse and pipes burst.
People don’t even want to pay enough for hardware quality. They’re not going to pay for software quality unless it’s really important like space technology software, aircraft/traffic controls and nuclear power stations.
Hell, not even the banks want to mess with their COBOL base even if the code ABENDS all the time.
In most cases the problems is cost cutting or lack of time for maintenance rather than genuine design flaws. My local shopping centre has worn out travelators because they increased the trading hours to a decade after the centre was built. Originally they had a 36 hour to perform deep maintenance. That was cut back to about 12 hours when Sunday trading commenced meaning that only ad hoc repairs can be done.
AFAIK all Australian banks were legally obliged to completely rewrite their backend software to much higher standards many years ago. [In the 1980s the smaller retail banks were all consolidated to form four huge national banks.]
Edited 2014-05-27 11:17 UTC
Yes, and no amount accreditation can override the forces of cost cutting, which is also what plagues software development rather than engineering standards.
As for your travelators, my escalator example is a bit different because they were supposed to be designed for more use than a shopping centre and they still fail. They were newly built for the newly opened Mandurah Line, and they didn’t even reach the decade of use of your local shopping center.
No doubt if software development took place in the 19th century, we may have had the luxury of “over” engineering everything.
It’s not the luxury of “over” engineering things, it’s just that… well… consider hundred of thousands coders across the globe working for the last 60 years reimplementing the ‘printf’ function.
I mean, if we weren’t redoing everything from scratch everytime, to avoid paying/licensing libraries, and put everything in a giant code pool, we’d not wasted so many man/month worth of coding.
Perhaps we would have colonized alien planets by now, or at least the orbital cities promised in the 50′ would be reality.
But seing how a single ARM MCU coded in C can be such a mamooth task to tame/debug for some coding team, I guess there is no hope.
I digress…
Kochise
You could also argue “if people stopped bitching about the programming language and stopped creating all these clones”, we’d be more productive.
Yup, exactly, look at rosettacode.org !
As I’ve told before, 4 programming languages are enough :
1- ASM & C for low level
2- Erlang for middle ware, network and interprocess communication
3- LISP for high level abstraction with its unmatched code autogeneration macro system
If people just dared to master just a handful of language instead to believe they have to learn a new every now and then (every year by some coding book publisher) perhaps they would do more with less.
Kochise
It isn’t the 19th century anymore. Every other manufactured product on the market is regulated for safety, quality and reliability. If you want to sell a car or a choc chip cookie you have to comply with the extremely onerous regulations.
If you want to release software you need to do what professional physical engineers do – design, develop and test to rigorous formal standards. If you fuck up you should pay the consequences by losing your job or getting sued. If you can’t comply stick to being a hobbyist or get a new career.
Regulation actually increases innovation because it forces engineers to think laterally. Extremely strict international emissions control, fuel economy and safety standards cars for new cars have produced more innovation in the last decade than in the previous 40 years. Americans were (are) still building cars with crude pushrod engines and a separate chassis 50 years after they were obsolete. The US car makers spent decades and billions of dollars fighting new regulations while other countries went about designing and building better vehicles. [We know how that turned out.]
If you want to sell software you should have to pass an internationally accredited course to become a Licenced Software Engineer. Every physical engineer has had to do so for well over a century.
unclefester,
Well, your talking about how regulating the end product helps create better end products, but then you completely switch gears and say software engineers need to become accredited with some kind of license. The later does not follow from the former. In your example, the reason cars got better emissions control and fuel economy has nothing whatsoever to do with engineers being licensed or not.
Same thing goes with energy efficiency requirements of refrigerators. The standards have made an enormous difference, but the fact of the matter is HVAC engineers were certified both before and after the energy standards. The problem in these cases obviously isn’t certification, it’s that companies in a free market tend to build things as cheaply as possible to maximize profits. Some industries actually have *negative* incentives to increase efficiency (ie who killed the electric car).
All physical engineers have been formally trained and licenced since the late 1800s. [The Mullholland Dam near Los Angeles failed, killing over 1000 people, partly because it was designed by an entirely self taught ‘engineer’ with no formal training.]
A formal qualification sets a minimum standard. Many physical engineering graduates never become any more than a than technician because they fail to develop more advanced skills.
A large percentage of software designers, particularly in India, are ‘failed’ civil, chemical or mechanical engineering graduates.
Edited 2014-05-28 03:54 UTC
unclefester,
This is a very passionate topic, so it’s good that you brought it up. I just feel that you are ignoring some of the counterpoints many of us have expressed based on our real world experience in the field.
An important fact is that the engineers usually aren’t the ones running the show. I’ve seen poorly tested products go to production, and even though I voiced a strong objection, in the end it doesn’t matter when management is determined to impose insufficient budget/time constraints. It seems entirely unfair to blame software engineers for decisions that (I believe in the majority of cases) they didn’t make.
Frankly it doesn’t take a software engineering license to see what’s gone wrong most of the time. We already know it, but don’t have the budget to do anything about it. I really wonder why you aren’t taking a stronger position on regulating the end product instead of licensing the software engineers? Otherwise we will end up increasing overhead expenses for software engineering licenses while ending up in the same underfunded position as before with regards to product quality.
Edited 2014-05-28 14:59 UTC
This would lead to the creation of more parasite businesses being happy to sell you a “well recognized certificate” if you pay, and the result would be exchanging knowledge, experience and qualification for money. This is already the situation in Germany where you can easily get certified without actually needing to understand stuff – it’s sufficient that you (or your current employer) is willing to pay for the “certification” so you become “more valuable” to the “free market”. I’ve seen that system in action so many times that I can assure you: It doesn’t work. It only makes stupid and wealthy people occupying important and responsible jobs which they aren’t qualified for. The results can be seen in reality. It’s more scary than it sounds.
No it wouldn’t. It would requite a standardised software engineering degree from a reputable university, supervised practical work experience, written exams (or submission of code) and continuing education modules. This is how all physical engineers are certified.
First problem here: Not everyone capable of actually delivering all it takes to become a good engineer will be able to afford attending university. It will probably be like “education for the rich ones”.
You can already see this today (even though it’s not 100 % related to “engineering”): The most capable, most skilled, most advanced and most experienced IT professionals are usually those who do not bother “sitting there” and getting a degree (and pay money for it), but instead educate themselves, learn, train, try, do stuff and become better every day. They are often called “master hackers” or “rockstar programmers”. They can outperform almost every university graduate in a comparable field.
I would actually really like to see this become reality, as it would eliminate the question of “he has a degree, but does he really know stuff?” question which we currently have here in Germany with university graduates and “professionals” with shiny certificates: The reality is that only a small subset of them are actually qualified in a manner that we would expect. Today, you can be lucky if a “Dipl.-Inf.” (Diplominformatiker, an IT degree) is actually able to use “everyday software” or has a basic clue about programming. Sadly, industry assumes that those who show up with shiny papers, curly signatures and coloured stamps are experienced, qualified, motivated and good programmers (because the job description says “programmer, university degree required”), but what they get is usually “hit or miss”, and in many cases, it’s just “miss”. Strangely, unqualified people still get the jobs here as long as they are certified, and in the result you can see the creation of non-working software, hostile work environments, unusable tools and buggy processes. Still nobody cares because they are certified! And of course they are much better paid than those who can actually do the work that is required, usually skilled engineers without that many degrees – people who simply can do “all the stuff”. Of course there’s additional confusion created in HR where “engineer” and “programmer” are not understood. The hiring process starts with simple pattern matching, and qualified candidates who miss a degree or previous job title get excluded quickly.
I know that a reliable certification would solve lots of problem, but as you can see, it introduces many others. This is immanent to the system. The system allows abuse, and therefore abuse will happen. This is where the “parasite services” enter the stage: They offer what industry requires in a market manner: Want degree? Pay money, get degree. Problem solved.
All the things you’ve mentioned, supervised practical work experience, written exams (or submission of code) and continuing education modules, are positive and useful things. But they cost time and money. Industry is not willing to invest that, and the individuals subject to that education probably won’t. A common argument is “I’ve been at university, I don’t have to learn anything new!” Having a degree is more important than being able to do a good job, as it seems.
But as I said, that is mostly the situation here in Germany as I have experienced it. It’s a very sad description, I know. It may be totally different in other countries where a university degree or certification actually means something, and is worth more than the paper it’s printed on.
Maybe this article is more interesting. Short quote:
Fake degrees
A quick study
Bogus degrees from non-existent colleges cause headaches for employers
You can buy everything else in China, so why not academic qualifications?
“Chinese people pay more attention to having a diploma than they do to having a real education,” says Mr Xiong. “A diploma is worth actual money, whereas an education is not.”
Source: http://www.economist.com/node/21558318
Now, hold on a second. I’m a “physical” engineer (mechatronic medical devices engineer to be specific), and I have to say much of what you’re saying is rubbish, mainly because most engineers don’t work alone. It’s not the engineer that is responsible, it’s the company the engineer works for, and therefore it’s up to the company to ensure that a) the engineer is competent, and b) the product released to the public is fit for purpose.
If it isn’t, it’s the company’s fault for releasing it, not the engineer’s fault. I have developed many things like aortic stents, optical blood analysis systems and other medical-level products, and have developed software and firmware for some blood analysis equipment too. This is serious stuff, yet:
– I don’t have a software development accreditation other than my Mechatronics degree
– My commenting and documentation are decent but I’ve no idea if they meet *your* standard
– I’ve no public indemnity insurance
– I’ve no say on whether my products (software or hardware) get released or not
Are you saying that I shouldn’t be allowed to develop these products? What absolute rubbish.
If you want software quality to improve, simply tighten up the QC and QA procedures applied. Software in the medical devices field is buggy, as it is in every sngle field, but it has gone through years of validation and verification until the company is happy that it’s not going to kill someone. Only then is it released to market. Of course most applications could be put through the same development cycle, but then the majority of people would be complaining that there hasn’t been a new version of Word in 8 years, and the existing version cost them ^Alb5,000 per licence. In reality the market dictates that software needs to be cheap and frequently updated, and since that’s what companies want, they enforce it. It’s nothing to do with bad programmers, it’s to do with enforcing tight deadlines and minimal QA and QC processes. After all, a silly bug in Word (and yes, there are many) might piss a few people off, but it’s hardly likely to kill them, is it?
It is hardly rubbish.
Are you a graduate of an accredited college with an engineering degree? Is it a Masters or just a BS? If it is a Masters, do you hold a State License as an engineer? Did you pass the required exams? If not, do you therefore report at work to those men and women who do?
Your field is an interdisciplinary one that includes aspects of software engineering. And yes, you as a member of your company, would be liable for any issues especially with a medical product, which by the way must meet certain professional level qualifications in order to be approved by the FDA, right?
I work in two fields – psychology and economics. In both, I have had to follow the same professional requirements as UncleFester rightly suggests software engineering and development should now follow. I, too, find myself programming quite a bit given the nature of my work and research. I gladly would welcome greater requirements of professionalism in that field. I would take the time to achieve those added requirements. It would benefit me, my research, my university, and those whom my work helps.
The resistance to this is laughable. Everyone who disagrees dreams of becoming Gates or Jobs while no one wants to aspire to being Ritchie or Knuth. With recent severe issues like Heartbleed and eBay’s security breach, with Appple wanting home automation as its next big thing, and with Microsoft wanting inside of automobiles, it far past time for the computing field to grow up passed the adolescent ‘freedom’ phase and into the adult professionalism phase of its evolution. Only teenagers living at home with mommy and daddy footing the bill believe what they have is ‘real’ freedom.
Well, first off I’m not in the US so terminology is a different in some cases. We do adhere to FDA requirements of course but our main concern is with ISO regulation.
I don’t have a Master’s, it’s “just” a Bachelor’s degree – BEng actually. I don’t have a state licence or any equivalent here, nor does any other member of my team.
How the medical field works is that someone ultimately approves the work that I do, and since I’m a senior engineer, that somebody is a combination of management, QA and Regulatory Affairs, none of whom have software engineering qualifications. If I do a half-assed job and produce buggy software which causes someone to be overprescribed Warfarin or something like that, the responsibility is with management for allowing such shoddy products to reach the market without adequate testing, and it would have to be proven that I deliberately and maliciously circumvented the quality systems for it to come down to me personally. They don’t need to have software degrees to approve my work or the work of my team, they need to have the correct controls in place to ensure that any issues get caught. That’s their profession and they’re good at it, be it hardware, software or implant.
I don’t know where you got the idea that I want to be the next Jobs or Gates. I most certainly don’t. What I do believe however is that professional products regardless of the field they originate from, need all steps of the development and manufacturing process to achieve a standard appropriate for their use. That means planning, design, development, verification, manufacture, validation, QA and post market surveillance. Manufacture alone does not make a quality product, the quality control steps (verification, validation, QA and post market surveillance) are every bit as important, and together have the single biggest impact on end product quality. Mission-critical stuff always needs far more rigorous testing, and such testing should have caught things like Heartbleet etc., but I think a lot of people are expecting mission-critical quality from products which are obviously not mission-critical, such as MS Office or games. I wouldn’t expect my washing machine to be built to the same quality standards as an aircraft engine for example, and therefore would expect that lower standards are applied across the board at every level of the washing machine’s manufacture. And I’m ok with that – I only paid ^a‘not400 for the washing machine and if it breaks down, chances are it won’t kill me or steal my bank details.
Perhaps some more experience of the manufacturing industry would make the concept of quality products a little clearer to people who live exclusively in the field of software or academia.
I don’t think anyone is arguing against QA steps. In fact, I would agree that those need beefing up as well as the professionalism of the earlier stages of development.
I am quite aware of the concepts you are putting forth despite my professional background. Trust me, there really aren’t as many differences as you imagine.
Planning, conducting and publishing research for review is not that different. It follows similar QA and specific cross-profession standards.
It isn’t just mission critical needs, it is reliability. If that wash machine in your example has poor QA and buggy software, it will be recalled or manufacture fixed. If it isn’t, then there are liability issues that arise. If those aren’t addressed, the company in question runs the risk of lawsuits and even going out of business. The same kind of standards and professionalism just don’t exist currently in OS, application, and game design.
Hello I am Warner Bros, and I could give a shit about fixing the bugs in Batman, all I care about is taking your money from DLC purchases. That’s the kind of bullshit that needs to stop, and is not allowed in other regulated professions.
http://www.rockpapershotgun.com/2014/02/10/warner-prioritizing-dlc-…
And here is another example:
http://www.gamesindustry.biz/articles/2012-07-19-fez-developer-wont…
I think many more examples including Heartbleed and others could be found. There is a problem. There are solutions. Many software developers unfortunately do live in a fantasy world of specialness and freedom at all costs.
TM99,
I agree that these are bad outcomes. Yet aren’t you forced to concede that it’s the company’s decision not to fix the bugs for financial/resource reasons? It’s not like the software developers themselves are unwilling/unable to fix them. The question of funding keeps on getting overlooked.
Heartbleed was serious, and there are plenty more examples of exploitable vulnerabilities in both proprietary and open source domains. Yet you realize that OpenSSL was/is free software? You did not pay it’s developers for it, right? I’m willing to bet that you didn’t have any kind of support contract from any of it’s developers, right? Did you donate anything to help fund QA, code reviews, or even developer training/certification? If not then it seems completely hypocritical to tell software developers how to do our work, and then not come through with ways to fund these things (even commercial projects can face internal resource problems as highlighted by your links).
With all due respect, maybe the fantasy is that having state license requirements would somehow give software developers the resources we need to improve software from it’s present state. This is the actual problem that we could use a solution for!
Edited 2014-05-28 18:33 UTC
All software sucks… Yet people panic when they don’t have access to it.
All computers suck… Yet they’re hopelessly lost without them.
Technology reflects people and that’s why it sucks… Only technology doesn’t suck, and neither do people.
The sky is falling… Except that it isn’t.
My advice to anyone who thinks computers and technology is so horrible is to stop using them. There are plenty of places in the world where computers and technology aren’t a part of daily life. If you think the grass is greener, by all means go find out. Just prepare yourself for the anxiety you’ll likely have to live with.
The guy basically says that because things can break or be used in a wrong way it means they suck and, well, that’s just fucking moronic. Does a hammer suck just because you can use it to bash yourself on the fingers? Does a calculator suck just because you enter wrong operands in it and end up with an answer you didn’t intend to get? Do other humans suck just because they value things differently than you and therefore do things differently?
Well, no. If everything was designed in such a way that they couldn’t ever be used in the wrong way or they could never, ever break then they’d be useless, they’d be so locked down that they wouldn’t be able to do anything and they wouldn’t allow humans to use them in the first place. And no, something being able to be used in a wrong way or by wrong person or whatever doesn’t automatically mean it sucks.
Having room for improvement doesn’t mean something is broken. Not being perfect doesn’t mean something sucks. The author isn’t the first to have a melodramatic outburst, and surely won’t be the last. One thing is certain — the world, and technology, will keep moving right along through all of it. Maybe some day technology will allow for time travel and he can then go back to live during the middle-ages where he wouldn’t have to deal with the horror that is computers & technology.
Bad examples. Hammers do their designated task for decades without a glitch. Calculators don’t randomly spit out incorrect results. A lot of software is complete and utter shit.
No, he says that no matter how you use your software, and no matter what software you are using, it’s flawed and can be attacked and abused by others without you knowing it. He’s right.
Now if I used the hammer to hit nails, and occasionally hit my finger, but then at night the hammer is used remote controlled by someone else to bash in someone elses skull (or mine, and then steal my wallet), *that* would be a better analogy…
Well, software DOES suck. A lot. Software today is still rather primitive, especially things in the web (html/css/javascript) space.
The number of times per day I get frustrated with some bug or design deficit is pretty damn high.
But then I relax and realize that it’s also pretty awesome at the same time and I *don’t* write a whiny blog about it.
Edited 2014-05-27 10:47 UTC
It’s people who are broken.
Moving on.
No software is perfect but I use tons of software every day that is perfectly serviceable. Either you need to stop whining and use different software or you need to just stop whining.
The main reason why software is broken is because there is a lack of any kind of professional association.
It would enforce standards, a certain skill level, and probably some kind of liability.
Yes, it is a trade off. Things would probably move slower if we had some kind of organization.
Consider FaceBook. Zuckerberg was some kid in college. Wrote up a webpage. Threw it out there. People used it.
Now imagine a world with a professional association controlling software. First, Zuckerberg would have to finish school and obtain his association. Then due to the standards in the profession, he would have to write proper documentation and code to an acceptable standard. He would have to have his code audited by a security professional as it would be dealing with personal information. The staff he hires would be expensive as they are also part of the association and certain staffing/skills/training levels would be maintained.
And we’d probably get Facebook in like 10 years time.
I actually don’t say that as a negative. The reality is we can dismiss Facebook thinking it is just some silly app. But if you have been in software, you know that is almost how the majority of software is written. From banking to networking.
Perhaps we could separate these domains or something. I am not sure.
In the end though, it seems the world has not come to an end. Software gets built. The world moves on very quickly. So maybe it has all been for the best.
Perhaps the sheer complexity of any system is too complex for anyone and the only way to get through it is to just grind our way through it.
Of course, maybe if we had a professional association, we would have had better standards so it would be less complex. But who knows.
Facebook started as nothing more than an online yearbook. The degree of real innovation was close to zero.
The real ‘value’ of Facebook is that it allows intelligence agencies open access to the activities and thoughts of potential ‘subversives’. [In fact Facebooks earliest angel investors were closely associated with the intelligence community.]
The software problem is more like some kid making a jetpack in his garage and selling to all and sundry without testing or certification.
Let me see:
– Perfection is an invention of human minds, there is no such thing unless you make some presumptions about what it should looks like. Greeks did it about circumference / 2 * radius and ratio between natural numbers, it did not end well. Similar mistakes abound on science (see the development of modern physics or the attempt to conceive a tautological model for all math, for example);
– As already said, there is a relation between time to develop / time to deployment translated to money expended. Blow it and you end with no software delivery;
– Engineering (civil, mechanical, etc) are far from what people think. They work because we test over and over again and we keep improving our models/analyses and our manufacture and its controls. Oh, and there are our security factors to account for the big “unknown / unpredictable” (and believe, on civil engineering it may be a factor of 10 when dealing with soil).
Even though we see the same process described on last item also on software, we see also two serious drawbacks that are rare occurrences on engineering (though there are examples of them):
– We keep changing the foundations (i.e., APIs and languages) and deploy the new ones on a very fast pace even when they are clearly no ready yet;
– We frequently brush aside/ignore the failure reports or fail to follow a strict procedure to mitigate unavoidable mistakes instead of using the incremental process engineers learn to follow.
So, instead of ameliorate our tools with passion we prefer to throw them away and (almost) start over again and for some reason do not investigate properly when something is wrong. Guess this is what give us the troubles we have.
Luckily, it does not happens this way on all software stack (or we would have an unsustainable situation). Compilers and basic functionalities of OSes follow a more strict model. The problem is that we interact with the top of the stack where things are not that great.
Yes, every non-trivial system cannot be proved to be correct and as a result might contain issues. But these systems are commonly usable and consequently — in my definition — not broken.
Greetings,
pica
The claim about the state of human computing machines is one level of abstraction above the actuality on the ground. It is therefore on a level with other, similar such claims about concrete realities. So I would like to quote Noam Chompsky concerning a similar claim about not only language, but about reality in general.
Here’s from the https://www.youtube.com/watch?v=UKeUwRgftDY Current problems in the study of language and mind video from minute 1:10:10:
“The spine is extremely badly designed. From an engineering point of view it’s a wreck… This is true of just about every system of the body. We’d have been able to save ourselves from a lot of Saber Tooth Tigers if we had a eye in the back of the head… if we’d had wheels. Organisms just do the things they can do, and it’s usually a mess.
“The inorganic world turns out to be completely different. The driving intuition of physicists and chemists is that somehow it’s all gonna come out extremely simple… and its been remarkably successful. There’s probably something to it, but why that is nobody knows…
“Language is very much like that kind of thing we find in the inorganic world… That’s kind of curious because langage is the most complicated thing the organic world could do. So why does language look like its analysis should be discovered to be simple? In fact right at its core you already have this property; language is a system which is technically called a ‘system of discrete infinity’…
“Every aspect of language is discrete. Its a strange fact because there is essentially nothing in the organic world that meets the condition of discrete infinity…. Systems which are both discrete and infinite are very rare.
“I think the only place you find anything like a discrete and infinite system in the biological world is when you get down to the level of big molecules molecules; and then you’re back in the inorganic world. Furthermore language has other properties that are rare (but not non-existent) in the biological world: if the economy conditions are real, and they seem to be, those of you who know anything about computability will recognize right off that if you introduce economy considerations, if you say that one computation is blocked if there’s another more economical one, you introduce huge computability problems — unsolvable computability problems. So the question is How does anybody know it if such problems are unsolvable? Like how do you know if some derivation is going to block another one [economically]. It turns out very quickly to be a problem of a very high level of computation and complexity way beyond what’s solvable. If that’s true, what it means is that large parts of language must simply be unusable because they will involve computability problems that can’t be solved.
“In itself that is not a surprising conclusion because we know its true — we know that large parts of language are just unusable. In fact the good part of psycholinguistics (the study of how language is processed) picks as its data things that are unusable, things like a “garden path sentence”. Those are all selections from parts of language that are unusable, and they’re interesting to to test for that reason; they give you interesting things about processing that are learned from the part; these can be quite simple sentences, they just can’t be understood. So we know that large and scattered parts of language are completely unusable. That shouldn’t bother anyone who things about language from the point of view of real biology, not pop-biology. In pop-biology everything is supposed to be usable and well adapted, remember? But in the real world everything is a mess and nothing works at all.
“It just works well enough so you don’t die. Its got to work well enough so that you can reproduce, so you can go on; but nothing has to be usable, and most things aren’t. Its true of the study of vision. Its true of the study of motor mechanisms. Its true of every aspect of psychology and physiology.
“There is that mysterious core that are just the questions that we are humanly interested in (for the most part) in which you can only stare and puzzle. These may reflect other aspects of human cognitive capacity. It may mean they are the kinds of questions that we are not designed to answer. It could be. Or it could be some other reason. But that’s where things stand at the moment.”
We are broken… ;(