“You often hear about how important it is to write ‘readable code’. Developers have pretty strong opinions about what makes code more readable. The more senior the developer, the stronger the opinion. But, have you ever stopped to think about what really makes code readable?”
When a programming language becomes very, very complex, like C++, what happens is that each person will use it in a different way, and no one will manage to fit the whole feature set in his mind. To follow his comparison to human languages, we end up with a dialect-like situation: people speak multiple variants of a single language, yet end up having a hard time to understand each other.
My conclusion is that languages should have a feature set that is large enough to provide good expressivity, but not so large as to cause the emergence of such code dialects. It’s arguably a fragile balance, and may be a subjective one: if you ask me, the Go programming language is quite balanced on this front, but I’ve recently had quite a long argument with someone around here, moondevil I think, who argued that this language went way too far in the name of simplicity.
By the way, I’m not surprised to see the opinion that many features are best coming from someone who likes C++ and C#. Both languages have gone pretty far down the path of adding every feature under the sun for the sake of expressiveness and programmer freedom. In C#, there even is a language keyword, yield, which to the best of my knowledge only serves the purpose of implementing iterators in a slightly more elegant way.
Edited 2013-04-17 22:14 UTC
Neolander,
I was going to say something similar. Code in either simple languages or complex languages can be difficult to read and understand for different reasons.
Consider the utterly simple computer language grammar used by the turing machine. Every step is so trivial that almost anyone should understand it, but so many steps are required that expressing any algorithm of significant complexity would make anybody’s head explode. Too many steps are needed to solve the target problem. It’s inefficient.
A balance is needed, but I don’t think a universal one is possible as it depends on both the programmer as well as the task at hand.
Let’s all return to the basis : Lisp, or even simpler, Scheme !
Powerful, abstract, expressive. Why so many languages popping around, why not just… Zoidberg ?
Kochise
Read this article, it’s the same about expressiveness :
http://archive.arstechnica.com/cpu/4q99/risc-cisc/rvc-1.html
Kochise
I read everything on two levels — literal and spiritual.
A funny thing happens if you can read like an outsider. It’s hilarious, sometimes.
Angels make people say things they are unaware of. Wisdom from the mouths of babes is an example.
This is me before my spiritual awakening, not quite this bad:
http://www.youtube.com/watch?v=eB5VXJXxnNU
Edited 2013-04-18 14:45 UTC
Out of all the details you had to pick, you picked that one as pointless syntactic fluff?
Don’t say that around Python programmers or users of various functional languages unless you want to get into a big argument.
Thanks, I’ll keep that in mind Been meaning to try functional programming for some time now, but never truly got around it at this point.
From afar, it always seems to be a mathematician-oriented way of coding, just like Labview’s graphical dataflow programming feels like something designed by and for electronics engineers.
Edited 2013-04-18 06:02 UTC
Try Erlang, never return back : functionnal as Lisp, close to the bare metal as C is (thanks to the <<binary>> format)
http://www.youtube.com/watch?v=xrIjfIjssLE
http://learnyousomeerlang.com/
http://www.tryerlang.org/
Kochise
So, I have tried the “Try Erlang” interactive shell’s tutorials, then gotten into Learn You Some Erlang for Great Good, but at this point the language just seems too random for me.
While I like some core concepts like atoms, list comprehensions, or the message-passing concurrency model, what strikes me as annoying is the design of the language itself, such as the silent conversion of lists to strings, the brainf–k-like operator syntax, or the way comparisons of atoms with integers result in random results rather than exceptions.
Perhaps I’ll try Erlang again if I need to at some point, but for now I’m rather going to look for a “cleaner” language to try out functional programming. Perhaps a Lisp dialect or Haskell can do the trick, since many people around here seem to like these.
Edited 2013-04-19 09:56 UTC
Then go for Scheme, which is a stripped down version of the original Lisp. Lisp as evolved into Common Lisp (Lisp with OOP features) yet there is not just a single implementation that is fully ‘CL’ compilant. Try those two :
http://ufasoft.com/lisp/ (Windows’ logo, exe file) but you’ll need several Visual Express 2010’s dependencies since the Lisp IDE is Visual Studio based. About 4OO MB.
http://www.sbcl.org/ another pretty compliant and powerful, yet no IDE, you should use Slime + Emacs for the gurus. Or an Eclipse based plugin. Or WinSBCL from here http://code.google.com/p/winsbcl/ or LispIDE from there http://www.daansystems.com/lispide/
BTW, comparison between CL and Scheme : http://symbo1ics.com/blog/?p=729
Kochise
Edited 2013-04-19 11:55 UTC
I tend to prefer languages like C++, C#, Ada and so on, because I am the opinion the language should be expressive to explore multiple abstractions and powerful enough to allow user types to behave like built-in ones.
Maybe that works for me, given the years I have been coding (since ’86) and my background in compiler design.
When I was young, I got excited about making new operators. Now, I see that as foolish, except where operators in math are already defined. Matrix and complex math is good with operators, but kids will foolishly use them where there is no precident (no pun intended.)
I think you would be better off just making built-in matrix and complex types… but there are many ways to define matrices–sparse etc.
cout <<1<<3<<“What?”<<endl;
I’m not fond of that. What if I want shift operators on numbers? It’s too damn verbose. I have something less flexible, but more practical.
printf(“Name:%s Age:%d\n”,name,age);
You can use this: There is only one stream.
“Name:%s Age:%d\n”,name,age;
Edited 2013-04-18 06:38 UTC
Operators is an over-hyped example of using abstract names for function/method calls.
All modern languages except for C, Java and Go allow abstract names as function names/operators.
C++ is a lot of the time is the wrong choice, if you ask me.
I’m more likely to use C.
I like what Pike had to say about the differences between C and Go:
http://commandcenter.blogspot.com/2012/06/less-is-exponentially-mor…
opinions are like assholes… come back when you’ve got something more substantial than one asshole’s opinion
Guess what else is like an asshole?
Really interesting article, and worth bearing in mind for anyone involved in teaching programming.
However, in terms of the wider argument, I think an important point the article misses is about consistency. One of the problems with natural languages like English is that the rules are not consistent (in the non-mathematical sense), which makes them very hard to learn. You can make a language easier to learn without reducing its expressiveness by making it more consistent.
It doesn’t follow that consistency breads simplicity (as Alfman pointed out, the simple rules of a Turing machine don’t make programming in it easier). On the other hand lack of consistency definitely results in complexity (like, for example INTERCAL!).
I agree with the part about consistency. Even if another programmer has a very different style from my own it’s usually easy to figure out what they are doing if their approach is consistent. I don’t really care if another coder uses “iCoins” or “NumOfCoins” or “numberOfCoins” as long as their approach is the same in each block/function.
The code I find hardest to work on is when multiple programmers have hacked on it and each one uses a different style. The code tends to become a mess if consistency isn’t maintained.
I have to slightly disagree.
1. The code I find hardest to work on is code I’ve personally written and hacked on over the years, each time using my then-current favorite style.
2. I can write consistently-styled spaghetti code.
And some people consistently write spaghetti code
Edited 2013-04-19 10:53 UTC
So, his point is that a big vocabulary lets a writer feel good about what he is writing — he feels he is saying exactly what he wants — but his subtilty is lost on beginners.
There is this thing that happens — an author can get mesmerized by his craft. He gets great satisfaction when he picks the right word, like he’s playing sudoku. This is a vainity.
Good programming is good, but there are a lots of gay people who are fond of 50 line functional programs. Come-on! It’s a special case of elegance that doesn’t apply very often!
Brevity is the best for programming. That guy is not brief. LOL
Edited 2013-04-17 23:43 UTC
Yeah, reminds me of reading some Perl code. It seems that a lot of Perl programmers will go out of their way to write code in the least amount of lines as possible, and to hell if anyone but Perl gurus can actually read it. And the end result looks like modem line noise on a terminal. It’s like prog rock bands who will play solos for 40 minutes, if for no other reason than to show off, thereby alienating everyone in the audience, except for the 1% who are also prog rock snobs. Note that I’m not dissing either Perl or prog rock, because I like both Just bitching about the excesses of both.
In his article, he says whether or not you should put code on one line or stretch it out to a few depends on who your audience is. I say that’s horseshit. If there’s two ways to write the code, you should always put it in the form that’s easiest to read, especially since you never know who’s going to be debugging your code 10 years from now. At least that way, even if experts are a bit irritated by the simplicity of it, you know everyone is going to be able to read it.
Edited 2013-04-18 01:36 UTC
One day when Larry Wall was writing Perl code his dauther had a look at the screen and asked him: is this swearing, daddy ?
Sorry but this is where I stopped paying attention to what you said and evicted your previous statements from memory.
God’s not very anti-homo. He uses “homo” as an adjective, though. Sports are homo. Smelling farts is Sodom. Pets are homo.
I’m reasonably certain that you’re not God.
How the fuck is sexuality and attraction towards this or that gender related to coding skills?
Perhaps he meant “gay” with it’s original meaning and happy people like 50 line functional programs.
I’ve seen stranger things written by him. It does liven up things here.
My sister tried to ban her boys from having toy guns. One picked-up a stick and said, “Guess what this is, Mommy?”
I asked God if He was racist. He said, “Sports”
Men that are fashion designers or in drama or in interior design, are frequently gay.
Most NBA players are black.
Functional languages are gay. That’s why they called it “LISP”. They can be good, perhaps.
Designers are “gay” (so to speak) because they are (let) more emotive flows, are more creative than the average redneck.
NBA are black, and do not forget, 2 meters tall at least. Not the average white man out there.
Fighters and warriors are stubborn people with only two brain cells not to oppose given commands or think before going to the front.
God worshipers are people with a lack of realism and pragmatism, find in their cult some comfort and power provided by a crusty and dusty book, because in real life they have none.
Just the most obvious examples of Darwinism though : adaptation and evolution…
Kochise
Edited 2013-04-18 07:31 UTC
Dude, don’t do acid. It makes you hallucinate and imagine things.
I was an atheist. Then, one day at age 26, reality didn’t seem quite right. Looking back, I was kinda retarded in a nerdy way. Anyway, spiritual awakening. I read the Bible. I would open randomly. It started talking when I opened randomly. Over the years, I tried various random techniques — it’s all the same. Any random technique works, but the key is you have to hold-up your end of the conversation.
God talks! He’s great. I love Him.
I enjoy what you write, because I like strange things and unexpected twists. But I’m not sure everybody here agrees with that.
It seems you are quite clever, but your technical knowledge and opinion gets a bit distorted and even lost when you add these non-technical remarks.
In a discussion about code readability somehow NBA players and gay brothers get involved. I know NBA player Chris Bosh has a mayor in computer science, but apart from that it seems totally irrelevant.
If you leave talking gods, LISP coding gays, the CIA, tall black men and all their friends out of your comments I’m sure it would make live easier for you and the ones who think “WTF!?” with every comment you write.
Anti-psychotics are your friend. #notajoke
You’re an kid who knows nothing of reality. Do you know why they say “Born Again”? It’s because you enter a new reality, like a baby. It all makes sense — all that goog in the Bible. It’s not goog.
“Filled with the Holy Spirit”
Most people have no idea. I’ve never used LSD, but those people know things.
Crack open a book, randomly. God will talk.
I took a couple philosophy courses. I was an atheist at the time. I got furious because they made a big deal about the distinction between the world as it is and as it appears. WTF are you wasting my time with this for? The fundamental categories for philosophers are “materialists” like I was, who think there just the world as it is; “idealists” who think just the world as it appears; and “Duality” people who believe in both.
Some day maybe you get a spiritual awakening, maybe not.
“Spiritual realm” “Material realm” “Delusion” if you like.
—-
I hope the FBI doesn’t prosecute me if you go crazy and kill people after open the Bible and breaking with reality!! That’s like yelling “jump!” It’d be funny if the radio talked to you or something and you killed one of their agents. ROFLMAO Deserved it.
Edited 2013-04-18 12:34 UTC
No, you’re definitely a bit of a looney-toon, and at times a little unpleasant about it too. (Takes one to know one, as they say.)
Unless you write your code specifically for educational purposes there’s no way you’ll know who will have to understand your code in the future. Obviously then, you should always write your code as easily understandable as possible.
Uh yeah, that’s why we have Basic and other introductory languages.
Why exactly should the larger base of the population understand how to code? It’s like saying the larger base of the population should know how to read and draft architectural blueprints. I don’t really see how it would benefit most individuals or the society at large.
I know this may come as a surprise to some geeks but not everyone is interested in learning how to code.
Edited 2013-04-18 03:40 UTC
If developer, writing the code, has clear vision, then he/she is half way in code readability. Likewise, even if you can afford a beauty like Python, without clear logic your code would be unreadable even for you after a while.
The human brain has some limits on its capacity for processing information, so it deals with complexity dividing information into chunks. Short term memory can retain only about 5-9 things at one time, according to George A. Miller’s studies.
Respecting this limitation makes code readable.
Edited 2013-04-18 07:29 UTC
I have a gay brother. We were walking in the Seattle Fisherman’s warf. Someone was handing-out free samples. It was some god-awful combination of foods. I was not about to try it.
TempleOS, you are a fascinating creature. What you write absolutely doesn’t make any sense to me, but the thought processes behind it must be of a rare variety.
Are you some kind of Burroughsian cut-up machine? You certainly read like one. Wouldn’t that be a just irony if so.
PS. Where is the necessary mods warning here? It is one thing to liven things up, as someone else has said, but either consistently posting to get a rise or being knowingly offensive, is something else entirely.
Orf.
I was always a bit of a basic guy. I used several Basic languages in the past and i liked them because of their readability (the syntax and keywords itself).
Have you guys ever tried Ruby? You can actually write code that talks for itself. Bad Ruby code is identified when its not self-explanatory.
I really hate perl and python for their cryptic spaghetti mess you see in every project written in those languages.
Sorry if someone has a problem understanding what a whole number is, they shouldn’t be programming.
I didn’t get this either. What is so baffling about NumberOfCoins?
It’s a Number. It represents Coins.
X or Number are far more confusing. What is X supposed to be? You’d have to look at the assignment itself (if it wasn’t declared up front). Never mind what it represents.
Number is a bit better, but you’d still have to look elsewhere to find out what it represents.
Before anyone claims I’m a versed programmer, I’m fairly good at Excel automation with VBA scripting, but that’s it.
Seriously, who *doesn’t* know what a whole number is? Other than Tarzan?
And how would NumberOfCoins hide the fact that it’s an integer value? The word Number is even there.
Oh I see.
Hey, I have an idea; why not use int1, int2, int3 … intXXXXX? That way there’s no “confusion” regarding the type of the variable and obviously your code would be much more readable…
For example With C#, Microsoft have pretty standard naming conventions for a reason.
Bools begin with Has, Can, Is. E.g dataReader.HasRows, String.IsNull etc.
I am sure similar practices are practiced in languages I am less familiar with.
Edited 2013-04-18 13:08 UTC
The rules for 50,000,000 lines of code are different from 140,000 or 10,000 or 1,000.
I aspire to be more libertarian than hall-monitor. I stop myself when I get an impulse to make up rules to keep things extra tidy.
Clearly, there is no end to the number of policies you could invent to keep code tidy. Moderation, I guess. Don’t pat yourself on the back because you can make-up lots of rules. It becomes the most important thing, if you’re not careful. Even beginner programmers might want to develop an attitude like a manager and spend a lot of thought thinking about what the code accomplishes. Code which has no purpose is not good no matter how tidy.
Edited 2013-04-18 13:22 UTC
Err no they aren’t.
Indeed. The only actual difference is the degree to which you can get away with ignorance or indifference towards such rules. When complexity increases quadratically to line count, cowboy hackery that works great at a couple dozen lines and is just about still tolerable at a few hundred will eventually exceed even the most brilliantly photographic and pathologically OCD mind if it continues to scale that way. Really, if you go over a hundred lines and don’t already own a copy of Code Complete and a link to thedailywtf.com then you really should stop there awhile to take stock of your position if you’ve any sense at all.
In my own comical case, I got to 2500 lines of Grande Spaghetti before I crashed and burned most horribly. Call me a total lightweight (2500? Pfft!), but at least it taught me my most valuable lesson early rather than later in my programming life (i.e. well before anything remotely business-critical), and resolved me to learn how to construct software properly in future. Whereas if The Daily WTF is anything to go by, there’s a frighteningly significant chunk of professional practitioners who go magnitudes bigger than that and still never learn anything no matter how disastrously it all ultimately collapses around and on top of them.
That is a property, not necessarily what the field will be called. I had occasion to have to write some code a while back to remove event handlers programatically (clean up code from a bit of legacy code) and to do that I needed to remove the “TextChanged” event handler on a generic Control instance. Guess what Microsoft called the private member variable? “EventText”. Oh, joy. every other event is called pretty much “Event<EventName>”, so “ValueChanged” would be “EventValueChanged” and “SelectedValueChanged” would be “EventSelectedValueChanged”… so this is extremely inconsistent!!
As for your example, that is SqlDataReader, the IDataReader interface has no HasRow property, which is a complete PITA when it comes to sharing code between different database back-ends. What you end up doing is calling Read(), and if it returns true, you know you’re cooking with gas. This works well for loops, because you will end up with : while(reader.Read()) {…}, but if you have occasion to be using a IDataReader with a single row**, it’s a bit of a bind. (I’m not a great fan of ADO.Net, I think it’s pretty poorly designed and pretty much use EF Code first now for everything.)
** Using a IDataReader for a single row is an edge case, but as I generally cache all data in class instances regardless, DataTables are mostly complete overkill.
My point was that bools tend to have names that when plugged into an if statement make it almost read like English.
if(!String.IsNullOrEmpty(myString){
//some code.
}
I tend to use similar in other programming languages. For example in a JavaScript.
if(!isAnimating){
//some code.
}
I know Microsoft’s own variable naming schemes can be a bit iffy in places. But it better with the newer version on .NET IMHO.
Edited 2013-04-18 18:28 UTC
Actually, the fact that NumberOfCoins is declared as an Integer is itself outright misleading: it should actually be a whole number (i.e. an integer with a minimum value of zero and no upper bound) since you can’t hold negative coinage in your pocket but in theory could have any number above that (tho’ the US Mint and laws of physics might put a practical cap on that eventually).
(Also, I strongly believe any C*/Java developer who bangs on about the vast importance of Integer and Float needs to be locked in a room full of Type theorists and Haskell and Eiffel programmers until they learn what a real type system is.>:)
As for should or shouldn’t be ‘programming’, there are lots of folks in the world who’d recognize the term ‘number’ or ‘whole number’ a lot easier than ‘integer’. Consider any spreadsheet user, for example: they might be a domain expert in finance and accounting, but they’re not what you’d call a ‘conventional’ programmer by any measure. For them, they don’t care what a cell’s ‘type’ is called, only that its contents adhere to their required rules and holds its exact value 100% reliably at all times.
While you’re at it, consider the fundamental difficulty, not to mention overt danger, of handing Floating Point types to such users. Especially those that deal with precision-critical tasks like money management, where Very Important Folks like shareholders and IRS might not be so happy to take “just some FP rounding errors” as an excuse.
Frankly the only thing nastier than dealing with numbers in a computer program is dealing with text as (contrary to traditional programmer beliefs:) most of the planet does not speak ASCII, and many of those middle- and far-east scripts are crazy hard to handle correctly, and even quite unexotic ones can have a few nasty tricks up their sleeves.
So, y’know, while it’s easy to say “Integer or GTFO”, being able to recall what ‘integer’ means is only a small first step down the actual rabbit hole – and I think even experienced developers may easily overlook just how deep the devil really goes.
No, but you could be in debt.
Yes, that are things to watch out for with number in programming but none if has any relevance to the dubious claim that NumberOfCoins “hides” anything about the nature of the data it stores.
Do not mistake Coins, which are shiny pieces of metal of various shapes and sizes, with Money, which is an abstract concept, frequently overdrawn.
Oh sure, if one really insists one can always write stuff like:
int NumberOfCoins = 5;
int DaylightRobbery = -5;
NumberOfCoins = NumberOfCoins + DaylightRobbery;
But then I wouldn’t blame either the language nor the reader, but the smarmy smartass that wrote it – because that’s exactly the sort of rotten communication skills that rightly give practitioners a bad name.
As for dealing with money, I still wouldn’t want to type it as Integer or Decimal (and definitely never Float). I’d declare my own Money type or Dollar type or whatever was the most descriptive name appropriate to the task and then use that. That clearly describes my intent in creating that variable and properly decouples my own program logic from underlying hardware/language implementation.
BTW, this is why I’ve grown a tad contemptuous of C*/Java type systems and the folk who hold them up as the One True Way Typing Should Be Done. Compared to the sorts of types systems you can find in functional and other ‘exotic’ languages, C*/Java typing is downright crude in concept and execution, limited in its expressiveness and mindlessly bureaucratic to use.
A good type system ought to be a positive productivity tool for expressing your program’s intent as effectively and efficiently as possible. But too often C*/Java typing is used as a personal crutch for lazy thinking, sloppy coding, premature ‘optimization’ and OCD tendencies: negative programming habits blessed with official seal of approval. Such a mismatch/shortfall is not only a technical deficiency but a cultural one as well. My earlier (long) post has more discussion of why cultural problems are every bit as important as – and fundamentally inseparable from – the technical ones, so I’ll not repeat it here.
hhas,
“As for dealing with money, I still wouldn’t want to type it as Integer or Decimal (and definitely never Float). I’d declare my own Money type or Dollar type or whatever was the most descriptive name appropriate to the task and then use that.”
If a language has a decimal type (like databases do), that would be ideal, why wouldn’t you want to use that?
In practice though many web languages have such weak typing you don’t have much choice (javascript/php). Floats aren’t great for cash, but it’s not usually a noticeable problem outside of banking because typical cash arithmetic doesn’t produce irrational numbers, so rounding should be sufficient. However to be honest I wasn’t even sure which way fractional taxes etc *should* be rounded.
http://boston.cbslocal.com/2010/10/01/curious-if-6-25-sales-tax-wil…
If you buy two items for $1 each, and tax is charged at the end, it could result in discrepancies if you return them individually.
Purchase:
subtotal = $1 * 2 = $2.00
tax = $2.00 * 6.25% = $0.125 # round up or down?
total = $2.125 # round up or down?
Obviously if the purchase price is not evenly divisible by the number of products, then there is no way for all the returns to be valued the same if they’re to equal 100%. Does anyone know whether the tax on returns typically gets recalculated in the context of the original transaction? If they are done as new transactions, one may be able to exploit the rounding differences to earn a penny
“BTW, this is why I’ve grown a tad contemptuous of C*/Java type systems and the folk who hold them up as the One True Way Typing Should Be Done. Compared to the sorts of types systems you can find in functional and other ‘exotic’ languages, C*/Java typing is downright crude in concept and execution, limited in its expressiveness and mindlessly bureaucratic to use.”
It’s not clear to me what you are criticizing. Both of these languages let you create your own Dollar class if you wanted to.
I do criticize languages for not having unsigned types (java, vb), this occasionally causes problems when the data type is truly not supposed to be signed.
http://javamex.com/java_equivalents/unsigned_arithmetic.shtml
I’d use Decimals to represent the cash values, sure, but to explain what the variable actually represents I would want to type that as, say, USD or UKP. Or, if the program allowed users to mix-n-match, I’d type the variable as Money then define a Money subclass for Decimals that also indicates the monetary system and provides methods for combining and converting disparate values (e.g. $5.50 + ^Alb2.20 as Yen).
You can easily get rounding errors on just a couple decimal places, e.g.:
>>> 324.21 * 100 // 1 / 100
324.2
The pennies become pounds, and the pounds become big honking liabilities.
You can mitigate that a bit by representing monetary values in their smallest units, e.g. cents/pennies rather than dollars/pounds. But the mere fact that you cannot trust floats to be accurate should be a red flag, and I wonder how many systems implemented that way even bother to mention all these caveats to their users. To give another example:
>>> 0.7 * 0.7 == 0.49
False
The problem with the likes of C*/Java languages is primarily a cultural one which is then reinforced by the design of the language itself. Such user-language combinations have highly mechanistic, reductionist view of domain problems, where everything is dragged down to the lowest, generic level; thus ^Alb2.20 is stripped of most of its core meaning until it can be viewed and treated as just another decimal number, 2.20.
Users of such languages are encouraged and conditioned to throw away vial meaning in this way because that is what the language makes easiest for them to do. Languages are not simply a one-way tool: they also have a strong feedback element that influences the ways in which we do (and don’t) think. And once such languages become entrenched in mainstream business and education, their whole philosophy and behavior becomes cyclical and self-reinforcing, until their users cannot conceive of any other way to think or behave.
Compare and contrast to the Lisper philosophy of taking the base language and then building it up with domain-specific vocabulary until it is able to efficiently solve problems within that domain.
Of course, to perform this bottom-up domain-centric building strategy effectively, the coder needs to have a good understanding of that specialist domain themselves. Which, alas, is something a lot of C*/Java devs have absolutely no interest or desire to do. They just want one group of people (Architects, Business Analyists, User Liaisons) to hand them a complete and comprehensive finished Specification which they can then unthinkingly convert to code form and hand off to another group of people (Testers, Users) to check that it does indeed fulfill requirements.
Many such programmers actively and jealously guard their deliberate ignorance of – and complete disconnect from – the actual problem domain in question, often aided and abetted by equally ignorant management which believes that if they aren’t hammering out code for the full 50 hours per week then they aren’t actually working. The notion that a programmer might spend a day/week/month sitting with users and learning about what they do and how they think and work is anathema to both: to the code lover it means stepping outside their comfort zone and having to perform hard mental (‘menial!’) exercise; to the PHB because it turns their collection of ten-a-penny disposable code monkeys into valuable and hard-to-replace company assets.
The only winners in this are OCD jobsworthys and toxic martinets. And the ultimate losers are the end-users who have to do all the actual (i.e. revenue-generating) work, because when all this shit flows downhill, guess who’s left standing at the bottom? This is not programming, it is self-serving code masturbation and total willful abdication of responsibility; the intellectual equivalent of crackhead tweaking.
Yes, it’s a rant, but not everything that’s wrong in programming and programs can be boiled down to a trivial purely technical problem with an equally pat technical solution: mainstream programming culture is a serious source of major modern ills too. I cannot abide people who, when faced with a problem they do not understand, drag it down to their own level of ignorance, laziness and effed-up understanding, rather than build up their own level of understanding until they can tackle it competently and effectively. I’ve long since fixed such toxic tendencies in myself, so I’ve no tolerance now for others who continue to make my and others’ lives needlessly painful by failing/refusing to do the same.
That’s just a degenerate instance of the much larger general requirement to specify appropriate bounds on any type.
It also illustrates my previous point perfectly: you are missing the woods for the trees, because you have been trained to think in terms of low-level literal machine implementation (bits and bytes in memory) rather than expressing your high-level abstract requirements completely, concisely and with total precision.
IOW, you’re thinking about numbers as a CPU thinks about them, not as a mathematician or scientist or accountant or economist thinks about them. But the whole point of writing a program isn’t to make the machine happy, it’s to satisfy the exact requirements of the domain expert who uses that program. Step into their shoes and walk around in them awhile, and perhaps you can shrug off such ingrained preconceptions and prejudices and learn to see the bigger picture.
I like Native American English. i.e. “Me get Fire” I think it’s a no brainer to say “NumCoins” not “NumOfCoins”. “Coins” is best. I’d use “i”.
I’m always brief. I’d have to be in a strange mood to use “NumSomething”. It’s redundant. Actually, I use it when I make an enumeration — The count of enumerated entries, I call “NUM”.
//Partition Types
#define PT_NULL 0
#define PT_FAT32 1
#define PT_NTFS 2
#define PT_NATIVE 3
#define PT_NUM_PARTITION_TYPES 3
Yawn
+1 for common sense. If a person needs “NumOfCoins” over “NumCoins” or “Coins”, labels are probably the least of his/her coding hurdles to overcome.
n sure some of you will wet your undies when I say this but instead of expecting code to be written so a 5 year old can understand it, perhaps more should be expected out of “coders” abilities to comprehend. I’ve never known a good coder to be tripped up (although annoyed maybe) by `poor` labeling.
I don’t know if they teach this, but there was a language called “FORTRAN” which had this kinda cool feature that variables beginning with “i,j,k,l” were automatically integers and other letters were floats.
This comes from the traditions in mathematics where you use “i,j,k,l” for counting indexes on summations and stuff.
“Hungarian notation” is voluntarily returning the the fortran idea — they put prefixes on variables.
I don’t really make any rules for my code. Gay people get carried away with coding standards. They’re okay, but kind-of a big distraction. My first company, Ticketmaster, was very libertarian. As a matter of fact, I look back and cringe at some of the awful messes I made in code and nobody said anything.
Edited 2013-04-18 11:50 UTC
FTA: “A beginner is more focused on the actual vocabulary of the language than what the expression of that language is trying to convey.”
This isn’t because they’re beginners, it’s because they’re badly taught. And much of the blame for this falls squarely on mainstream languages and their existing users.
As Guy Steele put it: “The most important concept in all of computer science is abstraction.” [1]
Or, as Seymour Papert explained it to 7 and 8 year-olds:
1. This is a Word.
2. This is how you execute a Word.
3. This is how you add your own Words.
Compare with how popular ‘introductory’ languages like JavaScript and the current plague of “learn to code” sites do it.
Khan Academy’s Programming Basics course structure [2]:
1. How to Read Documentation
2. Using Math Expressions
3. Intro to Variables
4. More on Variables
5. Incrementing Shortcuts
6. If Statements
7. Booleans
8. If/Else – Part 1
9. If/Else – Part 2
Codecademy’s PHP course [3]:
1. Welcome to PHP
2. Control Flow: If/Else
3. Control Flow: Switch
4. Arrays in PHP
5. Loops: For and ForEach
6. Loops: While and Do-While
7. Functions in PHP, Part I
8. Functions in PHP, Part II
9. Object-Oriented Programming, Part I
10. Object-Oriented Programming, Part II
11. Advanced Arrays and Maps
Codecademy’s JavaScript course [4]:
1. Introduction to JavaScript
2. Functions
3. ‘For’ Loops in JavaScript
4. ‘While’ Loops in JavaScript
5. Control Flow
6. Data Structures
7. Objects I
8. Objects II
I could find many more examples, but that’s enough to illustrate some key points. First, look at the topics covered.
– In all courses there are numerous chapters dedicated to the subject of flow control.
– The KA JS and CA PHP courses call out an apparently random selection of data types (Boolean, Array, Map), though what happened to all the other standard types (Int, Float, String, Date, etc.) it’s hard to say.
– CA’s JS course at least has a complete section dedicated to Data Structures, but it appears after flow control.
– KA gives special prominence to Incrementing Shortcuts, yet doesn’t even touch on abstraction. The closest users get to dealing with functions is to be shown how to poke some black-boxed mystery meat that causes shapes to appear on screen.
– The CA PHP course leaves Abstraction to last. Only the CA JS course tries to introduce it early on, while minds are still relatively clear. And neither actually gets to the core of what Abstraction actually is and why anyone should care.
There’s a pattern emerging from the above: one that strongly suggests its authors see programming in purely mechanistic, reductionist terms, are happy to tolerate or even embrace arbitrary complexity, and will blindly follow and replicate long-established tradition without question. Programming is primarily seen as a linear task involving the sequential issuing of instructions – Do Step A, then Step B… then Step Z – with an educational approach that progresses much the same: This is Feature A, this is Feature B… this is Feature Z.
Even when the CA JS course does try to introduce functions early on, it explains the motivation thus: “DRY: Don’t Repeat Yourself.” A trendy insider acronym, to be sure, but utterly worthless for pedagogical purposes. Why should learners care if they repeat themselves? They already have excellent and highly efficient tools for managing repetition – Copy, Paste, and Find & Replace – along with plenty of experience in using them.
So regardless of whether such a Functions chapter comes first or last, the most likely response will be “Pfft, functions, who needs ’em?” followed by several more years of spaghetti code. If they’re lucky, such an approach will collapse sooner rather than later, while they’ve still some chance to undo their rotten ways (as in my case) before that mindset becomes irremovably ingrained. But it’s a horribly expensive and counter-productive way to go about learning or doing anything.
So let’s go back to LOGO, which combines Lisp philosophy with solid pedagogical principles, and look at how it does it. First up: Papert’s objective was not to teach programming principles and practices, it was to teach structured thinking and problem-solving skills; something of great value in a good many fields outside programming as well. That students also learned some programming skills was simply a nice side-benefit of using LOGO as the platform for this.
In LOGO, students are not taught to write procedures to avoid repetition; they are shown how to expand a language’s existing vocabulary to better tailor it to their own needs. This is why LOGO’s limited built-in vocabulary is not a flaw (as real programmers’d naturally see it) but actually a key feature: when the only drawing operators you have are FORWARD and LEFT, you very quickly reach the limit of your patience when trying to perform complex operations. At that point, a tutor can step in and say “let me show you an easier way to do what you’re doing”. Thus the student very quickly learns to construct more advanced words such as SQUARE, TRIANGLE, CIRCLE, HOUSE, TREE, STREET,… through composition of existing words. And because students are doing this all this exploration at a really basic level, it is quick and easy for them to learn by making and fixing mistakes, and by experimenting with different ways of assembling their abstractions (e.g. they might initially build HOUSE using only FORWARD and LEFT, and then independently or with a little prodding realize it could also be composed more efficiently from more powerful RECTANGLE and TRIANGLE words).
Whereas if you go back to something like the KA JS course, the closest you ever get to functions is being handed a bunch of mysterious black-box incantations like RECTANGLE(0,100,200,400) and taught how to chant it in order to obtain a desired result. Superficially fulfilling, but leaves students with absolutely no understanding of what goes on beneath, never mind any idea of how to create incantations of their own.
Lastly, it’s not just the course authors who are at fault: the fundamental design of mainstream languages also greatly encourages and compounds all these problems. To illustrate, which of the following words is of most importance to a user: break, export, return, case, for, switch, function, this, continue, if, typeof, default, import, var, delete, in, void, do, label, while, else, new, with, checkMyBankBalance?
Hopefully you get ‘checkMyBankBalance’. But that being so, why does the language assign such special significance to all of the other words, going so far as to give them special colors, non-standard behaviors, and whole chapters of documentation explaining their use? By devoting huge attention to these smallest and lowliest building blocks, while treating the user’s own words as wholly generic, the language itself is teaching its users to believe that these primitive features are the most important words in an entire program, and therefore what users need to devote all their attention to in order to understand what the program actually does.
IOW the original article completely mis-identifies the root reason why novices can’t read code effectively. The fact that it stumbles across a vaguely appropriate response (use languages with smaller vocabularies) is incidental, and still doesn’t fix the real problem. Until the faulty thinking of mainstream language creators and users towards language-related HCI is rectified, ‘production languages’ will continue to be baroque backfiring monsters, while more parsimonious vocabularies remain relegated solely to ‘teaching’ or ‘toy’ status – for kids and thickies only, and definitely not for Real Work.
[1] Foreword to Scheme and the Art of Programming (Springer & Friedman), p. xv
[2] https://www.khanacademy.org/cs/tutorials/programming-basics
[3] http://www.codecademy.com/tracks/php
[4] http://www.codecademy.com/tracks/javascript
If a person asked you for a 10 ton bridge and you built a 20 ton bridge and charged him for it. Would that be good?
You guys act as though overengineering doesn’t exist.
If a person wants a wing for a 737 and you give him a 747 and brag about how good you are. Am I supposed to be impressed?
Take a look at HD Audio. Take a look at USB.
A good engineer cuts it as close as possible to the minimum needed by the spec. A bad engineer doesn’t do enough or does too much. A really bad engineer way way overshoots the scope of the project.
I talk with God. I am not worried about anything. If I got ripped off, I could buy a lottery ticket. I win 20 lotteries a day. The world is 100% just. I could be compensated with good health or a brilliant idea. My enemies could get a Deep Water Horizon or dulldrums. God claimed Deep Water Horizon. God claims everything. A Greek Muse is fun to think about.
The game company Zyanga ripped off people. They moved on. Zyanga now struggles to meet payroll. That’s not fun. They deserved money for keeping all those people on payroll so long. Why should a guy money for a day’s patent idea? God makes him do nasty paperwork for the patent. Work.
I don’t have a license. I walk.
Edited 2013-04-18 23:00 UTC
So we all agree 7-bit signed ASCII is pretty stupid, now that parity or 7 bit transmission is not around. You have to clobber people over the head to see the obvious, don’t you. They’ll zombie walk doing a new system with ASCII. UTF would be the correct choice for all source code. I don’t like that though — variable length symbols in my StrCmp().
I made my own compiler. It works best with 64-bit values. I forget, but I frequently use just 64-bit wrong-sized args and return values worried about avoiding needless zero-extensions or sign-extensions.
I wasn’t careful about signed and unsigned, but one day made eveyrthing signed.
My logic is like the BASIC language where begineers just have one type of integer. I want just one type of integer in my language 64-bit signed.
It is actually a brilliant idea — like the stupid ideas they do with C# to make it easier for beginers, that never make it simpler. LOL
The exact same logic for the C# language applies to one signed 64-bit integer type in the language — beginner frfiendly… misguided, perhaps.
What kind of retard thinks memory managment can be made simpler than MAlloc Free? Retards.
Edited 2013-04-18 23:30 UTC