The Commodore Datasette recording format is heavily optimized for data safety and can compensate for many typical issues of cassette tape, like incorrect speed, inconsistent speed (wow/flutter), and small as well as longer dropouts. This makes the format more complex and way less efficient than, for example, “Turbo Tape” or all other custom formats used by commercial games. Let’s explore the format by writing a minimal tape loader for the C64, optimized for size, which can decode correct tapes, but does not support error correction.
I’m no expert, but sometimes I wonder if modern computer classes and schools in general are on the right track by focusing solely on modern systems like Chromebooks and iPads. Wouldn’t it be better to teach kids programming in BASIC, with limited resources, on, say, C64 emulators?
I agree with your sentiment. Learning on a more basic hardware level can really help you understand a computer. I personally tried some Assembly which I was horrible at but it did open my eyes to how a computer actually works. I still dabble in QBasic from time to time albiet QB64.
At the school (grades 10-12) where I am currently doing my vocational training (as a teacher) the students use Arduinos when programming in courses that are more technically oriented and interfaces with hardware. Later on they will build an small autonomous vehicle that can follow marked paths.
In pure programming courses they program in Java. At younger ages they might use Scratch and Python is pretty popular in schools too.
No need to dig up 30+ year old hardware that has poor support nowadays. Sure, you could get the kids to do it, but they would constantly question why you would force them to use something that you find at the museum. If you want them to learn something, you don’t need that distraction, you want to focus on the learning goals at hand.
Thom Holwerda,
I am onboard with your intent, however I think micro-controllers would be a better target personally. Nothing wrong covering the history of tech in class, but I don’t see either a practical advantage or educational advantage in learning to program on legacy systems.
Too often developers learn high level languages like javascript or php, but it’s hard for us to collectively unlearn the bad coding practices coming out of the web industry.
Also, it would be cool if new generation of coders could learn to do low level programming in safer languages like rust rather than C where every generation ends up repeating the same bugs.
Edited 2018-10-04 18:15 UTC
So much agree, but then wouldn’t it for educational purposes be better to use C, if only just to learn to be aware of all those pitfalls?
Or pre-set environments on the Raspberry Pi and hardware control tasks with an Arduino, for a less masochistic approach. A C64 might be a bit, well… A bit much?
You can’t teach students to program on old systems. You’ll produce competent programmers that spend all their time telling people to get off their lawn.
I don’t think it’s engaging to teach the younger generation with retrocomputers.
I must admit I tried it, and failed… BASIC seems to be very tedious for them. The thing is, I tried again. This time starting with the ubiquitous Scratch, and then picking up Lua on codecombat with great results. Now he is a competent (very junior) programmer.
[Note that lately lua gets very little love in codecombat, so it’s not a good place to learn lua anymore; too many bugs in the language parser]
The thing is, there are better things for kids now than a BASIC interpreter in an emulator. There are better languages (you can think of Lua as this generation BASIC — the official book is fantastic, and the language has few surprises) and better IDEs.
And there are lots of options: after graduating from visual programming (Scratch), they can start with little scripts in RPG Maker, or making simple games with Codea, which is a lot more engaging. OTOH, there is a void of OSS software for that skill level, I’m afraid; picking Godot is too much at that stage, and pygames et all is not very “modern”.
Going low level is also possible (with Arduino), but then you have the problem of the attention span… hardware projects tend to require too much time just to see something working.
I believe every generation has to use the learning tools more adequate to the hardware it has. We cannot impose our BASIC/assembler/C path into the younger generation as they are not meant to make the interactive/dynamic/touchy applications and games they expect. There are better tools for that .
I am very happy with my uni’s curriculum from a software development point of view.
1st semester: computing theory + digital systems (with breadboards) + high level programming with a wonderful family of languages people aren’t familiar with (LISP), in a karate-kid-empty-the-cup-so-we-can-fill-the-cup way;
2nd semester: computer architechture with Assembly and algorithms with C. You knew what C pointers were (somewhat difficult concept) because you were dealing with a small scale but fully functional educational processor at a very low level;
3rd semester: introduction to object programming with C++ or Java + operating systems with C;
4th semester: logic programming with Prolog + software engineering with Java + low level networking with C + artificial intelligente with LISP;
From here on you still had three more years of a lot of different stuff.
But in just 4 semesters you had to deal with many different types of programming, showing that there is no one way of doing things, shaping the mind to think about the problems, not just spewing out recipes and frameworks.
But please, no BASIC. That’s a disservice to the kids.
Imagine going back to the 1980s as a kid and being told home computers are folly and you should learn programming on a 30 year old batch-processing mainframe to understand how to deal with limited resources?
I think most kids would take a Raspberry Pi with Scratch and Python over a Commodore 64 emulator and its horribly cryptic poke commands.
AMOS Professional forever!
Agreed, I learned to program on the C64 but it was the Amiga and Amos basic that really lit my fire (And later Blitz). The ability to create graphics and sounds so easily gave me instant gratification and kept me coming back for more for years. The C64 basic has a higher entry point for learning I feel as it has no real graphics abilities without POKE and PEEK.
Hi,
I’d want people to start by learning how the hardware works (how CPU fetches stuff from memory & does what it says, and how a formula might be broken down into instructions); including things like the relative performance costs of various types of instructions (e.g. shift is cheaper than multiplication) and the huge performance difference between RAM and caches. Once they’ve got this understanding I’d want to encourage them to imagine what the hardware actually does when they write code in higher level languages; possibly by using a series of small functions and walking through “compiling by hand” and explaining various optimisation a compiler might do.
Teaching higher level languages first doesn’t help people understand what their code asks the hardware to do and leads to bad/inefficient software. It doesn’t really matter much if it’s modern Java or Python or an ancient dialect of BASIC – it’s all too high level to provide the low level knowledge that’s needed to go beyond merely meeting the absolute minimum requirement (correct behaviour).
– Brendan
That’s not a problem with teaching higher level languages first. That’s a problem with teaching higher level languages ONLY. You need to teach higher level languages first to ease people into thinking programmatically. When they’re comfortable with that kind of thinking process, then you start stripping away the layers.
The problem is when that doesn’t happen.
01010100 01101000 01100101 00100000 01010010 01100001 01110011 01110000 01100010 01100101 01110010 01110010 01111001 00100000 01010000 01101001 01100101 00100001
++++++++++[>+>+++>+++++++>++++++++++<<<<-]> >>++++++++++++.>—.++++++++++++++++++.—.————–.+++ .+++++++++++++..+++++++.<<++.>–.>—————-.<< ;+.
Brainfuck
…someone probably said the same about punched cards and Cobol.
Look at all the chaos that is Win32, or the whole Web, layer of HTML, CSS, JavaScript versions, bugs, zillions of frameworks on top. This is not how we should do software. You do not build bridges, skyscrapers, or things like this. We need to go back to simple, understandable, well designed languages and frameworks. IMHO modern software stacks are way too mess, buggy.
Edsger W.Dijkstra, 18 June 1975
http://www.cs.virginia.edu/~evans/cs655/readings/ewd498.html
If you want to teach kids outdated tech, teach them Pascal. Pascal is pretty good outdated programming language.
Or, just teach them something useful today, like Python.
el pescado,
Good point. The modern incarnations we see today are similar in expressiveness to other languages (ie vb.net & c# have different syntax yet similar abstract syntax trees), but younger people may not realize just how bad basic was originally.
BASIC was so limited that the coder would need to manually track line numbers to use at every branching instruction using goto&gosub. This is quite awful compared to the way we program today. This is reminiscent of assembly language, but even assembly languages allow programmers to jump to labels. In BASIC we had to jump to numbered lines!
http://www.infinite-loop.at/Power64/Documentation/Power64-ReadMe/AA…
This meant that all BASIC programs would become an unreadable mess of spaghetti code over time. It was so awful that I feel it needs some historical context to explain why it would have been done this way. Back then, basic would be embedded in ROM BIOS, yet the BIOS did not contain text editors. So in order for the language to be workable on such a minimal platform it needed to be standalone and incorporate some way to edit the code by itself. In BASIC this took the form of numbered line based text editing.
I was primarily exposed to this in MSDOS GWBASIC, which even at the time was extremely archaic compared to PASCAL/C. However those were dependent on things like text editors. MSDOS, for it’s part had the “edlin” text editor, which similar to BASIC would edit documents using line numbers. It was just as bad as it sounds…
https://www.youtube.com/watch?v=isZis-k1dhs
So because of the lack of decent editors shipping with early PCs (MSDOS, Apple, C64), BASIC had an advantage in terms of selecting a language to bundle with them. In practice, commercial developers would rely on 3rd party editors like wordstar or wordperfect or borland’s awesome IDEs to edit code while home users who didn’t purchase add on software would have BASIC.
Edited 2018-10-05 14:23 UTC
And conversely, modern IDEs do so much more (sukru put it best recently: http://www.osnews.com/permalink?662864 ) / it’s good to learn to use their features.
Modern CS programs should focus on writing software for several platforms, but small ARM hardware is one. It shouldn’t just be about low memory situations but also parallel programming. Students should learn threads, lockless parallel algorithms, as well as higher level constructs for parallel like rxjava, libdispatch/GCD, etc.
Most CPUs have more than one core and few know how to use them properly. Old hardware wouldn’t help with that problem.
I completely disagree that teaching kids to program on old antiquated hardware is a good idea. I think before kids should touch programming at all, they should learn the history of computers. They should learn about hardware design and move into programming once they’ve obtained a basic understand of what a computer is, what it consists of, and how its’ parts work together. That provides the framework to build on. And of course they should be primered in logical thinking.
Ultimately experience is the best teacher and for most people learning simply takes time. It doesn’t make sense investing in something you aren’t going to be using when you can be investing in something you will.
[quote]Wouldn’t it be better to teach kids programming in BASIC, with limited resources, on, say, C64 emulators?[/quote]
I agree, kids are not learning like they use to
That’s not due to lack of proper tools to learn with, it’s due to lack of funding, good curriculum, and education as a priority. Throwing C64’s or old hardware emulators at those problems solves none of them.
I agree with your sentiment about learning programming with a retro computer, but I disagree about learning BASIC. Learning BASIC generally leaves you unprepared for learning more useful languages, because it allows unstructured programming and zero documentation. Learning BASIC first (and the lousy computer programming options I had at college) pretty much killed any potential aspirations I might have had as a programmer.
I’m no expert, but I see the best way to learn programming now is to first teach how to document, then teach how a computer operates. A retro computer, with its simple architecture, would be helpful. I’ll leave it to those more experienced than I am to pick the first language to learn (as long as that language requires documentation).
I’m not sure what you mean by first learning how to `document` but it’s essential to have a basic understanding of how computers work if you want any chance of learning how to program anything beyond the most basic & mostly useless languages.
Retro computers aren’t much different than computers today in that something handles sound, something handles graphics, something handles i/o, something is the traffic cop telling what data goes where, etc. Retro computers are best used to satisfy hobbyist & nostalgic inclinations and in teaching the history of computers, not in preparing you how to program modern systems.