Now, of course, I tell computers what to do for a living. All the same, I can’t help feeling that I missed out on some fundamental insight afforded only to those that grew up programming simpler computers. What would it have been like to encounter computers for the first time in the early 1980s? How would that have been different from the experience of using a computer today?
This post is going to be a little different from the usual Two-Bit History post because I’m going to try to imagine an answer to these questions.
This is a great idea.
The experience would vary quite widely depending on the computer and the version of BASIC. The Commodore BASIC the author uses was really quite primitive for 1983.
of course it was: Microsoft was delivering the BASIC for the C64
(Commodore was just rebranding it and MS learned that they should never again sell anything for a flat fee to other companies)
Edited 2018-09-03 21:09 UTC
I think the joke is that Microsoft sold a BASIC for the PET, and Tramiel was too cheap to license a new BASIC for the VIC20 and for the C64. That’s why the Commodore BASIC was simpler than Applesoft BASIC from 1978.
But on the other hand Commodore is supply a very good and useful manual with the computer.
Memory maps of the ROMS, circuit diagrams, complete instructions on the BASIC.
I completely agree, Earl.
My first computer was an already obsolete TRS-80 III in the early 90’s and it came with a couple of manuals, including one for BASIC. I had no idea what to do with a computer at the time, and very few people were interested in them around here, so I had nobody to ask.
After reading one of the manuals a bit I typed in a couple of lines of strange text and the computer responded! Without that manual I probably never would have tried programming a computer.
Hm, maybe with early models / only in some markets? With my C64C in 1992 I just got some introductory booklet on BASIC in Polish, and probably its original in… French.
This was in Canada in the 1980’s in English.
But I think the full manual was shipped in all of North America.
Check out: http://www.classiccmp.org/cini/pdf/Commodore/C64%20Programmer‘s%20Reference%20Guide.pdf
https://www.lemon64.com/manual/
and the user’s manual at:
https://www.commodore.ca/manuals/c64_users_guide/c64-users_guide.htm
Did you just get the user’s manual?
Yes, mine was user’s manual, I believe – it definately wasn’t so exhaustive and long as your first link (while I think I know where I have the book stashed, I can’t get to it now; though I’m sure it doesn’t have “Memory maps of the ROMS, circuit diagrams” …heck, it doesn’t even describe how to load programs from the tape in the format of included fast loading routines cartridge, I had to figure it out myself ); and I remember in a section on sprites a screenshot of moving balloon sprite with C= logo (like in https://www.commodore.ca/manuals/c64_users_guide/c64-users_guide-06-… ).
The only thing they left out was losing several hours of work on your latest epic program because you kicked the plug out of the wall. And saving involved cueing up a audio cassette to the next blank spot, typing CSAVE and waiting several minutes.
But at one point I pretty much had the memory maps of the TRS-80 Model III and Apple //e memorized.
You think that is bad, I built a graphic card for my PET 2001 and it took all night (over 12 hours) of running draw the fractal picture.
I call my brother to come look at how well it worked and in his rush to come see the door he opened pulled out the plug!
PS. I did not save the program, I had to redesign it from scratch again. And to this day I still don’t remember enough to do backups.
Had the same type of experience only that I had worked on a program and that it was a power outage.
Guess it’s telling about my intelligence I still don’t do backups as one should…
No, no-one thinks it’s APIs all the way down. It’s pretty common knowledge, even for non-programmers, who don’t even know what APIs are, that computers operate with “bits” that are sent as electric pulses. It was great living through a technological revolution, certainly, but I don’t think it really gave me any profound insights.
Something, something, pearls, swine.
OK, maybe. For what it’s worth, I did write a few Z80 machine code routines back in the day, e.g. to draw explosions, and make terrain collapse, for a Tank Wars / Scorched Earth clone otherwise largely written in BASIC (hope that makes sense). Not uber-geek cred or anything, but I did do a little programming close to the metal.
But I’m not sure what profound insights I’m supposed to have got from it. Everyone knows the metal’s there. I know some specifics of Z80 assembly, but they differ from the specifics of the metal in the machines I use nowdays. And besides, I’m so far from the metal now, that I don’t think it really makes any practical difference anyway. Everything I write now is either optimised by a compiler in ways I don’t understand, or interpreted.
If my knowledge of Z80 assembly is supposed to have given me some profound insight, what is it?
I totally agree with you. It was for sure an interesting period to live through, and the C64 was my starting place for programming, so it will forever be special for me. But unlike some others i don’t mistake my nostalgic feelings for being useful knowledge.
I made the mistake 10 years ago to revisit some of the C64 games on an emulator, and i learned that some things are better kept as fond memories, they don’t age very well. For the same reason you don’t see me following things like Haiku development.
If someone wants to revisit the technology of the 80s out of interest, then i of course think they should go have fun. But if anyone ask me if they should do it to help them become better programmers, the answer is an easy no, there are a million better ways to invest your time.
What I learnt doing machine language routines is the clear advantage not trying write *ALL* the program in the fastest method possible.
Writing a version in Donkey Kong in Basic for a C64 with the graphic movements in ML worked fine.
Trying to write a compiler for my Amiga 1000 in assembly because I did not like the ABC Basic was a total mess. Later when the Amiga go the replacement Basic I wrote the compiler mostly in Basic and just the key parts in ML. The output was ML and worked fine.
But STUPID me, I did not make backups again and someone stole my only copy while I was demoing at a computer store downtown Toronto.
Make Backups, Make Backups, Make Backups.
we managed and actually thrived. This may come as a shock to those who have only ever known the PC world.
My introduction to Computers came about in 1969. I watched in awe as a Pilot flew a Flight Simulator. This was the real thing. Powered by a Honewell DDP 124.
In the next shop was a giant of a machine. The South African Airways B-747 6 Axis Simulator. The Hydraulics just made me go ‘wow’. 3 seconds for a full 10ft of extension with a precision of 2mm.
As a 16 year old fresh out of school, I saw the future. So what if it was Punched Cards and ASR33’s and realy early VDU’s… It was obvious to me that was the way things were going so I climbed on board.
Now, almost 50 years later my career in IT is just about over. It almost came full circle as my last job was on Airport Terminals. Been there, done that, got the T shirt on pretty well everything in between.
Some of the technology fads stuck around and many fell beside the wayside and a lot more should be taken out the back, shot, hung, drawn and quartered and then burnt at the stake they are so bad.
Oh, and the original Dartmouth Basic wasn’t bad. I still have the source code to my program to calculate spring compressions and rebound rates from 1974. It is on paper tape naturally.
A shame that the Commodore 64 was used as an example of an 80’s machine running BASIC, because the BASIC was pretty poor on the C64.
Best 8-bit BASIC in the 80’s by a long, long way was on the BBC Micro. Having a 2-pass 6502 assembler built into the BASIC as well was simply genius. You could either create a 100% machine code binary with it and run that or mix the excellent BASIC and machine code together should you so wished.
The Acorn Archimedes that followed in the late 80’s (the first ARM-based home computer) extended on BBC BASIC further (including, yes, a built-in ARM assembler), but for the sheer excellence of cramming so much into a 16K ROM, the original BBC BASIC on the BBC Micro remains an all-time classic in my books.
All true, but I think it’s exactly right in with the spirit of the article. The limitations of Commodore BASIC V2 was one of the things the forced you to go around it, move the first steps in 6502 ASM and learn more about the machine hardware, become intoxicated with the raw power and discover a brave new world of computing at your hands.
Edited 2018-09-04 11:00 UTC
Also, the sheer popularity of C64 meant that one were much more likely to start with its BASIC than the one of BBC Micro; C64 kinda ~defined computing landscape of its times.
The Acorn Still some piece to Learn. Transputers also.
I hated the fact that the Atari ST-Transputer only supported one chip.
I first learnt of the Transputer at the Amiga Developer Group and the talker was showing the results of connecting 16 of them in a 4 by 4 array, I started dreaming for what you could do with a 16 by 16 array.
I think you forget that the Acorn Archemedes was the first machine to ship with an ARM processor full stop. After all, ARM originally stood for Acorn Risc Machine
There’s a bunch of us around who actually did that. We’re not even old. We can just, you know, tell you what it was like…
I can even remember back before Steve Jobs invented marketing!
Heck, I started with a Logix computer in the 1970s.
http://www.samstoybox.com/toys/LogixComputer.html
You can’t beat learning BBC BASIC V. Probably the best and easiest version to use.
Millions of School children learned this language in the Eighties in Britain.
I like COMAL-80
Learned BASIC On “Programmed Books”. An Options Question took you to another page. Learned the more about the logic of algorithms following the “wrong” options.
Had access to real HW through my engineer Brother 2 years Later. Burroughs mainframe, if well remembering. He typed my wanders on regular cardboard. You were allowed only one, debugging.
In 1983 already had a pocket, CASIO, 8K-RAM, BASIC computer. Electronics technology such, that battery drain only made it painful slow.
Edited 2018-09-04 18:24 UTC
8K permanent storage.
1983: Sophomore in Highschool & School’s first computer class.
I was programming random access databases w/ graphics in apple basic.
I got ahold of a “menu” from the Mustang Ranch in Nevada.
Created a series of questions that guided you to your temporary BFF as my final project.
Parents had an interesting Parent-Teacher Conference over that choice of content.
It was fun – apart from BASIC, you could use assembler and actually talk directly to the hardware.
PEEKing and POKEing
Or you could subvert BASIC and change addresses on the GOTO or GOSUB stack to make BASIC do things that were supposedly impossible.
You could write self-modifying code that enabled tiny programs to do magical things in a few lines.
You could drop down to assembler to speed up a slow routine or address the screen directly.
It was such fun…I learned proper computer logic from it and how to write clean economical code.
I still like programming, but you are so many levels removed from the output (now why is the compiler doing THAT?) that it is no longer that exciting, just a chore.
Ho hum…
Mac
Great reading the way things were from the comments here. 50 years in the computing industry! I tip my hat. Living legends.
Back in the 80’s My district had Classroom Computer instruction in Grade School for BASIC programing on the Apple 2 . I was one of the few people Lucky enough to have a Apple 2 at home because my father needed it for His Job.
Between the Classroom Instruction and Having the same computer we used in the schools at home learning to code was easy for me.
I just discovered this site:
https://www.commodore.ca/commodore-manuals/
Is there one for Atari Computers?