“In this tiny ebook I’m going to show you how to get started writing 6502 assembly language. […] I think it’s valuable to have an understanding of assembly language. Assembly language is the lowest level of abstraction in computers – the point at which the code is still readable. Assembly language translates directly to the bytes that are executed by your computer’s processor. If you understand how it works, you’ve basically become a computer magician.” More of this, please.
I grew up with 68000 Asm, so I think all coders should have at least a basic understanding of assembly languages.
Agreed! If you don’t know assembler you have no clue how computer memory works.
A properly good developer has to also understand how an adder circuit is constructed and works.
And Boolean algebra / logic gates too. Often forgotten by computer majors.
ASM was part of the first semester classes when I was in undergraduate technical college, along with binary. Then a few years later, in University, they don’t even seem to know it still exist. They think it is something that is behind us and to look forward, they must teach more important things like design pattern and how to write layer upon layer of abstraction… What a shame.
Assembler? Pah! I’ve handwritten mnemonics on a sheet of paper and translated them switch positions to write my own assembler (get off my lawn )
… but now all I get to do is an occasional bit of Perl **sob**
You lucky bastard! I had to solder my processor from hundreds of transistors! Now get your carriage off my propiearty!
Luxury…
And you try and tell the young people of today that ….. they won’t believe you.
Sad part is, that I actually had to do that…
You youngsters and your transistors. :p
In my day we had to weld valves into a computer the size of a small building. And when we had bugs in the system, they were literally house flies.
You were lucky to have valves! Back in my day, computing was done using our heads. We had to work 28 hours a day, 8 days a week, supply our own pens and paper and hope that the foreman didn’t hit us for writing with it!
By your use of numbers I’m guessing you really sucked at your job as a computer…?
He wasted too much time watching Monty Python.
Time well spend!
You had self-inking pens? In my day we had to whittle our own quills from the neighbours goose feathers.
Please stop it. I can’t stop laughing.
Luckily no one told he was programming an Abacus to play Tetris
This is an insanely cool thing!
We had an Apple ][+ at school back in the early 80’s, and I wanted to get as much as I could out of it, the only way was machine code.
I remember using CALL -151 so many times (3D0G to get back to Applesoft).
I got an Apple 2c in 84 that had the 62C02 chip, that gave you such things as BRA which I thought was the coolest thing ever (my friends C64’s 6510 was even nicer, but that’s another story).
I wrote code to put the //c into double hi-res, then using the video interrupt, wrote a small sprite routine that put a mouse cursor (copied pixel perfect from a Mac 128k image I had from a Byte magazine at the time). I even wrote a faux menu system that copied the Mac look and feel, right down to the Chicago font.
Times have changed.
And that niceness was in a much less expensive package, but that’s anot… hm, no wait, something like this goes on also in present ;p
Didn’t change totally; similar coding style still goes on with many microcontrollers (tons of them around), including some DIY stuff – this AVR “console” for example: http://en.wikipedia.org/wiki/Uzebox
Me, one day… I will make an Atari 2600 game – 128 bytes of RAM and racing the beam, here I come!
(I’m not really in a hurry though – in fact, I think that doing it on a half-century mark would be even more curious, so still 15 years to go; notably, also a 6502 variant)
I actually enjoyed this article but I do have a bit of “beef” with it.
fta: “I don^aEURTMt think you^aEURTMll ever have to write assembly language in your day job”
{ // begin rant
As a software engineer by trade, I have on a *few* occasions needed this. It’s not a purely academic exercise. Sure, even as developers, you will rarely (and mean virtually never) these days be employed as an assembler coder. That being said, it’s still a valuable skill set to have. It’s definitely crucial when you get stuck in those deep debugging sessions with gdb and need to understand “WTF is at this memory address? I didn’t put that there!” Being able to understand at the low level what is really happening is valuable. Or if you’re doing some low level performance optimizations, being able to write assembler is necessary for using the SSE intrinsic functions on x86_64 architectures.
While it may almost never be your full time job to write assembler code, having a moderate ability to do so is necessary when you come across that “every once and a while” when you do need it.
} // end rant
Compilers are good at what they do, even great, if not astounding. But they are not magic. Every piece of software, even gcc, is written by a bunch of people just like us (speaking to fellow programmers here), ugly warts and all.
Never say never indeed. My daily job IS writing assembly
code. Maybe one of a few left in the world but as old
fashioned or pathetic as it sounds, there still is a need
for assembly language.
And as someone have already mentioned, it is good to know
it in order to have a better understanding for how stuff
works.
Why not x86 asm?
I have several old Atari & 6502 books about to be dumped. Should I save them?
D’oh! RTFA, Sorry all.
Have you *ever* tried 6502 assembly vs. 80×86 assembly ? Does real mode, segmented memory, int 10 means somethig for you ? Once that will happen, you’ll freak out and love 6502 assembly.
Kochise
To be fair the PC architecture is much more complex and the main reason initially the 8086 was like that was to reduce production costs.
One thing I hated about 6502/6510 assembly was the lack of multiply and divide. On my C64 I could call the basic routines to do it or implement it myself, but that still sucks. :p
Perhaps ARM assembly is more appropriate. It is a contemporary architecture and it is used in 100-epsilon percent of all mobile devices.
A cheap board like the Raspberry Pi has IO pins that let you control stuff where timing is important and assembly could be appropriate.
Another possibility is the Arduino board which uses a completely different architecture, http://en.wikipedia.org/wiki/Atmel_AVR . There is no OS on top, only a small bootloader. And assembly is very appropriate to use here.
Edited 2012-07-09 05:53 UTC
Agreed!
Better yet, trying to make something like a Lisp Machine, Smalltalk environment, Native Oberon or Lillith.
Rediscovering the OS development with new (old) approaches, besides the usual C one from nowadays.
ARM assembly is much more complex than 6502.
Also, learning assembler on a Linux machine is pain.
How much code do you have to write to get a single pixel on the screen ?
Best is to have a small eval board with lots of LEDs with an Cortex-M3 on it and a debugger.
E.g. the Stellaris boards, which also often come with LCD.
Also the Raspberry Pi uses an ARM11, which is not widely used these days.
That’s a joke, right?
No, meant it seriously. ARM11 is quite outdated.
Of course it’s my opinion based on what I see and hear in my jobs. Not statistically based.
…then it’s quite a jump to say “not widely used” – it still ships a ton (just not any more in top smartphones for example). “Quite outdated” doesn’t mean much if it’s perfectly good enough (and much less expensive, with how ARM licenses older cores)
Well, I am a bit biased. When talking about 6502 and Raspberry Pi, I am thinking of hobby or embedded projects.
So far, besides mobil phone market, I really never saw or heard of projects using ARM11.
But of course, my view to the embedded market is limited.
Mobile tech is typically counted as embedded …overall, not very prominent – who knows really what ends up where, on large scale.
Sure, ARM11 is no longer dominant in ~top smartphones like it was for some time, But it didn’t disappear, continues to be used in many mass-market devices – many lower-end Android phones are still built around ARM11-based Qualcomm or Mediatek SoCs …quite possibly still most Android phones (it’s just that tons of cheap Chinese devices aren’t really noticeable in few most visible “premium” markets, but they largely power the worldwide Android adoption). Then add so called “feature phones” or Nokia Symbian devices, also most recent. It’s not impossible that most smartphones and/or mobile phones in general still ship with ARM11.
Which reminds me, there’s more than the main CPU – radio modules typically have some older ARM, and will likely continue that trend.
Also, Nintendo 3DS seems to have ARM11.
And that’s only the visible stuff. My quite recent wifi router has some old ARM core – not sure which, but the point is: older cores continue being attractive for most scenarios.
Edited 2012-07-16 00:15 UTC
ARM is in fact a kind of conceptual evolution of 6502. The creators of the architecture were heavily influenced by 6502 used in the BBC micro and designed ARM to be a logical successor.
Edited 2012-07-09 08:01 UTC
*sigh* this is only half true. The design of the 6502 was very sweet, much more efficient etc and can be seen to have generally influenced the design goals behind the ARM family. However, the ARM is a RISC processor, the 6502 is pretty much CISC (though some argue this point.) Nothing in the actual ARM architecture shows any real influence from the 6502. In fact, there’s zero compatibility and knowing 6502 assembler gives you no great advantage to knowing ARM.
Maybe the confusion comes from this: the ARM based Archimedes range of computers (which is where ARM originates from) were designed to be the direct drop in replacement for the 6502 BBC range used in schools in the UK. The main selling point initially was that they came with a very similar BASIC (same capabilities, but with a lot bolted on top) and that they could run *some* BBC software using the included software emulator. Schools in the UK had bought in to Acorn big style, and the BBC micro is very much the British Apple 2 (being that most kids from the 80’s started their computing in School on a BBC.) Many would argue that the Acorn range of computers ended up crippling the UK school system, as they’d bought in to a dud and the dominance of PC in the rest of the world was already in place. But that’s another day’s battle.
I guess the ARM add-on for BBC Micro also played a role in the confusion?
BBC Micro came out before IBM PC, right? And while one can easily argue that PC victory was already clear in mid-80s ( http://arstechnica.com/features/2005/12/total-share/4/ and the next page, 5; still, it’s not made clear, but those stats are probably mostly for North America – the article doesn’t even mention Spectrum or Micral), but it didn’t yet happen.
In the meantime, UK had one of the more vigorous ~computer (also education) landscapes – rest of the world didn’t really have a dominance of PC yet, it didn’t have much of anything.
One can accuse Amigas of pretty much the same, misplaced hope against the onslaught of the PC (just look at the graphs), but in the meantime they served well. And you still have that one of the most vigorous ~computer landscapes.
6502 was fun in its day. So was 68000. But as someone else already pointed out, compilers are damn good these days. Plenty of people get along just fine without any asm knowledge. Also, I would hope people understand how a computer works if they’re going to program. I don’t know any great (at least imo) programmers who don’t.
… but can be out-performed on many machines by a decent assembler programmer.
This might be true for database or banking software. But when it comes to embedded programming, it is always painful to discuss with programmers who do not know about the machine they are programming.
Also, assembly programming teaches good boolean algebra.
I have seen lots of code like this from guys who never coded a single line of assembly:
uart_format = _8BITS_PER_BYTE || ENABLE_PARITY;
I do not mean, that every programmer needs to be a perfect assembly crack, but it the same with a car. Knowing how to drive it, is just not enough. You need to know where to fill in the fuel (or today: plugin in the cable) and _what_ kinda fuel you need
I’m assuming that was supposed to be a bit-wise or instead of a logical boolean or? Or (no pun intended) was that the point – that those programmers that don’t know assembly don’t know the difference between the | and || operators?
I myself used to make that same mistake, and I learned C doing a tutorial that was heavily geared towards graphics and mixed in a lot of assembly (Asphyxia set if tutorials if anyone remembers that demo group).
That said, 6502 assembler is kinda cool. I learned it by trying to implement an NES emulator (turns out this is way harder than I thought it would be because of various quirks in the NES hardware), albeit that was a modified 6502. If I recall correctly, the D (binary coded decimal) flag didn’t actually do anything. But I digress.
As CPUs became more orthogonal, I bet the advantage eroded pretty quickly. Compilers can just keep track of more information to make better decisions on optimizations, such as efficient register scheduling.
And it’s poor software engineering to dive straight into assembly without finding the bottlenecks first. Even small micro-controllers like PIC are better programmed in C first with a directed migration to assembly based on performance requirements.
Except for the smallest of embedded systems, C is perfectly viable using something like Contiki ( http://www.contiki-os.org/ ).
I did not want to start the old ASM vs. C vs. C++ war again. But there are good reasons to write assembly code. And yes, a bad algorithm in assembly stays a bad algorithm
And sometimes, you just can’t speed up things anymore.
DeepThought,
Yea, GCC often produces subpar code in my experience. Depending on how tight a loop needs to be, hand-crafted assembly can bring decent gains. Sometimes we can get away swapping in intrinsics, other times GCC just refuses to output good code.
ICC is supposed to be an excellent code optimiser though.
It’s all relative though, computers have gotten so fast we’re usually waiting on I/O anyway.
It needs to be said that coding in asm does not have automatic benefits of any kind. The quality of asm depends on the knowledge and capability of the person programming it. I’ve seen plenty of terrible asm. There _can be_ benefits to asm, but it isn’t a given and shouldn’t be overstated.
Yup.
Edited 2012-07-10 16:24 UTC
Fixed that for ya!
RAM is the new disk, bla bla…
If you ain’t in cache, you ain’t nothin.
whartung,
“Fixed that for ya!”
“If you ain’t in cache, you ain’t nothin.”
Actually, I may be using poor semantics, but I consider memory limited processes (whether from CPU or GPU) to be “IO” bound. After all, those requests have to be serialised over the system bus almost like any other IO. Obviously most programmers treat “memory” as though it’s different from IO. While memory IO clearly plays a specific role for the CPU, from a databus-oriented point of view it’s not all that special.
Semantics aside, it can be difficult to make SMP systems scale well if memory & IO are the real bottlenecks. I’d consider a CPU-bound process one that makes minimal use of the system bus, including system memory. I find many multithreaded advocates proposing to subdivide problems among tiny light weight threads, but very often they shift a problem from being CPU-bound (which is good for SMP) into one that is memory-bound which offsets the benefits of parallelism.
Even caching is problematic in that x86 processors must implement very strict cache coherency models, which severely limits SMP scalability.
Watch some recent C64 demos and you will surely change your opinion
Edited 2012-07-09 08:02 UTC
What C64 demos output is more the work of VIC-II and SID.
Anyway, they don’t change anything in what ilovebeer said – nobody seriously targets 3-decade old computers any more.
Plus, they are just that, demos – not actually useful applications or interactive games, which basically looked nowhere close.
It’s easy to download a (free) Commodore 64 emulator (VICE), an Assembler program and start coding away.
BTW, why is it so free? …what’s with including the apparently original ROMs in VICE sf downloads?
When it’s free I ask no questions!
Even if it’s quite possibly… illegal? (and also very against basic sf rules, I believe)
Edited 2012-07-12 06:54 UTC
IIRC VICE is legal. I would be supprised if they weren’t considering their spot in the open.
VICE itself, certainly. But how come it’s ready to go when I download it from sf, with all firmwares inside? How can they distribute them all? (and from a site about OSS no less)
The Internet says:
The ROM images for the 8-bit components required to make VICE are copyrighted, and you install them at your own legal risk. Tulip
Computers, in the Netherlands, appears to hold the copyright to them, and is being somewhat persnickety about who uses them.
Ah, so they do something, well, illegal (honestly, “and you install them at your own legal risk” is silly, it would likely never result in prosecution of individual users; OTOH, they, VICE, choose to distribute those ROMs, and via sourceforge of all places) – hoping it will go under the radar… I wonder how long that might last, with by far the most popular C64 emulator – maybe Tulip isn’t that persnickety after all.
PS. It seems Tulip is even non-existing by now… http://en.wikipedia.org/wiki/Tulip_Computers (historically ironic how Tulip just copied the BIOS of IBM PC) – and they got rid of Commodore stuff even earlier. Oh well, I guess the whole legal mess with (slices of) C= corpse might be not so bad.
Edited 2012-07-12 15:15 UTC
I guess it would be okay if you own the original computers. Wheter you extract the ROM or download it, it’s the same result.
Tulip is Dutch, I am Dutch. It must have been somewhere in the previous century that I have seen a Tulip computer. I never see any advertisements or hear people mention them.
I think they used to be big when they sold their Dutch products to Dutch companies, but most seem to be using Dell and HP now.
Hadn’t you mentioned them I would have totaly missed them if forced to write down all PC makers I know.
How many people own (checking my VICE dir) PET, VIC-20, Plus/4, CBM2, C128, C64, C64DTV, C64SC? (not even sure what the last one is, without checking)
You can’t really download VICE from sf without downloading copyrighted firmwares of the above …and even if you do own all of them, I don’t think that makes distribution legally OK.
That’s even before getting into possibly slightly different revisions – maybe my C64C doesn’t have quite the same ROM?
“More of this, please.” – I second that!
I’d be remiss to not mention 6502.org.
It’s an active forum with some 6502 grognards. One thread at the moment is the progress of guy tracing circuits of the 6502 from a micrograph of the chip.
Other threads involve some folks who are working on the “65Org16” CPU, which is essentially a 6502 with everything stretched to 16-Bits. They have running examples in FPGAs.
Others are working on 65816 boards.
Over Christmas break, I decided to write my own 6502 simulator, and an assembler, and I’ve got Fig-Forth running on it, so the trials and tribulations of that are posted over there. I’m in the processing of adding ANSI support to my simulators terminal so I can make a screen editor for the Forth.
Simply, if you’re interested in the 6502 (not necessarily the Apple/C64/Atari), it’s a great place to hang out.
However, I must recommend the Altirra Hardware Reference manual, available from http://www.virtualdub.org/altirra.html, which is just chock full of really crazy low level stuff that the developers of this Atari 800+ emulator discovered. You learn a lot about the nuances of hardware design when you try to reverse engineer it.
I’d also be remiss not to mention visual6502.org, which is a transistor simulation of the chip. If you want to know what happens when the NMI pins fires during a BRK instruction, this is the place to go.
The 6502 remains as having a pretty active following, the chips are still in production, it’s a good chip and has a cool legacy to it.
Edited 2012-07-09 17:44 UTC