Last week I wrote a Mandelbrot set program for the Xerox Alto, which took an hour to generate the fractal. The point of this project was to learn how to use the Alto’s bitmapped display, not make the fastest Mandelbrot set, so I wasn’t concerned that this 1970s computer took so long to run. Even so, readers had detailed suggestions form performance improvements, so I figured I should test out these ideas. The results were much better than I expected, dropping the execution time from 1 hour to 9 minutes.
Articles like this are very satisfying to post, because we can all agree this is just plain awesome, no ifs or buts.
especially the bit about turning off the screen ..
Turning off the screen also improved speed on ZX81 The “FAST” mode.
Turning off the “screen” is still a valid approach for many tasks.
dir /a /s on c:\ takes forever because it is displaying all the output, but dir /a /s > %temp%\c.txt finishes a whole lot quicker
Basically all automation tasks should be run with either their window minimized or with https://excel.tips.net/T002498_Turning_Off_Screen_Updating.html
We are living in a computerworld where literally tens of billions of computations are done per second and processing times are often measured in miliseconds while screen refresh is mostly once per 17 miliseconds.
I once improved the performance of a program by a factor 100 by not updating the progressbar every time something changed but only once per second.
I guess every programmer has a story like this.
Mine is from embedded: Used to measure timing with an GPIO pin and got shocked. Taskswitching on a 1.2GHz CPU was horrible. But then I learned that the GPIO is connected to a 33MHz bus. So “turning off the screen” decreased the times by factor 10.
Made a progress bar and got a great speedup just by limiting the redraw frequency to something coherent and humanly bearable, like 4 times a second, which already doesn’t allow to read the text with precision.
I downloaded the manual. I was surprise that there was no “OR” function in the CPU. The work-a-round would work but would be slow compared to a proper instruction set.
Well one could be added to the microcode store, and there were alternate microcoded instruction sets for the Alto and D-Machines that did include OR; it was just that the default DG Nova compatible microcode didn’t include it.
This comment should be send to every developer on every locked down platform so they realize just how different their current solutions to problems are from the past.
While they chew on that they can be served a bit of desert in the shape of “how Compaq reverse engineering of the IBM-PC BIOS started the general computing area for the masses”
Well most computers at that time didn’t have a writable microcode store. But it is also important to remember that that flexibility came at a price.
Intel and AMD implement a CISC instruction set on top of a hidden RISC microarchitecture. Microarchitectures can be seen as the modern equivalent of microcode
Some RISC CPUs have microcode too.
Some of this stuff reminds me of programming my HP48GX calculator back in the day. Turning off the screen to improve performance was a common tactic on that platform too.
After switching from what basically constituted a high-ish level assembly (SystemRPL) to modern computer languages such as Fortan 2003+, C++, etc, its sometimes hard to capture the charm of those limited languages and systems. That being said, I like all the processing power and capability of modern tools.
Reminds me of when we had a seriall graphical dec vt 340 terminal connected to a vax workstation.
We had it do the spinning 3d cube on the terminal and also did some mandelbrott drawings.
the first implementation of the cube was horrible slow since we did draw the lines with pixels but at a later point we optimised it to send “lines” and it became decent