TempleOS is somewhat of a legend in the operating system community. Its sole author, Terry A. Davis, has spent the past 12 years attempting to create a new operating from scratch. Terry explains that God has instructed him to construct a temple, a 640×480 covenant of perfection. Unfortunately Terry also suffers from schizophrenia, and has a tendency to appear on various programming forums with a burst of strange, paranoid, and often racist comments. He is frequently banned from most forums.
This combination of TempleOS’s amateurish approach and Terry’s unfortunate outbursts have resulted in TempleOS being often regarded as something to be mocked, ignored, or forgotten. Many people have done some or all of those things, and it’s understandable why.
You really have no excuse to not read this article.
This.
This kind of article is why I’ve got space for OSAlert in my rss feeds.
Yes, exactly! I remember when I used to enjoy coming to OSAlert to get… OS news. It was a great site to come and see news (such as http://www.osnews.com/story/1385/Comparing-MenuetOS-SkyOS-and-AtheO…) about all the alternative/hobby operating systems out there and the new ideas (or rehashed ideas) people were experimenting with or find out about new projects.
These days, though, it’s a lot of news I’d find on any other tech blog with an OS post once in awhile.
Please, Thom, bring back the old OSAlert!
Perhaps because there is little news to be told about these alternative OSes, mostly developed as a hobby.
Main OSes : Windows, OS X, Linux (and distros) + mobile
Medium OSes : OS/2, NetBSD, FreeBSD, Genode, WebOS, QNX, …
Alt OSes : MonaOS, SkyOS, Sylable, Morphos, …
Old OSes : Workbench, TOS, Geos, DOS, …
Jesus f**king christ, that’s one ugly OS.
Well done, though.
Here : http://www.osnews.com/story/28091/God_s_lonely_programmer
Which includes this:
“Today I find the people most similar to me are atheist-scientist people,” he says. “The difference is God has talked to me, so I’m basically like an atheist who God has talked to.”
OK…
Yes, he’s schizophrenic, so that would make complete sense.
Also, a different article from when it was called LoseThos http://www.osnews.com/story/23796/Recreational_Programming_With_Los…
https://www.youtube.com/watch?v=vakWMNA1oWc
…
The author of the article didn’t mention, are there any known bugs in TempleOS? I can imagine software this “tight” and with an author this focused that there could be almost none.
I think about this all the time too. Programming used to bring a fun, almost magical sensation of closeness and being one with the computer. Does “Mode 13h” mean anything to you? If yes, then I think you understand what I mean.
Nowadays it’s just layers upon layers of languages, compilers, and libraries, all needing their own massive amounts of documentation and examples before I can write the most simplest of programs, and I still don’t know how any of them work.
I think that’s why I just stick with tinkering in ANSI C with a simple graphics library like SDL. It makes me happy.
I certainly know what “mode 13h” means, but, well, I just feel that once you learn to really understand what you’re doing and what’s happening behind the scenes and why the magic simply wears off naturally.
I do not agree with you on the claim that it’s all the “layers” in software-development that take the magic away, though. As long as I understand what I’m doing there is no magic, with or without any layers, and yet when I manage to do something that I don’t yet fully grasp it still evokes a sort of a feeling of magic — even with the layers there.
Maybe the magic you’re feeling is more that of nostalgy instead?
WereCatf,
I guess it depends on your personality type because these things never wore off for me. I absolutely loved being under the hood, not using libraries but building them. I suppose some mechanics might feel the same way about their engines, they just love working with them. Having the ability to write things from scratch gives me a “buzz” feeling that I don’t get by using existing solutions. I can honestly say that I never lost this feeling. But my “honeymoon” ended when I joined the workforce and discovered that most of my low level skills were in low demand. I refocused on popular high level frameworks for which there are many times more clients. I still jump at the opportunity whenever clients come to me with low level work, but realistically those projects are so few and far in between that it never really financially justified my having those skills. Maybe I should have focused entirely on high level web frameworks from the beginning, I’d probably be better off instead of having mucked around with the low level stuff as I did.
Is it just me or is it true that even most developers working on embedded systems don’t do any real low level programming any more.
It is you, because new device drivers still have to be written from scratch (when doc avail) and debugged through i2c/spi using os probes.
Get a Raspberry Pi and do some bare metal stuff on it, you’ll understand what I mean.
OK, well still seems to me there is a whole lot less low level code being written by embedded systems developers then their used to.
For example, I found out that the box that I mostly use as an amplifier under my TV runs Java.
I do not consider Java a low level language, do you ?
Lennie,
While I don’t have numbers to back this, I think you may be right. The number of embedded systems obviously has skyrocketed, but at the same time I think globalization and industry consolidation has resulted in fewer multinational corporations developing most of our products. I interned at a startup company doing embedded products. I learned a tremendous amount at this place (all about PLCs and my first real experience developing commercial apps) but they were bought out & closed by another company who wanted to move our customers to their platform. And like that, the products I worked on instantly became redundant.
Define “real low level programming”? Assuming you mean something close to the hardware interface side of things rather than the language used to do so. If that’s the case there’s plenty of people that write hardware drivers for all those dedicated devices out there that control everything from automated valves to aircraft black boxes to the firmware stuffed into cameras and phones. Any Turing complete language can theoretically be used for writing device drivers.
If you mean programmers that use languages such as the various dialects of assembly, or Forth… Probably so. But even then, compared to what the machine itself sees, anything in human readable form is “high level”. It just boils down to how easily readable a language is and how much you’re willing to trade off to gain that readability. There’s no garuntee that something written in assembly is going to be any faster than a C program compiled with a solidly efficient compiler/linker. Usually it’s not because it takes someone extremely experienced with how the particular architecture does things inside to make assembly programs that execute more efficiently than efficiently compiled languages. Due to readability issues with assembly, especially when it’s poorly commented, it’s simply easier to trade off a more arcane language’s potential benefits for a more easily readable one’s consistency (as in the case of UNIX based OSes & C), clarity (python vs. assembly for example), additional features (like garbage collection or built in functions), and other factors where the differences in performance may be negligible or not a factor at all.
I still do, and I love it! But I agree, it is very rare. I write firmware for medical diagnostic equipment consisting of a number of modules communicating with each other and controlled by a PC. For me it’s infinitely more enjoyable dealing with a few KB of RAM, I/O ports and that sort of thing than it is dealing with Visual Studio projects. Feels like a much more personal creation I think…
mov ax, 13h
int 10h
And I agree that the level of layering decreases the fun to some degree, on the other hand those layers make it much easier to do complicated stuff like programming a video game.
If you remember mode 13h you most likely remember how coding sprite routines wasn’t easy if one wanted maximum performance and doing it in “mode-x” (unchained 256 color modes) was even more tricky.
Having used asm to write a (very, very) simple OpenGL like library that used mode 13h back in the 80’s I sure can relate to your post.
And I fully share the feeling that today, for most part, programming has become a game of connecting huge stacks of libraries and tools together.
Though, IMHO there only one way to keep the “fun” in programming. Remain close to the bare metal. Driver development, kernel logic development, etc.
Nothing feels more “DOS” than playing around with IRQ states in kernel mode
– Gilboa
Edited 2015-06-09 12:04 UTC
gilboa,
I second this. DOS was my learning lab, great times! Sound blaster apps, graphics demos, TSRs. Also worked on file systems back when the windows kernel was unlocked. In my college years I started a kernel of my own. I had a blast doing it and there are still so many things I’d love to implement…this stuff is so much more fun than client work Occasionally now days I write linux patches to fix things, but it’s mostly a relic of my past.
Edited 2015-06-09 14:54 UTC
Oh boy, I miss those days. Actually, you know what? I don’t.
Fidgeting with endless config.sys and autoexcec.bat files to free up a few more Kb of RAM.
Fiddling with EMM/QEMM to make some game run.
Cleaning up disk space because your “massive” 40Mb hard disk was almost entirely taken up by DOOM.
Swapping soundcards in and out and manually changing the switches for IRQ’s and DMA’s because some game was hardcoded to use things differently (Ok, so I had a Gravis Ultrasound and didn’t have to do it that much).
All that? It sucked. Yeah, maybe it was a good learning experience but thank God we have moved past it.
Yes, I’m glad we moved on. But I still miss the simplicity of installing a program: just copy a directory.
Atari or Amiga.
Flat memory model, orthogonal 32 bits ISA (hail to the 68000) and fixed hardware made no compromise for game and demos that alway ran out of the box.
But one day, MCGA (320x200x256 colors) put a shame on them. Then SVGA (640x480x16M) and the worl was definitively upside down, despite better audio in the Amiga (Paula chip).
Adlib, Soundblaster, Audigy, …
If cracked?…
Edited 2015-06-15 00:06 UTC
Linux games still tend to be like that if the developer hasn’t gotten it into their damn fool head to complicate things. That’s why I like GOG.com insisting on doing their own packaging.
1. Download tarball
2. Unpack tarball
3. Run start.sh
Edited 2015-06-10 07:20 UTC
Soulbender,
Hey, I readily admit it was a poor platform that lacked functionality and foresight. It probably even set back the industry compared to what the competition had… but it’s hard to deny that it was an excellent platform for learning PC internals. It was a very short bridge from developing dos applications to writing an operating system since a lot of code could be developed and debugged under DOS prior to having a working OS. Courtesy of Borland, DOS had excellent development tools that rivaled anything else I saw at the time. These things made DOS very friendly to low level developers.
Yeah, Borland had pretty awesome development tools (for the day).
com and exe files, using overlays, tsr’s, in-line assembly. Theses were the days!
To quote Linus: (Back) “when men were men and wrote their own device drivers”
– Gilboa
Maybe it’s just me, but I derive fun from elegance.
Instead of creating simplicity by going low level I strive to keep my programs small and single purpose like the Unix philosophy. And use the best tool for the job.
It is the only way to fight complexity and thus prevent bugs.
I’m a developer and Dutch, so I’ll use Dijkstra to explain it:
“[mathematical] elegance is not a dispensable luxury but a factor that decided between success and failure.”
“ingeniously simple and effective”
https://www.youtube.com/watch?v=RCCigccBzIU#t=17m28s
I’m actually happy a lot more developers now feel the same and when they are building distributed systems (like complex websites) they now build ‘micro services’.
People should code in Erlang using lightweight processes working in the agent model in massively distributed fault tolerant mode.
Easy as pie. Just that the IDE is so poor and antique. So you also have to master emacs or eclipse to get the job done :/
Yes, Erlang has many good qualities. And programs developers are building are starting to look more like what you’d find in Erlang.
A long time ago I wanted to have a look at Erlang I couldn’t even find some good information how to get started. There were a bunch of existing programs I could use to learn from, but that isn’t a good way to get started they were much to large for that. I needed some good guidance to really understand the Erlang syntax but there wasn’t any.
So I ended up never using Erlang for anything.
Edited 2015-06-09 20:23 UTC
Well, the Erlang syntax is pretty easy to start with, just the IDE should have had REPL to help learning it. Basically you write a ‘module’ using notepad and compile it using the Erlang compiler ‘erlc’ into a .beam object code you can them run into the VM. Attaching a debugger is horrid.
Basically it looks like .c+.h having .erl+.hrl
A ‘module’ (.erl source file) should be declared using the same name (my_module.erl -> “-module my_module”) then you add some other preprocessor stuff at the beginning of the file (like author, file version and stuff).
You ‘import’ other modules’ function prototypes (like includes) and export your module’s “public” functions. Functions are not distinguished by their parameters’ type but JUST by the number of parameters (the ‘arity’). Hence ‘pow(int)’ doesn’t exist as is, ‘pow /1’ do. So you cannot have the same function with different type as parameter.
Why ? Because Erlang is not strong typed, because it is a functional programming language. That implies polymorphism as you can pass a integer, a boolean, or even a string. But how to make the difference between the type to avoid arithmetic damages to the CPU ? Using ‘guard’ (pseudocode) :
pow(n)
-> when n is_int do this
-> when n is_bool do that
That ensure that your algorithm is nicely packaged at one place with just the slight type variation.
Erlang also feature pattern matching also everywhere. Just read the Wikipedia page about factorial. Just imagine it also works the same way in binary strings, to to parse tcp or jpg packets is damn easy.
In fact this language is full of thousands little things that makes it so awesome to discover, but really, before digging in you have to invest yourself in.
Racket (DrRacket) is a pure dream in comparison. With the absolute power of Lisp upgraded to graphical experimentation.
To be honest I’ve never used an IDE to do any programming, never had the need for it I guess.
But when I looked at Erlang, I don’t even think Eclipse even existed yet. So it’s been a while.
Edited 2015-06-09 21:10 UTC
I am not saying I don’t agree partly with what you say, but I think you are missing an important part of the picture. You felt computing was “fun”, because you wanted to know all the little details of the hardware you used (and sadly that, with todays all-too-complicated HW stack is not really possible anymore…). But it does not mean that that mindset is gone: there are always people who want to discover everything about some of those layers. Say, those who try to find the “best” or “idiomatic” way to do everything in a certain computing paradigm; or who want to optimize their code best for a certain parallel architecture. How are they different from the hackers of yore?
I think the difference lies more in whether you work with something 9-5 (because then you will not have enough time to fully learn anything) or you do something as a hobby. The magic is still there; we just grew up, and do not have the time anymore…
Last year: http://www.osnews.com/story/28091/God_s_lonely_programmer
What’s new?
It’s not a repost from last year. Read the articles; one is a more general look at TempleOS, the other is a somewhat more technical look.
Edited 2015-06-09 13:07 UTC
Look at my post above.
i have to admit, it’s fairly impressive piece of software actually. It won’t hurt to give it a try.
It *WILL* hurt.
Your eyes.
One of the best reads ever. It’s so sad that it takes a mentally sick person to actually think outside the box. Not just trying out new ideas but also defending design choices that goes against what we take for granted should be implemented.
Today, 9 out of 10 alternative OS:es just seems to want to be UNIX.
that’s probably because unix design stood the test of time.
Or just because people already have a bunch of docs/sources to develop against, instead to start completely from scratch, which requires an insane amount of work.
More like UNIX is what people get used too when they are at the university, so instead of expanding their minds they are molded in the mindsets of 60’s UNIX developers.
So when they see a problem, or a solution to a problem they go ahead and implement it their own way but keeping the familiar rest.
The reason UNIX is still around is because it’s expandable, not because it’s a “good design”. The computer systems it was developed for have long become obsolete. A wholly new design would make a lot of sense. But no, because “this is what we are used to.”
Edited 2015-06-10 08:15 UTC
Maybe it has always taken what we now describe as the mentally sick to think outside the box. I’m not disputing that this man is ill, but I agree with the article’s author that we should try and approach these people’s eccentricities with good faith and tolerance, as it clearly allows the contribution of unique ideas to society. While in most ways modern society is more tolerant, I’m not sure we do very well with mental illness.
Caravaggio, Churchill, Beethoven — which of them would be allowed to achieve what they did had they lived today? Newton wouldn’t even be allowed to hold a pencil without serious chemical sedation.
The reason is that you get all the open-source POSIX-compatible software for free. If you have a novel OS without a POSIX or Windows emulation layer, you must write all your software from scratch.
kjn9,
This.
The truth is Posix/GNU/etc gets us tons of software for free – software which a novel non-posix OS will never have.
Innovative and novel implementations are constantly forced to contend with their lack of critical mass. Say we build a great desktop OS; with no apps it won’t be very useful. We can have a novel graphics stack, but then we’ll loose out on all the familiar apps like Gimp/Firefox/Audacity/games/word processors/etc and even niche apps like bitcoin wallets.
I’m not sure if Linus Torvalds considered any of this when he started his unix clone project at school. But just think if it had been a novel OS instead of a clone, then it’s very likely that another unix (such as the BSDs) would have taken the limelight and none of us would be aware of Linus’s school project.