Can’t get enough of porting old software? How about getting Doom ported to and running on an old version of AIX for PowerPC?
You know what ever computer needs? DOOM. Do you know what I couldn’t find? DOOM for the IBM RS/6000, but that’s not surprising. These machines were never meant for gaming, but that’s doesn’t mean you can’t do it. If you like pain anyway.
[…]In this extra long NCommander special, we’re going to explore AIX, discuss the RS/6000 Model 150 43p I’m running it on. Throughout this process, I’d explore the trouble in getting bash to build, getting neofetch to work, then the battle for high colors, SDL, and more.
This video is over an hour long, but incredibly detailed and lovingly obscure.
Obviously it was a nightmare. It’s DOOM we are talking about.
Seriously though, this is why people and businesses kind of hated Unixes and jumped to Windows NT the moment the first capable x86 CPUs came (that was with the Pentium III), and then completely abandoned Unixes when AMD released the first x86-64 CPUs. Anything more than plain cli C or C++ code and unaccelerated X11 code was incompatible between systems at the source-code level. Also you couldn’t assume more than 256 colors on X11. So, each Unix had its own application ecosystem, because trying as a developer to cover all Unixes (even if you were willing to ship multiple binaries, which you had to due to the different RISC ISAs) meant having to devote significant time resolving subtle differences between the different Unixes an supporting something half-dozen sound systems. Even your own source code was locked to a certain Unix variant, because running it on another Unix required lots of debugging and fiddling. Instead, all Windows NT boxes were (are) the same.
Even your own source code = Even your own internal source code
When Windows NT 4 came out, it supported 3 CPU ISAs (x86, Alpha, MIPS), with PowerPC added soon after.
Add to that incompatibilities with Win95 and still popular Win3.x, and it was difficult to write even source portable Windows code that would target all versions of Windows at the time.
People and business “hated” UNIX because it cost a pretty penny. Workstation class machines were expensive, but significantly faster than contemporary x86 machines, at least until the early 2000s.
FWIW, around this time, the software house I worked for supported:
– AIX (3.x, 4.x)
– SunOS (4, 5)
– HP-UX (9, 10, 11)
– Linux
– Windows NT 4
with an in-house port Irix (never released).
And I can tell you which of those was the most troublesome to support.
Easily, Windows.
Sure, we had some wrinkles across the various UNIX, but in the grand scheme of things, we had less trouble maintaining the UNIX side than we did reliably developing and maintaining the Windows side.
If I recall correctly, Windows NT 4 was contemporary of Windows 95, while NT 3.51 was running alongside Windows 3.1. This was the era of 3 different 32-bit Windows APIs (Win95, Win32, and Win32s).
Anyway, yes, UNIX was much better. I remember developing on Sun OS and IRIX at the same time, and being able to remote OpenGL windows through X11. (Solaris machines had no local OpenGL acceleration). GCC, libgl and X11 was much easier than Visual C++ + whatever they had back in the day (before DirectX).
But eventually cheaper system won. The “good enough” almost always wins. Compatibility layer can be written once, even at high effort, but at a fixed cost.
sukru,
I’ve always felt that MS C and assembler were subpar, I hated using them. They didn’t compete on the quality of watcom, borland and even free GNU based tools. Still, given their monopoly, MS was able to crush commercial competitors into oblivion. It’s really not fair that the odds are slanted so heavily towards dominant companies. I still hate MSVC and MASM even today, it’s a hack job behind a shiny IDE.
Alfman,
I agree. VC++ was atrocious in language compatibility. And they kept being subpar, maybe until 2010 or so.
That being said, they were “slow and steady”, and eventually won not because they were a monopoly (it had an affect), but I would argue, because competition lost their focus.
Borland had a great C++ compiler for DOS. But when they moved to Windows, they wanted to build their own class library (some sort of “Turbo Vision”). And when that failed, they tried to build a Delphi clone in C++. Neither of them worked really well.
I don’t remember Watcom ever having a Windows IDE. And GCC did not that much care about their Windows setup. (For a very long time, the best way to use if was getting Cygwin then mingw). And again, before Eclipse coming up with it, there was no good open source C++ IDE. (I don’t count KDevelop or QT Designer).
This was a common theme back then. Novell had a great NetWare 4.0 product for DOS. Then they decided to go with a terrible Java implementation, and leave the market to Windows NT, or even Windows 95 “Workgroups” for SMBs.
Wordperfect was DOS only in the Windows 3.1 era, while Microsoft combined their separate Word and Excel programs under the Office umbrella.
I could probably list many similar stories for failed competitors, while the “good enough” Microsoft solution “juts worked”. And they were (usually) much cheaper, too.
And Visual Studio is still cheaper than Borland C++ Builder:
https://www.embarcadero.com/app-development-tools-store/cbuilder
https://www.microsoft.com/en-us/d/visual-studio-professional-2022/dg7gmgf0d3sj
(Though both of them are really pushing you for a yearly subscription instead).
Anyway, sorry for disagreeing on the finer point. However I still agree that Microsoft did not bring the best dev tools back in the day.
sukru,
If MS didn’t have a monopoly though, the odds of their inferior products like MSVC outlasting the competition would have been very low. Being a monopoly was everything for microsoft and microsoft themselves knew it.
Turbo vision was a DOS library, a fairly popular one too. I think you are referring to OWL, which wasn’t popular.
https://en.wikipedia.org/wiki/Object_Windows_Library
All competitors were working from a disadvantaged position on windows next to microsoft thanks to their “insider” connections. Microsoft gave it’s own product developers preferential access to technical information. Microsoft’s MFC became the defacto standard for windows. Related technologies like activex components would use MFC instead of OWL.
I think it was also during this period they used their low level access to windows to create last minute changes before windows releases to make competing products break on launch day. Also microsoft did everything they could to interfere with competitor business operations.
https://techrights.org/2009/09/14/ms-admits-draining-to-destroy-borland/
While many of their actions would catch the eye of antitrust investigators, it would be very late, after competitors (like lotus, wordperfect, borland, netscape, novel, stacker, sun, etc) were already severed from the market. The more you study the antitrust cases the more you see just how much MS deliberately used monopoly power to crush competitors behind the scenes. I think monopoly power was a deciding factor in microsoft’s success over competition time and time again.
Yeah, we may have to disagree. I had a good link about this topic written by a microsoft evangelist working at microsoft. but I cannot find it now. It was really interesting because it was written from the perspective of a microsoft insider.
NT 3.51 was released in May 1995, Windows 95 in August 1995, and NT 4 in August 1996. Note that 3.51 included the first version of the Windows 95 common controls.
I think though Christian’s point is that the install base of Windows 3.x means developers having to support 16 and 32 bit outputs from a single codebase, which was always going to be challenging. That was exacerbated by having new/different UI controls on 32 bit systems, so there was significantly different logic in addition to different code generation. It was probably the worst time for Windows source compatibility, although it was also short-lived, since 3.1x died fairly quickly (for developers.)
I think you mean Win16, not Win95
Win16 had it’s origins in the 16-bit era, hence the name. It was the API used by Windows 1.0-3.1.
Win32S was a subset (hence the S) of Win32, designed primarily as a compatability layer for Windows 3.1 and 3.11, to allow more modern 32-bit programs to run on the otherwise 16-bit Windows 3.1 and 3.11. This is what things like Internet Explorer used on Win3.1
Win32 was the “modern” API used by Windows NT, and later, Windows 9x/ME.
Win64 is really just a stretched Win32. This is the modern Windows API, but is largely identical to the earlier Win32, except the bit length is larger. This is the API found on all 64-bit versions of Windows, including the Itanium releases.
I might not have remember the naming right. Basically what I remember was:
Win32 for Windows 3.1 (Win32s)
Win32 for Windows 95
Win32 for Windows NT (NTAPI?)
All of which were very similar, but slightly incompatible in different ways.
* and then there was Windows CE
@sukru
Win16 was for Windows 3
Win32s was for Windows 3.1 (s=subset)
Win32c was for Windows95 (c=compatibility)
Win32 was for Windows NT.
WinCE was obviously for Windows CE
I think by NT4 and Windows98, win32 was pretty stable and compatible across windows
javiercero1,
I remember being unable to run random applications on Windows NT 4. It has been a while, so I can’t list them, but vaguely remember most games had issues.
Windows 95 lacked some Unicode APIs, again can’t recall specifics. But most Win32 apps ran on Windows 95/98. Since it was the consumer facing version there was less of concern.
And there were deliberate blocking of some software. They wanted to ask for a premium license on the NT. (Norton Antivirus? again has been a while).
All these were fixed by Windows XP.
@sukru
I think games were sketchy because of other dependencies on DOS stuff, I think Windows95 was still built on top of DOS. And a lot of the gamin catalog was still running on weird DOS-based stuff underneath.
But for straight Win32 applications seemed to work just fine.
I think the main issue was that Windows NT had heftier HW requirements and was more expensive. So almost nobody used NT as a gaming platform.
javiercero1,
While Win95 could run DOS games too, sukru is right that windows NT had trouble even with some native windows games. If I had to guess it may have been the state of direct-X and drivers on NT. NT’s graphics drivers may not have received as much attention simply because it was a business OS.
You bet NT4 had trouble with most games.
I ran NT4 as soon as it arrived and through it’s lifecycle from bad memory stayed at DX3. There were DX5 libraries available which we utilised (likely unofficially i think). We were most definitely stuck with simple Win32 windowed games and some OpenGL 3D games. We had many GLQuake frag fests utilising university NT4 hardware in the day.
christian,
I agree, the windows NT and windows 9x & ME versions had compatibility issues. Obviously they had different drivers & permissions and a lot of games could not run on the NT branch at the time.
I sort of recall the university lab sun workstations feeling slower than contemporary x86 machines, but I have no idea what speed CPUs they used. Running X windows probably didn’t help. Also all file activity was on a shared drive and my guess is it could have been a big bottleneck.
they were using and they hadn’t upgraded them during my stay at university.
https://en.wikipedia.org/wiki/Ultra_5/10
Here’s an interesting comparison between ultrasparc III and P4 with benchmarks at the bottom:
https://www.silkstream.net/blog/2006/08/comparison-ultrasparc-III-cu-pentium-4-processors.html
Spec’s website still has a fairly comprehensive database of old benchmarks to compare older CPUs and it does seem like intel was able to grow it’s lead in the 2000s.
https://www.spec.org/cgi-bin/osgresults
Yes but why did it cost a pretty penny? The answer is hardware lock-in, same reason Macs cost a pretty penny (though less than the Unix workstations of old). It was simply impossible to know how much you were (are) paying for hardware and how much for software. Microsoft had to be upfront about how much their software costs, and it had to be kept at a reasonable price otherwise piracy in SMBs and other parts of the world would skyrocket.
Partly. Probably doesn’t apply to vendors such as Sun, though, as there were many Solaris/SPARC vendors. People bought the UNIX machines in-spite of the high price for a reason. They could just do stuff contemporary Windows based PCs couldn’t.
Volumes were low, and margins were high. PCs running Windows benefited from the economies of scale of the entire PC ecosystem, and x86 benefited from the investment such high volumes afforded.
In the end, Linux cannibalized the UNIX market as much as Windows did, on commodity x86 hardware, for the best of both worlds.
SPARC was an IEEE standard and SUN allowed 3rd parties to make their own SPARC implementations. Unfortunately they were never able to tap into large enough economies of scale, and there were a few original architectural decisions that came to bite SPARC in the butt later on.
I think at some point there were at least 9 different competing RISC architectures in the market at the same time: SPARC, MIPS, POWER, PA-RISC, AXP, ARM, Clipper, i860, M88K
(I think there were even more concurrent types of Unix in the market place at the same time. )
Who would have guessed that x86 would have outlived and outperformed most of them…
@christian and @javiercero1
SPARC was an IEEE standard but the Sun workstations weren’t. That’s why people kept paying for Sun workstations. And for most of its life Solaris an OS was proprietary, so Sun dictated what it would support, at least without too much pain on the user side. There was a good article about this on sparcproductdirectory but the site is down.
Desktop workloads are not mission critical and don’t need reliable hardware or software, so x86 and Windows could undercut these Unix workstation offerings. You can’t possibly compare them to x86 hardware quality and think they should be the same price. The software quality of Windows was inferior too. Are you forgetting all the blue screens we had to live through for so many years?
The last thing Unix vendors wanted was application portability Focusing on the whole “open” buzzword of the 90s; which was more about protocols and interfaces making it easier for application A running on system X to talk to application B running on system Y.
I think the only unix vendor who managed to make portability work was NeXT which had an elegant solution in terms of APIs and “fat binaries” (MS should have copied a lot of those concepts when doing NT). Ironically, NeXTStep was the least “unix” of the unices.
There were so many competing APIs, and even within the same API there were incompatible proprietary extensions. In the end, developing for Unix (other than for applications with hefty licensing costs) there was just no value proposition developing for commercial unix targets by the mid 90s really. NT was good enough, and x86 was good enough. So for less development effort you could get better returns, it was a no brainer.
I think the issue is that Microsoft not being as bad and Commercial Unix not being as good, as people remember.
POSIX exists for a reason, but it wasn’t perfect. I do remember writing a pthread application (POSIX threads) in the late 90s that compiled on Solaris and Linux without much (if any) define nonsense. Shell scripting was harder, since the standard commands often took different options.
I spend a bunch of time maintaining FOSS installs on IRIX which *generally* worked pretty well once you got the hang of the right configure/make options. I do remember it getting harder towards the end as Linux took off, since people didn’t bother with anything else.
We used to play Doom on the Sun workstations at school. Porting it to AIX is just nuts, AIX was never “with it” when it came to fun. I want to say I had Quake running on Solaris 8 x86 (back when you could get a media kit for $75), but that might have been Slackware.