GNU Make 4.4 is here, and it has some interesting – and sad – news for some of the old operating systems we still cover on OSAlert. Sadly, support for OS/2 (EMX), AmigaOS, Xenix, and Cray will be dropped from the next release of Make. Now, I’m not entirely sure just how many users of these operating systems even use Make, but for those of you that do – tough cookie right here.
It was nearly a decade ago already, but I had some enthusiast port my open source app to OS/2 without my prior knowledge, so definitely there used to exist a community who were yearning for modern apps on OS/2 at least.
Cool. What was it?
It is possible that I am remembering this wrong but I do not think this means they are dropping support for OS/2.
EMX is a compatibility solution that allows UNIX software to be compiled for OS/2.. it allows you to use things like fork() on OS/2. It is long unmaintained I think with the last version of GCC supported something under 5 ( not researched ). My read on the announcement cis that it is EMX that they are dripping support for.
If you are writing OS/2 native programs, the GNU toolchain is available with with I think GCC 9 or even 11. I expect that GNU make still supports this.
I could be wrong. Perhaps somebody with more knowledge can comment.
Apologies for all the typos. It was a quick submission from a phone keyboard. I should haver proofread better.
The audience of people writing OS/2 native programs nowadays must be in the dozens at best….
I am curious actually.
Warpstock seems to still be ticking somehow:
http://www.warpstock.org/
Also, there does still seem to be software trickling out. It is kind of amazing actually.
https://ecsoft2.org/
http://os2news.warpstock.org/
It wasn’t until I did a deep dive into GNU Autotools that made me appreciate the whole “./configure && make && make install” toolchain.
…but the m4 language still scares me.
drcouzelis,
I’ve got really mixed feelings about it. I like the idea of simple tools that do one job and do it well. But I often find that auto tools are complex and do their job poorly. And I hate how slow they are due to extremely mundane tests that honestly need to be portable at the source code level. Configuration should be done in the blink of an eye, but at can take minutes. I wonder how many billions of times autotools has run tests worldwide for esoteric platform behavior that only ever happened on an obsolete platform that hardly anyone ever used. The worst part is how many times you’ve got to run it because it outputs one error at a time.
./configure
Can’t build because A is missing, manually go download/build/install A.
./configure
Can’t build because B is missing, manually go download/build/install B.
./configure
Can’t build because C is missing, manually go download/build/install C.
./configure
Wrong version of C, …
Sure it technically works, but it could be so much better. if it were a single pass that clearly informed what it was looking for and what it found…
Furthermore you should be able to pipe this into a package installer:
This is just off the cuff, and I know it gets complex because package names aren’t consistent across distros, but realistically solving autoconf issues like these would eliminate so much wasted time and frustration over a developer’s lifetime.
https://queue.acm.org/detail.cfm?id=2349257
At least there are some alternatives like cmake, which are better although still not that common and I for one am very dependent on software using automake. In my linux distro I end up scripting the builds. While configure is still slow and writing scripts takes time, at least I rarely have to touch them again. They’ll often work as is even after updating the software (unless dependencies broke upstream, but hopefully it won’t happen often).
I honestly think you’re doing it wrong if you’re trying to figure out dependencies at configure time. Ideally, dependencies should be listed in a README or other documentation file. The problem I’ve run into is that some projects sometimes do not provide appropriate documentation. This is especially confusing for software that have “optional” dependencies.
I guess everyone’s got ideas on how things can be better, but really offer no real and workable solutions. Your idea of piping the output of configure to a package manager is unimplementable for various reasons, including the one you mentioned. Additionally, who is going to write the pipe support into the package managers? As far as I know, this is not a feature that any of them have.
Oh, please… CMake is a mess. It solves none of the problems you outlined above and introduces a bunch of regressions. For one thing, you need to learn a new obscure scripting language that is very similar to M4. Additionally, you cannot produce freestanding solutions with CMake, so for systems where it has not been ported to, tough luck. You can’t even configure a project on a supported platform and compile it on the unsupported platform because Makefiles generated by CMake call the cmake tool. Autotools generate freestanding Bourne Shell scripts and POSIX make files.
There’s also no easy way to find out what the configure options are. At least Autotools provides ./configure –help. While Autotools is not perfect, it’s definitely better than CMake.
Personally, I like the following solution:
(1) Small projects should “roll their own” configure script (like musl libc) along with a portable Makefile. Alternatively, no configure script needs to be provided if there’s nothing to configure.
(2) Medium-sized projects should use Autotools.
(3) Large projects should use “make menuconfig” (aka, kconfig).
teco.sb,
So much is not documented and/or the documentation isn’t up to date. I agree with you that, as is, the configuration scripts are terribly inefficient and nondescript at telling us about bad dependencies. But it needn’t be this way, with better tooling this ought to work extremely consistently and well. The ability to automatically query a project for dependencies in a standard way seems absolutely awesome to me. Due to functional overlap it logically makes sense to have it be part of the configuration such that one command can generate a software package dependencies and all.
I agree when it comes to tools today, but I’m talking about how it should work. This would make installing from source a cinch even for non-developers. I think we rely too much on binaries as a crutch because building source is too laborious with traditional autoconf tools.
IMHO compiling all your software from source may be computationally taxing for your CPU, but it should not be mentally taxing for your brain.
I’ve had cmake dependency issues as well but I wasn’t necessarily prescribing cmake so much as pointing out that many project authors are equally frustrated by autotools. There are legitimate problems that need to be addressed, but I think standing still and not fixing things at all is worse. Let’s be objective in recognizing the deficiencies of autotools so that future generations don’t have to continue suffering from it as much as we have.
But that doesn’t solve any of the problems we’re facing now. Continuing to use autotools comes at great productivity costs for millions of developers every year. Even personally I’ve lost so much time to the tool’s inefficiencies; time I will never have back, I think it’s a problem worth fixing so that future generations don’t have to go through the same frustrations, but changing defacto standards is extremely hard. It’s the same reason we continue to use C after all these years. Legacy tools are incorporated into everything even though they can and do hold back progress.
[q[So much is not documented and/or the documentation isn’t up to date. I agree with you that, as is, the configuration scripts are terribly inefficient and nondescript at telling us about bad dependencies. But it needn’t be this way, with better tooling this ought to work extremely consistently and well. The ability to automatically query a project for dependencies in a standard way seems absolutely awesome to me. Due to functional overlap it logically makes sense to have it be part of the configuration such that one command can generate a software package dependencies and all.[/q]
What you’re describing might be possible with some languages, but certainly not with C (or C++). With some languages, where you have things like [code]from packageA import *[/code], like Python, you could theoretically scrape every file and build a dependency list, I guess. I still think this does not replace good documentation.
Like you, I maintain my own personal GNU/Linux distribution, so I think I understand where you are coming from, but I still disagree with overall feeling. Every piece of code has to run on a processor, and processors have different instruction sets. There’s no way around that. Unless you’re advocating for the JVM or similar virtual machine, which comes with it’s own drawbacks, including a mono-culture.
Let not forget why Autotools came about in the first place and the problems it was trying to solve: “UNIX” was a loosely interpreted de facto industry standard,
Since then, POSIX has tightened the definition, but a lot of it is still left for interpretation. Even now, things aren’t as rosey as they seem. I’m attempting to move my distro to musl-libc, but it’s close to impossible without massive patching. At first glance, software written for any POSIX system should be able to be compiled with musl, but that’s not the case. A lot of software, including GCC, specifically probes for things like __FreeBSD__ or __GLIBC__, instead of using the Autotools facilities to check for compatibility.
Maybe, Autotools could provide a macro that allows you to define what platforms you intend to support and explicitly check for that behavior, instead of attempting to check every behavior under the sun, including behavior for OSes that have been dead for over a decade. I will agree that this is a drawback.
I think developers overestimate their time. One of the things that I still remember vividly was some developers’ insistence that Clang/LLVM was a better compiler because they compiled code faster than GCC. The tradeoff, of course, was that the code wasn’t as optimized and, on average, 10% larger and slower than GCC. So everyone else had to pay the price of larger and slower code because some developers wanted to save a minutes of their time.
By the way, the same thing is now happening to compression. A lot of distributions seem to be moving to Zstandard because it is able to compress and decompress at much higher speeds. The trade-off, as expected, is lower compression ratio compared to LZMA. So we all have to individually pay the price of longer and larger downloads because developers don’t want to spend the extra few seconds properly compressing their software.
teco.sb,
While that could be another gripe against C, it’s not actually what I meant. Since we’re already creating metadata by hand anyway for config to work, the data would be more accessible, interoperable, and powerful with better tooling.
I’m having trouble understanding where you are coming from. While I have to agree with you about there being different processors/instruction sets/etc, to me this is another good reason to build from source rather than distribute binaries that are hard-coded for some arbitrary x86 and ARM generation. If you can build it yourself targeting your exact hardware, it would be optimal. Ideally building from source should be trivial to do, but as it stands the limitations and manual involvement needed are extremely problematic to further automation especially around dependencies.
I agree, so much software is based on legacy cruft. And while I think many of us would benefit from fixing it, the reality is it’s not going away and neither is autotools. So much of our existing software is already built using legacy languages & tooling and that it’s extremely difficult to change & fix now.
I don’t really recall a discussion like this. I actually use GCC myself (except on windows where I use VC).
Good thing for those affected that Make is an open source project meaning someone can simply fork it and add the necessary OS support.
As I understand the announcement, the platforms involved are still supported, just deprecated. It seems it is effectively a final warning for someone to step up and maintain them actively before the next release. If someone really wants to keep them going, and it doesn’t cause too much clash with other planned work, it seems they’ll be able to try to rescue them from deprecation.