Miguel de Icaza: “To sum up: (a) First dimension: things change too quickly, breaking both open source and proprietary software alike; (b) incompatibility across Linux distributions. This killed the ecosystem for third party developers trying to target Linux on the desktop. You would try once, do your best effort to support the ‘top’ distro or if you were feeling generous ‘the top three’ distros. Only to find out that your software no longer worked six months later. Supporting Linux on the desktop became a burden for independent developers.” Mac OS X came along to scoop up the Linux defectors.
I agree with him.
Indeed.
CS in academia has been very UNIX-centric since the late 1970s. Around 2002, I began to notice Macs appearing among hard-core CS majors in college. By 2010, even the faculty had converted over. The people who weren’t using Macs were Windows users who were happy to use Cygwin or to ssh into Linux machines. Linux was practically gone from the academic desktop.
I can empathize with Miguel’s frustrations with Linux audio, because that was precisely what caused me to give up on Linux.
Once, after a system update, audio failed. I’d been using Linux casually until then, never having dug into the source code, so I figured I’d try it at least once and see how painful it’d be. In the sound card driver, I discovered that several of the boolean settings were backwards — in other words, 0 meant true and 1 meant false! Not too bad — an easy fix. Smugly, I thought that all bugs indeed were shallow.
Six months later, it broke again after a system update. The same fix no longer worked.
Screw this, I said, I’m going back to Windows. And since then, I’ve never lost audio after a Windows Update.
Vote parent down please, broken updates are a non-existing problem.
Brothers of Linux we need to keep modding down these posts related to broken updates to show they don’t exist. I’ve modded over down over 10 thousand of them in my fight against the Evil Ones. My Linux Forum Defense Forces Golden Badge of Suppression should be in the mail.
Audio on Linux is a bloody mess indeed. The same can be said about their graphics, but at least there’s Wayland to look forward to, which will make huge improvements to Linux in general.
But on the audio side, Jesus! ALSA, OSS, JACK, PulseAudio, Phonon, Gstreamer, KLANG…I mean, what the hell is going on here?!? Developers, let’s get some consensus and put the focus on only one major audio system.
I think he is full of it.
Run your old OSX programms on Mountain Lion?
No, not possible. Lion killed official support for PPC binaries. He should know what he is talking about.
And anyways, he switches OS because he loves his phone so much. That is just pathetic IMO.
Define “old”. OSX for intel was released in 2005 so if your applications are more than seven years old then they aren’t supported any more. Of course not everything written for an old version of OSX will run on ML but compared to Linux backward compatibility is pretty good which is his point.
“And anyways, he switches OS because he loves his phone so much. That is just pathetic IMO.”
That might be because the dev tools for the iPhone only run on OSX.
Yadda, yadda. He is talking about Photoshop from 2001 and that won’t work on Mountain Lion. Full of shit I say.
He gives photoshop from 2001 as en example of a windows application that still work on Windows 8.
“Meanwhile, you can still run the 2001 Photoshop that came when XP was launched on Windows 8. And you can still run your old OSX apps on Mountain Lion.”
64-bit windows doesn’t support DOS or 16-bit Windows 3.x programs either.
Linux is source-compatible with pretty much any commandline program from the 1990’s.
And many original Unix programs.
Modern Linux programs do have some bloat:
http://www.youtube.com/watch?v=Nbv9L-WIu0s
When have you ever been able to run a different arch binary on Linux without recompiling? Never!
Removing PPC/Rosetta support more than 5 years after an complete architecture change is industry standard practice. Almost every company follows a 5 year EOL practice.
Your point is dubious at best and completely inaccurate.
Wrong! QEMU has the “qemu-user-foo” executables on Linux, which lets you run Linux binaries compiled for architecture foo, no matter what the host architecture is.
That is a helluva stretch. Best case, you’re being disingenuous, but I think you’re actually trolling. I mean, besides OSX, what other OS runs binaries from a different CPU architecture natively? HP-UX is the only one I can think of. NT4 on Alpha did, also.
Uhhh 7 years for a Mac is oooolllld, and Apple did go through an entire arch switch. I still run programs from the late 90s on my Win 7 X64 and from even earlier with DOSBox which is now packaged preset from places like GOG, no problem.
The problem is Linux guys think everyone should be on the bleeding edge, and there is a reason they call it BLEEDING edge, because it will be a bleeding pain in the butt! I have customers running 7-8-9 year old software on still supported versions of Windows, no problem.
Whether the Linux guys like it or not you NEED the proprietary software because it fills niches that Linux devs are never gonna have enough experience with to support, things like medical billing and electrical supply and salvage yards and all these little niches that small firms write software for, but they can’t support you because everyone from Linus on up is constantly futzing and fiddling so things that work in foo are broke in foo+1 and won’t work again until foo+5 and that’s if the software devs fix it, otherwise you’re just stuck.
You have to give folks want they want or they go somewhere else, period. They will not bend to your will, you have to bend to theirs. Again like it or not OSX came along and gave those that prefer the Unix way of doing things a well supported platform where third parties could write software and still sell it a year later, and for those that need incredibly long backwards compatibility there is always Windows.
Or, use FreeBSD. A couple people are running FreeBSD 1.1.5 binaries on FreeBSD 9.0. That’s over 10 years of backward compatibility.
Oh I’m quick to give credit to the BSD guys, top notch they are. They also have a hardware ABI (Like Windows, OSX, Solaris, etc) so the drivers don’t break like they do in Linux.
The problem is we just have never had someone come along and really promote BSD like it deserved. If Shuttleworth would have spent those millions on BSD instead of Linux we might see a true third way right now, instead we just got the same broken guts with Canonical pretty on top.
And the sad part? Retailers like me that have been saying these exact same things for years have been attacked and called every filthy name in the book for daring to point out the emperor is bare assed.
I mean do you think we small shops LIKE shelling out so much of our potential profits to MSFT? Think they give us even a teeny tiny discount? Hell they’ve been royally screwing system builder FOR YEARS, yet every time guys like me would test the 5 or 6 distros that the community deemed “ready for the desktop!” and point out the same stuff he’s pointing out we’d get told we were “dirty M$ shill poo poo heads!” and told we were liars for reporting what we were seeing with our own two eyes, the broken drivers, updates borking systems, the same problems year after year after year.
I had tried for 7 solid years, from 2003-2010, to find just one that could take the place of Windows Home, but none did. Now when someone claims “Distro X is ready for the desktop!” I’ll simply download whatever version they offered 3 years ago and update it to current. Since Windows has 10 years of support its not a fair test, but too many distros have come out in the past few years to make a fair test possible. What do I find? broken drivers, updates breaking stuff, alpha quality software, googling for fixes. the same problems I ran into in 2003.
I hope it gets better, i really really do. I make less than $50 a sale thanks to how badly MSFT gouges, but I can’t give a customer a system that is gonna break the first time they update and I can’t afford to give away lifetime support with such low margins. And before somebody trots out “support contracts” (which I’ve been told multiple times is my “solution”) I’d like to point out home users HATE Best Buy for their trying to shovel extended warranties, they certainly aren’t gonna jump on support contracts.
Fix this and I will promote the living hell out of Linux, but until then I have no choice, its Windows only at my shop.
Dubious moves by Ubuntu with Unity and Gnome Foundation’s Gnome 3. KISS. My father and grandfather that would be less technically inclined people they are after couldn’t stand how the experience changed from Gnome 2.
Windows 8 was a great opportuninty as Microsoft pursued tablet experience on desktop computers (eww) but Gnome 3 made the same mistakes, just earlier.
Points brought up by Miguel are valid but user experience is the key and there was very little wrong with late Gnome 2 distros.
I’m under heavy influence of alcohol.
Edited 2012-08-29 23:18 UTC
I wish I could be. Oh, and I agree with most of what you’ve said.
I think you’re spot on. Ubuntu w/ Gnome2 was starting to become THE Linux experience for the average user. Gnome3’s “transition” basically destroyed that completely.
As for OSX being simple and easy to use, how many of the OSX users here run any of the following:
OpenVPN
Samba
SSHFS
NFSv4
I have a few clients that just moved to OSX on the desktop and it’s been a disaster for them. Either the performance them unusable. they are missing key pieces of the software stack, or installing them is painful.
My clients have asked around and so have I and I’ve yet to get an answer to the basic question: what does OSX have for workable file and print services? It sure isn’t SAMBA/CIFS (horrendous performance) and isn’t NFSv4 and it’s not appletalk so what do you use?
OSX has some advantages but trying to get REAL work done on it has been much harder to do than on Linux Desktops. Apple can’t seem to even decide where /etc/hosts information should live. Are 3 locations really necessary for that info?
I’d love to hear some productive responses from the OSX users here. These are real problems my clients face since migrating to OSX.
Apple dumped Samba for a home-grown replacement. That might be part of your problems.
We tested this on 10.5 and 10.6 and saw very poor performance (10% or less what the windows or linux clients were getting). Googling OSX and CIFS performance I see many others with similar issues. We tested this about 1.5 years ago, asked OSX people, googled, etc. with no luck. It seems there are some proposed solutions here so maybe things are getting better now: http://www.macwindows.com/snowleopard-filesharing.html#030311b They sure were not usable when we looked at it and doco on OSX solutions was VERY sparse (you could find lots of people complaining about it but little to no resolutions to the issue).
NFSv4 is supported and I use it every day.
Smb support:
http://support.apple.com/kb/HT1568
OS X uses CUPS for print services.
If you can be more specific with what your problem is it will be helpful.
I should have just said File services. You are correct in pointing out we had no printing issues. I’m just a bit used to saying “file and printer sharing”.
As I said in my original post SMB is supported just unusably slow in our testing (about 10% of linux or windows clients). We tested this about 1.5-2 years ago and on both 10.5 and 10.6 on several machines. At the time the client and I both asked around and there were very few suggestions offered to improve the abysmal performance. A quick google of the issue shows it has been around and felt by many with only recent suggestions (mid-2011) being offered. Maybe they’ve resolved some of the issues now but not so long ago it was brutal.
As for NFSv4 from what I recall we had the following problems:
1. No user ID mapping support (we had to go to global user IDs if we wanted it to work, a major pain even in a small company)
2. Kerberos authentication integration was problematic
3. DNS updating and host file issues were difficult to make it work with a VPN
4. User side install nightmares. Getting NFSv4 setup was a bear (NFS w/ kerberos over a VPN). In the end this is the issue that killed it. Scripting it was not possible because many things needed to be done through a GUI (no CLI tools for it) so it make the process unreliable when done by average users.
We rolled this same thing out to Linux desktops and Windows (NFS software was expensive for the windows clients) in testing with a small fraction of the difficulties. When problems were encountered on OSX there was a lack of documentation, knowledge base answers from forums and other sources and a void in finding anyone who was even using OSX in a business environment.
Maybe some of that is being fixed now but to say that OSX is much easier than Linux on the desktop really makes me wonder. My client still is predominantly an OSX shop. They just had to move to SSH as the filesharing method because nothing else worked. It works pretty well but is not a true filesharing technology so they have to be careful (no file locking, etc). All in all it works for them now but it was a rough transition mostly because of difficulties on the OSX side.
Pretty much.
Regular users whom I’ve tried to introduce to Linux are often confused by the fact that there are more than 1 versions of it. They understand older versions, but not different versions.
The concept of making specialized distro’s for users with different needs strikes most people I’ve worked with as overkill and they don’t understand why you can’t just have a single operating system that does everything. Like Windows or Mac.
Users like to know that their choice was a good one and that it will remain that way for a long, long time. With Linux, they have no such thing. Every so many months there’s a major update. Every so often 1 distro has a feature before the others do. It bothers the end-user and it’s a hassle they don’t want to deal with.
If there’s too much choice, non-specialized users will always have that nagging doubt: did I make the right choice? You don’t want a user wondering that about your product!
The problem is not the diversity, the problem is the incompatibility betwen them.
Edited 2012-08-29 23:29 UTC
You say “diversity”. That’s a very positive word. People feel good about that word.
I call it specialization. All of the sudden it’s not positive anymore, it becomes very business oriented.
It boils down to the same thing: there are different distros for different purposes.
But the choice of words makes all the difference.
Saying “diversity is good” only works as a product slogan. It’s what you say initially to convince the consumer to try out your product.
Sure, it’s good… for the industry. But for the non-professional end user?
Once you’re actually using the product and working with it on a day-to-day basis, that’s when it starts nagging at the back of your head: There are all these other options. Did I make the right choice?
Diversity means you have options to choose from. But time and again, research has shown that too many options is just as bad as no options. Perhaps worse:
1) http://blog.kissmetrics.com/too-many-choices/
2) http://www.nytimes.com/2010/02/27/your-money/27shortcuts.html
3) http://www.prismdecision.com/are-too-many-options-bad-for-you
Google “too many options”. You’ll get a ton more reviews, research papers and blogs about it.
They’re not incompatible, they all use the same POSIX API. Also, the kernel’s userland API hasn’t changed for years. Software like “xv” (latest stable release camt out in 1994) still runs on the latest Debian or Ubuntu.
What you are seeing as incompatibility is a result from most binaries linked to specific versions of a dynamic library and this is a problem which exists on *EVERY* operating system.
The only difference between Linux and Windows/MacOSX here is that in the latter case, almost every application ships with all libraries it depends on.
Just have a look at Inkscape, the Windows or MacOS versions are quite large:
https://sourceforge.net/projects/inkscape/files/inkscape/0.48.3.1/
On Debian, the latest inkscape is smaller by 1/3 of the Windows installer:
http://packages.debian.org/sid/inkscape
If we started shipping every application on Linux with every dependencies, we wouldn’t run into these compatibility problems either.
An example for this is “VueScan”, which is a single binary which runs on a large variety of Linux distributions.
Adrian
You are living in a dream, the kernel may be compatible, but a Linux executable needs more than kernel libraries, it needs video libraries, audio libraries and so on, and that where the problem is, one distro may use a version of the library, another may use another or other different type of library, etc, you dont get the point.
POSIX. Are you f***ing kidding me?
Compatibility, in a user’s eyes is downloading a program, clicking on it, and running it. Anything that doesn’t do that is deemed incompatible.
users have no problem with choice (there are more than a dozen current editions of windows out there)
users have a problem with fragmentation
and users are really pissed off when the first answer to whatever problems they have is “you chose the wrong distribution”
Incorrect, there are only three main editions of Windows 7 the average user is exposed to in mainstream markets and its tailored to their needs:
Home Premium – for home users
Professionals – for business users
Ultimate – for those who want it all
If you drop in specialized editions, it is still not a dozen:
Starter – introductory edition for basic needs web browsing, emailing, basic office productivity. Most users I know upgrade this Home Premium.
Home Basic – emerging markets, pretty much similar to Starter with more flexible options such as ability to apply themes.
Enterprise – larger businesses who deploy Windows on mass and have multi-lingual sites world wide. Comes with unique management tools such as MDOP.
At the end of the day, they are all Windows at the core. AutoCAD 2013 which can run on Windows 7 Ultimate can also run on Windows 7 Starter.
Linux on the other hand has different distributions, desktop environments, package management and support options. As someone noted, Windows 7 Starter and other editions can be targetted by just having one particular edition and you know it will be supported for the next 10 years.
see, you don’t even think of the other editions
there are 6 windows 7
8 windows server 2008
1 windows home server
and 1 windows mobilephone whatevertheycallitnow
there are even more, but these 4 groups are popular enough that everyone should at least have heard of them
What you want could only be done in a coporate setting, one distro in one hand, one decision point, and everyone else falling in line, which would mean the elimination of all other distros.
This was never the goal of Linux, thus it’s around high time to stop b*ching about having multiple distros, versions, GUIs and package systems. This is a neverending histeria and it’s totally pointless. Use Windows, use OSX, but then for f*k’s sake, be done with compaining about how the Linux world works.
no, all it needs is a official baseline distribution from torwalds
just look at the BSDs
That is why we just have ONE version of BSD and not multiple versions like FreeBSD, NetBSD, OpenBSD, DragonFlyBSD, Mir OS BSD, GhostBSD, PC-BSD, Desktop-BSD, FreeNAS, Freesbie, Debian GNU/kFreeBSD, Gentoo/FreeBSD, etc. Oh wait…
This seems a popular reply to any type of problem (wrong distro / wrong application / wrong service etc).
Of course it’s good to make people aware of alternatives, but the ‘you should use product X instead’ crowd can be quite persistent, and it’s frustrating when you really did make a conscious choice for the product you’re using, or worse, you switched to it the last time you had a problem and someone convinced you to switch….
No, it’s just an answer showing that you asked the wrong person. And it’s “popularity” is just showing how many self-proclaimed pros are in the forums and channels you ask those questions. Nothing more, nothing less.
I started finding that funny after a while when I was trying to get a Linux distribution working on my Thinkpad.
I must have been told at least a dozen times that the problems I was experiencing were due to running the wrong distribution, with a different “correct” distribution recommended each time. In the end the one I managed to get more-or-less working (Scientific Linux) wasn’t even one of the ones I’d been told to use.
The Thinkpad specific GUI utilities (that initially fooled me into thinking that Linux would be as easily installed as Windows) were packaged for different distributions and didn’t work when compiled from source. I ended up having to spend a couple of weekends reading howtos and configuring everything manually, but at least I finally got it working OK.
Of course I’d be a lot less sanguine about the experience if I’d had to use Linux as my main OS, rather than it being a hobby that I could set aside as soon as its problems became too frustrating.
End users do have problems with choices.
http://lmgtfy.com/?q=Too+much+choices
I’m sorry but you are wrong and here is why: As far as the average user is concerned there are only THREE versions of Windows, XP/Vista/7, and of those 7 is the one they will encounter on new systems and the other two are “old” and thus will be ignored. You see it doesn’t matter if its Home or Pro or Ultimate to the end user because unless you have a specialized task that would actually require a higher SKU they all do the same thing which as the article pointed out the same can’t be said of Linux because of incompatibilities.
Like ’em or hate ’em for it (personally I like it) with Windows nearly everything works across system, from old to new. It is only recently we’ve been seeing games that require DX 10 and I don’t think I’ve seen a DX 11 only game yet and gaming is a small niche. For the software the everyday users are running it works fine no matter if they have XP-7, it just works.
You’re not paying 1 cent for it so who really cares what losers like you and the people who hang out with you think?
> What killed the Linux desktop
To be killed, it’s pretty alive on my desktop. Thanks for your interest, Icaza
What we don’t know if it was killed it was Icaza’s interest in working at Microsoft after he was rejected, like Moondevil wrote:
Miguel de Icaza was rejected to work at Microsoft, before he turned its attention to Linux.
Most of the projects he touched were clones of Microsoft technology.
Bonobo: COM implemented with CORBA
Evolution: Outlook
Mono: .Net
Moonlight: Silverligt
It is as like he has been trying to compensate for the fact he was not been taken by them.
I think he answered your question from his website:
I have never received a payment from Microsoft or Apple. If anything, I keep giving both money for their products.
No religious/hard-line/extremist group likes introspective criticism or anything that contradicts their long held beliefs. So the only possible explanation that you guys can muster is “how much does Microsoft pay you”.
And btw, he was named MVP by MS, so is not like MS didn’t want him.
Edited 2012-08-29 23:48 UTC
> I have never received a payment
I was talking about an interest, I didn’t said that he was being paid.
> he was named […] so is not like MS didn’t want him.
After the services that were done, they could name him “valuable”, it doesn’t take much money, of course . A different thing is being paid every month for working there.
Edited 2012-08-30 00:08 UTC
Given your ad hominem, I take you don’t have a proper retort to the points that he raised, right?
I’ve used Kubuntu for years, and it isn’t broken like he says. Software is being developed and working, unlike he says.
And then I find a “What killed the Linux desktop”. What am I writing this with, then?
Notes: I don’t update from one version of Kubuntu to another, I do clean new installs. About programs that don’t work as a “service”: I read what programs are going to be updated, I close them and I update.
That is nice, but what does it have to do with the points he raised?
> That is nice, but what does it have to do with the points he
> raised?
If he had problems with a half-baked distribution, bad practices using it, a bad choice of software, etc. that doesn’t mean that everyone is in the same situation and “What Killed the Linux Desktop”. That’s why I wrote “like he says” or “unlike he says”.
At the end of the day you don’t have a decent argument.
You either debunk the points, or you can accept them if they are valid.
The “WORKS FOR ME” defence is a bit of an old joke now.
It’s no worse than the gazillion opinion pieces out there hitting x or y distro based on the it-didn’t-work-for-me-so-it-doesn’t-work-for-you paradigm.
And your point is?
Sadly he doesn’t have one, he is just getting defense because you pointed out “WORKS FOR ME” is so old its actually got its own meme page..
http://tmrepository.com/trademarks/worksforme/
The really sad part is you can take the excuses that are always trotted out and match every single one to one of the memes at TM repo that have been there for years. The same excuses over and over for the same problems.
In the end I would just point out a few little facts: Dell has to run their own repos, even though they sell Ubuntu on less than a dozen units, why? Ubuntu keeps breaking the drivers.
http://www.theinquirer.net/inquirer/news/1530558/ubuntu-broken-dell…
You’ve had Walmart, Best Buy, Asus, MSI, all of these companies that tried selling Linux units, all quit, why? same problem, constant breakage meant it was costing more in after sale support than they were making on the units, and between that and the returns they were losing money.
http://blog.laptopmag.com/ubuntu-confirms-linux-netbook-returns-hig…
And finally you have system builders like me, that have been royally boned by MSFT on prices, yet we won’t use your product, why? Same reason the big guys don’t, after sale support was killing me and if I’d have kept selling Linux units I would have had to close my doors.
> It works for me
If someone is curious, apart from my computers and my virtual machines, I have also directly seen Kubuntu working this way for some of my friends and an acquaintance of mine.
People can also go to one of the meetings of their nearest LUG (Linux User Group) and see their laptops.
If you want to try it by yourself, there’s a “VirtualBox disk image of Kubuntu 11.04 i386 Desktop, stable version” on
https://docs.google.com/folder/d/0B2UJmdRmDlL1VmRWZ3dfaGdJTUU/edit
to conduct experiments on it (for example, to try to update it and see no breakage, etc.) Just don’t update it to an upper version of Kubuntu and, before updating a GUI program, close that program.
I usually employ a copy of it as a virtual machine and it works stably, too.
About it, there is a README.txt on the same web address.
If you have any doubt, on this thread you can make questions. Greetings!
Edited 2012-09-03 07:40 UTC
If someone is curious, apart from my computers and my virtual machines, I have also directly seen Kubuntu working this way for some of my friends and an acquaintance of mine.
People can also go to one of the meetings of their nearest LUG (Linux User Group) and see their laptops.
If someone wants to try it by himself, there’s a “VirtualBox disk image of Kubuntu 11.04 i386 Desktop, stable version” on
https://docs.google.com/folder/d/0B2UJmdRmDlL1VmRWZ3dfaGdJTUU/edit
to conduct experiments on it (for example, to try to update it and see no breakage, no dependency hell, etc.) Just don’t update it to an upper version of Kubuntu and, before updating a GUI program, close that program.
I usually employ a copy of it as a virtual machine and it works stably, too.
About it, there is a README.txt on the same web address.
If someone has any doubt, on this thread you can make questions. Greetings!
Aye Miguel sucks at the tit of Microsoft, if anything killed Linux on the desktop it was him.
You forgot he was a founder of Gnome, Gnumeric and the main dev of midnight commander; none of them related to MS.
Gnumeric had a lot of resemblance to Excel :-). Gnome 1 to Windows 95 http://www.linuxinsight.com/files/images/gnome_old.png . Anyway, Moondevil wrote “Most of”, not “all”.
Something similar happened to me. I used to be a big time Amiga user, then switched to Linux in 95. I spent the last few years with Gentoo, which I really loved tinkering with. Around 2005 I moved to Windows, but never felt quite at home. Cygwin helped a lot, though. Throughout the years I installed almost every new major distro in VMware and tried it, but none worked for me. Last year I switched to OS X, and now I feel like I am having a Super-Next-Gen Amiga with BSD under the hood. Plus I am having all my opensource tools available, thanks to MacPorts. AND proprietary software like MS office, Adobe Photoshop! Wow! All in 1 OS, no dual boot or fiddling around with VMware. I can’t believe I waited this long.
You don’t dig a well by poking 10 small holes in the ground. Or ten thousand. So you’re on the apple pipeline now? Do you pray it doesn’t end the same way as with Amiga?
Despite completely disagreeing with the Gnome3 direction, I find myself actually agreeing with Mr. de Icaza.
Someone on another article mentioned that we need a stable kernel API for drivers, so that they can write them once, and know that they will run against future kernel versions for the lifespan of their product.
I would argue that the same needs to be done for applications.
Unfortunately, there is never going to be a consensus on things like this, as we can see from the results of the attempt to make all distros use the same package format.
It is a shame that there can not be some sort of consensus reached between the linux foundation, red hat, novell, the debian project (which would bring canonical into line, presumably) and a choice other couple of partners like arch and their ilk.
Edited 2012-08-29 23:44 UTC
For the most part, that’s already the case – for example, an app written to Gtk+ 2.0 back in 2002 will still probably compile and run on Gtk+ 2.24 today. It probably uses parts of that API which have long been deprecated, but it will still work. That’s over ten years of stability.
For an actual API break, you have to go to Gtk+ 3.0, where they finally cleaned out all of those old deprecated functions. But even then, it’s not a problem, because the 2.0 and 3.0 libraries have been designed to co-exist..
The work probably doesn’t fill me with Enthusiasm
Yes but GTK is one GUI library out of many that work on the platform.
The difference is with MacOSX/iOS, Windows and Android you have a standard set of APIs that you can expect to be there and expect to work in a certain way.l
This doesn’t exist in Linux except for the basic userland.
I agree with you on that. Maybe I should add that most users really don’t care whether it is open source or not, they just want it to work.
A linux desktop works fine and doesn’t need much maintenance, but you need someone that can help you out when an update screws something up. Most readers here will be able to do that, but the average user doesn’t know what to do.
For the 99,999’th time, bullshit, bullshit, bullshit.
I maintain a fairly large (in 100,000’s of LOC) in-kernel project that covers everything from files to networking. This code works in Linux, BSD (when required) and used to work in Windows.
The amount of work required to maintain the project between different Linux kernel releases is negligible at best (usually less that 1h per kernel release).
*However*, I dropped the Windows support as undocumented changes between SP releases tended to break both the user-space part of the code and the kernel-space part of the code.
Care to prove me wrong?
– Gilboa
We are talking about Drivers not your pet projects … unless you are talking about drivers your comments are not relevant.
*Sigh*.
Option 1:
– I manage a large project (mostly in-kernel) that supports multiple operating system due to client demands and that includes multiple drivers.
– I’m really, really, bored.
Care to take a guess?
– Gilboa
BTW, Given your apparent experience in Linux kernel development I can only assume that know quite well that in-kernel APIs (E.g. networking, files) changes are just as frequent as driver API changes.
As before, feel free to prove otherwise.
– Gilboa
Edited 2012-08-30 09:03 UTC
I didn’t say that or claim it.
Your comments still aren’t relevant to the topic, no matter our experience level with the Linux Kernel API.
“We are talking about Drivers not your pet projects”.
Oh well.
Unlike *your* comment, my comment was right on the spot.
Somehow, there’s a stupid notion that the lack stable *driver* API makes Linux far harder to support on *the desktop*, even though this claim have been called BS by many out-of-tree kernel developers (E.g. nVidia kernel engineers interview @Phoronix).
Now, unless you have a *personal* experience in supporting out of tree drivers and/or have any evidence (personal or other-wise) to counter-my, and, say, nVidia kernel engineers’ personal experience *, I doubt that you have something meaningful to contribute to this sub-thread.
– Gilboa
* Keep in mind that my job is 100 times easier than that of nVidia kernel engineers. GPUs are far more complex than say, network drivers.
Beyond that, I wasn’t required to take Windows drivers and turn them into multi-platform capable drivers…
(So if they say the Linux kernel API is a non-issue…)
Edited 2012-08-30 12:20 UTC
Changing the goalposts with a changing interface is a bad idea, no other operating system does this and they have far more consistent driver support.
Oh well, Linux users are well known for making lots of excuses for Linux’s problems.
nVindia drivers btw won’t have KMS support and replaced a lot of the libraries on the system to do with DRI and what not so it isn’t a problem for them because they install their own damn libraries.
Edited 2012-08-30 12:51 UTC
So in short, you couldn’t really put up a factual rebuttal so you switched to “You Linux users always making excuses”.
Good luck with that.
– Gilboa
I suggest you read again your and his (gilboa) post, still can safe your face
It’s not stupid at all. Fixing an in-tree driver in Linux involves a submission/approval process and then you have to wait for the distros to pick up the change. A simple bit flip can take months to get to users.
In Windows you can fix a driver and then immediately publish to the web server. You can in fact automate the entire process with a single click.
Try downloading a driver code from the Internet, compile for say, Windows 7 x86_64 or Windows 2K8 and install it and let me know how it goes. *
– Gilboa
* Hint: nothing. (As in driver-not-load-nothing)
Edited 2012-08-30 14:29 UTC
Why bother? The .exe works fine so no compiling. I know its shocking, not needing to compile everything, but it does have its advantages.
For example my 73 year old dad got impatient because I couldn’t come out and install Win 7 on that new machine i built him until the weekend. he decided to DIY and when i got there I thought I’d have a mess, turned out all I needed to do was show him how to install his browser. All the drivers, patches, updates, and even the AV (it popped up a little box on first boot and gave him the choice of several free or pay AVs, he chose MSE which works fine) was all taken care of, he just didn’t know where to download his browser from.
Like it or not, admit it or not, Linux is a geeker OS that is a PITA for your average user. Sure if you know compiler flags, can google for fixes, know your way around the forums, then you’ll have no problems. lets see, that covers….about 0.97% of the public according to netstats. The rest? Sorry but your OS just don’t cut it.
Congratulation.
*Nothing* you just said had anything to do with the subject at hard – read: Maintaining out-of-tree drivers in Linux vs. doing the same under Windows due to the lack of stable drivers API in Linux.
Please, re-read the previous comments before pressing the submit button!
If you look closely at a comment I made 30% down the page, you’ll see that I more-or-less said the same.
Linux is for geeks / power users / take your peak and I personally rather keep it this way.
– Gilboa
Don’t use Windows, but in my experience, all you have to do is download the driver (For recent hardware, there will be 4 version of the driver – XP, Vista, Windows 7 and soon to be Windows 8).
The drivers will come in a single executable package. Bingo.
With Linux, unless you are using NVidia, it might involve tarballs (you have lost 99% of potential users right there), or hunting down a distribution specific packaged binary.
Again, your comment has nothing to do with the subject of this sub-thread:
Maintaining out of tree Linux drivers vs. doing the same under Windows due to the apparent lack of stable drivers API under Linux.
As before, please read the comments before pressing the submit button!
– Gilboa
Edited 2012-08-31 08:42 UTC
Yet another hint: I was *not* referring to the download -> click -> click -> click -> reboot drivers installation process from the *user* perspective. I was referring to the *developer* perspective (those of you who missed the subject of this sub-thread, please feel free to re-read it).
In Linux, you simply post a tarball with a Makefile.
In Windows, especially in 64bit and especially once SecureBoot kicks in, developing and, God forbids, testing drivers is a *huge* hassle.
50% of the reason I rather stop porting my kernel code to Windows is the damn signature enforcement!
– Gilboa
Edited 2012-08-31 08:50 UTC
You have to use the signature override
http://www.ngohq.com/home.php?page=dseo
A fair point but not a big deal
1. Signature override (of any [1] type) will not work w/ SecureBoot.
2. My personal experience w/ signature override is *far* less than stellar. Let alone the fact that it forces my users to jump through hopes to use my non-certified drivers.
I believe I proved my point: The lack of a stable driver API in Linux has a very, very limited effect on both in-tree and out-of-tree developers. More, I believe that I also proved my point the Windows kernel development can be just as annoying, if not worse due to specific Windows limitations (Signatures, API changes between different SP releases, etc).
– Gilboa
You are flat out lying just to prove a point…
The above is so NOT true, I seriously doubt that you have ever done any Linux kernel development… ever …
Maybe you are role-playing or something? Anyways Linux drivers break all the time…
This has to be dumbest comment I saw in *years*.
Congrats! You’ve just got my vote for winning the next year’s Darwin award.
– Gilboa
Linux desktop – supossing it has ever existed, since there have been always several Linux desktops – never had a real chance.
While I agree with de Icaza in some points, I think he misses the general picture. The average user do not use an operating system; in fact, many of them ignore what an operating system is (“Linux? Operating system? What on earth is an OS?”). What this kind of user actually uses is a set of applications. And the relevant fact is that there has never existed Linux versions of very popular applications like AutoCAD or Photoshop, to mention just two best sellers. Applications like these are the ones selling Windows machines.
That is, as somebody wrote, when a user buys a Windows powered machine, what is buying is a fuzzy feeling that the applications he is used to work with will remain available. That’s why Microsoft has managed to survive to a fiasco as big as Windows Vista and myriads of little and daily fiascos like blue screens of deaths, painful updates or security vulnerabilities.
Edited 2012-08-30 00:02 UTC
Well yeah, of course. But a big reason why this is so is because of the stuff the he points out, which will never be fixed, because too many Linux Evangelists are convinced that having 900 different distros competing with each other is a good thing. And, well…. maybe it IS a good thing, but not if you want people to actually use it. And those of us who don’t use it have been pointing out the same issues as Icaza for over a decade, but nobody listens to us. Hence, the reason why desktop Linux has been such a spectacular failure, and will continue to be so. Hell, most Android variants can utilize the same app store, and even still people are pissed at the fragmentation.
The problem is not that 900 distributions compete with each other. As Miguel pointed out the problem is incompatibility even in versions of only one (or the 3 top) distributions. He goes on to name (indirect) Red Hat, PulseAudio and Systemd as examples.
Those backwards-compatibility breaks are indeed serious and omnipresent. It seems like every 6 months a shiny new thing replaces working stuff in the stack and breaks compatibility all along. The gain is minimal compared to the lose.
I must be different then, because I never had the need for a bitmap editor as extreme as Photoshop, and I never had a need for *any* CAD software outside of school. Hell, no one I know probably even knows what AutoCAD even is, and those that have Photoshop probably pirated it.
Me? Well, I bought a copy of Jasc Paint Shop Pro years ago, and after that I started to really like Paint.net during my last few years with Windows. I have toyed around with “trial” versions of Photoshop to see what all the fuss was about, but it has to be the most overkill, confusing, slow and bloated bitmap editor I have ever used. I was not impressed, and you would truly *need* to have some damn special requirements to actually want to pay 700-1000 bucks for a god damn bitmap editor. I honestly don’t get it why so many people pirate it and then brag about it… I guess just to say that they got an ~$800 piece of software free?
Having been on Linux exclusively for over half a decade now, I do still miss Paint.net (it really was a pleasure to use), but I can get over it. I’ve been doing just fine on the GIMP, and I have got used to its multi-window approach years ago (which is actually very nice on a widescreen monitor). I’m actually anticipating trying out the new single-window mode, that should be interesting. But I can say that Windows has something good going for it with Paint.net.
Edited 2012-08-30 01:16 UTC
Photoshop and AutoCAD are the de facto standard in some areas, and some people make a living using them. I know using the computer for professional purposes, what a weird concept…
No, my point is exactly what you said. These programs are for highly specialized, professional purposes. The claim that a lot of people “choose” to run Windows for certain programs especially, and then listing these two specifically, is just false for the vast majority of Windows. That is the claim I was arguing.
They choose a computer; chances are it comes with Windows. Most of the time, Photoshop and AutoCAD don’t even come into play. Many people probably don’t really even know what the two programs are, and if they did, they would be lost if they tried to use them. Most of the people I know or have heard of that brags about having a program like one of these does it purely to brag that they got a ridiculously expensive program for free; a piece of software that they would have “legally” had to spent a metric shit-ton on just to get, that is way above their heads.
Edited 2012-08-30 02:44 UTC
Perhaps you should consider extending the size of your sample in order to provide a better view of why it is that people use Windows or OSX over Linux on the desktop?
Also extend the scope of application, try Office and iTunes for example. See how things change…
The original claim was about Photoshop and and AutoCAD, so that is what I was arguing about. Nothing more, nothing less–and nothing else. Obviously if you consider Office and iTunes things change, but you made no mention of them in your original post. But that was not the argument; the argument was that such highly specialized, professional pieces of software are NOT why the vast majority of Windows run Windows. Simple as that. I wouldn’t consider an office suite to be “highly specialized” or “professional” either really; I learned to use MS Office and other office suites in freaking middle school.
I am not claiming that Windows has no software exclusive to it that helps to propel it above all else. I am just saying that two highly-specialized professional applications don’t make much of a dent in the overall mass use of Windows, unless you’re a business and need them. Hell, I’ve even heard claims that even image professionals that use Photoshop would prefer to do it on a Mac. True or not, I don’t really give a damn–but the point is, even Photoshop is not exclusive. Not sure about AutoCAD (again, don’t care).
Your other two examples are, IMO, better ones–they’re something a lot of people use, with Office having a heavy presence on businesses but certainly not exclusive to use by them. iTunes is something more “personal” and unlikely to be on a business machine, but I wouldn’t doubt a lot of people–whether they have a business job or not–have it on their personal home machines. On the subject of Windows exclusivity, you sure as hell don’t need Windows to run iTunes either–and being an Apple program, it probably purposely runs better on a Mac anyway.
Edited 2012-08-30 04:04 UTC
Doesn’t matter what the original claim was, the overall point stands.
There is a lot of business software that only works on Windows or MacOSX properly.
Lets not forget about bespoke software.
“What this kind of user actually uses is a set of applications. And the relevant fact is that there has never existed Linux versions of very popular applications like AutoCAD or Photoshop, to mention just two best sellers. Applications like these are the ones selling Windows machines.”
That was the original point which I was arguing, and I don’t think it stands the way it was stated. “Applications like AutoCAD and Photoshop” are what sells Windows to *businesses that need them*. Businesses that have very specific requirements for bitmap images, or companies that are designing a product. Programs like these do NOT sell Windows to the masses, which is my whole argument. So no, I don’t see how “the point still stands.” At least, not as originally said. Maybe the general point stands in certain cases, but those IMO were two of the absolute worst examples that could be given, due to their highly specialized, professional, business-oriented nature. The high prices their respective companies ask for reek of these three qualities.
No the point still stands.
What you are assuming there is no middle ground between
* Hardcore Power user (developers, photoshop, Autocad, Matlab etc)
* Person that posts of Facebook and can’t spell that well.
There are office uses that are in the middle and most Linux software doesn’t cater for them.
So while your argument works if you consider only two very extreme cases, it doesn’t work if you consider anybody inbetween.
No–I think you still misunderstand what I’ve been saying. The argument doesn’t work worth the two examples originally given. That was my point. It may work if other programs are given–but I was only arguing about the two specific programs mentioned. Didn’t that point get across yet? So yes–with other examples it may be true to varying extents (depending on the applications in question), but for Photoshop and AutoCAD–hell no. You can probably get them both on a Mac anyway. And Office… well, that’s on a Mac too. And the reality is, any “normal” people don’t even need to pay out their ass for these very specific programs, because there are plenty of office suites, standalone word processors, image editors, etc. that are either free or much cheaper to buy and are geared more toward the masses. They are just some of the more commercially popular examples in their class.
Last time: I was arguing very specifically about PHOTOSHOP and AUTOCAD, the two examples originally given, and then MS OFFICE and ITUNES, the later two examples given. The end.
Edited 2012-08-30 21:01 UTC
And the same can be said of Linux. Or did you forget all the internet stuff? Not to mention anywhere there is a cluster involved, or all those Android apps. The street goes both ways.
People choose Windows under the assumption that any program they might need will work with it.
It’s just a given that it will work with everything. Take away Office, iTunes or Netflix and the streets would run with blood.
Fine I’ll give ya one…quickbooks. I’ve seen everyone from families to pretty decently sized businesses using QB, the express version if free so it doesn’t even cost a dime if all you are wanting to do is manage a household. And there is nothing, absolutely nothing, that even comes close to QB in FOSSland. a Single QB girl (and its always a girl for some reason, you’d think they had a union or something) can run an entire supply house with nothing but QB and a printer, everything from inventory to payroll, all nice and neat.
But even if you are talking about “granny users” that use NOTHING but a browser, well then they’d still be in trouble on Linux, why? Well in just the past 3 years we’ve seen the DEs gutted and replaced, ALSA for pulse (still buggy and more likely to fail than work BTW) and the entire wireless subsystem is a lousy mess. So grandma gets that new Ubuntu machine, see the “you have updates!” and like any sane normal user pushes the button, it asks to reboot and…whoops! Sound is gone, wireless toast, and depending on the GPU she may be looking at a single user mode black screen o’ death.
Sorry but that ain’t user friendly, and its certainly not ready for anyone who isn’t a geek which is a problem because most geeks? Have no problems running OSX or Windows either so no point in switching. The ones that need Linux the most, grandma that clicks on anything and gets bugs, can’t run it because its too breakage prone and complex…now see the problem?
virtualbox runs these, quickbooks and several others much better than the Windows 7 does from my experience. In particular I moved a client to Linux with XP in a VM to save them from buying another $4-5k/seat Autocad version that is needed to run it on Win 7. Win 7 could not run autocad even in XP mode. Now explaining virtualization and how they really are like 2 PCs in one and having it sink in is a bit of a task but after the first couple of weeks they got used to it and now they love it. The extra bonus of no viruses has made them an extremely happy client.
There is a project on Google Code to port Paint.NET to Mono, but it hasn’t had much activity in the last few years.
http://code.google.com/p/paint-mono/
Oddly, the last commit is by Miguel De Icaza. It was a fix for Mac OS.
It been there for a while, not much interest.
There is no need for updating paint-mono. It is already ported and kicking! Is named Pinta.
http://pinta-project.com/
If you like Paint.net you should check out Pinta:
http://pinta-project.com/
You might want to take a look at Pinta, which is a project inspired by Paint.NET. http://pinta-project.com
Edit: Never mind, I need to refresh my pages more often.
Edited 2012-08-30 11:39 UTC
Lol wtf? Autocad? Are you insane? Autocad is a highly specialist application which not even 0.01% of Windows users have installed on their machine.
Photoshop on the other hand is the best photo editing software bar none which is not something just professionals have use for, however the reason it’s on most people’s machines is that you can easily pirate it. If people had to pay for it then it certainly wouldn’t be near ubiquitous outside the professional realm. Also it’s available for OSX so you don’t have to get Windows for it, and AFAIK it runs under Wine on Linux.
For those who say that AutoCAD and Photoshop are applications aimed to a tiny set of professional users, I would like to stress that what I said is that average users buy is a fuzzy feeling of that the applications they are used to work with will be available if the machine is powered by a Windows operating system. AutoCAD and Photoshop were just two examples.
The average user might has never worked with AutoCAD, Photoshop or the like, and do not expect to ever do it. However he is aware of that most of the applications will run on a Windows powered machine, being one of the reasons that he knows people who pays their bills by working with tools like AutoCAD or Photoshop – to mention two examples -, which are only available for Windows.
As somebody wrote, it is software what actually sells hardware. I add that applications are what actually sell operating systems. And I insist that this fact is what allowed Microsoft to survive to a fiasco as huge as the crappy Windows Vista: for most users there was not alternative
Windows Phone is now in exactly the same situation, still it makes strides.
If it indeed becomes the 3rd ecosystem, it proves that no-app chicken and egg look can be overcome.
I generally agree. The slow uptake of 64bit windows (evidenced by the fact that PC memory configurations have generally stalled at 4G for several years) that doesn’t guarantee such compatibility is a prove to that.
Uhhh…its stalled because most OEMs aren’t paying for 4 RAM slots (the race to the bottom means every slot counts) and it is only VERY recently that 4Gb DDR 3 RAM sticks have become reasonably priced, and that is on the desktop. With laptops you often have one slot, two if you are lucky, and again the RAM prices have kept the OEMs from going higher. Hell my netbook holds 8Gb easily, only came with two even though it had Win 7 X64 installed.
If it was for compatibility reasons they wouldn’t install X64 on those systems, they’d install X32.
Most of the polular end user systems I see around in s bjshops are still 32 bit. My corp is also standarized on it.
Funny, all the laptops I’m seeing now come with X64 and 4Gb of RAM, and that is the same with all but the lowest end desktop units.
Sure if you buy the “$299 Best Buy Special” you’ll be lucky if you get 2Gb and 32bit, but that’s the nature of the beast, same as Intel purposely cripples their Atom chipsets so you can’t run more than 2Gb of RAM in an Atom system, which is why I sell the E350 AMDs instead of Atom, most come with 4Gb and can hold 8Gb so no crippling trying to force you to buy a higher unit.
But in the end its just math, you can still get 2Gb chips cheaper than 4Gb and when Dell is making $8 on average for their low end units every dime counts so they are gonna nickel and dime you to death on the BOM. If you’ll take a closer look at those 32bit units you’ll find other cost cutting measures, like lack of solid caps, crappier fans, PSUs that are barely above what the system pulls,etc. Its just how the game is played friend, no different than if you buy a $600 unit from me you’re gonna get a heck of a lot nicer unit than if you buy the $300 one.
He’s had a huge ego for ages, and most of his projects have gone down the rat hole.
I remember very well in the late 90’s when gnome was formed to fight against the “evil” kde licensing. intelligent people screamed that it was going to fragment linux, delay its adoption. but purists like Icaza said it must be done. 15 years down the line and linux on the desktop has ended up nowhere.
Havoc Pennington said something truly stupid in that early period while working on gnome: we don’t have to worry about backward compatibility like Microsoft does with Windows. Thus we can innovate much more quickly (paraphrased from memory).
I knew right then that this was a mess.
And Icaza has never taken a dime from microsoft? what garbage. where was novell and Icaza getting the funding for all that Silverlight for Linux work? MS was promoting it heavily, and no doubt was sending plenty of money their way. But once silverlight was ditched, oops, no more money.
Now he is promoting OSX – i wonder if that’s because MonoTouch is selling itself as a C# compiler for iPod apps? Hmmmmm.
he’s been a hack for a long time, nibbling at the crumbs these big companies leave him. the linux desktop has been in ruins for years, but he’s gone beyond that and it’s easy to lay the blame on linus and the culture he created.
You do know that Xamarin is making real profits which invalidates you ad-hominem attack entirely.
Though not anything you said actually invalidates any of his observations.
I think you have misunderstood what HP really said. That he meant is at that time (W98 era) is that Linux didn’t have to care about compatibility to *existing* apps bc there weren’t any, while MS had to put up with mountain of hacks comming from DOS era.
For the record GTK which was one of HP main projects has one of the best binary compat track records in the OSS world.
Compatibility is a big problem. I work on cross-platform commercial software and supporting multiple distros is very painful and requires a disproportionate amount of testing time. Microsoft should really be commended for their backwards compatibility work (but not their past poor decisions they now have to live with).
I don’t understand the love for OS X. I have to use it at work but I think it’s ugly and it’s very sluggish on a Mac Mini. I would rather use Windows.
I think Haiku is the only hope for a free desktop OS. Unlike Linux they have vision, good taste and don’t suffer from fragmentation. I’m worried it is too small to ever get much support, but I hold out hope.
…have anything to do with the holier than though attitude “power” users have with the n00bs. Nothing like getting shouted down/belittled in a forum to entice people to join your platform.
So, not only are Miguel’s points valid, but the culture attracts a high portion of alpha jerks.
RTFM! (RTM works just as well).
While his criticism of often changes in the ABI is a valid concern, your point about n00bs treatment is completely invalid. Many Linux communities are very helpful for new users.
To be honest I haven’t really noticed this. Some of the Linux forums I’ve been on have actually made it a policy to prohibit insulting other people’s OS choices. Could be what communities I’m in though.
BTW, while I think everyone but the fans will acknowledge that Linux has a fan problem… Is anyone else disturbed by the way some people insult Linux users? I’ve seen the word “freetard” bandied about a few times for instance, and occasionally worse than that; and I’m a bit put off that people are willing to insult each other over choice of operating system, which IMO is really pretty silly.
Not at all, like Yankees fans they’re annoying assholes who are asking for it.
The most over-the-top fanboy behavior I have seen has been from Linux youth. This is the group that has sent death threats to LINUX BLOGGERS over unacceptable posts.
LOL. For what it’s worth, the Red Sox fans I’ve met have always been bigger jerks.
But it’s clear at this point that you’re either trolling or incapable of spotting the holes in your logic, so I’ll leave well enough alone.
1) People don’t like the idea of change
2) People don’t like things you can’t pay a company for
3) People don’t see advertisements for Linux on television
4) People are happy with the state of affairs with Windows or OS X.
Most people are not geeks, therefore whoever says “well I use Linux on the desktop!” are living in a cocoon world.
All these “reasons” about why Linux isn’t the runaway success it should be are no more than cargo cult reasoning. “They were the state of affairs at the time of the non-success, therefore they were the cause”.
This is very true, I was just saying pretty much this the other day on /.
Wrt #2, this is especially applicable to corporate setups. They want someone to sell them support contracts. AFAIK, Microsoft’s support contracts don’t really do anything useful for the companies that buy them. But it allows a CTO to say “Blame MS. We have contracted support with them. I’ve been on the phone day and night giving them an earful.”
And this pleases executives. Doesn’t matter that nothing useful ever gets fixed. There is a whipping boy to blame.
Wrt #3, yes, when something is on TV and perceived as being backed by a big company, it has more perceived value in peoples’ minds.
I don’t really care anymore what OS people use. But I’ve come to think that if you really want to evangelize linux, then don’t send people to ubuntu.com. Send them to system76.com. They seem to have a need to pay money for something.
I actually seen Microsoft/Microsoft Gold Partners turn up and fix things, so what you are saying is bullshit.
Edited 2012-08-30 08:27 UTC
With it’s cheap looking imitation of the Apple desktop, surely Apple should be going thermonuclear on GNOME for the total ripoff of the OSX menu bar at the top etc
The amateur look and feel of any linux distro in general. The whole Linux experience feels like its done by high school kids still living in the 1990’s
Has any linux GUI developer ever sat down and said to windows users how they would want their desktop environment to look/feel?
A missed opportunity lost time and time again.
Yes Sun paid for and organised a usability study on Gnome 2 in the early 2000’s in Ireland with Windows users. The improvements resulting from this study in part made the later releases of Gnome 2 as good as they are.
Still using Gnome 2.3 in Ubuntu 10.04 today.
Is Gnome 3, the One Desktop to Rule Them All. A desktop on which you can’t make the file manager open things with a custom command, and can’t even change the freaking font size without installing a giant third-party utility… Right.
Good gods, I actually thought Linux had a pretty good shot at the desktop back in the Gnome 2/KDE 3 days. Those desktops actually worked, and had everything most users would need. Now the X server is finally stable, and can configure itself automatically, and the kernel has much better driver support… And where the hell are desktops? Gone, in favor of bloated KDE 4 and braindead Gnome 3.
Things were coming together. And then they all fell apart because a few people couldn’t get enough eyecandy.
Hurray.
For a “dead OS” it works great on my desktop, and has worked great for the last 10 years also.
You’re the exception to the rule?
Mind, I don’t think Linux is dead in the water either (not by a long shot). But for most people it’s not a reasonable choice IMO.
I was being sarcastic.
My linux desktops are alive and well thanks
WORKS FOR ME!!
This wasn’t the point of the article and you know it.
No the point of the ‘article’ was that it didn’t work for Miguel and so he declared it ‘killed’.
Why should ‘doesn’t work for me’ hold more weight than ‘works for me’ ?
Becayse his is constructive criticism. Yours add nothing helpful.
Edited 2012-08-30 10:58 UTC
The reason why I punted FreeBSD/Linux as a desktop was because the apps were almost usable, but not quite.
I can’t tell you how many applications I’d download and try to use. But could not because the developer did not finish it! They would make a great start and get to the 90% point and leave the last polishing out. There was a lot of time spent on the original app but they would get bored (or something) and not put the polish to it to actuall make it usable. These arn’ts incompatibility issues but polish issues.
The most famous of these is an office replacement, I tried Many but none were usable as a full replacement for MS Office. So I punted and went to the Mac.
How long ago was that?
The last 20% in software takes 80% of the time. It is hardwork and not a lot of fun.
2011 was the first year that Gartner actually measured more Linux servers being sold that Windows servers. While it had been speculated in the past due to the blank server issue, in 2011 it was obvious. On top of that we have Android coming on just about every type of device you can imagine. Sure, Linux isn’t really growing on the traditional desktop like Windows 7 or XP ran on. But then again, everyone, Apple, Microsoft, Google, is betting on a dramatic shift in the desktop paradigm. Enjoy OSX while you can. I have no doubt Apple will turn it into a walled garden just like iOS and Windows 8 RT. The desktop doesn’t really matter.
Add to that the vast numbers of machines that don’t get into the statistics as being sold as Linux servers, e.g. probably thousands of machines in academia. I’ve had my fair share of presence at universities and research institutes and I witnessed only a very very tiny fraction of pre-built Linux servers being ever bought. 99.9% of them were bought as blank (without OS), or built from parts on-site, and I’ve seen and done a large number of desktop Linux istalls on bought-with-Windows machines as well.
Moving the goalposts. The article says clearly “what killed the Linux DESKTOP” not what killed the Linux server which has never been doing better.
Just to keep everything clear and on topic, not talking servers, cell phones,HPCs, or anything other than what is considered a “desktop” aka your average bog standard desktop box or laptop, okay?
6 months is a stretch in some cases. I have seen things stop working much sooner after a patch is applied to the system.
Only way this will be fixed is for someone to create a new distro with the goal of maintaining backward compatibility, and that will mean a business model somewhere between open and closed as far as management goes.
You write some software, then it’s your problem to maintain it. Nobody does it for you. Not even on windows where it’s not that noticeable because it’s years before a new version pops up, or almost a decade between the really usable ones.
All those defectors that got scooped up by MacOSX are not a loss, in fact, I even doubt they were ever a real part of the community at all.
The linux desktop is very much alive, but unlike the rest of the world, the linux users, don’t give a damn what people say about it.
Way to go to miss the point.
When you write software you expect certain things to be there, if it is constantly changing maintenance becomes a nightmare.
It increases costs and that is why Commercial Software for Desktop Linux is thin on the ground.
Edited 2012-08-30 08:32 UTC
The GNU desktop is not killed but outsold by Microsoft Windows and to a lesser extend (understatement) MacOS X because of inertia. MaxOS X is also outsold by a factor of 100 by Microsoft Windows because of inertia and MacOS X doesn’t have the supposedly flows he is talking about.
So Miguel de Icaza may be right and he may have valid points about the flaws of the GNU desktop. His criticism may be valid but saying that it “killed the GNU desktop” is a bit over the top. The GNU desktop has not been killed anyway.
Miguel de Icaza knows something and thinks he knows everything. He has a point but he doesn’t see the big picture.
Linux desktop is dead as a mainstream contender. The world domination plans to launch adequate and self sustainable alternative to Windows are dead. And that was one of major motivators for free software developers in the area. Other claim is that platform is not growing and developers are fleeing. That claim needs of course hard data to be proven but gut feeling confirms it.
Just a few comments..
First, stop going on about how the Linux desktop isn’t dead because “you” are using it to post how it isn’t dead. It was neither said nor implied that the Linux desktop vanished or doesn’t exist. The piece is about how it’s never become a real success. Stop being drama queens about it.
Next, too much choice often IS bad for users because they feel overwhelmed and lack the knowledge to make a good decision. Couple that with the fact that many users don’t actually know what they need or want and all that choice is even worse. It’s no surprise that some of the most popular Linux distros are the ones that try really hard to be like Windows.
Also, arguing that Linux doesn’t have compatibility and breakage problems is like trying to argue the Earth is flat. Everybody knows it’s bullshit and they’ve known it for a long time. That being said, yes, in some use cases it can be very stable. Don’t get too excited however, the same can be said for the competition.
Linux can be a great choice depending on what you need out of your OS, software, and hardware. BUT, it can also be a horrible choice as well. Anyone who doesn’t understand that or is unwilling to admit it has their head in the clouds. Linux suffers from a whole myriad of problems that can easily make it a poor choice for Joe Average.
Lastly, OS worshiping/bible-thumping is moronic. The more “you” do it, the more of an idiot you make yourself out to be. There is no “best”, only what works best _for you_.
The choice would be great if the pieces would actually be interchangeable. But the problem is they don’t so the user is either presented with box of bricks that don’t exactly fit together (aren’t created to do that) with a non-trivial task of assembling them to make something functional (lots of hand made glue needed) or can get a someone else’s puzzle with obvious all or nothing approach. While there’s a lot of such puzzles none of those set aligns with 100% of given user needs. In other oses (of pre IPad era) , while the set of bricks one selects is definitely smaller (“less choice”) they can actually be easile mixed and matched to generate far greater number of combinations that number of Linux distros.
Edited 2012-08-30 11:24 UTC
Thank you, that pretty much sums up my feelings.
Re too much choice, I do think it’s worth mentioning that Windows offers a lot of choice for more experienced users – not just in terms of vast numbers of Windows applications, but also modifying the desktop’s behavior (e.g. with TweakUI) or even replacing it wholesale (e.g. Litestep).
The difference with Windows isn’t lack of choice, it’s that Windows ships with a standardized (and quite sane) default setup. Choices do exist, but diving into them is not necessary for end users.
No Windows 8 post will be able to top this.
Miguel is going to crash Slashdot servers.
Linux has never had any real momentum on the desktop, and Miguel’s flamebait use of ‘killed’ just shows how bitter he is that it didn’t actually go up in flames once he left. Looking at some of his points, codecs for watching movies, really? Is there any Linux user out there who has problems with codecs for watching movies? PDF Viewers?
Miguel left Linux once his attempt to push Mono onto the desktop (both user and enterprise) failed utterly and now he has ‘seen the light’ in Iphone and OSX where he is selling MonoTouch but of course due to his bitterness he can’t let go and thus tries to paint the Linux desktop as having been ‘killed’. (pause for dramatic effect)
Apart from Ubuntu there’s never been any serious push to put Linux on the mainstream desktop, and while it has become the most popular distro I don’t think it attracts alot of people who were already using Linux because they are drawn to Linux for it’s flexibility, not a one-size-fits-all-solution.
For mainstream users to move to the Linux desktop in any serious number there has to be something drawing them there, and there never really has been, your facebook page will look the same in Windows, you can listen to your music, watch your videos and unlike on Linux you can play just about every game for the pc platform. Add to this that the OS comes preinstalled with the machine you buy, there’s really no incentive for the mainstream users to switch.
Linux on the desktop has always been the haven for developers/tinkerers people who want a great level control of their system and be able to customize it to their needs, this is not what the overall user wants.
In fact the overall user generally don’t give a crap about the OS as long as it allows them to do the aforementioned (surf, watch movies, play music and play games comfortably), Windows does this perfectly well.
Linux on the desktop is chugging along just as it always was, and like the kernel the surrounding components are pretty much always in heavy development, sometimes breaking backwards compability and requiring recompilation of affected binaries.
That’s the ‘price’ you pay for this development model which is not controlled by a central authority, if you can’t live with this then yes, you are better off with something else than Linux, or at the very least stick to a distro like Ubuntu which moves very slowly and works on ensuring compability between it’s versions.
So no, I don’t ever think Linux on the desktop will reach any ‘critical-mass’,but it sure as hell is never going to be ‘killed’, and Miguel’s departure certainly hasn’t made a lick of difference one way or the other, even though he really would want it to be so.
Getting codecs etc working is a ball-ache. This shit is a piece of piss in Windows.
You might be able to get them working, I might be able to get them working … but it is still more effort than it needs to be.
Yeah. If you use VLC…
K-lite codec pack. VLC and any number of third party programs.
So it’s basically the same as on Linux, isn’t it?
Nope as I already explained here.
http://www.osnews.com/thread?533216
… don’t even need to go down to the command line to install VLC.
1.- Go to the download section on http://www.videolan.org
2.- Download (on debian based distros you can even skip this step, just click install on the web).
3.- Click to install
Or you can use the package manager.
Edited 2012-08-30 14:22 UTC
I use to swear by K-Lite too but if you are on Win 7 try Windows 7 Codec pack by Cole Williams. Takes care of 32 bit and 64 bit, takes care of everything, no muss no fuss and even lets me burn videos with AC3 audio in Win 7 DVD maker which was a PITB before, really great and hassle free.
Ehh, sudo pacman -S mplayer or sudo pacman -S vlc automatically installs the ‘codecs’ I could possibly need as dependancies together with either of the aforementioned programs on my distro (Arch).
I seriously doubt it’s much harder on other distros (perhaps you need to enable a particular repo which is no harder than to google for and install cccp-codec-pack or whatever the rage is these days on windows).
Yes because dropping down to the command line and editing repositories is something a normal person wants to do … NOT!
And as you say it probably works the same way in another distro … BUT NOT THE SAME WAY.
Miguel criticism is the fragmentation and you comment just proved it, you can do it a myriad of different ways depending on your distro. Even if you took all of the different way you normally would install these on Windows, it is still walking through an Installer Wizard.
Lets forget depending on the media player and whether your audio backend you might have to install different packages.
I can’t be arsed with this shit half the time, and I am an OpenBSD user. If anyone else had to do this they would say “this is a bit shit isn’t it”.
Lets compare it to Windows
In Windows:
* Download Klite Codec Pack, VLC, iTunes or any other popular media player
* Install.
Edited 2012-08-30 11:20 UTC
I don’t think you need to use the commandline to add repositories on say Ubuntu as this image suggests:
https://help.ubuntu.com/community/Repositories/Ubuntu?action=AttachF…
And on my distro the repo (extra) is enabled by default, no editing necessary.
Well, naturally there can be differences between distros, they are essentially different operating systems sharing components, Linux is just the kernel. But I don’t have to give a crap about how it works on ‘other distros’ anymore than I have to care about how it works on Windows or OSX, I only have to learn how it works on my distro and it sure isn’t hard.
And furthermore, once you’ve learned to use a package manager it gets infinately more easy to manage your installed software than through separate uninstallers like on Windows, each with a tendency to leave crap behind leading to the well-known ‘ever growing’ Windows partition problem.
Learning to click through a installer, using Ubuntu’s app installation gui or using pacman -S or apt-get or OSX’s method of either application bundles or installers are really no harder than the other in practice.
You learn it in 5 minutes or less. I don’t get why you try to paint this as some major hurdle, I’m guessing you have fallen off that bike of yours one time too many.
I’m not following this at all.
Who is ‘anyone else’, certainly not me and certainly not the Linux users out there. I get it, it’s too hard for you, but no one is forcing you to use it.
Again, sudo pacman -S mplayer
then play movie, bluray, 10bit x264 encoded anime, tv series, dvd’s etc
It fragmentation which is what Miguel was on about, installation is just one issue that is difficult.
Easy of use and things happening without effort is a big deal. Saying “once you learn how a package manager works” … are you serious? You need to have a mental picture of what packages and dependences are, I certainly couldn’t explain it easily to 90% of the office I work with and most of them are fairly decent at using a Windows PC.
Of course you can say “well they don’t need a full understanding”, then you are encouraging people to mindlessly type things into a terminal as an Admin users, it is no different than people mindlessly clicking through installs (while being potentially more damaging to the system).
It not just installation either. It is a whole host of things that is wrong with Linux, you can say this or that is easy (IF YOU KNOW HOW) but there is soo much fragmentation and little niggly problems that all add up to a lot of frustration if you don’t know how to fix them. Fragmentation only (even between versions of the same distro things can be different) makes this worse.
And using contrived examples such to prove a cheap point doesn’t his comments any less relevant or meaningful.
But whatever, Linux users have a long history of ignoring what other people are saying and just classify those with decent concerns as “haters”.
Edited 2012-08-30 12:47 UTC
Mental picture? Packages are applications, dependancies are applications which another application needs, and really you don’t need to understand dependancies as they are automatically resolved, you just need to either click on the program (as in Ubuntu’s gui package manager) or type a simple command line phrase containing the name of the application. Anyone can learn that, it’s not hard.
It’s not some deeply fundamental thing you need to grok, and since you used the concept of a program and corresponding codecs as some prime example of how easy Windows is to use then how can you pretend that the concept of packages are hard to grasp?
You do know that Windows will come with it’s own App Store, which is essentially nothing but a package manager with some DRM bells and whistles.
Would it help you if we from here on call a ‘package manager’ an ‘app manager’ instead, will that make it easier for you to grasp?
Lol, how is it ‘mindlessly’ to type a specified command followed by the name of an application, and how is it at all more potentially damaging to the system than clicking on third-party installers downloaded from the web?
You are the one who has tried to present contrived examples and claiming the equivalent would be ‘oh so hard on Linux’, I’ve shown that was bullshit, it’s not even hard on Arch Linux, which is anything but a ‘hold-your-hand’ distro.
Heh, based upon your comments here you don’t even seem to have used Linux at all, yet you are certain you have ‘decent concerns’ and anyone disagreeing with you does so because they choose to ignore you as a ‘hater’.
Sorry you do need to know what packages and dependencies are or at least have some concept of them, because when you want to install something outside of the package manager or outside of the repositories you have to either add repos or add them using a tool like dpkg or rpm. If you on a debian based distro you may even have to use alien to convert the package.
Spotify, Dropbox and Skype come to mind (Spotify is a mare to setup on anything than Ubuntu).
No it is not hard if it all works properly, but it is rarely the case and works differently on pretty much every major distro … that is fragmentation which is what Miguel’s comments were about.
As I said you kept on missing the larger overall point.
AS for me not using Linux, I pretty much tried every major distro since 2004. I run a Fedora machine and an OpenBSD user. I know my way a *nix boxen.
Edited 2012-08-30 14:40 UTC
That’s hardly something many users need to do as even if the application is not available in the official repos then it most likely is available in the user repo where someone has packaged it for you, just like on Windows where someone has created an installer with dependancies or a self contained .zip file.
If you are forced to build it yourself from scratch then I don’t see how it’s any different when doing the same on Windows, you have to make sure you get/build the dependancies manually.
Of those I’m only using dropbox, and that was no hassle at all.
Well I disagree with your notion that it ‘rarely works properly’, if it rarely worked properly then I certainly would not use Linux as my day-to-day desktop system.
I have not denied any fragmentation, I’m saying that it hasn’t anything to do with Linux lack of desktop penetration (the reasons for which I’ve listed above), you don’t have to learn how to use 10 distros, you only need to learn how to use one distro, just like you don’t need to learn Windows to use OSX.
Fragmentation, or rather diversity is a natural part of any ecosystem where there is no authority calling the shots.
Looking at the Linux desktop, Gnome developers want people to use Gnome! KDE developers want to use KDE, XFCE developers want people to use… ah, you get the idea.
None of them can force people to use a their offering so not surprisingly it will be diversity as there are many different preferences out there.
The only way to get everyone to use the same thing is either by removing the choice (hello Metro!) or by creating something which gains critical mass because it simply appeals to just about everyone, which is pretty goddamn hard.
This is were distros come in, some of them will say we will use this desktop environment and that’s what we will support, we make this choice for you, at the cost of flexibility but you don’t have to maintain anything yourself. Ubuntu would be a prime example of such a distro.
On the other side of the spectrum we have distros like Gentoo, Arch, Slack etc which basically say we provide you with the tools and a large repository filled with everything you need to setup your system the way you want it, get at it.
You choose the distro based upon your preferences, in practice the distro is the equivalent of choosing an operating system, like choosing between Windows or OSX, only the distros naturally share alot more between them then Windows and OSX does on the system level.
The only way a standard ‘Linux desktop’ could emerge would be if a distro would become so popular that all other would simply die out due to lack of interest, but I don’t see that ever happening, nor would I want it to except of course if the emerging ‘Linux desktop’ would be exactly according to my preferences, which is unlikely to say the least.
I prefer the diversity and being able to pick and choose between a wide range of offerings and get a system that is perfectly tailored for my needs, YMMV.
I seen plenty of people just type in whatever Linux instructions to solve a problem they saw on a forum.
Most third party installers won’t destroy your MBR, but it easily doable with “dd” in *nix.
With an installer, you just need to know the application isn’t (in most likely hood) malicious and how to follow it through.
Installing stuff through a command line interface is harder, and novice users will enter whatever you tell them.
We could argue about this all day.
Still doesn’t change that Linux is fragmented and it makes it difficult for developers.
Miguel attempts with Mono are perfectly in line with his claims.
It was basically the last sensible effort to equip Linux with set of binary stable systen apis for applications as even GCC apparently sucks hard at this (even, GLIBC breaks binary compat from time to time).
Assuming he perceives 3rd party commercial software as customary of success for OS (that’s a view most of the world agrees with) I see this as a honest effort to spare Linux desktop from irrelevance.
If you look at this from the distance this is exactly what Google did with Android, and given its success it’s not hard to imagine Linux could have a shot in e.g. form factor that wasn’t well served by windows (like netbooks).
Edited 2012-08-30 10:15 UTC
I don’t agree Linux never had momentum. There were times where mainstream tech magazined pitched it against windows on front page articles.
I still remember where about the only Windows advantage was its binary compatibility. It was more resource hungry, much less stable and completely unprepared for looming internet era. Linux desktop solved real problems for people at that time. It seems unimaginable how MS managed to capture desktop dominance with such a piece of crap but they did. The irony is now that Windows is decent MS is on the verge of loosing OS game.
That’s because the layers on top of Linux were crap and still aren’t as good as the competition.
X isn’t as bad as it used to be but it is still the weakest link in the chain.
This is actually not true. The ABI/API of the kernel towards the userland has been pretty much stable for years. People keep on confusing *INTERNAL* kernel APIs with the ones towards the userland.
While the former are subject to frequent changes (which is why kernel developers encourage anyone to get their kernel-related stuff *INTO* the kernel), the latter are very stable.
I have never had any single binary which wouldn’t run on a modern Linux kernel because there was an API mismatch.
The last change in this regard was the jump from libc5 to libc6 and that happened AGES ago.
On Windows, on the other hand, userland APIs change very often and without prior notice. Microsoft keeps on introducing and dropping stuff.
Have you ever wondered why Steam reinstalls DirectX for every game? Well, it’s because DirectX is constantly changing:
http://forums.steampowered.com/forums/showpost.php?p=23759166&postc…
Adrian
Dear Adrian.
I….
You…
Ah.. Forget it…
And whilst everybody continues to argue about Linux and application support, millions and millions of iPads (and soon Google tablets) are being sold and users are getting their stuff done on all new apps that only exist on mobile platforms.
By the time Linux catches up to the desktop of now, the desktop won’t exist any more, and the next gen of apps that consumers require won’t be on Linux either.
Making Linux for the average user should be dropped immediately and a single, unified, concerted effort should be put into a single, unified mobile platform.
Linux is a kernel, making Android as much Linux as is Ubuntu.
So, yeah, Linux is pretty much alive on both Desktop (never been better numbers-wise), Servers, Tablets, Phones etc.
Don’t know why this article is linked on OSAlert. It is painfully obvious that the writer fell in love (his words) with iPhone and OSX and is now making the, well known argument “I’m switching to [whatever], therefore everyone (“many hackers”) is doing it” and this article is just him rationalizing it to himself.
Getting emotional about software and his iPhone aside, I disagree with him on his main point that “compatibility across Linux distributions” is a problem. It is the package maintainer’s task to make sure that the software is well implemented in the distro’s environment, not developers’
What he probably meant to say was that non-FOSSoftware is a hassle to maintain. Well, I will agree with that, but this opens an entirely different. more then a decade old, can of worms called “The Cathedral and the Bazaar”
Oh, and I would like him to explain to me, what exactly is a “third party developer” in an OSS ecosystem?
(edit) It didn’t contradict you, maintaining 3rd party sw products is not commercially sustainable in OSS systems.
Besides, the whole approach lenient to compatibility breakage of individual projects creates a complex set of hard dependencies that are not manageable even to dedicated system integrators (distro makers) resulting in all kinds of update headaches. While every component might be technically of good quality, the composition of them is failing constantly.
Edited 2012-08-30 10:07 UTC
Re the desktop ceasing to exist, I think it would be better to say that the desktop may cease to exist as a separate device.
Some things are much more easily done on a desktop. Coding, writing, image editing… Smart phones can do these things now, but don’t have the right display and input mechanisms for it.
My suspicion is that, in 10 years or so, a “desktop” will be a combined monitor, keyboard, and I/O hub that you plug your phone into. IOW then-current desktops will not be separate devices for most people, because smart phones will be powerful enough to do everything a modern desktop can; but the role of the desktop will still be there, because there simply isn’t a better way to do a lot of things.
Not gonna happen, and here is why: mobile chips are naturally designed for battery life above all and even the latest and greatest has MUCH worse IPC than even a 6 year old C2D much less the new AMD and Intel multicore monsters.
Can we put that “Tablets are killing the desktop!” meme to bed already? I’ve been selling systems since 1993, right down in the trenches, and what I’m seeing is the exact opposite in that there are more x86 systems than ever and THAT is why sales are down, not because anything is replacing anything. What jobs do your average user have that a Phenom I X4 or C2Q can’t do with cycles left over? None, that’s what. Heck I’m gaming on a 3 year old Phenom II X6 and my youngest took my Phenom II X4 has a “hand me down” and both sit there twiddling their thumbs while playing the latest games!
The simple fact is in their “war of the cores” both AMD and Intel leapfrogged right past good enough and went even past insanely overpowered and right to ludicrous speed! People are buying tablets and smartphones because in the case of the tablets they are filling a niche, the “check my email while plopped on the couch” niche while the smartphones are having their own MHz war and as such the phone you bought last year won’t run the latest this year.
Desktops aren’t going anywhere, they are simply a mature market. Since ARM is already talking about “dark silicon” where the chip will have more transistors than the battery can feed give it 3 years and you’ll see the same thing happen with mobile, then like desktops folks won’t replace them until they die.
To be honest, I didn’t bother to read Mr De Icaza article. Given his history, I doubt that I owe him a minute of my time.
However, there’s no doubt that Linux on the desktop was never a success and to be honest, I can’t say that I really care.
For me and my colleges Linux (be that KDE 4.x or XFCE) is dependent environment on which we build our business.
For me and parts of my family and friends, Linux is dependable environment on which we conduct our daily lives.
Do I care that Joe-six-pack doesn’t and will never use Linux? Not really – that is, if Joe-six-pack-2020 will even remember how to use a desktop computer….
If anything, the attempt to attract Joe-six-pack to Linux (Gnome Shell, I’m looking at you) only damaged Linux in the eyes of those who really care – power users.
Linux desktop is a 1-2% platform aimed at power users and will most likely remain as such. Time to concentrate on that.
– Gilboa
As I said, the prospect of Linux desktop actually making a difference to outer world motivated a lot of devs. WO that motivation they will find better stuff to do (like mobile stuff) resulting in slow degradation of Linux desktop projects. This equals either radical feature trim (Gnome 3) or slow collapse in quality department.
This won’t happen overnight but is evident already.
I’m not sure about that. I hope the real target demographic of most Linux distros will always have enough people to keep one or more usable distros around. And while the real target audience might not include average joe or his grandma, it hopefully includes enough devs for it to be OK for a while.
His constant endeavours to weaken the community with impossible missions that only flatter Microsoft can not be explained by stupidity. There is no other way, Miguel de Icaza is either a shill or a psycho.
The “your code is my right” crowd underestimated how many Unix developers didn’t care about politics and just wanted to get work done.
Most people including most developers don’t like dicking around for an afternoon just to get sound working or to fix a broken update.
But Apple didn’t kill the Linux desktop, Linux shot itself in both feet and then a team of FOSS doctors argued over which foot to start on while the patient bled to death.
Edited 2012-08-30 13:30 UTC
I don’t think the problem lies with Linux (or even Windows, because Win8 on the desktop is a worse disaster than Gnome 3!) but rather with current design trends interfering with well established desktop paradigms.
Seriously, designers, fuck off with your touch screen desktop bullshit. If you want a touch interface you can go ahead and have a whole separate shell and your own sections of frameworks for your touch/tablet purposes instead of trashing the desktop stuff!
Although I feel the pain of the unstable ABIs (a real issue for proprietary software needed for specialty areas), I think that the rest of the issues are just a storm in a teacup really.
Windows 7 is a great success. Windows 8 isn’t even out yet so there’s no way to gauge whether it’s a “disaster” or not. Even if it flops, it will be little more than a speedbump for Windows.
In terms of desktop, OSX is chugging along as usual. Windows is doing great as usual. And Linux continues to be the red-haired step child. The `whole desktop space` is not suffering, it’s business as usual.
I can gauge a disaster coming.
The likelihood of Windows 8 being a success despite all the negative feedback is about as likely as Ron Paul winning the presidency.
You can’t release a product that pisses off your most important customers. Feedback from CIOs has been overwhelmingly negative, this is going to be worse than the Vista release.
Yeah, people said the same thing about Y2K. It’s funny how that turned out.
Interesting theory. What have you based this on exactly?
Although Vista received a lot of criticism, it was a successful product. That being the case, you’re not very convincing in claiming Windows 8 is a certain disaster.
Edited 2012-08-31 16:58 UTC
Personally i have decided to stick with LT releases.
Lets hope Google can build ChromeOS in to a more traditional desktop for both users and developers … and have the staying power to keep the base maintained and evolving
Linux is for technical people who doesn’t give a shit for bell and whistles and fancy things and so on…
Linux is about a superior technical correctness and stability. Everything else is secondary. Take a look at embedded, server and HPC. In these areas Linux reigns an rules, specially HPC.
There are several businesses at that I have convinced to do LINUX KDE desktops, and they like KDE quite a bit.
Web browsing, mail, or just running web apps it is great for corporate applications.
The only thing that has any staying power seems to be MS Office.
When that is required, they rdesktop into a KVM instance of a windows desktop to write or open documents.
That is pretty rare now as most of the applications this year have been built with open office in mind.
-Hack
As a server/enterprise Linux is unbeatable. On desktop it suffers but mostly by some rules of the community, so in a certain way Icaza is right. It’s a pain to cope with a new release every 6 months. Now it’s a pain to cope with Unity and GNOME Shell. I think Ubuntu was doing the right thing up to 10.10. After Unity was introduced, the one major distro just started to lose its popularity.
Just because it doesn’t have the market share of Windows (or OSX) doesn’t mean it’s dead. It just mean it’s not mainstream.
Kinda makes me appreciate the “One Vision” paradigm of Haiku Desktop. No different audio / video stacks to support or things like that. Just one for everyone.
Linux was just the victim of bad timing. Mac and Windows also went through growing pains. Theirs just came before the web made most of it irrelevant. Microsoft saw this handwriting on the wall over a decade ago, and tried to anticipate it by integrating web functionality into XP, only to get beat up by various governments for doing so.
Google’s Cr48, et al, was the shot heard round the world: The PC is dead. Long live the thin-client/zero-client.
The new ‘desktop’ is whatever the web-site programmer has created for you. This is good and bad. On the one hand, there are far too few well-designed sites. But, on the other hand, we’re back at square one all over again, and inventive programmers, no longer locked into Windows’ paradigm(s), or Apple’s, or Gnome’s, or KDE’s, or … are free to create. Let’s sit back and see what they come up with. BUT …. let’s keep patents and lawyers out of it!
In the mean-time, let’s position Linux to, pardon the expression, capitalize on this turn of events. Let’s have a Linux which installs, out of the box, as a terminal server and a terminal client. Let’s see a Linux which publishes its applications over the web (because that’s still a viable alternative to roll-your-own web-2.0-based UIs), using X-forwarding and/or whatever else. I’d love to have a server at home doing all the heavy up/downloading of my movies, music, mail, etc., as well as mass storage, while making it all accessible to all my devices in the field, or just in other rooms (networked via power-lines, of course). The new Raspberry PI comes to mind as an ideal client to bolt to the backs of all my monitors and TVs. The humblest laptops (which often have the greatest battery life) could serve as my terminals, maybe even replacing my smartphones.
Linux hasn’t missed the boat, my friends. It built the boat. Everything is ready. X, Apache, LTSP, OpenSSI, even the goodwill and well-wishing of the general public (unlike with M$) … It’s all there. Stop grieving over the demise of the old desktop paradigm, and let’s launch the REAL (new) desktop.
Ah, of course. How stupid of me, not realizing that you can do everything on the internet. I guess I should find that web page that allows me to do professional quality recording and audio editing along with MIDI sequencing. What an idiot I’ve been, not to realize that computers are no longer used for serious work and are just an interface to social networks.
This trend worries me as well, however see this: http://www.audiotool.com/.
android wouldn’t be the way it is without all the hectic effort put into the linux desktop over the years. almost all of the practical result has been a waste because almost nobody gives a damn about any linux desktop. but android is taking over the world and the linux desktop gets some credit.
How do you figure that? I’m honestly asking, as I just don’t see the connection. Android is a custom Linux kernel and a basic core userland, and that’s as much resemblance to traditional desktop Linux as I see in it. The rest of it (the Dalvik VM, the core APIs and frameworks, the graphics stack, etc) bare about as much of a resemblance to your typical Linux desktop as iOS does to the Apple II as far as I can see. Am I missing a piece of the puzzle?
this isn’t a puzzle. do you think everyone has been working on black terminals for the last 15 years? of course the linux desktop has influenced android.
samsung gets banned for curved corners influenced by apple and you can’t figure out how at least a decade of explosive linux desktop usage growth in schools and homes has influenced android. derr let me type this question into my bsd phone.
The Linux LAPTOP killed the Linux desktop. Once laptops became competitive price-wise with desktops, it made more sense to buy a laptop. No bulky case, no jumble of wires for peripherals, no need to buy expensive flat screen monitors anymore, and oh yeah you can take the laptop anywhere you go.
I bought a Acer Aspire 7730 in 2009. While it came with Windows 7 installed, I installed Debian Sid (via lmde) on it and have never gone back to Windows. The one nod I’ll give to the author is about binaries. Debian is especially guilty of this. In this day and age of wireless networking, it’s just boneheaded NOT to include the OPTION TO SELECT non-free drivers prior to install. But that’s why LinuxMint rose in popularity, because they listen to and provide what users want. Even though LMDE isn’t their flagship version, they worked hard to make Debian user-friendly (yes it CAN be done, note to Debian management).
That said, LMDE installed just fine, and everything worked right “out of the box” 3D support for my i915 video, audio, microphone, webcam, touchpad, etc. No config files needed to be configured in a text editor (ahh memories of 2002 & XFree86), no drivers had to be hunted for, I could get on the net, surf the net, codecs were installed, send e-mail, watch DVDs, listen to music (streaming and locally housed), IM, Skype, Write Documents, edit photos, edit sound files, etc.
And I updated the repos to SID with no issues (I like getting new software first and SID is a heck of a lot more stable than Ubuntu).
100% Free-as-in-beer. Easy to install. And if you choose KDE over Gnome/Unity/Cinnamon I’d say it’s easy to use. And 95% Free-as-in-speech. I applaud the Mint Team for NOT being of the mindset that it should be extraordinarily difficult to get wireless networking up and running DURING the install because it’s driver is “morally wrong, dirty, and unfit for Debian’s pristine environment”.
Linux works and works well on my laptop, even with upgrading SID packages everyday, no major or minor breakages in recent memory.
The bottom line: Linux has made huge strides and is easy to install and use now. While there are some Devs in the Linux World with extreme PRIMA DONNA attitudes (*cough*Stallman*cough*Debian*cough*) there are projects that have sprung up to plug these very obvious holes. Sure, Linux on the desktop is dead, but so is Windows and OSX on the desktop. This is the age of portable devices!
A repost from 2007-07-20 …
I’ve been arguing the same point about fragmentation for nearly ten years now all the while being told how I’m wrong, how Linux will rally around vendor supported distributions. Linux zealots refuse to recognize that open source is a double edged sword.
Vendor X merely needs follow the GPL and then they can blame vendor Y for not implementing X’s solution to problem ABC. Vendor Y says “our solution is better” as vendors tend to do. In the end users are left with software that isn’t interoperable. Of course they are free to pay us to implement it which wins support for Linux but doesn’t solve the root problem.
Zealots insist that people want choice – they just can’t understand that people don’t want to be forced to make them. People a good default and the option to replace it if they don’t like it. The constant KDE vs Gnome, Distro X vs. Distro Y has, in part, crippled Linux adoption on the desktop.
Of course the fact that it relies on X11 only makes it worse. Which distribution? Which desktop environment? What window manager?
The proof is in the pudding.
I mean why not include five WMs, desktop environments, browsers, file systems, etc. and make the user pick during installation?
That is a hideous kind of thing to do to all but the most advanced users. The fact that Linux is contrary to its own philosophy in that almost every vendor provides a default and the options thereafter but yet the Linux market just can’t seem to offer that same kind of standard to those on the outside looking in.
Argue till you are blue in the face about it – it won’t change the facts that people only want the option of choice and that nobody likes to be forced to make any decision. It is true that nobody is holding a gun to anyone’s head however we are asking the world to adopt new technology and it wouldn’t be the first time the technical and scientific communities were in error to do so.