InformationWeek is speculating on how Linux will change in the next four years. “By 2012 the OS will have matured into three basic usage models. Web-based apps rule, virtualization is a breeze, and command-line hacking for basic system configuration is a thing of the past.”
In 2012, barring a miracle, Linux distros will still be bit part players on the desktop – that’s something I don’t see changing, sadly.
Edited 2008-08-15 05:43 UTC
I expect Linux to gain a few more market shares, at its own rate. Unfortunately Linux doesn’t have the financial backing that Windows has, even if it has a few companies sponsoring it, such as RedHat, Novel, etc…
The rationale is that as Linux gets more contributors and financial backing from donations or support sales, it gets better. For Apple and MS, it’s the opposite: they gather the money needed to get the product they want, they market their product, sell it, pay their bills, reinvest some of the money into further product improvement and keep the remaining profit.
I’m sure many people who use an open-source OS don’t see the point paying $49 for an annual tech support, if they can get fairly good free tech support online. So they don’t pay and don’t contribute to the development of Linux. I’m sure if major players of the Linux community gathered money to develop a really good Linux OS and charged $49 for the OS, people would buy it. Heck, $49 is nothing compared to your daily expenditures. The product would be developed more professionally, it would be more attractive, easier to use. Forget Gnome and KDE. Forget intricate terms and conventions. Ease of use. Eye-candy. Turnkey operating system. People would actually use it. It would be to operating systems what Firefox is to browsers. There would be ONE main desktop-oriented Linux, not hundreds of similar distros. The profits could be reinvested in their entirety into the development of the OS itself.
At one point, there would be so many users that the investment would have been reimbursed, and the product could be freed and opensourced. The small percentage of users who would purchase addons, DVDs, support, would be enough to keep sustainable development going. People have no problem to pay when they see a value in their investment. But this rationale is incompatible with the opensource philosophy…
There is no room in the OS market for another Windows-alike OS. OS X has its hardware niche, but that’s it.
It’s called a monopoly, you know?
The only chance GNU/Linux has is actually not to rely on a typical business model. Linux is only able to grow (very slowly but) steadly on the Desktop because it has not to compete with Windows at all.
And that’s the only way for it to survive. Your model would just fail on the market, as others did before. Think about OS/2 or BeOS. Both were sophisticated approaches, the first one was even out before Windows 95.
i didn’t know gnu/linux was a business…all it has to do is satisfy its own users, even if nobody else wants it, and it will be just fine.
“i didn’t know gnu/linux was a business”
that was exactly my point, btw. did you follow the discussion?
There is one thing though: The acceptance through a broader audience has a key value, for example when it comes to hardware support. Sure most drivers are free, but most of them would not exist without support/efforts from manufacturors. Think of no one apart from me and you using GNU/Linux. I could not buy a brand new laptop and use it w/o hassle.
[quote]Linux is only able to grow (very slowly but) steadly on the Desktop because it has not to compete with Windows at all.[/quote]
On both desktop and servers, Linux has been competing with Windows. Think about all those Windows replacement discussions on forums and blogs when people say they ditched Windows for Linux distros like Ubuntu, PCLOS, openSUSE, etc… Also, distro vendors have been competing with Microsoft for a few years now. Think about Novel, RH, Mandriva who convinced car manufacturors, town halls, ministries, etc to drop Windows for their own distro. There’s something going on that we have to recognize. Linux is competing with Windows (and the winner is the consumer).
We are the Borg, join us. What kind of an argument is that? Surely you can do better than that pathetic effort.
Well 2012 is that year that many cultures that were completely isolated from each other have predicted to be the apocalypse…
I doubt the command line and hand editing configuration files will ever go away, especially since everyone still gives out help in the form of console commands. As long as that keeps going on, it won’t go away.
Edited 2008-08-15 13:10 UTC
Agreed. Prettier and ever so slightly easier to use.
-Kernel will be realtime and have a ZFS similar FS
-X will be a user process, kernelmodesetting will enable smooth, flicker- and tearingfree graphics
-KDE4, GNOME3, OpenOffice4, Firefox4 will be even more modular and more tightly integrated
-3D will work with everything ( at least on Intel, Via and especially AMD/ATI ) thanks to Gallium3D
-Wine will play games faster on Linux than on bloatware windows7 ( thanks to the kernel and gallium3d )
-It will be so beautiful that you would be OK with dying after seeing it ( thanks to Mark Shuttleworth .. hmm .. maybe I am slithly wrong on this one .. )
-Many more great things I cannot think of right now
I truly wish I could be as optimistic as you guys. I’ll keep on wishing you guys luck with your projections for 2012 Linux.
I just wish we had a great set of common base software activities API that could be used (although slightly differently to integrate with the possibilities) with most common languages and toolkits.
Things like configuration files API using a single config framework (no, it doesn’t need to be registry like, or even to have unified config format, just unified config abstract concepts that we could map to different file formats as each project sees fit);
Common package naming conventions with versioning information sane between all distributions, flags for packages that inform which features or pieces of it are included in some binary package, so that package writers could make packages portable across systems and only re-do the work on those who are incompatible due to distribution choices on different configurations / versions of libraries;
Hmmm I can’t think of anything else right now, but I’m sure I could come up with lots of other infrastructure stuff I’d love to have on Linux and that I just can’t remember now being almost 4AM.
Getting back to work… I hate freaky deadlines.
For anything to become unifiedd the majority of the foss community would have to do something it’s so far shown itself incapable of doing. It would have to sit down, collaborate and, here’s the big one, actually agree on standards. By collaborating I do not mean they sit down and begin to discuss standards, only to have some of the developers leave the discussion because they think they could do it better. I mean put aside their pride and preconceptions, sit down together (figuratively or literally), and actually come up with something that pleases the majority of both the developer and user communities. Don’t take away the user’s choices, but at least set down a common base.
Is Gallium3d a serious project or just a project built on hopes and dreams?
No, it’s an actual project and it’s about the only thing interesting happening in the open source graphics world.
I think you went totally wrong there…
It will be KDE7, Gnome6 and OOo 5
I don’t know about KDE… but I’d dare say that Gnome folks would be in Gnome 3 FOR SURE… if they really got Gnome 3 ready by the time. Not that I’m criticizing it, and things may have changed the last six months (I’ve been away from oss community for a while since my new job is about Microsoft stuff), but Gnome wouldn’t get on 5 that fast… They wouldn’t NEED to get on 5 that fast. At least not by their standards.
You forgot that pigs will fly with Linux-powered wings…
forget pigs, lets just try to get a penguin off the ground first, then we can go from there.
Unless a distributor somewhere actually gets a clue, looks into the open source software world at all the functionality available and works out how to use it and what to use, uses the right software, and as far as the desktop is concerned, uses the right technology, starts seriously building infrastructure to attract developers and gives developers a sane way to distribute and configure their software……………….we’ll still be where we are now. Linux machines currently live in too much isolation. At the moment, there’s rather too much idealism and not enough problem solving.
We’ll still have lots of Linux servers but nobody will be trying to compete with Windows Server, still, server usage will stagnate, and as for the desktop, we’ll have a pretty desktop that has moved ahead of everyone else in certainly the open source and possibly the proprietary worlds, but all the above problems will remain.
2012 will be Linux’s year
Seriously though, Between now and then I’d love to see some significant and bold changes within Linux. Changes that will make the mainstream press and Average Joes sit-up and say “Wow Linux? Why have I never installed that before?”
Sadly such changes are either:
* dangerous (Look at the way UAC in Vista backfired – though obviously Vistas has problems beyond UAC),
* unessisary (as aspects of Linux already includes such changes or works well already)
* or just impossible with Linux’s user / developer base (too many geeks stubburnly clinging on the old ways instead of banding together to push Linux ahead of the game).
Either way, I will continue to use , develop for and promote GNU/Linux where I can while having a smug self-satisfaction that I’m running the system I want rather than the system Redmond wants.
A previous poster here raised the issue of a standardised desktop, configuration, naming conventions etc. Which is all very nice to discuss.
The make or break of a platform is the ISV and IHV support it receives. Windows Vista being the best example of what happens when you change things to much, and the ISV and IHV’s don’t pull finger and support the new platform.
Linux (and other alternative operating systems) are currently in that situation right now. The hardware support isn’t too bad, but it is pretty shocking given that, for example, media player support for MTP devices is basic and problematic at best – for example. The ability to handle multiple audio devices is just coming into beta now with pulse audio.
What Linux needs is all the above which was mentioned, but it also needs big software names and big hardware names behind it. Linus needs to swallow his pride and come up with a stand API/ABI for the driver framework – atleast stability between the 2.x releases, rather than breaking it every 2.6.x release.
There are a laundry list of issues that Linux faces; and sorry to be a wet blanket, but I think there are better opensource alternatives that should be marketed at the desktop besides Linux – that atleast have the foundations required which can garner support from big name vendors. OpenSolaris being one example, FreeBSD desktop variants being another.
Linux hater blog did a good article on the problems with the bazaar model versus the cathedral model – too bad there is too much noise versus signal, in regards to discussing these pressing issues which the linux community face.
Edited 2008-08-15 10:26 UTC
MTP devices were specifically designed to work with Windows, and ONLY Windows. The entire protocol was designed, by Microsoft, to replace the standard, well-known, and well-supported USB mass storage protocol that pretty much every MP3 player used. The rationale was to support their DRM system – there are absolutely no other advantages to MTP over USB mass storage.
It’s a small miracle that they’re usable on Linux (or any other OS) at all. That it’s not perfect is hardly surprising.
Edit: All these kind of drivers are user-space anyway. I really don’t see a need for a stable kernel driver ABI though.
Unlike Windows, USB devices can be driven entirely from user-space (with a stable ABI) – that covers most weirdo one-off bits of hardware. Same goes for printers.
Video card manufacturers have had more trouble tracking ABI changes in Xorg than kernel ABI changes. Wired networking, audio, or storage devices have no real reason to use closed-source drivers – the drivers are trivial compared to things like video cards, and are either open specs, or already reverse-engineered.
That really only leaves wireless network hardware, which is a pain in the ass under Linux.
Edited 2008-08-15 11:55 UTC
May I suggest you get a player that uses USB mass storage, transfer 6,000 songs to it – and watch the speed in which the player loads. In the case of Cowon, the when it did eventually boot – I couldn’t navigate my music. May I suggest you look at the issues of using USB mass storage before making ill informed comments.
MTP is also fully documented – it is downloadable from the Microsoft website, along with the specification for PTP (of which MTP is derived from) – it was used to replace the variety of proprietary protocols previously used by players. If Apple can support PTP in iPhoto, I’m sure the open source world can too in the case of mtp.
Given that the specification is available, I jolly well expect it to work.
You need to take a lesson in operating systems. There is the usblib, but beyond the very basics, you need to write drivers against the kernel and loaded in kernel space.
Again, that is a load of bullrot. Windows already has user space drivers. Again, stop spreading lies.
Because Xorg is a decrepit piece of garbage which lacks the man power required for something of that complexity – look at the number of show stopper bugs still listed when 1.4.x was released.
Which is a key part of computers these days, along with the fact that the drivers on *NIX are iffy at best, printer support is cruddy, media CODEC support is of a low quality or simply abandonware (in the case of FAAC).
I’ll get marked down for this unfortunately because there are too many precious Linux fanboys who spend their whole lives censoring critics like me rather than listening to the issues, and actually bloody well doing something about it – or is it the fact that most of these ‘Linux users’ are nothing more than arm chair programmers, akin to the ‘arm chair’ sports man who knows everything but never does anything.
Because Xorg is a decrepit piece of garbage which lacks the man power required for something of that complexity – look at the number of show stopper bugs still listed when 1.4.x was released.
I haven’t been following X.org development but atleast I am not aware of any serious bugs. I actually haven’t met a single X.org related bug for a few years! So, care to tell me what are those “show stopper bugs” you mention?
Which is a key part of computers these days, along with the fact that the drivers on *NIX are iffy at best, printer support is cruddy, media CODEC support is of a low quality or simply abandonware (in the case of FAAC).
Printer support is cruddy? How come? You DO realize that OSX uses the same back-end? Or go browse CUPS website and look there at how many printers actually are supported.. Media CODEC support is of a low quality? Haven’t seen such myself.
I don’t quite understand what it is with you anyway, I’ve seen you trying to bash *NIX (especially Linux) for a LONG time already. You’re always acting like you are some sort of expert in all areas of life and computing, you act like you are so much better than everyone else around, you’ve even been boasting about having loads of money to spend around.
http://www.osnews.com/story/19846/X_Server_1_4_1_Is_Released_No_Jok…
” is coming more than 200 days late and it doesn’t even clear the BugZilla release blocker bug”
Backend =! drivers
Look at the FAAC vs. Nero AAC vs. Apple AAC
What on earth are you going on about – I’m confused even more that there are morons who have added 4 points onto your post and you haven’t contributed a damn thing to the discourse!
You are aware that nero provides a linux version of their encoder/decoder? just use that if you have problems with faac…
Guess what codec package I use on windows? ffdshow which is based on ffmpeg. On osx The first codec package that I installed was perian…which is again, (you guessed it!) based ffmpeg.
tell me again which codecs suck under linux?
Why would you need to use ffdshow; you *CHOOSE* to use ffdshow when there are already the native CODECs available. Don’t use *THAT* as an excuse. Dear god, do some research. Prove me wrong, but make sure that the facts are actually concrete rather than, “well, ffmpeg is available on Windows, therefore I have to use it to watch media” when that simply isn’t the case.
I Never said that (plus completely missing the point of my argument)…I install ffdshow because under windows I don’t get any codec support barring wma/wmv, mpeg2 and mp3.
The other codecs I use regularly: aac,ogg,flac,xvid,h.264.
containers that I use: mkv,mov,avi,ogg,mp4
None of those work out of the box. Sure, I could install other codecs one by one to get all those working, or I could just install one package ffdshow (barring flac, which is a bit of an adventure under windows).
The reason ffdshow is so popular is that it is easy and quick. Everytime someone complains of codec woes in windows, the first thing people suggest is to install ffdshow.
So what do you use on osx? I know out of the box it is just as lame as windows (swap WM codecs for Qt codecs).
That aside, I was merely pointing out that those nice codec packages are based on ffmpeg which is available on linux (or any other major platform).
p.s. ffdshow is a directshow wrapper to ffmpeg (which is more or less platform independent) so it is more or less native. YOU do some RESEARCH before assuming I am an ignorant bumpkin
Edited 2008-08-15 21:34 UTC
http://www.osnews.com/story/19846/X_Server_1_4_1_Is_Released_No_Jok…..
” is coming more than 200 days late and it doesn’t even clear the BugZilla release blocker bug”
Well, I did check the bugzilla entry. I do suggest you to check it also. The two bugs remaining in order to clear that bugzilla entry (it’s not even a bug entry, it’s just a pointer for several bugs that hopefully will get addressed soon) are https://bugs.freedesktop.org/show_bug.cgi?id=14639 and https://bugs.freedesktop.org/show_bug.cgi?id=13539. The first one is memory leak with X.org compiled against DBUS on Debian system. The latter one is that in some cases you have to apply XKB settings 3 times before it works. Neither of those are _severe_ bugs. But, you were talking about _serious_ bugs and issues. What are those?
I am not saying bugs are good at all, but every software has those. There is no software without a single bug. And the 2 I found aren’t that severe, mostly likely they’ll get fixed soon.
Backend =! drivers
In the case of CUPS, yes, they are. CUPS isn’t distributed without any drivers, and there wouldn’t even be any point in that. Go and check for yourself, OSX uses CUPS and the drivers provided with it. There might be some proprietary additions there too, but most likely those are available for Linux also if you are willing to use proprietary ones.
Look at the FAAC vs. Nero AAC vs. Apple AAC
But you were talking about general media CODEC support. One codec doesn’t equal every other codecs. And to this specific thing I don’t have much to say, I have always been able to play AAC files without issues but that’s where my experience with them ends.
What on earth are you going on about
Well…like you did promise to buy Mac for the guy who couldn’t afford it himself? It’s sad you have such a bad memory!
http://www.phoronix.com/scan.php?page=news_item&px=NjA3Mg
and there have been numerous other articles written on the sad state of Xorg, along with the sad state of GTK+.
Bullshit. The drivers I use with my printer are NOT bundled with CUPS, so there for:
Backend =! drivers
The backend PROVIDE the infrastructure for the DRIVERS to hook into, it does NOT provide the drivers, that is *WHY* there is the Gutenprint project, along with foomatic and so on.
Do some bloody reading on the matter for once.
I used the example of AAC for the CODEC issue; the fact that you can’t even be bothered to extrapolate that one example over the whole range of formats that need supporting tells me you don’t read, you browse and fire off half baked crap that doesn’t make any sense what so ever.
It was a flippant remark you dickhead.
Edited 2008-08-15 16:07 UTC
http://www.phoronix.com/scan.php?page=news_item&px=NjA3Mg
and there have been numerous other articles written on the sad state of Xorg, along with the sad state of GTK+.
Again, didn’t see any mention of those serious bugs you mentioned. And that article is almost a year old now, too. And GTK+ is a GUI-toolkit, doesn’t have anything to do with X.
Bullshit. The drivers I use with my printer are NOT bundled with CUPS, so there for:
Didn’t say ALL drivers are part of CUPS.
No, yous tated that drivers were bundled with it; there are no drivers bundled with it – that is why you have ghostscript, guntenprint, etc, etc.
No, yous tated that drivers were bundled with it; there are no drivers bundled with it – that is why you have ghostscript, guntenprint, etc, etc.
Ghostscript isn’t a printer driver. It’s an application suite used to convert and format various things. It is USED by drivers/printing layer. And yes, drivers ARE indeed bundled with CUPS. Gutenprint is an additional package of printer drivers and PPDs. I have no idea if Gutenprint drivers are even included in OSX.
CUPS provides a portable printing layer for UNIX~A`^A(R)-based operating systems. It was developed by Easy Software Products and is now owned and maintained by Apple Inc. to promote a standard printing solution. It is the standard printing system in Mac OS~A`^A(R) X and most Linux~A`^A(R) distributions.
Taken from CUPS website for your convenience.
Where Can I Get Drivers for My Printer?
By mike on Apr 23, 2008, 1 comment(s)
CUPS includes drivers for many printers. Open source printer drivers are available from other sites.
Taken from CUPS FAQ. Clearly states CUPS does come with drivers included. So will you finally shut up and stop spreading misinformation?!
Ghostscript includes several printer driver backends. They are not used so much these days, but they are there. It can take Postscript and rasterise it into the raw raster format suitable for the printer to use.
As I’ve already told you, the drivers bundled with CUPS are simple example-level drivers. They claim to support “many printers” because they provide very basic support for ESC/P2 and PCL, which covers most Epson and HP printers. You wouldn’t want to actually use these drivers for any real printing, though. OS X does not use them.
Looking at http://www.osnews.com/comments/19643 we see that Kaiwai’s full quote is “You’re complaining about a piddle $1200? geeze, if I was over there, I’d buy you the damn thing; $1200 is chump change. If you can’t afford it, maybe it speaks highly of stupid decisions you made in your life which results in the lack of cash flow today.”
That’s not a “flippant remark”. That is a despicable statement that could only have come from a despicable person. In fact, it reminds me of the Social Darwinists as portrayed by, for example, Dickens, who considered that charity was evil because it attempted to counteract the workings of Nature, and that a person of “inferior wealth and social position” was that way as the result of actually being inferior. And of course how can one not recall what the notorious robber-baron John D.Rockefeller said, of his wealth: that is was God’s way of saying “This is my beloved son, in whom I am well pleased”.
You might think that you are smarter and better than anyone whose “cash flow” is not up to your standards, but the only person who could possibly believe that, is you.
Edited 2008-08-16 13:12 UTC
No it isn’t. The basic CUPS installation includes only three or four very basic example drivers. I can assure you OS X does not use those.
On Linux you have a bunch of drivers from different places such as Gutenprint, Ghostscript, HPLIP, Splix, foo2xyz and others. These are all glued together by Foomatic, which is a rather ugly PERL script that tries to hide the differences in all of these different drivers.
On OS X I believe they have their own drivers, which are not derived from any of the Linux drivers.
So no, the backend does not equal drivers, even if it’s CUPS.
You do realise that the blocker bug is not actually a show stopper. If it was a show stopper then it would be affecting a large number of people.
To quote phoronix
Neither of those are actually serious if you read the bug reports.
While I agree with nearly everything you say regarding MTP, after all the MTP protocol[at least 1.0] was submitted to the USB forum, with extensions being published by MS.
The issue I’ve ran into supporting MTP devices in a project I work on in my spare time has been many MTP devices don’t actually work conform to the MTP specs, they do a half-baked implementation and you end up doing all kinds of device specific work just to support it. With PTP there hasn’t been an insane amount of deviation from the protocol, but extensions, AFAIK, haven’t been added to PTP to allow/require DRM, which is where MTP gets messy on some devices.
user mode/ring 3 drivers are indeed common on every platform of major usage these days. People who argue that point need to actually write drivers for a living for a bit and they will quickly see how often things are pushed up to user land. The least amount of access a driver has to the rest of the system, the better IMHO.
fan boys..
Sadly every platform these days seems to have them[vocal idiots], and many are so darn blind they can’t see the good in something that is not their product-of-the-month they worship.
*doh: didn’t realise you can’t add mod points after commenting. oh well.
Edited 2008-08-15 14:09 UTC
You’re right there – a prime example would be the Sansa from Sandisk whose firmware is horrendiously buggy and unreliable – however, the Creative players seem to be more reliable. Too bad the implementation of MTP/PTP aren’t cleanly portable to non-Linux platforms.
Unfortunately so. There are issues with MacOS X, there are issues with Windows Vista – but lord help someone who *DARES* show that Linux isn’t 100% perfect and without fault. I keep pushing and pushing, harassing and harassing over Linux because I hope that one day Linux users will pull their head out of their collective behind and remember that Linux is an operating system – not a way of life, it is written by humans and thus have faults, and perish the bloody thought – not everyone is going to find it suitable for what they want to achieve.
It is all about being a grown up, something Linux advocates on this website seem to have a big problem dealing with – you certainly don’t see the same sort of zealotry coming out of the *BSD or OpenSolaris camp to the same degree.
Mandriva’s been fantastic with wireless. I haven’t had to do anything by cli or muck with wpa_sup. Boot the machine, add the wifi passphrase and press connect.
I know it’s not so clean on other platforms and it does depend on the wifi card. I have an outstanding task to test the latest Mandriva against my old Linsys 54gs and it’s consumer hostile broadcom chip. That was the one that made me learn ndis and wpa_sup config so long ago.
I still can’t see a reason for for wifi NIC drivers to be so proprietary in the first place. It’s a freaking network interface attached to a radio; where’s the threat to national security?
Don’t get me wrong, choice is good, but for average Joe, their is just too many distro out there to know what is the best. Ubuntu is changing this slowly as more and more people seems to know what it is.
The battle over KDE or Gnome. Again, choice is good, but it’s not easy to set a standard (learning curve) when you have mutliple GUI choice. Look at Apple OSX and Microsoft Windows, you get ONE GUI. Love it or not, you only have one thing to learn and you can expect to see the same GUI on any other Win PC or Mac.
Package installation, Add/Remove Software, is really not there yet on Linux. Look with the eyes of a novice for a moment. On Windows, you download your apps (or use the supplied CD/DVD) start the setup and voil~Af^A , you have all your icons created, start menu and all. Not so with Linux. Allot of time you’re required to look around because it’s unknown where the apps got installed.
As with the software installation, Linux could use a much better way to install/remove drivers. It should be automated.
Anyway, it’s a long rant from a Windows/OSX user. I tried Linux many times and always came back to Win/OSX.
Don’t get me wrong but for the average Joe, there are just too many different cereal boxes to choose from. How anyone pickes a breakfast cereal when buying there grocieres is beyond me; they must all genious decision makers.
The choice has never been the issue. Confusing the uninitiated by calling hundreds of destinctly different OS/Userpace “Linux” while ignoring the branding and diffrences does a whole lot to keep alive, the myth of choice being a bad thing.
Ubuntu, PCLinuxOS, Mandriva.. those OS are on the right track for general desktop use. Other OS built with the same commodity parts are on the right track for there own goals.
Eesh.. havnig a choice of one (maybe, one and a half) different window managers for Windows was a big reason I started looking at other platforms back in my too-much-free-time highschool days. (RH 5 or so)
A note on software installation, most novice users that I’ve introduced to ubuntu actually found the package management far superior to the find/download/install cycle on windows… and I must say I agree, just searching in add/remove and pressing install to see all the icons etc installed is soooo easy
The two developments that stand out on Linux over the last four years are the rise of visualization and the rise of Ubuntu. If Linux managed two comparable new developments over the next four years, then it wouldn’t be doing too badly, imho.
On the desktop front, at least, I’d be watching for any indication that the media companies are starting to value Ubuntu as a brand rather than as merely a flava of Linux. Once something acquires brand value, it tends to mean it’s well-established enough to have acquired some credibility with consumers. If Ubuntu makes inroads with Linux’s perennial Achilles heel – low or no OEM preloads on new machines – then this might just happen.
Quite honestly, watching just two outfits over the next four years – Ubuntu and Red Hat – would probably cover most of how well or not Linux is doing. Meantime, Ubuntu may well need to decide whether to continue with Gnome and the eccentric (at best) Ubuntu colour scheme or move to a matured KDE4 and accept that Joe Sixpack places a much higher value on eyecandy than does your average geek.
Otherwise, this article seems to suggest “more of the same” with incremental improvements. Pretty modest ones, frankly. The market for mobile phones alone runs to hundreds of millions a year, so talking of Linux in 40 million smart phones and netbooks is hardly ambitious.
How well Linux fares over the next four years also depends very much on how well other major tech companies fare, notably Apple, Google, Microsoft and Dell. To judge from the stock market, the money at present is on Apple and Google as engines of growth, at least in the consumer sphere
I really don’t know whether this may offer Linux more or fewer opportunities. I’d guess, though, that Apple on a roll will take away at least as many potential Linux sales in favour of an iphone or powerbook as anything Microsoft comes up with. If you then add on Microsoft’s latest and undoubtedly formidable server OS on the other side, it’s not hard to conclude that the next four years will be extremely challenging for Linux. Possibly tough enough to show this article as rather optimistic.
By 2012:
– Huge presence in the NetBook & UMPC areas,
– Increasing presence in PC & Notebooks as Dell, Gateway, Toshiba & others, ‘openly,’ offer Linux on their products.
– Big presence in Smart Phone & Mobile Phone markets,
– Earning consumer marketshare & business presence from Microsoft, Apple and Sun,
– Microsoft, Apple and others being forced, by the success of Linux, to evaluate/change business models in dealing with consumer & business markets,
– Software companies recognizing Linux importance & vitality with product releases. Imagine a Linux version of iTunes. How about Quicken for Linux or PeachTree Accounting. And many more popular titles running natively in Linux with no virtualization required.
The last point is critically important in my opinion and I can see it happening. Remeber the 1980’s? There were many companies that would develop games accross multiple platforms. You could purchase Origin’s Ultima games for Apple II+/e/c, Apple IIgs, Commodore 64, Atari XL, Atari ST, Amiga, PC (and clones), etc. EA Games also did the same thing with many titles, as did several other companies. BankStreet Writer was another multi-platform program.
Back then consumer wants & needs drove the bus, not corporate needs. However, corporate wants & needs seem to be what we have to deal with these days.
Kernel 2.0.10.28 will be available, with old bugs out, new bugs in, really old bugs still in.
Wine will be version 1.000123124, with supprt for new games but still wont run Dreamweaver CS2
ATI/AMD graphics will still suck with poor drivers
Booting will (still) be slow and the grub/lilo/… interface will still look horrid
Politics will still drive it further from improvements and back into a DOS like CLI + VI attitude
Users and Developers will still be complaining to “Use another OS if you don’t like it” and not move the whole infrastructure forward…
Oh, i guess things haven’t changed have they?
But seriously, if there’s one thing i’d like to see and am looking forward to, is 1,000 better installation than even what we have now with *buntu etc… (including graphics acceleration, software, all peripherals, etc…). Kind of like a very good Out of the box experience.
Wine will be version 1.000123124, with supprt for new games but still wont run Dreamweaver CS2
Actually, apparently even Dreamweaver CS3 is now running and usable with Wine. Failed flame bait
ATI/AMD graphics will still suck with poor drivers
Closed ones perhaps, but they did release the specs for a lot of their hw a while ago. Sooner or later those open-source drivers will catch on, and most likely surpass the closed ones.
Booting will (still) be slow and the grub/lilo/… interface will still look horrid
Looks horrid? How come? WHenever I boot into Mandriva I get a nice, colorful graphical booting menu with animations, and I can even customize it to my own taste
Politics will still drive it further from improvements and back into a DOS like CLI + VI attitude
I don’t personally know anyone who uses VI. And well, it is rather clear you’re just trying to troll. but doing a pretty poor job at it ^^ Sorry, the Linux distros are more and more heading towards all-things-graphical, with just giving CLI backends for those who LIKE cli. More powerful that way than having one or the other
But seriously, if there’s one thing i’d like to see and am looking forward to, is 1,000 better installation than even what we have now with *buntu etc… (including graphics acceleration, software, all peripherals, etc…). Kind of like a very good Out of the box experience.
If you want my opinion then Ubuntu does suck. But that’s only my opinion. Try Mandriva, they even install proprietary drivers out-of-the-box and just works
Actually, apparently even Dreamweaver CS3 is now running and usable with Wine. Failed flame bait
Really? That’s good to hear.. The last time i tried to install it was back in November 07, so it might have developed a bit.. I could barely get DW8 to work on it back then.
ATI/AMD graphics will still suck with poor drivers
Closed ones perhaps, but they did release the specs for a lot of their hw a while ago. Sooner or later those open-source drivers will catch on, and most likely surpass the closed ones.
I just hope it wont take too much time.. I’ve read how hard graphics development is, and they need to start catching up.
Looks horrid? How come? WHenever I boot into Mandriva I get a nice, colorful graphical booting menu with animations, and I can even customize it to my own taste
Now that i didn’t know.. I’m too used to the ugly black/white CLI based menu, and was looking for something more friendly to use. This should be pushed forward and put as a standard rather than keep it backwards like what it is now (IMHO)
I don’t personally know anyone who uses VI. And well, it is rather clear you’re just trying to troll. but doing a pretty poor job at it ^^ Sorry, the Linux distros are more and more heading towards all-things-graphical, with just giving CLI backends for those who LIKE cli. More powerful that way than having one or the other
Guilty as charged but it’s more of a rant than troll. I guess it would be trollish unless i write at least a page about this, but i’ll just leave it as is for now..
If you want my opinion then Ubuntu does suck. But that’s only my opinion. Try Mandriva, they even install proprietary drivers out-of-the-box and just works
Sorry, haven’t tried *buntu i was just going with whatever that’s in hype lately. My point in that actually was not only on the *good* side of the OS, where it would install perfectly or flawlesly on a machine provided all the hardware is supported, but also what it does to install flawlesly on other machines that either don’t have the drivers ready at hand, or misconfigured, etc.. Take ALSA, and Framebuffers for a few examples. It’s becoming a nightmare for myself to get them running quickly on machines nowadays.
My predictions for Linux in 2012.
The kernel will have grown so mature that multiple janitor groups have formed, for security, portability and so on and now make up about half of the code changes. The OpenSolaris, FreeBSD, OpenBSD, NetBSD and DragonflyBSD kernels wont be far behind.
Linux will gain market share, and so will the other free *NIXes. As new users come to Linux thanks to Ubuntu and SUSE old user will migrate to less well know distributions or free *NIXes (because they are still cool). All of them together might finally scratch the 4% mark (I’m rather optimistic on this).
Linux and the other free *NIXes will all have their own specific VM/hypervisor infrastructure in kernel. XEN will evolve to a bare metal VM/hypervisor.
Linux and the other free *NIXes will start talking about a shared userspace driver model.
We will finally have a competitive sound architecture (alsa, jack and pulsaudio integration).
The desktop environment market will look a bit different. We will have KDE5, a port and direct continuation of the KDE4 work, just ported to QT5. We will have GNOME3, it will have made true on the on the promise of allowing different GNOME distributions to be build on its core. Because of this XFCE and ROX will chose to continue as GNOME distributions rather then totally independent DEs. GNUStep and Etoile will finally have reached beta and have started to attract and build a sustainable community. Enlightenment will have either grown into a DE with distributions similarly to what GNOME is planing or will have died.
We will see dozens of new command line shells pop up using new approaches (we are already seeing some of this) that try to make the command line a modern experience.
People will still say Linux and the other Free *NIXes will never be an important desktop even though they have already started using Linux and other Free *NIXes on embedded devices all over the place. Linux, Free *NIXes and UNIXes will still hold a very strong position in the server market. Maybe people will finally get it that UNIX isn’t going to die.
Please tell me that your comment about a competitive audio architecture being made up of ALSA, JACK, and PulseAudio integration was sarcasm or a joke? Because, to put it bluntly, the Linux audio architecture is a complete mess of competing APIs and half-baked drivers. Pulse is the closest thing we’ve got in recent years to something decent on top of ALSA, but the fact that we needed something like Pulse should give anyone a clue about how bad the situation has gotten. Here are the main Linux audio APIs I know about, and I’m not counting intermediate libraries like libaudiofile:
* ALSA
* OSS
* Jack
* PulseAudio
That’s four different APIS, count them. So, what happens when we put a few of them together? Here’s an example situation, one that I’ve encountered several times. I’m using vmWare Workstation as the example product, but feel free to substitute.
* ALSA loads the basic audio subsystems, including your audio driver, and its OSS emulation layer
* If the audio card in question doesn’t have hardware multi-channel support in ALSA, Dmix is brought into play for software mixing (this is yet another audio layer, though not really an API)
* If your distro has decided to use it, PulseAudio is then loaded, though in most distros ALSA isn’t configured to use it for all sound output
So what happens now? We’ve got three ways for a program to output to the sound card: ALSA’s native API, OSS, or PulseAudio. If they all played nice we’d be fine. Unfortunately, they don’t:
* Programs that use PulseAudio play the nicest, since all PulseAudio settings are honored, including the audio device you’ve selected. PulseAudio handles software mixing itself and does a much better job of it than Dmix. No problems so far.
* Now, a program that uses ALSA’s native API comes in. It honors none of your PulseAudio settings and, therefore, does not honor your volume nor your chosen Audio Device. A bit annoying, not a show-stopper yet.
* Now, throw an OSS application into the mix. Here’s where the fun begins. OSS is handled by ALSA’s OSS emulation. Problem is, the OSS emulation does not utilize Dmix at all. This means that if you’re running an OSS application and don’t have a full multi-channel card, the OSS application will control the sound card completely until it terminates. But, wait, we’ve got another problem: PulseAudio already has the sound device via Dmix. This means that the OSS application cannot utilize the audio device at all. This *is* a show-stopper to the average user, who just wants some freaking sound in all their apps.
Now, there are workarounds, usually utilizing a wrapper script and the LD_PRELOAD environment variable to substitute an emulation library into the application. This doesn’t work in all cases, however, Vmware is one such application that doesn’t like either the ALSA or PulseAudio emulation libraries, and RealPlayer can be downright iffy with the Pulse emulation libs though the ALSA ones seem to work fine. I can work around these audio issues.
But that’s not what’s going to matter in the long run. The average desktop user doesn’t even know what a shared library is, let alone how to write a wrapper script utilizing a linker preload. For them, what they see (or rather hear) is that applications aren’t consistent, and some may not work at all. They don’t know what ALSA and OSS are, or why OSS applications don’t play nice. And they don’t care.
Personally, I’d like to see the audio stack slimmed down considerably. Have ALSA be the bare audio drivers. They initialize the audio cards, and let the system know what their capabilities are, and handle the final generation and/or capture of sound. Have PulseAudio handle the rest of the audio work, including mixing sounds, recording, and playing them, meaning the Applications go through Pulse to do these tasks, and ALSA only handles the lower level, card-specific bits as Pulse requests them. Let the native ALSA API go for general application development, and have an OSS emulation layer for PulseAudio that integrates seemlessly with other Pulse applications (since OSS works across various UNIX-like OS’s it’s not going to go away soon). As PulseAudio is relatively portable, all things considered, perhaps in time it would be adopted by other UNIX platforms. Even if not, though, this would considerably reduce the issues encountered all too often with audio in Linux. It’ll never happen, of course, since it would require significant change, and it would certainly not be a one-person project or done quickly.
GPLv4 will rule that OEM machines with Linux in it must be given out free with blueprints on every chip included and installing of closed source software on linux machines will be ruled out unlawful…Linus will be forced to license kernel was GPLv4 or FSF would pull out GNU tools from linux…Chances for Linux desktop dies completely.
no, in 2012 Linus decides to stick his middle finger at GNU/FSF and switches from gcc to LLVM.
Not quite, Linus will flip the “bird” to GNU/FSF but will switch from GCC to OpenWatcom (Remember a C compiler is neccessary to build the Linux kernel.)
Actually, I believe its now possible to build the Linux kernel with Intel’s C compiler.
If Linus flips the bird and uses Watcom then it is likely that there will be a major Linux fork between the Free Software folk and the Open Source folk. Eventually the Open Source stream will be subsumed by business (one way or another), and the Free Software stream will continue.
The reason why people contribute to the Linux kernel is because the GPL means no one can take their contribution away from them (as can happen under licenses such as MIT and BSD). Unfortunately Linus, as smart as he may be, don’t grok that properly. He is nothing without the community around him. To lose that would doom non-free Linux.
so basically.. in 4-30 years linux will begin to approach the usability of osx 10.0.1… is anyone sick of linux on the desktop yet?
it’s really getting annoying… linux will this.. linux will that.. the truth of the matter is that linux will continue doing what linux has been doing for the past decade, copy what other people are doing in a hackey nonsensical way which is only fit for use by programmers…
it’s really getting annoying… linux will this.. linux will that.. the truth of the matter is that linux will continue doing what linux has been doing for the past decade, copy what other people are doing in a hackey nonsensical way which is only fit for use by programmers…
And you’re saying OSX hasn’t copied ANYTHING from any other OS? Wow, that’s quite a load of..hmm, better leave that part out Oh, and just to mention something..My mom is 56 years old now, can barely handle mouse and going to the online bank yet she’s been praising Linux all over once I replaced the Windows installation she had. She’s as far away from the definition of a programmer as is possible.
this is usually the case. linux is great if you are a die hard developer or if you are someone who only uses the few apps included on ubuntu but once you get a bit beyond that and you want to download firefox 3, you begin to get frustrated.
Download Firefox 3?
Most distributions include that now.
No the problem isn’t that. The problem is there are hordes of FOSS applications and that many people aren’t aware of. So they simply state FOSS sucks because the well-known apps are missing functionality but haven’t checked to see if any other FOSS solutions exist to fill in the missing functionality.
For example: some say GIMP sucks but haven’t bothered to try Krita, CinePaint or Pixel (which fill in most of the missing gaps). Gimpshop is another option but that project appears to be dead. Its also possible to use Photoshop and Paint.NET via Wine and other methods.
Either that or its the shear lack of interest in learning an alternative application. In that case, simply natively run or seamlessly access the Windows applications.
The latter option works especially well if there is a spare Windows machine. This way you wont ever need a virtual machine and an additional Windows license for VM usage.
Bottom line: Wine, Mono, and my STS software ensures that the other Linux users in my house are successful even in the unlikely case that a suitable FOSS alternative doesn’t exist.
Edited 2008-08-15 19:01 UTC
“Download Firefox 3?
Most distributions include that now. ”
Not mine because I got ubuntu before ff3 came out. I had to go the ^&$%$ synaptic in order to get ff3 because the *obvious* way of just going to firefox.com is too complicated to get to work, unlike mac or windows. And the firefox icon on the top of the screen still gives me ff2. It sure would be nice if I had a desktop icon when I install something.
Gimp is what is included in ubuntu. Now you can say there are hordes of other apps but as long as synaptic is the primary means to getting those apps, those apps will never be found.
So, it all boils down to you having never bothered to try anything other than Ubuntu? Nor did you even bother to download FF3 RPM-file which is used to install apps in most distros… Good job, you failed to make a point. Whenever I am teaching someone to use Linux I let them play around with it a moment, ask any questions they have and remember to mention that installing apps is different than on Windows. In Windows they are .exe files, whereas in the distros I use they are RPM files. On Mac they are DMG files. Those are things you need to know if you want to successfully install apps, you can’t use .DMG on Windows or .EXE on OSX either..
I don’t expect to click on an .exe for linux. But I do expect to click on an appname.something and it would install on any linux distro, including the popular ubuntu.
Because like it or not, ubuntu = the public face of linux for all practical purposes and that is what will be compared to windows and mac os x.
Like a .deb? You can, for example, download a .deb package for Ubuntu (and a whole lot of other format/distro combinations) from Opera’s download page and on double click Gdebi will pop up, allowing you to (amazing! ZOMG!) install Opera that way (prompting you for your password prior to doing so, of course). The functionality is there, what are missing are already made packages.
Also, synaptic isn’t the only interface available for installing and removing packages in Ubuntu, there also is gnome-app-install, filed under Add/Remove in the Applications menu.
But, anyway, if you discount package managers you are, kinda, missing the point with Linux distributions today, IMO.
Allthough I have to admit that I’m not a Linux / OS X / “Windows” user, your point seems to be really valid to me. One of the advantages of Linux is that you don’t need to visit a web site to install software (what a strange concept, by the way); instead, you can tage advantage of your distribution’s package manager to do all the stuff for you: downloading, installing, updating. This minimizes the neccessarity of interaction, another important thing in my opinion.
There are many good GUI solutions to make the distribution’s native package manager more appealing to a CLI hostile user.
If (1) Linux would have one standardized package format (or would add a kind of abstraction layer, worse solution) and (2) software manufacturers would be so kind and release installable packets of this format for their applications, those who insist on the habit of downloading software by theirselves would be happy.
My resume here: Using a package manager is the more modern way to handle installable software. I hope Linux will get more standardization here within the next years.
Opera? I was talking about firefox. I don’t see any firefox.deb
Why wouldn’t firefox come in a .deb format?
Why are there so many for Opera listed? I would have expected 2 links: Opera.deb and Opera.rpm and not the 5 zillion Operas for every imaginable distro
The reason why Mozilla doesnt list rpms is because its the default browser for rpm based distros. Its already in there.Same for most distros I would think. Actually now that I check, they offer a tar bzipped file that should install on any distro.
That “something” is called a “.deb”. You can’t have used Ubuntu at all.
or rpm, yum, ebuild, tgz, pisi, mo, or one of the zillion other package types out there.
or rpm, yum, ebuild, tgz, pisi, mo, or one of the zillion other package types out there.
You don’t use packages meant for Windows Mobile in Windows Vista, the same way you don’t use packages meant for Red Hat in Ubuntu. Trying to twist the thing doesn’t change it any better or worse. Use .deb in Ubuntu, that’s all there is to know. Every distro has their own specific package format they use, if you download files with the correct extension then you can just double-click on it and have it installed.
Yes, debs only work on Debian-derived distros. You can’t have used Ubuntu at all.
What is so hard about understanding what the parent is writing? He said he should be able to install on ALL distros.
Why won’t a phillips head screwdriver screw in flat head screws? Sure, you could solve this problem by forcing everyone to use phillips head screws – but really it is up to the engineer. Even dumb as a stick mechanics can deal with different screw types – yet many (supposedly smarter) IT folk completely freak out if they see variation in file names (even if they files do the same job).
In theory it would be nice if all distros used the same package format – so I understand your point. In my experience the tools for .deb give far less trouble than those for .rpm (mostly due to the way they manage dependency versions). If everyone standardised on .deb I wouldn’t complain – but I bet you Red Hat (and derived distros) won’t do this as a matter of pride.
but I bet you Red Hat (and derived distros) won’t do this as a matter of pride.
I don’t think pride has anything to do with it. People have differing needs and views on things, the people backing RPM format over DEB just think DEB lacks something RPM does have and vice versa for DEB people. A unified package format would be nice but it won’t happen for some time more. I still do predict that sooner or later most distros will move to some common package format.
By the way, installing DEB files in a RPM based system (or vice versa) isn’t impossible either. Most distros already handle converting the format to another format without user-intervention thus allowing you to seamlessly install different packages. Alien is one of those apps that can convert DEB to RPM and vice versa Rpm2deb is also commonly installed by default on distros, and should be available in repositories otherwise.
no… actually they are not rpm files.. there are at least what? like 20 different types of linux packages… STANDARDS PEOPLE…. pick a freaking major here or your just going to end up in school for the rest of your life trying to decide what to do..
When was the last time you ran the update manager?
Actually, last time I checked the Linux version of Firefox as downloaded from their site doesn’t need installation at all. You download the compressed archive, uncompress it and look for a “firefox” script inside the folder. Then you create app shortcut to this firefox script, and you can manually add the menu entry (with kmenuedit in KDE).
You have to unistall the previous version of firefox before running it, or it will run the old version. I guess there are ways to have both versions at the same time, but I don’t know right now.
If that looks like too much trouble, you can always wait until they have it in the repos (in Synaptic, that is). That’s what I tend to do.
Well, getting acquainted with Synaptic or equivalent tools is one of the first things a new Linux user should do. It’s not like having to compile a kernel or something. In fact I think it’s one of those places where the Linux experience is more pleasant and newbie-friendly than that of Windows, as long as the repos are big, up-to-date and conflict-free as they should be, that is.
no actually.. i’m saying that basically all it does is copy…. hey at least they are venturing from the start menu to the dock.
Actually IBM is working through moving its desktops to Linux. They’re smart fellas (I dare say, smarter than us).
Yawn. You must be relatively new to computing since your comments demonstrate a fair degree of ignorance. Windows mostly copies others (examples: Win32 TCP is a port of BSD’s TCP stack, Vista trying to emulate Mac OS X, etc).
Edited 2008-08-16 11:18 UTC
Kernel 2.0.10.28 will be available, with old bugs out, new bugs in, really old bugs still in.
???
Probably kernel 2.6.50 with old bugs fixed and new bugs. All software has bugs and new bugs appear when features are added.
Wine will be version 1.000123124, with supprt for new games but still wont run Dreamweaver CS2
Who cares when CS3 works with Wine 1.1.2? Its rated Bronze but judging by the rater’s detailed evaluation–everything works minus saving to networked folders, he should have rated more like Silver or Gold.
ATI/AMD graphics will still suck with poor drivers
Actually they aren’t so bad now. The OpenGL rendering performance of their Linux drivers have increased 33% recently. There are still bugs but the Windows drivers have more errata listings in the release notes.
Booting will (still) be slow and the grub/lilo/… interface will still look horrid
Slow is subjective. In Fedora 9 optimizations have been made to X Server initialization process to start in under a second on any relatively modern machine. The Upstart init system is also faster over SysV style init system and provides many other benefits as well.
I’ll agree that Windows and even Vista still boots faster but that the difference is insignificant enough not to matter.
Horrible interface? What exactly is horrible about it?
You do know editing it via menu.lst isn’t the only option, right? You can also modify it via a GUI program, such as KGRUBEditor and Gnome System Tools.
Edited 2008-08-15 16:40 UTC
Xorg 7.4 is likely being delayed to ensure that its a quality release. Distributions like Fedora are adopting stable pre-releases to obtain the best features in development early.
GTK+ 2 is stagnating but stable. GTK+ 3 may rejuvenate the toolkit and possibly break backwards compatibility as well.
I have a screenshot of an early Linux 2012 Alpha Release:
————————————-
$ ls | grep txt
readme.txt
test.txt
$ echo ‘Drag this!’
Drag this!
$ itunes
bash: itunes: command not found
$
————————————-
Pretty cool, eh?
Following the emergence of Enlightenment DR17, KDE and Gnome merge and Knome 5 is released by the joint effort. It’s main feature is the ability to use applications from within Duke Nukem Forever which was only released on Linux and WINE after Microsoft EOL DX9 in a botched effort to make 3D Realms rewrite DNF for Windows 7. Sport as we know it has ended as people take to chewing bubble gum in DNF – which is the first game to become an olympic sport and replaces the 110m hurdles for London 2012. Whilst playing DNF, players can take advantage of the advanced Kome 5 integration to airbrush onrushing aliens using KNIMP 5, giving each user a personalized DNF experience, whilst simultaneously chatting on KnomeChat. (As part of the Gnome merger, it was agreed all application names would start with ‘Kn’.)
2012, finally the year of the Linux Desktop.
(FWIW I am a Fedora/Ubuntu user)
Linux will be where ever we the development community decide it should be. Unlike OSX and Windows which will be where ever someone else decides they should be. It all really comes down to freedom. Use Linux and go where ever you want, or use something else and go where ever they want you to go.
And don’t be hostile to Kaiwai. Every morning a penguin sneaks into his kitchen and pisses on his wheaties.
Why does this “freedom” always end up as something that’s been done a million times?
You know that penguins cannot piss, right?
…pretty much everything anyone here has predicted will have turned out to be wrong.
Absolutely hysterical response to this article:
http://www.beranger.org/index.php?page=diary&2008/08/15/06/52/08-wh…
Do I agree with him? Nope. It is a good read though.
Haha this is a funny article. As a full time user for the last 5 years the only thing that will be different in 2012 will be the version numbers and distribution “codenames”. Ubuntu “pink rabbit” and other wacky names.