After announcing the move to Unity, and the eventual move to Wayland further down the line (someday one day perhaps eventually maybe once when unicorns roam the earth), Ubuntu is announcing yet another major change, this time in its release policy. While they’re not moving to a rolling release as some websites are claiming, they will update components and applications more often.
Currently, Ubuntu is on a six month release cycle, a cycle it has adhered to very strictly. However, this has had one major downside – when major applications, like Firefox or OpenOffice LibreOffice are updated, users will have to wait until the next major release before getting their hands on these updates. This can be quite aggravating when such newer versions include important big fixes or desirable new features.
Mark Shuttleworth wants to change this stringent cycle to allow for more regular updates of common Ububtu components and accompanying applications. “Today we have a six-month release cycle. In an internet-oriented world, we need to be able to release something every day,” Shuttleworth said, “That’s an area we will put a lot of work into in the next five years. The small steps we are putting in to the Software Center today, they will go further and caster than people might have envisioned in the past.”
No word on when these changes will be implemented, but I’m hoping sooner rather than later. With Debian’s package manager as flexible as it is, it seems counter-productive to just hoard all these updates until a major release hits.
Not to mention annoying.
This is just treating the symptoms of a wider problem in that Linux distributions don’t have a sane software installation method for non-core distribution applications.
The next question is how long you will keep providing ‘backports’ (because that’s all this is) for, and for how many releases, and who is going to quality check them. It just isn’t scalable.
This just sounds like rearranging some deckchairs to be honest.
Edited 2010-11-24 18:13 UTC
So you’re telling me the Windows way of maintaining 3rd party programs is BETTER than the repository method used by Suse, Redhat, Debian, Ubuntu, etc.
That sir is Moo – as in Udder Nonsense.
What’s your definition of maintaining? I think he’s referring to straight-up installation. You want the new Firefox? Go to firefox.org and download it. You don’t have to wait for your flavor of distro to roll it out via the repo. And you don’t often have to worry that any prereqs for the app are going to break your other apps, just as it’s rare for a Windows Update to break things. Updates and service packs might break custom or in-house enterprise apps, but i don’t think it’s an issue most general users face.
If you want chrome on ubuntu for example you can get the deb from google, double click on it and it will install and automatically hook up with apt so that you get automatic updates (unlike in windows where a lot of apps have their own update application, I have like 5 running now: “adobe updater”, “apple software update” etc). Is that really much more difficult than running a installer in windows?
Mozilla prefers to have the distros package their own builds, but they could have it working in a similar way if they wanted to, you can’t blame it on linux that they don’t.
1. Developers have to provide backports for every single release. When they can’t be bothered you’re out of luck.
2. It’s Ubuntu specific.
3. There’s a reason why the update menu item in Firefox was disabled on Linux when it wasn’t on Windows and Mac OS. It just looks stupid.
1. Developers have to provide “backports” for windows XP, 2003, vista and the various editions of them too. It the same thing for Linux, just choose the oldest release you want to support and make packages for it, then fix any problems when installing them on newer releases. If you require any newer libraries on Linux they can be bundled.
2. Ubuntu was just an example, fedora and OpenSuse works in a similar maner. Covering them is probably enough.
3. As I said, they should hook it up with the built in package management, there is no need to provide an included update application when the OS already provide the required functionality. The technology is there, if they choose to use it.
Well let’s look at the amount of applications that you can install on Windows and the fact that you can install Open Office on a nine year old OS in Windows XP but not on a nine year old Linux distribution.
The only thing that Windows lacks is a means for applications to have their own update repositories and systems.
Well you can call me sir all you like, but no it isn’t, unless of course your goal is to make it as difficult as possible for users to get updated versions of applications and for developers to get those applications to users.
The reason why Ubuntu is looking at this approach of providing continually updated applications is because the ‘better’ way you describe really isn’t.
Edited 2010-11-25 13:25 UTC
“Well let’s look at the amount of applications that you can install on Windows and the fact that you can install Open Office on a nine year old OS in Windows XP but not on a nine year old Linux distribution.
The only thing that Windows lacks is a means for applications to have their own update repositories and systems.”
You forgot that windows development pretty much stagnated for some years before MS released Vista in late 2006. And vista sucked. Hence a lot of people still use XP. How many people do you think are still using a 9 year old Linux distro on a desktop?
And besides, XP has recieved 3 service packs nad hundereds of updates over the years. Do you think a pre-SP1 XP from 9 years ago would actually run modern software?
“Well you can call me sir all you like, but no it isn’t, unless of course your goal is to make it as difficult as possible for users to get updated versions of applications and for developers to get those applications to users.
The reason why Ubuntu is looking at this approach of providing continually updated applications is because the ‘better’ way you describe really isn’t.”
I disagree. Having a central package management system is far superior to having 100 different installer and updater applications. How is that “as difficult as possible”?
The problem is that a lot of developers leaves it to the distros to make packages rather than making their own.
Edited 2010-11-25 18:38 UTC
People still use XP because there is a large, stagnant installed base, especially in businesses, who can’t and won’t upgrade as fast as some think they should. A nine year old Linux distribution should be so lucky, and the reason why people don’t use one that old now is simply because few want to run Linux.
The rest of your comment are unrelated criticisms of XP and I don’t see them as relevant.
Because people want to upgrade their applications, get new versions and actually keep their system relevant to the work that they want to do. Upgrading it every six months to do that is just plain stupid, hence why Ubuntu is looking at doing this.
Unfortunately, trying to do it through a central repository system is a gross duplication of manpower and resources where they will have to backport to each and every single release and there will inevitably be a delay until new applications appear. They’ll also have to work out how long they will provide backports for. Most upstream developers refuse to support many distribution packages as well.
It’s swings and roundabouts, pros and cons, and simply making a sweeping statement that a central repository system is the best way is just nonsense. It isn’t. There are just glaring disadvantages that people paint over.
Chicken and egg. There is no sane installation and configuration system for third-party software in any Linux distribution that doesn’t interfere with the distribution itself. A package management system isn’t enough. When someone has come up with one they have been consistently told that they’re stupid.
Look at how easy it is to configure MySQL through a configuration wizard on Windows versus the hoops you jump through when you install on Linux. That’s just the tip of the iceberg.
If I met someone deliberately running a 9 year old Linux distro without a really damn good reason I do believe I’d slap them. Twice. There’s really no rational excuse not to stay with a supported version of the OS.
You might think that but I’m afraid the sign of success is that you have a lot of people using a wide variety of older systems. The reason why no one runs a nine year old Linux system, on their desktop anyway, is because no one uses it.
I’m sure there are nine year old Linux server systems that haven’t been upgraded simply because they’ll be running applications quite happily, and there isn’t the time or the urge to mess with them because someone on OSAlert thinks they should be upgrading continually.
It doesn’t work like that.
How is mentioning the fact that MS has maintained and updated XP over the years “unrelated critisism”?
That’s the actual problem.
I see more advantages than problems.
Yes there is. Packages can trigger post installations processes, including starting a wizard or whatever. If you install the dropbox client on ubuntu for example it will ask you to restart nautilus and to start the client. When you do it detects that it runs for the first time and shows a wizard that lets the user configure it, just as in windows. MySQL could do the same thing. The technology is there, it’s just up to developers to use it.
Edited 2010-11-26 17:44 UTC
Well actually OpenOffice 3 runs on Win2k SP2 or greater. But yes just about everything runs on XP SP2 and SP2 came out in 2004.
You can have a single installation and updating service without having a bunch OS/application interdependencies. See the iphone as an example.
Oh so it is the fault of developers now. Is it the fault of developers when a distro breaks a program with a system update? Leaving it distro package managers is how (open source) developers deal with the mess. Some don’t have the time and others simply don’t want to waste it testing and packaging.
2000 was unitl recently still being maintained and updated. XP still is, and it’s still used a lot. 9 year old Linux distributions aren’t in most cases. That’s the difference. But technically you could probably make OO 3 run on an old Linux system but there is no point doing so.
You could have that in Linux too. Compile static binaries or bundle required libraries. Or use something like java. The .tar.gz distributet by mozilla runs on pretty much any Linux system with no need to install dependencies.
Did I say that? Where?
Anyway, if you’re targeting a stable release series chances are slim that an update will break the application.
It has nothing to do with being maintained and updated, it’s called a stable interface. Of course you can get OO 3 to run on an older system but it requires far more steps than just going click-click-click. Why does everyone have a hard time admitting this is a problem? Forget even comparing Linux to Windows. Both FreeBSD and OSX are much better at maintaining binary compatibility.
It’s a PITA compared to Windows and OSX and there are unstable components of the system that cannot be added to the package. Ubuntu broke some statically compiled games by screwing with the sound API. What were developers supposed to do in that case? Distributing software outside the package management system is a major annoyance and has held Linux back.
Someone in this thread already pointed out how you can’t install Postgresql 9 on 8.04 through the packaging system. 8.04 LTS came out in 2008. Is this acceptable to you? Requiring a major system update to install a freaking command line database program?
Note that this was filed on 2008-10-14:
Please backport OpenOffice.org 3 to Hardy
https://bugs.launchpad.net/hardy-backports/+bug/283137
I didn’t say the situation today is great. Only that a central package management system is better than 100 different installer and updater apps. And of course this would work better if Linux distros would provide stable interfaces but that’s beside the point.
Developers could provide packages that works across diffrent versions and fix them for the changes but in most cases they leave it up to the distros to make their own packages. Hence the current situation is not because the way linux distros handle software installations.
nxsty, Your dealing with paid microsoft advertisement companies, pretending to be individuals.
Who in their right mind would ever run a 9 year old Linux distribution ? The whole beauty of Linux is that its constantly evolving! Does it really matter if I can run new versions of programs on obsolete distributions ? No because the Linux model is vastly different to Windows, its continuously updated and evolved. This is just such an asinine point and so far removed from reality, no one and I mean no one runs 9 year old versions of Linux there is just no need for it. Only Hired Microsoft shills could make such a ridiculous point and try to make it sound like a valid criticism, its just not reality.. Who uses old 386/486 Computers as their main desktop computer now ? Its the same stupid principle..
Furthermore back on topic with this news article, Ubuntu was started in 2004 thats 6 years ago it didnt even exist 9 years ago, who cares if it could run openoffice ?
Where it comes to package maintenance and distribution Linux is obviously far superior. Digitally signed packages in a centralised and maintained repository easily searchable to find any program you want automatically updated from the same package management system, practically no malware as its all digitally signed (except for 1 very badly maintained irc server). All automatically updated all automatically maintained.. Its about as good as it gets, no platform can match Linux in terms of package management.
Even windows system update is taken directly from the Linux world, Linux had auto update well before Windows. Take a look at Windows now the more I look at its touted features:
UAC , yeah thats sudo..
c:\users oh right because mac uses /Users/ Which I think is a hardlink to /home/
Aero, yes again thank you Mac OSX, also Linux does it far better.
The more I have to laugh, this is the best Microsoft has to offer now? No thanks I will be sticking to my Linux distro that automatically updates everything for me and provides me with a kick ass shell, a kick ass desktop (I love Kde and I also absolutely love Enlightenment E17) Enlightenment as far as desktop design goes Rasterman and the E17 team are absolute Geniuses (Genii whatever). Anyone here remember the early E17 demos with the moving backgrounds ? The effects he had running well before Compiz were just amazing. That I guess is the difference Linux because of its constantly evolving model is always on the forefront of technology. The desktop may at times become unstable, but never the underlying system. It can be made to look absolutely beautiful and can be adapted to do exactly what the user needs it to do, rather than the user having to adapt themselves around the OS. This is the beauty and charm of Linux take windows and its “stable abis” and shove it where the Sun dont shine.
I even switched my parents over to Kubuntu and my Mum regularly uses it, My Dad who has Windows 7 on his desktop prefers working on the Kubuntu laptop, its just less hassle for them. I have made the Icons super huge on the desktop and have set it up so that my Mum can just click the icons and get to access her emails, have her google voice conversations in gmail and tune into their favorite Radio stations, thats the beauty. I can configure the desktop to suite them and their bad eyesight. Long gone has the old limitations of software, there is nothing they use on windows that isn’t available on Linux and this is in huge part thanks to companies like Google who are themselves huge Linux supporters or even Skype and of course Open Office and Mozilla Thunderbird.
If this eventually roles into rolling updates rather than having to rely on constantly updated distribution numbers I think Ubuntu would greatly benefit from it. This is one of the reasons why I always loved Debian Sid, once installed it was always updated to the latest cutting edge stuff. Maybe Ubuntu should rethink their packaging strategy and follow Debians strategy more closely, or maybe I should just switch back to Debian.
You know what I really hate is this windows drivel and hate thats spewed on practically every Linux article on the web. Take your windows and stick to it, us Linux users really dont care. Its never constructive criticism its always knit picking and useless points being raised that do nothing other than end up annoying other users, its removes healthy valid conversations, it removes the essence of learning and understanding and instead fills it with nonsense and garbage.
Its like a bunch of dumb ass monkeys coming in to the room and throwing shit all over the good conversations, shit throwing if you will.
And you respond with a load of nonsens..?
Yeah who would want to run an os that has proven it self to be stable enough to run in a production enviropment where it might be part of a cnc and what not… You do realise that the desktop/servers arent the only place you’ll see an os right?
Who cares if it runs open office now?
I do agree with you here. The idea of a central point for software install and update are better then what we see on windows.
Are you serius trying to make it appear as if it’s a linux thing only? Or just trying to make it look like linux was the first with a way to do central updating?
And what is sudo? A veriation su?
What are you trying to state here?
Linux? Don’t you think you should give credit where due?
Allways at the forefront of technology? Oh yeah right.
Good for you. But what part of what you did for your parents are imposible to do on a windows box?
They wont make the switch anyways.
http://www.omgubuntu.co.uk/2010/11/ubuntu-is-not-changing-to-a-roll…
If you want to trashtalk something you really should do your research first.
As long as they did their research first i don’t find it more annoying then your rant.
Stop right there perfect example of spreading Fear Uncertainty and Doubt.
http://www.youtube.com/watch?v=vREspQG6ayQ — German Air Traffic Control completely running Suse Linux, because its constantly updated that means Linux is unstable according to you, so unstable in fact that the Germans are willing to move all of their air traffic control systems over to Suse Linux..
http://www.theinquirer.net/inquirer/news/1008257/faa-switches-air-t… — FAA switches air traffic control to Linux , even the Americans are doing it.
If Linux was in any way unstable do you think it would even be considered for air traffic control ?
Being able to run OpenOffice 3 on an obsolete version of Linux is absolutely pointless and proves nothing about stability or anything else for that matter. In fact being able to sacrifice backwards compatibility enables better, faster, more stable and more secure libraries because your not left with having to support dodgy API’s dodgy undocumented quirks. This is what I mean by constantly evolving.
So yeah thanks for proving my point ..
Read what i wrote… You just continue your load of b*
So your saying because they are switching they will update every 6 months or what ever the release schedule are for those distros?
Did i mention OO 3? Or any other new version of a piece of software that i would want to run on an older version of a OS?
I did’nt. You seem to think i wrote stuff i didn’t to make it seem so. But i’m sure that you will eventually get your head of out your behind and see that your just making a fool out your self.
What did you write ? Your point was what exactly? I re-read over your post and other than a complete FUD statement the rest is b*ll.
So you trying to imply they will never upgrade their systems ? Are you trying to imply they will not in anyway update their systems over a 9 year period ?
I dont even think you understand the whole thread of conversation that went on before.. Its you whose comments make no sense, I suggest you re-read the earlier comments..
??? This makes absolutely no sense.
This only serves to prove nxsty and I right..
No I am saying Linux was first, Microsoft Copied the auto-update features from Unix Land more specifically from Linux.
No Sudo is much more than Su, surely you must know this. The point is, its nothing new to anyone thats used Linux, its just Microsoft playing catch up once again.
Did you understand my comment ?
It was the first to support x86_64 the first to support hypervisor technology, the first for SAS/SATA … Name me areas where Linux is using outdated tech ?
Do this on windows box:
http://s1178.photobucket.com/albums/x367/delt0-delt0/Desktop-Screen…
See what I mean ? Everything has easy access. Its molded around their needs, because they both have bad eye sight. This is not their actual desktop, but a quick mock-up of how I set it up for them and its easy to do.
Did you read my comment ?
And what would be the better alternative? Lke Windows where there are 5 billion installer systems, few companies provide automatic updates and those who do all use their own update system?
Read what Ubuntu is proposing here and you’ll realise that the status quo is not acceptable. They know there is a massive problem where applications are tied to a particular distribution version, meaning that you need to upgrade every six months if you want to get a new application version. That is just plain stupid.
The Windows system at least provides the means to install a wide variety of applications, and you can install Open Office on a nine year old system in XP that you can’t do with any Linux distribution. What it lacks is a sane update and remote installation mechanism, but that’s because software installation on Windows has existed for a very long time.
Scale this up for Ubuntu and they’re going to have to maintain an extremely long list of backports, the quality of which will inevitably be compromised. Where applications are concerned that just shows you that it’s the developers and users who should be responsible for maintaining and installing the software that they want to use.
Eventually they will end up realising this after another ten years maybe, but until they do they’ll have to jump throuhg hoops such as just where they will draw the line as to what they will update in a distribution and what should be kept static.
Edited 2010-11-25 13:40 UTC
who cares whether different applications use different installers if:
– they vary in flexibility and functionality ( eg some operate via scripts, some can install the Application on a per component basis) but all of them basically do the same thing, extract the application files to the installation folder, and set some registry keys
– the underlying system (as applications and installers are concerned) is for all intents and purposes the very same (thus, a unified platform compatible with itself across releases) for over a decade – thus allowing nearly any combination of <<arbitrary application for “windows”>> and <<arbitrary windows version>> to work
Scenario: You are presented with two systems with equivalent functionality: Office suite, multimedia, graphics applications (such as photo management, raster & vector graphics editors), CD/DVD burner, Internet suite (email, IM, browser), PIM, etc
If you have one which is a Windows system that is “stale” … hasn’t been touched in a couple of years … and the other a Linux system which is also stale (also hasn’t been touched in a couple of years) … and you are asked to bring them both right up to date without losing any user data … it is absolutely a given that the Linux system will be done in less than a quarter of the time of the Windows system.
That is a reason to care.
Edited 2010-11-26 02:47 UTC
The App bundles of Mac OS X is in my opinion the best compromise to date for installing new software easily and removing it without leaving junk in system folders.
Updating is an issue, though. Apple should provide a standard update procedure, allowing third-party software to be updated along with system packages through the Apple Software Updater tool.
But looking at how much they care about their users keeping their system up to date, I wonder if this is going to happen before the only way to install and update software on a mac is via a “Mac Store” using Apple’s repositories, effectively making the replacement of ASU able to update all software.
Edited 2010-11-25 20:28 UTC
Superficially the OS X system looks great….until people actually start developing real applications on OS X that have shared, logical dependencies on one another.
It is based on the assumption that most non-system applications, developped independently from each other, don’t have such dependencies on each others. That Photoshop does not depend on Office to work, that Pro Tools does not depend on Corel Painter, etc…
This assumption works very well on the desktop, as long we don’t have a lot of apps depending on several hundreds of MB of common non-system libs, in which case sharing is best.
So…
-The set of system libraries must address all common needs
-Developers must use it.
Sounds reasonable to me.
Edited 2010-11-26 05:54 UTC
This is just one of the many emperor’s clothes in Linux wardrobe. Nobody seems to understand that installing an OS every six months is a plain stupid proposal. Nobody will explain why a Windows or Mac user can upgrade his apps easily and a Linux user can’t.
Just silly.
Hold on a second. Nobody has to reinstall “Linux” every six months to have their applications upgraded. They have to reinstall Ubuntu and that’s quite a difference.
My old desktop has Debian Sid running from the same install since 2004, survived several parts replacements and it keeps going without missing a beat. My laptop has been running Sid since 2007 without any problems either.
I am now running Sid on my newest quad core desktop since two months ago and it flies on it. Something tells me that it is going to be there forever, too!
And I am increasingly looking towards Arch to satisfy my urge to stay on top of the latest KDE improvements before everybody else knowing that I won’t have to reinstall anything at all and the fact that Arch does not stray too far away from upstream – if at all – just sweetens the deal.
That people have to put up with Ubuntu’s weaknesses because they don’t know better is one thing. But do not lump all the Linuxes together with Ubuntu just because one does not want to look elsewhere.
I am really hopeful that Canonical can pull this off, though. They already have the blueprints (hint, hint, nudge, nudge) so it is just a matter of following them…
Edited 2010-11-26 12:21 UTC
Bogus; the Linux software managment system is *amazing* and far better than any of the comparable alternatives. yum / apt / zypper are all good tools. More projects should use the OBS; then they can offer one-click installs that automatically add the repo and install the packages. This works VERY well. Check out installing Monodevelop and Banshee as good examples.
Of course Ubuntu is the worst for this because it is more of a fork than a distro – but it still works reasonably well. Switch to openSUSE for a distribution “for humans who need to get work done”.
Enterprise wants STABLE software release.
Software on which you can rely on and that is easy to provide support for.
Because if it’s old it means it’s stable. Like Windows 98.
yeah. clap! clap! clap! very funny…
Being stable doesn’t mean being old.
It’s clear to me that you’ve never worked in an enterprise with hundreds (or thousands) machines, and don’t know what you’re talking about.
So, your post doesn’t even deserves wasting time to be answered in a serious way.
It does, somewhat. Stability means finding a combination of packages that works, and sticking with it. And typically, that means that those concerned with stability tend to be one or two releases out of date – upgrading only once the version they’re on is end-of-lifed by the vendor.
I think what fepede was getting at was that it’s not a one-to-one relationship. IOW, most (all?) stable software is “old”, but not all “old” software is stable.
It was a joke. There are two different ways in which the term “Stable” could apply: a set of software the doesn’t change or a set of software that does not crash. Ideally, you’d like both with your “thousands of machines”.
That is why Redhat’s releases for RHEL are so ‘slow’, that is also why many providers use Debian (many maintainers also work at providers I think). Although some have switched to Ubuntu because some get requests for more recent software.
I see this as an option in software sources > updates > automatic updates.
You will be able to choose whether to get upgrades or just patches.
Enterprises will choose patches.
Oh, just like LTS then.
If they would like to go down the path of rolling releases or just more frequent updates to packages such as Firefox and LibreOffice. Then they should implement delta updates with a fall back to normal packages (in the event that the delta cannot be applied for some reason).
Currently the smallest change to the Kernel or OpenOffice can mean that there is over 50MB to download.
Edited 2010-11-24 18:36 UTC
I guess I better get upgrade from Woody then.
But serially, folks, this is a good move. One thing my son really disliked about Linux was how long he had to wait for the latest Firefox or OpenOffice. When Firefox releases a version with a 10x improvement in javascript performance, you don’t want to wait 6 months for it. I know you can manually do some of this, but this defeats the purpose of the repository concept.
I assume there will still be an LTS release (that does not do this) for the millions of enterprise customers…
It doesn’t. That’s the central point behind the central repository model – that you can control what applications get installed.
Unfortunately, that means that there is still a relatively long lead time between the release and when it appears in a new distribution or in umpteen backport repositories – as opposed to developers creating one sane installation package to users as soon as a new release happens that they can support.
Edited 2010-11-25 23:29 UTC
This is a sensible approach AFAICT – and very close to my idea of a multi-tier distribution.
1) Stay rock-solid in regard to the toolchain (GCC, glibc, binutils and such) – updates every 18 months, except for critical bugfixes and security issues
2) 6 or 12 months update cycle for lower level libraries above the toolchain (the kernel (not part of the toolchain), GTK+/QT, WebKit and such) – of course with the usual quick releases when fixing critical bugs and/or security vulnerabilities
3) End user applications and libraries specific to these applications (bygfoot, firefox, libreoffice, chromium, epiphany blahblahblah) can be updated at will.
Another option is to make sure that API and ABI are not broken when releasing new versions. In this regard the *BSD’s tend to be much ahead of GNU/Linux.
Give this guy a medal I really like the idea.
Thank you
Exactly. You need to keep the core OS stable for several years. But allow user-level apps to be installed/upgraded at will, *without* upgrading the core OS.
Windows users can do this.
MacOS X users can do this.
*BSD users can do this.
But Linux users can’t.
What would also help if more distributions selected the same base each round.
Agreed. A step in the right direction would be a central “repository” for patches used by the different distributions. Today it’s a true PITA to search for all the patches used by different distributions, comb them for “useless” (read: distribution specific) patches and apply those which one can use.
LSB is kind of a step in the right direction, but then again – it’s really not.
True. And it’s quite annoying that API and ABI is constantly broken. ABI breakage can be fixed through recompiling and/or relinking. API breakage is much worse.
I’ve played a lot with LFS, BLFS and recently CLFS (and CBLFS) in an attempt to create a multi-tier distribution, but it’s painful to maintain. Of course, with me deviating a lot from the Linus FSH, it’s kind of my own fault.
I’ve tried applying different package management systems, but neither .deb nor .rpm seem to be adequate, and attempting to use Slapt-get creates a dependency on a rather old tar-package. I like the latter approach though, since the extended slackware format is less obnoxius than .deb or .rpm.
Exactly.
This is the way it should be done, for a desktop distro, and Red Hat and the RHEL clones do this already, except in a very extreme way.
The kernel is locked to a version, and RH backports improvements to keep the API/ABI stable. I haven’t paided attention to the toolchain as much, but I believe they follow a similar philosophy. Applications get updated with patches and revisions, but maybe not version numbers. Once again, I haven’t looked at the applications that closely.
What you could do is use the RHEL kernel and toolchain as a base, and create a repo which provides source built software. Maybe have something like pacman and yaourt like Arch LInux has.
Edited 2010-11-26 17:38 UTC
As long as all the proper testing is done before releasing the software for update.
I like incremental updates. Both Mac OS X and Windows 7 do that – and you have the option to NOT update the apps or system components if you would rather wait and see all the posts in forums the next day from the people who installed the updates then ran into trouble.
[edit]
Also, when I ran Ubuntu I tended to go out and get the latest versions of software (non-Ubuntu repo) and install it manually anyway. This way people would be less tempted to do that and not risk messing something else up with shared libraries or something. Tho’ in the case of Firefox I’d install into my $HOME directory and just set my path and start icon[s] accordingly.
Edited 2010-11-24 18:41 UTC
Thom, I know this is off topic, but I’d really like to read something on this site without a reference to unicorns. The whole thing is tired, and wasn’t really ever funny to begin with. Are you like 12?
That’s my jerk post of the day.
Actually, I agree. Maybe if everyone stops pretending to find comments like that funny he’ll stop. Wait, What?? (another pet hate of mine) – LOL
Go to OSAlert.com, hit ctrl+f/cmd+f, type in “unicorn”, and tell me how many hits you get. Alternatively, you could embrace unicorns for the awesomeness they represent.
I’m 25, by the way – at least, until December 1, which is my birthday.
I wonder what Sigmund Freud would have had to say about you and Unicorns!
Which means you never saw the following in the theatre:
http://www.imdb.com/video/screenplay/vi2750742809/
It was the saddest and most boring movie I ever saw. I haven’t seen it in 28 years, but I’m pretty sure I don’t want to again, without the ability to fast forward or stop. But I don’t know maybe your unicorn-itis will be cured, or sated, or embiggened.
I’ve always been a fan of ArchLinux and its sane rolling system. Ubuntu should do this, IMHO. They could release a snapshot every 6 months to supply clean installations.
(And, yes, by that it could be the end of the ridiculous out-of-nowhere animal names and weird inflated release versions)
You do know it’s year/month right ?
It seems to me that Canonical is trying to flee from quality issues in their latest releases. Using PCLinuxOS atm, I can only imagine how demanding it must be to continue the rolling release method without breaking anything. Canonical, looking for a similar release scheme, has to improve their QA drastically IMO.
I’ve said it before, and I’ll keep on saying it until some Linux distro clues into it: you need to create a clear separation between “base OS” and “user apps”. And they need to be developed separately, but in tandem.
Windows does this.
All of the BSDs do this.
MacOS X does this.
It’s only the Linux distros that don’t.
You can install Windows XP today, and run the latest version of Firefox on it. Or the latest version of OpenOffice.org. Or the latest version (with a few exceptions) of AppX.
Same with the BSDs. You can install version Y from 3 years ago, and still install the latest (with a few exceptions) version of AppX.
Same with MacOS X.
But it’s almost impossible to do that with a Linux distro. Want the latest Adobe Flash 10.1? You need to upgrade GTK+, which means you have to upgrade glib, which means you have to upgrade half your installed packages.
Want to install the latest Firefox? You have to wait for your distro to include it, then upgrade a bunch of inter-related packages. Or download it from Mozilla, and have it poorly integrated.
We’re fighting with this right now with Debian. Even on our 5.0 (Lenny) boxes, we’re stuck with Flash 10.0 due to the GTK requirement being higher than what’s available in the Lenny repos (no, we’re not going to install GTK from the backports repo, as that requires upgrading some 100+ packages). And we’re stuck with 9.0 on our Etch boxes for the same reason.
But, I can install Adobe Flash 10.1 on FreeBSD 7.0, released how many years ago? And on Windows XP, over a decade old. Without having to upgrade half of the installed OS.
Repos are good for package management. But the same repo shouldn’t be used for the core OS and the user apps.
Indeed.
I like MacOS X / BeOS way of dragging a file to a location and it is installed, removing the file removes the software (well most of it, there can still be configuration files left behind).
Windows is worse in some respects because records in the registry get left behind when removing software. Also, a lot of software provides think that it is an excellent idea to have their separate update notification programs load automatically upon login in and staying loaded. However your point remains true its easy to install programs in Windows without issue compared to Linux distributions if the software you want is not in the repositories.
There would not be so much of a problem in Linux distributions if the community actually agreed and stuck to a set of standards. The distributions could use their own internal format like DEB for Debian and Ubuntu or RPM for SuSe and Fedora. But they should still be able install for example PBI file if that was the agreed standard.
OpenSUSE has been doing this for some time now. The build service allows for the creation of separate repos containing software that is automatically built against specific versions of the underlying OS.
I’ve been easily able to stay synched with the latest versions of KDE, firefox and OOo etc. by adding the appropriate repos, without having to worry about the underlying OS breaking.
Although the build service was designed to support additional distros such as fedora, deb and ubuntu, I thought that the other distros were picking up on this as well…. Isn’t that what the Ubuntu PPAs are all about? Honestly don’t know, I don’t really follow Ubuntu.
This is the elephant in the room.
Spot on.
If you pin core parts of os to testing or unstable
wouldn’t that alleviate the problem?
The biggest problem I’ve always had with the Linux world is the enormous number of distributions using custom patches that unleash hell onto the end user and the upstream project being inundated with bug submissions caused by those stupid patches.
Could we please have distributions stick to vanilla builds that don’t use insane optimisations? I’m not really asking for much – stick to the pure source provided by the upstream project and when it comes to compiling I’m quite happy with the distribution sticking with -Os instead of the alphabet soup of tweaks and optimisations afterwards.
We’re at this situation because with each patch, with each optimisation the further apart the distributions become in terms of compatibility. A bug that might appear in one distribution doesn’t appear in another, a problem with one distribution might not appear in another thus making upstream projects like Chrome, Firefox and so on pull their hair out dealing with distribution specific bugs.
I’ve always been tempted to, one day, just create such a distribution that doesn’t attempt to be exciting but merely a stable platform that gives end users what they need but without all the patching and fanfare that sometimes occurs by distributions.
I can’t mod you up, so I’ll just say: Amen, Reverend.
Particularly the parts about stupid patches and sticking to vanilla builds.
Oh yeah, and the part about merely being a stable platform instead of exciting.
Well, really – all of it.
EDIT: Missed the part about your thoughts on creating your own distribution (part of merely being stable). Been there, tried it. Might try again, if anybody’s with me *nudge nudge wink wink*
Edited 2010-11-25 03:24 UTC
Aren’t you a Gentoo user?
Obviously
I can’t remember the last time gentoo was “exciting”. All the odd stuff happens in other places mostly – except for assogiate :p
Of course one can make gentoo very “exciting” – but you can also make it very stable and bland. I’m doing the latter, though I admit to be running Compiz.
Oh I probably shouldn’t forget paludis, but it’s not in “stable” so I haven’t switched to it. I almost never go past “stable” when it comes to system-stuff.
why do you always need to mention unicorn?
He’s just preparing us for the day that he comes out of the closet.
It must be a large closet if he can have unicorns in there…
I’m sure my past girlfriends would have a collective “explains-a-lot”-moment, but no, sorry to disappoint you.
Finally, something Ubuntu has done recently that I agree with. It seems that for every release the last few years, Ubuntu has made some annoying, stupid, or outright braindead decisions with seemingly no real/good reason. They have lost me as a result of putting such crazy ideas that come off as sounding like they just want to dumb their distro down in ways Microsoft and Apple have done in pretty much every version of their distro since… well, I lost track, it’s been so many versions ago.
But I can’t deny, this is a step in the right direction for those who would like to keep up to date without… eh, updating the whole OS every six months, fetching third-party packages constantly, dealing with many third-party repositories, or even worse–having to go the trial-and-error route of building from source and hoping it works (hint: more than half the time, in my experience, it doesn’t).
This was always one of the benefits of Windows, which I somewhat miss. Sure, its “software management” if you can even call it that is a joke, but it works, and it allows you to download an installer of a program the second a developer releases a new version on their site and install it. If you don’t like it–no problem, keep the old version’s installer and revert. Linux simplifies package management by centralizing it, but it’s horribly limiting when you’re pretty much forced to stick to the same old, often-outdated packages that came out with the version of the distro that is running.
In reality though, this is not surprising given that Ubuntu already made this change for Firefox not too long ago. I was glad when they did that, and this really seems like the next logical step to continue what they started with the Firefox exception. Hopefully this is done well enough that an Ubuntu version can be chosen in the future, and if it works well enough, then only basic applications can be upgraded to newer versions, not compromising the stability of the underlying OS. And if a version doesn’t work very well… a slightly older version could be used but with common programs updated.
As a wholeheartedly user and lover of Debian Sid, I see this latest move from Ubuntu as something mostly positive, except that I don’t really see the point as far as their current user base is concerned.
I mean, you see these people bitching and moaning everywhere on the Internet that Ubuntu is slightly behind one or two versions for some popular software and that it is a bitch to reload everything every six months at the same time that they sing praises to the fact that each iteration is more or less a no moving target and therefore they don’t have to update it as often as, say, Sid or Arch users.
I’ve recently set up Linux Mint (the regular Ubuntu-based one, not the Debian-ish flavor) on my brother’s machine – which was somewhat wary initially but fell in love with it once I installed MediaTomb for his PS3 and a few other goodies – and after having tried it for a while, the one thing that stood out from him was a remark as to why I put up with so many updates on my machine whereas his only gets them every now and then.
I did point out the advantages of having patches and new features available on a nearly daily basis but it was clear that I was saying gibberish to his ears. These people simply don’t see the point in updating their systems – hence the large number of users that disable the automatic updates on Windows despite every warning to not to do it – and strangely appear to be averse to the idea.
Furthermore, it is clear at least to me that at this point in time the QA work done by Canonical is nowhere near the minimum acceptable for such a thing to work properly and they tend to patch things unnecessarily. A LOT! To such an extent that its distro usually does not survive an upgrade – and it is not even recommended – despite its Debian roots. Even a distro like Debian, which has an overwhelming amount of developers, will let a few annoying things slip through the cracks every now and then so I have a heck of a hard time seeing how Canonical expects to pull this off.
Experienced users that want a rolling release lifestyle would probably look elsewhere anyways.
I am seriously looking forward to see how this will work out.
Edited 2010-11-25 11:55 UTC
Near enough this one.
http://www.tmrepository.com/trademarks/linuxforgrandmas/“>Linux…
PostgreSQL 9.0 is better in every way compared to PostgreSQL 8.4, but you can only get it in 10.04 and 10.10, and that’s through a PPA. Want to have the latest, greatest, and stable PostgreSQL on your 8.04 LTS server? No such luck.
Same goes for Python. Python 2.6 is better in every way compared to Python 2.5, and can run all Python 2.5 code without any problem. But you can’t have it in 8.04 LTS, unless you compile it yourselves.
I hope Mark is not removing the ridiculous policy for desktop applications only.
You do have the option of creating your own deb-packages and possibly create your own ~ppa. Or just install from source. That is, if it builds at all. The stability and usefulness of the build system in ubuntu is not exactly stellar.
Just use FreeBSD and save yourself a lot of headaches.
Oh, if only more people would do that.
I see I was modded down for that comment.
I may have to do a blog post that shows how LTS is a joke compared to FreeBSD.
Maybe I’ll also point out how Linux had some serious kernel exploits this year that left a lot of websites hacked. FreeBSD has a history of being more secure and stable, that is a fact that can be buried but not denied.
I dont speak very well english but i can understand what did you say, and only will say three words “Ubuntu Software Center“.
Nice idea, I really hate the fact that I have to stick with not-at-all-fast Firefox 3.6 for another 5 month. Of course there is Mozilla’s PPA, but it is buggy as hell.
If only someone could solve the other problem of the deb-style package management: the lack of incremental updates. Why does at least 100M have to be downloaded every time I type ‘sudo aptitude safe-upgrade’? That generates lots of excess traffic, which can be quite a nuisance on a wireless connection.
They can’t get upgrades to work. They break systems on each upgrade, yet they want us to believe they’ll be able to backport apps (for how long?) and keep everyone happy.
If they only could guarantee that they can provide non-breaking upgrades, this wouldn’t be much of an issue. I would gladly wait for the next release if I knew I wouldn’t have to re-install because upgrade won’t work.
I upgraded from 9.10 to 10.04 with no problems.
In recent releases (9.10 onwards) I have found upgrading painless.