Ubuntu Gutsy Tribe Alpha 3 has been released. “The Ubuntu developers are moving very quickly to bring you the absolute latest and greatest software the Open Source Community has to offer. Gutsy Gibbon Tribe 3 is the third alpha release of Ubuntu 7.10, and with this new alpha release comes a whole host of excellent new features.”
I’ve liked Ubuntu for a long time now. What drew me to it was the fact that it only took one CD to get it installed. What’s kept me with it is the fact that I find it very easy to use, update, and manage.
I know lots of open source people look down on Ubuntu for various reasons, but I just simply like it over all the other versions I’ve tried (openSuse/SLED, Fedora, Mandriva).
I look forward to Gusty. I hope KDE 4 is done by then and I hope Gusty has it included.
Same thing here. Also, *Ubuntu is beautiful. This is an important aspect to bring people to Linux. When a desktop is ugly, there is no way to sell it to your (non-geek) friends.
One of the bad side of Ubuntu (debian) is the difficulty to create packages. Also, new features only appears in new version (Edgy -> Festy, etc.) so if something doesn’t work in one version (in Edgy f.e.), you need to wait for the next one. This is the advantage of rolling releases.
I really like how ubuntu is evolving. Great work!
“Ubuntu is beautiful. This is an important aspect to bring people to Linux. When a desktop is ugly, there is no way to sell it to your (non-geek) friends.”
Beauty is in the eye of the beholder and with Ubuntu, it’s a double edged sward. To me, Ubuntu looks like someone vomited all over the desktop. Not everyone likes pumpkin orange mixed with brown as a default color scheme. It’s truly hideous (in MY opinion.) When I used Ubuntu, I would immediately change everything over to shades of blue following installation. Fortunately, the ‘Blubuntu’ meta package makes this easier.
I don’t understand why every alpha release of the next Ubuntu makes headlines like the second coming of Christ. It’s unfinished software. While having loads of bug hunters makes for a better final product, I highly doubt most people are dutifully filing bug reports (at least if one goes by the numbers.)
I am running the latest Gutsy, and except the normal updates of the package versions, I saw no new, visible, features. I hope they come up with until the release, because every release must include at least 2-3 new big features not found on other distros.
One of the greatest features is the integration of Ebox for server deployments.
Ebox is a very cool tool that makes server management much easier and less error prone.
This alone makes Ubuntu Gutsy a must-have for me.
I said “visible”, not server or services or driver stuff.
And porcel was talking about server aspects.
Edited 2007-07-19 21:14
Not being able to mod down ‘moderators’ is OSAlert’ biggest failing.
Yes, this comment includes personal attacks/offensive language
Yes, this comment is off-topic
Yes, this comment is spam or includes advertisements
which of those reasons were you going to mod her down for?
or was it just because you diagree with her?
I was hoping the init scripts could eventually be ported to upstart for parallel startup performance.
Anyone knows what’s the status of that?
because every release must include at least 2-3 new big features not found on other distros.
http://perkypants.org/blog/2007/07/12/qotd-paul-nowak/
*points to the release notes for each tribe release*
http://www.ubuntu.com/testing/tribe1
http://www.ubuntu.com/testing/tribe2
http://www.ubuntu.com/testing/tribe3
I think they tend to incorporate more of the Ubuntu specific changes toward the end of the cycle though, once they’ve successfully integrated most of the upstream stuff.
Well, Compiz by default is a new, visible feature. Maybe some of the other major desktop distributions will also have this in the fall timeframe, but it’s still an impressive and compelling feature. Windows and Mac users will take Linux more seriously due to Compiz.
I don’t think that Ubuntu, or any other Linux distro for that matter, has problems getting their installed base to move to new releases. They don’t charge for updates, so there’s no reason for users to stick on a back-level release. Steady, minimally-disruptive change is critical for retaining users.
I don’t think that Ubuntu’s target audience is committed distro junkies. They focus on being a final destination for recovering junkies and attracting Windows and Mac users to the Linux community. Ubuntu’s selling points are its strong community, effective packaging, and relatively cutting-edge features.
The Linux desktop is relatively mature. KDE4 is probably the biggest change we will see for a long time afterward. We’ll see more change in the way Linux is perceived and supported than change in the software itself. It’s going to become more seamless, more comprehensive, and more socially acceptable.
Desktop Linux is becoming all grown up, and the free software community has instilled in it the values and competencies that will bring it great success.
“Well, Compiz by default is a new, visible feature. Maybe some of the other major desktop distributions will also have this in the fall timeframe”
well, you could try “already had it two releases ago”.
Ubuntu didn’t have it configured and enabled by default two releases ago (or last release, for that matter).
A quick scan of the up and coming “visible new features” reveals several interesting (to me, as a “normal” user) changes:
– Forms support in Evince
– Ekiga 3.0
– Google Calendar backend for EDS
– Gedit moved to new GTKSourceView – much improved highlighting of combined files (html containing javascript etc)
– Proper Gnome Keyring PAM integration (at last – no double prompts from NetworkManager!)
– Improved battery statistics in Gnome Power Manager
– Combined Appearance Preferences windows
– Rhythmbox gapless playback, improved support for online stores
Agreed these aren’t distro-specific, BUT, name a distro that does have 2-3 big features not found in other distros, that are user visible.
The point is, desktop linux is maturing to a state where there’s some stability. This has led to feature cross-pollination across all distributions. It almost doesn’t matter what distro you run these days; all of the big ones are on the ball and provide out of the box support for the best desktop technologies.
Eugenia,
Powerusers like yourself might not care, but the restricted-manager was updated to fetch and install broadcomm wireless firmware. This alone got my friend convinced to try it out.
Besides that, the only real noteable features are probably compiz-fusion enabled by default and the Appearance applet. Also, Colin Walters mentioned on the ubuntu-devel mailinglist that some of the features of wubi (http://wubi-installer.org/) are being integrated into debian-installer (for Ubuntu).
That would lower the barrier to entry even further as a point and click installer would set up a working ubuntu system as a file on a windows system. Then the user could reboot, and the windows boot loader would boot them into Ubuntu. It is more fragile, but infinitely easier to install.
The focus is on ease of use and polish. For this, they win.
because every release must include at least 2-3 new big features not found on other distros.
Because otherwise Eugenia is gonna unload a whole lot of crap unto you?
I like where Ubuntu is going, their philosophy, which seems to be a very good combination of idealism sprinkled with pragmatism as needed, and the fact that they offer a free-for-all supported enterprise release.
Unfortunately, I have found it a bit unstable compared to both Debian and Suse, in both server and desktop roles.
When I mention unstable, I don’t mean that it crashes all the time, but rather that it is the only distribution that I have managed to crash unexplicably (get a kernel panic). The testing was conducted on the same hardware.
I think Ubuntu needs two things:
1) A real control center, ideally integrated into the overall Gnome desktop to avoid the duplicity of efforts that often take place between desktop environments and distributions.
2) Stability, stability, stability, which may be a hard thing to have with a six-month release cycle.
I really don’t know why people are so keen on frequent releases. I think a slower moving release with periodic and well-tested updates to specific pieces of the desktop/server is much better for everyone except hardcore technologists.
Well if you use the LTS version (Dapper) the release cycle shouldn’t be an issue. On the other hand, who wants to be running old stuff, I needs to be on the cutting baby!
Seriously though, Ubuntu is my favorite distro. It’s the only one I use at the moment as that whole distro hopping thing is getting old. My only gripes are that stability degrades over time. At the moment the vlc plugin crashes firefox after it plays a movie. It was fine just two months ago, then I enabled backports and now it’s useless. Too bad as it was the only thing that played almost everything without any issues. I’m not a stickler for stability though (since I’m a tinkerer and most thing are usually my fault anyway). I upgrade every release which is every six months so every six months it’s a fresh start.
I can see your point, but a lot of people want to simply have longer upgrade cycles. As long as security and bug fix updates are issued, then I don’t see a problem with this.
The point of the LTS is not to be more stable, it’s for people who want a longer update cycle (servers, businesses).
The other releases are meant to be just as stable. TBH Dapper had problems on release too. I think Ubuntu aren’t willing enough to have a short delay for non-fatal bugs. I still stick with them though… I think it’s probably worth waiting for a month after the main release before installing though.
Regarding your stance on six-month releases, I think Ubuntu’s target audience demands six-month releases. I am guessing that people who use Ubuntu want keep on the latest development in the Linux ecosystem. People who need longer release cycles and more stability use RHEL or SUSE not ubuntu. If they don’t want to pay they will be using debian or something.
Though I agree that Ubuntu seems to lack stability. My experience playing around with kubuntu wasn’t impressive.
I really don’t know why people are so keen on frequent releases. I think a slower moving release with periodic and well-tested updates to specific pieces of the desktop/server is much better for everyone except hardcore technologists.
Ubuntu does have the LTS versions. Which I believe is 3 years for the desktop and 5 years for the server. Somebody correct me if I’m wrong.
I am aware of the LTS release as I mention in my comment, but I have found that release too a tad unstable, although it seems to have improved recently, possibly one of the kernel upgrades took care of some weird race condition.
I really don’t know why people are so keen on frequent releases. I think a slower moving release with periodic and well-tested updates to specific pieces of the desktop/server is much better for everyone except hardcore technologists.
The thing that bugs me about all Linux distros is that the only way to get newer versions of AppX is to update the *entire* distro. If I’m running Distro X.Y, why do I need to upgrade to Distro X.Z in order to upgrade from AppX 2.0 to 2.2?
I’ve never been able to understand the whole “upgrade the OS in order to upgrade the apps” philosophy of the Linux distros.
Ubuntu 6.06 LTS is nice, but it comes with KDE 3.5.2 (or something like that). In order to get KDE 3.5.7 I need to either use a non-standard repos, unsupported packages, or upgrade to Ubuntu 7.10. Why? It doesn’t make sense to me.
The FreeBSD (and Windows, and MacOS X) way just seems more logical. You can run/install newer versions of apps without upgrading your OS. I had a laptop with FreeBSD 6.1 on it running Xorg 7.0 and KDE 3.5.7, even though neither of those were available when 6.1 was released.
It’d be nice if the distros would split out their developments into OS and app groups, such that one could update them asynchronously.
Sometimes, people just want new apps, they don’t want a completely new OS.
Edited 2007-07-19 21:58
Arch Linux with its rolling nature might be for you.
Debian does exactly that. I run a mixed environment on my laptop for that purpose.
Debian does exactly that. I run a mixed environment on my laptop for that purpose.
How does Debian do that? Using the standard Debian repos, one still has to upgrade to testing (or stable once it’s released) in order to get newer apps. And that still upgrades the entire OS.
The FreeBSD way just seems more logical. You can run/install newer versions of apps without upgrading your OS.
With ports, you compile the software yourself. You can do this with Linux as well. Gentoo (or Arch as someone mentioned) might be more your cup of tea as it seems to be built around that idea, but you can compile from source with any Distro.
Some distros (like Debian/Ubuntu) provide binary packages. Newer versions may be compiled against newer libraries that then also need to be installed (and installing some libraries isn’t the same thing as updating the whole OS). That’s the nature of binary packages. You can avoid installing the newer libraries (even in Debian) by compiling the app yourself, as you do with FreeBSD. Complaining because Debian also provides a different (binary) way to install apps, which some people find more convenient, is a bit nonsensical. Just do everything like in FreeBSD and use apt-get source if it’s a big deal to you.
And yeah, FreeBSD has packages (binary) as well, but you can be sure they pull in dependencies the same as debian packages do.
Of course Windows and OSX deal with binary packages, and you say (the) Windows and MacOS X way just seem more logical as well.
They do this by generally including every dependency in the package. Linux (and FreeBSD) have their reasons for not doing this (redundancy, size, security updates for all apps using the dependencies).
Your problem seems to be with OSs with dependencies separated out that distribute everything in Binary form. Thing is, you aren’t locked into binary distribution. ANY distro will let you install from source, and even binary-centric distros like Ubuntu let you install from source and remain within the package management framework (apt-get source). And since some distros don’t do binary at all (LFS), I just don’t see how you can say you have a problem with “all Linux distros.”
You miss my entire point: Linux distros have no separation between the base OS and the installed apps.
On FreeBSD, you can stick with binary packages and still get upgraded apps without upgrading your OS. I can run KDE 3.5.7 on FreeBSD 5.5, even using binary packages. KDE 3.5.1 was the current version when FreeBSD 5.5 was released.
Same with Windows or MacOS X: you can install new apps or upgrade existing apps without upgrading the entire OS. While it may not be recommended, you can run MS Office 2003 on Windows 98.
You can’t install KDE 3.5.7 on Kubuntu 6.06 without jumping through a lot of hoops. The recommended method is to upgrade to Kubuntu 7.10.
The standard way for upgrading apps on (at least the ones I’ve tried so far) Linux distros is to upgrade the entire distro. Or to add custom repos which may or may not break the security updates of your system.
What is it with Linux distros that they can’t separate apps from the OS?
KDE is hardly an ‘app’, it’s an entire desktop environment. Notwithstanding, there are Feisty packages for KDE 3.5.7, built by an official Ubuntu packager:
http://kubuntu.org/announcements/kde-357.php
there are also packages for the latest stable releases of Arch, Mandriva, Pardus and OpenSUSE:
http://www.kde.org/download/
and you still seem to be ignoring my post about backports repositories.
While KDE may be a desktop environment, in the Unix mentality (for lack of a better word) it is just an app. It’s not an integral part of the OS, it can be upgraded separately from the OS, etc.
Yes, there are unofficial packages for Kubuntu 7.04, but they aren’t supported by the security team, haven’t been fully tested, and aren’t recommended for general use.
In this situation, there may be an “easy” method to get a specific app updated.
However, it doesn’t address the point that there’s no standard method for updating apps on Linux distros without upgrading the entire distro.
Maybe it’s just me, but what is the aversion to separating the OS from the apps in the standard repos for Linux distros? And having separate release schedules for the two?
See my above post.
The real issue with linux distro’s is that there is no operating system. Every thing is third party. Applications, kernel, desktop etc etc. Its all various bits and pieces from different places forced together. They they call it a operating system. This is far different then ‘freebsd’ where they develop a operating system and then use the ‘ports’ tree to install third party applications.
tgrondin is correct. It’s not a separation you can really do successfully. If you could, *some* distro would have done by now. It’s just not possible to have a clean separation and work on the two parts separately; apps are too heavily tied into ‘the OS’, and bits of ‘the OS’ change so frequently that apps must be constantly recompiled against them. backports repositories will work fine for most distros for many apps, and more distros should use them (and, as you said, consolidate them) – IMHO implementing the Mandriva /backports repositories was the most positive thing we’ve done for a while. but they do have to be branches for specific releases, because of the aforementioned interdependencies.
Support revenue perhaps? I mean canonical work on support for their revenues, if you were to get updated apps in Ubuntu dapper, you wouldn’t need to upgrade to the new bugs in Gutsy. I dunno, just a thought.
It’s something that annoys me also, having to upgrade my whole OS to get Thunderbird 2.0.0.4 which was released 1 minute after feisty is complete BS. Same with dapper users, they’re still on OpenOffice 1.9 beta. And that’s another thing. What’s the point of a 3 year support cycle if your apps are 3 years old and full of bugs?
Security fixes are welcome, but old bugs are left alone. Why bother with it at all?
The problem stems from the fact that ubuntu linux and indeed all the others that I know about do not actually adhere to the unix way of doing things. They instead attempt to integrate everything into one single system. There is no seperation between ‘operating system’ and ‘applications’. It is the great downfall of linux distro’s.
This is why I use use freebsd on most of my desktops and all of my servers. I see no need to upgrade my glib, gcc, kernel, etc – just to use the latest version of apache as supported by the operating system I’m using. (P.S. Back ports are not offical and in my job unoffical means NO)
I would really like to use Linu* due to its hardware support, but until the distro’s fix the ‘os’ vs ‘application’ seperation issue, it is a absolute no on any of my servers and most of my desktops.
Gentoo, or use a distro with good backports.
Edited 2007-07-20 01:39
Gentoo suffers from massive instablity and outdated ports.
For instance the gnome desktop meta-port is still at 2.16 instead of 2.18.
I looked into gentoo a while back and sadly while it seems good on paper it does not work well enough.
What we really need is a simple way to run a binary disto with a soild base operating system and a easy (like apt-get) way to install third party software that would update when new versions come out. Even allowing use to download from getdeb.org with a full dependencies issue resolved would be good.
Gentoo has been perfectly stable for me, as long as you stay with the stable apps and conservative flags. I also remain unconvinced that the backports system in several distros doesn’t do exactly what you want. Especially since you can manually install any app you want if it doesn’t get made an official backport. I guess I just fail to see the problem your solution is supposed to be solving.
But ultimately, you’re the one using the system and if you like FreeBSD then that’s fine.
Gentoo being stable for you is great. Sadly it is not stably on my hardware or the 800 + machines I have to maintain at work. Most of our machines are running windows because gnu/linux/bsd/etc do not cover basic functions that are requried.
Heres the list
1: Stable base OS that does not need upgraded (security updates are a exception)
2: Ablitiy to install third party software without upgrading the base system. Easy installation and control of such network wide.
Those 2 things windows does very well. While I would say that windows is totatly crap, those 2 features are a base line by which I judge what operating system I install for my clients. As of right now, only ‘windows’ and ‘mac os X’ count. The linux/bsd/gnu distros just cant handle it.
“Heres the list
1: Stable base OS that does not need upgraded (security updates are a exception)
2: Ablitiy to install third party software without upgrading the base system. Easy installation and control of such network wide.
Those 2 things windows does very well. While I would say that windows is totatly crap, those 2 features are a base line by which I judge what operating system I install for my clients. As of right now, only ‘windows’ and ‘mac os X’ count. The linux/bsd/gnu distros just cant handle it.”
I’m not sure what this has to do with anything. Since NT4 I’ve seen larger roll outs of windows base system routinely nightly, as an effective way of keeping everything updated, virus free, stable. I know nothing of macs but I know this is also trivial for GNU, so why should this be a problem.
Ubuntu actually supports older incantations of itself, especially for the reason you describe. If that is *really* what you want. Although personally I suspect more users would want the latest and greatest available to them. Its one of the reasons for Ubuntu’s popularity.
I think there are *lots* of potential problems with Linux adoption and rollouts, but these aren’t them.
Hmm, can you install XP explorer on windows 2000?
Can you install Aero on windows me? Will media player 11 install on windows 95? Can you install directx 10 on windows XP? No you can’t.
What is it windows that they can’t separate apps from the OS?
Internet Explorer 5+ will install on Windows 2000, which effectively gives you the same Windows/Internet Explorer as Windows XP. A few things will be different, but for the most part, it’ll work.
Aero is a theming engine, not an app.
Media Player 11 should work on Windows 98+, although I haven’t tried as Media Player is a piece of crap that should never have been released.
DirectX is a graphics API not an app.
I fail to see the relevance of your post.
Internet Explorer 5+ will install on Windows 2000, which effectively gives you the same Windows/Internet Explorer as Windows XP. A few things will be different, but for the most part, it’ll work.
As a developer I have to disagree. IE6 was disappointing in terms of improved support for CSS, but it had a lot of changes under the hood when it came to support for other web technologies that are no longer backwards compatible to IE5. IE5 is a forgotten browser as far as DOM scripting, AJAX, and many other things are concerned.
Ubuntu has a backports repository:
https://help.ubuntu.com/community/UbuntuBackports
So does Mandriva:
http://wiki.mandriva.com/en/Docs/Basic_tasks/Installing_and_removin…
(Mandriva’s is rather larger).
They really need to come up with a better name than backports. And, at least for *Ubuntu, they need to integrate all their repos. Packages in feisty-backports conflict with packages in Medibuntu repos.
Edited 2007-07-20 00:46
That’s medibuntu’s fault. It’s unofficial.
Medibuntu isn’t a Canonical repository.
OpenOffice updates do not consistently require new Ubuntu systems. However, you have already had your question answered (Linux favors shared libraries, not bundling them into every binary application redundantly). While you’re telling us that Win/Mac don’t have this limitation, try getting apps that require Unicode functionality to run on OS 9, or Core Video apps to run on OS X Jaguar. There are similar requirements with some Windows apps.
What I really can’t understand is your reasoning behind refusing to update a free OS. I’m running Feisty competently on seven-year-old equipment with 2001’s idea of RAM/HD/video, something marginally possible with OS X Tiger and XP SP 2 but not advisable.
What I really can’t understand is your reasoning behind refusing to update a free OS. I’m running Feisty competently on seven-year-old equipment with 2001’s idea of RAM/HD/video, something marginally possible with OS X Tiger and XP SP 2 but not advisable.
It’s not that I refuse to update the OS (I’m running Debian Etch with some things like KDE from Lenny on my desktop, and Kubuntu 7.04 on my laptop, and FreeBSD 6.2 on my home server, with XP SP2 on everything else).
It’s that I don’t understand the reasoning behind, and the seeming delight of Linux users, *needing* to update the entire OS in order to get the latest version of a simple application like Xorg, KDE, OpenOffice.org, Amarok, Firefox, KOffice, and so on.
New applications, even those that require a bunch of shared libs, do not *need* a new OS to run on. If that were true, then we wouldn’t be able to run KDE 3.5.7 on FreeBSD 5.5, we wouldn’t be able to run Firefox 2 on Windows 98, and so on.
My problem is that none of the Linux distros (that I’ve tried so far) is willing to devote themselves to a core set of packages as the “base OS” and use those to build packages of newer apps as they are released. None of the Linux distros (that I’ve tried so far) is willing to make the distinction between “this is our base” and “these are the apps we provide” with asynchronous release schedules for the two.
That’s my problem, and that’s why I tend to use FreeBSD and Windows XP wherever I can, and only use Linux when I need to.
There’s nothing more frustrating than seeing a new release of KDE and not being able to use it for another 5+ months because it shipped a month after Kubuntu or Debian or DistroX did.
Well, since you said Linux, and Linux is the kernel.
You don’t need to update the kernel to update KDE.
I said Linux distros. And you need to upgrade your distro in order to get newer apps.
I said Linux distros. And you need to upgrade your distro in order to get newer apps.
I don’t know how many times this has to be said: NO YOU DON’T!
It’s true that things are a little bit more complicated, but it really isn’t much different than the situation on Windows.
Let’s put this into perspective. Debian’s release schedule was so abysmal that Canonical felt it necessary not to fork Debian but to fork Debian’s release schedule. As a longtime Debian user, it really did suck. I think the reason Canonical went with six months instead of twelve is because theoretically Debian itself was aiming for twelve.
The version-locked apps you mention are OpenOffice and Firefox. Yes, it was irritating when FF2.0 came out and the current version of Ubuntu didn’t update to it. Likewise with OpenOffice, but most of OO.o’s changes were under the hood and not in the core features. Note that this time Ubuntu’s been considerably faster about getting bugfix updates of FF and OO.o into the repositories for Edgy and Feisty.
Part of the impetus behind not backporting is that there is a disincentive for Canonical to support multiple versions of their OS. Essentially they support two OSes: the current stable, and the most recent LTS. Everything else is your choice and your risk: even official PPC support went out the window with Feisty.
You call them new OSes, I call them service packs.
“It’s that I don’t understand the reasoning behind, and the seeming delight of Linux users, *needing* to update the entire OS in order to get the latest version of a simple application like Xorg, KDE, OpenOffice.org, Amarok, Firefox, KOffice, and so on.”
Something like X.org, nor the whole KDE, can hardly be called “simple applications”… Every single application that has a graphical user interface depends on X.org in one way or other, also KDE is quite a complicated whole, like an OS in itself. For example, it has taken a long time for the FreeBSD team to port the new modular X.org to FreeBSD, and it is still not officially done if I remember right. (Ubuntu and even Debian has had modular X.org for a long time already.)
Anyway Linux and package management are not related. There are many sorts of Linux distributions, some allowing easy coexistence of many versions of the same packe (for example Gobo), and some that don’t do any kind of dependency checking (Slackware) and so allow you to install and mix whatever software versions you like (if it breaks or doesn’t work it is up to you to also repair the damage).
Having strict dependencies between packages can be useful in order to guarantee stability and ease of use (no need to know all the necessary libraries and gory details) and also when you want to uppgrade, say Ubuntu, to a new version every six months and hope to be able to do it as smoothly and easily as possible (not that I wouldn’t have had many problems dist-uppgrading Ubuntu though but I’ve usually done plenty of tweaking…).
Distributions like Debian and Ubuntu usually have lots of new software gradually backported and easily available in backports repositories (also see, for example, apt-get.org for Debian and debget.org for Ubuntu). If however, you can find no ready-made package, then compiling bleeding-edge new software from source is always also possible for those who really want to try such software before it is officially available for your distro. Checkinstall makes it relatively easy to compile RPM or DEB packages.
As to Ubuntu’s six month release schedule, I’ve thought that it is simply related to the six month GNOME release schedule. New Ubuntu releases have a new GNOME. Maybe the 6 month release schedule tied to Gnome releases doesn’t make that much sense in the case of Ubuntu derivatives like Kubuntu (KDE), Xubuntu (XFCE), Fluxbuntu (Fluxbox) etc. though.
Sometimes, people just want new apps, they don’t want a completely new OS.
Couldn’t have said it better.
A large part of the problem is that each version of each distro ships whatever versions of whatever libraries it likes, and compiles all its software with those versions, leading to a version lock-stepping effect where all applications and libraries are crosslinked with each other and pulling anything to a different distro version is impossible. So backported apps have to be recompiled on each older distro version from scratch.
What distros should do is provide binary compatibility between the library versions they ship, and compile the app on the oldest version of their distro they wish to backport to, then use that binary on every version after that.
Unfortunately, since the pervasive culture is “we don’t have to be binary compatible because we provide the source code,” applications commonly require the latest version of library, and libraries break binary compatibility between versions.
Ubuntu is a appliance OS, not a software platform.
All softwares are included on the CD conceptually part of the OS while others are more easily upgradeable, but are not supported.
Compare that to notepad or wordpad on windows. You don’t get to upgrading them that frequently.
It shouldn’t be difficult – as long as you keep your cycles to evolution and resist the idea of doing what Eugenia expects – revolutions, your stability should not change.
The features should be atleast 12 months behind the release so that there is 12months of solid testing and bug fixing before officially being merged into the distribution.
If you’re a user who wants eye candy, revolutionary changes then whinges because there is a lack of stability, the person that needs to be examined is the individual demanding the above – the complete lack of understanding of the software development and testing process.
Software development is a long and slow development – Windows Vista is what happens when you over promise, under deliver and still ship a product with parts that are not adequately tested. I’m sure there are Linux here who would rather see their distribution ‘behind the eighth ball’ than trying to be on the bleeding edge simply for the sake ‘sticking to Microsoft’.
Have read somewere, that the new kernel wich will be a part of gutsy, will have better support for broadcom 4311.
The question that i will like to have an answer to, is how much is this wireless supported in gutsy??
God knows I love Ubuntu. The thing is I haven’t seen any real innovation in Gnome in a long time. I wish the tried to innovate like the KDE team is doing. Not that I like to use KDE (I don’t) but they seem much more focused in introducing some real improvements in their environment. Gnome is feeling a little old these days…
And please don’t use this post to start a flame war…
I totally agree. I like Ubuntu, but GNOME really doesn’t seem to be going anyway for me. Perhaps later in the year Ubuntu will use KDE 4.0 as it’s default WM?
how difficult is it to replace the default WM?
As simple as installing kubuntu-desktop or xubuntu-desktop (if you want the whole OS to be based around KDE/Xfce), or as simple as installing the WM of your choice and editing the settings for your login manager to start that WM when you login.
Is that Canonical wants to stick with the open source philosophy and I applaud them all day and most of the night for sticking to their guns. Its commendable in this industry where sides shift and point of views get changed and change back constantly. The problem with Ubuntu is that its heading to that mainstream distribution status. Its becoming the Mac OS X of the Linux world but et there is no Canonical support for multimedia codecs, they stopped shipping ndiswrapper with the installation disk and mobile computing support while greatly improved in Ubuntu 7.04 is seriously lacking. On my Dell laptop the boot process flashes by and Im being serious, the graphics card will flash during boot and its annoying, I have to close the lid to keep from going into epileptic shock, the battery meter doesnt work and Im constantly questioning how much power I have left. This is after all the work arounds I have tried. I constanly joke with people that the hacked version of OS X x86 works better on that laptop then Ubuntu does. My PCMCIA slots work perfectly now though where before if I unplugged a PCMCIA card if I didnt lock up with a kernel panic, unloading and loading the module would fail. Windows works great for me. But until Canonical fixes the issues with Ubuntu Im stuck on Windows and for my Linux distro, PCLinuxOS.
I have been looking at forums and launchpad and I could not find any traces of people screaming that nvidia cards refused to boot properly on the liveCD. Am I the only one?
Seems like they want to activate compiz with xorg’s nv driver (which obviously will not work).
I must have missed some big bug report somewhere, am I alone, and, if not, could someone be kind enough to point me to some bug report I could cling to?
On the other hand, maybe they are planning on defaulting to Nouveau (but I am highly doubtful of that).
I’m really looking forward for intel to release a standalone graphics card for desktop computers. I’m sick of proprietary bits that do not work well.
Compiz explicitly checks for ‘nv’ and will refuse to load if you’re using it.
No it’s a known bug in xorg for the newer cards 8800 and so on. Ubuntu knew about it but failed to fix it. You need to append vga=791 the end of the grub boot string.
791 is for 1024×768, I don’t know the other values, but a quick search on the net should give you the info you need. I was faced with the same problem, still am, but putting that bit on my grub boot string fixes it for the time being.
actually no, I have an “old” 5200. But after much messing around (https://bugs.launchpad.net/ubuntu/+source/xorg/+bug/127096) I have tried removing xorg.conf and… it works! Perfectly!
I am certainly looking forward for this file to disappear forever into the archives of computer history.
It as to see AppArmor integrated instead of proper SELinux. AA looks very inferior from all sensible comments I saw. I trust Linux kernel developers.
AA looks very inferior from all sensible comments I saw. I trust Linux kernel developers.
Are you a kernel developer?
Neither is the majority of ubuntu users. And why should they. AppArmor arguebly is a little less robust but a lot easier to operate. You can’t realy expect everyone to write SELinux policies all day.
Than again if you want SELinux what’s stopping you after all it’s linux we are talking about thus highly configurable.
Ubuntu developers may have simply decided that at this stage AppArmor is much easier to use in Ubuntu (isn’t SELInux incompatible with Ubuntu’s new init system, Upstart, for example?).
Also, and despite some people’s beliefs, it is not yet written in stone that SELinux is the final and ultimate, by far the best, security solution that solves all Linux security problems much better than all other solutions ever will – and it will likely never become that. Although SELinux may have a bit stronger security model than AppArmor, also SELinux uses LSM (Linux Security Modules), and LSM in itself may have many potential security and other problems:
http://www.grsecurity.net/lsm.php
http://www.rsbac.org/documentation/why_rsbac_does_not_use_lsm
Allways hearth warming to see a distro doing more than only packaging updates.