Ubuntu 7.10 has been released. “Ubuntu 7.10 Desktop Edition adds an enhanced user interface, improved hardware support, multiple monitor support and integrated desktop search. Ubuntu 7.10 Server Edition features improved functionality, manageability, pro-active security and hardware compatibility and delivers a rapid deployment platform for developers and businesses. New versions of Kubuntu and Edubuntu, derivatives of Ubuntu aimed at KDE enthusiasts and the education community respectively, are also being released at the same time.” And a review. Update: One more review.
It’s finally here people…
Remember to donwload with bittorrent. Let’s give everybody the opportunity to play with.
http://releases.ubuntu.com/7.10/
Edited 2007-10-18 14:08 UTC
The advantage of being at work during the day is that this should be seeded and people already having the full download by the time I get home making my bit torrent download faster
This version is really stable, I tried it from tribe 1 and problems has been minimun.
“stable”? Depends on what you mean as “stable”. Just open glxgears while compiz is enabled and move the window over the desktop, and you will see how high have been the stability goals of the ubuntu team in this release vs the desire to get Shiny New Features
Edited 2007-10-18 15:03
Having done what you suggest many times on 7.04 without issue, can you please tell me what issue you are experiencing?
“stable”? Depends on what you mean as “stable”. Just open glxgears while compiz is enabled and move the window over the desktop, and you will see how high have been the stability goals of the ubuntu team in this release vs the desire to get Shiny New Features
Why don’t you complain to xorg instead? I don’t think the ubuntu developers have the time or knowledge to fix window redirection for the composite extension.
Personally, i never reported so much bug for a realise with so few new feature, and most of them were major bugs (preinstalled apps crashing on startup in beta and things like that)
Prolly, But I don’t use compiz, not because I can’t, surely I can turn on all the animations but I think is just a waste of resources.
I think is just a waste of resources.
I agree.
What resources?, your GPU is just sitting there doing nothing while your CPU is working away with the moving of your windows on the desktop.
If you dont play games much whats your GPU even doing, sitting idle 90% of the time, bit like a new toy you dont play with anymore.
My computer has a GPU? Where? .. not everyone has a video card with its own chip, much less with megs of ram. There needs to be some consideration for those with less than powerful video configurations.
Right, so having a GPU would free up resources, why have your CPU do 20-40% of the work just to move windows?
Graphics cards are so cheap now, not real reason not to have one unless your running servers or have you just woken up from the 90’s when computer parts were expensive?
can you please tell us Neandertals who’ve just woken up from the 90s how to add one of these cheap GPUs to a laptop?
Looking forward to your help. Thx.
If you dont have a GPU in your laptop then it doesn’t apply to you does it, my point being the GPU takes the load off the CPU ,there for compiz is not the useless resource hogging eyecandy people think it is.
Did people say the same about square wheels when round ones came along?
“Did people say the same about square wheels when round ones came along?”
Those round wheels were copied from OS X!!!! Damn Open Source wheel stealers
So no other OS can have a composited desktop without being called a thief?
It’s not like Apple didn’t steal ideas from SGI right?
Edited 2007-10-19 01:41
I was just making fun of some of the other posts about feature copying. The square wheel was just too easy. I realize everybody copies everybody else, and there is only so much you can do with a WIMP style GUI
Um. have you used Ubuntu lately? I just ask because you seem to be under the assumption that Ubuntu enables Compiz regardless of your setup. Ubuntu detects your hardware and if your system passes the requisite test then compiz is enabled, otherwise you get plain old metacity. i really don’t see your point here. If ubuntu is not detecting your hardware correctly and is enabling compiz on a machine that shouldn’t be running it, then by all means please file a bug report, otherwise please go to ubuntu.com, download the iso and use the thing before you spread misinformation.
I hope you know that the composite extension provides dubble buffering for your desktop. So a composited desktop will actuallty use less resources (GPU/CPU) since the windows aren't redrawn everytime you move them over each other. If the effects are slow, just disable them! Compiz is still an improvement without any eye candy.
I’m using it right now without GPU acceleration and it is working just fine. Exactly what is your complaint here?
Edited 2007-10-19 12:13
I’m happy that 7.10 is released… now I’m crossing my fingers ATI will release the 8.42 driver with AIGLX today.
I’m happy that 7.10 is released… now I’m crossing my fingers ATI will release the 8.42 driver with AIGLX today.
Well, crossing your fingers shouldn’t be a problem, except the fact it can become very annoying feature for you after few years.
But I strongly advice against holding your breath.
ATI won’t release drivers with AIGLX,… ever. AMD? Maybe. But, we can probably freely guess burden of that lies on X.Org to provide those drivers based on specs they gave, and even that will probably take few more months.
ATI is part of AMD — and they do plan on releasing AIGLX enabled drivers in the near future.
Yeah that was several months ago, how near in the future do you think people can wait? Ignore binarycrusader’s post, either use something other than linux (which is fine, so long as its not Windows(c)) or buy an NVIDIA(c) card. ATI(c) abandoned linux(c) even before AMD acquired it, and now it is hardly been any better. Think, the glx_ext_texture_from_pixmap needed for compositing or AIGLX, is still nowhere to be seen, and has only been ‘in some senses’ validated by the guys at phoronix. W/ as much as I believe in alternative OS’s and as much as I want AMD(c) to Whip Intel(c), I’m afraid AMD(c) (in all its bastardizing glory) is already way to far behind to ever compete on the *nix platform again. Your dillusioned to think otherwise.
I’m afraid you’re out of touch. ATi/AMD did originally mention this a few months or more ago but now they’ve started delivering on their claims. They have released the full 2D spec for their more recent cards and are about to release the 3D spec too, as well as building a new OSS driver framework, including a reference 2D implementation for the OSS community to build on.
http://www.phoronix.com/scan.php?page=news_item&px=NjA1Mw
Incidentally, I only use nVidia cards currently, because of the poor state of the current ATi drivers, but I expect to switch fairly quickly if nVidia cards don’t get a fully OSS driver before ATi ones do.
Why are you putting copyright symbols after everything?
I guess you meant to put trademark symbols, but you only need to do that if you’re writing something commercial.
You’re free to believe what you like. It doesn’t change the truth. I own an nVidia card, and run Solaris, so your implications that I’m an ATi fanboy are ludicrous at best.
I have just purchased an upgrade for one of my systems, the parts only arrived a few days ago. I am still in the porcess of setting it up.
It is an AMD Athlon 64 X2 system with a low-end ATI 3D graphics card, specifically a HD2400 Pro. This card is not currently supported by almost any Linux drive, apart from the very latest ATI proprietary driver fglrx version 8.47.1, which does not support AIGLX as yet.
However, the specs for this card for 2D have been released by ATI, and there is soon to be a further release for 3D specs. Proprieatry driver form ATI fglrx version 8.42, which is almost ready for release, will support AIGLX, and it will support this card.
The open source radeonhd driver, which is in pre-pre-beta stages, recognises this chipset but does not yet support the specific card … I will try to make sure the radeonhd developers have the information to support this specific card by running the connector test utility and sending them the information it reports.
The system will eventually support Ubuntu 7.10 in full 3D with open source drivers … just not quite there yet. I have already tested kvm virtualisation on this system using Kubuntu 7.10 release candiate, and it works … I can run Kubuntu and a Windows XP guest OS on the same system at the same time.
Anyway, the main point of my post is to tell you that your main assumption here is wrong. I was in the market for a new *nix platform, and I chose AMD/ATI specifically because they were opening up the graphics for open source development, and I therefore did not have to choose Nvidia. I could have chose Intel as well, because the drivers for Intel graphics are also open source, but the problem with that is that Intel graphics tend to be available only as integrated graphics (not standalone), and the Intel equivalent system would have been more expensive but less capable on the 3D graphics front.
So, in the final analysis, not only are ATI/AMD competing once again on the *nix platform, they are already winning sales to open source users over Nvidia systems, as my own very recent purchase demonstrates.
Well from what I know and I’ve been following these drivers very closely the ATI/AMD 8.42 drivers should be out this month and they are scheduled to have AIGLX support.
But then they’re also supposed to support every card ever, be 200% faster, perfectly stable, make tea and promote world peace. I think ATI may be just a *tad* guilty of overpromising on the 8.42 front.
When I upgraded to 2.6.23 kernel I couldn’t get the ATI drivers to work, they installed but I didn’t get 3D support and it defaulted to using the Mesa drivers. Worked fine in 2.6.22.x.
The too biggest problems for me whenever I try a distro tend to be graphics and wireless. I haven’t had success in the past with Ubuntu because my X always got messed up really bad if I wasn’t satisfied running at 640×480. I guess that my graphics chips are a bit less common. I’m looking forward to trying to get past my graphical issues with the new “bulletproof X” implementation.
If I can get past that, I’ll have to see how the wireless goes. I have high hopes from things that I have read, but seeing will be believing.
The too biggest problems for me whenever I try a distro tend to be graphics and wireless. I haven’t had success in the past with Ubuntu because my X always got messed up really bad if I wasn’t satisfied running at 640×480. I guess that my graphics chips are a bit less common. I’m looking forward to trying to get past my graphical issues with the new “bulletproof X” implementation.
I fail to see how the “bulletproof X” is going to help you here since, as far as I know, “bulletproof X” is basically a fall back option for the X server to start whenever it detects that cannot use the proper drivers for some reason and it is intended mainly for troubleshooting broken systems. I believe that it will use VESA at 640×480, 800×600 or something along these lines. Its main purpose is to never let people see the dreaded command prompt ever again in case that Nvidia or AMD/ATI drivers updates break something (as they invariably do from times to times), I guess.
Let me explain briefly then. I don’t have anything against Ubuntu, and I would like to be able to use it since there is so much community support. Unfortunately, as I said before, I haven’t gotten past the graphical issues. I don’t need everything to work perfectly out of the box either, but Ubuntu seems to have more graphical issues than I have seen in other distros (for my hardware).
Probably the biggest reason that I haven’t gotten past the graphical issues is due to the fact that once X fails, I’m limited to a single CLI until I manually force X back into some sort of safe mode. From what I’ve read, the new system would allow me to make a change and test it out, and the worst result should be that I end up back in the same spot. That would make troubleshooting both faster and easier. Surely, it’s not that hard to understand.
Try Puppy Linux then sir.
That’s ok, Fedora Core 6 borked on my wifi card too, but somewhat displayed video correctly on a Radeon X1300, as long as the highest resolution was 1024×768. Any higher and that scrolling with your mouse thing would kick in which makes me completely insane and look to boot back into XP.
Always a tradeoff with this stuff. Ubuntu is really the only distro that (mostly) gets it all right the first time. Except for Fiesty running on a Thinkpad T61….forget it. Errors abound, even off the Live CD. I’ll try Gutsy tomorrow at work and see how she does.
My new Dell laptop (Inspiron 6400) arrived just today. The perfect chance to install the latest Ubuntu.
I’m very impressed. Wireless networking worked out-of-the-box. Battery support works out-of-the-box (if I remove the power cable, Ubuntu will switch to power saving mode just like Vista would; battery meter is shown by default). I can plug and unplug USB mouses at will. Partitioning the system is painless because it supports non-destructive NTFS resizing out-of-the-box. I have absolutely no idea why so many people are complaining about Linux laptop support.
I have tried many, many laptops, desktops, etc., and I have yet to find a computer that fully supports sleep and hibernation. Most of the time they never come back awake. If they do, usually something doesn’t work, usually X. I have had 0 success on this issue with every major version of Linux. I realize some people do have power management working fully, but I believe it is still a minority. A least frequency-scaling and battery management usually work, so hopefully the suspend/hibernate situation will also resolve itself some day.
I bought a Thinkpad X31 recently. I installed Debian Lenny (testing) and sleep/hibernation fully worked out of the box.
It works near flawless on a T42
So because your laptop works you have no idea why people are complaining? You’d make great tech support!
I don’t want to be tech support. So there.
Every last one of the things you mentioned except wireless worked “out of the box” on my Latitude C610 with Edgy. While I’m happy for you, your bar is set a bit low.
What required tweaking was wireless (because the Broadcom chipset in my cardbus adapter wasn’t supported in the kernel) and sleep/hibernate (because the default configs for detecting lid closure were incorrect).
New hardware, Ubuntu supports very well. Older hardware is another story:
Feisty is full of launchpad reports on IDE drives being unsupported on both PPC and x86 architectures. The official Feisty release liveCD would not boot on a standard Dell Optiplex GX260 without inserting options at the opening screen, dropping to busybox and then modprob’ing hardware.
Installing Xubuntu has borked recognition of PS/2 mice.
Feisty let in versions of key math libraries compiled solely against AltiVec-enabled PPCs, making G3s unable to use them without crashing. Gnash is one of the applications which requires these libraries. (replacing them with G3-enabled versions allows this to work, so it wasn’t a judgment call)
Myth users are still manually compiling LIRC for IR support and IVTV for Hauppauge tuner cards rather than using the repository’s versions, four release versions after people started using Ubuntu as a platform for Myth. Kernel maintainers broke audio support for BTTV (one of the most common tuner card chipsets) in 2.6.15.
The liveCD’s partitioning tool has been incapable of correctly setting up an XFS partition since at least Dapper, requiring the use of an alternate install CD.
And that’s just hardware.
The default network applets have been incapable of supporting WPA encryption out of the box for all wireless chipsets.
Static IPs are not possible with wireless unless you manually configure chipset-specific scripts.
OpenOffice.org has been incapable of digitally signing documents without crashing since at least Dapper, because the Debian maintainer insists on compiling it against the wrong encryption library (manually compiling against the correct encryption library results in a version which does not have this problem). Bugs have been filed with launchpad and both Debian/Ubuntu maintainers have been informed at least three years ago, with no changes.
I’m an Ubuntu fanboy. I’ve got it running a Mythbox MBE/SBE combination, Gutsy is running on a vintage 1998 iMac in my office, and I’ve used it to run LAMP based servers with satisfaction.
But as a desktop platform, there are places where either a new release forgets to maintain backward compatibility compiler flags or old, important bugs are swept under the rug.
Myth users are still manually compiling LIRC for IR support and IVTV for Hauppauge tuner cards rather than using the repository’s versions, four release versions after people started using Ubuntu as a platform for Myth. Kernel maintainers broke audio support for BTTV (one of the most common tuner card chipsets) in 2.6.15.
I have Mythbuntu installed right now on my HTPC and it’s very nice, has LIRC support and everything compiled in (though I don’t use an IVTV card, so I don’t know about that).
You don’t have to ‘manually’ compile them, use module-assistant. Anyone who manually compiles them just doesn’t know how to do it the ‘debian’ way.
The default network applets have been incapable of supporting WPA encryption out of the box for all wireless chipsets.
Odd, I’ve not had problems with getting WPA Encryption since Edgy (when I first tried it) then again, I bought smart and have an Intel Wireless card which is very well supported.
OpenOffice.org has been incapable of digitally signing documents without crashing since at least Dapper, because the Debian maintainer insists on compiling it against the wrong encryption library (manually compiling against the correct encryption library results in a version which does not have this problem). Bugs have been filed with launchpad and both Debian/Ubuntu maintainers have been informed at least three years ago, with no changes.
Didn’t crash here under Gutsy Gibbon updated to the latest today.
I think it is called DisplayConfigGTK, but in betas it would just crash on my machine (ATI x1300 with dual monitors) – reported on lunchpad. The last version wouldn’t crash, it just could work. Googling I see that many, many people have just had to resort to manually editing the xorg.conf file. Any success stories here with this utility and dual-monitor configuration? Perhaps its just a problem with the proprietary drivers (ATI and Nvidia).
What is it about Ubuntu that makes people so enthusiastic about it? Don’t get me wrong, I use it at home, and I can be counted amongst those enthusiasts. But rationally, it doesn’t appear to be much better than the likes of Fedora or Suse (and both of which have prettier default themes IMO). Yet, Ubuntu seems to have a certain je ne sais quoi, something that generates buzz, that makes people infectious in their enthusiasm, and that eludes many other distros. I can’t even say why I choose to use it, yet I do.
Well, for starters Ubuntu doesn’t feel like a stripped down version of Canonical’s main offering because it IS Canonical’s main offering (remember, we’re talking distros here, not support contracts).
NB: I’m not saying that Fedora and OpenSUSE aren’t feature complete, it’s just that _I_ get the feeling that Red Hat and Novell aren’t too interested in polishing Fedora and OpenSUSE too much as it would threaten their commercial offerings. YMMY though.
Secondly, Ubuntu has IMHO a superior package management system as it is Debian-based.
Thirdly, it’s Debian-based and everybody loves Debian except for the fact that Debian releases “always” are “outdated” and “far between”. The unsupported Debian-unstable is the way out, but unstable sounds “nasty”. Not to mention how nasty “unsupported” sounds…
Which brings us to the fourth reason: Ubuntu is essentially an up-to-date Debian with support.
I could go on mentioning Mark Shuttleworth’s likeable personality, Canonical sending free CDs to anyone that asks, and so on. Suffice to say: yes, there is indeed something about Ubuntu.
Thirdly, it’s Debian-based and everybody loves Debian except for the fact that Debian releases “always” are “outdated” and “far between”. The unsupported Debian-unstable is the way out, but unstable sounds “nasty”. Not to mention how nasty “unsupported” sounds…
Debian releases are meant to be as stable and reliable as possible, they’re not meant to be as up-to-date as possible. Debian releases are good for servers as well for those desktops where stability and reliability are more important than up-to-dateness.
For those who want Debian with more up-to-date software, there’s also Debian “testing” that is slightly behind Debian “unstable” in up-to-dateness. There’s also the option to stay in “testing” while getting some select packages from “unstable”. Also, you seem to forget that Ubuntu is built on Debian “unstable”.
Debian “unstable” is *not* unsupported. On the contrary, Debian “unstable” is fully supported — it gets bug-fixes and security updates come from upstream developers. However, the largest component of Ubuntu, called “universe”, *is* unsupported. The Ubuntu web site says that it “comes with no guarantee of security fixes and support“.
http://www.ubuntu.com/community/ubuntustory/components
Which brings us to the fourth reason: Ubuntu is essentially an up-to-date Debian with support.
This is a myth that needs to be busted.
Like I said, Debian is fully supported (security & bug-fixes) but Ubuntu’s “universe” and “multiverse” components are unsupported. Ubuntu releases get outdated soon after the release but Debian “unstable” and “testing” receive version updates every day, which always keeps them more or less up-to-date.
I agree with you 100% In fact, if Debian had a bit more polished set up out of the box, it’d cream Ubuntu as far as popularity etc. Well and a more steady release schedule.
Personally I usually play with Ubuntu, but there is always something that makes me go back to Debian Sid.
I agree with you 100% In fact, if Debian had a bit more polished set up out of the box, it’d cream Ubuntu as far as popularity etc. Well and a more steady release schedule.
There’s a bit of friendly competition, which is always healthy between distros, but I don’t think Debian really aims to “cream” Ubuntu or any other distro “as far as popularity etc.”
Different distros have different strengths and different weaknesses. Canonical has hired a bunch of top Debian developers and these skilled full-time developers can make some big changes happen quickly. Debian, with a thousand volunteer developers, has more man-power and strict packaging policies that ensure the high quality but the big changes tend to be slower. Still, big changes do happen also in Debian.
Consider, for instance, polish on the desktop. Debian’s switch first from XFree86 to Xorg, and then to modular Xorg, was greatly helped by Ubuntu doing these switches before Debian but, since then, Debian has developed their Xorg packages quite independently from Ubuntu. There’s also been great improvements in Debian’s installer that has been developed independently from Ubuntu, while Ubuntu has concentrated in developing their live-cd installer.
Also, a lot of work has gone into improving the “out of the box” desktop experience in the default Debian installation and to making it more polished. Also in this area, Debian has made its own choices instead of copying what Ubuntu has done. Many reviews of Debian Etch testify that users have been quite happy with these changes. Some of the reviewers wonder why Debian has a reputation of being a difficult distro when, in fact, the default Debian installation is easy even for beginners and it gives “out of the box” a polished desktop.
Some people attribute these improvements on the desktop to Ubuntu’s influence, but I see Debian going its own independent way also in creating a polished desktop experience. Ubuntu and Debian both look and feel different. Ubuntu customizes its desktop(s) heavily while Debian makes smaller, more sophisticated changes. If a lot of people like Ubuntu’s desktop polish better, fine. Some of us prefer the Debian way of adding desktop polish. And Debian keeps improving, so wath out in the future…
I would also like to point out that, although both Ubuntu and Debian have stable releases plus a development branch, Debian also has in addition a “testing” branch that changes every day but is more tested (and lags somewhat behind in up-to-dateness) than Debian’s main development branch, “unstable”. Ubuntu has more frequent stable releases but it lacks such “testing” branch. (Ubuntu can make frequent releases because they don’t officially support the large number of packages in their “universe” component.)
You could consider Debian “testing” as a “rolling release” that is constantly updated but still aims to be release-ready at all times. Historically, Debian added this “testing” branch because “unstable” tended to get utterly broken after stable releases and, yet, it’s important to have a development branch that can be temporarily broken when big changes are merged in. But for users it’s not ideal to track a development branch that can occasionally get broken. In contrast, Debian “testing” aims to be usable at all times.
One of the slow, big changes in Debian for the past couple of years has been the effort to improve the security support for “testing”, and it’s currently working quite smoothly. I don’t really know if Debian plans to release snapshots of “testing” as special desktop releases (like, say, once every six months) but if they have such plans, I’d say that Debian “testing” begins to be ready for that.
Two words: Mark Shuttleworth. A rich and famous geek icon poured his heart, soul, and lots of cash into Ubuntu, and people noticed. Also, Ubuntu’s user community is huge, knowledgeable, and friendly — an unbeatable combo in my book.
What is it about Ubuntu that makes people so enthusiastic about it? Don’t get me wrong, I use it at home, and I can be counted amongst those enthusiasts. But rationally, it doesn’t appear to be much better than the likes of Fedora or Suse
I’m not too familiar with (recent releases of) Suse, but I have tested practically every release of Fedora. For me, Ubuntu wins because of apt-get (much faster than YUM) and the huge repositories. And Automatix solves those pesky “patent issues.”
I’m not trying to start a distro war – for those who prefer RPM-based distros, more power to you. I’ve played around with all the big distros from Slackware to Debian, and each has some endearing feature. But all things considered, Ubuntu just seems easier than the others. If I wasn’t using Ubuntu, I’d probably be using Debian.
But rationally, it doesn’t appear to be much better than the likes of Fedora or Suse
Rationally, it doesn’t appear to be much better than Fedora or Suse now. But it wasn’t long ago that Ubuntu was the simplest to download and install, had the most up to date packages that I cared about (mostly Gnome) and worked the best on my hardware (and others’ too presumably).
If Fedora was as good as it is now when Ubuntu came out it could be the one with so many fans.
Edited 2007-10-18 15:43
I can give you one example: package manager.
Fedora’s Pirut and Pup is quite frankly an embarrassment. It’s sluggish but even worse is the complete lack of user feedback of what’s going on. Often you launch pirut, get prompted for the root password and then have to wait forever while it does something in the background without ANY indication or feedback of what’s going on. Heck, it doesnt even show a window. Thankfully there’s Yumex.
I still use Fedora (when not in my primary OS, OpenBSD) but pirut and pup are a huge pain in the ass.
Ubuntu is also admittedly more polished than most other distros.
I don’t know if it’s just me,
but this release feels like the most exciting linux release since the original Ubuntu came out and attracted everyone’s attention.
Feels polished, simple, usefull. =]
P.S.: If you have friends you would like to convert to linux, let them try Wubi:
http://wubi-installer.org/
Even if something goes wrong, I’ll be much safer and probably no therapy would be needed to try Linux again in the future… =]
It’s probably just you.
I never thought I would be asking this, but I cant get more then 50kb from any of the mirrors.
http://www.linuxtracker.org/
Very nice. I’ve been running it for about three weeks and it’s good.
But there are these things that never change, release after release. Mouse support. Yes. Every time I install a Linux dist I have to google how to modify xorg.conf and some other program to use more than three mouse buttons.
I really like my back button in Firefox…
Other than that, remove the additional slowness Firefox has compared to the win32 version and I’m quite happy.
I really like my back button in Firefox…
That’s a firefox “option”.
put:
about:config
in the address bar of firefox.
change:
browser.backspace_action to 1.
Tried the new TV-out stuff on my oldish laptop with Intel graphics.
Just killed X to the point of oblivion. Oh well, another Linux release passes me by…
/quote/But as a desktop platform, there are places where either a new release forgets to maintain backward compatibility compiler flags or old, important bugs are swept under the rug./quote/
And this is the problem with Ubuntu in a nutshell. You have to upgrade to get newer versions of applications and when you do you’re left crossing your fingers hoping that there’s no regression.
But regression is a mainstay for Ubuntu. I used to run it and features that worked like a champ under Dapper, let’s say, were horribly supported or not at all supported under Edgy. The common response is that Dapper was a LTS release and was meant to be stable. But if Ubuntu wants to claim the desktop space *every* release must be stable and have little to no regression. Referring back to a release that is almost 18 months old as the answer for stability is really no answer at all.
Ubuntu and its derivatives need to seriously consider coming out with releases every 9 months or so instead of 6 months. Oh wait, then the users would have to go 9 months before their apps are upgraded.
So we are left with a conundrum: either extend the release times and use older versions of apps longer or release every six months, adding tons of eye candy and hope there is little to no regression.
A bit if a trade off is it not? And we wonder what all the hype is about?
(Typing from a Slackware 12 installation running the latest office apps.)
Well, an 18-month release is not only perfectly acceptable, but expected for a major operating system.
In fact, if you compare it to Mac OS X or the Windows collection of operating systems, only Mac OS X manages to have had releases within the 18 month time-frame.
People want no regressions, total stability, totally new features every six months with perfect documentation to go with it all for free. I honestly don’t believe that that is possible, although I wish it were.
I think you need to establish priorities. My servers are all on stable releases that change every few years. On my company’s desktops, I keep also stable releases that change every few years, but never sooner than two.
On my personal workstation, I mess around because I know how to fix things when they break.
In summary, long-lasting stability: CentOS, Novell Desktop Linux, Mandriva Corporate Desktop, RHEL, Debian Stable or Ubuntu LTS.
You want to have fun and can deal with the occasional glitch, regular releases of the above distros are fine.
Never forget, Ubuntu was a young upstart using Debian as a base to start from.
I recall the Debian community getting really up in arms when Ubuntu started gaining popularity, mostly because Canonical could “poach” core developers onto it’s team, with the offer of full time employment.
Ubuntu has a solid financial backing with a solid Debian developer core behind it.
It’s a most “powerful” choice when it comes to turning it into the Windows replacement, yes. Most patented and semi-legal stuff can be set up with Automatix, so codec support is there. ntfs-3g puts end to once a huge problem. Ndiswrapper, restricted manager, all sorts of user friendly features and HUGE repositories with cutting edge versions. That’s power of Ubuntu.
However, Red Hat’s paying tons of developers to really improve core stuff (Ubunteros more foten pick up something easy). Ever wondered who’s behind new redirected rendering, TTM work (partially), new firewire stack, PulseAudio? Those are being developed by RH employees. As a result, lots of this stuff ends up first in Fedora, a testbed distro: always a cutting edge, but not as powerful as Ubuntu with all those features I described. It’s the only mainstream distro where you’ll find latest mac80211 driver snapshots or 2.6.23 with CFS.
Thats the thing, Ubuntu’s strength is in its distribution and packaging. Canonical isn’t anywhere near as big as Redhat or Novell, both have the capital to sponsor R&D. Ubuntu is funded by donations from its founder, Mark. He’s payed about 20 Million out of his own pocket to found and sustain the project. Red Hat and Novell have investors, they have deals with MS, they have stock holders, they have exclusive contracts with with big IT firms, Canonical at this point has yet to turn a profit for Ubuntu, though they are slowly getting there. They have to maximize their resources and focus on what they do best, which is distribution, packaging, and apparently marketing. They don’t have the luxury of sponsoring pet projects or spending money on R&D. Ubuntu is not sold anywhere. its free for anyone to download and install free of charge with no strings attached. Red Hat, Mandrake, and Novel have had years selling their products and however little profit that brought them it was still revenue. Where is Ubuntu getting its money from? Mark is not a bank, he cannot keep throwing millions into Ubuntu. Ubuntu has to stand on their own and when Canonical finally does become profitable you will most likely start to see more funding go into sponsoring OSS projects and research into new avenues of software development. But you can’t expect a project with no money to do this, just like you wouldn’t expect the same from Debian, Slackware, or Gentoo.
“…just like you wouldn’t expect the same from Debian, Slackware, or Gentoo.”
Though you make a point the three projects you mentioned do not have as their goal to be the #1 desktop; they have one goal in mind: to be the best Linux possible for discriminating users of Linux.
I run Slackware and while it does not have a huge developer base, the developers it does have–in addition to Patrick Volkerding–all focus on making the best Linux possible for the end user who knows what they want.
Different distros have different constituencies so you can’t paint them with w broad brush.
That’s not my point though. My point was that you wouldn’t expect Slackware to have a dedicated research and development team when Slackware is a community based project, and for all intents and purposes that is what Ubuntu is. It may be popular but at the end of the day Ubuntu is a community based distro just like Slackware. The post I responded to tried to paint Ubuntu in the same picture as RedHat and Novell, but these companies have a constant stream of revenue and are enterprise oreiented. Ubuntu may have a large community, but Canonical as a company is relatively small. You have to consider where Ubuntu originated from (Debian) and what model they follow as a template. Most community based distros don’t have specialized payed R&D teams but they have dedicated developers who work to better their distribution of choice, and Ubuntu follows the same model. I happen to think that the reason Ubuntu is so popular has to do with the fact that ubuntu’s focus is still on the community, and not the enterprise exclusively. RedHat lost sight of that and now want to regain the market, and Novell wants to force feed you their vision of what linux should be. It’s great that these companies exist to enrich linux, but to put Ubuntu and these company in the same league is naive. Popularity especially with a free distro deosn;t equal rich in the OSS world.
I am very open to any Windows alternative, but Linux copycatism is starting to be ridiculous. New document preview is taken right from Leopard Coverflow, KDE filemanager has breadcrumb navigation like vista explorer, hell even the wallpaper is Vista Strands wallpaper just colorized to brown. Both KDE and Gnome has start-menu search just like Vista. I wonder when will OpenOffice “borrow” ribbon interface. With such a vast base of devs and almost no need to preserve compatibility and user experience, Linux should lead the UI evolution, not piggyback on Windows/OSX.
Edited 2007-10-18 21:46
Well, we’re not really talking about “Linux” here, but about the DE’s and 3rd party apps. But am I missing a workspace like feature (that nowadays I can’t live without) on XP? Didn’t microsoft originally say that nobody wanted tabbed browsing, only to copy that from an open source application?
Everybody’s going to copy everyone. You’re making a huge mistake if you don’t add in features that the public has already seen and demands. But don’t make it sound like Linux (RE: Free Software) doesn’t innovate as well.
If Windows had workspaces, then Microsoft would be a co-defendant in the IP Innovations patent suit. Or would they…?
Edited 2007-10-19 01:17
1. Linux DE’s have had document previewing for years even before Vista.
2. Did Vista invent abstract design wallpaper?
3. Where is the search supposed to go in a menu for searching?
4. Office apps need to look like each other really, if not people will complain about the cost to train people to use OO.o.
5. Linux DE’s have lots of UI evolution, you just not looking in the right places.
Leopard isn’t released yet, so it’s absurd to argue that its features have been copied. XFCE’s Thunar had breadcrumb navigation before Vista. It doesn’t make sense not to have a search bar in the main menu.
Before you lead, you have to reach parity. Users have expectations that must be met. There are bold new ideas in UI design, but for the most part, Microsoft, Apple, KDE, and GNOME are all converging toward a similar vision. It’s not so much that they’re all copying off one another as they’re coming to the same conclusions about what users prefer.
You ever notice how all of the mid-size cars on the market look eerily similar? Are they copying each other or merely giving customers what they want and expect?
Breadcrumb navigation was in gnome way before vista got it. XFCE also had it before Vista, even (when the hell are they going to release the stupid thing) E17 had breadcrumb navigation before Vista. I mean you can’t really be serious when you mention a Microsoft product and copying in the same sentence. All of Vista smacks of OSX dejavu. The abstract wallpaper has been around for a while on deviantart and as far as I know the first OS to use one as default was OSX. The search in the start menu is NOT the default for either gnome or KDE, it is a replacement made by Novell. The default gnome menu look very much like Classic MacOS and I happen to think its better than the Novell one by a long shot, there have been reviews stating as much. Kickoff is pretty okay though. Cirtual Desktops has been part of the Linux DE experience for years, Apple is implementing the same concept in Leopard, I don’t see you accusing them of being copycats.
Edited 2007-10-18 23:07
Some kind of breadcrumb navigation was present even on CDE (before 1995) and Open Look (late 1980s)
http://xwinman.org/screenshots/dtwm.gif
http://xwinman.org/screenshots/olwm.gif
There are probably other older implementations
Ooops!
I think you may find that Vista is actually the copycat.
Start menu search was in openSUSE long before Vista. And btw, ribbon interface was in Lotus SmartSuite 96 in 1996, it’s called InfoBox.
“And btw, ribbon interface was in Lotus SmartSuite 96 in 1996, it’s called InfoBox.”
InfoBox was a sort of wizard/dialog Hybrid, The Ribbon is a menu Replacement. I don’t see how you can even make this claim.
Because I’m still using it and know how it works. And mostly it is in “how it works” not “what it is”. Office 2007 is only a different implementation of the SmartSuite ideas.
I disagree entirely, I did use SmartSuite in the past, but this is the present, and it worked differently, as it is essentially just a tabbed dialog box, which has existed for decades.
The ribbon is an extension of that idea, but limited to just the commands, and not the work area. It is not a new concept, but a new use for an existing concept. I doubt SmartSuite was the first to use a tabbed dialog box
That is how most things evolve – features that work appear across the board, features that don’t slip away into obscurity.
Operating Systems will converge on similar ideas not because they are deliberately copying each other, but because there are only so many ways of performing a task optimally. Form and function often have a very tight relationship. Wheels can be designed to take on a myriad of appearances, but all wheels will converge on the defining feature of being generally circular in cross section. OSes will converge on similar features because of user preferences and expectations.
I don’t care if OO.o copies the ribbon, because it is a good feature and a better way of accessing the functions in MS Office than in previous versions.
I don’t care if Gnome and KDE have a start-menu search like Vista – it is a sensible and usable place to put it.
Good features should be copied.
OpenSUSE and Mandriva have to admit the Ubuntu supremacy…
My godness!, this release is almost perfect…
whatever…
Ubuntu and kubuntu always remind me of a default windows install – in that there’s just NOTHING there. In windows you get a media player, internet browser, email program, notepad, and file manager…wow. Ubuntu does give a lot more than this but its all standard Linux software. Ubuntu is like a “base install” with all the extras pulled out.
To say it another way, they’re more aiming to get their desktop working well, without bringing anything particularly new to linux. I believe they are playing with features like AppArmor, but in terms of polish and overall feature completeness – I believe opensuse offers more. openSUSE is a more commercial-based product which is a test-bed for SLES. And it should be said that Ubuntu cannot compete with RHEL and SLES at present. I see openSUSE as bringing solid security and a full featured desktop that’s backed by Novell. anyway, its hardly an argument since we all have our favourites. Unfortunately though I dont believe kubuntu comes close to openSUSE for being the best KDE distro.
The biggest difference for me I think is YAST. Say what you like about it but its an awesome piece of software…
All my personal opinion so take it how you will
Edited 2007-10-19 03:43
You can only put so much on a single CD. More Stuff is what Add/Remove is for; the whole point of repositories is that all the other software is trivially available. A DVD release could include more by default, but installing extra software is so easy it probably wouldn’t even be worth the effort of putting together the .iso (and would also most likely put a greater strain on the servers at release time).
Hmm I tried it for some days here… (appart the ACPI bugs from my BIOS and the mouse wheel not working, and it being slooow on 192M (hey come on Zeta flies on this)…
I’m still wondering what makes it so secured compared to debian or another one.
No root account ?
fine, that’s even simpler now:
sudo sh
It is? I have actually not heard this claim before.
Well, there’s a root account. You just can’t log in with it.
I prefer “sudo su -” for those occasions where “sudo <command>” doesn’t cut it.
Just downloaded Kubuntu 7.10 and burned it to CD to give it a shot.
Looks nice, except that it doesn’t recognize any of my fixed disks. First GNU/Linux distro I’ve ran into that can’t access either of my fixed disks.
I really want to like *buntu, but each release I’ve tried so far (6.10, 7.04 and now 7.10) has some showstopper bug that prevents me from using it, on the 4 different machines I’ve tried it on.
I could probably get help from the forum, but I don’t see the point since so many other distros work flawlessly out of the box. I’ll just stick to PCLinuxOS for now.
Not sure why *buntu hates me so much, but congrats to the team for their hard work. Maybe the next version will work on my system.
Ubuntu finally detects my monitor correctly! Very nice, as messing around in xorg.conf is never pleasant.
Compiz (or whatever it’s called) seems relatively stable. The effects enabled by default appear to be pretty sensible. It’s taken too long, but I’m pleased that it’s finally here.
Fonts, default art etc. are all still not to my tastes, to put it diplomatically. The sloppiness of having two different information icons upon right clicking the network applet persists.
Tracker doesn’t seem to work for me.
usplash still looks hideous on widescreen monitors. I can live with it, but I’d rather not.
Overall though, great release. Canonical just needs to hire some, y’know, artists. If Ubuntu had the same level of visual polish as Fedora, I’d recommend it without question. As it stands, it’s easily the best desktop Linux distribution in every other respect.
Tracker doesn’t seem to work for me.
Search results may not be available until tracker’s initial indexing has finished (this is a feature not a bug). Also indexing is suspended on battery so you must be on ac power for initial indexing to occur. Also tracker pauses if it detects other apps writing to disk so if you are downloading stuff continuously tracker indexing will be stalled
Also if you previously had old versions of tracker installed pls do
killall trackerd
trackerd –reindex
(to make sure update blues has not spoilt things)
It seems that support for not-so-common graphics chipsets is much worse, compared with Ubuntu 7.04. I still try to get X running on VIA Unichrome chip, but, alas, to no avail. In the previous version it worked out of the box.
And that’s just what I pointed out before: Ubuntu focuses on six month release cycles and new whizbang features. The downside is that there seems to be more bugs with every release as well as regression. What worked great in release X is poorly supported or does not work at all in release Y.
If Ubuntu and Canonical want to be a major player even more so than they are now they must minimize these regressions and provide more consistency with their releases.
Edited 2007-10-19 15:31
Yes, in every version it needs different kinds of adjustment to work on hardware that was supported flawlessly in the previous version.
My rather exotic Laptop works now with 7.10.
I got the alternative ISO, installed in textmode, booted into single-user mode, disabled GDM and configured X from scratch with a new xorg.conf that I took in parts from a Debian system.
And I am glad to have made this small effort. But for the common user this might be a bummer.
Apparently wired is extremely impresses with this ubuntu release. The review isn’t deep or anything and it doesn’t go into much detail but its pretty positive.
http://www.wired.com/software/softwarereviews/news/2007/10/ubuntu_g…
Just nervously performed a dist upgrade on my Feisty install (via the Update Manager – very slick) and everything worked perfectly. I’m really impressed.
Really terrific work folks. Thanks!
I tested both Ubuntu 7.10 and Mandriva 2008.0 and the best is Mandriva.
1) Installers: both have livecd installation but the traditional installer of Mandriva is graphical or text based (you can select it) while Ubuntu alternative installer is text based and not so intuitive and with less options/features.
2) Ubuntu’s KDE (and Kubuntu) is worser than Mandriva’s KDE and Mandriva’s Gnome is almos as good as Ubunt’s Gnome
3) Mandriva has the powerful MCC “control panel”
4) Mandriva seems faster than Ubuntu for desktop usage
5) You can use both urpmi, smart and apt4rpm/synaptic/aptitude to do package manipulation in Mandriva. Therefore the usage of apt/synaptic in Ubuntu is not an advantage over Mandriva, who has more options of package installation with automatic resolution dependency.
6) The myth that Mandriva is not “free” (as freeware) is false because Mandriva Free and Mandriva One (who includes also proprietary drivers and codecs/plugins) can be downloaded freely and without cost. And you can pay for the comodity of Mandriva Powerpack which comes with all these proprietary things and it is totally ready for normal desktop usage after installation.
I suspect that both Ubuntu and PClinuxOS have more popularity only because they are fundamentally US-based distributions while Mandriva is a french/brazilian distro. I cannot agree with some erviews and people saying that PClinuxOS is perfect or better than *buntu and at the same time PClinuxOS is deeply Mandriva-based, sharing MCC, the Mandriva installer, many packages and even the Mandriva/Conectiva apt4rpm/synaptic.
Having just gone through an “install-a-thon”, a few reflections…
I find 7.10 to be the most accomplished Ubuntu distro yet. Doing a clean install from the Desktop CD on my desktop system (using an Intel D865-GLC main board and a BFG nVidia GeForce 6600-based display adapter) has resulted in a very pleasing environment. I particularly appreciated the “Restricted Drivers Manager” and the ease with which commercial DVD support is added, without having to resort to Automatix.
On my laptop (Dell Inspiron E1505n), the path was even smoother. I’ve been running alpha, beta and RC versions right along – now I’m at release version with no more than an “sudo aptitude full-upgrade”.
Canonical is to be congratulated on this release.