Last September, some of the Debian Linux distribution’s leadership wanted to make sure that Etch, the next version of Debian, arrived on its December 4th due date. Almost two months later, though, according to the February 17th Release Critical Bug Report memo to the Debian Developers Announcement list, there are still 541 release critical bugs.
Debian is suffering continuing delays partly because of a slowdown by key developers. Many developers are upset that Debian’s two release managers are being paid to work full-time to finish Etch. Umm, last time I checked, people have bills to pay and they have to eat! The intentional slow down being orchestrated by these key Debian developers is serving no purpose other than turning off people to Debian. While many distros are flocking to Ubuntu’s packages for their core, distributions using pure Debian are beginning to shrink. Meanwhile, the Debian developers continue to play their childish games. I think Debian must have taken a page out of Microsoft’s play book. Unfortunately, sometimes it’s NOT better to be late than never.
Debian is suffering continuing delays partly because of a slowdown by key developers.
So, who exactly are those key developers that you claim are slowing down their work and how exactly has this alleged slow down delayed the Etch release? You’re parroting unproven allegations. Please provide the facts and details to prove your claims.
I find it strange that Steven J. Vaughan-Nichols is so worried about the Etch release when he apparently doesn’t even use Debian himself. Why is he so concerned? People should know by now that Debian is only released when the developers think it’s ready — not a minute earlier.
A recent post in the debian-devel-announce mailing list suggests that the Etch release is progressing just fine.
http://lists.debian.org/debian-devel-announce/2007/02/msg00019.html
It’s no secret that many Debian developers are upset about Dunc Tank. A Google search will make that clear. Further, I doubt anyone will admit to being one of the developers purposely delaying their work. Would you admit it? I doubt it. As for hard facts, they’re scarce, but I did read this recently:
“A group of 17 developers, led by well-known Debian maintainer Joerg Jaspert, issued a position statement in October citing its disenchantment with Dunc-Tank. It read, “This whole affair already hurts Debian more than it can ever achieve. It already made a lot of people who have contributed a huge amount of time and work to Debian reduce their work. People left the project, others are orphaning packages…system administration and security work is reduced, and a lot of otherwise silent maintainers simply put off Debian work (to) work on something else.”
Please stop acting like a litigator. I’m not wasting time providing names, addresses, dates and times so that my comments seems more credible to people like you. If you want more information, look it up yourself.
You’re missing my point — which is that there’s no rational way to prove that the disagreements concerning the Dunc-Tank project would have contributed to the delay of the Etch release. There are other technical reasons that explain the delay well enough. Don’t ask me to look up information that either confirms or refutes this because such information just doesn’t exist. This is the reason why I think that people should be careful not to spread these unproven allegations like they were actual facts.
As for the post written by those 17 developers who oppose Dunc-Tank — yes, I’ve read it, but that post also lacks cold hard facts concerning the actual effects of their protest. AFAIK, there are more than a thousand Debian developers. Why should the opinions of 17 of them weight more than the opinion of the majority of the DD’s? With such a large group of volunteers there are inevitably disagreements every now and then. It’s not the end of the world, and it’s certainly not the end of Debian.
Personally, I speculate that it’s something much simpler. Linus had a really hard time delivering any kernel on time, no matter how hard he tried to enforce it. There were many excuses for the delay, including the “a release won’t be finalized until it’s stable” reason. But given that these days Linus *is* able to make regular stable releases that rarely slip, it’s clear that those excuses were false. Things didn’t start working smoothly with the Linux kernel until until the Linux development process changed the tools involved and changed the development and stabilization process. Something similar could be put in place for Debian.
Imagine if Debian unstable were “stabilized” once every month (or 2 weeks or 2 months, whatever works). If a package didn’t make it in one month, no problem, wait ’til next release window (e.g. 1 month). This process would ensure that unstable never gets too broken and that people who want to rely on the bleeding edge, wouldn’t have to worry about the spaces in between releases where packages are being stabilized (and thus might result in package breakage). Once every 6 months (or 9 months, again whatever works), an unstable release would be picked for further stabilization, and a beta stabilization release (aka testing) could be made every two months..
If some generally useful packages (e.g. GNOME or KDE or Apache or Linux kernel with Xen/KVM) missed the first “testing release” cut but got stabilized before the next unstable release, the Debian maintainers *may* decide to include in in the second testing release, but that’s not guaranteed and if a package doesn’t make it into the second testing release, it will have to wait until after the next stable release for inclusion. *No exceptions*.
After about a year of stabilization *without package updates only bug fixes*, nearly all the bugs should have been all worked out, so a stable release can be made.
At the end of that 1.5 to 2 year process, you will have a predictable, stable release, but it all comes down to project management.
Imagine if Debian unstable were “stabilized” once every month (or 2 weeks or 2 months, whatever works). If a package didn’t make it in one month, no problem, wait ’til next release window (e.g. 1 month).
That would kill Debian for anything but casual use. The beauty of Debian stable is that you can do a apt-get update && apt-get upgrade and have the latest version of stable and secure packages. Now this comes down to one thing, configurations are NOT broken. Look at how the current stable would handle an upgrade to the latest clamav, it wouldn’t. You have to manually fiddle the config files simply because overwriting and/or merging the new conf-files is a really shitty idea if you run more than a few servers.
Stable is stable for a reason, if you want the latest and greatest you run unstable or testing (etch).
Ubuntu is very nice but stuff breaks in far too many ways in between releases and that takes away some of the fun.
And yes, we run 100+ Debian servers at work, and my workstation is Ubuntu (which btw has piss-poor interactivity compared to my older and slower FreeBSD-Gnome laptop).
Edited 2007-02-24 13:38
Hey chicken,
http://blogs.turmzimmer.net/2006/12/18#20061211
You will not hear the truth because somethings aren’t for everyones ears.
Uh, the link you provided says this: There are media rumours floating around that “[Etch has] been delayed because some developers have deliberately slowed down their work”. This doesn’t reflect what I said.
Many developers are upset that Debian’s two release managers are being paid to work full-time to finish Etch. Umm, last time I checked, people have bills to pay and they have to eat!
Except for developers– who apparently live in boxes, survive on garbage and sleep after they’re dead.
Seriously though, I think that this issue brings up on a small scale some of the questions about a world of only Free software that Stallman and FSF supporters have yet to answer adequately. Mainly: Will developers work for free, and if not, who will pay them?
Just to make it clear, I don’t have a problem with Free software. I do, on the other hand, have a problem with the implications of the Free software only world that Stallman envisions.
I think you are putting words into RMS’ mouth that he didn’t utter. There’s nothing wrong in charging for GPL’ed software. And nothing wrong with paid developers.Nothing wrong with corporations, as long as they respect the GPL.
Poor Linus, poor Red Hat.
Well, the above post may not be the longest, but it accurately answers the question – who was the idiot that modded it down?
Linus and RH are both very good examples of how free software developers may earn money. A company may need a feature that is not available, and it may sponsor a developer to implement it (some of the features of FreeBSD was developed this way). They can also participate in professional support and training, or work for a commercial Linux company.
Anyhow, if every software would be free software, that would mean a new arrangement or rearrangement of the way developers are being payed, but I don’t believe that overall fewer (or more) developers will earn a living through writing software.
Have you ever seen what the FSF charges for one of their “deluxe” GNU software distributions? The FSF has no opposition to charging for software, just to restricting the rights of users.
In any case, it should be borne in mind that most software is not sold as a product. Most software is created in support of other products. This balance is only going to shift more towards custom software as the commercial software market becomes saturated (honestly, who really NEEDS yet another version of Photoshop?) and more and more products have embedded software (what open-source project is going to write a free-software implementation of an avionics software package?) Also, while free software means there is less money for producing specific types of high-volume software, it also means that users of these types of software can have more resources to spend on more productive things. We use Linux and GCC at work, because it means we can avoid putting out the $$$ for a copy of VxWorks. That in turn means more of the money from our contract can be spent on hiring people to work on our main project.
Your point about paying developers is actually valid.
There are companies making millions off of supporting Linux but that money does not necessarily make it to any of the people actually writing the code, building packages, or fixing bugs.
According to the Inquirer, HP is making 25 million a year supporting Debian: http://www.theinquirer.net/default.aspx?article=37799
I can’t speak authoritatively on if any of this money is getting channeled back to support Debian or not though, I am simply pointing out that there is nothing preventing HP from making 25 million a year on Debian and giving Debian squat.
I believe a large portion of money made on open source software ends up in the hands of the middle men due largely to the support model.
The best solution for companies building Linux is to also sell commercial support, but there must be a thousand companies other than RH offering paid support of Red Hat products. Sure they do bug fixes, but since they don’t have to pay full time developers to contribute the bulk of the code, it makes it easy to undercut Red Hat’s support prices.
A more direct model to support developers and support staff alike could offer huge gains, but most of these attempts have not really taken hold.
Some ideas on how better to do this could make for an interesting article/discussion.
Interesting indeed.
I’m not sure either that the free software model with support paying for the development is the way to go.
Don’t remember where I read it but he did suggest a Software Tax at one time.
But we will never need to solve this problem. Open Source thrives in certain areas but in others it is always behind. Some software you pay for and it works. It needs no support. Search for your own reasons but the simple fact is that the GIMP will never be as good a Photoshop, not to mention the rest of the Adobe Suite. So a world of all Free software will only exist in Stallman’s head.
There is still a problem of paying developers in the areas which OSS thrives. The best way I see is that companies hire devs just as they hire admins. They cooperate with a central managing body, like Apache or Mozilla, and all contribute to the software that they use. The benefit is that they could have distributed internal support between all contributors.
This is where is see Ubuntu going as a common base distro.
I think you’re missing the big picture here. Free software is a manifestation of the fact that users share many of the same requirements for what their computer should be able to do, and that nobody understands their needs better than the users themselves. I intend to show that users have yet to fully understand the role of software in their everyday lives, and that as they begin to realize its importance, they will in turn realize the importance of free software.
Computing has become less about providing an exclusive advantage–a leg up on the competition–than about bringing people and businesses together in a way that makes solving complicated problems easier. Computers began by solving problems we already knew how to solve, but faster. Today, computers solve problems that we had no way of solving before, providing insight into the nature of our world and the way we interact with one another.
Software is about people–what makes us the same, what makes us different, and how we can interact in a mutually beneficial way. It’s time that the way we develop software more closely reflects the problems it is meant to solve. The first step is to conquer the problems solved by the preexisting paradigm. Honestly, these are not the kinds of problems that free software solves remarkably well. But without this as a starting point, we cannot leverage the power of our collective insight to solve problems that are beyond the capabilities or ambitions of conventional proprietary software development.
Make no mistake: the free software community doesn’t relish the opportunity to create replacements for existing software. People rely on these products to create, manipulate, and consume information, yet their creators don’t allow us to fully understand, improve, and extend these capabilities. They pretend to know everything we want to be able to do, and they pretend to offer to bring our vision to reality. But they only care about what we want, and they only offer to make it happen, so long as there’s something in it for them.
Proprietary software only succeeds in an environment where people don’t understand what they want out of it. When the desktop market began to blossom, people knew they wanted to be able to create documents and play games–not much else. Proprietary software vendors made it happen, and people were happy for a while. Then came the Internet and email. The proprietary software vendors made it happen, and people were happy. The same with multimedia a bit later.
But once people’s most basic needs were addressed, we were faced with the question of where did we want to go from here? We began to realize that we didn’t just want to create, manipulate, and consume information. We wanted to discover, explore, and share information. We wanted to work together on information, and we wanted to experience life as the pursuit of relationships amongst information.
Proprietary vendors are in the business of dominating the connection between the software that creates information and that which consumes it. So our natural desire to understand information–to become enlightened–is not of any particular business interest to them. The proprietary model of funding software has broken down, because software is no longer about their producer/consumer relationship, it’s about participation. People ultimately want to participate in social and information networks that cultivate relationships amongst people and ideas. The want to feel like part of a community, and they want to realize personal empowerment through participation.
These are heady concepts that I believe will come to people in time as they work through their frustrations at the ways in which software limits their ability to satisfy their insatiable desire for personal growth. People are already starting to become furious at the rigid connections between the creation and consumption of media. The process of extrapolating this sentiment to all aspects of their ability to participate in our information-based society is only natural. Our initial attempts to right wrongs might not always address the root of the problem (e.g. piracy), but a true understanding of what it takes to protect our right to participate in information is inevitable and will lead the masses to free software.
Free software is not just a development model. It is a reflection of the way social networks have spontaneously assembled throughout the history of mankind. Freedom is what happens when we stop making excuses for our own ignorance and strive for the fulfillment we so desperately crave. Freedom is what happens when people are allowed to connect and participate. It is a path to progress, understanding, empowerment, and ultimately happiness.
Freedom is a notion that doesn’t just exist in Richard Stallman’s mind, it exists in all of our minds. Proprietary software will continue to provide a model for production and consumption, but only free software can fulfill our fundamental desire to participate in information.
Seriously though, I think that this issue brings up on a small scale some of the questions about a world of only Free software that Stallman and FSF supporters have yet to answer adequately. Mainly: Will developers work for free, and if not, who will pay them?
To fine tune that argument further – Will developers work on fixing bugs in software that don’t interest them; in otherwords, will they fix a bug that doesn’t affect them directly or indirectly? this isn’t an attack on the programmers, just bringing up the old notion of self interest.
Its like communism; are you willing to work and give what you can and only take what you need? same goes with programming, for many, its an interest outside their full time occupation, so there fore, is there any incentive for them to work on things that don’t affect them directly?
One idea which I like is the idea of bounty’s for key bugs or features required – provide an incentive for people to work on the rather unglamorous parts of the software and receive a reward for their hardwork.
Money shouldn’t be the only reward; for example, Sun could offer a free workstation, 3 year Solaris and developer subscription for someone who comes up with a replacement for the current sound API – the amount it would cost Sun? sweet bugger all; benefit to their customers; massive to the point that it can’t be measured in dollars alone.
As for Stallmans world; its based on the assumption that firstly languages will get to the point that they’ll be so high level that almost any man and his dog and contribute; it also assumes that the end user will advance in skills – 20 years of IT so far and I can assure you that the average end user is just as dense, if not more, than they were 20 years ago.
The other assumption is this; that those who work in large companies maintaining large amounts of infrastructure will have employers who are willing for their employees to spend part of the day working on an opensource project that helps their business – considering that the in vogue thing for managers is to scream outsource when in doubt, as if it were the panacea to all of lifes problems, I doubt the above secenario would happen as it would rely on a number of companies working together on the one project in co-operation rather than simply re-inventing the wheel at each company – that would require common sense, which many businesses lack.
20 years of IT so far and I can assure you that the average end user is just as dense, if not more, than they were 20 years ago.
Because the average user is receiving increasingly less education. There is however some changes on the way, and I can say for sure, that younger computer users are a lot more tech-savvy than the former generations. It’ll happen – but it takes time.
The only problem I can see would be a lack of women in the IT-sector. Not that they don’t exist – they do – but there are quite few of them.
I think you’re missing the big picture here. They paid the release managers, yet when the developers slow down there’s no release. Now exactly who should they have paid to speed up the release? Common sense would say to pay the guys actually working to squash bugs and get the damn thing out the door, not glorified mailing list managers.
Etch will be released when it’s done. Where’s the problem with that?
Also, there are 103 open bugs, NOT 541.
http://bugs.debian.org/release-critical/ (look at “number concerning the next release”)
Looking at that graph one wonders what happened about a year ago. Did ATI sneak release a driver or what?
IIRC, the increased number of bugs one year ago was caused by the modular X implementation. But that was a problem that all distros had to fight with at one point or the other.
>Debian is suffering continuing delays partly because of a slowdown by key developers. Many developers are upset that Debian’s two release managers are being paid to work full-time to finish Etch.
This is true and nonsense but for Debian itself it’s a problem because of their philosophy.
>Unfortunately, sometimes it’s NOT better to be late than never.
True. And Debian is about quality. Most of the big distros out there advert with short release times and cannot fulfill it – e.g. Fedora, delay by delay and afterwards lot of bugs again. This I call “quality”. So in the end, Etch should take all the time available to fulfill it goals about quality and stability. There are enough distros full of hype but with a massive lack of stability.
Heh, if it weren’t for trademarks they could name it Debian Vista!
*ducks*
(Just a joke people, relax
/*Heh, if it weren’t for trademarks they could name it Debian Vista!*/
good one, but Vista is already out, any many sheeple are already enjoying their DRM os vista, and i’m still waiting for Debain etch, if this delays keep up it might force a number of debian users that don’t have the kind of patience to wait for etch to get release to other distros, to more current up to date debain based os, like, ubuntu.
Edited 2007-02-23 22:35
if this delays keep up it might force a number of debian users that don’t have the kind of patience to wait for etch to get release to other distros,
Or they could use etch. Debian Testing is pretty stable. There’s a lot of package churn at times, but that can be offset by not running apt-get upgrade every day. I’ve been using Testing for several years (Sarge then Etch) with no problems.
Granted, no one would want to do that on a server, and it is more work than tracking Stable. For me though it’s always provided a nice compromise of stable/cutting edge.
Why are you awaiting for Etch? Go to the Debian website and grab one of the nightly builds right now. Then you can run apt-get upgrade at least once a week and by the time that Etch final gets released, you will have it already for quite some time. No need to wait for anything!
This isn’t Ubuntu nor Fedora, you know?
Time to get the money out of politics and freesoftware.
Wheezy the Penguin is still available.
Seriously, “herding cats” is a tough job. I’m sure the Debian folks will get it done, and it will be worth it. Don’t forget the time from Woody to Sarge. Things ARE improving.
We can make the year of Linux bugs now.
It’s difficult to attract new users to an operating system that is considered outdated when it’s released. Even on the server side there are some rather old packages that are the cause of difficulties after too much time has passed.
And it’s too easy to say Debian is for the server. Debian itself sees itself as a general purpose distribution. As such it has to compete not only with other solid server distributions such as RHEL or SLES but also with shiny desktop distributions such as Ubuntu, SUSE, and Fedora.
Another way to look at Debian is to look at is at the reference implementation of an abstract system – the Debian System. As a reference implementation, it doesn’t focus on all the shiny niceties we have all gotten used to – but it works. Sun does this with their J2EE product, they release reference implementations.
People like Ubuntu, Linspire, Mepis produce friendlier implementations of the abstrace Debian System. This does not in any way detract from Debian, it is rather a testament to its usefulness.
Having said that, I find Debian to be a very usable system if you know what you are doing. If you don’t, it is best to use one of the friendlier implementations.
What exactly happens at the moment? There are “nice” distributions that used to be based on Debian – well, they are now based on Ubuntu. So this “reference implementation” thing is somewhat fading away …
Debian _is_ a very usable system. What I mean is: Does it still attract new users? Isn’t it true that Ubuntu takes away a huge chunk – even now with Dapper for servers? And isn’t Ubuntu on its way to substitute Debian (tho somewhat impossible since it’s based on Debian itself)?
Well, if Ubuntu ever stops being based on Debian, then I would agree that Debian has a problem. However, I don’t see that happening in the near future. The fact that Linspire and Mepis are _indirectly_ based on Debian is to me a moot point. They are still implementations of the Debian System.
It’s hard to say anything definite of the usage stats of non-commercial distros. Nevertheless, it looks like most of Ubuntu’s users come from somewhere else, not from Debian. It’s also possible that some Ubuntu users want to check out the “mother-distro” once they’ve played around with Ubuntu for some time. And Debian has a solid (and well-earned) reputation as a very reliable server distro.
There’s DistroWatch popularity stats that doesn’t show decline in Debian’s popularity:
http://distrowatch.com/stats.php?section=popularity
And there’s the two-years old Netcraft report that shows growth for Debian:
http://news.netcraft.com/archives/2005/12/05/strong_growth_for_debi…
And then there’s the recent news that Hewlett-Packard makes good money by selling support for Debian. This probably wouldn’t be possible if there weren’t a lot of people using Debian.
http://www.internetnews.com/dev-news/article.php/3661481
>People like Ubuntu, Linspire, Mepis produce friendlier >implementations of the abstrace Debian System. This does >not in any way detract from Debian, it is rather a >testament to its usefulness.
True, but none of them makes use of the enormous work of stabizing the codebase as none of them is based on stable. Most of that work goes to waste as serious users have to seek for updated versions of their target software voiding any testing and integrations effort debian developers did.
Maybe debian should just ditch stable alltogether.
the number would be lower if people didn’t file fake bugs like this one.
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=280987
That’s not a fake bug. Learn to read the whole thread before linking.
Build-Depends are tracked, they’re not enforced.
This means that without the bug report, the problem would be known but not fixed by the scripts.
The article tries to make it look like Debian has a problem. Of course it has, but so does Ubuntu, Fedora etc.
Other distros just release at a certain date, no matter the quality of the packages. Debian does not.
It matters to have both Debian and distros like Ubuntu for people who care less about the individual package quality and more about usability / ease of use.
> Other distros just release at a certain date, no matter the quality of the packages. Debian does not.
Fedora, RHEL, OpenSuSE, Ubuntu, Mandrake, all have delayed a release. One week, two weeks, sometimes 2 months.
Debian simply does not know how to manage a release.
Debian simply does not know how to manage a release.
I think you’re overlooking a couple of rather obvious problems in releasing Debian.
One problem Debian must face is that it’s one of the biggest distros out there, maybe bigger than any of the other distros you mention. Debian supports 11 architectures and possibly more packages than any other distro.
Another big problem is that Debian developers are volunteers who don’t get paid for their Debian work. This means that they cannot concentrate full-time, seven days a week to get the release out on time.
Now, put these two problems together — you’ve got this very big distro that needs a lot of complicated and synchronized work to get the release out, but the developers can only work for Debian when they’ve got some extra time in the evenings.
Sounds like a Mission Impossible? Actually, they’ve done rather well so far but there are still a couple of issues to be sorted out before they can release Etch. If people could just accept the idea that a good product is more important for Debian developers than releasing at a specific point in time, they might have a little bit more patience.
People like Steven J. Vaughan-Nichols can make fun of Debian, comparing it to Duke Nukem Forever and suggesting that it will never be released, but these snide comments will lose their entertainment value as soon as Etch is out. And if it’s a good release (I have no doubt that it is), then that’s what people will remember.
Etch is in a usable status now. It will be even better once RC2 of the installer is released (pretty soon).
But even the daily builds work quite well.
This is bad news to everyone (like me) that believes in community linux work.If your are system administrator and
your work depends on Debian release to implement new features, like php5, then this is your worst nightmare.
Debian will be soon a hobylist distro.
Long release times aren’t an issue for people who have chosen Debian (or RHEL, or SUSE Enterprise). If you want to have a stable distribution with a few more current packages, you can either use a backport (best approach), pin based on Debian unstable/testing (not my favourite), from compile your own version (the universal fallback).
The only problem for corporate users is that without regular (even if they are long) release times, it’s not really possible to do any deployment planning. I seriously doubt that Debian will soon become a hobbyist distro since it’s always been this way and has grown despite the irregular releases. However, the irregular releases do limit Debian’s market potential, especially if the Ubuntu Long Term Support releases happen regularly and with Debian-like reliability (only time will tell).
If you want to have a stable distribution with a few more current packages, you can either use a backport (best approach), pin based on Debian unstable/testing (not my favourite), from compile your own version (the universal fallback).
Yes, it makes life harder on both the desktop and the server when installing on new hardware that’s supported only in the latest kernels.
Even Etch wouldn’t recognize sata hard disks and optical drives on newer motherboards late last year when I was trying to install it.
The *buntus had no problems since they used a newer kernel.
It’s very hard to justify the extra time spent on kicking Debian into usable shape on new hardware when I can just pop in a *buntu disk and have a fully functional install half hour later.
Etch uses currently the 2.6.18 kernel (I believe it still had 2.6.17 late last year). Now, the product in the *buntu family that’s comparable with Debian Etch (a stable Debian release) would be Ubuntu Dapper Drake (with “Long Term Support”), which uses the 2.6.15 kernel. So, Ubuntu’s LTS release doesn’t really support your latest hardware any better than Debian Etch.
Etch uses currently the 2.6.18 kernel (I believe it still had 2.6.17 late last year). Now, the product in the *buntu family that’s comparable with Debian Etch (a stable Debian release) would be Ubuntu Dapper Drake (with “Long Term Support”), which uses the 2.6.15 kernel. So, Ubuntu’s LTS release doesn’t really support your latest hardware any better than Debian Etch.
Well, I was mainly talking about 6.10 (Edgy).
But interestingly, when trying Ubuntu Dapper on this same Asus P5B board that Etch wouldn’t even touch, it recognized correctly the sata drives. So they must have used a patched or somehow different sata driver in Ubuntu.
I’m aware that there is a lot of info on how to get Debian installed on this same hardware.
But my point is that it’s a lot of effort and requires a lot of Linux skill (major stumbling block for new Linux users) to accomplish as opposed to just running the install disk and be done in about half hour.
That’s why I think it’s important that Debian has more frequent releases or at least update the old ones with new kernels (probably not possible but can’t see why not).
If one has to go through near nervous breakdown just to install Debian on a new computer then sooner or later one will look for an easier alternative.
And very few new computer users will be willing to spend a few weekends reading through the Debian docs just to get the thing installed.
Well, Debian Etch should work all right with most of the hardware that is available today. But, as Mark Shuttleworth has pointed out, even Debian can’t be everything to everyone — and neither can Ubuntu.
I find it a good thing that there are plenty of Debian-derived distros, which offer alternatives when Debian doesn’t meet your needs. I’d suggest that you should try Sidux, a new distro that is based on Debian Sid. The latest Sidux release comes with the 2.6.20 kernel and it has a live-cd with hd-installer, so you can test it before installing.
Here’s a short but informative review of Sidux:
http://www.kdedevelopers.org/node/2691
Just my 2 cents, but I suspect SJVN is trolling in this article. I think he is exaggerating Etch’s release-unreadiness by quite a bit. SJVN has already declared fulsome Ubuntu lurv and beat up Debian quite severely during the Firefox/Iceweasel affair. He is hardly an impartial witness. The Dunc-Tank business also doesn’t amount to much. The “go slow” faction, if such it is, seems to have been isolated to a small group of mostly French babies.
For all that, SJVN has put his finger on an important point, namely that Debian’s organization makes it very hard to them for adapt and change. This is actually a serious challenge because the Linux world is becoming more and more hungry and competitive with every new year. Folks are not nearly so forgiving these days because they don’t have to be, with so many other top-notch distros around both for server and desktop.
I’ve been using Etch for months and am typing this on it. Fantastic, so what is all this fuss about?
It is always funny to read the comments of desktop users in comparison to comments from system builders or even distribution builders.
Users complain about slow release cycles, huge delays, etc. in releasing a stable Debian, a distribution that only secondarily is targeted at them. System builders are happy with stable releases about every 18 to 24 months and cope well with delays, finally distribution builders are moving away from Debian because Debian’s development branches — the ones they would like to use to make a fresher impression to the community than Debian does — are moving too fast!
Linspire, Mepis & Co. base off Ubuntu because Ubuntu stabilizes a more interesting subset of Debian once in a while. Just recall the statements by Mepis’ chief developer or check out what tool-chain Edgy Eft is using… you will be surprised to note that it on average is older than Etch’s. While i think Mr. Shuttleworth had something different in mind, he was right when he said Debian can’t be everything for everybody. And to bring in another obscure reference, Debian is the compiz of the distributions, well thought out, deliberate and rock solid. I’ll prefer it in high availability scenarios over Ubuntu anytime, just because it is so serious about its quality.
“Users complain about slow release cycles, huge delays, etc. in releasing a stable Debian, a distribution that only secondarily is targeted at them.”
Huh? I somehow seem to have lost the target group of Debian. I always thought it was for users …
Sometimes it’s good to go back to Ian Murdock and read about his vision for Debian – then:
http://www.debian.org/doc/manuals/project-history/ap-manifesto.en.h…
” (…) It will eventually be distributed by The Free Software Foundation on CD-ROM, and The Debian Linux Association will offer the distribution on floppy disk and tape along with printed manuals, technical support and other end-user essentials. All of the above will be available at little more than cost, and the excess will be put toward further development of free software for all users. (…) The Debian design process is open to ensure that the system is of the highest quality and that it reflects the needs of the user community. (…) Involving others also ensures that valuable suggestions for improvement can be incorporated into the distribution during its development; thus, a distribution is created based on the needs and wants of the users rather than the needs and wants of the constructor. (…)”
It makes a great OS for servers, a great desktop by proxy, and to boot it’s an adventure in ad hoc democracy. What’s not to love?
Etch is perfectly usable now.
However, what is the situation RE security patches? I know that stable (Sarge) gets patches and I thought Etch had some sort of security patch system in place.
But using Etch on backend Samba and Database servers has been working well for a while.
Apparently HP is making money from Debian.
http://www.internetnews.com/dev-news/article.php/3661481
<quote>
Who said you can’t make money by supporting free community-based Linux distributions?
HP is making $25 million by supporting the free Debian GNU/Linux distribution in what may ultimately turn out to be a challenge to commercial distributions from Novell and Red Hat.
</quote>
So it would be easy for them to pay some developers wages – and pay well.
These developers should be part of the main Debian setup and would not have to go into HP offices etc. Think of them as ‘no-shows’ as per the Sopranos!
On the Debian site we could then see a listing of which staff are being sponsored by which company. We could also see which companies are supplying other support such as bandwidth or hardware.
Now, I buy servers regulary and I can freely choose between Dell, IBM, HP, Fujitsu-Siemens etc etc.
(Already I choose IBM over Dell because of their support for Linux).
I would buy servers from the company which is providing the most number of ‘no-shows’ to Debian.
After all, I am looking after my clients *long term* interests. So they may pay slightly more for a server today but if the server supplier is actively supporting Debian then my clients will gain in the long term.
As a side note – if a developer feels they are being wrongly pressurised by their ‘sponsor’ to do anything against the Debian ethics then they could soon let the rest of us know. I don’t think any company would like to get a bad rep with the Debian community!
Or couldn’t Debian have copyright on a
‘Certified for Debian Etch’
badge.
Companies should then pay a small fee to be able to badge their servers/PC’s.
This would mean that there would be a steady income stream coming back to Debian.
Also, for server suppliers like me it would make it much easier to decide on which servers to buy.
For instance, I’m not sure if SATA support is built-in to Debian Sarge for Dell servers – but a badge on the server would mean I could buy the machine knowing that Debian will install easily.