Pretty much for my entire career in Linux USB (eight years now?), we’ve been complaining about how USB device power management just sucks. We enable auto-suspend for a USB device driver, and find dozens of different USB devices that simply disconnect from the bus when auto-suspend is enabled.
For years, we’ve blamed those devices for being cheap, crappy, and broken. We talked about blacklists in the kernel, and ripped those out when they got too big. We’ve talked about whitelists in userspace, but not many distros have time to cultivate such lists.
It turns out it’s not always the device’s fault.
Fascinating bug.
The temporary fix is apparently for Linux to wait 20ms by default instead of the previous 10ms, but apparently xHCI is supposed to emit an interrupt when the device is actually ready instead of the host initiating anything. What struck me as odd is the fact that this interrupt apparently goes completely ignored in Linux’s USB-stack, even though it’s mentioned in the specs. Why? What’s the rationale for implementing resume wrong in the first place? Will it actually be fixed properly now that it’s out, or will the devs just stick to the 20ms work-around?
From the article:
What is kind of fun is that, devices that somehow worked on other systems where dismissed as faulty. Usually if something work for others but not for your system the first suspicion should go to your system. It could be that the device is faulty and that the problems arise in your context, but over my career I’ve found that in many cases is the other way around.
It is like bugs in compilers. I’ve come across some, for sure. But most times I thought it was a compiler error, it was actually bad code on my side.
In general, I agree with you, but in practice, when it comes to USB, there are so little devices actually _truly_ following the specs that I can imagine things being dismissed, mostly for being USB
Yeah, that may explain the attitude: when you get burned twelve times because of bad USB specs implementations, then you tend to think the thirtieth times that it’s again the case. Isn’t that a well known cognitive bias?
… and you just outlined a behavior that has been RAMPANT among *nix developers for DECADES now. You are quite correct — if the same hardware using software built off the same spec doesn’t work in one, but does in all others, the implementation should be the FIRST thing you look at.
I’d not be surprised if ALL the devices that have problems in Linsux, even when ALLEGEDLY supported are not victim to similar problems. Lord knows two thirds of what I own for hardware is either crippled far below it’s full capabilities (video for example) or doesn’t work at all (networking, sound, APM/ACPI controlled cooling) trying to use any *nix as a desktop OS — which is only PART of why I consider ‘mainstream’ Linux and it’s BSD kin to be pathetically useless crippleware as a desktop OS.
You see the same thing in Web development, where people writing websites will blame the browsers for their own ineptitude… be it jumping the gun on specifications not even out of draft, using outdated methodologies proven time and time again not to work, slapping endless “gee ain’t it neat” crap at a site before they even have the core functionality working, or worst of all just can’t pull their heads out of 1997’s arse using things we’ve been told for fifteen blasted years to STOP USING!
Really across the board in programming this is a rampant problem I’ve seen time and time again — where the people writing software don’t actually seem to take the time to understand the language they are writing in, the specifications they are writing against for hardware, or even the point of the task they are trying to implement.
… and mostly the cause seems to be an apathy that finds it’s roots in laziness, comfort, and wishful thinking. Of course, when you dare to point these things out you’re an ‘alarmist’ or ‘unfairly harsh’ — reality is harsh, deal with it! Have a problem with that, do the world a favor and go flip burgers for a living!
Helen: Daria, do you have to cast everything in such a negative light?
Daria: You mean the harsh light of reality?
Edited 2013-08-24 08:44 UTC
deathshadow,
“I’d not be surprised if ALL the devices that have problems in Linsux, even when ALLEGEDLY supported are not victim to similar problems.”
Put this into perspective. The reality is that most linux users are NOT running on linux certified hardware, meanwhile many still having the expectation that linux should run as though it were officially supported, never mind if it’s 3rd party code with no official support whatsoever. IMHO this is one of the bigger problems WRT linux drivers, most manufacturers aren’t cooperative leaving 3rd party developers to reverse engineer the product.
“You see the same thing in Web development, where people writing websites will blame the browsers for their own ineptitude…”
Haha, Sure many exist who are incredibly inept. However every experienced web developer will have faced the agony caused by quirks and inconsistent rendering between browsers requiring asinine workarounds. It may be getting better, but things can still break due to browser problems.
Your experience does not match mine at all.
My hardware has been perfectly stable for the last 10-13 years.
Of course, I research what I buy thoroughly. Never had a problem. In fact, the driver situation for me is a lot better in linux than in any other OS.
I don^A't have to be hoping that the manufacturer of a MFP decides to stop supporting it, such as HP did with the HP laserjet 2480 where in Windows 7 64-bit you can no longer scan from the network.
I can list dozens of devices that have met the same fate.
It works beautifully and perfectly in linux. So I have had abandoware “drivers” in Windows time and time again, and ever-improving drivers in Linux. To me, the choice is simple.
So in an open spec and an open source OS it took somebody that has been working on this code for the last 8 year to find this bug.
I will reference this article whenever somebody writes a comment how “all problems would be fixed if MS opened their Office specs, NVidia would open their hardware design and browsers would just follow standards”.
Software development is REALLY hard and tiny things can go unnoticed for a very long time until somebody finally sees the light. I hope USB-devices will soon work better on Linux than they do today
No. It was just the attitude of snarky Linux developers.
Let alone a snarky Windows developer might want to read through the post, before spouting ignorance and schadenfreude.
Edited 2013-08-23 11:54 UTC
This snarky Windows Developer DID read through the entire post, and the comments. That is where he read about all the problems this is causing for users, how someone solved the problem by using a Mac where USB “just works” and where someone posts that in the Windows driver a timeout of 10ms works just fine.
This snarky Windows Developer also read about all the workarounds that have been discussed and deployed to fix this problem previously (blacklists that grew out of proportions, whitelists that wouldn’t be maintained)
I did enjoy the schadenfreude although I don’t know why. Mostly I am happy this will (soon, hopefully) solve the problems of many endusers
P.S. I am just a developer that uses .NET. I am not “developing Windows”. I do have to defend Windows as a closed platform because I keep hearing “x is open source so everyone can look at the code and fix things but they cannot fix this because it is closed” which irritates me a lot
But that’s true. Look, no one is saying “All bugs in any Open Source software will always be found and fixed.” The quote is “Given enough eyes, all bugs are shallow.” Notice that it requires “enough eyes” I.e. in order to find bugs, people have to be looking at the code. Evidently, not enough people were looking at this particular code, so the bug wasn’t found. But it probably still has more eyes looking at it than a closed source application.
You can use this article. But only if you can then explain why keeping the source closed helps to avoid a situation such as the Linux USB bug.
Go on. I’m very interested.
I can use this article, with or without your permission and with or without explaining your false premise.
Code/Product quality has nothing to do with open or closed source. Problems don’t go away when things are open or closed. Problems like this go away when an expert tries very hard to solve a problem and eventually finds it and fixes it.
In this case closed source (OsX and Windows) read the same spec years ago but wrote better working code.
The open source code and spec were available for everyone all the time but it took this long to get the problem found and (soon, hopefully) fixed
OS X’s USB stack is open source, so people are free to see what Apple did to get USB functioning the way that it does.
Apple Source Browser: IOUSBFamily
http://www.opensource.apple.com/source/IOUSBFamily/IOUSBFamily-560….
Alternately, they didn’t write better code, but their quirks are better supported by hardware manufacturers.
This wouldn’t be the first time hardware manufacturers have designed hardware to work with a particular OS rather then follow a spec with other operating systems having had to hack around it.
Mostly agree. Although the interpretation of the relevant time out was a 50/50 chance. The fact that windows and/or os x both work better in these situations is not enough evidence to say they have better qa/testing (windows probably does, Apple probably doesn’t). They could have just been lucky.
You are right about open source code not being any more stable or robust when initially written. The stability advantage comes form not having to reinvent the wheel, so that when a proper solution has been found, derivative works can benefit from it more effectively.
Now of course, this also means that bugs from sloppy code can spread more easily as well. To this extent, open source relies on emergence via selective processes more than central authorities (which, I personally think is a better approach over all). However, when dealing with specialized areas that are not duplicated, but are core to the underlying framework (such as kernel development) the notion of duplication and emergence as a means of feedback selection falls apart. I recommend “The Cathedral and the Bazaar”
avgalen,
“Code/Product quality has nothing to do with open or closed source.”
This is correct.
“Problems don’t go away when things are open or closed.”
This is not necessarily true. You can end up “up the creek without a paddle” when you don’t have the source code.
As you know, open source doesn’t automatically solve incompatibilities and problems, obviously someone qualified still needs to be motivated enough to actually do the work, which most users are not. However when there is no source code all users become 100% dependent upon the manufacturers, as opposed to being dependent upon a much larger set of qualified developers that an individual/company could hire.
I’ve had at least three instances where I really needed a fix for proprietary drivers but the manufacturers just were not interested in doing it, and without the source code I could not do it either.
If you come from the windows world, then you could be forgiven for trivializing this dependency because all manufacturers explicitly support windows out of the box. You don’t really feel the pain of closed source drivers as much as those of us who use an open source OS.
Completely agree on this.
Also completely agree on this.
The reason I personally (as a developer) prefer open source is that it gives you the option to fix the code. With closed source, you don’t have that option at all.
Also, even if you can’t fix the code, it lets you see how the code reacts in certain cases, which coud help you find a workaround.
Durrrr… except for the minor detail that he never actually made that claim in the first place. Do you have trouble with reading comprehension in general, or was that a deliberate attempt to setup a strawman?
Go on. I’m very interested.
Edited 2013-08-23 16:42 UTC
No, IMHO, the problem was in the spec. It defined a timeout value that was interpreted as a minimum by hardware developers and a maximum by software developers. Reading the spec and the code side by side, you probably wouldn’t ever notice the problem.
This is something that only could have been caught with testing for this issue on a wide variety of hardware ( which is how it was caught). This is one of the weaknesses of linux, especially desktop linux, IMHO. It still happens with Windows, but more rarely. My brother’s desktop runs ubuntu because windows isn’t stable on the hardware
If there’s something so wrong with hardware that current Windows releases are unstable, using Linux might be still risky… (for data integrity for example)
zima,
“If there’s something so wrong with hardware that current Windows releases are unstable, using Linux might be still risky… (for data integrity for example)”
Maybe, but I wouldn’t say that’s a foregone conclusion. Microsoft sometimes breaks a lot of perfectly good hardware and drivers with each new operating system. XP->Vista for example was notorious for this. I skipped Vista personally, but with win7 my webcam stopped working, a usb disk enclosure, as did my parent’s document scanner. Sometimes it really is a windows problem with nothing wrong with the hardware.
For this particular case it’s a clear advantage for open source drivers. With FOSS we are not dependent upon manufactures to to update hardware drivers. Arguably they even have a *disincentive* to do so in order to EOL perfectly good hardware to push new hardware sales.
Edited 2013-08-28 02:37 UTC
It’s not so simple though. Many OSS drivers are half-baked and/or unmaintained. One example I have here are Radeon R200 drivers – people like lemur2 would tell you that they are the shining example of FOSS ecosystem …but the truth is, they are horribly broken in multimon setups for a very long time. The same hardware works flawlessly under Windows, even though it’s supposedly abandoned.
Webcams are similar – I’m a sort of webcam collector (I have way more old ones than I need). And sure, they often work under Linux, but a) the support is typically, again, half-baked b) Microsoft made the situation much better by the requirements of Vista & up logo programme: to have it, a webcam must support generic USB video device class.
PS. Anyway, my post above was about something else – probably broken hardware, which somehow manages to be stable under Linux …but it’s better to not trust it.
Edited 2013-08-29 18:27 UTC
Replying to your comment in the BB10 topic, which has been tombstoned by the OSAlert CMS…
I don’t know of a networked SMS backup solution, but to backup them on an SD card, you can use SMS Backup&Restore : https://play.google.com/store/apps/details?id=com.riteshsahu.SMSBack…
For calendar and contacts, I use ownCloud on my home server, combined with the CalDAV-Sync and CardDAV-Sync apps from Marten Gajda. They are available on Google Play, and also on the AndroidPIT app store if you prefer to pay through PayPal like me.
Edited 2013-08-28 06:18 UTC
Clearly you’re just a troll. Or a shill for Micro$$$$oft. Or you don’t understand the issue. Or you didn’t read the article. Or you’re using the wrong distro. Or you should submit a bug report. Or you should fix the problem yourself and submit a patch.
Hmmm, odd – none of the standard excuses really seem to apply here… not to worry, though, the gist is that open source is a magical cure-all and if you have an issue with it, then it must be your fault (somehow…).
Now that you have seen the error of your ways, you will no doubt a life-long open source advocate. Because, after all, a barrage of talking-point personal attacks is obviously the most effective way to convince someone to change their position.
In response to the posts seemingly praising open source… It’s not exactly rocket science to figure out some open source stuff works better and some closed source stuff works better, and it’s not the openness or closeness that makes that statement true.
If you work on anything that goes into the linux kernel, you will know that the open nature of it can also be its biggest drawback. There’s constant bickering about how something should be done. So much to the point that it puts off great coders and much more elegant approaches. Linux is riddled with compromise, poor approaches, poor implementations, backwards thinking, etc. The one good thing however is that you’re able to fix all the dumb crap yourself thanks to its openness. Ultimately both openness and closeness are blessings & curses.
How good/trash software is really depends on the decisions that have been made starting from the ground up.
If you think the same does not apply to closed software like Windows then you are mistaken.
It should be viewed as a property, not an ideal or inherently superior for software progress.
If it were actually inherently superior then there would not be thousands of abandoned open source projects all waiting for “many eyes” and “many brains” to work on them. Furthermore there are tons of open software projects where bugs stay open for years even though the “many eyes” already found them since there are limited number of volunteer developers available.
Anyways paid eyes and paid brains beat volunteer eyes and volunteer brains. Open source can be useful but then so can a paid tester whose job it is to spend 40 hours a week looking for bugs in proprietary code that open source religion would find too boring to even glance at for 5 minutes.
It’s really the money that matters outside of software that hobbyists find interesting. I’ll take 5 paid developers over 100 volunteers any day of the week. I don’t care if the source is open or proprietary.
I agree with most of what you’re saying here about community development, but at the same time I think that you’re also making a mistake by unilaterally associating it with open-source. Where does the open source definition (as provided here http://opensource.org/osd ) state that developers of open source software should never be paid and/or well-organized?
The GNAT toolchain for Ada development ( http://www.adacore.com/ ) seems to be a good example of commercial open-source software development, IMO. You are encouraged to buy the commercial version through broader licensing options, beta access and much better support, while at the same time, the open source community gets regular GPL-licensed source code releases for uses such as independent code reviews or inclusion in Linux distribution repositories.
Of course, you may argue that this kind of business model only works for corporate software. Then again, even in the closed-source world, that increasingly becomes the only way to make significant money with software, as users become cheaper and cheaper in their software purchases in the emerging “app store” software distribution model ($0.99, really ?).
Edited 2013-08-27 06:57 UTC
Neolander,
That’s exactly it. Source availability and commercial status aren’t mutually exclusive and we do find overlap.
ze_jerkface is right that there are many abandoned open source projects, however we should not jump to conclusions from that by itself without also knowing more about how many abandoned closed source projects there are. My hypothesis is that the abandonment rate is greater for all small devs in general since they have much lower odds of success than larger devs, however I don’t even know if this is statistically true.
It would be really interesting to have concrete statistics about long term success rates given funding, head count, technical aptitude, license, etc as inputs. It would be sad, but unsurprising, if it turned out that all small developers had low potential across the board simply due to their small size.