“This is the first real release of Wayland and Weston. Wayland is the protocol and IPC mechanism while Weston is the reference compositor implementation. The 0.85 branch in both repositories is going to be protocol and interface stable. We have a series of protocol changes on the table before 1.0 but this branch marks a stable point before we jump into that.” Change is coming to the Linux world. And yes, I get the irony of using this particular icon, but it’s the closest I could find.
What does Wayland has that X.org don’t?
Edited 2012-02-11 00:16 UTC
Umm… It can render windows without any tearing! No tearing whatsoever! Ever!
…
*yawn*
Wake me up when they make a graphical server that doesn’t crash, please.
Brilliant… i dont know if to laught or cry
are your serious ?
Okay, so it’s a more efficient design if you want to use desktop compositing. So what? What’s the point of desktop compositing anyway, other than silly graphical effects and no tearing? Windows XP got by fine without compositing, and despite being “obsolete” and “outdated” it can do pretty much every actually useful thing that Windows 7 can do.
(XP *is* obsolete in terms of security, but that’s not my point.)
Anyway – the amount of rapid break-everything change going on in the Linux world is ridiculous right now, and it looks to me like Wayland is going to be more of the same.
P.S. That last statement is why FreeBSD is now my main OS.
What can`t you update your code, or atleast hire someone else to update whatever you need? That is opensource. No need to worry about breaking anything.
I feel exactly the same way. Linux desktop is in shambles at the moment. I’m staying with Ubuntu 10.04 LTS for as long as I can – hopefully still this whole mess blows over.
But I am looking more and more and FreeBSD as well – if it works, don’t “fix” it.
On a side note: I am a GUI toolkit developer, and would love to see some improvements over X11. Things that Windows and Mac graphics systems have done for years. Performance in X11 is terrible, and the tearing is very bad in todays world. If Wayland can improve on that – good for them.
With that kind of conformist mentality of “Let’s never make progress because we are too afraid of change”, it makes me think that you should go back to live in the Stone Age and use Windows 3.1, where there is no change or progress as you like it.
Sorry but you are a moron.
Edited 2012-02-11 07:44 UTC
[qWith that kind of conformist mentality of “Let’s never make progress because we are too afraid of change” [/q]
That’s not what I said. There is a difference between making changes to improve things, compared to “lets make changes that breaks everything and just be different – f*ck the end-users.”. Unity, Gnome 3, KDE4 falls in the latter category.
End-users don’t like change, so if changes need to be made (like for the sake of improving things), those changes needs to happen is small digestible bits. Software Development 101.
I’d agree, if in fact we were talking about changes to an existing codebase. Unity, Gnome 3, KDE 4 and the rest were all *new* starting points. In those cases it’s best to just make a clean break and let the pieces fall where they may instead of compromising the design for the sake of legacy support.
What I do wish they had done though is maintained the old codebase in parallel to the new one for a while. Preferably until early adopters like myself have and the project coders have had the time to refine things with quality of life features that people take for granted till they aren’t there. Unfortunately that’s not a practical expectation when every high profile distro on the planet jumps on board with the new stuff as a lead feature.
That is exactly the main reason for Wayland, there are architecture problem for X11 that makes it slow, mostly because it has to support network transparency. There have been multiple attempt at making X11 faster (XAA, EXA…), but all of them have given good enough results, because they could not address the fundamental flaws in X11.
You mention Windows XP as an example of a good GUI, when in fact the XP GUI is slow as hell that has leaks all over the place. What the hell are you smoking?
Edited 2012-02-11 06:47 UTC
Oh and Linux is not “breaking everything”, it’s growing, progressing, enhancing things, maturing, moving things forward. The community and development is alive, active and healthier than ever.
Well, unlike your BSD that doesn’t even have support for KMS.
Edited 2012-02-11 07:23 UTC
There’s experimental support for KMS with the Intel and Radeon drivers. Not sure if it’s been committed to the main subversion repo or not, though.
That’s good to know, sorry about my tone, but it pisses me off when somebody says that Linux is breaking things, it’s not, it’s making progress.
Edited 2012-02-11 09:18 UTC
Well, I’d say that in the software world, breaking stuff and making progress are far from mutually exclusive, and that achieving the latter without the former is an art
Edited 2012-02-11 11:20 UTC
Indeed.
Get your head out of the sand dude! Unity broke so many applications. Unity in general is so much more limiting to Gnome 2.x – I can’t even set my font anti-aliasing settings in Unity like I could have done in Gnome 2.x. KDE4 took about 2 years to be “usable”, but still lacking so many features that existed and was stable in KDE3. Gnome 3 – I don’t even want to go there.
Don’t believe me. Go to DistroWatch. Since Ubuntu changed to Unity, it has been dropping month-on-month. I used to be the number one distro for years, but since Unity, it is now number three and falling. This is one example of many where Linux for the desktop is in shambles at the moment.
This is pure nonsense, I’m afraid.
KDE4 is going from strength to strength, and it is a fully functional, powerful and polished desktop. It is miles in advance of where KDE3 left off.
How do people come up with this rubbish?
Edited 2012-02-12 00:50 UTC
No, there’s only experimental GEM support for intel GPUs. Nothing for Radeons.
The question was about KMS (Kernel Modesetting) support.
Open source Radeon drivers absolutely do support KMS.
http://www.x.org/wiki/RadeonFeature
Wow… Try to keep up… The discussion was about KMS on FreeBSD, specifically, which does not exist at all for radeon hardware.
You know, neither Windows nor OSX have support for KMS and they do just great.
OpenBSD is an operating system on its own and it hasn’t to be a Linux clone or even a Linux copycat.
What the hell, Android doesn’t have KMS and it’s Linux.
You don’t have to support KMS, udev, upstart or pulse audio to have a fully functional OS.
KMS is coming to *BSD’s (including OpenBSD) as it makes a hell of a lot of sense. Besides the usual features of KMS mentioned it is also a means to an end to deal with the security concerns OpenBSD has had with X.
Ahem, being able to run X as a normal user instead of root is a big improvement in security for sure, but ‘an end to deal with the security concerns OpenBSD has had with X’ is a very bad way to describe this: there is no ‘end’ in security
Windows has support for KMS, look at this:
http://en.wikipedia.org/wiki/Mode-setting#Microsoft_Windows
See also:
http://en.wikipedia.org/wiki/Mode-setting#Implementations
Oh and Linux is not “breaking everything”, it’s growing, progressing, enhancing things, maturing, moving things forward. The community and development is alive, active and healthier than ever.
All fine and dandy, but it doesn’t change the fact that people who just want to use their desktops productively are greeted with a mixed bag at the moment.
Progress is fine and nobody is advocating maintaining the status quo indefinitely.
What is irking many people is that pre-beta software is thrown out as an actual production release. We’d like a little stability and 70% of the feature to be finished before we are subjected to project Y as the first guinea pigs.
Nobody uses desktops to gawk at pretty pictures and go Oooh and Aaah.
I believe “even” chimpanzees tend to be taken by experiments (quite possibly using computer video output, nowadays) showing them… pretty pictures (or moving images) & “Oooh and Aaah” – also, I hear such are a major part of human internet activities and, generally, audiovisual storage and playback (often, nowadays, using computer desktops of various kinds).
Smart ass comment. Nicely done.
But seriously, pretty looking desktops are only acceptable when they work as intended, are well rounded and finished products. Not what we get these days as “release ready” desktop distros.
Back in the early 2000’s it was halfway acceptable to get desktops that were a bit flaky and under featured. In 2012 it simply isn’t acceptable to release alpha software and expect people to accept experimental bananaware as the final product.
Release a preview if testing is necessary. Most people will try it out and hopefully report back.
The opportunity you presented was irresistible (I’m sort of surprised it wasn’t on purpose, some variant of sort of ~”trolling…” – but “…for funnies”)
And yeah – here I am, basically using a desktop as plain, or even more so than win95 one (also because I tend to just turn off desktop icons).
I guess it might be also a disconnect between working on the software (as devs do) vs. with the software (users); and/or tediousness of polishing, vs. fun of introducing “exciting” new things. And the main issue with the latter: the overall advancement (feature-wise, etc.), for many categories of desktop software, tends to be in or near “good enough” for some time now; chasing “progress” is not a major issue (lets see: I have right now opened the browser obviously, file manager, IM, ~notepad thing, coincidentally also audio, video and picture openers ;> – and IIRC around half does have newer versions but I don’t really care about updating those …over the last few years they tended to bring, well, nothing; browser is the main notable exception)
Edited 2012-02-11 21:33 UTC
I’m not offended. Your comment was fun to read and it was an eye opener of sorts.
Yes, people do use desktops to gawk at pretty pictures. Heck, I use the desktop to gawk at Youtube.
It’s just that I was so intently focused on the desktop itself, that I forgot that gawking at pretty pictures is broader than gawking at a pretty desktop.
But, but, but, but … that goes completely against the whole “release early, release often” mantra.
That’s the way it goes in the OSS desktop world. You get stuff for free but the price you pay is that occasionally you have to play tester.
As for Ubuntu that’s really Shutterworth pet project where he wants to execute his dream UI paradigm dusruptions. Fact is it’s tablets now that everybody’s crazy about now. So is Ubuntu. Wait for first crop of decent x86 tablet hw just waiting to gets some Linux freshness. You’ll be pricing the new UI.
If that’s still not your thing make Linux distro diversity finally work to your advantage and go somewhere else.
Wayland 0.85.0, KDE 4.0, you name it, perhaps it’s not a so obvious to everyone that dot-oh or an oh-dot means not so stable. This is open source software, developed in the open where you could download it at anytime. It doesn’t mean it’s a finished product.
If you’re downloading bleeding edge software while really you just want what works, don’t complain, change the part of the above that doesn’t do it for you!
Say what you like about XP and its GUI. Fact is that it’s still perfectly usable 11 years after release. And what’s more, the majority of current Windows software still works on it.
Compare and contrast to RedHat 7.1 which was released at broadly the same time. Impossible to get current software installed on it, and essentially unusable with the outdated Gnome 1.2 desktop.
True, but for all the wrong reasons. Vista was several years late, and was generally reviled by the market – so developers had to support XP for what seemed like eternity. And Microsoft kept extending its life, even bringing back marketing during the netbook surge.
Now that 7 has arrived and been met with a general sigh of relief by the Microsoft faithful, support for XP in new apps is fading away.
Finally.
Don’t expect Win 7 to run most new apps in 2023 – unless Win 8 is greeted with the same lack of enthusiasm as Vista was, of course.
I can’t speak for memory leaks as I am not a Win32 programmer by any stretch, but what in the world is slow about it? Especially on modern machines it is quite fast at drawing windows and widgets, and even on hardware of its first generation it is quite fluid and usable given enough RAM.
While it lacks full compositing support, in my experience (given identical hardware) it can keep up with unaccelerated Metacity, Kwin and Xfwm and in some cases renders smoother than all of them. Turn on compositing and the tables turn of course.
Don’t get me wrong, I’m happy to see XP give way to Windows 7 but to say it is “slow as hell” tells me you’re either running it on crap hardware with bare minimum RAM or you’ve got some serious issues with your installation.
ok, i am convinced i am not understanding properly, because to me you dont make sense.
if you dont want tearing in your graphical effects, just enable vsync. done ! Xorg with no tearing.
the same happens on windows! happens everywhere : tearing is a videocard <-> monitor sync of the vertical refresh !
i am all for wayland also, but lets keep it fair!
as far as “rapid development” of linux, i am glad that it is so. but if there is an area where development hasnt been so rapid is on the graphic server point : do i have to wait another 10 years for another graphical server to appear ?
It’s not that black-and-white, sorry. X11 has huge latency issue which causes the “slow” feeling you get. Also the fact that the window frame and the actual application are two separated processes doesn’t help things either. This latency issue is very noticeable in X11. Simply take any complex dialog, drag the bottom right corner of the window resizing in both directions. The mouse cursor is always ahead of the window frame, and then only does the application try and catch up with the window frame. This causes lovely lag in screen updates. This is hardly noticeable under Windows, and not noticeable at all under Mac OS X.
The latency mostly has to do with interaction between the window manager and the application and, more importantly, that Linux apps tend to have dynamic layouts, which must be dynamically updated with a resize. Windows apps often don’t, or the layout is dynamic, but simple, requiring much less complicated algorithms for re-laying out the controls.
Can’t speak for OSX, but that kind of lag is absolutely noticeable under Windows Vista and 7, especially in Aero mode. In 7 it was bad enough that I turned on outline move/resize.
(Not noticeable at all in 2k/XP though, unless you’re using a Pentium II.)
IMO what’s more obnoxious on Linux is that outline move/resize is halfway broken. On any version of Windows, you can resize a window in outline mode and the screen will continue to update. On Linux, the entire screen will freeze, and so will audio for some applications – just to prevent artifacts from rearing their head (and they will, if you don’t freeze the screen!).
If that alone were resolved I’d be pretty happy. As I understand it though it’s a non-trivial thing to fix.
(And in that way I can see the argument for Wayland. It’s just that a) FDO developers have a history of horribly breaking desktop stuff every six months, and b) Wayland is Linux-only.)
Edit: that said, it works well enough and it’s not like I payed anything for it, so what the hell?
Edited 2012-02-11 22:00 UTC
The 10 years come exactly because of the decision to invest in X11 crusty codebase and instead of letting this venerable man pass away in decencyo, turning it into flashy Fr~A nkenstein.
Not sure if wayland is the answer but the only problem wit the decision to switch is the timing: 5 years too late!
Wayland is a protocol that allows direct communication between applications and hardware. No window server is necessary, the window manager takes care of everything (but you can run an X server on top of Wayland if you need it, just like OSX does).
The answer to your question is that Wayland has actually less features than X.org, but the things Wayland can do, Wayland does better. Also, Wayland is lightweight enough to run on anything (smartphones, embedded systems…).
This sentence is misleading: I think that it would better to say that the window server and the window manager are merged in one process: the compositor.
As for the “lightness”, there are light implementation of X too, and I don’t think that we have numbers to measure the difference, do you have some?
The real question should be, “what does X have that Wayland doesn’t?”
In that case, I would start with network transparency.
I’m not so sure I understand the point in Wayland myself. I originally viewed its inclusion with Ubuntu as just another way to distance themselves from all the rest and break compatibility with other distributions, but I think Fedora has also announced switching to Wayland eventually. Then again, this is Fedora I’m talking about… like Ubuntu, they’ve made a lot of decisions I didn’t agree with (or at least at the time–ie., KDE4, GNOME 3).
I think I read somewhere that Wayland is lighter and cleaner code than X (stripped of features, it better be, but give it time–software bloat will eventually kick in and shit on any advantages…). Aside from that, I recall that it’s supposed to feel snappier than X, which is not a bad thing. But I also read that it will support graphical effects with more responsiveness, which I’m not so sure is a good thing–I’m not too crazy about all the graphical crap they add to desktops these days, and in this way Wayland has the potential to allow even more with less of a performance penalty.
Edited 2012-02-11 00:57 UTC
At first, I was not so sure, but I came to accept it.
I have one of these crazy movie like screen configuration with my own WM to manage that in an efficient way. So I had a taste of X bloat mess. With both protocols, Xlib and XCB, it feel bloated and outdated. The API have been abstracted over and over, so at this point in time, nobody interface directly with X anymore because it is almost impossible. Qt have 3 backend (GL, Raster and X11) because pure X11 performance was terrible. GTK is similar too (Cairo backend, Gallium state tracker and X11).
Then there is the whole design extension mess. X was never supposed to be composited and accelerated. It was designed for network transparency. It was designed so the server and client are not on the same computer. So anything that fail to enter that model, so 99.95% of X installation, is suboptimal.
Then the fact is that the whole network transparency and core X protocol is outdated. A lot of Ph.D thesis and research have been done since 4 decades. Even microsoft rewrote the GUI part 5 time in that timeframe. Just look how efficient SPICE, Nx, Citrix and VNC are at network transparency compared to pure compressed X11. Being network transparent and locked in a terrible protocol is now a drawback for Xorg. They can’t innovate anymore. They have been trying to add support for multitouch for 6 years now and it is still unmerged. Any major refactoring of the protocol is a no go, so they are stuck in unnecessary design loop to achieve anything. If they could simply go back and update the protocol without breaking compatibility with other version of X (because and version of the server and client can be different), it would be easier, but they can’t.
As for composting. Many of you think of it as desktop effect, but you are wrong. Compositing is not about desktop effects. It is about having any clients in their own framebuffer. All those framebuffers are then piled up, layer per layer on top of each other. Modifiers can be applied to individual framebuffer for effects, but it is not the point. The point is having the most optimal rendering pattern and having the compositing done on the GPU as it is more efficient. Apple did it first and it made MacOSX feel snappier than Windows. Windows then did it for longhorn not to copy OSX, but not to fall behind. They knew they could not stay on the old double framebuffer path as it was ignoring innovation. Novell and RedHat both knew that too and tryed to “fix” X. Novell used XGL, a whole server on TOP of X11 to do it in a clean way. It was snappy and implemented compositing with GPU in mind. But it broke so many things that making it work at all was very hard. The drivers were not ready at all. AIGLX, the RedHat way is a X extension and is less groundbreaking and more iterative. However, it inherit X11 limitations. So 5 years later, I can say both solution failed.
Wayland is -designed- and not just patched over and over to the point there is no design like X. It implement backward compatibility in a safe approach, doing what Apple did for years. It does not have to keep compatibility to old version of itself, so they are free to fix design issues instead of working around them. It is mostly modern and both Qt and GTK now have the necessary backend abstraction to implement it in a clean way, not in an hardcoded way like X11 in GTK2. It remove many unnecessary layers between the application and the screen. It move most of the job in the Kernel, while X11 was designed to be an userspace application, so it is more efficient than the constant mode switching required by a driver based userland application (see LWN driver handbook). And finally, it is more secure and less leaky for the years to come.
As for the point made by some “give it time and it will bloat”. You are right. But once it is the case, we can write a newer one based on newer technology and fitting use case of whatever years it will be written in. We wont be stuck with a protocol for another 30 years. Microsoft proved you can rewrite a graphic layer without too many issues. Some win9x did not work in XP and some WinXP app did not work in Vista (and all Vista App wont run “native” on Win8, but this is another story). But overall, it went well and prevented a design mess, improved the performances and allowed new technologies to take advantage of the progress made. Now that we have the abstraction to do it, I think it is the way forward.
Technically (compositing – NOT composting & yes they are so wrong – as ~”clients in their own framebuffer” and handled efficiently by hardware), Commodore essentially did it (first?) with the Amiga – most immediately experienced in how you could smoothly pull up & down* another virtual desktop, or fullscreen application, even one in different resolution or colour palette.
And yeah, it probably contributed to Amiga GUI feeling quite snappy – though evidently it didn’t really inspire anybody, then, to catch up.
*I seem to remember some specific name for this effect, but couldn’t quickly find it when scanning over Wiki Amiga-related pages – oh well, still, some relevant links out of it:
http://en.wikipedia.org/wiki/Virtual_desktop#Amiga
http://en.wikipedia.org/wiki/Original_Chip_Set#Blitter (and “Uses of the Copper” just below)
Edited 2012-02-11 04:32 UTC
God damn, is it even possible to be modded above 31? But really, that was an interesting post with tons of information. In fact, if I was able to (I already posted in this topic so I can’t), I would mod you up myself… that is, if you haven’t already been modded up like hell to begin with.
Edited 2012-02-12 06:16 UTC
[quote]Just look how efficient SPICE, Nx, Citrix and VNC are at network transparency compared to pure compressed X11.[/quote]
VNC does not have the features of X11. You can not export a DISPLAY and have different windows running on different machines. VNC is only for remote desktop.
NX runs on top of X11.
I believe X11 makes sense for servers and HPC, which is where it shines when compared to Windows and MacOS X.
There is a per-window VLC with Mutter (Gnome 3) plugin called Xzibit created by Collabora (gstreamer, telepathy) for telepathy. But it is not the point. I was talking about networked efficiency. The number of bits per second it take to make the client/pixmap usable over a network. Sorry is it was not clear in the first post.
Frankly your post seems to be a “fanboy” post!
For example, you state that at one point the driver weren’t good to have OpenGL HW acceleration for X, this is correct drivers were bad (and some still are), but Wayland with the same drivers would have had issues too!
So how is-this relevant to the X vs Wayland discussion?
Also about X (lack of) evolution, what you imply is that it is a technical issue for X, but IMHO this is a *social* issue (lack of interest, lack of manpower), and I have a proof: toolkits are taking a loonng time to use XCB instead of XLib.
So there is a risk that Wayland will have exactly the same issues than X when the novelty factor wears off..
(Sorry for another long post like this)
>For example, you state that at one point the driver
>weren’t good to have OpenGL HW acceleration for X, this
>is correct drivers were bad (and some still are), but
>Wayland with the same drivers would have had issues too!
>So how is-this relevant to the X vs Wayland discussion?
The point was that X have not been developed with OpenGL, multitouch and compositing in mind. This is simply because they were non existent at the time. If you ever taken an advanced software design/architecture class in college, you can probably figure out the kind of bending that was necessary at the time to make the old design support the new technologies. Wayland do use the same driver having the same bugs, I give you that. But it is not 2006 anymore, the bugs that prevented compositing/XGL from working have mostly been solved.
>Also about X (lack of) evolution, what you imply is
>that it is a technical issue for X, but IMHO this is a
>*social* issue (lack of interest, lack of manpower),
>and I have a proof: toolkits are taking a loonng time
>to use XCB instead of XLib.
First, let me say that XCB is not the solution. (We) AwesomeWM have been the first WM to migrate to it. And honestly, it is not that great. We have our own set of problems and trigger a lot a memory leak in X. The main Awesome developers and mostly fed up with X protocol. See http://julien.danjou.info/blog/2010/thoughts-and-rambling-on-the-X-… (disclaimer: it is -not- me, and he is one of the XCB dev). So I don’t think it is a social issue anymore. The X protocol is frozen for compatibility. It can be extended, but not fixed. When, at work, I have to connect to some of our client servers, some using something as old as RedHat9 or CentOS3, I can still open a X11 application over SSH. While it is great, it just show how much unchanged the protocol is. In theory, I could still open a 1990 Motif application running an a mainframe.
See how other protocols/technologies handle that.
-Udev support only a small number of kernel for each release. You can not use older nor newer ones.
-Same for kernel modules. The API is not frozen. If it can be improver, it will. No matter if it take minor incremental changes for “drivers”.
-Microsoft have backward compatibility for Win32, but they are not happy with it. Newer toolkit, such as .NET had minor revision of the API over the years. Even C# still have some breaking changes from versions to versions.
-Android use “API level” and market share tracking. As of now 90.05% of Android devices use 2.2 or higher. So everything that have been deprecated before 2.2 (API level 6?) can now be safely removed from the main API, even if the OS still support it for a bit longer (to allow old apps to keep running).
-GNU/Linux have it’s own way of doing things for years. When a tech is superior to the old one, it is replaced. Sometime, it is rough, like UDEV->HAL->DeviceKit->UDEV+UPower+UDisk or OSS->ALSA->Pulse (talking about the frontal API, I know Pulse use ALSA as backend) or KDE1(Qt)->KDE2(Qt2)->KDE3(Qt3)->KDE4(Qt4)->KDE5(Qt5) and Gnome 1->2->3.
So instead of bloating an aging technology even more. X/Wayland devs are abstracting the old codebase to run on the new one. Everything that used to work will still work. But on a newer codebase designed for 21st century.
As a consultant, I often see COBOL application written years before I was born. These day, those organization (both public and private) use many layer of connector to access the data from other systems. They would want to add features, but they know they can’t. The cost of moving to newer technology is now too high. They will have to, they are aware of it, but as profit is the real boss, they avoid it at all cost. Adding tons of “one time” fix and support cost to keep the old stack running. This include the connectors, new mainframe, emulated tape devices, aging COBOL programmers, technical limitation inherited with avoiding adding new features and having an heterogeneous technological park (I am not saying homogenous technology is good, it is not, but again, this is an other story).
This analogy can be applied to X. Sometime, “fighting on” is not a long term solution.
There several points to considerate: performance, tearing, “snapiness”, network transparency.
*Performance: If you have a system where your application can use hardware acceleration to render itself, then there should be a gain in performance, otherwise the performance will be the same.
*Tearing: there are different way how Wayland can be used:
-client only rendering (which is what Wayland devs encourage), this setup should eliminate tearing, but note that in some case, a client can take some time to react to an iconify/kill button.
-server side rendering for the decoration (which is how KDE plan to use Wayland at the beginning): there can still be tearing, but even if the application is busy you can iconify/kill an application (don’t think that resizing is possible).
*Snapiness: I think that Wayland design create less IPCs so the result should be a little snappier, but you won’t have true “snapiness” until all the applications use a multi-threaded design a la BeOS/Haiku.
*Network transparency: here Wayland will probably be a bad thing as it doesn’t bring anything interesting on this part, except new bugs (new code) and reduced performance (because it adds another layer).
If this is true then I’m all for it. I’m not saying we need graphical effects whizzing around all the time, but for portable devices like tablets it makes sense in order to compete with stuff like the iPad. I wonder if Nokia is still considering a MeeGo tablet… Qt5 will support Wayland.
The whole point of Wayland is to do less than X.org
http://ompldr.org/vY3IyaQ/wayland-icon-with-white-bg.gif
Not a good looking icon.
However, I am one of the people who wants to see the fall down of X.
Great NEWS.
Edited 2012-02-11 01:14 UTC
Do you work at Canonical by chance? Or are you an Ubuntu developer or something?
I think the Wayland logo is great, you don’t like the icon or the Wayland logo? Sure the quality is degraded when converted to gif, but that’s not my decision, OSAlert seems to use gif as their icons.
I’ve used this as original image source:
http://upload.wikimedia.org/wikipedia/commons/9/99/Wayland_Logo.svg
OK, the original has some details that make it better.
For the person who asked if I worked for Canonical, I obviously don’t. Does Canonical wants to see the fall down of X? Well, you pretty know well that when X crashes, you lose just about everything you were doing – period. That’s what I am talking about.
People keep coming up with this “network transparency” feature, but I don’t think it’s any applicable or relevant for desktops.
Actually, I was being sort of sarcastic.
Well, true, but I don’t honestly don’t remember the last time I’ve seen X crash…. and whenever it did, I was doing something incredibly stupid, and yes I knew it, pushing the disgustingly under-powered machine (1.7GHz P4, 256MB RAM) far past its limits (working with extremely large astronomy TIFF images of insane resolutions). Honestly, in those circumstances, a crash is destined to happen.
However, it’s also true that when the OS crashes you lose all of your work. In this case, all operating systems are vulnerable… and I’ve seen far more of these happen on Windows than X11 crashing on Linux. Even Windows Vista, with SP2. So much for stability, eh? Windows XP before a few service packs also crashed like crazy when trying to do something as simple as download torrents (apparently a shitty, buggy driver, because one of the service packs fixed it and the number of BSODs dropped). Windows 95 and 98… same, frequent crashes. And don’t get me started on Windows ME.
Why not? I have used it to breathe a bit more life in the above-mentioned computer with 256MB of memory. With it, I was able to ease the transition between the old machine which had all of my files, to the new one which had a dual-core 64-bit processor and 1GB of RAM. I ran OpenBox on the old machine with all of my files on it, but when I needed to use a memory hog of a program (I’m looking at you, Firefox…) I ran it from the machine with a gig of memory. They were both connected to my monitor, so I could use either if I wanted. Eventually, once I got a new external hard drive and all of my files were migrated to it, I was able to retire the old 1.7GHz p4.
Edited 2012-02-13 08:16 UTC
Look forward to breaking one of my Arch or Fedora installs with it when it gets to the point of being a semi usable environment day to day. I’m not sure if it’ll ever entirely displace X11, but it should be a fun ride to follow for those of us who like playing with shiny new stuff.
Edited 2012-02-11 01:45 UTC
Or you could play it safe…
http://sourceforge.net/projects/rebeccablackos/
Hannah Montana Linux? Justin Beiber Linux? what…but…wait…why…the hell???
Well, I wanted to make Cannibal Corpse Linux distro but I’m too lazy.
Hey, I’d run it.
Edited 2012-02-13 08:49 UTC
Internet sarcasm is clearly out of control.
Well, it seems it originated on 4chan so the joke being lame beyond comprehension was inevitable.
Safe’s no fun, you learn more interesting things when the front falls off. That said, that’s likely the most terrifying thing I’ve seen since Rosie O’Donnell in Exit to Eden.
It is funny how previous articles about Wayland’s progress kept a relatively toned-down comment tone, while on this article about a “serious” release it seems that hell got loose.
More on topic : I’m happy that the Linux world tries to clean up its graphic stack by removing unused features rather than adding more and more. Simpler and snappier code wouldn’t hurt.
However, I also question Wayland’s current reliance on OpenGL and KMS. Knowing the sad state of GPU drivers on any minority OS, the fact that GPU drivers are in charge of implementing lots of stuff (including OpenGL) on Linux, and the wondrous performance of the software rendering fallback provided by Mesa, what exactly are they thinking ?
Edited 2012-02-11 07:01 UTC
It is not 1999 anymore. All major video card have and support OpenGL. Mesa removed all older drivers not using KMS+Gallium from Menu 8. So it will have softpipe accel for those with strange old hardware. The only real problem caused by dropping the old stuff is i810. That driver still had a lot of users.
For “minority OSes”, it is probably the end of the line. They will have to fork Xorg and Mesa 7.12 to keep the old gallium free drivers or they will have to port the DRM, GEM, TTM and Gallium3D to their kernel.
It is progress, as I stated in my previous post, the current situation is a no go. Linux is cornered because of X protocol backward/legacy compatibility. There is a trend in the dev community to do what it take to solve the problem. Evolve or die, Darwin said it.
A few notes first :
1/I was talking about drivers, not video cards
2/On x86, I consider that Linux counts as a “minority OS”, unlike on ARM platforms where it is a first-class citizen.
That being said, here is my problem in a nutshell : on Linux, open-source video drivers work okay on hardware that’s a few years old, but on newer GPUs you are lucky if they are even able to set a screen resolutions and draw 2D graphics on the framebuffer properly. Meanwhile, proprietary video drivers are not included in most linux distributions, it’s tricky to get them working, and they risk breaking at every kernel update due to Linux’s notorious unwillingness to get a stable ABI.
The situation is so bad that when Firefox decided to blacklist Linux drivers with what they considered an utterly broken OpenGL implementation, not much hardware was left working. For another example, one can see a KDE advocate’s traditional answer to KWin performance problems, which basically revolves around “Linux graphics drivers suck”. More anecdotically, I am always amazed by how badly very simple OpenGL games run on the open-source Intel video driver and how often KMS/Plymouth breaks and leaves me on a console screen during OS startup, with the same driver.
So, what does Wayland hope to achieve by relying on such broken software ? That drivers will magically improve due to user complaints ? I think that after 4 years of KDE 4 and even more of Compiz, it is fair to say that this won’t happen. Is it a push to make Mesa improve their software renderer then ?
Edited 2012-02-11 08:43 UTC
As someone doing some stuff related with graphics development, I won’t touch Intel graphics cards unless I get paid to care about them.
With all these years gone by and they still don’t manage to produce graphics cards with proper 3D support.
If we’re talking Intel hardware, weren’t the Sandy Bridge IGPs (HD Graphics 3000) supposed to be much better than previous generations, close to mid-end Geforce M in performance ?
Edited 2012-02-11 08:43 UTC
On the marketing materials yeah.
If you check online forums for game development you will see that reality speaks otherwise, specially on the OpenGL side.
If you are using DirectX, then at least Intel provides the GPA (Graphics Performance Analyzer), but even there the graphics performance is just a bit better than what ATI and NVidia are able to achieve.
The main reason is the way Intel keeps trying to impose their graphics architecture, without assuming that actually they should change the way they design their GPUs.
Well and they’re quite successful with that, no? (don’t they ship a strong majority of GPUs? …and I don’t really see them losing this position, the opportunity to become more or less “the baseline to target” for at least most games, in few short years)
At least their GPUs are definitely much more decent (user-wise; it seems we can’t say that about poor devs ;p ) than was the case just half a decade ago.
Edited 2012-02-11 15:03 UTC
If we restrict ourselves to the Windows world where their DirectX drivers are quite ok, then you are right.
Now OpenGL support is a joke from what I see in OpenGL.org forums.
I guess many OEMs go for the integrated graphics solution as it is cheaper, specially due to the tight margins they usually have.
So in the end it always depends on which type of consumers you as developer want to target.
The graphics situation is actually a lot worse on ARM than on x86. No X11 drivers, and the drivers that do exist (for Android) are proprietary blobs.
Very good point, I was only thinking in terms of market share and that was short-sighted. Wasn’t someone working on making Linux compatible with Android drivers though ?
Well, one more reason to stop depending on X. I don’t mind the binary blobs as long as they work and don’t prevent me from upgrading my OS.
The way I see it they are targeting this squarely at the embedded market. Maybe including the next generation of tablets running ARM and OpenGL ES with completely different graphics chips. In those cases the drivers come with the device so it’s not an issue. Whether they are open-source or not, that I’m not sure of…
It seems there’s some quota to fill, and since this topic was and still is at the top of news list…
Cmon guys.. osnews got to have new icon for wayland
remove this shit: http://www.osnews.com/images/icons/56.gif
Agreed. I’d also like to see Wayland have its own logo here in OSAlert, so I’ve emailed the Wayland icon to Thom Holwerda, but he said they are working on a new version of OSAlert.
This was his reply:
Here is the icon I made:
http://ompldr.org/vY3IyaQ/wayland-icon-with-white-bg.gif
Original image:
http://upload.wikimedia.org/wikipedia/commons/9/99/Wayland_Logo.svg
I hope that Thom doesn’t mind that we are quoting him here.
Edited 2012-02-11 08:41 UTC
I just assumed the icon had something to do with running rings around X.
Wayland Toolkit Q&A
http://www.youtube.com/watch?v=WNXWT3ine7E
How-To Write A Wayland Compositor
http://www.youtube.com/watch?v=u6Jvdo55RUU
Edited 2012-02-11 08:52 UTC
The official ones aren’t out yet, so I’m glad Phoronix was at the Graphics track so I can watch some of them.
xforwarding: that’s one bit of tech people choose to ignore… and it’s a damn shame.
Once you’ve used it, it’s like a revelation.
I showed the little ones how to use it.This is not obscure tech just for the super geek in the server farm.
But I guess it’s a small price to pay if you are ignorant about this tech. I’m sure the boys and girls will come up with something just as spectacular. OSS always do
Wayland won’t kill network transparency, don’t worry about it.
I love network transparency too, but I’m not afraid, I know network transparency will still be there in Wayland. Also, Wayland developers ARE former Xorg developers, so I think that everyone should know that they know what they are doing. It’s not that they will take network transparency away from people.
Personally, I hope that network transparency will be available as a layer on top of Wayland, think about the possibilities that will bring: modularity, KISS, UNIX design, better/easier maintainability, competition and elegance.
Instead of having “one true way” of doing network transparency in the core, enable the same functionality as layers on top of the minimal Wayland API and enable competition that way, while at the same time retaining KISS/UNIX philosophy.
I think the Wayland future looks bright, let’s embrace and love it.
Edited 2012-02-11 09:44 UTC
My point is that modularity is good. Some people like to do the same things differently. Thus allowing flexibility, choice and modularity. I think Wayland will bring that.
Wayland will make network transparency better, it will take it to a new level.
Think about programing languages for example, some programming languages have a small standard library and they have third party libraries for doing tasks.
I hope Wayland will be modular like that too, I think it’s very exciting.
Thanks Wayland team for having the courage to try new things and take things to a new level.
Edited 2012-02-11 10:24 UTC
One thing that bothers me in Wayland, which incidentally was “done right” in X, is that Wayland assumes that 1 window is 1 process. While that’s an nice and easy model (just link to your graphics backend and you’re done) it doesn’t have to be the most efficient design.
Mind you, with the advent of multi-core CPUs people now are going in the opposite direction. They are breaking up software in parts similarly to how it was done in X: in parallel (separate client and window manager processes), and in a pipeline (client and X server processes). This was a very good design, which had nothing to do with network transparency, that was just a nice side effect.
X has of course many problems, mainly the cruft it has accumulated over the years and poor integration with the OSs. Personally, I’d love to see X12 protocol with features people actually need (Cairo-like rendering API, audio, video codecs, camera support, perhaps uploadable “applets” (e.g. implementing a WM or toolkit widgets)) with X11 being provided using a root-less server.
How exactly Wayland assumes this? Where did you read that?
Good move. Issues like X.org’s video tearing are what makes third party developers run away from Linux. Nvidia’s drivers actually replace a significant part of X to make things work, and this is because they are paid big $$$ by companies who use Linux for rendering. Other companies like ATI just don’t care and rely on what X provides, and you get nasty tearing effects and other nastiness. Then they throw the source and hardware specs to the community just to show the world the community can’t write good graphic card drivers without replacing at least a part of X either.
I would actually want to see a Wayland for audio too, and not PulseAudio which is yet another attempt to put plaster on top of something broken (ALSA).
If you don’t fix the graphics and audio stack, real developers like Pinnacle and Adobe won’t come no matter how hard you whine in their forums. The only succes history of Linux on the consumer space is Android, and this is because it did the brave move of providing a new audio and video stack.
Browser: Mozilla/5.0 (Linux; U; Android 2.2.2; el-gr; LG-P990 Build/FRG83G) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1 MMS/LG-Android-MMS-V1.0/1.2
Too bad all they put out is useless binary blob garbage. No thanks.
+1
Nouveau FTW.
“Too bad all they put out is useless binary blob garbage. No thanks.”
And the graphics card in your Linux box is of what brand, exactly? Nvidias are the only cards that work without issues, and the reason Nvidia won’t publish the source is so that purists can’t complain how Nvidia’s drivers mangle their sacred X (i once came across a book by Graham Glass and King Ables that actually praises X, homestly).
Someday the Linux crowd will have to decide: Do they want Linux to be a viable alternative to Windows, full with a DRM system for Netflix, binary blob drivers etc or Linux to stay “pure” to the free software/neckbeard ideals of the holy GNU, sacred X.org and divine Bash, where developers have to deal with bad APIs and lack of SDKs? So far, we have distros like gNewSense on one side, Android on the other, and Ubuntu somewhere in the middle. A break up should happen soon between neckbeard Linux and consumer-friendly linux, for the better of both sides, because as Ubuntu proves, the two sides can’t coexist
Browser: Mozilla/5.0 (Linux; U; Android 2.2.2; el-gr; LG-P990 Build/FRG83G) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1 MMS/LG-Android-MMS-V1.0/1.2
I have no problem with binary blobs, but I rather hope that support for DRM and legal corporate rootkits never shows up. As silly as it sounds, you have to fight back against some things, if only in a small way.
P.S. I do not pirate stuff.
Who cares about Netflix and DRM, all you need is: http://thepiratebay.org/
And quite frankly, when it comes to the blobs, the nvidia blob is a f–king piece of absolute shit. It doesn’t even have KMS, seriously.
The nvidia blob is a straight half-assed port of the Windows drivers to Linux, we don’t want that kind of bullshit.
Nouveau FTW.
Edited 2012-02-12 00:48 UTC
This is a stunningly bad appraisal of the situation.
Here is a Graphics card hierarchy chart from Nov 2011. The cards listed at each level are roughly the same performance, one wouldn’t normally be able to pick the difference in use.
http://www.tomshardware.com/reviews/fastest-graphics-card-radeon-ge…
At every level there are choices between nVidia and AMD/ATI. Only when one gets to the lower levels are there also Intel choices. At the bottom of the page are listed some of the best value buys at different levels. Radeon choices predominate the listing. So from purely a hardware perspective, the best performance-for-money hardware at many tiers of performance is an ATI card.
Because it supports Wayland, the best driver for AMD/ATI cards is the open source Gallium3D driver from Xorg. This driver has recently achieved OpenGL 3 and GLSL 1.3 compliance.
http://www.phoronix.com/scan.php?page=news_item&px=MTA1MDM
http://www.phoronix.com/scan.php?page=news_item&px=MTA1NTg
Performance tuning has very recently begun to land (first up is R600 Gallium3D tiling support)
http://www.phoronix.com/scan.php?page=news_item&px=MTA1MjY
The next major performance boost should be coming from HiZ.
http://www.phoronix.com/scan.php?page=news_item&px=MTA1NTE
I wouldn’t be too sure about this.
The first release of Wayland is due in the first half this year.
http://www.phoronix.com/scan.php?page=news_item&px=MTA1Mjk
Planning is in place to get Wayland and X to both work seamlessly on the same machine:
http://www.phoronix.com/scan.php?page=news_item&px=MTA1NTM
So a system with open source drivers will be all set.
Just use a system with an AMD/ATI graphics card (for performance and value-for-money) or an Intel graphics card (if you must), and you will be good to go. It will work very nicely out of the box with zero hassles.
Run this system with a personal-desktop focus, up-to-date KDE distribution:
http://distrowatch.com/?newsid=07089
http://distrowatch.com/?newsid=07092
http://distrowatch.com/?newsid=07028
Where is the problem?
Edited 2012-02-12 01:25 UTC
Well the main problem points are performance and supporting the latest standards I’d say (OpenGL 3.0 is about 3 years old). If you want full performance from your card you’re basically forced to use the binary blobs, in my experience there’s no comparison there.
The situation on Linux is really unfortunate: you either get to take the open source drivers, which have the best integration with the rest of the system, but suffer performance, or you take the binary drivers and suffer integration.
Between ATI and NVidia binaries, the NVidia ones are generally better drivers (I have Nvidia at work and ATI at home, I’ve used all combinations of the drivers at one point or another).
I hope there’s a path to use the binaries with Wayland, just in case the opensource drivers aren’t up to snuff by the time it is the de facto standard.
OpenGL is indeed a few years old, but the open source drivers are catching up fast.
What Mesa Has Left With OpenGL 3, OpenGL 4
Posted by Michael Larabel on August 11, 2011
http://www.phoronix.com/scan.php?page=news_item&px=OTc3OA
All of the stuff under GL 3.0 has since been done.
GL 3.1 will require:
GLSL 1.40
Texture buffer objs (GL_ARB_texture_buffer_object)
Uniform buffer objs (GL_ARB_uniform_buffer_object)
GL 3.2 will require:
Core/compatibility profiles
GLSL 1.50
Geometry shaders (GL_ARB_geometry_shader4)
Multisample textures (GL_ARB_texture_multisample)
GLX_ARB_create_context_profile
GL 3.2 will require:
GLSL 3.30
GL_ARB_blend_func_extended
GL_ARB_texture_rgb10_a2ui
GL_ARB_timer_query
GL_ARB_vertex_type_2_10_10_10_rev
Performance tuning has begun, and tiling support has landed in master but is not yet in the production drivers.
HiZ is next in line for implementation. Between these two upgrades the performance of open source drivers will improve significantly.
http://www.phoronix.com/scan.php?page=news_item&px=MTA1MjY
http://www.phoronix.com/scan.php?page=news_item&px=MTA1NTE
They should make up a great deal of the gap to the performance of the closed binary drivers.
By the end of the year this will resolve itself. The path ahead for Linux graphics is clearly NOT via the binary drivers, but rather via the open source drivers.
Edited 2012-02-12 10:57 UTC
Catching up fast?
Maybe, OpenGL rendering is improving but note that there are other things that videocards do: video acceleration, GPGPU.
Both aren’t ready on opensource drivers..
OpenCL (GPGPU) for open source drivers is a wok in progress. Some of the required components have been done, but most are still to be started.
Video acceleration works fine for the Intel open source drivers.
The groundwork for video acceleration using UVD for Radeon open source drivers has been started, but the critical programming information has not yet been released. This is held up through lack of this information.
http://www.phoronix.com/scan.php?page=news_item&px=MTAwNzg
Is there support for suspending clients if Wayland server goes away (like in case of crashed compositing manager – now in server process as far as I understood)? E.g. client-side library could handle this and re-establish IPC once the server is restarted, either in normal or fallbck mode.
X client side has a bad habit of killing the app in case that X server goes away, and I always thought that protocol rewrite is right moment to implement ability to reconnect, instead of retrofitting it into Xlib(XCB) which is next to impossible.
Another issue: window decorations – are they client side now? Does server take over window controls if application becomes unresponsive (no problem in X because decorations are handled by the WM process)?
This was discussed at FOSDEM 2012.
Yes, the decorations are now client side as it is the case in Mac OS X and Windows.
They are still thinking what to do with unresponsive applications, there is yet to be a final decision in what to do in those cases.
About reconnection issue: is-it an X protocol issue, an implementation issue or an application issue?
Probably a problem in many layers. Client side problem is I guess in Xlib(XCB) which asserts when server closes sockets (not sure, but I suppose it works that way). These omnipresent libraries have some expected behavior and changing that to introduce suspend in X communication would be problematic or complex.
The protocol? Extensions have worked well over the past in fixing problems. Basically only a subset used often these days could be supported (client side drawing “mode”), so any ABI’s not compatible with restarting could be avoided or extended if needed.
Then there is also state on the server side that has to be renegotiated if server is restarted, like where shared buffers are.
I suppose that Wayland now is a better place to land this kind of stuff instead of very invasive changes in X that this would require (and a headache of app compatibility which would be selective). Not sure if it has been planned or discussed for Wayland, however.