“Chapter 5 of the FreeBSD Handbook provides an excellent overview for understanding and configuring the X Window system. Today’s article goes beyond the Handbook to demonstrate some of the cool things you can do with your FreeBSD system and other systems running X.”
AFAIK, r200 stands for R200-based cards, meaning Radeon 8500 to Radeon 9250, not Rage200, and similarly r300 stands for R300 and R400-based cards (Radeon 9500 to X800), not Rage300.
In case this helps
http://dri.freedesktop.org/wiki/ATIRadeon
I’m no BSD expert, that may be how it’s designated internally?
Though……………I’m sure there’s more that OS news’ers can talk about than a small typo.
Edited 2006-12-14 23:45
According to this guide my laptop’s Radeon 9000 Mobility card is a “rv250/M9” card. It doesn’t work with Compiz/Beryl/composite. I tried 3 different distros, they can’t handle the card for a 3D accelerated desktop, and it kinda bugs me.
I have the same problem on my laptop. Radeon mobility cards do not play nice with X.
According to this guide my laptop’s Radeon 9000 Mobility card is a “rv250/M9” card. It doesn’t work with Compiz/Beryl/composite. I tried 3 different distros, they can’t handle the card for a 3D accelerated desktop, and it kinda bugs me.
Same here. Radeon 9000 128MB (normal desktop agp variant). Does not work with any of the advanced graphics stuff in Linux.
Edited 2006-12-15 00:10
The funny thing is that it doesn’t work with compiz/beryl no matter if you use the Xorg driver or Ati’s binary driver. A great mystery, because even less powerful cards work. It seems to be a bug somewhere in the implementation of compiz/beryl, because 3D games work fine, accelerated and all.
Are you using display :1 for Xgl instead of :0? This is a known bug with ATI cards…
I am not using anything, just the default stuff. I tried with Novell’s Linux too that supposedly comes pre-configured for that stuff. Novell’s Linux says that my gfx card is not supported, while it naturally is as it’s a newer version that some older supported cards (and yet not too young either). 3D apps and games work fine, only composite chokes.
The proprietary ATI drivers don’t support the composite extension, so you have to disable it in xorg.conf (yeah, I know, it doesn’t make sense). Also, when you start Xgl (ATI doesn’t support AIGLX), you have to make it use display :1 instead of the default :0.
AFAIK you *have* to use the proprietary drivers, but I can’t say for sure. I can only say that I had to on my laptop (Xpress 200 chipset).
Conclusion: Nvidia has done a much better job with their Linux drivers than ATI. I have both an ATI laptop and a Nvidia desktop, and setting Beryl on the latter was a real breeze compared to the former.
Edited 2006-12-15 01:32
I’m not sure if these guides are valid and working anymore or not, but they at least suggest to me that it is possible to get it working.
http://www.ubuntuforums.org/showthread.php?t=219336
http://www.ubuntuforums.org/showthread.php?t=291464
http://lhansen.blogspot.com/2006/10/3d-desktop-beryl-and-xgl-on-ubu…
Pick a recent distro that comes with xorg 7.x server with the necessary AIGLX/Composite support. Lastest Ubuntu/openSUSE for example.
I got Beryl running on Ubuntu and open source r200 drivers (ATI 9000) and openSUSE and NVIDIA propietary drivers in an eyeblink.
Stay away from XGL and ATI propietary drivers if you can, too much hassle to bother with, but the only option with ATI Mobility AFAIK.
Just add the additional repositories (yes, NVIDIA hosts a yast repository for the
latest 9xxx drivers), install packages, enable the relevant bits in xorg.conf. It’s all in the Ubuntu and openSUSE forums/wiki.
Edited 2006-12-15 09:48
To my knowledge, the OSS/DRI ATI drivers haven’t incorporated support for XGL yet.
I haven’t tried it yet on my own ATI card.(firegl 8800) primarily because of that.
But let me be clear, that’s just what I’ve heard, what I thought I read somewhere. I could be wrong.
According to this guide my laptop’s Radeon 9000 Mobility card is a “rv250/M9” card. It doesn’t work with Compiz/Beryl/composite. I tried 3 different distros, they can’t handle the card for a 3D accelerated desktop, and it kinda bugs me.
Try the .debs(assumeing you have Debian or are willing to change) by Shame.These work!
He is setting up repository’s with the most up to date and tweaked beryl debs.I believe everyone over at Sidux.com has not had a complaint…it just works.
I had compiz working on AIGLX on an ATI 9100IGP chipset on my old desktop, using the open source driver (in Mandriva, of course). Haven’t tried a mobility or desktop 9000 card.
With the proprietary driver you can only use Xgl. In theory both AIGLX and Xgl should work with the open source one, but AIGLX is probably a better bet.
I got to work with AIXGL tonight. XGL still doesn’t.
That’s rather ironic considering those who develop proprietary software like Xgl are always singing the praises of proprietary software.
Use the latest fglrx drivers if you can, I know you don’t have ubuntu, but you get the idea from here:
http://wiki.cchtml.com/index.php/Ubuntu_Edgy_Installation_Guide#Met…
fglrx users HAVE to use xgl instead of aiglx right now. If you were using ubuntu, you could follow this:
Use the latest repo
http://justinconover.com/2006/12/15/ubuntu-beryl-update/
Then follow this, ignoring the repo.
http://justinconover.com/2006/12/06/ubuntu-beryl-ati-xgl/
Something cool… hmm.
How about getting X.Org 7.x ported to FreeBSD.
That would be cool.
http://wikitest.freebsd.org/ModularXorg
Didn’t we have that in the list lately?
Patience, my friend.
Don’t know why, but it’s already in work.
http://blog.xbsd.org/
I love her books and articles. I didn’t know about xwatchwin before..
Monitor Other Systems
Once you’ve configured systems to allow X connections from each other, it is trivial to monitor activity on other systems. I installed a watcher program on 192.168.1.1:
# pkg_add -r xwatchwin
Every time a window (or window manager) opens in X, it receives a window ID. In order to watch another system, you need to know the ID of the window you wish to view. As an example, if I want to watch 192.168.1.2’s KDE session, from 192.168.1.2’s GUI, I can determine the window ID by typing:
% xwininfo
xwininfo: Please select the window about which you
would like information by clicking the
mouse in that window.
If I click the desktop, this is the first line of output:
xwininfo: Window id: 0x1200008 “KDE Desktop”
In this case, “KDE Desktop” is the window ID. It is worth noting that the default window ID names are easily guessable. This is the reason why we use firewalls and why X shouldn’t listen on the network by default.
I can watch everything that happens during that remote KDE session by typing one command on 192.168.1.1:
% xwatchwin -u 1 192.168.1.2 KDE Session
The update switch (-u 1) will refresh the display every second. Figure 2 shows the result.
Bu you can also use “xwininfo -all -root” to get a list of all IDs, so there would be no visual evidence of a sneaky sysadmin.
Mr Burns: “Exxccellent..” (rubs hands together, hehe)
Fun for me is when I don’t have to deal with garbage like xorg.conf at all. Well, I guess I’m different.
Xgl/AIGLX+Beryl is still experimental in many ways (though very usable).
That said, if you have a Nvidia card, you don’t need to edit xorg.conf (which isn’t “garbage” as you so eloquently put it).
But I doubt you’re really interested in contributing to this discussion in a constructive manner…
Don’t be a snob. Guy has a point.
Xorg.conf shouldn’t be required as “known”. Users shouldn’t have to deal with it unless things truly break. Users shouldn’t have to change “nv” to “nvidia” when they change their driver, the packagers/installer/whatever should take care of that.
There should be a PROPER xml version of the file, and automatic detection mechanism for it. Manual editing should be possible, but desirable only in very special scenarios (and NO, twin-view isn’t very special) or breakage fixing.
So.. drop the snob accent, you’re not helping anyone here yourself. I just put a few nice thoughts up, if you disagree, or have points why it shouldn’t be as I said, feel free to respond, but keep the tone peer-level.
“Fun for me is when I don’t have to deal with garbage like xorg.conf at all. Well, I guess I’m different.”
You do…and you don’t. Xorg in 7.3 which will be available about June next year. Will use dbus to autoconfigure x.org.
There’s a fall back, hopefully? X.org has never detected my SGI 1600SW flat panel correctly. I have to add a modeline to my X.org config every time I install.
“Fun for me is when I don’t have to deal with garbage like xorg.conf at all. “
I think it’s okay to have a user servicable configuration file for X. It allows (i) people who know what they’re doing and (ii) people who want to experimentate and test non-standard things. I don’t want to rely on a fancy GUI frontend with blinking and squeaking buttons to change things in X – especially when they don’t work.
Just imagine you own a Sun fixed frequency monitor. The system is starting – you see nothing. You can only plug in the serial console and change xorg.conf to enter the correct frequencies, restart X and – voilla – there’s your X. That was fun for me.
Or you have a couple of old PC stuff with 12″ monochrome monitors to set up in a psychological testing lab (where this hardware still is more than acceptable). Starting X with a default of 1400×1050 @ 85 Hz would blow up the monitors. That would be fun, too.
Autodetecting is already working (something like “X –configure” if I remember correctly) that will create a working xorg.conf file. Then you don’t need to touch the file anymore – but if you want to do it, you can do it. With an “autodetect only” system you won’t be able to, so that wouldn’t be any fun.
Control should be where it belongs to – in the hands of the people who know what they’re doing. If you don’t want to use xorg.conf – you don’t have to. The choice is yours. Having both possibilites (autodetect, servicable file) is the best solution in my opinion.
“Well, I guess I’m different.”
I won’t say anything…
AIGLX+Beryl which IMO is the way to go with the 3D desktop has been a pain for me, especially on these chipsets.
I’ve had problems like you will find on all of the forums.
1. White cube
2. Black borders
3. Delays in restoring objects
These are *only* fixed in the latest versions of.
X-org 7.2 rc2
Mesa 6.5.2
xfce86-video-ati 6.6.3
…and others
which is pretty bleeding edge stuff, and they need to be compiled against each other.
It still does not play nicely with mplayer in XV or GL. It seems to affect gaming a little, but I can’t really tell, but gaming which always was a little flaky is worse. These problems have been enough for me to turn of the 3D-effects for now. I already miss many the ease of navigating the desktop that it offered.
I suspect that until xorg 7.2 is formally released either this week or next week, and the latest combination of *stuff* is packaged right and compiled against each other. That the 3-D desktop will not be stable on AMD cards, and suspect strongly that it won’t work seamlessly with other graphics intensive applications until 7.3 is out in June.
I’m actually using that combination on Gentoo (plus a patch that fixes an AIGLX problem, see https://bugs.freedesktop.org/show_bug.cgi?id=8991), with a Radeon X550 (RV370); compiz/beryl+AIGLX works well, the transitions and the effects are smooth and pretty much every effect can be enabled.
Stability however is not so good: randomly X hangs and start using 100% CPU and keyboard and mouse cease to work. The other processes continue to work and I can login with ssh from another machine to reboot (X can’t be killed). I can’t find an evident trigger and sometime I can work for hours without problems. I hope that in the next weeks I’ll be able to run X in gdb or with strace to obtain some debug data and open a bug report to X or compiz.
For the moment, I think that we’ll need to wait at least until xorg 7.3. In my personal wishlist, I’d like to see the EXA support for R300/R400 cards and the new dri memory manager implemented.
I’m not impressed. xnest, xwatchwin? Not to be a negative twat, but the stuff described in the article is old news and already worked nicely in the year 2000 with XFree86 4.0.
OTOH, there might be something i’m missing. Please give me a friendly hint, if i do.
I don’t remember seeing the author claim these functions were new to the product.
They may be old to you and me, but a new user isn’t going to find them unless (s)he goes looking, and if (s)he does (s)he’s probably not a new user anyway.
twenex: I don’t remember seeing the author claim these functions were new to the product.
They may be old to you and me, but a new user isn’t going to find them unless (s)he goes looking, and if (s)he does (s)he’s probably not a new user anyway.
Yeah, you’re right. The fault lies within my expectations. When i read the headline i expected something “whoa” and got something “meh” and was disappointed. No reason to knock down the article, so my apologies.
I have two machines here. A pretty-fast AMD XP 2800+, 768MB RAM, Radeon 9800 PRO 128M AGP 8X equipped machine and an intel Celeron 1.8 GHz 128K cache, 256 MB RAM, Radeon 8500-DV All-in-Wonder equipped machine.
Installed Fedora Core 6 on both machines, enabled Desktop Effects on both machines, and, very surprisingly, the, supposedly slower, Intel-based machine performs as if the interface code where as perfectly written as possible, without a hint of delay or a single glitch at all.
The AMD Machine’s effects face no issues, except a noticeable delay prior to moving a window, rotating the 3dDesktop, and noticeable sliggishness during that process, though not severe.
Thing is.. the AMD machine should, by all technical means, spank the intel machine.. hands down. Benchmark scores are about 60% higher ALL-AROUND on the AMD system, but, alas, that doesn’t seem to matter.
Of course, I would venture to say there are many optimizations for intel that do not apply to AMD. But, I mean, what little feature is making the intel system perform as if it were twice the machine of a machine that is, actually, twice it?
Both machines have the same CPU ‘extras’ ( MMX, SSE ), with an extra ‘extra’ going for AMD ( 3dNow! ), which is likely the cause of my confusion.
Anyone else here of anything like this? Or maybe it is just due to some difference in the installs?
–The (confused, again) loon
Are you using the proprietary fglrx driver for the Radeon?
If you’re using the DRI open source drivers, the Intel machine with the Radeon 8500 uses the r200 driver, and the AMD machine with the radeon 9800 uses the r300 driver.
The r200 driver was developed with full access to the GPU specs and errata, and with some engineering support by ATI developers, so it is known to perform well and AFAIK there are no stability issues or missing features.
Then ATI managers decided that they didn’t want to release specifics to open-source developers anymore, not even under NDA, so the r300 driver has been written exclusively by reverse engineering the windows driver. Most cards work well, and it is fairly stable, but there are some GL extensions not fully optimized or unimplemented. This could explain the bad performances.
IIRC, the Radeon 9800 it’s a specially unlucky card, because it seems to be initialized in a strange way, and the developers until not long ago didn’t really grasp how to make it work.
Edited 2006-12-15 19:42