“The world is changing and online applications are becoming more and more popular, whether for e-mail or word processing. The developers behind Bigboard and Gnome’s ‘online desktop’ initiative think it’s time our desktops started catching up. Read on to find an interview with Colin Walters, more information about Bigboard, the online desktop and the obligatory screencast showing it off!”
It seems to me like putting all eggs in one basket. What if you write something and net brokes? What if someone steals information travelling through net? Yes, I know about encryption, but also know about unmatched naivines and plain stupidity of people.
The whole “put your eggs in one basket” argument is not really relevant when it comes to computing.
Current usage of computers could also be viewed as unreliable considering how many new potential sources of problems are created by using computers. E.g. When moving to digital format, it’s a lot easier to steal info. It’s a trade off between complexity and productivity/flexibility.
It’s just a matter of time before web enabled applications become the norm. We have to wait till the infrastructure and ecosystem catches up to these new web-based application systems.
There’s something about having it physically secure in my home that appeals very strongly over having all my personal information stored places only God knows.
It’s a nice concept, but in a world where identity theft and fraud are increasing, online desktops just don’t make sense to me.
There’s something about having it physically secure in my home that appeals very strongly over having all my personal information stored places only God knows.
On the one hand I can agree with that arguement. I’m sure this man http://news.bbc.co.uk/2/hi/entertainment/7019644.stm did as well up until recently.
On the other hand I’ve seen the (physical and data) security at some of these off site data storage facilities and it sure as hell beats the crap out of anything I have in my home, or any office I worked in for that matter. Let me encrypt the data end to end with my own key and I’ll be content.
On the third hand being dependant on a net (fast enough) connection to access your data isn’t good. So it’s not that simple either.
Yeah I work in one of them and I know just how difficult it can be to get in, but it goes beyond that. It’s no so much the threat of an unknown intruder getting in, but the so-called ‘trusted’ people allowed in on a regular basis, such as support staff.
Yes you can encrypt the data and yes there are logical controls over the data, but someone has to have the knowledge of how it all hangs together internally whom may be able to access your private information.
Encryption is one thing, but with the massively parallel computing systems around it stands that encryption can only go so far.
It wouldn’t surprise me if there were SETI like networks out there designed specifically to brute force crack passwords with those enormous rainbow tables.
Then again, maybe I’m just being paranoid.
It’s pretty much up to the distros who implement this sort of thing to make sure that encryption and whatnot is done by default, without user intervention. If past security practices are any indication, fedora at least will do just that.
naivines – sorry, is this some sort of Cockney reference to what people do in battleships to keep themselves occupied, perhaps? In other words, people in glass houses…
Are you stupid if, in using your phone, you subsequently find out the government has been taping your calls?
The behaviour is called risk taking…some act without being aware of the risks…
Enough of the ‘stupid ‘ argument – it belongs in the same place as the ‘lazy’ argument, i.e., in a very deep marine trench somewhere with enough radioactive material atop it to keep the idle curious at bay…
grrrr.
I see this is an attempt to develop a new way of using ‘your system’ in an increasingly ‘always connected world’.
I’ll probably give is a try in F8.
It has obvious advantages when used with a ‘Live Distro’. You don’t need to have to remember your USB key (I’m always forgetting where I put mine…).
I’ll need to understand more about how your own personal environment is held online. Configuring the place where your own data is held would be an essential for sing this in a business scenario where you are developing a single sign on using an PC in the company tryp of environment.
I think we will need to see how this develops as time goes by before any real judgement can be made.
As the article says there is a long way to go.
Just tried it via the latest live CD and it seems to work very well, with lots of interesting features. I think they may be on to something – a new way of looking at how you work on and offline.
And exactly how does the offline mode of operation look in an online desktop? I have wondered about that for some time now.
Pretty much the same way in this case except you wouldn’t have access to your online stuff (at least, right now, unless they implement something like Google Gears).
“Online desktop” is what they are calling it but I wouldn’t say (in the 30 minutes I played around with it) that it works the way you might think. Rather it pulls together a lot of the online services into one integrated package. You can still do stuff locally as normal.
If you haven’t looked at the screencast in the linked article it wouldn’t make sense at all – but I think you still have to try it to get it.
so im guessing that its more like a universal sync service, where as long as you have access to a reasonably quick net connection (and with 3G and later, who does not?) the latest version of what your working on will have a copy online as well as offline.
i can see why google made a statement in direction of making https the norm for web access rather then the exception that it is now.
> who does not
travel much? many of the places i go to don’t have reasonable internet access. we have decent internet access in most first world cities (rural areas still often suck, however; two weeks at my sister’s house in rural Washington state was enough to pull my hair out but outside of those areas internet access can still be pretty “primitive”.
we could, of course, say “who cares?” but then one of the things i’ve always loved about Free software is that it opens technology to people all over the world.
as long as online interaction is an optional, if integrated, component in the big mix of things then i believe we’ll be ok. if online services become central to the experience of modern computing we will inadvertently push back many technology wins in various places of the world.
i keep forgetting that of all nations, usa got worse mobile phone coverage then some third world nations.
i guess being a european makes me a bit spoiled in this area
Hmm, me too. I haven’t found a single spot in years now here in Finland where I haven’t had connectivity. And now they’re planning to start building a country-wide wlan network next year, too.
i guess being a european makes me a bit spoiled in this area
You can keep your fancy mobile coverage that actually works. I prefer to keep the majority of my paycheck instead of having it taxed away. Go U.S.A!
OK so having viewed the demo and seen the screenshots I realize now that it seems to be, atleast so far, not much more than links to various online sites that you may have accounts on already. Not unlike the Flock Social Browser(based on Firefox) I just downloaded which has tons of links to social sites.
It also seems to push this idea of a gnome site to how all of your personal settings and preferences. Now I see the direction being taken I would much rather the Plan9 approach as mentioned in the other approach. Instead of a “portal” tool/side bar, why not focus on improving the fuse/vfs network-mounted-folder capabilities and offer more transparent, yet optional integration with the “gnome desktop”. That way there is a wider potential for acceptance through less extra effort.
I just think this may be more misdirected development attention that would best be served on build upon existing infrastructure. But then again this being Open Source, developers are free to work on what they want/are being payed to. Also how does this relate to the stateless desktop that redhat was working on previously?
In Plan 9, all this discussion would be uninteresting, because the only difference between reading the photos on your hard disk to reading the photos from a online service….would be to mount the remote filesystem in your namespace, or implement a small userspace daemon that takes your photos from flickr and provides a filesystem that you mount in your namespace.
The photo application itself does not need to be “online”, it just reads a directory.
And the interesting thing is that Linux could do that: it has FUSE, it even has Plan9-protocol support, it will have a complete implementation of per-process namespaces some day. So: why are we wasting our time modifying apps to support online services instead of doing “the right thing” and provide an abstraction that takes care of all that without needing to rewrite the apps?
What you said is sooooo true. Plus the fact that doing backups is so easy with rsync and NFS.
I don’t understand why there is a need for APIs to upload to websites like Picasa Web Album. You should just have to copy a local folder to a mounted networked folder and the software of the web album should just read the standard metadata embedded inside the jpg.
What we need is a more flexible kind of web services, integrating with things like FUSE or NFS, not more local software integrating web APIs. Web APIs are evil and moving targets.
Seems almost anything IT related is a moving target these days. I suppose with API if it’s abstract then the underlying comms method can be whatever.
So: why are we wasting our time modifying apps to support online services instead of doing “the right thing” and provide an abstraction that takes care of all that without needing to rewrite the apps?
Because online services are not only about remote storage?
Because online services are not only about remote storage?
The idea behing Plan 9 is to represent pretty much everything under a filesystem-structure, including things that are not “remote storage”, but that could very well represented as storage. Because in Plan9, a filesystem is not just a “storage device”, that’s a idea inherited from unixes and windows, in plan9 a filesystem can also be, and many times it is, a userspace daemon that exposes a filesystem.
For example, the plan9 window system is implemented as a userspace server that exposes a filesystem – apps just write to a file in order to get their window draw. Aand because the VFS is fully network-transparent, the Plan 9 window system is also fully network-transparent without having a single line of network related code. That is certainly not “remote storage”.
In plan9 the filesystems are sort of “common i/o subsystem”, all the apps do their I/O through filesystems that are added to the per-process namespaces, be it I/O done to a storage device or to a online service. The online services can be “hidden” under a file where the app writes to, and the “online service” receives the data. So, under plan9 the apps do not need to be designed to be “online”, the same code is used to to read local and remote files, the apps just read & write files using open()/write(), it’s the filesystem and the VFS and the daemons who take care of providing the data channel.
So IMO, doing the “right thing” in the online gnome desktop should mean to implement those userspace daemons to provide filesystem structures (it’d be easier if the online services would hid their funcionality under a 9P filesystem, but that’s not going to happen), the apps only need to be modified to offer a UI whose actions start those daemons. It’d make the “online desktop” easily available for KDE and any other desktops aswell.
Edited 2007-09-30 19:01
Ahh, nice. Thanks for the info and sorry for the lack of vision and insight into this. Got me thinking about more familiar but similar concepts like fifos for inter-process communication and KDE’s kioslaves that present many different aspects of the system in a file-way, both of which I certainly like.
if there is one thing (now that FUSE is going full steam on linux) i would love see linux adopt from plan9, its the window manager.
being able to kill a troublesome program by doing a (forced) delete in the file system is highly appealing.
hell, it would make accelerated 3d graphics much less of a mess i think.
yes plan9 was/is one insane concept. to bad it got torpedoed by being under a restrictive license all those years (iirc).
and we will probably not see anything like what your talking about until unix like systems become much more norm on the desktop. as long as we have the windows mindshare of different partitions/drives being different letters under “(my) computer” this is just a geeks wet dream.
hmm, it may even work for small devices like phones and similar. as long as they can show a file system, and can handle CCP (cut, copy, paste) it can interact with these systems. no need for big web interfaces or similar. want to search for some images, mount images.google.com with a search term into your phone and the photos found will be displayed as files.
facebook: profile is a folder, with photo and similar subfolders, and a optional index.html or similar at the root of the tree.
damn, i feel like setting this kind of stuff up…
Linux has FUSE, and GNOME will soon have GVFS, which is a userspace VFS that works on top of FUSE to provide per-user namespaces where unprivileged users can mount all sorts of volumes, like their GMail. The KDE equivalent would of course be KIOSlaves, although I’m not sure if FUSE integration is planned. FUSE allows applications that don’t use the GVFS and GIO libraries to access these volumes using plain Linux file I/O.
I’m not sure that thinking about web services as Plan9-style network volumes is necessarily a good approach. It simply moves the application part of the service from the web server to the desktop. Therefore, you need a client application that understands the service to access the data, send/receive commands, and whatnot. The proliferation of clients might cause interoperability problems or even break the service.
I agree that something has to be done to decouple user data from web services. However, I think that this problem is harder than it may seem. The more sophisticated the service, the more challenging it is to separate the application from the data. For some services, simply exposing a volume and using the default MIME handlers would be sufficient. But I imagine that MySpace would be difficult to translate through a network volume.
I think that if Bell Labs were still around today (well, I guess Google is the new Bell) and wanted to reinvent UNIX, they’d probably explore in the opposite direction from where they went with Plan9. Files are nice, but the object is a much more powerful abstraction. The class definition is all you need to interact with an object, so if a web service published its class library and exported the data as objects, developing client applications would be relatively simple.
Oh, well. The computer industry is a place where the penalty for being ahead of your time is that your ideas won’t be realized until many years after their time. Network computing is certainly one of those tragedies. We’ll have to trudge through this Web-2.0 era before we realize that the Web is a distribution medium, not an application environment, and we’ll eventually end up thinking about networks in much the same way that Bell Labs did in the mid-nineties.
> We’ll have to trudge through this Web-2.0 era
> before we realize that the Web is a distribution
> medium, not an application environment, and we’ll
> eventually end up thinking about networks in much
> the same way that Bell Labs did in the
> mid-nineties.
i couldn’t agree more. =)
the value is not the web browser (seriously, uck!) but the dislocation of where you and your machine are (locality) and data access/delivery (information). particularly so when you can “mash” it up with local data at the same time.
oh well .. we’ll get there eventually.
“But I imagine that MySpace would be difficult to translate through a network volume.”
Not if the approach taken is one of the plumber mechanism, whereby universally similar sources of information are subscribed to. For example my “Pictures folder” can derive images from the Photos of MySpace, messages show up as special emails, etc..
“Files are nice, but the object is a much more powerful abstraction.”
Im not sure if you have seen Microsofts PowerShell and what it attempts to achieve. It sounds quite similar to that concept. instead of piping iostreams they pipe objects.
KIO-FUSE has existed for years already, mounting any KIOslave as a fuse mountpoint. But it was/is never used outside KDE, as far as I can tell.
Edit: and have a look at some less know KIOslaves to see what Plan9 like stuff could do. I mean, the Audiocd:// KIOslave is seriously cool, showing the content of an audiocd as mp3files, flac and ogg files, ready to be dragged and dropped anywhere, encoded on the fly. Or the settings:/ kioslave, or applications:/, or man:/ (yes, unix manuals throuhg KIO, from within any app). Some apps like strigi are even configurable through KIOslaves, though KIOslaves also support html-like interfaces.
In other words, KDE is way ahead of you
Edited 2007-10-01 11:15
FOSS is so cool.
Nerdy sounding, I know…but think about it.
Most argue that Vista is lacking innovation.
With Linux growing, AMD/ATI releasing specs, and Plan 9 – who can say they’re NOT excited?
the funny thing is that plan9 was dreamed up in the 80’s.
http://plan9.bell-labs.com/plan9/about.html
http://en.wikipedia.org/wiki/Plan_9_from_Bell_Labs
but as it was under tighter creative control then the original unix, was aimed at office/thin client environments, and apple (and later microsoft) was hammering out their own “worlds” on the user desktop, it never got the mindshare it could have gotten.
one could also say that the lack of ever-present network connections (the internet was a collection of mail servers and early web servers: http://en.wikipedia.org/wiki/History_of_the_Internet ) made it a silly thought for the home user to remotely mount his work environment. at best you used telnet on some small home computer to access the command line on some unix box where you worked, and that was it.
it makes one wonder how ahead of its time some of the alpha-geeks working out of silicon vally and similar places where, and how long it will take the rest of the world to catch up to it all.
hmm, it makes me think that what we think of as geeks was at one time thought of as wizards and alchemists. and that they probably had a better understanding of how things work, then history gives them credit for.
Why are OSS-fans excited when they read about ways to rip their freedom apart by putting their data on a server of some company?
That’s not freedom, right?
It might sound Stallmanistic, but I really do wonder why.
“It might sound Stallmanistic, but I really do wonder why.”
Actually it does not sound Stallman like at all since if you are using other computers over the Internet, the concept of freedom in the sense of Free software doesn’t apply at all and that is RMS’s opinion on this which is why GPL v3 doesn’t extend the notion of distribution to online services.
Think about it. If you were able to get the code under a Free license for the software running on the server i does that by itself make the service Free? IMO, not really.
Now when I use other online services, I care about the freedom to import/export data which many do offer via open API’s. There are some related thoughts on
http://log.ometer.com/2007-07.html
“””
“””
This point has not been as major a subject of discussion as I would have expected. To me, the inclusion of the clause to extend GPLv3 to cover online services, in the first draft, and its later removal, was one of the more interesting things that occurred during the process.
I fully expected RMS to advocate closing the online services loophole, and to stand firm on the point, dismissing arguments against doing so. And I was wrong. It’s one of the few points on which I can say that RMS advocated *less* in the way of restrictions than did I.
I like it and from what I saw of the screencast it looks easy to use. My only real gripe is that its kind of messy and all over the place. They’ve pretty much thrown the gnome HIG out of the window. Right now it looks a bit cluttered and I think with a bit of streamlining they could actually be on to something. Clean up the menu a bit, definitely revisit the application browser, which is a huge mess. and work on the aesthetic a bit and it could be a home run for Fedora.
bookmarks-for-web2.0-sites-on-steroids in a sidebar next to launchers paired with an rss reader in a systray icon menu: that’s essentially what this is, and while i’m sure some find it very useful, when i tried to use it a few months ago on a regular basis i just couldn’t get into it. i tried =) and i’m a fairly avid twitterer, facebooker, etc… *shrug* i also don’t really care what music my friends are listening to in real time or other such things
i do think it’s great to have quicker access to online accounts, but really….
speaking of twitter, the way i got into it was once i had a desktop widget that let me update my twitter blog right there and see my friend’s entries; the difference in this approach is that it is blended right into the desktop interface itself; no web browser popping up windows or other user interaction. just a quick glance at the desktop, with the update widget right there to click on and start typing into. i find that without that desktop widget, i don’t really use twitter that much. i wonder how different my behaviour there is from most peoples?
perhaps the neatest part in this is the online application installer. the value add there is that others can rate the apps and what not and you can see the results, which brings some of the nice social networking bits into more traditional desktop functionality. this isn’t something that really requires a web page in a browser, of course, but certainly integration with “web” services.
i suppose at the end of the day it’s going to come down to whether or not people will get enough out of this “bookmarks on steroids” system to justify changing their habits and configuring it all.
will be interesting to see how it pans out in any case.
Doesn’t this have google apps integration and stuff too?
I took a look at that screencast and I honestly couldn’t see anything I’d have any use for..The only mail account I use is the hotmail one, and even then it exists just for the sole purpose of using my messenger. I do write some tasks on my calendar in Evolution but I don’t want anyone else seeing what I’ve written there, nor do I really have any interest whatsoever in other people’s calendar entries. I’ve never even heard of that mugshot thing, I don’t use myspace and I have all my pictures either stored locally on my hard drive or in the finnish picture gallery website (www.irc-galleria.net). So, the only thing this thing would do for me is bring some more clutter on my desktop! Besides, to me it looks like they’ve just thrown a few apps together in a rather messy bunch..
Don’t forget 1997. This was the year Microsoft released Active Desktop. Information would be pushed to users via widgets embedded in the desktop. To this day I have never seen anyone actually use it.
http://en.wikipedia.org/wiki/Active_Desktop
I am an XFCE user and I see the desktop analogy as being outdated. I use XFCE with CDE style icons for the desktop. I find it more efficient to run the file manager separately on another workspace. Same goes for the browser. Using Compiz-Fusion I can toggle between screens easily. Much easier than having to hunt for items on the desktop.
I had envisioned this whole online thing in a totally different way. Like for example, you’d have a folder called ‘My Pictures’ and under there a folder called ‘Online Pictures’. If you saved a picture in that ‘Online Pictures’ folder it would automatically converted to proper format and saved on a remote server, with the filename as it’s caption/title/description. If you opened the folder in Nautilus it would just show the pictures as files like it does now. Also, Nautilus should be extended so that the sidepane actually shows the file details and that there was a small box for writing your own comments about the file easily and quickly (right now you’d have to go through properties -> proper tab and then write the comment. And it’s not actually used anywhere..).
That’s the kind of an online desktop that’d actually be useful!
EDIT: Right now that online desktop thing makes me feel less excited than my ex-boyfriend…even though he was a really lousy lay!
Edited 2007-10-01 12:37
Hope all of this can be turned off. Bigboard is the most hideous thing I have ever seen and I have absolutely no use for an “online” desktop. Can’t even begin to name the usability issues with BigBoard. This entire online desktop thing seems to me to be a solution in search of a problem.
This is going back to the dumb terminal days, and it is a bandwidth hog.
The whole idea is really ridiculous instead of focusing on a solid desktop strategy time and money is being wasted on the same failed concept. How many of these ‘online desktops’ have came and went in the past 5 years. I can name at least 5 that have disappeared with the same single point of failure concept only to rear its ugly head once again.
Being a Linux Admin, I can hardly do my duties from an online desktop, much less create a test environment and several other activities that require real machines. Considering the fact pc prices are at all time lows and hardware is constantly getting cheaper, and the fact OO.o is available for free use I do not see a need. I have seen this concept come and go in the past 10 years even in the large enterprise environment the ‘win terms’ were a HUGE waste of money and found themselves being thrown in a dumpster one day.
Plus the security layer, it is full of wholes who is keeping the data, who can see it pass, who will ensure it availability 24/7, what are the backup measures? The list goes on and on, it will be gone in a year. Only to reinvent itself again saying it is better than sliced bread…
I don’t think this is the dumb terminal hooked up to a mainframe idea done yet again. it seems to me to be more focused on integrating lots of data people have spread all over the place into their pcs.