This article tries to explain why workstations are no longer an appropriate tool for the present working environment, what the alternatives are, and what consequences it has for the development of OSes.
First I would like to explain why I feel competent enough to write this article. I’m a hardware engineer and I work as an EDA (Electronic Design Automation) consultant, this means I often change projects and customers and I’m using UNIX-based environments to get my job done. All the developments, which I describe in following, are affecting me, so probably either they are similar for other engineers not necessarily from the EDA industry, or they will affect them in the near future.
The tool is the same, but the task is changing
Let’s step back for a moment and remember how it was just a few years ago. Every engineer had his own UNIX-workstation in his office. All the project data was stored on a file server, so for changing the data, first thing to do was to fetch them over LAN. If the data were coming directly from customer, they were stored on tape, so they had to be loaded on local disk, after that the engineer could start working on them. The workstation was powerful enough to handle the amount of data. If data had to be shared, the engineer stored the data back on the file server, so his colleague could access them. The communication was handled over email or over phone. All electronic correspondence inside the company was using the same data format. The team, who was working on data was present in the same office.
There are at least two developments, which changed this peaceful picture: globalization and flexibility.
Globalization
Nowadays several engineering teams from all over the world must have access to the project data. That means that the file server can be located anywhere and must be accessible over comparably slow WAN connection. Since several people might work simultaneously on the same data, versioning systems must be used. The amounts of data are increasing rapidly. It takes too much time to fetch them and store on the local disk and write them back after processing. Additional problem is that providing necessary power to process this data to every single engineer is just too expensive. The resources must be shared. These factors lead to conclusion that it might be easier to let the data on server, or just copy them over a fast connection from the file server to the grid of computing servers, which cost less than the certain number of workstation and can be used more efficient. So the only data connection, which is required is a remote display, which let the engineer start the jobs and see the results. X11 has network transparency build in, but the protocol is not very efficient for WAN connections, so better optimized solutions would be Citrix ICA connection. A free solution is e.g. VNC. Another important point is that Citrix clients are available for Windows, MacOSX, Solaris and Linux, so the OS on engineer’s desktop is completely independent from the OS being used on the server. Additionally it is possible to share a connection, that means it is possible to see what another Citrix user is doing, that is very nice for solving problems or providing online training. One solution is to provide an inexpensive terminal with slim Linux distribution, which can run Citrix client and possibly RDP protocol to connect to a Windows server, so the user can use software from both worlds. All the production data is stored on UNIX server, all other data on Windows.
Flexibility
Flexibility means for the engineer two things. Being flexible means not only to work on the technical side on the project, but also contribute more, than just processing the data. Today the engineer must write the documentation, meet the international customers and held presentation about the project status, provide training, fill-out various web-based forms, like timecards or expense reports, attend webinars, telephone and video conferences, communicate with other project teams on various channels. He receives several dozens mails a day from colleagues, mailing lists and customers, works on different projects at the same time, and must always learn new things. He is responsible not only for the project itself, but also for the pre- and post-sales support. Another aspect of flexibility means, that the engineer is not longer bound to his office. Lot of companies do offer possibility to work from home, either because they want to be seen as family-friendly, or just want to avoid expensive offices. Some companies do not have enough space for all the employes, so they come to the office only twice a week. During critical project stages the engineer must have possibility to look at the data without making long way to the office. During customer visits he must have lot of data available to be prepared for every question the customer may ask. To fulfill all these demands the engineer must use a notebook with a OS which helps him organize all the data, which are project related, but which are not production data.
So the combination of these two trends shows that the ideal platform for a nowadays engineer is a notebook with a modern desktop OS, installed VPN and Citrix or VNC client. He can connect it to a broadband connection and have access to the server for working on project data or use the applications of the notebook OS for all the communication and office related work.
What kind of consequences does it have for the development and usage of operating systems?
We can draw a very sharp line between the server OS and the desktop OS. Both systems have completely different demands. From the view of user the server OS is visible in his Citrix client window as an application. In fact it is comparable with a WebOS, which are running in a browser window. Server OS must be stable, reliable and scalable. It must run on big servers, handle lot of load and users, support virtualization, be fault-tolerant and self-healing. The windows manager must be simple, but still effective enough, to help handling several open windows and terminals in a session with a resolution and color depth as small as possible to minimize the network traffic, but still large enough to display all relevant data.
There are already server-only OS, like zOS, VMS or OS/400, but our definition would declare also AIX, HP-UX, Solaris and all BSDs as server OS (Note: I don’t mention Linux here, it is a special case). There are lot of minimalistic window-managers and desktop environments available (CDE, FVWM, WindowMaker), which comply with the requirements described above. Which leads us to the question, which software is required for a serverOS? Obviously programs for the work on the project data are needed. Then a development environment with a tool chain to be able to write programs for the OS. A web-browser with PDF-plugin and an IMAP-based email program for simple communication. What software is not required for it? No advanced communication software, no multimedia programs, no office-software, no bloated desktop environments like KDE or GNOME, no 3D-acceleration, no search software, nothing what might disturb the user or the computer system from work. The ideal case would be, if the home directory of the user would stay empty, all the project data are stored in project directories in versioning systems, accessible for other project users.
On the other hand the desktop or notebook OS must have every feature which helps the engineer to organize his work and should help him to be able to communicate with every possible client and manage all his data. He should be able to read and write every document format and be able to access every website. International customers might send him documents in every possible format and he cannot reject it, with an excuse that’s because his desktop OS does not have an application, which is able to read it. The stability and reliability do not play a very important role. If the system crashes, it is still possible to connect to the Citrix session and continue working. Currently there are only 3 OSes which to some degree support these demands: Windows, MacOSX and Linux.
By using Windows the chance to have all the programs for communication like VoIP, IM, video-conferencing is higher than on other platforms. Windows-based application like Microsoft Office are used by most customers and non-technical departments in the company. OpenOffice is available for Windows as well, in case that somebody is sending ODF data around. It is sad, but there are still lot of web-forms, which are used in Intranets and which work only with Internet Explorer. For group-ware functionality Exchange-Outlook is still the most popular combination. Multimedia plugins and codecs for all relevant formats are available. Windows Vista has integrated search, which helps to find documents and emails on the basis of different criteria, for earlier Windows version applications like LookOut or Google Toolbar can be used. Windows supports Unicode and lot of different char-sets, which is also important, since customer from Eastern Europa or Asia might use different char-set on their web-page or in the email.
MacOSX is also able to read and write most of the popular formats. It has its problems with multi-platform groupware-functionality and while VoIP and text-messaging with different IMs is possible, video-conferencing with a Windows user might become a bigger problem. Not every web-page can be viewed with Safari and if Microsoft removes VBA-functionality from its next Office for Mac software version, all the Excel tables with Macros cease to work. MacOSX has very advanced searching capabilities and is good suited for writing of documentation, especially because of the build-in PDF creator, so the documents can be viewed on all platforms, even on server OSes.
Linux can be used as server and as desktop OS. While optimized distributions make good shape on server, the Linux desktop still has a long way to go to become as helpful for the engineer as Windows. All the arguments, which are valid for MacOSX, are valid for Linux even more. Even if the company is pure open source and uses only standardized document formats and communication paths, the customers might not and there must always be a way to be able to read everything, what a customer might send. Group-ware solutions on Linux are available, but the Exchange support is fluky, MSOffice-Macros might sometimes work with OpenOffice most of the time they do not, I’m not aware on any cross-platform videoconference software which is available for Linux. Codecs and plugins are often not available as well. Recently Linux also got a search-engine. But the advantage of Linux is, that it is possible use the notebook as development machine and run the code on Linux server, without recompilation. It is possible to demonstrate software and provide training on the notebook, without having connection to the server.
Conclusion
Due to the change of the working environment workstations are not the right tool to do the job anymore. They are too expensive, can be used only by single user, the data amount is too large to be downloaded and processed. Better solution is to leave the data on server and send them through fast network on computing grid. As control station either a terminal or a notebook can be used. Notebooks offer better flexibility as it can be used for work from home or during traveling. ServerOS should not be optimized for desktop usage but concentrate on such tasks like reliability, stability, scalability. Only lightweight windows managers should be used to save the bandwidth and processing power. The OS on the notebook must be able to help the engineer to communicate, manage his data and organize his work. Windows is currently the most advanced OS for these tasks, but Linux’s advantage is, that it is flexible enough to be used as serverOS and on desktop.
It’s the same with Mainframes. Like it or not, they’re a dying breed. They cost too much to maintain.
Mainframes are cost effective in some cases, which is why we still use them (for example) and plan to for the next couple of decades (unless something better comes along).
Like most things related to IT, the viability of a given solution often depends very heavily on the specific business and technical context in which it is to be used.
It’s the same with Mainframes. Like it or not, they’re a dying breed. They cost too much to maintain.
Dude, mainframes have experienced an uptake in recent years. If you’re as ignorant on the Windows/Linux issue as you are on this, I’m glad I’ve never listened. Nor am I surprised.
I don’t know why this is modded up – it simply is not true. Had the author _any_ knowledge about this matter, he’d not written this crap. Mainframes are more wanted than ever. They are not dying – on the contrary companies rely more on them than ever. Well, guess why …
Mainframes are experiencing a resurgence, and it is precisely because they cost less to maintain. Less administration, less power, less heat, less space, less cabling, and much greater reliability.
The downside is a higher initial investment, but after giving distributed commodity infrastructure more than its fair consideration, the medium-to-large business is willing to pay boatloads to ease their maintenance headaches.
They’re sick of fitting the pieces together with mixed results. They just want it to work. They want to shrink their IT departments and start concentrating on their business model instead of their datacenter. They discovered you can’t do IT on the cheap. You either pay for it now, or really pay for it later.
We’re seeing it in the high-end UNIX space. Customers are requesting mainframe technologies like I/O virtualization and live recovery. They also want the mainframe brand of customer service, where if a customer’s system goes down, the senior engineers work around the clock in shifts until a fix is delivered and verified.
Edited 2007-01-19 04:05
For me at work, a traditional “workstation” is just fine, but I spend most of my time doing three things:
(1) Using terminal windows (VT+SSH or UTS) to access a server box and use software development tools on that box via command line.
(2) Using X clients running on a remote server box and displaying their graphical windows on my local workstation screen. This includes editors, debuggers, etc.
(3) Using normal “office” applications (word processing, web browsing, bitmap manipulation, vector drawing).
I don’t typically move data from the servers on which it is stored. Instead, I manipulate that data locally on the appropriate server(s), and redirect the display as required. It’s much easier. But that’s what UNIX “workstations” (or in my case, Windows + Cygwin) have been able to do for well over a decade.
I’m not quite sure that the author has explored all of the practical options at his disposal. All this talk of moving data around is silly — we have *LANs* these days for God’s sake, not just FTP connections…
Edited 2007-01-18 19:08
People have been saying workstations are dead since X-terminals came out. People have also been saying Unix is dead since about V4. Neither has happened. Although some find such things as the Linux Terminal Server Project useful (and more power to them if they do), it would be more accurate to say that X-terminals are dead.
When will people stop flogging this stillborn horse? At this rate there’ll be another round of media-excitement over dot-coms (the ones that crashed and burned in the late 90’s, even).
SGI is pretty much dead. anything else wasn’t really a “workstation” but merely a unix pc.
SGI is pretty much dead. anything else wasn’t really a “workstation” but merely a unix pc.
Well, SCO boxes were “Unix PC’s”. Compare the state of Sun, DEC, DG, HP and IBM hardware in the 90s, and the software that was run on them, with the equivalent Amiga/Mac/PC offerings and I think you will agree HP, DEC, Sun, DG and IBM Unix boxen were “workstations” at the time.
And of course IBM and Sun Unix workstations are still around, as are HP-UX boxen if anyone wants them.
yes Sun and IBM are the last holdouts. Everything else is using x86 plus some nvidia card. But SGI in particular was something special.
IBM doesn’t really have a viable UNIX workstation. They sacked their entire 3D graphics team a while ago. If you enjoy the vintage feel of CDE cerca 1996 I guess you’ll be quite happy. If you don’t, you’ll probably prefer telnet, or even better, use the networking support in the hypervisor to let you directly access the system console over virtual serial.
True enough for vanilla Aches, but KDE is also available for it (and maybe GNOME).
“workstation”? There are still “workstations”?
I’ve enjoyed the centralized/decentralized argument since, oh, 1973. It really has little to do with technology and costs. The decentralized systems have almost always had a higher total cost of ownership than the centralized ones. It’s always been about control over working environment.
The decentralized systems have almost always had a higher total cost of ownership than the centralized ones. It’s always been about control over working environment.
Exactly – it’s not all about TCO. I’m willing to bet that a Ford Mondeo has a higher TCO than a horse, and a Lexus a TCO higher than a Ford Mondeo – but can you see the owner of a Lexus or even a Mondeo swapping their cars for a horse?
The Mainframe environment is growing, there has been an increase in sales, because Big Iron is dependable and will be around 20 years from now.
As far as workstations go, if you plan to deploy some dummy terminal computing (already failed in the past) it is a waste of time, money and productivity. It is cost effective to use PC’s in a business and more productive than a single point of failure dummy terminal service for end users. I myself have to have a workstation at work, if not I could not use any of the tools or support multiple environments.
Workstations – PC’s have finally got to being a cost effective solution and other operating systems are available to run now like Linux what I use at work.
While I do agree with the Sun vision that the network is the computer I think it may be just as likely you’ll see a mainframe under every desk intead.
Good grief.
I know that some people will find this odd, but often the type of work you do will define the hardware configuration you use.
If you are on the road as sales person, you’ll probably have a laptop with local applications that try and copy as much as possible to the local hard drive.
If you do weather prediction or simulate nuclear explosions, you probably have a fixed workstation that connects to a massive super computer or cluster, and if you work in animation, I’d guess that you’ll have a heavy duty workstation that you use to do pre-rendering before uploading to a cluster.
I would guess that students would most likely find laptops useful, whereas a hard core gamer is probably hanging to his or her desktop system.
Different strokes for different folks.
Quote: I’m not aware on any cross-platform videoconference software which is available for Linux.
Ekiga uses the same protocol as Netmeeting and there are multiple clients that support this protocol. Ekiga can also double as your VOIP phone for world-wide communications (we use it a bit here).
And GnomeMeetnig, also compatible with NetMeeting. There are also proprietary Linux solutions (I don’t know the brand off hand, but my companies videoconferencing hardware runs embedded Linux, though I don’t know if they make a client available for desktop machines).
Ekiga is the (not so) new name of GnomeMeeting.
There was an interesting idea about network computers some 10 years ago. The trend was, unfortunately, killed by Microsoft. They have seen it as a threat. Later they released Windows terminal server, which is a similar concept.
There is a psyhological problem about that. Employees often see their machin as “their own pc”. And often they do stupid things with it, so someone has to reinstall, clean the viruses, reconfigure, etc.
DG
What to you is a stupid thing, may most likely be to the person who does it a great productivity enhancer. The whole reason that the PC has trumped centralized Computing services, is the degree of ownership the user has over the environment, that they are free to construct their own spreadsheets, to download/install a new tool, or use a new website.
Yes this does raise the total cost of ownership in terms of IT supported needed, but it has a great benefit, and the clear lesson from pracitcal experience, is that the benefit outways the cost, if it did not, we would see lots more companies out there with dumb terminal systems.
Users and business are generally not dumb ( even though they may not know the details and theories of technology systems ) they understand whats needed to get their job done in a more efficient manner, much better than anyone else.
In many ways its a parralell of the centrally planned economies ( where “experts” decide approach and “citizens” aren’t free to make mistakes) versus a free market economoy when people are given ownership and are free to succeed or fail.. on average in an ownership society, there is faster progress than one ruled over by so called “experts”.
There will never be a pure client/server or workstation environment. It has all to do with the right solution to a problem.
For instance, our main offices use laptops/pc’s, the branch offices use thin clients connected to terminal servers at the main office so we don’t have to support local servers.
I use a laptop because I have to be mobile sometimes, and it runs VMware server for testing, has all the applications to independently create documentation and of course occasionaly I watch video on it.
This is something I can’t do well on a terminal server.
On the other hand I use a thin client as well to connect to a special terminal server with all the administrative software and tools to manage the network. This way I can disconnect and everything will keep running until I reconnect from home.
Nowadays I can’t imagine having to solely use one or the other, to me both the local workstation and the thin client/terminal server are important tools to do my job.
Edited 2007-01-18 22:10
…the ideal solution for me is to have my workstation. I like a large screen in front of me with lots of windows open for coding, documentation, email, etc. I also have to run several containers at the same time, locally, so I can test changes to our sites, BEFORE checking them into CVS (a copy of which is also kept locally) for our nightly builds. Some of us even run copies of our Oracle database on workstations to be able to screw around with the structure of major tables without impacting our fellow developers as they too test.
I suppose of someone builds a laptop with a 20+ inch screen I might bite… but also don’t particularly want to take my work home with me if I don’t have to. (This is a bit of a lie, I actually work from home – so let’s say I don’t want to take my work all around with me).
Another issue with laptops is that if I have to have a local copy of the code on my laptop, and my laptop is stolen, now someone else has access to that code.
I am not saying Citrix is not nice, I tunnel into work now, but a laptop will never replace a workstation for me.
Nice site by the way…
I agree a laptop is required in the office now, but I also have a workstation. Most large corp’s lease them from (DELL) like ours does.
My next position will be home based, I would be more productive at home with more time to work on stuff.
Large screen, multiple CPUs, fast local disk: all these help productivity.
I’m running a quad-core Opteron workstation with 8GB RAM, a heavy duty NV workstation card, 24-inch monitor (on which I actually run 10 virtual screens, each 2400×2000), and dual 15K-RPM SCSI discs. Let me tell you that direct display of scientific visualization is much faster than display on my office-mate’s similar machine, connected to mine by 1000GB Ethernet.
Another thing is missed here. What once constituted a workstation (in terms of memory, disk, processor, etc.) is now no longer the high end of machines. They are now the discount chain special.
There is a place for centralized processing and storage, but there is also a place for having a local copy for just one user to work with. The tools need to fit the job. With the insurgence of Linux on the scene, that heavy lifting that was once done only by Unix Workstations can now be done with an inexpensive machine and a Linux disk.
The world is changing, but there will always be a need, by some, for a high end machine on the desktop. It is just that now the delta from cheap to expensive in terms of performance has shrunk.
I think this is one of the main points here, the gap between the top and bottom of the computing platforms in use has narrowed a lot, such that it isn’t at all clear what people mean when they say workstation.
E.g. I have a windows machine at my desk, that is I beleive called by the manufacture a workstation, because (maybe it has 2 CPUs in or more than a couple of monitors attached ) – but its a few years old and probably no more powerful than a modern bottom end “dual CPU”. All would be more powerful that the ageing sun “workstation” I also use from time to time to work with Solaris code.
In addition, the other major change, is the way we scale computing power, its no longer normally just ever faster bigger/faster boxes, as you move from workstation to server.
Now our most powerful computers are no more than clusters of boxes similar to those we have on our desktops, which has lead to design changes in software, where once our final code may have run on seriously heavy iron, we may have needed a “workstation” just to test / develop it.
Instead we now have software that it more componentised and can be run/tested /developed in smaller isolated blocks.
Sometime ago, there was a PC for day to day tasks / documents e.t.c. But it couldn’t really run the more complex systems we would build even in development, those simply needed more memory / more CPU / more throughput to even test them. Then our production servers were generally large, expensive multi CPU boxes.
Now the deployment system tend to be “grids” of smaller systems, rather than big hulking machines.. so the software can also now run in smaller units, which means we no longer
So yes, I would predict a continued demise of the workstation..
…it isn’t at all clear what people mean when they say workstation
Very true. And that’s always been true.
I remember doing schematic capture and programmable logic design in the early ’90s on a PC with an 80286 processor. Some people called that a workstation, but others insisted that it was not.
It’s all a matter of personal preference. I don’t like laptops, and I would not say they are very flexible. Yes, you can take them with you. That’s about the only advantage they have over a PC. Otherwise they are very, very inflexible. I want to be able to exchange parts of the PC – which is still close to impossible with laptops. With PCs it’s a snap. And to be honest: I don’t know a single person that is sharing his laptop. A PC, yes, but not a laptop. That’s total rubbish.
A terminal? No way. Too little, too late. Seriously, this has been around for some time now, and it has _never_ worked. SUN has claimed it, but not even they managed to push it thru. Now MS is proclaiming the server in every home – maybe that leads to terminals altho I doubt it very much. A PC can be had for as little as $ 200. Why should I buy a terminal that is more expensive? I don’t see any reason.
Mainframes are dead? You must be joking! Mainframes are used more often than before, and their market share has considerably increased. Increased, not decreased. So I really don’t know what the author is talking about.
This reminds me of predictions made by analysts. When you read them you often have the impression these analysts live on the moon. Well, same with this article …
Edited 2007-01-18 22:35
“A terminal? No way. Too little, too late. Seriously, this has been around for some time now, and it has _never_ worked. SUN has claimed it, but not even they managed to push it thru. Now MS is proclaiming the server in every home – maybe that leads to terminals altho I doubt it very much. A PC can be had for as little as $ 200. Why should I buy a terminal that is more expensive? I don’t see any reason.”[/i]
what is the real cost of that $200 computer. First at that price it can’t come with Windows XP or Vista with full networking capabilities so add on $150. Next we need antivirus, malware, and a copy of MS office, so that is another $500 or so in software, plus about 4 hours of a techs time at $25 an hour for another $100. Now if anything breaks a tech will have to respond to the desktop say that happens once every 3 months, and it takes an hour each time that is another $100 a year, then add in power costs so that $200 computer now costs the company about $1000. All these are low figures in real life they are much higher, and involves keeping a pile of computers that need to replace deffective ones or spare parts.
Now lets look at a Sun Ray, they are about $300 requires about 15 minutes to setup so that works out to be $325. All administration is done at the server, and can be configured to automatically login to a terminal server running a shared license of Office, anti virus, malware removal and is setup once and costs an average of $300 per user for the license that is shared between a dozen users, if you are running an opensource solution you can subtract the $300. Since its maintained all on the servers it means that joe tech that used to run around from cubical to cubical and taking extended breaks betwwen calls is now replaced saving the company his $50,000 a year salary and benefit package.
How long does the computer last, a cheap $200 computer would probably last 2years, 3 at the most, so budget another $1000 in 3 years for a “new $200” computer. The sunray last 5+ years because they don’t have hard drives or fans that fail, and don’t need to be replaced when MS decides to release a new OS thus saving another $1000 because it didn’t need to be replaced after 2-3 years.
> what is the real cost of that $200 computer. First at that price it can’t come with Windows XP or Vista with full networking capabilities so add on $150. Next we need antivirus, malware, and a copy of MS office, so that is another $500 or so in software, plus about 4 hours of a techs time at $25 an hour for another $100. Now if anything breaks a tech will have to respond to the desktop say that happens once every 3 months, and it takes an hour each time that is another $100 a year, then add in power costs so that $200 computer now costs the company about $1000. All these are low figures in real life they are much higher, and involves keeping a pile of computers that need to replace deffective ones or spare parts.
Have you heard of this thing called Linux?
Have you heard of this thing called Linux?
sure have, but most offices are still running Windows, but feel free to to subtract the $500 software fee, and double the install time unless you have a tech that is really on the ball (and more expensive). And also double price because windows tech are a dime a dozen, Linux/Unix guys tend to make more money. And the tech will still need to make at least 2 trips to the desk a year for hardware and networking problems.
Even with the reduction in costs the $200 computer still costs closer to $500.
with the sun ray techs only need to visit the users desk for setup and total hardware failure since there are no user configurable parts on the sun ray its just set and forget as far as the tech goes and no hard drive or fans to die.
Apparently companies haven’t quite understood SUN then. I don’t see many companies that use SUN’s SUN Ray hardware. Maybe SUN’s marketing department isn’t doing its job then
No, seriously. SUN has been saying this since when? Nothing really has happened. So apprently this is theopretically a great idea but it fails in practice.
No – it doesn’t convince me – and it hasn’t convinced others who are decision makers either.
Apparently companies haven’t quite understood SUN then. I don’t see many companies that use SUN’s SUN Ray hardware. Maybe SUN’s marketing department isn’t doing its job then
Your correct in saying Sun has not done a good job selling the Sun Ray. Having used a Sun Ray for years, I can say they are a fantastic device. The older it got the faster it went – faster, cheaper servers….
One thing I found was if somebody had not used a Sun Ray before they just did not understand where they can be used and why they can be cheaper to run.
Notebook sales have been gaining on workstations in terms of market share for a while now but I’m still not interested in owning one.
I had one for work for a while but all I did was stick it in a docking station.
When I needed to get work done from home I would rather sync the files to my workstation and use that than fumble around with a laptop.
Besides, you can get a high end work station for the price of a mid-range laptop, and when you need to upgrade, you can save money by not needing to upgrade the monitor as often as the tower.
I also disagree with the authors implication that becasue data needs to be shared that people need to use VNC or some terminal client to share it.
Thanks…
“My next position will be home based, I would be more productive at home with more time to work on stuff.”
There are some things to be wary of… I’ve been working from home full-time for 1.5 years now, and before that for 2 years I worked from home 2 days per week.
I have to admit that at times I feel like I am falling out of touch with my co-workers. Also we had an IT re-org and I am working with people I have never met in my life, now.
I also have two small children, one is in school now, but the other does not understand the difference between daddy at work with his door closed and daddy at home . . . she tends to barge in whenever she pleases despite my wife trying to keep her out.
On the plus side… I can work at any time of the day/night that I want. As long as I get my assigned tasks done, my boss isn’t too caring about when I do it, as long as I am either available during their work hours (I am on the other coast) or at least let them know when I won’t be available.
Where I work we just got freeNX/NX/2X instead of citrix, it’s x11 protocol compress is super, the latency issues we had with citrix is gone.
NX is one of the most underrated technologies out there for linux/unix at the moment.
It makes X11 over a DSL connection not just usable, but actually convenient. Anyone who does any form of remote X11 (Or would if it didn’t suck so hard) really needs to check it out.
That, and an rsync-type system for file-sharing (iFolder springs to mind, but development of the non-NetWare version seems permanently stuck in “slow”) would address 90% of the issues the author brought up.
It won’t change the fact that a laptop makes a lousy workstation, or that I’m far more productive with a desktop resolution of 2560×1024, however.
The author forgot to define what he thinks a workstation is. Definitions vary depending on who you ask.
In either definition this subject has been rehashed over and over again. Xterminals, the PC, thin clients, web applications…every “solution” has sparked this kind of debate. Lets just agree that there’s room in the market for all, and that each have their purposes, eh? Declaring obselecence to a particular type of technology is at best shortsighted…see the mainframe, it’s doing fine while its death has been predicted since the ’70s.
“The author forgot to define what he thinks a workstation is. Definitions vary depending on who you ask.”
That’s a good point. Among better educated german computer scientists, you’ll usually find this definition (or something similar):
Workstations are a subset of computer devices of smaller size so they fit on top or under a desk. Furthermore, they do not run “Windows”; to be mor precise (additional claim), they are not able to run it.
Small Computers = { PCs | Workstations | Mac }
Typical workstations: Sun Sparc, Sun Ultra, SGI Octane … Fuel; just to name a few (of my favourites).
You could count devices like Sony’s PS 3 to be in the set “Small Computers”, allthough it’s neither a PC nor a workstation, it’s more a computer system designed for gaming.
Terminals and thin clients (such as Sun Ray) can be considered a workstation too, because their power is based upon a server (usually a UNIX machine). But I’d think they’re a different set because they’re not very usable on their own.
Equipment = { Small Computers | Terminal Devices | Accessories }
Typical for workstations: Except the hard disk drive extensions are not built into the machine itself, instead they’re attached via wire (e. g. SCSI DVD-RAM recorder).
Just a suggestion.
“Lets just agree that there’s room in the market for all, and that each have their purposes, eh? Declaring obselecence to a particular type of technology is at best shortsighted…see the mainframe, it’s doing fine while its death has been predicted since the ’70s.”
When it’s up to reliability and stability, I would choose a mainframe (if you can call the IBM AS/400 or RS/6000 systems that way). Most people have a room full of equipment (“dinosaur”) on their mind when talking about mainframes, and something similar when they talk about “workstations”.
On SGI workstations (for example) you could to things 10+ years ago that require a high end PC today. So, in most cases, if the PC world (especially MICROS~1) claims to have invented something new, you’d say: “Uh… I’ve used this feature ten years ago…” This hoes for hard- and software. Multicore, multiprocessor, big monitor, SCSI, RAID, LVM, VMS, networking, interoperability, multiuser, multiprocessing, …
The question – as usual – is: What task to you need a computer for? This will specify which kind of computer you will use: One for gaming, one for HPC, or just a better typewriter. And if you just want a typewriter replacement, a PC would fit just fine, but if you have to do scientific evaluation, image processing (CT, MRT, PET) or managing control over an intensive care unit (ICU), you would rely on a real workstation computer.
I would like to point out that operating systems like OS/360, MVS, and VMS/OpenVMS are not “server only” operating systems. They are general purpose operating systems.
The first two are mainframe O/Ss and VMS/OpenVMS ran on workstations, servers, and minicomputers. I don’t know if it ran on DEC-10/TOPS-10 so I can’t say if it ran on a mainframe.
One important difference between the current PC/Workstation/Server O/Ss and the old mainframe/minicomputer O/Ss is the expected uptime. Some MVS systems have been up for 30+ years with no unscheduled down time. VMS/OpenVMS systems have recorded uptimes on the order of 20 years.
Modern Unix and Windows systems are, at best, 1 to 3 orders of magnitude less reliable that those systems. And you pay for that reliability.
I really did. I tried. I just couldn’t get through it.
It’s obvious that English is not the author’s first language. But if you are going to write something in English with the intent of publishing it to a large audience, you might want to have your copy proofread by a native English speaker.
I did manage to decode the following:
“From the view of user the server OS is visible in his Citrix client window as an application. In fact it is comparable with a WebOS, which are running in a browser window. Server OS must be stable, reliable and scalable. ”
Which basically shot any credibility the author might have had with me. Citrix is nothing at all like a WebOS. With a WebOS most of the computational resources are local. Code, and initial graphical layout are loaded from the server, but then execute in the browser – very much ‘client-server’. I’ll grant that a WebOS is ‘thinner’ than a fat binary client, but its nothing at all like a Citrix thin-client.