Speaking of the Xerox Alto – let’s move on a few years and talk about the Xerox Star, its successor and, like the Alto, one of the most influential computers ever made. There’s this great demo up on YouTube, where some of its creators walk you through the basics of using the Xerox Star, from basic filing, down to the included virtual keyboard which could display any keyboard layout you wanted – including things like Japanese or a math panel.
I love watching videos of the Xerox Star in action, because it shows you just how little the basic concepts of the graphical user interfaces we use every day – OS X/Windows or iOS/Android or whatever – have changed since the ’70s, when Xerox invented all the basic parts of it. Of course, it has been refined over the decades, but the basic structure and most important elements have changed little.
Like still relying on shoehorning a timesharing punchcard mainframe operating system onto a phone, we still rely on the same old Xerox concepts of icons and windows and dialogs on our phones as well. Hardware has progressed at an incredibly pace – we have watches tons more powerful than 100 Xerox Stars combined – but software, including UI, has not kept up.
We should have better by now.
> we still rely on the same old Xerox concepts of icons and windows and dialogs on our phones as well.
It may be old, but it also works extremely well.
“If it ain’t broken, don’t fix it!”. We can witness plenty of disastrous result when change is made for change’s sake
Such as every change to personal computing since roughly 2007.
Like still relying on shoehorning a timesharing punchcard mainframe operating system onto a phone
There is a phone that runs MVS ?!?
Yes, and wheels… They have invented all the basic parts of it even before they could write. Of course, it has been refined over the millenia, but the basic structure and most important elements have changed little.
We should have better by now.
Edited 2016-06-20 23:21 UTC
Are you suggesting UNIX and our current GUIs are as elementally perfect as the wheel?
…no wonder we’re stuck with this crap.
We are not really stuck: we actively descend into crap. Which should be no surprise provided that computers’ user base evolved since 70’s, mostly towards inclusion of various people who do not and don’t want to understand what they are doing with computers. You can’t expect software to become better if it is designed for less prepared users.
Edited 2016-06-21 00:01 UTC
This statement seems backwards to me. I would expect software to improve (especially if we are specifically talking about interfaces and user experience) precisely BECAUSE it is designed for less prepared users.
If I am creating software for a user that has prepared himself through long hours of study and practice to understand and program the system, I can provide a very minimal set of services and still be useful. If I am creating software for somebody that is only willing to accept a few hours or a few minutes of training (or even none at all) then I am going to have to do a lot more work building software that takes on more responsibility for being useful to such a user. The more power I want to expose to unsophisticated users, the better my software needs to be.
I think precisely because of this attitude I can’t find a mail client for my phone that would suck less then default mail client for my old Palm device.
See, different kinds of users require different sets of features. You can’t possibly develop end user software that would be good enough for everybody without making it so complex that it would end up unmaintainable. Now, “App Store” model makes it more viable to develop for particular group of users: people who know nothing about tech and uninstall the app if are presented with choice where they have no firm understanding of the options.
Edited 2016-06-21 00:39 UTC
I disagree.
We should not want to expose more power to unsophisticated users, because that is like to giving a knife to a monkey. We should want to expose more power to users that want to be sophisticated, that want to learn how the software works, that base his software manipulation based on standard guidelines, not based on what seemed to be better for the software writer.
That is how the current society works: You do not want to give a truck to a kid, you do not want to give an airplane to a lawyer or the laws to an engineer; and at the same time, everybody needs to know how a semaphore works, what the red means, what track you should use when driving, what is the pedal you must push to activate the brakes, etc.
Current software (specially on phones and tablets) does not follow any rules, any guidelines, any quality criteria (in several cases); with functionality clipped or removed. Now to me, it is far easier to use a Windows application than a phone app (probably they are called “apps” instead of “applications” because they are not true applications, aren’t they?)
Edited 2016-06-21 17:10 UTC
Why does everyone need to know how a semaphore works? That’s just tech elitism. You need to know how to steer the car, not the details of how the engine uses the gasoline.
With so many resources we have nowadays, it is just ignorance not trying to learn, though superficially, about the things around us. Including semaphores
Edited 2016-06-21 17:32 UTC
Depends. There are a lot of things I find easier on the iPad than I do in Windows. Not the phone though. That screen’s too small.
“…everybody needs to know how a semaphore traffic system works, maybe?”
If well traffic protocols have to learned by everybody, Not everybody needs to know how to drive a Fire Truck.
But ddc_ is also right: that wheel is a very good wheel, indeed. And no, I’m NOT stuck in the traffic. Neither my wheel is made of sticks. But still a wheel, you know.
Concepts are tremendously powerful, USABILITY side.
If we wanna go there.. what is the software equivalent of a roundabout…
“The more power I want to expose to unsophisticated users, the better my software needs to be.”
Load helicopter pilot program… Done.
Unsophisticated users shouldn’t be given such powers.
[Because that power is not really on the User side, but on the Matrix’].
“We are not really stuck: we actively descend into crap.” Ouch! ouch!
[But if referring to disempowering, procrastinating HW, SW & Law-Frames, you’re right].
Well, I would say that it’s hard to tell what parts of the UI metaphor really need to change.
You could question other fundamental systems from the 1970’s.
Is the idea of a file outdated? A network address? Could we reinvent networking without TCP?
The UI metaphor tends to be good enough. And familiarity and predictability (which go hand in hand) are worth much more than just being innovative in UI.
There were quite a few different network stacks that did not use TCP/IP.
SNA (IBM)
Decnet (DEC) are two that come to mind.
IMHO, TCP/IP was state of the art at the time is was developed. Today? I think it needs a complete rethink.
Starting to read a bit of data and not knowing how long the data is before reading it (hence buffer overflows) is a method that has had its day. Time to move on. Yes, I know you can work around it but it should not be needed in this day and age.
We also need to switch to encrypted data as a default but the likes of the NSA won’t like that.
“… a method that has had its day. Time to move on.”
Better build side-by-side, FlyingJester.
‘better build side-by-side’.
We needed a 9Km rock to replace reptilians at the top of the food chain. [And only at land].
“We also need to switch to encrypted data as a default but the likes of the NSA won’t like that.
[Maybe because that would make the work of Harvesters an easy ‘breeze’?]
Giving you reason of primitiveness. But that kind of overflows easily trapped with hardware filters.
What does that have to do with TCP/IP?
I know. I mean, can you believe that we’re still using levers, wedges, pulleys, inclined planes, screws, and wheels and axles to amply our effort to accomplish certain tasks our bodies cannot complete on their own?
I mean, old technology, right?
Edited 2016-06-21 01:58 UTC
Funny how you are comparing simple devices, perfectly adapted to their usage, to a jumbo jet intended to fly hundreds of people at a time, which was retrofitted to hold one person, and stay on the ground.
Sure, it works, but would not it be timely to invent the motorcycle or the car? Or even the horse?
Or the Horse?
Ah, you mean the Camel. A horse designed by an ISO Comittee.
Wrong, the Award goes to Larry Wall.
Well, until somebody at least makes an attempt to explain why the underpinnings of Unix that made it such an effective multi-user operating system make it ill suited for a desktop, or even a smartphone, with all the software complexity those require, when the requirements align so well – especially process isolation and privilege separation.
I guess a better analogy would be, we’ve been building buildings for a long time, and while the details have changed, many of the overarching architectural designs have persisted for hundreds, even thousands of years.
All I hear in this criticism is, “It’s old, so, it must be bad. I would be better if it were new, because new is better.”
Unix is surely a timesharing os, true. But it had a totally different intent than most end-user oriented OSes: asynchronous and parallel command line commands, also known as batch jobs. Batch jobs are the exact opposite of a responsive, interactive UI.
The best illustration of these issues is the difference in UI-responsiveness between an OS dedicated to user interfaces, BeOS, and a hack like MacOS, GNU/Linux, Windows or even Android.
You can put any count of hackish layers to improve responsiveness, such as special IO and CPU schedulers or add pseudo real-time extensions, you cannot do miracles if the core layers of the OS were never intended for that.
I still marvel at the fact that in 2017, IO waits can totally freeze most modern OSes, for no valid technical reason.
These also POSIX, this useful, but huge and outdated standard, with many insecure APIs which have to be kept for compatibility. Compatibility to buffer overflows?
As a software engineer, I can totally understand why all these OSes converged to that ugly direction, the path of least resistance, but as an end-user, it is just baffling.
Edited 2016-06-21 08:51 UTC
As a software engineer claiming history knowledge, you might try using Unix. It was designed for interactivity (via a terminal), not batch jobs (via a batch dispatch system)
Exactly… timesharing batch system is practically an oxymoron.
A batch system implies non interactivity… whereas Unix has always been interactive (firstly the shell and secondly editors etc are almost all interactive though there are exceptions)
Also, Thom you are wrong about automobile engines. Electric motors + batteries are only potentially better if they are supplied with clean power… which is a problem all of it’s own.
Edited 2016-06-21 13:33 UTC
Incredible.
You are there comparing the latency of the display on a terminal, to the latency of a UI, here. Seems like you have missed that the display does not have the same response time, and not the same intentions.
An unix terminal was never intended for realtime display. Sure, reasonably fast display, but not instantaneous. And I think about a damn frequent use case : remote terminal. A modern UI is totally dedicated to realtime display and fast interaction responses.
Take vi, a good example of interactive terminal application. It was mostly designed under the assumption of slow network connectivity to the unix server. Take a modern IDE, think IntelliJ Idea, Eclipse, Visual Studio, or whatever. It is design under the assumption that text must be displayed instantaneously, because it is not a good user experience not to see the characters display instantaneously when you type.
That is not the same use case, there were not the same expectations for UI and display response speed.
There was no constraint of realtime display on a terminal. On a modern UI, it is the primary requirement.
Terminals are teletype derivatives. Nop, interactivity is after-thought, at teletypes. [But a lot of effort went into making ash interactive]. Editors are applications.
Amazing how the software has been adapted and re-architected to support these things, no?
Untrue.
Even *if* all the electricity that powers your EV has been generated by a coal powered plant, then the EV is still cleaner than your average ICE, because the EV is much more efficient. Don’t forget that diesel only gets about 40% efficiency and petrol is even lower than that. The higher efficiency of big power plants and the EV negates the losses of energy distribution.
Also, don’t forget crude oil has to be refined for you to be able to use it in your car, which also costs energy to perform and to transport. Only if you include all of this in your equation will it ever be a fair comparison.
This gap only widens the greener the electric power is.
How many RTOS are on the mobile scenario?
Probably a lot… since the baseband processor in just about all phones undoubtedly uses one.
Thanks cb88
Sorry, I wrong worded the doubt at my mind: Are there RT consumer-grade full stacks at the mobile arena? [Up to applications optimized?].
Indeed. Thom seems to have an obsession with badly describing Unix (punch cards were never really a thing for Unix) every time a POSIX-compliant OS comes up, despite POSIX-compliance not being equivalency to Unix.
Surprise! Even the jumbo jet has a lot of levers, screws, wheels and axles. You will also see a lot of wedges, pulleys or inclined planes used for its operation (repair, loading and such).
While some guys were busy creating C at AT&T, Xerox was already doing memory safe systems programming with Mesa for the Xerox Star.
https://en.wikipedia.org/wiki/Pilot_%28operating_system%29
Building up on the inheritance of systems like the Burroughs B5000 from 1961 or Algol-68RS used at UK Royal Navy computing systems.
Which was further improved with the evolution to Cedar, which made use of automatic memory management. Used to implement stuff like the Alpine distributed file system at Xerox.
So don’t believe those UNIX revisionists about C being the very first systems programming because computers weren’t powerful enough for anything else.
In what concerns programming tools, the spiritual inheritance of Xerox lives on those technology stacks like Mac OS X, iOS, .NET, ChromeOS and Android, which eschew the PDP-11 CLI to more Xerox like developer workflows.
Then go invent something better. Why expect that someone else will do it for you? It’s nothing but a spoiled attitude and sense of entitlement.
I’ve found that most people who make claims like “we should have better” really have no idea what “better” is supposed to mean at all, let alone how to get there. It’s also ignoring the fact that “better” is a subjective term anyway. What is better for me may be far worse for you, and so on.
With this in mind, then, I believe a challenge is in order. What is better? Define it, in terms of a GUI system. What would your ideal system be? I’m speaking not just to Thom, but to everyone. Define and describe your ideal GUI. I suspect you’ll be surprised how many things described fall back on windows, menus, buttons, and other building blocks that have persisted over the course of more than three decades.
I think the folks that came the closest to defining an actual usable alternative were the guys from Etoile:
http://etoileos.com/etoile/
http://etoileos.com/etoile/mockups/jesse-2007/
Why do I need to know that Adobe Acrobat or Evince open apdf document? I don’t care, I just want to open it. And why is it that I need to consciously exit from it and open an email client, attach the document to an email and hit “send” in order to mail that document out? Why can’t I just be viewing the document and, out of the blue, email it to a buddy without stopping my workflow? That’s the general concept behind the Etoile approach, as I understand it.
Unfortunately, it never materialized.
You mean like the share button in iOS, OS X, and Android? Not a full implementation of the concept true, but at least quickly e-mailing something you’re reading can be done the way you want.
Edited 2016-06-21 18:10 UTC
Somewhat, yes. I think the Share feature is a great step in the right direction. Another, more complex example, would be if you’re emailing a picture to your friend and decide that you want to supply a filter to it (whatever it may be) mid-email. Right click on it, choose the filter, apply and voil~A . No need to”exit” the email client. Technically, the an application providing that service would have been opened, the image passed to it, the filter applied, and the result passed back to the email client. But as a user, I don’t need to care that all that happened.
Yeah, I know what you’re getting at. I suppose the closest thing we have to that is, at present, Android’s activities where by part of an app can open inside another one. I haven’t seen it used properly very often though.
You may not have checked it out.. but etoile is basically what those guys think an OS X like interface should be… It’s even based on gnuStep and objective c.
You have to do this? I don’t with Foxit. Maybe you need a different piece of software. Or, maybe this should be a universal option? Not all file formats are amenable to emailing, though
You are right. Some have tried to introduce a different user paradigm and failed.
IMHO, Window Metro/TIFKAM/Modern is one of those and personally, it failed miserably but there are some who like it so who is right? I don’t know but at least they tried.
As a software developer for the last 40+ years I wouldn’t know where to start making ‘something different’. Those two word give me a mental image og John Cleese sitting at a desk and saying ‘And now for something completly different’.
Thom is really hung up on legacy. If, whenever you develop a knew platform, you don’t scrap what was built before, if you don’t build it new, entirely from the ground up – Thom won’t like it.
So the fact that our phones run variants of an unreformed, unrepentant file based OS designed in the 70s really really bugs him. I mean, alot. Doesn’t matter that those nasty old bits have been relegated to the edges and smoothly covered over with modern new APIs. Nope, Thom can apparently still see the flaws beneath the spackle.
Similarly with UI – I guess we have to re-invent the wheel to please Thom. Multi-touch and gestures – nope, not enough. Voice control, nope, not enough. Comprehensive package management, permissions systems and sandboxing – nope, not enough. The equivalent of a 70s super computer, packed with tons of sensors, a camera 10x better than your dad’s with a screen that makes your 90’s computer monitor look like a postage stamp, all in the palm of your hand – nope, not enough, because we are still using a 2D grid of pixels with icons and buttons. We should have better.
Everybody can.
Every layer is yet another source of performance degradation, bugs, security issues, maintenance costs, and so on. The more layers we keep throwing atop this outdated operating system, the more fragile it becomes, leading to even more layers and APIs to keep things stable, and so on, and so forth.
It’s not very different from automobile engines, really. We’ve made the internal combustion engine remarkably better over the years, but there’s going to be a point where continuing its development is not going to be worthwhile vs. moving on to electric engines.
The same is going to happen to operating systems eventually. There’s going to be a point where the delicate balancing acts that are our operating systems simply won’t be worthwhile anymore – and that point is approaching faster than most UNIX-as-a-Religion people want to believe.
I can’t wait for the house of cards to come tumbling down.
Yeah, nice. It’s not the way I want it, so I can’t wait until the whole thing burns. Nothing spoiled about that at all, no sir. Pretty fscked up, I must say, especially because you’d get absolutely nothing more out of everything crashing down than your own vindication. Pathetic!
How is it fucked up to hope for something better to replace what came before? Is it “fucked up” of me to want electric cars to become affordable enough for everyone, so that cars with internal combustion engines slowly fade away?
What is it with you UNIX people that inspires such religious fervour? It’s just a bunch of ones and zeroes, nothing more. Wouldn’t you rejoice if something better came along?
No, because they rejected the improvements Plan9 and Inferno brought over UNIX.
For them, using the screen full with XTerms and VI/Emacs in 2016 is progress.
If that is what one understands what a computer should do in 2016, they will never get the point of systems like the Xerox Star, developed in the same decade that UNIX started to be developed.
Every layer is a battle tested, hardened piece of code that we don’t need to re-invent and debug.
Android used Linux because it provided decades worth of development and expertise – *for free*, all with modern APIs, a high level of security and hardware isolation, and extremely good performance.
I mean what you are saying makes literally no sense. How would a from scratch OS not have to deal with security issues, performance issues, and maintenance? Why wouldn’t you pick an OS that already does what you want, and has already had most of those kinks worked out, oh, and is being actively developed, for free, by thousands of developers world-wide.
I mean, I don’t know about linux-as-a-religion, but that sounds like a pretty good deal to me from a purely secular, pragmatic point of view.
Dinosauric size OS are inherently fragile without a lot of political support. In that sense an OS can, suddenly crash down like a house of cards.
But politics again is what prevents mutiny and burning of the ‘carabelas’.
And yea, Thom is right in that Layers are conceptualization artifacts to divide REAL Problems which so often are integrative by nature. The limits of Systems Theory.
“There’s going to be a point where the delicate balancing acts that are our operating systems simply won’t be worthwhile anymore…”
To an extinction to occur a replacement should be luring around. Can’t see it…
…but yes, should be looking among the small, rare, irruptive beasts. It took a humble Acorn with a 32K tr. count CPU to by-side [population wise] x86 architecture.
Being almost 10K years of wheels as transport tech. Time will come when irrelevance hits them. Hopes are for the better.
” – and that point is approaching faster than most UNIX-as-a-Religion people want to believe. ”
Research is quickly distancing from algorithmic programming. Is that what you mean?
Some of us, back in the day, actually got to work on these machines. First he Alto, with it’s 5 MB removable 16″ Disk Platter. . . . Portrait monitors, made for GREAT e-mail clients. . . Heavy, though, make out of cast metal. . . OK keyboard. . Fully networked to servers and high-end laster printers, the only goofy part was all the cables hanging down from the ceiling. Yea, good ole 10Mbit Coax Ethernet. . . .
The Star’s were a GREAT upgrade. We ran them in “Dandelion” mode, which was more like a KDE-ish environment. Still GUI based, but not the fancy folders, etc from the STAR stuff. Tools group wrote all of our compilers for our Copier products. Yep, did programming on low-end copiers between 83-85. 8051’s mostly, since Xerox has “special” ICE (In Circuit Emulators) custom built for debugging. . . Way cool at the time.
And that’s why I got me a MAC when they came out. First one was a 512K Fat Mac – yea, it was a “step down” from what I had at work. . But I didn’t have to purchase the Alto/Star myself either
Fondly remember those days. . . .
Did you program on them?
From reading the Mesa and Cedar documentation, I get the idea that our modern IDEs still have quite a bit to catch up with them.
Also the way the OSes were extensible.
“Until we perfect the “machine responds to the human mind” interface ”
Then take care of ‘sins of thought’, those would be known to the whole World. Your iButler could actually perform them, before you uncommit. Keep guns out of reach of your iButler
Interface main function is to enact a FRONTIER.
This is the same line of reasoning for keeping non-products private. Law can’t go after thought, no matter how convoluted. All of a life of education is dedicated to produce civilized output from it. The effort of fathers, educators & hundreds of well intended citizens, beyond the self. The tools an individual use in this educative process are from their own, and nobody else.
Robots are unthinkable without an ‘small-ethics’ chip. Not the best AI farm of today has those.
Every time I try to delete the last typo at this interface, and miss the target by a single key. All of this session and my browser instance are gone.
My interface is failing at its main function, as a frontier.
This product|non-product is going to be definitory.
My Internet Navigation History is a non-product. It expresses selectivity at consumerism. But I’m producing nothing with it. It’s a memory helping tool. Its original intent.
If making a list with it and sending somebody else, then becomes a product.
Having been in this business since 1968, I have seen it all! This discussion about user interface has been going on for centuries… The “simple machines” mentioned are easily understood and, while refined by every generation, are essentially as they were originally invented. Ah Ha – the same with computer user interfaces (read machine user interfaces). One can only do so much. Until we perfect the “machine responds to the human mind” interface (hello Forbidden Planet) we are doomed to only “improve” the icon, point, click, interface, and not much else. The limitation is not the interface but the inability of humans to interact with it in other but the 5 human senses.
I’m not sure I’d want brain interfaces anyway. Some of the things I think should never be let out of my twisted mind.
Are there nice apps with distance/force sensitivity for the iOS phones that now have them? I’d like to tune in whatever that is…
Surely multiscreen setups have a multilayer deco window view management or other wonderfully sensible WM.
Displays have had nice IR multitouch tracking at good distance for a bit…why not this one at tiny cost and why not let me point at an article or byline, tap delete just so, and never see that except as a greeked inset again? It shouldn’t matter whose forum it’s on if the user isn’t trying hard to play anon, or as another functionary; render as I ask, of course.
The memory for sensibility is there; the phone or desktop or laundry console should have a good idea why it would beg off opening a PDF with forms, and say as much, ready to pivot to FoxIt or LibreOffice, choose VMs from a long ways away, etc.
Quite possibly the OSAlert newsletter could have a tooltip frame for an example, someone could share the scriptlet for threaded comment reading, and there could perhaps be forward and autoreply buttons like the delete one without so much cobbling. Similarly, this form could not pop in blank just because I allowed an ad script (in vain, it seems) when Firefox ESR supposedly has nudges in it to keep the typed text? …..