The only place in San Francisco still pricing real estate like it’s the 1980s is the city assessor’s office. Its property tax system dates back to the dawn of the floppy disk. City employees appraising the market work with software that runs on a dead programming language and can’t be used with a mouse. Assessors are prone to make mistakes when using the vintage software because it can’t display all the basic information for a given property on one screen. The staffers have to open and exit several menus to input stuff as simple as addresses. To put it mildly, the setup “doesn’t reflect business needs now,” says the city’s assessor, Carmen Chu.
San Francisco rarely conjures images of creaky, decades-old technology, but that’s what’s running a key swath of its government, as well as those of cities across the U.S. Politicians can often score relatively easy wins with constituents by borrowing money to pay for new roads and bridges, but the digital equivalents of such infrastructure projects generally don’t draw the same enthusiasm. “Modernizing technology is not a top issue that typically comes to mind when you talk to taxpayers and constituents on the street,” Chu says. It took her office almost four years to secure $36 million for updated assessors’ hardware and software that can, among other things, give priority to cases in which delays may prove costly. The design requirements are due to be finalized this summer.
This is a problem all over the world, and it’s more difficult than one might think to replace such outdated systems. Existing data has to be transferred, a new system has to be designed, staff has to be retrained – and, of course, since it’s not a sexy subject politicians can flaunt, it has to be done with impossible budgets that inevitably balloon, often leading to doomed projects.
It’s easy to laugh at these outdated systems still in use today, but often, replacing them simply isn’t an option.
No one wants to put their head on the block. On these kinds of projects, if everything goes OK, no one notices, so there’s no personal advantage in doing it; on the other hand, if anything goes wrong, then everyone notices and you’re screwed. The only winning move is not to play and just hope that nothing blows up while you’re in office.
The only winning move is to hire a consultant, get a 3rd party to do it, then complain about the mistakes they make* and how much better everything was in the past
* While conveniently leaving out all the old issues and nobody being capable of working with that system anymore while the new system is working mighty fine for everyone
This reminds me of the LRT-1 and 2 lines here in Metro Manila being so old the computers controlling them use 5.25″ floppies.
Old software is not a problem if:
1. It’s well written for the purpose to begin with
2. It still does the job just fine
3. It’s not accessible from the outside world
If these 3 conditions are true, then it’s perfectly, absolutely fine to run 20, 30 or even 40-years old software.
spambot,
Great. None of them applies to this situation.
The software is no longer well written for the purpose. The purpose has increased in scope and complexity and size.
It doesn’t do the job fine. It’s error prone and slow.
It’s not even accessible to the people who use it to do their job for the city.
kwan_e,
I’ve had some work maintaining legacy systems…it’s really some of the worse code I’ve ever seen. Despite my opinion that a lot of legacy code should be put to rest and replaced with a clean slate (particularly when it comes to databases), it’s still around and ticking because the companies are attached to it. I do have one gripe with your list. If you can run the old software on modern hardware, it typically runs circles around modern frameworks that are bloated and require beefy severs & workstations just to get mediocre performance.
I suppose you may have been talking about the performance of human users rather than the software itself, in which case, yeah modern software usually benefits from having more information density on every screen than those old 80×25 displays which required a lot more navigation. I kind of feel this has regressed a little bit with the emphasis on mobile (and windows metro style apps) where information density has ticked back down.
Alfman,
Unless we’re talking about mobile Web interfaces, in which case information density is much less, much slower and even more cumbersome.
I’ve worked with mainframe programmers who are far more productive on a 3270 (although with a 132×27 sceen) coding in C than programmers forced to use things like Eclipse based environments.
It comes down to matching the interface to how people actually do their job, and it sounds like this particular city department’s software was designed by architects with big ideas about how things should work in their neat little pristine world.
kwan_e,
Yep. It’s funny that back in the day I thought that software could be designed to run well on both the desktop and mobiles. But now that I’ve seen enough counterexamples in practice I think the compromises we make for one end up making the experience miserable for the other and we’re better off keeping them separate. You’ve got information density problem and a blunt input problem. A keyboard and stylus help with poor density on mobiles, but those are the exception rather than the rule. Mobile interfaces are becoming the new least common denominator. Just today Thom posted an article about “Bringing iOS apps to macOS using Marzipanify”. It may prove to be a sucky future for those of us who prefer desktop computing.
It’s probably been over a decade for me, but eclipse was unbelievably slow on my work machine at the time. Even a tiny project could bring the machine to a crawl. At the time visual studio worked so much better (though I feel it’s gotten worse in subsequent releases). It’s overgeneralizing a bit, but every new generation of hardware seems to bring less efficient software development tools and frameworks. Of course there’s still some of us who do things the old way, I guess we’re the dinosaurs now
The problem with replacing old software is that the magnitude of the project, the expected gains AND the opportunity cost of not doing it cannot be easily quantified.
I consider old software a bigger long-term challenge for humanity than nuclear waste and orbital waste, because we don’t know the magnitude of the problem. That is, how much knowledge has been accumulated in old software (the same software that might have all kinds of limits and date bugs) and how thus hard it would be to rewrite that software.
Related reading:
https://www.joelonsoftware.com/2000/04/06/things-you-should-never-do-part-i/
The article is misleading – the hardware and OS are running, reliable and still working 35 years later. Maybe they should spend the money to pay a company to develop new software on the as400 platform (which can run modern web apps too) . The line about not being able to use a mouse is a red herring .
More likely, SF need new software that meets their needs. Most any company would write that for you for 1-2 million dollars . There are as400 Dev shops out there.
So, similar pitfalls to those faced by OSAlert recently (and unfortunatelly it seems that not all went well with its data migration – long URLs in posts from old OSAlert didn’t migrate properly, they’re cut (like they were displayed by old OSAlert, not to what they linked))