I know all this because I remain a hopeless computer tinkerer who happened to come across a Quadra 700 around the start of 2020. Unlike my road test of the IIsi for Ars back in 2018, the Quadra 700 presented a tantalizing opportunity to really push the limits of early 90s desktop computing. Could this decades-old workhorse hold a candle to the multi-core behemoths of the 2020s? The IIsi turned out to be surprisingly capable; what about the Quadra 700 with its top-shelf early ‘90s specs?
The Quadra 700 is such an enticing machine. Clean, elegant, and capable for its time, I’d love to play around with one today.
I suppose if you leave security to some other benign network interface you might get some use from this sort of system, but otherwise the Apps and OS are grossly out of date and wide open to many attacks.
We have several old PowerMac g4 we keep for legacy / compatibility purposes, but they are air gaped because if not they are otherwise almost certainly super-spreaders!
Is there actually anything out floating around that would spread to one a G4 running Tiger, just by plugging in a network cable? I mean you would have to install a browser and either download something or browse nefarious websites? There isn’t any software just calling out to compromised sites (one would hope).
From my understanding, the only usable browser is TenFourFox anyhow.
Just wondering, because so many people think just plugging a machine into the internet makes it an instant cesspool of viri, and it isn’t like licking ATM machines at all…
What can you do without a browser, you can’t run any up to date desktop apps!
Perhaps you can drag out those Zip discs and go over some old old emails from a 1999 backup!
We considered TenFour but it’s wide open.
Ha, to me you don’t need up to date apps. Like I would rather use a much older version of Office if I had a system nkt connected to the net, but I also know on say a WinXP systsm even connecting to the internet you will get infected. My ask was is this really true for a PPC MAC?
Of course it is, only Apple tells you they are immune, but they’ve spent decades denying any MacOS virus exist!
But infections are only part of the story, weak encryption leaves passwords and other transmitted data wide open.
There are a few issues with older apps and systems security but nothing which needs people to start hyperventilating. A few features and maybe a new IP stack and that should sort most people out. There’s no reason why web browsing or even video can’t work on an older or more modest system if bloat wasn’t an issue and there was a hardware H264 decoder.
The point of articles like this is they show how capable older systems are. Apart fro a couple of things like art and games and video they are still competitive. The actual bits of the system needed to bring the experience up to a modern platform aren’t that many. Not everyone needs a multicore monster with the latest GPU and 32GB of memory costing god knows how much and which is obsolete within two years.
Why is two years the cut off date? For business the metric was five years and it was a pretty good one financially and psychologically. The two year cycle is driven by financiers and marketing, and expensive and bad for your mental health.
The lesson from an article like this is to step out of the “manufactured consent” bubble and look at things differently. As yourself why do we need ABC and what is the real value of XYZ and so on.
Back in the day VCRs had gone as far as they could and IHVs began stripping features from lower models and introducing features on now more expensive higher models. They did this to keep selling “new” and “nudge” people into paying more when really a five year old VCR had all the top features an worked well enough if you looked after it. Cheaper components which don’t last and the collapse of repair shops and even repairability meant people kept buying. The gee whizz and novelty of IT has got old and they are all up to the same old bad tricks and monopoly practices of “old” business. Articles like this help people see things differently to what expensive and persistent marketing what to shape.
@HollyB.
“Back in the day VCRs had gone as far as they could and IHVs began stripping features from lower models and introducing features on now more expensive higher models.”
Those ‘cheap’ VCRs were being sold for a fraction the cost of production by Chinese OEMs. IIRC correctly Walmart were selling a model for about 1/3rd the teardown cost. {The Chinese OEM that made them went bankrupt with around $700m in losses.]
Only the high end Japanese VCR models were sold for anywhere near break even price.
@Brisvegas
I’m discussing the 1990s market in the UK and have no idea what was happening in the US nor have any way of checking whether your claims are true or not and not just taking your word for it. Citations please,
The VCR I had was of a previous marketing generation. It was par with the ones selling for more and it had cost less. For the last generation of VCRs I doubt anyone was selling at a loss but playing price manipulation in what was then a shrinking market. People who had VCRs didn’t need a new one and new technologies were beginning to come in such as DAT and other gimmicks before DVDs took off. The whole market was in flux at the time with a few companies making bad decisions and others getting lucky.
I agree the Japanese have a thing about quality but their internal market is a little different so they could sustain this. You will note they priced themselves out of the market and also lost innovation momentum.
In the UK around this time and certainly in the subsequent decade there has been a lot of manipulation with bank lending and investment rates and the price of staple goods. This is in tandem with structural changes from retail to investment banking, and high street to out of town shopping. It’s all designed to inflate margins while ducking regulators while outsourcing costs and using highly leveraged debt. You can also throw in intense lobbying on tax and a massive redistribution of income.
Discussions today about big business and anti-monopoly investigations and so on are just the end point of a lot of historical ducking and diving. It can be difficult to see without a global view and they rely on this.
You could try to breath new life into the G4’s with MorphOS.
https://wayfarer.icu MorphOS browser based on the latest WebKit source
https://iris-morphos.com MorphOS IMAP Client
Yes been doing that since YellowDog was a pup!
MacOS classic just has about zero concept of security. Going on line with it nowadays will probably not be much of a risk, just because there’s zero interest on it.
However, you could always encounter the usual bored port scanner that will just be creaming their pants about all that room for activities with all those open ports.
Not that you can do much modern web stuff with that system anyways.
The superspreaders of this world are the vast majority with their fancy smart appliances, popular operating systems and cheap factory set routers. Retro enthusiasts worry about floppy mold.
This article was a nice glimpse in to what was and is still possible with what are now very modest processing and graphics power compared to todays machines. This leaves me wondering what would the design be for a minimally effective system today covering ordinary office tasks, watching video, and web use (sans bloat). I think this kind of article helps reveal how overblown most of our hardware and software is today. The thing about choosing components built to last is another interesting point as is, from a useability point of view, having a clutter free environment.
“This leaves me wondering what would the design be for a minimally effective system today covering ordinary office tasks,”
A cheap Raspberry Pi 4 or 400 would work fine
I was speculating what minimally effective system from a cleansheet point of view would be. No expense spared. Compromise later. By this I mean examining the global problem and stepping through each element. I already know about the Pi. That’s the boring answer for people who don’t think.
I am afraid this whole “bloat” obsession is more like a psychological thing. This odd need to humble brag or whatever.
Most people use computers to get shit done, and turns out that “bloat” some of y’all are obsessed with is what makes a lot of other people far more productive.
For example, I can now do in a few seconds, with some of the frameworks I have access to with my programming tools I use, what used to literally take days a few years back. Same thing goes with the EDA tools or even the Office suite I use at work.
If your needs are very basic, you really didn’t even need to upgrade in the past 10 years. That’s how fast things have gotten.
I’m presenting a very well defined scope while noting issues like graphics and games so obviously have an idea about use cases and workloads. Yes, I already know about OS functionality and frameworks and specialist uses. I know the difference between lean and mean versus bloat, and the difference between capable general purpose and narrow one function embedded systems. yes I’m aware of the 80:20 rule etcetera. Your comment is basically from your point of view. In the legal field this is what is called “irrational” or “political”. Don’t start hyperventilating over these terms. They are technical.
If I had a magic wand, personally, I’d like a proper academic study done of your work. You might be surprised how much bloat is needed to cope with bloat and how little you actually really need in practice.
You get bloat in the medical profession and civil service. Bloat enhances illusions of status for egomaniacs. Bloat creates work for none jobs. I direct your attention to the terms “minimally effective dose” and “adequate”.
Apple’s M1 system is actually a minimally effective system using a few extremely old tricks such as prioritisation, throughput, and unified memory. It’s a “least effort” system with a lot of bang per buck or “punching above its weight”. Apart from artists (who work with 20MB raw files which end up as miniscule images on 1080p screens) and gamers (who play games with insanely polished pixels but less gameplay than 20 years ago) and coders (who compile for bloated systems to make more bloated systems) I suggest 90% of people don’t need a system like Apple’s M1 in the real world if truth be told.
I’m not against progress. I’m just against the cult of bloat and the idea that just because YOU need 16 cores at 4GHz with 64GB of memory and a GPU heading in the same direction and an SSD raid array that everyone else has to subsidise your hobby.
HollyB
“If I had a magic wand, personally, I’d like a proper academic study done of your work. You might be surprised how much bloat is needed to cope with bloat and how little you actually really need in practice.”
And you might actually be surprised how It’d probably take you longer to even begin to understand the domain of the problem, than it would take my “bloated” solution to finish.
Anyhow
Computers, at least for me, are mainly the tools I use to solve problem in my domain of application/expertise. And “Bloat” is a completely subjective qualitative term. But one thing is for certain, a “bloated” solution that exists is infinitely more useful, to me, than an ideally lean one that either doesn’t.
FWIW Technically, I am the one subsidizing your hobby; usually the high margins in the premium tiers are what helps pay for development of the value and low end tiers.
@javiercero1
For a Phd who claims to be an expert you sure have a habit of getting your opinions off marketers via newspapers. In actual fact it’s the rest of us subsidising the rich if you take a look at supply chains and income distribution.
Discussions with zealots are boring.
“I am afraid this whole “bloat” obsession is more like a psychological thing.”
Windows 10 has nearly 1000x the minimum hardware requirements of NT3.51 and ~100x as much as Windows 2000. The basic functionality has barely changed. So the bloat is 100% real and nothing to do with psychology.
” This leaves me wondering what would the design be for a minimally effective system today covering ordinary office tasks, watching video, and web use (sans bloat). ”
If it only used applications written in a low-level language, no legacy support and hardware acceleration for everything (eg an ASIC based console) it could using incredibly low end hardware. Even the humble Commodore 64 could run a GUI.
Yeah… This would be a pretty extreme implementation.
I’m kind of wondering if the idea could be extended via a nice fat slot for plugging in a go faster card (either a CPU or GPU) and something like Thunderbolt to an external box like a console. You could say the modern PC with its internal pays for GPUs and gaming via the cloud is the same idea just louder and noiser and hidden behind a paywall but conceptually it’s basically the same.
Sadly privacy invading big business who want to make an easy billion have trampled all over the home user and small business.
@HollyB
My idea is for a ‘Lego’ PC. Every component would be a standard-sized self-contained block. Each block would connect to a baseboard/backplane using ultra-fast slots (something like m.2). It would use a replaceable open EUFI that allowed you to run virtually any combination of hardware depending on your needs. You could add extra CPUs, GPU, RAM. memory and storage as modules as needed. It would be largely future-proof (10+ years) if implemented properly
@Brisvegas
Thanks for getting it. A friend of mine had the same idea years ago. Taking your basic idea I’m sure the design issues including aesthetics and scalability could be refined. There’s also the multiuser and networking angle. For a larger home or the office there’s things like local or remote thin/thick client-server/minicomputer/cloud type of ideas so it could have expandability. Beyond a certain point there’s no reason why 20+ years longevity or more isn’t achievable.
I don’t have the schematics for the bus on the bottom of my laptop which plugs into the dock and would never get around to it anyway but Lenovo used to sell a dock for another laptop model with a bay for a full length GPU card. There’s also Grid who had a modular system so examples exist.
The way I see it is going back to basics with a cleansheet view of the problem can help people look at things differently. None of us owe $100 Billion companies an income either.
“In contrast, taking the IIsi through its paces was a joy. ”
Yes it is fun to work with classic hardware. However it is also a waste of resources to use them in daily life (except for a few valid reasons).
I for example, switched from a Desktop to a NUC and actually reduced my computing power by 50%. But the electric consumption had been reduced much more. Over the long period, I don’t need the fastest system to do my job (I remote to office and build code in a very beefy system when necessary). However having a silent, stable, and not-power-hungry system makes a big difference.
Long story short: Get a Apple M1 Mac instead. The Mac Mini is finally a good choice again.
My old desktop before it went up in a puff of smoke was an AMD x2 64 4200 with 8GB and an ATI HD 2600. This was a huge step up over what I had before and excellent for office stuff and development work. I used to play games back then too and it played everything new at 100% for two years. It was so ridiculously powerful I couldn’t believe it and nobody else online or offline noticed. Then things changed overnight as new games came in and over six months my machine dropped down to 50%. I’m no longer interested in games. There is nothing about with the same magic as the original Tomb Raider or Deus Ex. Everything is too serious and too expensive and too toxic.
As a replacement I bought a used and in excellent condition Thinkpad T520 and dock. It’s small and quiet and with a new display uses hardly any electricity. I also swapped from halogen lights to LED lights. My electricity use is one third of what it was before. I would have bought an NUC and bolted it to the back of my display (too expensive) or a Rasbperry Pi (not powerful enough). The laptop is the best compromise I could afford.
I would still be running Windows 2000 if I could get away with it. I stuck with Windows 7 as long as I could and only installed windows 10 last year.
When I coded I used an abstraction layer which catered for different systems. It handled x32 and x64, different OS targets, different compilers (Watcom thought to Visual Studio and all its versions), and different platforms from PCs to a range of game consoles. At the time Microsoft began to play funny by artificially restricting Direct X installing to older platforms and dropping support in Visual Studio for older platforms by “mistake” by dropping key system calls and dropping older OS from support. Then there was the havoc of changing their useful help system to an HTML system which didn’t work and from what I gather still doesn’t work. more recently they sacked their entire technical writing team, dumped their testing team, and have shifted priority from Windows to cloud services, and have tried every trick in the book to take control of the entire PC platform from secure boot to locking out alternative OS to pushing Microsoft accounts to forced updates. Almost all of Microsofts competitors for compilers and office suites have been driven away or barely exist. Adobe has pretty much done the same thing for art software. Apple are no better and to some degree even worse.
Over time with IT I have gone from “OMG exciting” to “Ugh how is this thing going to gouge me and hurt me”. Maybe it’s age but I don’t find IT or software development fun at all. I’m now just an end user. Office apps, email, browsing, and some art and stuff. How people are sucked in by lock-in or the cloud, and high prices and subscriptions I don’t know. It’s a loss of control and rights and expensive.
Laptop + dock definitely works. Especially for newer ones with type-C or Thunderbolt that takes only a single cable for everything. One thing I noticed that is though you need to keep the lid open. Even if you are not using the screen, cooling does not work 100% with the lid closed.
Microsoft supporting old OS and software is actually an admirable thing. Raymond Chen’s blog has really interesting tales from back compact experiences: https://devblogs.microsoft.com/oldnewthing/author/oldnewthing (look for older win95 articles). Hint: It was not easy.
Compare that to Mac OS or even Linux, and there is almost no binary compatibility after two major releases. You cannot even apt update a non-LTS branch after two years.
My laptop is a bit old so was on the edge of USB 3 coming in and Expresscard going out. I only have USB 2.0 the old and useless low bandwidth version of Thunderbolt. It doesn’t have USB 3 (boo hiss) but it has Expresscard which I can use with a GPU adapter which is useful when I need it. I have two docks (the SATA version I’m using and the USB passthrough version in the cupboard). I have a duplicate laptop as a backup/travel laptop. USB 3,0 and Thunderbolt for an external GPU would have been nice but what I have works.
My laptop is fine with the lid down, docked, with an external 28 inch screen and external keyboard and mouse. (It’s the dual core CPU not quad core and bog standard Intel graphics not Nvidia so a little cooler.)
Microsoft did get a lot right and has some very very good code in there. Company business practices and userland is a mess though. I’d be fine with Mac OS or Linux but too much walled garden or friction. I’ve read some of Chens stuff on Windows 95 and the NT branch shimming. Lots of compromises and bad decisions in there. Microsoft made a few dud decisions too with kernel compatibility and compilers and VMs and 16 bit stuff. On the plus side a very stable driver model and Win32 backwards compatibility was good too. The amount of voodoo I needed on linux to get the battery driver working (or TV adapter dongles) isn’t something anyone should have to go through. Whether Linux would or wouldn’t do UEFI and sit happyily with Veracrypt and secure boot and Windows 10 was the last straw. Why isn’t Bitlocker an open standard? Why do Windows and Linux throw a fit over hibernation? Do Linux developers have anything like the quality of documentation of Microsoft or the Windows Driver Development Kit? Why did Intel strip OpenCL support from its display driver? Aaaaaaagh.
I’d say it’s both age and a severe misunderstanding of what the goal of the IT fiel is.
Most people use computers to get work done, they really could care less about the underlying things. Subscription models work, because computing is now another commodity/utility.
Just like you don’t think twice when you pay your electric/water bill. I am sure there’s some old timer who is all about homesteading who is lamenting all those fools who don’t generate their own electricity and don’t spend their weekends prospecting for water wells and emptying their septic hands by hand pump.
I have a PhD in the field, but you know what… it makes perfect sense for me to have an Office 365 subscription so I don’t have to think about my software being up to date, and I can have all my documents accessible anywhere. Because the cloud makes so much sense, if you spend so much time organizing your data and making your own backups like we had to in the old days.
Age and a severe misunderstanding of what the IT field is about? Erm. I’d say experience and point to every other comment I’ve made since joining as saying yes I know what I’m talking about. A Phd and job title on its own means squat without the expertise to use them.
HollyB
Yes age and lack of understanding.
The common theme I have noticed is that a lot of people seem to get stuck with the implementation, of general concepts of computing, of the time and place when they learned about the field.
It happens to everybody. That’s where the “back in the day” from every generation come from.
I honestly think there are 2 laws of computing regarding this:
– Whatever it is considered as “bloat” and “inefficient” after 10 years it becomes the standard of “lean” and “efficiency.”
– The death of x86 is predicted every 5 years.
@javiercero1
You’re arguing perception not technical issues. What I said about Phds and lack of expertise still applies.
javiercero1,
You’ve used ad hominem arguments against me too, but lets be honest these are cheap shots that add nothing whatever to the merit of an argument. Criticize the arguments themselves and not the people making the arguments! Otherwise it’s just a cat fight, let’s please move on…
Oddly enough we probably agree here. Industry standards evolve such that what was once considered slow and bloated becomes the new norm.
Hmm, is this just anecdotal or do you have evidence for this “law of computing”? In my experience pretty much the opposite has been predicted since the wintel monopoly started. Is it possible these predictions were different in apple circles?
HollyB
I don’t think you comprehended my point.
@javiercero1
You’d be surprised, sweetie.
Alfman
I am an adult like you, please refrain from telling me what my arguments are or aren’t, what it is that I should say, or how it is that I get to say it. Your ad misericordiams are not any better.
FWIW. The whole meme of X86 is dead has been a meme since the RISC chips came to market in the late 90s.
@javiercero1
You’re the only one bringing up x86 versus RISC. I have no idea what this point of argument has to do with older machines or PC/workstation/mainframe meta discussion or business practice and end users.
javiercero1,
Ok, but in the future, just stop using ad hominem attacks on me or anyone else, fair enough?
javiercero1,
That’s not the whole story. In many instances it was a long uphill battle for cloud vendors and a lot of consumers in the industry weren’t really thrilled about it. Given the extreme persistence by vendors pushing subscription models and allowing the traditional software to stagnate with no investment or upkeep, consumers eventually cave. Take something like adobe suite, customers are still resentful over that but they just don’t have a choice anymore, adobe won in spite of “the cloud” and not because of it. Or look at quickbooks, whose developer put all development resources on the cloud versions and left the old local version in major neglect. One of my clients right now is deciding right now whether to switch to a cloud subscription that they don’t want. They may end up going with the cloud subscription version but it won’t be because they want to buy into the cloud subscription model, it will be because the traditional software version is in such abysmal shape!
Cloud applications did spread like wildfire in the freemium space though. When you’ve got a lot of cheap people who don’t want to pay anything, they’re not very picky about owning or controlling the software and they fit into the advertising model like a glove. For better or worse these freemium services have cut deep into traditional software vendors accelerating their demise. So, in a way, the increasing role of advertising killed the traditional software model and not so much consumer demand.
Alfman
This happens all the time when new disruptive tech/business models come along.
The new model may not fit the needs of all customers, but they tend to fit the needs of more customers.
Cloud won because it is the superior product. Turns out that most people find the value proposition of having your data available anywhere on any device, without having to worry about backups or keeping the software up to date… far more enticing than managing the applications and data locally and having to worry about backups and all sorts of maintenance.
Cloud is also the superior business model, because you can offer ad-subsidized “free” products, for people who would not otherwise buy your product, and you can also target paid-subscriptions for those who will buy the whole product and your support.
I understand that may suck if your livelihood is still tied to the old models.
But this happens all the time. I was a kid when I remember old farts, who had memorized all those arcane WordPerfect for Dos commands, freaking out because they couldn’t compute far more people liked that you could print a letter in word for windows, out of the box, without having to even read a manual.
javiercero1,
You didn’t address my point anywhere in your post. Sure, given the choice of an obsolete & poorly supported software and a well supported “cloud” version, there’s not much question of the superior product. But we should not mistake correlation for causation.
I think it’s fair to say there are proponents of each and that is fine. But it’s also fair to say that many vendors are abandoning the desktop software because they preferred the subscription model and not because their customers demanded it. They ran the numbers and determined that it was better for them to sell the same software over and over again as an online service paid through yearly subscriptions rather than have to work so hard to convince users to buy the same software over and over again.
Nope, it’s not about me.
Alfman
I addressed your opinion by giving mine.
Correlation obviously does not imply causation, but context explains a hell of a lot. Successful companies try to maximize their profit, while guaranteeing there’s a customer demand for their products. They are not charities, so if the cloud model works best for them, that’s why they are going to chase. The cloud model also exist because it meets more demand from customers that the previous paradigm. It works both ways, that’s why there has been an explosive growth there.
I totally understand that there may be cases in which the traditional system works better. But if there’s not enough demand, that;s unfortunately the way she goes at least for things that are not natural monopolies.
More people and organizations have seen more value proposition on the cloud model than those who don’t.
From your post I get the impression you think that’s the result of some conspiracy to shovel the cloud down unwilling customers, I am just seeing traditional supply and demand dynamics at play.
@javiercero
You are no longer discussing the merits and technical merits of older machines but persisting in reframing discussion for irrational and political reasons. Business decisions are liable to regulators and the law and customers. Hiding behind “we’re a business and not a charity” has been used more than once as a cover for inadequacy and discrimination and shady behaviour. Like “BUSINESS” is some kind of tough guy magic word you learned off the internet or a right wing friend which is supposed to end all discussion? It’s not the 1800’s now, sweetie.
Ultimately if you follow through all the governance issues and regulations and law and perhaps even individual policy of the business themselves or people who work for or supply the business that business is a matter of the public interest.
javiercero1,
You can have opinions, it didn’t rebut my point though!
So, you’re actually making my point here. Vendors embraced the subscription models because it was in their financial interests to do so and not strictly because consumers wanted it.
No conspiracy needed, in fact I’m glad you bring up supply and demand because my point is that they’re both important here! Here’s an example: best-yet was the only brand we ever found that offered dairy free ravioli, but the chain was bought out by another chain and now there’s no more supply that we’ve been able to find. Now just looking at the sales numbers you’d see an immediate drop in sales for this particular product, you can take that fact to the bank…but to draw the conclusion that this is entirely or even mostly due to changes in demand may be misleading because you have to consider both supply and demand.
You can look at the sales of adobe cloud and whatnot like I mentioned before, but if you don’t consider changes in supply then you may be missing parts of the picture. In all likelihood it’s not entirely one or the other, but the point is without finding a way to measure these more directly, changing sales does not prove in and of itself that consumer preferences were the causal factor.
Before this Adobe Cloud, a “photoshop alternative” would be limited, like the open source GIMP, or something along the lines of Paint Shop Pro.
Now with customers ready to switch there are real alternatives to both Photoshop (Affinity Photo), and Lightroom (Luminar). AfterEffects still has significant lead, but for example when you search for “professional video editor” Premiere is not always the first result. Actually I could argue DaVinci Resolve is a better alternative (especially with the capable free version).
Basically cloud was not a good fit for Adobe.
Cloud was a good fit for Office, since many users actually collaborate on the documents.
Cloud was also a good fit for software development (GitHub, BitBucket, etc).
Many others fall somewhere in between this spectrum. Cloud is not a solution to all problems.
sukru,
I pretty much agree with everything you said.
I would add that a lot of the benefits of document collaboration could have been achieved in a decentralized way too without depending on these centralized service providers. I don’t think we’ve explored the full potential of P2P in productivity software. More federation and peer to peer collaboration tools could have been an interesting option for consumers and it could have been refined and simplified as much as the online services are. Alas, the major industry players are obviously more interested in their own bottom lines and can they get more profits selling the same software as a subscription “cloud” services. Making users dependent on your services is good for business even if it’s not technically necessary. :-/
@Alfman
You can add a mee too. I totally agree. Setting aside the Hollywoodisation of computers and software the fact implimentations could be provided locally as well as remotely is all about their bottom line. One screaming example is Microsoft neutering its own OS designs and supporting product lines locally while providing the exact same things in the cloud… for a subscription fee. Abuse of the market? I think so.
Alfman,
Somebody needs to write those p2p software. I think we had this discussion before, so I will not repeat it.
Companies will do their thing. It won’t change. Microsoft now finally supports Office on Linux, via the web subscription. They had all the manpower to port it 10 times over, but they only kept it on Windows and Mac.
Similarly, if they can sell a subscription, it is better for the bottom line than an upfront sale. It is also better for the corporate customers (so all vehicles and even laptops are leased, buildings are rented, and money is borrowed, etc).
Accepting this could enable working around to find solutions. Black Magic, once again, sells video editing hardware, and hence the software if free (at least the free version is really capable). This is a very good alternative model for example.
sukru,
Yes, I think we’re all in agreement. The technology that gets the most investment is naturally the technology that corporations find most profitable. For better or for worse, the subscription and advertising business models have had a huge influence on technology companies turning to centralized solutions rather than decentralized ones.
There’s really no simple blanket answer, it depends… Long term buyers may be better off buying than renting for the exact same economic reason that sellers are better with continuing subscriptions than one-off sales. You can rent a chair or buy a chair, but unless you only need it for the short term, buying it is probably going to be the better choice for long term users! Obviously software has differences, but a lot of the time this generalization still holds.
Old systems are surprisingly fast in everything other than raw, continuous number crunching. Modern CPUs are _insanely_ fast when it comes to instruction throughput, but system _responsiveness_ has definitely gone backwards.
I’m currently writing some software on a PIII 800MHz subnotebook and the machine responds faster, and compiles faster, than a VM doing the same on an 5th generation i5!
Yeah, but is that P3 still responding faster and compiling faster than the i5 natively? Hardly doubt it.
Not possible! I’m using a turn-of-the-Century C++ SDK that requires Win NT (4, 5, 5.1) or 9X to run.
Yes, the i5 would be blazing fast at the same task, but it is none-the-less interesting that even with 2.3 GHz and an SSD available, the VM is slower than a PIII with a crunchy, original 2002 IDE PATA HDD!
Back when I coded I had a love for Borland products so stuck with the Borland compiler as long as I could. I eventually was forced to give up and use Microsoft. Coding Win32 in a text editor routine compiles would take 5-10 seconds max. Bigger stuff like libraries would take longer of course. Back then Microsofts build array for Windows took 4-5 days.
If I was given a blank sheet and told to go away and write a web browser which supported modern needs I’d probably rely on a few libraries for JPEG and suchlike but have no idea how in Gods name I’d write anything the number of lines of new browsers. What do they put in there? By supporting modern needs I mean modern needs. Not necessarily modern ways of doing it which is bloat based on bloat. How much do you need to handle formatting and text and a few pictures with maybe a video and a bit of logic in there?
There may be something very wrong with how you configured the VM? Or whatever, there are so many variables to clear out that I don’t know what that anecdote implies really.
Viewed the other way… the i5 is emulating the old hardware AND compiling almost as fast. You’re literally running your old environment as another application within another bunch of stuff going on that allows you to live in the modern world. I am sure you’re getting significantly longer battery life out of the whole thing.
Kroc,
Yes, I understand what you are saying. Many of the hardware performance gains are followed by software performance regressions caused by bloat and less attention to coding efficiency. The net result isn’t always as impressive as the hardware advances would suggest they should be. This can be frustrating to people like us, but it’s kind of one of those things you’ve got to accept.
Hrm, I understand I’ve seen evidence of that. But for the particular Mac in the article, it was terribly unresponsive when compared to Win NT 3.51 systems. And the Win NT systems were no speed demons either. Maybe they all had bad disks or something, not sure, I was already in enough trouble in college for upgrading Netscape on the machines I wasn’t supposed to have access to. I didn’t want to risk messing with the Macs.
I can tell for sure that there are a lot of business solutions. We’re using Salesforce, which is the easiest way to automate jobs. This helps us to substantially of the money and time required to perform those tasks. But it was initially difficult to set it up, it’s good that we were researching the salesforce integration partners content, and then it became much simpler to do that. Now, it’s working properly. So, I hope that I can support you with that, too. Good luck and good luck!