Recall that Doom is a multi-level first person shooter that ships with an advanced 3D rendering engine and multiple levels, each comprised of maps, sprites and sound effects. By comparison, 2016’s web struggles to deliver a page of web content in the same size. If that doesn’t give you pause you’re missing something. So where does this leave us?
It leaves us with a web that is horrible to use.
I understand a lot of people on mobile devices like to set up their browsers as desktop, so they get the full desktop experience. I’d actually like to do the opposite and set up desktop browsers up as mobile. Seems to me that mobile sites generally have the same content, without nearly as much of the extra crap …
Knock yourself out
https://www.google.be/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=…
I would not recommend it. Mobiles sites are generally total crap, and often have the same or more bloat.
It is quite mixed in my experience.
Some publications like Slashdot have indeed horibble mobile sites, which are much worse than their non-mobile counterparts even on a phone.
But there are quite a few where the mobile site browsing experience is better even on a desktop.
And they often don’t even give you the content you ask for. Sometimes they just redirect to a mobile front page.
Indeed, without an ad blocker the typical mobile website consists of about 30 percent advertisement on the top and the bottom bottom, with the remaining 40 percent of content in the middle, which more often than not is also covered by a floating ad. When one tries to tap on the tiny “X” in the corner to dismiss that one, they inevitably tap the border ad behind it instead, launching three or four pop-up tabs with videos playing.
And that’s the websites with content worth reading, once you can get to it.
On your never realized dreams of global enlightenment.
Money will always work this way.
…..
Click bait for procrastinated tabloid & gossip addicted consumers.
Bandwidth has also vastly increased.
I’m less concerned with bandwidth usage (I have plenty), and more concerned with utterly shitty, inefficient scripts on pages that make my CPU fan spin up for no visible reason.
If I put my PC in low-power mode (4c/8t at 700Mhz is what the CPU runs at in that mode), the web is hardly usable. Maybe I should stop using Firefox, but, dammit, I like tab groups way too much. The next browser that has tab management like Tab Groups has my vote.
Edited 2016-04-22 23:56 UTC
This. Javascript has ruined the web. Before Javascript, sites used to be beautiful trees that even a 100MHz processor with a tiny amount of RAM could process. But no, web designers are compelled to drop useless script after useless script in, such as making images “pop-out”, stupid animated galleries, and the biggest annoyance of them all, scroll-activated JS scripts. For example Ars Technica, on their mobile site, feels compelled to include a script that “compresses” the logo when scrolling in the comments of an article, because reasons.
And none of those “designers” has the courtesy to provide a noscript tag with an image that doesn’t pop-out, a gallery that isn’t animated, or a logo that is just a frickin’ png. I frequently disable Javascript on old phones (to prevent the browser from crashing) and the more “hip” a website is, the more it breaks.
I kinda wish the original vision of the web, aka a beautiful HTML tree with boxes of JavaSE apps activated only if the user wants to and which leave the rest of the document alone instead of messing with it, had happened. And it would have happened if Sun allowed Java to be integrated into browsers instead of being a third-party plug-in.
PS: What’s the purpose of Google’s AMP and Facebook’s Instant Articles. Can’t they just make light webpages?
Edited 2016-04-23 01:15 UTC
Much of the problem with Javascript is in ads, though. Often times, 70% of what a page loads is from a couple of ads that load giant chunks of Javascript.
But, fark, eternally scrolling pages that just load more content when you get to the bottom of the page are annoying. Worse, some sites hide half of their content and make you click a button before it appears – for no fucking reason. AAARRRGGGHHH!!!!
That makes me mad.
JS that makes it so pages cannot be static piss me off, too. For example, if you click a link on Facebook, then hit back, your page is different. Too many pages do this.
No because and the reason why is that some product manager normally wants a few hundred tracking scripts. Most web devs normally build something nice and this shit get tacked on over time.
I had our desktop website rendering in less than a second on a 10 meg connection without anything being cached (and there is 15000 lines of JS and 10000 lines of CSS), over times management kept packing scripts onto the page until it is now back to 4 seconds average.
Edited 2016-04-23 04:06 UTC
Have you tried IceDragon or Waterfox? They are both based on the Gecko engine so your Tab Groups should work but they feel less sluggish and bloaty IMHO.
I don’t have your PC so I can’t comment on how they will behave in your particular setup but I can say that running Comodo IceDragon on an underclocked Phenom I 9600 (1600/400MHz) it feels quite snappy.
In other news, a potato is not a sweet and tangy as an orange. For years we’ve been complaining about this significant shortcoming of potatoes and yet nothing, NOTHING has been done!
I blame the government, major corporations, and that guy down the street whose house is bigger then mine.
That’s why I am voting for Donald Trump or that other guy with crazy hair, Sandy or something.
Two simple stats about Doom:
* The typical texture size in Doom is 64×64 pixels using a 8 bit palette. By comparison, the small OSAlert logo at the top of this page is 176×69 pixels alone. That one icon covers 3 doom textures. If it had been a little bit more advanced so that it required true color it would have consumed more than 9 full Doom textures. For a single web image.
* Doom was made for 320×200. Web pages in 2016 are made for 5120×2440 or more. A single screenshot on my computer holds more information than 9000 doom textures.
If a web page was meant to be as blocky and low res as Doom (both graphics and fonts), then yeah, maybe it would be fair to compare the two.
dpJudas,
Clearly you are right that it’s not apples to apples, however I think this misses the point. Doom was a vast interactive multiplayer multilevel world, which not only included all the resources for the game, but also a self contained engine to handle it’s own archive formats, dialogs, 3d engine, fonts, networking stack, etc.
That a simple mostly text webpage today with no high res graphics has such a large footprint highlights how grossly inefficient thing have gotten.
Take osnews page (using the developer tool’s network tab):
js * 42 = 1,934KB
html * 14 = 133KB
css * 4 = 48KB
images * 39 = 48KB
xhr * 16 = 31KB
Total Size = 2196KB
Total Time = 9.5s
I turned off all blockers for this test, and…OMG is this bad or what?
Note that the “hires” multimedia elements are not the culprit, most of the overhead is in 2MB of virtually pointless JS.
Using the adblocker (with some custom rules to block additional 3rd party tracking) a full page load goes down to 139KB total in 4s. This could be better, but the low hanging fruit is clearly 94% overhead from 3rd party ads & trackers.
Caching helps eliminate some network traffic, but the memory & CPU resources still take a toll locally which is why opening a single webpage needs more memory than entire computer had in the 90s.
It seems that every time modern hardware & networks address inefficiency using more horsepower, the less efficient we become. I used to strive to be efficient just because I took pride in it, but I’ve mostly given up because no one else cares.
Nobody cares because in the big picture the user doesn’t care if it is efficient. The only thing that matters is that it doesn’t get too slow. The improved hardware resulted in cheaper software development, not faster programs (they’ve been roughly fast enough for 20 years soon).
dpJudas,
Well, some people do complain, at least indirectly. My wife has mentioned that FF on her tablet struggles with bloated websites. But alas, as things keep getting less efficient it becomes the user’s responsibility to upgrade when there is a software performance problem.
Another anecdotal example: I bought a laptop at the begging of 2015 and it’s already too slow. In a recent project my job is to upgrade a codebase from VS2003 to VS2015. Man, VS2015 is so frustratingly slow that I seriously want to go back to an old & unsupported yet much faster version. This complaint is echoed by many in the community. Maybe the new version “does more”, but it’s not readily apparent that any of it will be useful for my work. I regularly have to wait for the IDE to catch up as I’m typing. Sometimes when I click on something I’m not sure if I misclicked or if the IDE is just running dog slow. These kinds of UI sluggishness problems should be extinct by now on all modern hardware. I assume the reason MS doesn’t care about Visual Studio performance is because everyone there is running the latest and greatest high end computers available; MS probably gets a great discount and just think, they don’t have to pay the “MS tax” either.
Edited 2016-04-23 07:19 UTC
Still using VC6 with WndTabs and everything is perfect in this world.
Did you install Update 2 and did you enable hardware acceleration in the Options dialog? When VS2015 came out I immediately went back to VS2013 because this product was clearly broken. About a month ago I gave it another try (I really want the C++11 features ) and at least on my C++ projects it is now by far the fastest VS I’ve seen in a while.
Of course, if you compare it to VS98 everything is slow. But then that just once again illustrates that this isn’t an issue specifically with the web – it is that developers always only optimize until things somewhat OK on their own hardware. I’m sure that VS98 was slow compared to say VS2.0 if you tried both on a 1996 machine.
dpJudas,
VS5/6 ran well on all my computers and I loved that about them. My circa 2003 computer ran VS2003 easily and VS2005 without issue. 2008 was slower but usable on the same system from 5 years earlier. Now VS2015 is just intolerable on my 2015 rig. Evidently I need to buy a newer beefier system to run VS2015 well, but having bought one just last year it’s not in this year’s budget.
Ironically this project’s code base is from the 90’s, so it doesn’t benefit much from anything newer. Oh well, I don’t really have a point to make with this, it’s just a complaint.
I don’t really know why there’s such a big difference between your experience and mine. My computer is from late 2014 and even only got 8 gigs of memory. It is an i7 with a 980gtx card and a SSD though. Too bad Update 2 didn’t fix it for you as it did for me.
VS2015 seems snappy enough for me on my 3-year old i7-based laptop with a GB of ram and a slow spinning disk.
Now, it’s much faster, since I have an SSD, but otherwise, the same laptop.
Drumhellar,
I probably need to replace the laptop with an i7 then.
Is this the same computer you have trouble using firefox on?
http://www.osnews.com/thread?627770
You only have 1GB ram or is that a typo? My desktop shows 1.5GB usage right now with not much to show for it other than a few websites, email, and some terminal sessions, so it’s possible that having 1GB total system ram might be too crammed for your apps or the OS could be ejecting pages from cache prematurely. Then again I’ve never tried capping a CPU into 700MBz low power mode so that’s uncharted territory for me.
dpJudas,
While the wastefulness irks me, as long as hardware gains come in equal measure with overhead, then you could always come along and say “I really don’t see the problem” regardless of how bad it gets.
Currently, it doesn’t seem there’s there’s any momentum at all for making things efficient, and I’ve accepted that already. However just because we can over-provision technology doesn’t mean there aren’t costs for doing so, which are multiplied across millions of users. Due to this, just a little bit of effort from producers to optimize content/software could easily save society the billions of dollars that are directly and indirectly paying for overhead.
Yes, but you work in software development like me, so you know how it goes. If the immediate boss of yours doesn’t complain, and you’ve been assigned a really boring task (which all web development is ), then that’s where the optimization task ends. It is exactly why the performance is a function of the threshold of a boss complaining when testing it on his computer.
dpJudas,
Yeah yeah… I know. Unlike you however, I’m going to use a frown
Why use the IDE? In brutal honesty new IDEs have always been excessively chunky – I remember Visual C++ 1 on a 386 with 2Mb RAM, or buying Visual C++ 4 because it said it needed 8Mb RAM which turned out to be comically optimistic, or first seeing Visual Studio which wanted over a gigabyte of disk space in 1997 (I left it on the shelf.) Underneath all the gunk, the compilers are getting better.
In my current work life, I’m using a VS 2015 with msbuild projects that could be used with the IDE if others want to…but I choose not to most of the time.
malxau,
If you have a good paying job that involves this good old “real” CS work, then I’m all ears Graphic rendering, os-dev, used to be my passion. I’ve tried to land jobs doing that but it’s been in vein. Now I mostly do websites and maintain legacy business software because that’s where the majority of the local opportunities are at here on Long Island. If I were to move elsewhere it would have to pay well enough to justify uprooting my family.
When I started using the WWW back in 1995 the average web page was ~5KB. It was literally instantaneous on the university 100MB/s LAN and extremely fast on dialup.
It served text and pictures. And for that it should not take more resources.
Nowadays websites still serve text and pictures – at least that is all I want from a website – and it takes a whole power plant to make that happen.
What kind of people are producing that crap?
Get off my lawn!
Doom isn’t a 3D engine.
It’s a program that draws lists of trapezoids that it pulls from a BSP tree representing a 2-dimensional map.
Most people call it 2.5D, but in reality, it is indeed a 3D engine. The particular form of 3D rendering is called “lines of constant z”. This means that all rendered lines are perpendicular to viewer – flat floors and flat walls. The constant Z means that perspective can be calculated once for the entire line, allowing for a simple linear rendering. It’s still 3D, just a very restricted version, making rendering fast for low-end systems.
I believe the 2.5D reference is just as much to the limitations of the levels. The game looks 3D, but the actual map is 2D with different height values for each sector.
Yes, games didn’t move to 3D maps until the mid 90’s. Quake and Tomb Raider were prominent examples of using 3D levels as opposed to 2D with height info. 2D with height info is STILL popular for certain parts of games, even today. For example, the outdoor landscapes for “open” games. You then mix in a 3D mesh for objects located in that 2D map, like rocks or trees or buildings or whatnot.
JLF65,
Yea, extremely steep slopes are rare in nature so height maps are easy to edit and generate, they work well. I remember magic carpet used two height maps so you could fly around in caverns. It was awesome that the game incorporated effects that manipulated terrain on the fly.
Aside from flight sims, it’s rare to see 3d games using all 6 degrees of freedom.
Hmmm, let me think…
Elite (David Braben and Ian C.G. Bell)
Zarch (same duo)
Starglider
The Sentinel
Hunter
Midwinter
Powermonger
Stunt Car Racer
Hard Drivin’
…
Edited 2016-04-26 11:51 UTC
I don’t really think games like World of Warcraft qualify as 2.5D because their height map terrains have holes in them and contains world map objects. What earned Doom the 2.5D moniker was the fascination that the game looks 3D but in reality you’re walking around in a 2D world. The collision detection in Doom didn’t even take the height into consideration if I remember correctly.
There were plenty of games before the 90’s that did 3D maps. Not necessarily textured 3D but 3D none the less. Other than the many flight simulators a shining example is Mercenary.
Edited 2016-04-26 06:12 UTC
As others have noted, the problem seems to be compounded by modern browsers becoming just as bloated as the pages they render.
This has always bugged me, there are a few programs I will accept dragging my system’s performance down and a web browser is certainly not one of them.
A not so great workaround for me was to start using elinks for the quick one off searches that I find myself doing when coding etc, but unfortunately the modern web just isn’t built for text only browsers anymore. I have to say though, OSAlert is excellent in this regard; I’ve always been impressed by how neatly it renders in elinks and co.
Luakit would be perfect but is just a bit too unstable at the moment for me to use it day to day. Links – when compiled with graphics support, looks okay(ish), is rock solid and blazingly quick but suffers from the same problems as elinks.
Recently I’ve started using xombrero and have to say, it ticks all of my boxes: lightweight, quick and it supports a vi-like command interface. The only caveat here is that it’s affected by a bug in libjavascriptgtk that seems to be causing random segfauts in all of the alternative webkit browsers atm (a minor annoyance for me compared to dealing with firefox and chrome)
Try browsing the web these days using a Amiga A1200, nightmare. It doesn’t seem too long ago where you could do this easily, no chance now.
Too much bloat!
I was going to bring this up but you beat me to it! I have both an Atari Falcon 030 and an Amiga 4000D (nicely upgraded with an 060 and tons of ram) and when people throw around MB for websites, and you’re dealing with 14mb (Falcon) and 128+16 Fast, 2mb chip (Amiga) then it is massive. Ibrowse at least handles aminet fairly well, I haven’t browsed the net on the Falcon yet, still setting some stuff up, but it really is hard to do on such limited resources.
I’ve yet to get around to setting up the Internet on my a1200, I’m hoping that the ACA 1230 will give it a hand with the worst offending websites.
Any recommendations for browsers? I remember playing around with AWeb a few years ago and was left pretty underwhelmed.
Your pretty limited on browser choice unfortunately, there is AWeb that you tried already and there is iBrowse that is a little dated now. iBrowse is the better choice although it will struggle with the latest web sites using CSS etc.
There is also Voyager but that is really outdated and mostly unusable except on basic sites.
There is NetSurf but you need a 68060 for that, and there is OWB that is used in Aros and MorphOS, that requires emulation, a PPC processor or Aros on x86.
The good news is that iBrowse is getting a new release in the near future which should alleviate some of the problems with modern web sites.
It still won’t solve the bloat problems though.
Flash didn’t have this problem.