One name was conspicuously absent from the list of companies backing Google’s WebM project and the VP8 codec. Despite other chip makers and designers being on the list, like AMD, NVIDIA, ARM, and Qualcomm, Intel didn’t make an appearance. Yesterday, the company made its first careful commitment to the WebM project.
The commitment is indeed a careful one, since it inly covers the “Smart TV” space. Intel is currently working with Google to bring internet video to television sets, and part of that push is a specialised Atom processor, the Atom CE4100. This is a highly-optimised SoC running at 1.2Ghz, which can, among other things, decode two 1080p streams at once. This is the chip included in Google TV-enabled devices from Sony which will appear later this year.
Intel has stated it will add hardware decode support for WebM/VP8 to this chip if the new codec manages to establish itself in this space. “Just like we did with other codecs like MPEG2, H.264 & VC1, if VP8 establishes itself in the Smart TV space, we will add it to our [hardware] decoders,” said Wilfred Martis, general manager for retail consumer electronics at Intel’s Digital Home Group.
For now, WebM content will be decoded in software on these devices, but you can be pretty sure that Google will push Intel to enable hardware acceleration for WebM video. This pledge of support shows that while Intel was absent from the list of companies backing WebM, it sure doesn’t seem like Intel is against WebM.
Good news.
Intel always seems to be late to adopt anything new.
How are they late to the party given that VP8 was only released a few weeks ago with the chip in question on the board since September 2009. I don’t blame Intel’s cautiousness given how the vultures of MPEG-LA seem to be circling.
Rubbish. How are Intel any less of a target than AMD, ARM, Nividia and Qualcomm who have all baked it and would be just as aware of the legal ramifications (esp. considering some of them use H.264 hardware).
Intel want the good press, without the commitment. They are effectively saying that if WebM becomes big without their help then they will join out of necessity so as not to be left behind, but by not backing it now they are actually inhibiting its ability to become popular since Intel is such a big player. They want it to fail without them but succeed with them so they can ride in and say how much they support the idea and how great it is.
Intel are full of crap. Put your silicon where your mouth is or go away. More and more they seem to be losing the plot.
Eh, you do realise the other guys haven’t done a single thing yet either, right? No chip currently supports this out of the box.
Right, my mistake; but Intel^aEURTMs words are still just PR. Of all the companies Google have managed to get backing from at least some hardware must come out of it. They know YouTube/WebM is coming and that^aEURTMs a big money-spinner if you can claim to have the best performance / battery life when it comes to YouTube.
True there is not yet any hardware that does this out of the box, but for whatever it is worth that sizeable list of backers have claimed that they will be adding this support sooner rather than waiting to see if the technology takes off.
That to me at least seems like a big difference between Intel and its competitors. Intel follows when it either seems safe to do so, or when they feel they’re going to be left behind if they don’t get onboard.
Committing publicly to having support in named chipsets on a stated timescale meets and exceeds my definition of having “done a single thing”. It is certainly better than than saying you’ll wait and see what happens by any measure.
Intel aren’t as strong in these markets, their chips are often paired with hardware decoders or GPUs from other manufacturers, but it would be nice to have them on board.
As I noted, the CPU was announced in September 2009, the question would have been how easily it would be to add support for VP8 to the processor considering that AMD, ARM and Nvidia don’t actually have a product already designed – they’ve only promised. Quite frankly, I don’t give a crap about what ARM promises because they’ve been ranting and raving about ARM Netbooks for the last year and I haven’t seen a single one being sold in New Zealand.
Talk is cheap, anyone can promise to add support for VP8, just as anyone can rant on about ARM Netbooks and mobile Linux – its time that there is less talk, less promises and more delivering those products that have been promised each year by ARM, AMD, Nvidia and co for the last year.
Edited 2010-05-28 13:40 UTC
http://distrowatch.com/6095
mobile linux without the empty talk.
I am talking about it being on actual devices. My point still stands; many promises but no products on the shelf that I can purchase. Until I see the netbook preloaded onto the device with integration/compatibility issues sorted out (reliable suspend and resume, reliable wireless etc), it is another pie in the sky project. Same goes for ARM devices that are promises each year and never delivered to the marketplace.
One can promise and post things up on the internet but if they don’t appear in the store on a device then quite frankly they might as well not exist at all.
This will probably come out later this year, or at least next. Hopefully they’ll have worked out all the problems the current prototype seems to have.
http://arstechnica.com/gadgets/news/2010/05/android-tablet-prototyp…
[q]I am talking about it being on actual devices. My point still stands; many promises but no products on the shelf that I can purchase. Until I see the netbook preloaded onto the device with integration/compatibility issues sorted out (reliable suspend and resume, reliable wireless etc), it is another pie in the sky project. Same goes for ARM devices that are promises each year and never delivered to the marketplace.
[q]
Who actually cares about these devices or in fact the people who are dumb enough to use them?
So what if Apple won’t support WebM? It’s not like the people who use WebM care that much about Apple or their users that much to begin with.
Don’t like WebM? Don’t use it and just go away. It’s not like as if anyone actually cares about you.
If it’s a specific distribution or device your after then rumours abound about stuff in development.
In terms of Meego though, the Maemo side of it’s parent forks has had products available for years with the latest being the N900. Granted, they’re not the letter sized tablets currently trendy in industry news. Meego is far from vapourware though with a build coming to N900 and a image file available for download. If I had an ia32 based device, I’d give it a go for a look; even with some of the attributes they chose not to inherit from Maemo. (rpm instead of deb based; WTF?)
http://maemo-freak.com/index.php/miscellaneous/1412-meego-v10-for-n…
This popped up in my reading today. You can already choose between the N900 and MSI hardware.
I say ‘late’ in this particular case as quite a few other companies have already committed to support it as opposed to waiting to see if it gets widely adopted as Intel is doing.
Based on information available online, some of these other companies were committed to it before it was publicly announced, and I would find it rather odd if Intel was left out of the loop during this time.
I certainly don’t blame Intel for being cautious due to patent worries, but that does not change the fact that they have made a conditional and half-assed commitment to VP8 *AFTER* so many others.
So yeah. Late.
… Say the person who -absolutely- does’nt know the time it takes to verify if a hardware is ready or not for a given algorithm.
I am not a fan of Intel but:
May I remind you that we are talking about a silicon chip ?
Says the person who knows nothing about me. But thank you for your stunning insight that we’re talking about hardware support. I would _never_ have known had you not pointed out that little nugget of information.
I am forever in your debt kind sir.
…and what makes you think they’ll need to change hardware, instead of a firmware update for an on-chip DSP of some kind?
… And yes, I was talking about algo (so FirmWare) not about hardware.
It is a question of gates available, also of memory available, etc, etc.
Intel is talking about an existing SoC platform in term of released design, not a future ^AuP/graphic card product. So, yes we are talking about FW.
And I am surprised to see people saying Intel is late about its VP8 support… On an almost released product!
If we were debating about future and (maybe) on-going developments, I can admit that Intel is late. Okay, it is a question of “commercial & strategic” announcements.
But NOT about this released product.
Call me mister slow-coach if you which, but please, at least make a good case for. ;-))
Three months ago, the majority of OSAlert readers wanted Theora mandated as the required codec for the video tag in HTML5. With the announcement of WebM, the ‘virtues of Thusnelda’ have been forgotten.
I’m surprised Theora supporters aren’t saying that Thusnelda is better than VP8.
No, everyone knew that on the technical front Theora was inferior than h.264. The main “virtue” of Theora is its openness and the fact that it doesn’t put your rear end up to sale to the MPEG-LA. WebM and VP8 bring the same “virtue” but are also technically stronger, making it a better alternative at the moment.
Or maybe this is just flamebait and I shouldn’t have answered.
Inferior “to”, dammit.
It wasn’t flame bait, but an observation. I had written an article a couple months ago about why I think Theora wasn’t on par with H.264. Those who disagreed always pointed to unreleased builds (Thusnelda) as the saviour, and even claimed it was better than H.264.
Two months later … not one mention of Thusnelda. It as if Theora didn’t exist. Seeing that Theora does not have the corporate backing like VP8, it is no surprise that Theora has gone down the memory hole.
Now we know why companies use H.264 and why it remains popular. You can’t go about standardizing your systems around a format loses mindshare at the drop of a hat.
Maybe you ran into some crazy commenters (it’s the internet, it happens), but no one seriously thought that Theora was as good as h264. The argument was always that it was “good enough” for web video.
If there were a critical mass of Theora video out there it would be a lot tougher to get rid of, but right now there’s so little that it makes sense to drop it and move on to a better alternative. I don’t think that corporate support is really the core issue, it’s just the # of videos using it (which corporate support can help, of course).
For me, it was more about having an open codec for use across an open network. As an open standard, Theora would improve exponentially with the growth of the developer community. Consider why and who contributes to the kernel or Apache as near standards in there categories. Consider those same motivations applicably applied to an open video codec becoming ubiquitous.
I’m surprised that even with the Theora drama going on for months and month you completely fail to understand the point of anything being discussed.
The point is that we need a open codec for the web that anybody can use for anything.
You don’t pay royalties for JPEG, do you? Do you pay royalties for PNG or HTML text? Were you around when people tried to get royalties for GIF?!
No, of course, nobody pays for image formats or text formats anymore. The system could not work if all of a sudden when you used more then 300 images larger then 1024×768 all of a sudden you had to pay 100,000 dollars to the ‘JPEG-LA’ group or risked getting sued for patent infringement.
Theora/Vorbis/Ogg was the best that was available and now Vp8 is around and is better so now people want Vp8/Vorbis/WebM to be acceptable on the web.
If you tried to understand the issues you would not be surprised at all by the turn of events. In fact you could of predicted everything that has happened with great accuracy.
People will keep supporting Theora because it’s been around and that gives something to fall back to if Vp8 hit the skids.
Because in terms of company support, VP8 has much, much better chances of succeeding than Theora. It’s just a fact.
Theora was the best option for the web yesterday, because it was the sole royalty-free codec. Now it’s VP8, because of Google’s backing that may kick some butts at the MPEG-LA when it goes into patent trolling because they see that the superior format is starting to win. Xiph and the open-source crowd can’t do that.
About those who criticize the honesty of theora backers, I have just one question : if the situation was absolutely reverse, ie ON2 invented H.264 and MPEG-LA backed a patent-encumbered VP8 the “official” standard, how many current H.264 backers would explain that VP8 is a technically superior format, through exactly the same encoding tests that we see now where people argue that H.264 is superior ?
Edited 2010-05-28 21:40 UTC
I haven’t seen anyone here pushing H.264 for political reasons.
If anything the debate here has been Theora advocates vs people who want whichever codec has the best quality. No one here has a vested interest in H.264. Of course the ideal would be a completely unrestricted codec that provides the best quality but such a codec doesn’t exist.
My opinion is that the W3C should specify two codecs, H.264 and VP8. There should be a codec built into browsers that can be used for commercial purposes without the permission of MPEG-LA.
The question I asked is if the “MPEG-LA and the industry backs it, hence it has to be great” unconscious thought does have a strong influence.
If MPEG-LA backed VP8, and if H.264 lovers took exactly the same Park Joy frame and told “look, the grass is blurry”, I think that VP8 backers would say “No, you’re just wrong, videos can’t be compared based on still frames, it’s images in motion. You won’t ever see the blurry grass in the moving picture. Video algorithms are made to optimize the look of the video, not of the still frames which composes it”. And then H.264 backers would argue “Dammit you’re bought by the MPEG-LA, come on, still frames are the absolute reference in terms of video quality”.
Don’t you think so ?
The whole video quality discussion is a joke launched by H.264 backers like x264 devs in order to hide the real issue. If quality really mattered on the web, Youtube (which use H.264 by the way) wouldn’t be #1 video site on the web.
It’s all about the politics of media industry, really. It’s royalty-free vs widely distributed. Getting things done right vs getting things done fast.
They cannot. It’s written in the web standard definition by the W3C itself that a web standard has to be royalty-free. This has been made so in order to avoid enduring the Unisys incident again in the future. So either the MPEG-LA makes the definitive statement that H.264 is royalty-free, which is not going to happen because those rats want to make money on what looks to them like a juicy media, or the H.264 will remain a non-standard way to display video on the web, like the Flash player which introduced it in the first place.
Edited 2010-05-29 07:37 UTC
The quality issue came about when YouTube refused to use Theora because of its quality/bandwidth usage. Adamant Theora supporters believed that the average user would not have noticed that Youtube’s video quality had tanked or that the videos took longer to buffer. If Theora was a credible alternative, more sites would be using it. However, it is not, which explains why over 60% of the web’s video is in H.264, and Theora has 4%. If Theora was supported by Flash (as H.264 is, and VP8 will be?), I believe it would have had a higher penetration.
It is good to see that VP8 appears to be a credible alternative to H.264. Companies can freeload on H.264 for the next 5 years, while deciding when/if to migrate to VP8. It might even mirror instances where companies would pretend to be eyeing Linux in an attempt to get a better deal from Microsoft .
Who knows … maybe there will be an H.265 by then, or devices get powerful enough to start using H.264 Mainline/High profiles, and we all will be having this spat all over again :-s.
I’d gladly mod you up as informative if I could. I forgot about that episode, having x264dev’s post about how high tech and incredible H.264 is and how crappy Theora is as the most distant thing in my memory on the subject.
This does not take into account things like video conversion costs. MPEG-4 is around since the DVD days I think, whereas Theora is much younger. Most of the established video playing infrastructure on the web is based on H.264, and changing it would prove to be quite difficult, unless a big company like Google pushes change forward.
Then there’s the visibility problem : honestly, who knew about Theora before trying multimedia on linux ?
These two problems are adressed by VP8, not because of its quality but because of the big company backing it. If google provides money, security through proper support, mature encoding software, FUD against the MPEG-LA, and cheaper licensing, chances are that VP8 will make it as the codec for the web.
Again, it’s got nothing to do with quality, except when it’s very bad (which Google argued about Theora. Don’t know if it’s true), it’s about which company or organism is stronger If Apple ruled the W3C, the whole video codec issue would never have occurred because the royalty free requirement would have been silently removed in some way ^^
(Actually, it’s quite worrying to rely on companies owned and directed by a single man for everything, when you think of it)
Edited 2010-05-29 13:21 UTC
Theora was built on VP3 which was competing with MPEG4. Just a few months ago, the notion of Google Chrome having native Theora support was the rallying cry for Theora supporters to coerce Microsoft and Apple to include native support in their browsers.
True … but I have yet to watch a VP8-encoded video file, so it is less visible than Theora in my (and many others) point of view.
It has always been about quality. Nokia proposed MPEG1 for HTML5 video as its patents will expire in the near future, but no one wanted it due to its poor quality. It is possible to get high quality video encoded using Theora: the problem is that the bandwidth usage wasn’t economical.
It has always been like that. Jobs, Gates, Shuttleworth; all rich and influential men who made their OSes popular (to varying degrees). Google was capable of getting more hardware support behind VP8 in the past few weeks/months (due to its similarity to H.264?) than Theora has gotten in its entire existence.
How do you know that ?