A comment on the recent article about the Bali release of Googles WebM tools (libvpx) claimed that one of the biggest problems facing the adoption of WebM video was the slow speed of the encoder as compared to x264. This article sets out to benchmark the encoder against x264 to see if this is indeed true and if so, how significant the speed difference really is.
Not a bad comparison, although there are some damn obvious flaws:
1. PSNR? Really?
2. The conclusions don’t match the graphs – x264 clearly dominates.
It would be nice if vpxenc were faster. Then again, the encode time is apparently good enough for youtube, and they’re probably the people most in need of fast encodes.
Except for noting that x264 baseline and vpxenc in the higher quality ranges are competitive I think that is what my conclusion says. I certainly wasn’t trying to give the impression that vpxenc is as fast as x264 in general, it’s clearly not.
Ok, I must have misread. Sorry!
Since I’m the one that made the comment prompting this comparison… I’m very glad to see Bali Shows some significant improvement on a real world encode. It’s still slower than x264, but it’s no longer “embarrassingly” slow it seems. I’ve also never seen those settings that were used for 2 pass, that is good info. Kudos to the author!
However, I’m still going to do my own tests with more realistic source material when I have the time. Using a 320×240 resolution for the source is not representative of what most users would actually encode (not even close really).
Edited 2011-03-16 00:07 UTC
Yeah, getting good documentation on the encoder settings for vpxenc really is a bit of a problem.
240p video is very low resolution but its not unusual to see it on the web, youtube still offers it. It seemed like a reasonable starting point. I’d be interested to see how your higher resolution tests compare.
Remember that most users are not really home users in this case. WebM does well at low resolutions – the kind of resolutions you need for your mobile devices or youtube. Of course 320×240 is too small for that, 360p or 480p would be the most common use case for WebM.
The MSU test on the original release of VP8 found that it did much better on HD content than it did on SD/DVD content. I think they’d just spent more time tuning for that size.
I don’t know if that has changed since, you’d think Google would have some interest in the lower resolutions used on Youtube, but on the other hand maybe every extra bit of quality on HD saves them more bandwidth than an extra bit of quality for a tiny video.
I still need to do some more tests to be completely fair, but just to give a taste of what I finding so far:
I used this video as source: http://media.xiph.org/video/derf/y4m/sintel_trailer_2k_480p24.y4m
This is the Sintel trailer at 854×480. Its about 52 seconds long. I used the exact same encoder binaries as the linked article, although my machine is different (Phenom II X3 720 with 4GB memory, Windows 7 64-bit).
I compared x264 “baseline/slow” to webm “good/cpu-used=0” using a variable bitrate of 1000Kbs. Since multi-threading affects webm quality adversely, and x264 is mulththreaded by default, I am choosing to force x264 to run single threaded by using –threads 1 for this comparison. This is as apples-to-apples as I think you can get for a single threaded comparison (Ill do a multithreaded comparison later). I’m reporting PSNR, but frankly I don’t thing it is at all relevant at this bitrate.
x264
Commandline: x264.exe -o sintel_trailer_2k_480p24.mp4 sintel_trailer_2k_480p24.y4m –profile baseline -B 1000 –preset slow –threads 1 –psnr
Avg PSNR: 52.375
FPS: 15.04
Filesize: 5436Kb
Actual Bitrate: 851.94Kbps
webm
Commandline: vpxenc.exe -o sintel_trailer_2k_480p24.webm sintel_trailer_2k_480p24.y4m –good –cpu-used=0 –target-bitrate=1000 –end-usage=0 -p 1 –psnr
Avg PSNR: 50.862
FPS: 6.43
Filesize: 5017Kbs
Actual Bitrate: 766.710Kbps
Conclusion
Quality is very comparable subjectively, x264 is a tad better imo (but not much). When I get all of this together I will post the actual output videos so people can look at them.
Filesize is close enough since both stayed well under the target. Webm is a bit smaller, but in this case that isn’t actually a good thing. Both had plenty of bitrate to work with, and both ended up not needing anywhere near all of it (I used a high target because frankly that is what people normally do at these low resolutions).
note: I have noticed that webm when run with the “–good –cpu-used=0” with a high bitrate cap tends to undershoot much more than when run using “–best” (but it is also twice as slow when run that way). I prefer using best because it tends to get much better quality (i.e. higher bitrate) when given alot of headroom, but I didn’t use it here because I did not want to be accused of an unfair comparison.
The big difference, as I have stated before, is in speed. x264 is over twice as fast (and frankly it is being hobbled here since running it designed to be run multi-threaded)… I’m still in the same ballpark speedwise as the original article, but the gap has widened a wee bit (i.e. roughly 60% slower). Well see how that pans out at HD resolutions.
Same setup, only change is the source is 1280×720 and the bitrate is set to 2000.
x264
Commandline: x264.exe -o sintel_trailer_2k_720p24.mp4 sintel_trailer_2k_720p24.y4m –profile baseline -B 2000 –preset slow –threads 1 –psnr
Avg PSNR: 53.647
FPS: 7.23
Filesize: 10702KB
Actual Bitrate: 1678.26Kbps
webm
Commandline: vpxenc.exe -o sintel_trailer_2k_720p24.webm sintel_trailer_2k_720p24.y4m –good –cpu-used=0 –target-bitrate=2000 –end-usage=0 -p 1 –psnr
Avg PSNR: 52.067
FPS: 3.35
Filesize: 10118KB
Actual Bitrate: 1548.36Kbps
Conclusion
Encoding speed relative to x264 stays about the same (this is much better than previous versions of webm at this resolution). Quality is a mixed bag. Webm looks much better during low motion scenes (detail is higher), but suffers during high motion (smearing). Overall quality is about the same imo.
Ill do a full battery of tests offline (multi-threaded too) and put a comparison together with graphs and links to the output files. Will probably take me a few days.
When people see high motion in real life, they actually perceive it as a blur. It is perhaps a mistake to demerit WebM for having this characteristic.
When h264 videos have to make compromises on quality per bit, which happens in high motion scenes, the compressed video exhibits artefacts … little extraneous bits that aren’t there in the original scene. The human eye doesn’t do anything similar when people are looking at scenes real life.
Just saying.
Edited 2011-03-17 01:28 UTC
To my eye the smearing/blurring looked worse in this particular example, and the artifacts you speak of that x264 often generates were barely visible. I have seen examples that fall in line with what you speak of and I also find webm often looks better, but in this case it didn’t. I’m just trying to be as fair as possible.
Fair enough. I merely point it out because, as has been said by other posters before in this very thread, sometimes it is easy to get carried away with trying to measure one result versus another without actually considering the actual implications of the metrics … in this case, that would be the actual visible-to-viewers impact on the observed video in real time play, versus what can be seen in still frames.
Edited 2011-03-17 02:52 UTC
From his conclusion:
>comparisons in quality are always somewhat subjective
No, they’re not. Just use a pixel difference utility and do it objectively. For my tests I use this: http://pdiff.sourceforge.net/
>encoding speed to quality ratio with x264 ^aEUR~baseline^aEURTM at the higher quality settings
So basically he’s telling us that you have to push the encoder (and your CPU) extra, to just get “h.264 baseline” quality out of it. This is not competitive.
>the gap never exceeds around 50% encoding time
Which is huge for a service like Vimeo or Youtube, that get thousands of encoding requests per minute. Time is money, and this would be a huge waste in resources.
>expect vpxenc to take around twice as long to encode a video of comparable quality to a x264 ^aEUR~high^aEURTM
Yup. Not good good enough. Not for video services, not for mobile applications that might or might not have a DSP for webm, and definitely not good for professionals. Remember, it’s the professionals who upload video that the majority of people watch. Webm is only good for people who don’t mind the extra time, as long as they can do it for free. For example, a screencast of someone in his bedroom showing the new Gnome3, and uploading on his own blog bypassing youtube. These are people who do niche video that don’t matter in the grand scheme of things.
WebM won’t make it, in my personal, and professional opinion as a videographer. I wish it did, because I hate MPEG-LA and all they stand for, but there is no h.264 killer out there. Except h.265 that is, that comes out in 2013.
Of course, just like in the past, for my Ogv vs h.264 article, people will say that quality/speed don’t matter as long as it’s Free, but that’s not true at all. Professionals in my field don’t like h.264 for a variety of reasons, but they like webm even less.
Edited 2011-03-16 02:02 UTC
“Not good good enough.”… yet
With a group like x264 working on it, it should improve significantly, given the massive gap between the h.264 and x.264 encoders.
New replaces old, as services evolve.
Hopefully with time, it will replace x.264 as it becomes better optimised by the FOSS community.
That, or software and medicine copyright is over-ruled :p
Except that certain pixels are often more important to scene quality than others. It is subjective, even if you can quantify parts of it.
Note that Youtube is already encoding everything in VP8, so apparently it is good enough for them.
Clearly, it has a long way to go before it catches up with x264. I don’t think anyone has ever claimed otherwise. Luckily, they are continuing to release updates every 3 months and the gap will continue to close. The improvements from the original release are already significant. Also remember that all these tests tend to be against x264, which is the best in class h.264 encoder. It actually blows away a lot of the commercial encoders as well, and plenty of companies are making lots of money off them without being told their codecs are useless.
Edited 2011-03-16 03:40 UTC
>Note that Youtube is already encoding everything in VP8, so apparently it is good enough for them.
Youtube’s quality sucks. It’s visibly soft on my HDTV (watching its HD videos via my Roku). Vimeo is way better, about 25% better. And youtube doesn’t use webm for normal usage, it’s only used as a last resort, if h.264/flash is not found.
That’s because of people like you who refuse to move to something better and instead cling to proprietary, dead-end software. It doesn’t help the fact that you are so close-minded as well.
>people like you who refuse to move to something better
The benchmark article proved that it’s not “something better”. As for closed minded, I suggest you read more of my anti-MPEGLA articles, general copyright, media articles, etc etc, here on osnews and on my blog, before you blatantly troll here by declaring me “closed minded”.
If anything, by being an developer, an ex-tech reviewer/journalist, and a current filmmaker (with quite some following for my tutorials), I have way more visibility than you have on the matter. Just because you’d like a proprietary technology to fail for philosophical reasons doesn’t mean that it will. I WANT THE SAME THING YOU DO. But it won’t freaking happen.
Just butting in and putting a fly in the ointment.
I WANT THE SAME THING YOU DO. But it won’t freaking happen.
And why is that? Could it be that all the ranting and raving against the wrongs of MPEG-LA is utterly meaningless if you still support and tacitly endorse these wrongs by using H.264? Feel free to read YOU as broadly as possible, i.o.w. all videographers.
Here is exactly where philosophical reasons enter the picture. If we/you/I, don’t want a world where a licensing association is locking up our intelectual works/culture in patent boobytrapped containers, we need to put money where our mouth is. Don’t support H.264. Cut your losses and encode in a free(dom) but lesser and slower format.
It is quite simple. You can’t be against a dictatorship, if you can’t or won’t live without the privileges granted to you be the regime.
1. The fact that Youtube is by far the biggest player of video on the web despite sucky quality shows that quality really doesn’t matter on the web.
2. It is obvious that you don’t know how WebM on youtube works, so you’ve not even tried it. Interesting…
Sure, but you said no video sites could use it. Obviously the biggest and most important one can.
That’s because it’s in beta. Or don’t you think that in 12 months or so they’ll flip the switch and have FF/Chrome/IE9 with WMF codec default to the WebM page and have the Flash/h.264 there as a backup for mobile and older browsers? I don’t think they spent all that time re-encoding their entire back library just on a whim, or as a backup. They’re going to do something with that.
The quality of Vimeo does not matter, as when you visit their site in a Flash-free Firefox 4, you see a download link for Safari… seriously.
Some video is better than no video.
How much difference is there between a PSNR of 39 and 45? Numerically that’s only a 12% difference, at most.
There is also the issue that there is only a 4% difference between the best x264 settings and the baseline. Saying that WebM can only keep up with the x264 baseline really isn’t saying anything bad.
The only thing this data proves is that WebM NEEDS a multi threaded encoding machine. Now do you really think anyone doing anything serious with video is going to try to encode it on a single core computer? If we’re really gonna talk about Google and other video sites, understand that they will have a cluster on the back end. The cost of the license to use H.264 probably far outweighs the cost of the extra hardware it *might* take to produce video of the same quality in the same time frame. Hardware is cheap these days.
There is also NO data here on decoding, so you really cant speculate from this article about how they will perform on a mobile platform.
I am sick and tired of hearing people declaring that professionals are never going to support WebM. Google owns the largest video site and they choose WebM. They obviously can make it work.
It doesn’t matter to me, I am firmly in the “PSNR is meaningless” camp and just ignore PSNR scores. It is somewhat useful for selecting a optimal bitrate constraint for particular source material, but for comparing quality across codecs it just doesn’t mean anything.
I am a webm supporter, but a spade is a spade and that statement isn’t fair to x264. A 4% difference when you are using a low resolution source with a highly constrained bitrate is meaningless… Try the same video at 720p with a reasonable bitrate (say 1500kbps) and that 4% will turn into upwards of 10-20% real quick. x264 main and high profile produce much better quality than baseline at high resolutions with comparable bitrates. That is frankly what those profiles are for (i.e. trade higher decode complexity for increased quality at high resolutions).
You can’t take the 4% swing on a 320×240 source file at such a low bitrate and form a conclusion like that from it.
You are reading too much into that. It needs multi-threading to get within the “twice as slow” range compared to x264, but x264 is multi-threaded by design and by default – webm’s multi-threading support is rather poor and hurts quality rather significantly (unfortunately). x264 quality is virtually unaffected by multi-threading. In reality webm needs “better” multi-threading support, using threads to make it faster as things are now is something of a crutch that doesn’t work all that great (but granted it is better than nothing).
Google is a content distributer. They are not video professionals by any stretch of the imagination. When people like me say “professionals will never support webm” we are talking about content producers, not distributors. And I still don’t think professional content producers will ever embrace it, but I also don’t think it actually matters much either. It is a fine distribution format, at least as good as x264 baseline and getting progressively better with each release. Who cares if content producers use it? It doesn’t much matter – any video is just a re-encode away from being free from the grips of MPEG-LA…
Granted, Eugenia lumps content producers in with distributors (even online distributors), but that is her bias since she works in the industry. I frankly don’t see it that way at all.
Edited 2011-03-17 00:17 UTC
Most of the optimal methods for video are patented by other parties, so WebM must use sub-optimal methods. There has to be a compromise made somewhere, and WebM has made this compromise in the encoding speed.
WebM has probably surpassed H264 now in quality/bit, because it was only just behind h264 by that metric at launch, and WebM has improved quality/bit by 12.8% over its two releases since then.
However, only in the last release was any attempt made at improving the quality/encoding time.
http://blog.webmproject.org/2011/03/vp8-codec-sdk-bali-released.htm…
The recent Bali release improved encoder speed significantly, but it is still not close to x264 on this metric (and this metric alone).
Perhaps the next (Cayuga) release will make up more ground by Q2, 2011:
http://blog.webmproject.org/2011/03/next-up-libvpx-cayuga.html
However, it must be said that encoder speed is the area where webM does make a (necessary) compromise, and it may never catch up with x264 (for this metric alone, since webM has already caught up in terms of quality/bit).
It is perfectly reasonable for this to be the area in which a compromise is made, because for every time a digital video is encoded it is played on average many thousands of times, and likewise for every person who encodes digital videos there are many thousands who merely view them.
For professional and high-volume use, perhaps the solution to the encoding speed problem will be found in a hardware encoder:
http://blog.webmproject.org/2011/03/introducing-anthill-first-vp8-h…
AFAIK there is also a project underway to implement WebM encoding via contemporary GPUs (shaders) using the OpenCL language, which if completed would also become a means (usable by anyone with a 3D-capable GPU) to considerably speed up the WebM encoding time.
Edited 2011-03-16 04:38 UTC
> WebM has probably surpassed H264 now in quality/bit
Rubbish. Show me an example.
Eugenia,
“>comparisons in quality are always somewhat subjective”
“No, they’re not. Just use a pixel difference utility and do it objectively.”
This test is highly appealing do to it’s simplicity, however it is a little deceptive if we dig deeper.
Lossy compression algorithms deliberately throw away information we cannot perceive. This is by design, and should not be considered a fault. Creating an objective quality tests turns out to be rather difficult.
Using audio as an example, consider two cases:
Audio codec A throws away very low and high frequencies which we cannot hear.
Audio codec B replicates the waveform best on paper, but introduces other audible artifacts.
The diff test would prefer codec B, but the human ear would prefer codec A.
The nuances are even more complicated with visual media.
Consider a black screen on a white surface. A motion estimation algorithm might place the “lines” a couple pixels out of place, but never the less produce a great visual image overall. The trivial diff test would fail to recognize the quality a human observer would perceive.
“>the gap never exceeds around 50% encoding time”
“Which is huge for a service like Vimeo or Youtube, that get thousands of encoding requests per minute. Time is money, and this would be a huge waste in resources.”
I’m not particularly impressed by those numbers either, but we need to recognize that encoding performance may be less crucial than decoding performance and size ratios. The importance of either depend on the use cases.
You mentioned mobile applications, Today’s devices have more computing power than bandwidth, so it could make sense to sacrifice one to help the other.
The other example you mentioned was youtube. If one is distributing millions of copies, then encoding speed isn’t nearly as important as the decoding speed and compression ratio.
Just to emphasis the point… What if we had an algorithm capable of compressing a typical DVD to 50MB with no perceivable quality loss but it took 48hours to encode, but also played back on modest hardware? This would be of great interest to youtube, netflix, etc. The savings on bandwidth * millons would offset any inconvenience to the service providers.
I’m not familiar enough with these codecs to make any judgements, but I do believe it’s likely the battle will be fought on political rather than technical grounds, assuming both codecs are “good enough”.
Confusing encoding and decoding requirements ? Pretty bad for a professional videographer.
People don’t encode videos on mobile devices, and decoding WebM has less overhead than decoding H.264. Try again.
Also, I think you badly overrate the impact of professional video and quality on the web. Video on the web is mostly Youtube and Dailymotion, not carefully polished professional video.
Edited 2011-03-16 06:23 UTC
But yes, they do. 99% of SoC chipsets on mobile devices these days are capable of encode and decode. Any mobile device recording video is doing encoding. The abundance of 1st and 3rd party on-device (iMovie for iPhone, etc.) software speaks for itself as to consumer adoption.
As for WebM, repeat after me: WebM SoC/DSP acceleration is not yet widely available. Right now it is completely irrelevant to even attempt to assert that CPU usage is less than H264 when H264 devices are using SoCs/DSPs with no battery drain.
Last, we can fight about this all day but the only real thing that counts is that consumers don’t care as long as their content can be recorded and viewed without problems. WebM has a long way to go.
Which they do through hardware, so the speed of libvpx isn’t important to this discussion. The solution to this is simply to update the hardware being pushed out in these devices, something which Google has apparently convinced all the hardware manufacturers to do. It will take probably 2 years before that support is truly widespread, but it’s coming.
Quite correct. h.264 will be far superior on mobile devices until this is rectified. This will happen, the only question is how long it will take.
Agree again. Consumers don’t care, they just want their video to work. People like Eugenia are not typical consumers (at least not typical web video consumers) – she tends to lean more towards the professional side which makes her think that certain h.264 attributes are important when most people don’t care one way or the other. Like the VHS vs Betamax wars, where one was obviously superior to the professionals, but was completely abandoned by the market anyway.
What he said, except I don’t believe in H264-encoding DSPs which have no battery drain. There’s nothing magical about DSPs, it’s just a processor on the inside, eating battery power to provide computational power. The miserable battery life of current smartphones shows well enough that hardware acceleration is no panacea as far as battery life is concerned. For that, we need computational efficiency, instead of more computational power.
The miserable battery life of smartphones is mostly due to the power consumption of antennas in the device and your distance from the source.
Turn off 3g and you’ll see the difference.
Then why do Symbian phones typically last much more than Android/iOS ones on battery, even at similar connectivity settings ?
The screen does have a role to play, but twice the screen area does not explain a 1.5 day vs 4.5 days difference alone. Software also has a role to play there.
Then why do Symbian phones typically last much more than Android/iOS ones on battery, even at similar connectivity settings ?
The screen does have a role to play, but twice the screen area does not explain a 1.5 day vs 4.5 days difference alone. Software also has a role to play there. Faster, more modern hardware consumes more power when it runs at full speed, but theoretically less power when running at low speed…
“No battery drain” is a myth.
Power drain was measured for the Anthill WebM hardware encoder and the results shown here:
http://blog.webmproject.org/2011/03/introducing-anthill-first-vp8-h…
It was measured at 12 milliwatts compared with between 1025 and 3459 milliwatts for different software releases.
Perhaps about 1% of the power drain would be a better estimate than “no power drain”.
Edited 2011-03-16 11:10 UTC
I wonder about their measurement methodology. From the title of the plot, it looks like they are comparing the power consumption of the CPU (and nothing else) between software encoding and hardware decoding.
I hope I’m wrong. Otherwise, the measurement is obviously totally flawed, as they don’t take into account the power consumption of the hardware encoder itself. This plot only serves as a way to say “look, the CPU is idle and can do something else meanwhile !” (which is indeed a benefit of hardware encoders).
Edited 2011-03-16 11:31 UTC
You must be new to this and/or forgotten the whole H264 debacle over mobile SoC hardware decode/encode from ~ 2001, which is now repeating itself, once again.
Here, note where it says ‘battery savings’:
http://news.cnet.com/8301-30685_3-20043249-264.html
Every Android device from this point onwards (Android 2.3.3 or higher) will support WebM.
The following chips will support WebM decoding hardware acceleration:
Rockchip RK29xx
TI OMAP 4 and OMAP 5
Chips&Media BODA9/ CODA9
VeriSilicon ZSP Digital Signal Processor Cores and SoC Platform
Qualcomm Adreno 3xx
ARM with Neon extensions
Nvidia Tegra 2 and 3
These are just the devices found within a few minutes web search. The latter two (at least) also have support for WebM encoding in hardware.
Just about the only new ARM SoC design announced lately that does not support WebM in hardware is the one used in the iPad 2.
Most of the Android 3 (Honeycomb) tablets announced recently use the Nvidia Tegra 2.
http://mashable.com/2010/01/27/9-upcoming-tablet-alternatives-to-th…
http://www.andro-tablets.org/
If you are wondering if that is significant:
http://www.andro-tablets.org/android-tablets-should-dominate-the-ma…
Edited 2011-03-16 09:58 UTC
…
Note where I said ‘not widely available’. Try reading my response next time.
The points are orthogonal, neither is wrong.
Yes, WebM hardware acceleration is not in most mobile devices currently out there in use in the field.
But no, you will find that a lot of the Android mobile devices on the market right now for sale do in fact include WebM hardware acceleration.
It comes down to how one would interpret “not yet widely available”. It is indeed available, it is on the store shelves right now. It is actually a featured selling point for the latest offerings in Android mobile devices.
You don’t know what you’re talking about in the mobile space. As usual.
You think the cameras present on nearly EVERY smartphone are storing raw video?
The Nokia N8 for example which is probably still the best quality “cameraphone” encodes HD via H.264 with AAC for the stereo audio. Video encodes at around 9mbits/sec with stereo audio encoded at 128 kbits/sec with 48kHz sampling.
I don’t really understand you. You’re saying that because the currently existing WebM encoders are slow that they’ll never be able to be fast?
Gee, remember when H.264 came out and it was still new? All the tools were abysmal and it took years to encode something. Well, lookie here, we have the exact same situation: a young, new codec and abysmal tools. So why is WebM treated differently?
I’m just saying that YES, the encoders are still slow, but for f*ck’s sake look at their age! Drawing the conclusion that because it’s slow NOW it can’t ever be fast is just plain short-sighted ignorance, nothing else.
Especially considering the very first software upgrade which was aimed at improving the decoder speed achieved a factor of 4.5x improvement.
http://blog.webmproject.org/2011/03/vp8-codec-sdk-bali-released.htm…
Also consider that the next release, which aims for another similar improvement factor again, is due in about three months time.
Edited 2011-03-16 11:19 UTC
I know this article is about encoding, but ultimately, it doesn’t matter.
The professionals here are getting far too caught up in what’s important to them, as a professional, as compared to what’s really truly important, which is decoding.
You see, encoding only matters to a small number of people who create video and really, an even smaller number of people who do the transcoding (vimo, youtube, etc). For encoding, its the later that even matters. And really, once encoding performance becomes good enough, anything else is gravy. So really the question is, has WebM encoding performance become good enough. Everyone that matters seems to be saying absolutely.
Beyond encoding, decoding is ultimately what matters. Streams can be decoded millions of times. Here, it appears WebM has a lead in decoding performance. Furthermore, hardware is now becoming available with hardware assist. For one of largest emerging markets, this means WebM is most definitely a market winner.
Given that WebM has achieve visual parity with X.264 and is beating X.264 on decoding (which translates into improved battery life), WebM most definitely is providing serious competition.
As a professional, it may hurt your feelings that your take on it doesn’t really matter, but reality is, your technical analysis of why X.264 is king is nothing but noise in the market. The reality is, where it matters, WebM is already adjusting the entire market to accommodate it; and its just barely out of the gate.
That is a great utility, but it has exactly the same problem as using PSNR for this kind of stuff… The fact that it uses perceptual biases instead of strict signal to noise doesn’t matter much. The problem is both a utility like the one you use and PSNR is meant for comparing an input to an output. One measures SNR, the other perceptual differences in pixels. In either case you get a relative difference to the input, i.e. a “how much has changed” value.
Both are a fair facsimile of human vision, and arguably pdiff is a better one. Either metric tends to work decently at low compression ratios, but neither do a good job for comparing “subjective” video quality when you start dealing with high compression ratios where the output is often significantly different from the input.
The point is that being as close as possible to the input is not the goal at high compression ratios, it is more important to have the output look clean and natural, contain minimal artifacts, not lose too much “subjective” detail, etc.
Higher compression ratios (ratios that you as a video professional would outright reject because they adversely affect quality too much) are the norm for web video… PSNR and pdiff metrics for such video are mostly useless as you can easily find examples where the relative scores do not at all match most peoples subjective opinions.
Edited 2011-03-16 20:09 UTC
I personally am much more interested in the quality aspect than in encoder speed. Unfortunately, there seems to be no quality benchmark anywhere that is actually fair to both parties; either it tries to pose H.264 as superior and thus uses inferior settings for WebM, or vice versa.
How I would like a quality benchmark to be done:
1) Have a 30min video at really high resolution and no lossy compression as the source, with a few steady scenes where it’s important to preserve the colours and clarity as well as possible, a few scenes with lots of movement where it’s important not to produce many compression artifacts, and then lastly low-light versions of all the afore-mentioned scenes so we can quantify how well the encoders manage low-contrast situations.
2) Find a H.264/x264 enthusiast who knows his stuff and tools to do the H.264 encoding, and a WebM/VP8 enthusiast to do the WebM encoding. It’s not quite possible to avoid bias anyways, so why not just choose biased parties from the start but only let them work on what they prefer? This way both of them will do their best to get the best output they can.
3) Establish the conditions for the encoding tests, like for example bitrates as 150kb/s, 600kb/s and 1800kb/s and resolutions as 320×240, 640×480 and 1024×768, and then encode all the combinations above.
4) Upload all the resulting output files somewhere so people can make their own comparisons, but also clearly state file sizes, encoding times and any settings used and provide a few sample images from all the different conditions.
Then we could actually have some meaningful discussion without having to resort to “he is biased, he doesn’t use the best possible settings!” arguments.
5) Send the original file through a blur filter (no compression) to show how effing stupid tuning for PSNR is
Here is a decent attempt at quality comparison done in May 2010:
http://compression.ru/video/codec_comparison/h264_2010/index.html
In June 2010 they produced this update which included VP8:
http://compression.ru/video/codec_comparison/h264_2010/vp8_vs_h264….
Considering only quality per bit, and ignoring encoder speed, this SSIM metric comparison puts WebM quality behind that of x264 but ahead of XviD h264. The VP8 “best” preset just overtakes the x264 high-speed preset, all other presets of x264 are progressively slightly better.
In June 2010 the only version of WebM was the launch version of libvpx, the reference codec.
Since that time there have been two subsequent releases of libvpx.
http://www.osnews.com/story/24502/WebM_Project_Releases_New_Version…
These WebM releases were announced here (Aylesbury):
http://blog.webmproject.org/2010/10/vp8-codec-sdk-aylesbury-release…
and here (Bali):
http://blog.webmproject.org/2011/03/vp8-codec-sdk-bali-released.htm…
http://www.osnews.com/permalink?465497
The most recent release of the libvpx reference code for WebM is 12.8% better quality than the version shown in the graphs plotted in June 2010 at compression.ru.
Hope this helps.
Edited 2011-03-16 12:34 UTC
No, it doesn’t. Percentages do not say anything about how it actually looks like to the eye and thus referring to them as some sort of a holy bible doesn’t really tell me much.
This observation is perfectly correct. In “blind testing”, a group of people are shown different versions of a video, without being told anything else about them, and they are asked which version they prefer.
Blind testing of WebM launch release showed that people were simply unable to pick the difference between h264 and WebM. Some preferred one, other preferred the other, and still others said they couldn’t pick between them.
“The eye” effectively can’t pick these small differences in measured video quality.
PS: It was you who asked for video quality benchmark results, not I.
Edited 2011-03-16 12:59 UTC
But as I said, comparisons over how many more percents one version has gotten over the previous one in one metric simply doesn’t suffice for a benchmark. And I asked for the actual output files so people could make their comparisons and opinions themselves, not just one or another metric.
I personally have no tools to shoot such a source video as I described nor do I have the necessary knowledge to know what encoder parameters I should use and thus I don’t qualify as for making the benchmarks myself.
The YouTube HTML5 trial lets you join and un-join the trial.
http://www.youtube.com/html5
If you have Firefox 4 RC or Chrome I think you can use this trial to view the same video at the same resolution in both WebM and h264 versions. Some video clips are available at 720p resolution.
Recently I haven’t been able to spot any difference. Hope this helps.
Of course you can’t. The only way people can spot the difference is if you make stillframes and zoom in 56 times.
I can guarantee you: even hoity-toity “indy” videographers won’t be ale to tell them apart in a proper double-blind test. It’s all a bunch of bullshit, like audiophiles claiming they can hear a difference because their digital (!) cable is gold-plated.
This obviously shows your lack of knowledge of the audio field. Yet another example of awful journalism on your personal blog. You totally should be fired.
Real audiophiles can hear the sound of one dropped bit in several hours of 24bit/128kHz sound. And get a refund on their whole Hi-Fi setup when doing so. All true audiophiles also have a special contract with their electricity network, to ensure that their power supply never goes more than -256 dB away from the 230V/50Hz sinusoid.
What’s more, you showcase your blatant lack of knowledge and research by suggesting that audiophiles may listen to digital audio from time to time. True audiophiles only listen to records and magnetic tapes, as they can’t stand the cold sound of digital hardware.
We call this voodoo-audio in Poland. People believe in some things, and can kill if You tell them they are wrong haha Then we take them, show them blind tests where they usually fail crying
That was hilarious. I’m going back to my highly optimizes bi-wired listening experience
http://xkcd.com/841/
We do not know what encoder settings were used, what version of the encoders were used, nor do we actually know if the videos have been encoded with the same bitrate settings.
None of these things are going to be important to people watching web video.
They aren’t even particularly important to people hosting video as long as the filesizes (and hence badwidth requirements) are about the same. There are only three important parameters: (1) can viewers see it, (2) does it look as good to viewers, and (3) how much does it cost me to host?
For WebM, compared to H264, the answers to these questions are: (1) yes if they install a recent browser and (in some cases) the OS codec, (2) yes, and (3) a lot less.
Edited 2011-03-16 22:21 UTC
…that after seeing Eugenia post this link on Twitter and following it, the comments have the same old sad arguments between the usual suspects was predictable what I found. *sigh* Even down to the inevitable 15 link lists Lemur2 always uses as a crutch to support any argument.
I prepare to be modded in to oblivion, safe in the knowledge that this entire comment section is bullshit between whiney little bitches.
I provide lists of links to back up what I say. If I don’t provide links whiney little people like you will endlessly challenge whatever I say. The links are a crutch for you, not for me.
If you are prepared to believe me, then you don’t need to read the linked pages.
His point — which I happen to agree with — is that you overuse links. A whole gathering of links just makes it feel like you yourself have no insight to provide whatsoever and thus makes your comment seem irrelevant.
As I said … you don’t have to read the links. As I also said, if I don’t provide links, people just moan that whatever I say is unsupported.
I’m between a rock and a hard place here, damned if I do and damned if I don’t provide links.
I have adopted the policy of providing the links as backup for what I say. Normally this quells then number of people who challenge what I say, but not always. On balance, links are better than no links.
I believe it is possible to provide links to support nearly any position if you are anal enough to be bothered.
The point is more – no, you are not always “right” so why the hell should I have to agree with you? Get over yourself, with all due respect, obviously. (hint: zero.)
You typically provide a bunch of redundant links that no one cares about.
When it comes to providing links less is more. It seems that half the time you are filling your posts with links to add gravitas but people would probably take them more seriously if you shortened them and got to the point.
Thanks for doing this. If someone needs more information, he can click on the links. Also, this way there’s no need of “copying and pasting” all the information. That is what links are for.
I’m glad you have that much time to waste. It’s more time than I have to dedicate to being a whiney little bitch -> you win on that front. Whiney little know it all! Yes. I like how you hide behind the Myers Briggs bullshit personality assassination too. Always good to have a ultimate excuse for being “you”.
No, I don’t think you understand -> they are your crutch, because you want them to prop up your opinion. Baring in mind, your opinion is worthless**, that pretty much makes it a crutch.
** as is everyone else’s opinion on here. Myself included. Everyone on here is a whiney bitch or self congratulating tunnel visioned muppet these days.
Why would I believe you? You try to hard to be right, so you must be hiding some kind of inadequacy. I’m not interested in being lectured to. You are not always right. What you are is a whiney bitch that will not let go till they have the last word.
It’s worth noting that since you joined “this entire comment section”, you have, by your own definition, made yourself a “whiney little bitch”.
Exactly.. I hate myself as much as I hate you. Or similar.
Indeed. Obsessed with one-upping, instead of listening to people who just may have been doing this slightly longer than they have. But you don’t have to have time in the actual field measured in decades to be an internet expert these days, just the ability to search google.
Agreed. I tend to listen to Eugenia – she is passionate about the subject matter, and, you know, actually works as a semi-pro in the field (maybe “semi-pro” is an exaggeration, but having completed clips for local bands to use as Promo, I see her more as a pro rather than a consumer level user.) What she is saying is also my experience as someone who really doesn’t give a toss about using propriety codecs. WebM, Ogg/Theora, whatever. If I have to re-encode files using those codecs/containers to use them on my PSP (which is a device I do watch video on and is not Apple, so not immediate flame bait), really, what is the point? Wasting 30+ minutes per 700MB of 720p video by having to re-encode the file is a PITA, end of story.