It’s really more than an HDMI competitor, it’s a cable specification that “converges full uncompressed HD video, audio, 100BaseT Ethernet, high power over cable and various control signals through a single 100m/328ft CAT5e/6 LAN cable.” That’s an idea that I can really get behind. No new proprietary connectors, no expensive cables needed, consolidation of all necessary signals into one cable. The founding companies include LG Electronics, Samsung Electronics, and Sony Pictures Entertainment.Of course, having a few electronics giants behind your proposed new standard doesn’t necessarily assure its success. Sometimes it just assures that those companies’ closest competitors get behind a competing technology and you get another pointless platform war. But this one doesn’t seem like a good candidate for that scenario. Who would be the loser if hdBaseT were to prevail? Perennial villain of the AV world, Monster Cable. Of course, they’d soon be selling $100 gold plated, 15mm thick ethernet cables for idiots to buy, so maybe they wouldn’t care so much.
The thing that I find the most interesting, and the most unbelievable is that hdBaseT would allow “a single-connector TV to receive power, video/audio, Internet and control signals from the same cable.” It’s the power that has me surprised. I suppose that the AC signal from the house would be converted to DC in some appliance before it was then transmitted as DC over the cat5 cable. But what this enables is, for example, a single media server or DVR to be centralized in the home, then inexpensive cable run to any monitor or television in the home. And for personal computers, it would allow a single slender cable to connect the box to the monitor. The hdBaseT documentation doesn’t say anything about USB, I would imagine it would not be hard for USB to be one of thsoe “various control signals.” It would be very cool if a single PC could support multiple “terminals” in a home or office, using hdBaseT. The software wouldn’t be hard.
We’ll be keeping an eye on hdBaseT, because it could have a big impact on both AV technology and computing.
you mean a generic cable that does it all? i’m buying
A generic cable that does it all….that’s already widely used and cheap to manufacture.
Which will ultimately either drive the pre-existing price of Cat6 cable or create a modified version and ultimately you’ll be paying 2 or 3 times as much as you already do for just Cat6.
Count on it.
Considering the absolute mess of cables behind my TV, DVR, DVD, etc… it would be absolutely fantastic.
Too bad manufacturer’s goal is primarily to make money – and the proprietary connector trap have worked pretty well so far. And the satisfaction of customers has only been a distant second goal.
name one connector in AV that is proprietary.
HDMI
Sounds like a monster cable. I’ll have to shope around for the best buy when it comes out.
http://en.wikipedia.org/wiki/DLNA
http://en.wikipedia.org/wiki/Power_over_Ethernet
Between these two I think you have all you need to make this work. It seems to me this is likely just an industry agreement to actually use existing standards.
Don’t quote me on this, but as far I know DLNA has severe limitations like not being able to support srt subtitles. Some implementations also have issues with the MKV container format and a lot of the implemenations don’t support things like vorbis audio.
While it might be suitable for some people, I’d imagine DLNA wouldn’t be good enough for a significant portion of osnews’ readership.
Considering how many cables are needed to make home theaters work (and wireless is nowhere close to being as high quality) having everything down into one or maybe two ethernet cables would rock. Now, we just have to hope Sony doesn’t screw it up by demanding ridiculously high royalties to implement it. Hopefully LG and Samsung can reign Sony in.
I love this new standard, however, there’s a single problem with it. Thin, mobile, devices, and even consumer cameras, won’t be able to carry this “thick” plug. I think we’d still need a “mini” to “normal” adapter, like we do with the new HDMI-mini. And I personally dislike that… but hey.
[deleted post]
Edited 2010-07-09 19:01 UTC
Simple: Bring back the dreaded PCMCIA “dongles”
http://www.google.com/images?hl=en&q=ethernet%20dongle
God I hate those things.
from making those cables myself I know that the actual wires are pretty damn small around 1mm, making a flat connector about 2mm high and some 15mm or so wide should be a piece of cake
Yeah. But you still need a different adapter on the thin device’s side. It won’t be a *standard* plug on both sides. This is what I don’t want: adapters, and “mini” versions. I want the same plug on all the devices, be it a TV/BD player, or a cellphone.
That’s silly. “One connector does all” doesn’t work as you yourself pointed out. Putting a mini connector on a full-size device is just as silly as putting a full-size connector on a small device. You NEED different standards for different uses. A regular connector for regular needs, and a mini connector for mini needs.
No, I don’t “need” that. Instead, it’s imposed to me. The more standard the cable on both sides, the easier and cheaper it is for me (so I could have a single cable for all my usages, instead of different ones).
I feel the same about USB, mini-USB and micro-USB btw. I want a small connector that fits on all kinds of devices. This is 2010, we should be able to design a connector with enough pins to do all the jobs we want it to do in a small/thin plug.
I agree with the annoyance of the multiple USB connectors, but I appreciate that the larger “standard” USB connector is more sturdy and can survive a little abuse whereas the smaller versions are a royal PITA and always getting bent or falling out with only very slightly rough use. If every device had small fragile connectors like that, it would drive me bonkers.
Well, I don’t want that. So if we each have a vote, I vote against your objection.
Why didn’t they just do this in the first place instead of reinventing the connector with HDMI?
I’m not sure I’m terribly excited about getting the ethernet connection all the way to the tv. Television would optimally be just a dumb display device, and all the “smarts” would be in the set-top box. TV shouldn’t be something you update to get new intelligent features, it makes more sense to update the cheap set-top box instead (by buying a new one, or upgrading the software).
Moving more intelligence to the TV seems to be a trick to create artificial obsolescence for a class of products that used to be “good enough” for 5 years or more. I even thing DVB receiver in the TV is redundant ;-).
personally I dislike tvs completely, which is why I would get a passive cooled pc and an led beamer if I would want a large screen, it would give me more than any tv at that price could.
Unless you want a bright room (who likes brightness anyway) you can have a decent setup this way that does just about anything.
“…hdBaseT would allow “a single-connector TV to receive power, video/audio, Internet and control signals from the same cable.”
While i really like the hdBaseT idea, and hope it succeeds, there is one issue I have.
is there going to be something that prevents it from burning out other cables? i mean if it’s going ot be a standard eithernet cable, and you plug it into some random eithernet device, sending that much electricity into something could fry the hell out of it.
I’m sure it will work similar to PoE: namely, the device connected will receive a very small amount of initial power voltage/wattage – and will then negotiate with the “host” (supplying the power) which power level it needs. Edit: Actually, it seems to be done with a passive resistor (I was thinking of how USB works)
“High Power” is defined as 12.95W – 25.50W @ ~50V per wikipedia.
There’s nothing new about this technology – I work in a building that provides PoE to all network jacks in order to support Cisco IP phones, and any ethernet device can be plugged into these jacks just the same.
Edited 2010-07-09 21:39 UTC
Made that mistake before. Working with an embedded system that had two 6 wire phone cables that plugged in right next to each other. One carried a large amount of current to provide power to some part of the system. The other was a real phone line. I plugged the powered line into the modem and melted the board. I was pretty proud of it. I’m sure I wasn’t the first one to do that with our product.
There’s lots of people who’ve fried their analog modems by plugging them into the digital PBX jacks in hotels. I remember some hotels had large “DO NOT USE WITH LAPTOP MODEMS” signs over all the phone jacks back in the 90s.
http://www.amazon.com/Denon-AKDL1-Dedicated-Link-Cable/dp/B000I1X6P…
read some of the reviews.
“Transmission of music data at rates faster than the speed of light seemed convenient, until I realized I was hearing the music before I actually wanted to play it. Apparently Denon forgot how accustomed most of us are to unidirectional time and the general laws of physics. I tried to get used to this effect but hearing songs play before I even realized I was in the mood for them just really screwed up my preconceptions of choice and free will. I’m still having a major existential hangover.
Would not purchase again.”
Holy pope on a pogo stick!
I am in shock, in awe… I don’t even know what to say.
Thanks for the link (and death to audiophilia, the world’s dumbest religion).
http://www.hdbaset.org/
http://en.wikipedia.org/wiki/HDBaseT
Edited 2010-07-09 21:28 UTC
Doh! Thanks for noticing I messed up the link.
Great. So now we’re going to have people plugging their TVs into their home routers and complaining that the routers no longer work as 100w (or volts or whatever) of power is pumped down the cable.
Here’s hoping they’re smart, and configure the power side of things to be “start with PoE checks, then negotiate power envelope, then start power”, and not just “connection made, dump 100w on the line”.
IOW, making it optional and starting it only if both ends can handle it.
If they don’t, then they really should not be overloading cables and connectors for completely different uses.
The original post didn’t include any links, so here’s one I found.
http://www.youtube.com/watch?v=mfbiILDmw-I
Interesting video, but it looks like Valens really needs to hire a native English speaking salesman!
The idea is great but the information is a bit vague. They’re not talking about how many simultaneous videos could go through the wire for example. Plus it looks like those of us with a fairly complex home network will have to replace our network switches for that signal to go through. And watching videos on the PC would require a new NIC.
Why not go with the obvious and established solution instead? Just stick everything over IP and be done with it. This would have the following benefits:
– any PC on the network could consume audio/video.
– any PC on the network could produce audio/video.
– the signal could also go wireless with a WiFi access point.
– anybody with network knowledge could do awesome things with that.
– A/V devices would become part of the managed network.
The only reason I can think of to send uncompressed video is that it keeps the receiver from having to decode a compressed format. But considering most video entering the system will be compressed in the first place then it doesn’t seem to make much sense in term of picture quality. So we could stick to compressed video and fit a heck of a lot of channels on a gigabit network.
I suppose this would only appeal to the type of people that frequent OSAlert though…
Video over IP is not a new idea and will be used more and more over time, but it has limitations. Sure you can send the video over wireless, too, but what if it’s not fast enough? Stuttering/slow playback will get you returns even if it is the end user’s fault. Uncompressed video is virtuous because it is not controversial. What compression algorithm would you choose? If you say h264 then you’re going to get yelled at, if you say Theora or VP8 then you’re going to get yelled at. I’ll yell either way!
Also, with PoE you don’t get gigabit since the extra wires are used for power instead.
Still I never liked hdmi and I absolutely detest the constant connector-of-the-month churn. DVI? HDMI? One of the two or three newer ones? I prefer plain old HD-15 and composite video! Simple, consistent, did not change every other year. We need the high capacity digital equivalent of these plugs and we need to stick with whatever we choose for at least 10 years.
I can’t wait for user to blow their ethernet equipement with power over ethernet.
I’ve plugged countless machines into PoE jacks without every blowing them.
A PoE switch delivers no power until its known that the device on the other end needs it.
Looks like they’re picking up where HANA left off….
http://www.1394ta.org/about/HANA/index.html
The problem is there is not motherboard maker in the pool.
And the RJ-45 connector is with experience rather fragile.
Agreed–the plastic nib that’s supposed to hold it in place has a tendency to get messed up over time (either it fails to spring back into place enough or it falls off–either way the cable is able to come loose very easily). On the other hand, this is probably just due to cheap cables–prime opportunity for Monster Cable to step in .
Anyway, ethernet ports are already standard on most Bluray players, game consoles and other set-top boxes these days, so it’s not like it’s asking the consumer to use anything unfamiliar…
Edited 2010-07-10 13:28 UTC
Print “Monster” all over them and you can easily get 170 USD for a 3m cable
Anyone willing to spend that much on cables deserves to have their money taken and redistributed. Best Buy is performing a public service by selling Monster cables.
I just read an article about Intel’s new Light Peak system…this standard is supposed to do a similar thing over fiber. Sony was reportedly excited about Light Peak also…what gives? The only problem with CAT 5 is that installers believe that CAT 5 is “super cable” that can be used for all uses. Besides this concern, it sounds like using cat 5 connectors and cables in this manner is a neat application. I wonder how much more data and power can be shoved through CAT 5/6 cable?
So one cable won’t be enough for devices that could use it the most, namely big HDTVs that are wall mounted.
Interesting tech but I doubt it will be worth the cost of having a non-standard psu.
Edited 2010-07-10 13:42 UTC
I wouldn’t even care if they used multimode LC fiber for the spec. At least then we could be done changing it.
You can buy 10 foot multimode jumpers for about $20 each which isn’t much more costly than an HDMI cable.
Some of the new 3D formats will already max out HDMI. Now there is just the problem of building cheap enough lasers
So….I had good AV gear with composite connectors, then I upgraded to SVideo, then I bought an HDTV as an early adopter. That HDTV set used analog component video inputs for 1080. But the industries blocked HD from most sources on those connectors in favor of DRM enabed HDMI connectors. I now have an HDMI based system.
Now I may get to change it again…? That sucks. I don’t care if it uses standard cables and connectors, I am getting of changing my connectors/cables/systems.
I can get HDMI cables for about six bucks, it is one cable and it works fine for me. Leave the RH-45 connector for computer networking and the occasional phone system.
End rant