Friday, January 31, 2014

Why no product reviews? On Shootouts and Demos, with an apology to Extron

About what do AV designers talk? Design certainly, in all of its forms. Past projects and wish lists. Perhaps most of all, we talk about technology. For all of our talk on these things, there are relatively few actual product reviews or comparisons. I'll talk about products here, but stop short of a formal endorsement or non-endorsement. Why is this, and what are the perils of doing so? I can illustrate with two examples and an apology.

First, Infocomm 2013. AV_Phenom Mark Coxon saw the dizzying array of HDMI over structured cable extension systems and decided that an old-fashioned "Shoot out" was in order - he'd take a selection and compare for the benefit of the rest of it in the industry. Right away he ran into problems.

1. Acquiring Appropriate Gear
Manufacturers pretty much universally outwardly agreed that this was a great idea. When it came time to actually get sample products, roadblocks appeared. Their trade-show demos were strapped down to permanent displays. They were missing power supplies. They didn't know if they had the latest firmware updates. Some of these might have been legitimate issues. Some might have been a lack of comfort with the risk of taking part in a showdown under conditions they couldn't control. For whatever reason, it's something manufacturers aren't quite comfortable with. 
 

2. Creating a Fair Test
If you read the original post from last year,  you'll see that at the first try none of the extenders worked. At all. Blank screens all around. Removing the extension system and running sources directly to the display, of course, resulted in a perfectly clear picture. Every element of the test had been proven good except the extension system. Which means that the extenders were bad. Right?

Not right. Replacing an active HDMI cable with a .99cent special made everything work. So the problem is the active HDMI cable. Right?

Not right. Months later I saw the same problem - an active cable not working on the back end of an extender. Replacing it with a seemingly identical cable made the problem go away. What the issue appears to have been is a defective cable. Not quite so defective as to give no picture, but marginal enough to fail with some equipment.
Notes on Digital Video

Side note on digital video: as I'm sure you're aware, a digital signal is just a string of ones and zeros. One way to measure the integrity of such  signal would be with an oscilloscope. Ideally, ones should be very high, zeroes very low, with a clear sharp transition between. This is called an "eye pattern". As the signal attenuates and picks up noise the "eye" will flatten and become less sharply defined. Different receivers have different "eye masks" - their tolerance for imperfect signals. The problem with this kind of test is that, absent some rather costly and complex test equipment, it is impossible to determine where the signal is degrading, how, and to what extent. We're left with the binary "it works/it doesn't work". Which leads to the final issue:

3. They all work
Coxon's result was something I could have told him before he started: all of the extenders were able to pass video.  After all, for a manufacturer to sell a product which simple doesn't do what it is advertised as doing would be rather shocking and result in a short life for that manufacturer. Yes, there are secondary tests he could have performed but didn't. Unplug and reconnect the video source to measure sync time. Unplug and reconnect the power to compare startup time. The larger point is that these kinds of devices have become somewhat commoditized; not only do many have the same function, but they also have similar form-factor and, under the hood, use the same chipsets. It's ultimately a comparison of apples to very slightly different apples.

Are there manufacturers with product lines and ecosystems better suited for one application or another? Yes. Are there some which offer better reliability and better customer service? Also yes. I'm not quite ready to say that digital switching and transport is a pure commodity in which any device is equivalent to any other. What I AM saying is that they're close enough that, absent a great deal of time and equipment, it's quite challenging to make meaningful performance comparisons. 

Which brings me back to the beginning, in which I owe somebody an apology.

Some demo gear. I love demo gear!
A few months ago in my visit to Extron post I commented that their XTP switcher changed sources slowly, especially when switching between unprotected and HDCP protected content. This is true; it was unacceptably slow and much more so than other, similar products. So, when one of my colleagues (SMW senior consultant Joe Gaffney) received some Extron demo gear and saw that it switched very slowly I found myself unsurprised.  Fortunately, this was a test in the comfort of our office and Mr. Gaffney is quite diligent about getting things right. After watching the indicator lights on the front of the unit and making several calls to Extron, he determined that there's a setting to drop the HDCP handshake when non-HDCP sources are selected. If this setting is turned off, it has to initiate a new three-way handshake every time a protected source is selected. Hence the long wait time. Turning it off made the unit behave much more reasonably.

Is this what happened with the XTP demo at Extron's demo facility? Without an actual XTP matrix I can't say for certain, but I must admit that it's a possibility. While a manufacturer always should be sure their demo is configured to show the product in its best light, we need to remember not to take first impressions at face value. 

The moral of the story? Sometimes we all get things wrong. Evaluating products is hard. It's OK to make judgments, but make them carefully and be open to the possibility of revisiting them.

Those morals are a bit more universal than the world of AV, aren't they? Perhaps therein lies another lesson.

2 comments:

  1. Leonard,

    I wanted to issue a response here with slightly less snark than the formal rAVe post :)

    My point about the secondary tests for power sequencing, sync time etc is that I really tried to just do a pass/fail in that test seeing if people were really passing 1080p at 300ft on Cat5 UTP outside the walls of their controlled lab.

    I knew everybody's extenders work in certain scenarios, but the problem is that the results aren't consistent. It is not the manufacturer's fault, as they would never have the time or money to cross check every source and sink device with every HDMI cable out there, but the point is that they shouldn't have to. HDMI should have that locked down.

    Integrators should never have to go through what I saw in my test, nor should they be concerned with testing power sequences and sync times either, which is why I ignored that as well.

    All that is a time an labor pit. It will net you a whole bunch of data that is quite frankly "Measurable but Meaningless" when the fact is that video just doesn't pas through that combination of gear. The client just wants it to work, they really dont care why it doesnt nor does it make them comfortable when we scratch our collective heads at $75 per hour.

    Head to head comparisons are meaningless right now, when considering switch times, delays, etc. If everything worked as it should, those may be more important, but its like worrying about the door dings on a car that has a blown engine.

    Function should procedd elegance, even though I agree that both would be desirable.

    I have had a video extender manufacturer in my shop testing their own device that failed in the field. They tested it for a full day as well as a second identical unit, finally concluding that "we've never it do this before." The net result was ditching their HDMI over a single Cat5 version and substituting their HDMI over 2Cat5 version, which worked fine. We had a day of bench testing in the office, and another couple hours replacing the field unit (luckily we always pulled extra cat5 just in case, so we didn't lose money there too).

    HDMI should require their manufacturers to print the "eye mask" signature on the box. If a unit fails in the field, the integrator should replace it and send the other to HDMI for testing. If their test shows a different result than the advertised "Eye Mask" the manufacturer should be fined or sanctioned somehow.

    At least this way an integrator/consultant could choose products that balance price and performance signatures, and could go into their jobs with their "eyes open". If they were burned repeatedly by products with marginal eye masks, then they would eventually change their buying habits to get something else based on actual test results as opposed to marketing bravado and excessive feature lists.

    Just my take!

    Mark C

    ReplyDelete
  2. I wholly understand your point. Mine (at least part of mine) is that absent a fairly well controlled lab condition even "it works"/"It doesn't work" doesn't always tell the whole story. If you want to argue that publicizing this kind of result speaks to larger industry-wide concerns about the overall TMDS-encoded video ecosystem I have a hard time disagreeing. If you're looking to answer the typical "shootout" question as to how products compare, then you don't seem to be quite hitting the mark.

    That's where I find some secondary tests - sync time, hot plug, etc - to be valuable. They represent some of the small ways in which a very high quality product differentiates itself from a more pedestrian or even poor product. Some of these don't show up on the typical spec sheet (along with more subjective factors such as look and feel of configuration interfaces), and are, at least to me, well worth reading about in product evaluations.

    AS always, thanks for sharing your thoughts.

    --LCzS

    ReplyDelete