This is a very quick follow-up to my previous post documenting my experience with HD DVD and Blu-ray playback on Windows Vista. (If you haven't been following this story, you'll want to read that post and its predecessor, Blu-ray, HD DVD, and Vista to get the proper background.) DRM doom-and-gloomers have tried their best to scare you into thinking that you'll need to scrap your older monitors, video cards, and even HDTVs to play back HD content. They're wrong, as I was able to demonstrate with a two-buck VGA cable.
In the Talkback section, several commenters expressed skepticism over my contention that Windows Vista's DRM didn't come into play at all. Here's one typical comment:
You said that Vista's DRM was not used. If PowerDVD supplied the complete end to end protected pipe, then why did MS add it into the OS? I think you mistake the PowerDVD app displaying the HDCP non compliant warning that Vista supplied to the application as not Vista DRM? (i.e. Vista's monitoring reported to PowerDVD the problem, and PowerDVD displayed the information)
Another commenter thinks my measurements of CPU usage (Blu-ray disks required only 9% CPU on average) were out of line:
The statement regarding CPU usage is complete fancy. HD playback beats the crap out of your CPU. On an AMD Opteron 180/8600GT on an Abit mobo it pegs both cores at 90% on Vista using the XBox 360 drive and on a AMD 6600+/8800GT on an Asus board it hit's 50% across both cores.
Well, there's a very easy way to put both assertions to the test. I pulled the HD DVD/Blu-ray drive out of the system I had been using and plugged it into an older, slower system running Windows XP Media Center Edition 2005. I installed the same copy of PowerDVD Ultra. Neither the monitor nor the video card were HDCP-compatible.
When I tried to play either of the HD discs using a digital (DVI) connection, I was greeted with the exact same HDCP error message I showed in the previous post. The older operating system reported HDCP information to the player software, which in turn decided whether to allow playback. That proves to my satisfaction that Windows Vista isn't involved at all in this playback restriction.
Ah, but that error message says I should try plugging in an analog connection. So I powered down the system and connected the same monitor using a VGA (D-Sub) connector instead. When I started the system back up and tried to play the same HD disc, everything worked just fine. As promised, PowerDVD Ultra pays no attention to HDCP over analog connections.
Now, the monitor I used for these tests is an old 18-inch LCD with a native resolution of 1280 x 1024. As a result, it displayed the HD content in letterbox format, at 1280 x 720 (720p) resolution. Obviously, the results couldn't compare with the output of a 50-inch living room display, but the picture was rich and detailed and it looked great from a reasonable viewing distance. If I had connected it to a larger LCD monitor with a 1920 x 1280 resolution, there's no reason why I shouldn't have gotten full 1080p output.
To measure CPU usage, I ran Performance Monitor as a background task while I played a Blu-ray and HD DVD disc in the foreground. For a video adapter, I used a spare Nvidia 7600GS board I had lying around (similar adapters sell for $80 or so new). That's nowhere near as capable as the 8600 GT I used earlier. The CPU in this system is an AMD Athlon 64 X2 3800+ (2.0 GHz). It's considerably less powerful (and less expensive) than the Intel Core 2 Duo E6600 (2.4 GHz) on the XPS 410 I used for the earlier tests. These benchmarks at Anandtech peg the difference at 30-40%, and that feels about right to me. So how did this lesser system do?
- On the Blu-ray disc, CPU usage was consistently in the 35-36% range. That's considerably more than the 9% I measured using the other, more powerful PC, but it still leaves plenty of room to do other tasks in the background without overheating.
- On the HD-DVD disc, CPU usage was in the 50-52% range, compared with approximately 24% for the same disc on the more muscular Core 2 Duo-based system. That still isn't even close to overtaxing the system, though. (And I certainly wouldn't recommend this older system as the centerpiece of a high-definition Media Center.)
I wouldn't dream of trying to do HD playback with an underpowered video card. The latest generation of GPUs from ATI and Nvidia (even those found in relatively inexpensive cards) do an excellent job of offloading decompression from the CPU.
Analog playback has its own set of complications. If you use composite or S-video connectors, you get only SD output, regardless of the source media. A
composite component connection works just fine up to 1080i (sorry, no 1080p), but very few video cards offer composite component connections, and adapters cost as much as a new video card. A VGA connection like the one I used here is your best bet. Just about every LCD monitor has this type of connection, although they're not as common on HDTV equipment. And, of course, the entertainment industry has the option to disable or constrain analog output anytime, although it's unlikely to happen for at least another three years, and maybe considerably longer. In hardware terms, that's a long, long time.
Coming up next: Is it worth it?