Friday, March 09, 2007

Tech stuff

This is interesting. Bob Cringely (who ought to know):
I have heard that Apple plans to add hardware video decoding to ALL of its new computers beginning fairly soon, certainly this year.

Why Apple would do this is fairly clear to me, but first let's clarify what I mean by hardware video decoding, because it isn't implicitly the MPEG-2 format used in present-day DVDs. I'm not saying Apple's video-decoder chip won't also decode MPEG-2 (it may or may not -- I simply don't know), but the chip's primary codec is H.264, which is at the heart of both Apple's QuickTime software and its iTunes video downloading service.

WHY Apple would add H.264 video-decoding hardware to its entire line of PCs comes down to supporting iTunes and any similar video distribution efforts Apple may spring on us. By going with a chip, Apple ensures the same base performance level from every machine it sells, from the lowliest Mac Mini right up to the mightiest four-core Mac Pro. Up until now it took a multi-core machine with a lot of memory to support real 1080p (HDTV) decoding, but soon you'll be able to do that easily on a Mac Mini while leaving the main CPU to handle other chores like networking, running the graphical user interface, or perhaps integrating in real time a variety of video ad streams.

Apple's new policy, if true, will turn on its head the whole notion of forcing users upmarket if they want better video support. THE POLICY WILL COST APPLE MONEY, not just for the video chip, but also for the lost sales of higher performance machines.
It is just a rumour at the moment, but it's one I believe. It would fit in to Apple's long-term behaviour: give individuals the creative tools to manipulate, edit, and create whatever media that available technology will make feasible. This goes back to Apple's long-term advantage in desktop publishing, and continued with the iPod and the "Rip, Mix, and Burn" campaign. I think it's clear that Apple is doing this again with video this time, and so does Cringely:
So what's in it for Apple? Potentially a lot, because the chip Apple has chosen doesn't cost $7, it costs more like $50, and it doesn't just do hardware H.264 decoding, it does hardware H.264 ENCODING, too.

This will change everything. Soon even the lowliest Mac will be able to effortlessly record in background one or more video signals while the user runs TurboTax on the screen. Macs will become superb DVR machines with TiVo-like functionality yet smaller file sizes than any TiVo box could ever produce. In a YouTube world, the new Macs will be a boon to user-produced video, which will, in turn, promote the H.264 standard. By being able to encode in real time, the new Macs will have that American Idol clip up and running faster than could be done on almost any other machine. Add in Slingbox-like capability to throw your home cable signal around the world and it gets even better. Add faster video performance to the already best-of-league iChat audio/video chat client, and every new Mac becomes a webcam or a video phone.
Apple has earned a reputation as a disruptive company. And they do love that reputation, they love it sooooo much. But this really is an interesting jump -- it will be interesting to see how PC companies react. What will Dell and HP start packing on their systems? Will they follow Apple's lead, or go with another standard?

Fascinating time to be paying attention to the computer industry.


Gar said...

I wonder - with all that encoding and decoding ability built it, might this make DRM management a bit easier? One reason DRM fails in Windows VISTA is that it is using general purpose hardware for the purpose. If a machine already has a chip dedicated to encoding and decoding, might a clever programmer add the ability to use this for DRM to this? Security is not my speciality, but I've always been skeptical that technical obstacles will protect us from DRM forever.

john said...

It's possible, but if my understanding of these matters is right (no guarantees on that count) the encoding/dedcoding chip wouldn't have the DRM built in, the DRM would have to be software.

The problem with any hardware DRM is evident by the recent failures with AACS.

I am, however, most certainly not an expert on these matters.