Tag Archives: apple

Google WebM: Who will think of the users?

A quote:

bq. That is all well and good for Google, but what does that mean for me, the guy who just wants to lay on his sofa and watch cute kittens? At this point, pretty much nothing.

This is a short excerpt from an otherwise “well balanced article”:http://blog.andrewhubbs.com/?p=87 explaining the players, roles, and technologies involved in Google’s decision to remove H.264 support from their HTML5 video tag implementation in Chrome. The sentiment expressed is that it doesn’t matter much to us mere mortals. That couldn’t be further from the truth. If Google is successful in pushing WebM as the standard means of encoding video on the web, it will render millions of devices obsolete, impacting the millions of consumers who own those devices. How?

In many articles on this topic, you’ll find passing mention of something called “hardware decoders”. Since my goal is to explain what Google’s actions mean to the Average Joe, I’m going to go through the trouble of backing up a bit and explaining a few things about video, and how it is played back on various devices.

All this talk about video codecs, what does it mean? A codec (short for coder/decoder) could be thought of as a process definition. Say you had a letter that you wanted to send to a friend, but the post office charged based on the length of the letter. You have two choices: you either shorten your message, or you find a method by which you can reduce the number of characters required to communicate the same information. Expressing the same information using fewer characters is something computer scientists call compression. In addition to compressing the message to save on costs, you’d want to make sure the letter was written in a language that the recipient understands. And what about his ability to open it and access the contents? You need to make sure the envelope is accessible and allows the recipient to easily access what’s inside. That sounds like a silly requirement, but it’s relevant when you look at the details. You could think of all these details as a “codec” for writing and delivering a letter.

I’ve lumped codec together with file format here, which is technically incorrect, but trivial for understanding this issue from an end-user perspective.

So how does this relate to web video? All of the seemingly inane details expressed above are the type of things that computer scientists think about when they design a video file format. Interestingly, codec is just one aspect of a video format. I won’t go in to the others, but it’s worth understanding that the problem is very complex and covers many different areas of knowledge. For the moment, let’s look at the compression part.

Inside your computer is a very, very powerful microprocessor called a CPU. Your CPU is capable of computing solutions to a very wide variety of problems. Because of this, we call it a general purpose microprocessor. It is possible, however, to build a kind of CPU that is optimized to perform a very specific task. In the various articles written about Google’s WebM decision, you’ll find mention of an “H.264 hardware decoder”. What does that mean?

H.264 hardware decoder: a specialized microprocessor that is purpose-built to decode the target codec.

Examples of H.264 hardware decoders:

* The video card in your computer probably has one
* The iPhone has one
* The iPod has one
* Most Android phones have one
* If your TV can play video from an SD card or computer, it has one
* If your digital camera shoots video, it probably has one
* Your digital camcorder probably records in AVCHD (incorporates H.264)
* Virtually every video production suite on the market can utilize an H.264 hardware encoder-decoder

So what does an H.264 hardware decoder do for you? In short, it allows you to watch high-resolution video while using far less battery than it would if you used your device’s CPU. When sitting at your desk, you’d think this wouldn’t matter, but playing back a 1080p video encoded using H.264 can peak even modern processors at 80%-90% utilization. That means the loud fan in your computer is going to turn on and make noise while you’re trying to watch your movie. On laptops, the consequence is even more severe. You can lose hours of battery life by not using H.264 hardware decoders. On mobile devices, it’s game over. Your phone doesn’t have a powerful dual-core CPU. It has a tiny mobile CPU that simply doesn’t have the horsepower to decode high-resolution video on the fly. You’ll be stuck with lower resolution, larger-size video that requires less computing power.

Feel that pit developing in your stomach? Yeah, I’m right there with you.

Let’s look at some numbers:

* 50 million iPhones [1]
* 450,000 iPads [1]
* 220 million iPods (as of Sept 2009) [2]
* 8.5 million Android phones (as of Feb 2010) [3]

That’s close to 280 million devices with H.264 hardware support, and I haven’t scratched the surface. There are no televisions on that list. Remember CES and all the hype over Android tablets? None of them have WebM hardware decoders. On every one of these devices, the cost of WebM video playback will be:

* Greatly reduced battery life
* Larger file-sizes (less compression will be required for smooth playback)
* Lower resolution

We’re talking about falling back from every major milestone met by mobile device manufacturers in the last three years, and millions of devices rendered obsolete for video encoded in WebM. What happens if Google goes WebM-only for YouTube? Right now, Apple supports H.264 exclusively on their mobile devices? Why is that? Because Apple considers user experience to be first priority. Even if Apple were to implement WebM on their mobile devices, the consequence would be jittery video playback that sucked your battery dry in no time. That’s not a good user experience.

So, what does this mean for the Average Joe? If Google is successful, it means that your user experience will be significantly degraded on any device you own that contains H.264 hardware, but no WebM hardware. Have a look at the specs for your phone, portable media player, television, and home theater media devices. Any of them that rely on H.264 hardware are at risk for becoming obsolete.

1 – “TechCrunch”:http://techcrunch.com/2010/04/08/apple-has-sold-450000-ipads-50-million-iphones-to-date/
2 – “World of Apple”:http://news.worldofapple.com/category/world-of-apple-events/
3 – “Numberof.net”:http://www.numberof.net/number-of-android-phones-sold/

Larry Dignan is dead wrong: Apple and AMD

Is it really fair to even pick on ZDNet these days. Adrian Kingsley is about the last writer they have on staff that I can even read without wanting to fall out of my seat. Take this little gem from an Apple speculation piece:

bq. Add it up and AMD could provide the graphics capability Apple is looking for. As AppleInsider noted, AMD traditionally trails Intel on raw performance. However, Ghz is a secondary issue for Apple buyers. An Apple purchase is about design, quality, OS X and ease of use. AMD can get by on the Ghz equation with a mere close enough to Intel if the graphics stars line up. Sean Portnoy asks whether folks would buy an Apple with AMD inside. I’d argue that the processor is a secondary consideration (at best) for buying an Apple.

Gee thanks, Larry. Give me a second to grab my box of crayons so I can scribble down a reply to your sweeping generalization about Apple users. I mean, the desire to own a computer that is easy to use is obviously mutually exclusive from the desire for a computer that is fast and powerful, right?

Oh, wait…

When run against PC laptops, the MacBook Pro line (running Windows under Boot Camp) has, on several occasions, been “the fastest Windows laptop in its class”:http://www.google.com/search?q=macbook+fastest+windows+pc. There goes that argument.

Apple doesn’t refresh their line up as frequently as many PC manufacturers do their consumer lines, so between refreshes, consumer-oriented PCs run away with faster processors. However, when you move up the line to a business-class machine like Dell’s Latitude or Lenovo’s ThinkPad, there are a lot of similarities. These companies test these configurations more thoroughly, so they don’t change as often. This results in a more stable configuration, but they also cost more. Sounds familiar, doesn’t it?

As an Apple buyer, I’m all about performance. I’d be unhappy if Apple moved to Intel while AMD offered an inferior product, and today, that’s the case.

I’d be willing to bet that the reasons Apple was talking to AMD were twofold:

1) AMD has graphics switching technology similar to what Apple just implemented on their own, so it may have been that AMD’s Optimus technology was up for consideration, but was ultimately ruled out.

2) It is in Apple’s interest to keep Intel on their toes. You never sit down at the table with one vendor and one vendor only. That’s a great way to hand margins over to your supplier.

Return to form

Time Magazine has an excellent “gallery of vintage computers”:http://www.time.com/time/photogallery/0,29307,1670168_1461055,00.html from the book titled “Core Memory”. What’s striking about the photos is just how much “design” is there. I’ve always admired Apple for their dedication to making a computer that not only works well, but is pleasing to look at. A lot of hardcore geeks scoff at this notion, treating design as a superfluous luxury not worth paying for. I feel pity for the person that does not see any value in beauty. I absorb everything I see and hear, so I choose to surround myself with positivity and beauty. The benefits are worth a few extra dollars to me.

Apple invites Vimeo to party, snubs YouTube?

Apple has posted a web page containing a list of “iPad ready (really just HTML5 ready) websites”:http://www.apple.com/ipad/ready-for-ipad/ to their website. What’s interesting is that YouTube — a Google property — has been omitted from the list, despite the fact that they have an “HTML5 version”:http://www.youtube.com/html5 of the website available for public consumption. Oversight or passive aggressive behavior?