Networked Vs. Integrated DVR: The Debate
by Leslie Ellis // July 23 2001
Serendipitous surprises, like the unanticipated run on HBO’s subscription video on demand test in the Carolinas a few weeks back, sometimes spawn all sorts of interesting developments.
As the tale goes, 26,000 HBO customers in that southern Time Warner Cable system were given the opportunity to watch episodes of “The Sopranos” and “Sex in the City” on demand. In under two hours, around 3,000 people landed on HBO, saw the offer, and clicked. If my calculator is right, that’s an 11.5% response rate. That kind of response is beyond interesting.
What happened next is the very reason for technical trials: The equipment necessary to set up sessions and to stream the content had an anxiety attack. The headend controller, suddenly busier than it had ever been in its life, got overwhelmed, and started speaking in tongues: Way more people than anticipated had shown up for the party. The server, hearing mostly babble from the controller, threw up its hands and went off for a nap. Time Warner downshifted, knowing this a “good problem to have,” and announced plans to re-stage the tests.
Whether you believe in subscription VOD, or full, on-demand TV viewing, like what TiVo does, the effect to the consumers is essentially the same: On-demand TV, instead of linearly-delivered TV. Its technical underpinnings are relatively different, but share one common characteristic: The need for sessions, and storage.
Consider this: Research data shows that within two weeks of buying a PVR, people shift about 60% of their TV viewing to on-demand. (It’s certainly true here in the geekosphere: Weather is the only live stuff we watch. Episodic TV is stored up for empty evenings.)
Planning for 60% peak, simultaneous usage is totally different than planning for 10% peak, simultaneous usage. The latter is the model technologists have been using to calculate the storage and session capacity they’ll need to deliver movies on demand.
Going to 60% raises an old, chewy technological debate, that translates roughly to “where do we put the bucket?” Should the servers be in the headend, or in the set-top? To store or to stream becomes the question.
Cable, unlike other broadcast delivery formats, is the one group that could theoretically go either way. It has brute force bandwidth: A 750 MHz cable system translates into some 4.3 Gbps of brute, downstream (headend to home) capacity, if all analog channels were digitized and compressed before delivery.
Cable also uses digital set-top boxes: 12 million of them, as of June. The installed base doesn’t have the muscle to do PVR just yet – TiVo wasn’t yet a twinkle in digital’s fertile eye, back when cable’s digital boxes were invented. Still, the installed base of digital cable set-tops do generally include the right connector on the back, to attach a sidecar PVR.
What to do? The industry suddenly finds itself smack-dab in the middle of a torturously complex, decades old, and ongoing economic analysis of both methods.
One the one side is the hard disk drive group: The PVR people, like TiVo, and the set-top suppliers themselves. Build a drive into future digital set-tops, they say. Or, strap it on as a sidecar. A set-top and a TiVo, integrated or strapped together. (Motivation: Sell more boxes and hard drives.)
On the other side is the networked PVR people – the nCUBE’s and Concurrents of the VOD server world. Store everything remotely, they say, and deliver it as needed: You’ve got the bandwidth to do it, and in a much more attractive way than DBS. (Motivation: Sell more servers.) DirecTV’s “Ultimate TV” offer, with integrated PVR, they continue, boasts two tuners. Networked PVR translates into unlimited tuners, they say. Americans like more. Give them more than DBS can.
But what’s often more interesting is what the two camps say about each other. Networked PVR proponents assert that local storage is cost overkill: Spending $200 more to plunk a hard drive into a box that already costs around $230 is illogical. Moving the storage back into the network means more people can share more storage, thus spreading cost.
Plus, say the nCUBEs and Concurrents, putting a device as inherently failure-prone as the hard disk drive inside the home introduces ugly service problems. (I wager a dime that you’ve either experienced a PC hard disk crash, or know someone who has.) Who shoulders the venomous blame when the PVR’s hard drive fails, obliterating an entire family’s desired TV programs: The box manufacturer, or the cable operator who installed it? By striping the TV content such that it splays across several remote servers, like how VOD works, nothing is ever lost, say the networked PVR folks.
PVR’s counter: Installing that many servers, remotely, is a logistical nightmare, for starters. It involves setting up real-time, bi-directional sessions between a digital box and a remote server – a new thing for cable, and not always easy. Two, latency matters: Without a real-time upstream path from the set-top to the servers, features like rewind or pause could be drearily slow. Three, the PVR camp says, this isn’t like modeling for time-shifted movies. Consequently, storage and bandwidth needs will be alarmingly higher, too.
This conundrum puts the poor souls who do bandwidth and traffic modeling at center stage. If the addition of PVR services means that operators should anticipate 60% peak, simultaneous loading – six in 10 customers needing manipulable TV streams, as opposed to one in 10 for movies – the traffic model suddenly indicates a need for way more storage, sessions and bandwidth.
Take, as a practical example, the 500-home node, where 90% of the attached homes take basic cable services (400 homes) and half are anticipated to take digital service (200 homes). Traditional VOD server modeling, for movies, would indicate a need for 20 simultaneous streams (10% peak loading x 200 homes) to that node. About 10 digital channels fit into a 6 MHz cable channel, so that means clearing off two 6 MHz cable channels for VOD.
Applying the same logic to networked PVR, but with 60% instead of 10% simultaneous usage, the storage number swells to 120 simultaneous streams, or a dozen 6 MHz cable channels. Tie in the fact that it’s much more processor-intensive for a VOD server to ingest video than it is for the same server to stream video. So at the get-go, you need servers dedicated to ingestion – which will become a primary competitive distinction among VOD server vendors.
There is no tidy answer to the question of storage v. bandwidth. Both camps make plausible points, which makes a plan difficult to discern.
Ultimately, which PVR method is best for cable? Whichever one beats DBS (with its considerable head start) to the finish line. That means avoiding analysis paralysis, no matter how tempting. Store v. stream is not a new debate, and could be successfully argued until the cows come home.
For cable to keep the cows coming home to its own barn, though, it might be useful to just pick one or the other, or both, and try it.
This column originally appeared in the Broadband Week section of Multichannel News.
Inside Time Warner’s Interactive Services Architecture
by Leslie Ellis // July 09 2001
So far, we’ve tipped cable’s installed base of 10 million digital boxes upside-down and sideways, to better understand what they can and can’t do. We’ve peered into signal paths, tallied the trade-offs of under-the-hood resources, and mused about how to cram more, preferably revenue-bearing, interactive TV services into today’s digital boxes.
And still there’s more: Like collecting the revenues that spurt from interactive services. Perhaps trickiest of all are the automated processes to harvest new, interactive, revenue-bearing services into the monthly cable bill. Cable people call this “the back office.” The rest of the world calls it “enterprise computing.” It’s the same thing: What has to happen behind the scenes for customers to turn on, tune in, or drop out.
Right now, there are three revenue spigots for cable video: Basic cable, pay-per-view, and digital cable. Upgrading existing billing systems to add the collection bucket for pay-per-view is generally remembered as agonizing. Ditto for adding the collection plate under digital video subscriptions.
With interactive applications, it gets even worse. There are theoretically as many new vendor “partners” as there are new interactive applications. Each participant in the chain needs to mesh with the billing system, and with any other related plumbing, like video servers. This is always easier said than done.
In today’s world, adding a new interactive service to a cable bill is as analogously disruptive as adding a new line of baked goods to a grocery store – but only if the store retools the cash register, re-builds all the shelves, and moves all the display cases.
Nobody knows this better than Time Warner Cable, which first collided with the snarly issue of multiple ITV apps and how to bill for them more than a half-decade ago, in its Full Service Network.
As a direct result, Time Warner’s video engineers are well down the trail of a plan aimed squarely at resolving ITV’s back office issues. They call it “Interactive Services Architecture,” or ISA. Almost everyone within Time Warner’s video supplier mix, from headend to server to billing vendors, knows ISA well: It’s the big, white elephant in the negotiating room of most major discussions. Although not formally expressed by Time Warner, the perceived sentiment is one of “Be ISA-compliant, or no go.”
Here’s what ISA is not: It is not a specific product, nor is it back office middleware. It is not a new billing system. It is not the commoditization of video servers, nor an attempt to create an industry standard. ISA is a way by which Time Warner Cable, its existing video and billing partners, and its future ITV and apps suppliers can modify their business software to make it faster and easier for everyone involved to launch new services.
If you put ISA in your hand, it would look like a 92-page specification – a lot like the “Pegasus specification” Time Warner’s engineers wrote in the mid’90s (which became Scientific-Atlanta’s Explorer 2000 system.) It details necessary business interfaces between servers. (Set-tops are not included in the spec.)
Translation: ISA defines the language all servers should speak, when they speak to one another. Functionally, ISA decouples new service provisioning from existing billing systems, so that sign-up and fulfillment of new ITV services can be done by the cable provider, not its billing vendor.
ISA components are explicitly lumped into two categories: Services, and infrastructure. The intent is to distinguish ITV products from the stuff used to provision, market and deliver those products – because it shouldn’t be necessary, each time a new service enters the cable video mix, to change software used by video servers, data carousels, marketing campaign management, customer service and user interfaces.
In fact, the ultimate goal of ISA matches a deeply held cable marketing fantasy: To make a way for marketers to let customers try new cable products on impulse, just as they impulsively toss a tin of Altoids into the basket at the grocery store checkout. With something like ISA, and the commonality it creates between heterogeneous types of ITV servers, cable providers can offer whim-oriented wares: Sample HBO for the weekend. Try out VOD. Order an upgraded digital tier. Push a button, try the product.
At the core of ISA is an invention of computer science called “CORBA,” which stands for “Common Object Request Broker Architecture.” (A web search also produced a “Concerned Off-Road Bicyclists Association,” aimed at protecting mountain bikers in Santa Monica, but the two don’t appear to be related.)
CORBA is a cooperative effort, led by a computer industry consortium known as Object Management Group, to standardize the many juncture points in a computer network. It is widely used by a swath of familiar names: The Weather Channel’s meteorology department, Boeing, Nike, CNN Interactive, American Airlines, Netscape, Microsoft, Sun, and on and on.
CORBA, for cable, (through ISA or otherwise) represents a common way for cable operators and its suppliers to link up to one another. ISA is one way to do it. Other cable providers may find their own CORBA-like answers. The important thing is that it starts to happen – because it’d be a shame to launch a lucrative new interactive service without a way to realize its revenues.
This column originally appeared in the Broadband Week section of Multichannel News.