DASE & OCAP: How Different Are They?
by Leslie Ellis // October 28 2002
A freshly adopted digital TV standard, devised for terrestrial broadcasts of interactive material, is advancing on the periphery of cable’s digital TV action.
It’s called “DASE,” for “DTV Applications Software Environment.” Spoken, “DASE” sounds like “days” — or “daze,” as the quip goes. It hails from the Advanced Television Systems Committee, which sets broadcast standards.
DASE matters because it is similar in intent, but different in detail, from cable’s specification for the software needed to run interactive applications. Cable’s version is called “OCAP,” for “OpenCable Applications Platform.”
The people who worry about things like DASE express it this way: What if a broadcaster were to pass a DASE-based interactive application through to an OCAP-based set-top – or to a future TV with OCAP built-in – and OCAP couldn’t recognize it?
Such an incompatibility is particularly worrisome when imagined as a mandated part of any must-carry rules that may be in the works. (This hasn’t happened, nor has anything occurred that portends that outcome. But worriers worry for a reason: To anticipate the worst, so that they’re prepared, if it happens.)
How different are DASE and OCAP? Considerably, but not insurmountably, experts on the subject say. For starters, broadcast ITV is inherently one-way. A clickable thing slips into a show and is sent out over the air – or, funneled into the cable distribution network. At the house, DASE software inside the TV figures out what to do with it. End of story, at least in the first version of the standard.
OCAP, by comparison, is inherently two-way, because cable’s distribution network is (in most cases) two-way. Manipulating an on-demand program means passing the clicks intended to pause, fast forward or rewind that show, up the network, to the server that holds it.
Because of that, OCAP was designed to support a two-way environment, which is managed by a “monitor application” (translated in the Sept. 2, 2002 edition).
There are also structural differences between the two endeavors — which requires some background. OCAP is heavily reliant on the work of the Europeans, who organized themselves into a group called “DVB,” for “Digital Video Broadcast,” and who successfully promulgated a software standard called “MHP,” for “Multimedia Home Platform” (translated in the Nov. 19, 2001 edition of Multichannel News). MHP, in turn, is largely based on Sun Microsystem’s “Java” software.
DASE happens to also rely on Sun’s software, but it doesn’t always do so in the same way as OCAP or MHP. Summary: OCAP and MHP (read: the U.S. and Europe) are largely aligned, technologically; DASE, as yet, is not.
Many of the structural differences between DASE and OCAP are in the applications program interfaces, or “APIs” – the software toolkits used by computer programmers to write the code that becomes the clickable, interactive thing. There are roughly 10 APIs that differ between DASE and MHP, and thus between DASE and OCAP.
The details of the differing APIs are, predictably, intricate. In some cases, MHP includes APIs that DASE doesn’t, but perhaps should. The API named “org.davic.mpeg,” for example, exists within the MHP standard, but not yet within DASE. It describes how to pull packets from an MPEG-2 video stream for use within teletext-style applications.
Some say that standards don’t have to be identical from one industry to the next. Others say the cry for unity is often more like a cloak, hiding a desire for control.
Either way, alignment around important areas of functionality is usually useful to everyone involved. Everyone, in this case, is cable, broadcasters, and consumer electronics manufacturers, at a minimum.
The good news: A harmonization effort is in the works. It’s called “GEM,” for “Globally Executable MHP.” GEM is the brainchild of the “MHP Umbrella Group,” which, as you can guess, goes by “MUG.”
In an almost unforgivable simplification, GEM, in intent, is sort of a software equivalent to duct tape, or WD-40 – which variously can stick or un-stick just about anything.
GEM began as a way to handle the inherent differences between U.S. and European digital video applications. Specifically, it started as a way to connect the dots that needed connecting between MHP and OCAP. Involved technologists suggest that GEM could also be put to work on DASE.
Hopefully, this will happen while DASE is still on cable’s periphery, and not turn into a big ugly mess. Cable and broadcast already went with differing digital modulation techniques – VSB (vestigial sideband) for the broadcasters, QAM (quadrature amplitude modulation) for cable. It wasn’t the end of the world, but it created kinks in signal flow.
For those of us on the sidelines, then, it’s probably useful to hope that MUG’s GEM takes the daze of out DASE.
This column originally appeared in the Broadband Week section of Multichannel News.
Abstracting the Abstraction Layer
by Leslie Ellis // October 14 2002
An expression common to computer scientists is on the rise among cable technologists, and it’s a doozy. Usually it crops up in conversations amongst software engineers, about digital video hardware and software.
The expression: “Abstraction layer.”
Abstraction layers are everywhere in software. Industrially, an abstraction layer is something software architects build. Its intent is to take something complicated, with many possible outcomes, and to put something on top of it that yields a simple way of doing it, that works in lots of different places.
Clear as mud, right?
Let’s break it down further, starting with the word “abstraction.”
Outside of techno-interpretations, “abstraction” carries at least seven meanings (and that’s without consulting the Oxford English Dictionary). Many of its definitions seem only vaguely related to the next: There’s abstraction as in “lost in abstraction.” Or, abstraction as in a removal of something. And, abstraction as the inventive isolation of an object’s characteristics, like when sorting something into its genus or species.
The whole abstraction thing, then, is fairly cerebral, and it doesn’t get much better when hitched to “layer” and whisked into the lexicon of software engineering.
The general invisibility of software is why software people are experts at drawing piles of rectangles, when explaining the “hows” of their world. Most of this stuff is hard to envision without the rectangles, layered to make a stack.
A fairly typical depiction will show a rectangle marked “set-top hardware” at the bottom, “operating system” above it, “middleware” above that, and “applications” at the top.
The north-south intersections of those four rectangles are where the abstraction layers do their work. Abstraction layers essentially say “do it,” instead of listing long instruction sets about how to do it.
Abstraction layers bring with them their own set of prefixes: Hardware abstraction layer, software abstraction layer, database abstraction layer, network abstraction layer. In every sense, the intent is to simplify, for the next software module in the stack, how to proceed with an activity.
So a hardware abstraction layer, in the case of the rectangle at the bottom, marked “set-top hardware,” will summarize for the box above it, marked “operating system,” how to proceed with a desired activity. And so on, up the stack.
If there weren’t abstraction layers, software programs would exist as big blobs, not all that re-useable, and not at all happy or speedy about handling the inevitable changes that happen during the course of business. Without an abstraction layer, then, deployments of whatever the advanced digital video product may be – VOD, SVOD, you name it – would wind up as massive, custom integration projects.
Example: The navigational part of a VOD system, which in the stack of rectangles would sit at the top, as an “application,” just wants to tune a movie. So its abstraction layer says to the one below it, “fetch me the stream for “Waking Ned Divine, please.”
It doesn’t say, “tune the following 6 MHz carrier, find me the program identifier for the MPEG-2 stream related to “Waking Ned Divine,” isolate the index frames, begin filling the MPEG buffer, decompress the video, and get it on the screen, please.”
The kissing cousin to the abstraction layer is the “API,” or “application program interface.” APIs are the software tools used for the layers to talk to one another. People who work at the hardware level — engineering chips onto boards, and writing machine language code to make the chips do their work — need to know how to make all of that sensible to the next thing that needs it. In the case of the set-top box, the next thing that needs it is usually the operating system.
APIs, then, help the operating system know what’s below it. APIs at the middleware level, above the operating system, help applications developers know how to get to what’s below it, and so on.
Building abstraction layers, computer scientists assure, is an art. There are people whose entire lives are spent building abstractions. It’s not always perfect: Sometimes, getting to a high level of abstraction necessitates throwing away some good stuff, too.
Abstractions, as cerebral as they are, weren’t meant to be a confusion device developed by computer scientists to make the rest of us feel stupid. They were meant to simplify. They theoretically afford bigger portions of an R&D budget to go to the actual product, not to custom integration.
And, as cable executives up to the CEO level know all too well, anything that relieves the stacks of dollars going to custom integration is worth its weight in abstractions.
This column originally appeared in the Broadband Week section of Multichannel News.