It’s Super Bowl time, and for network technologists, the big action has little to do with Broncos or Seahawks, and everything to do with how many people will watch it as a video stream over the Internet, vs. a traditional television broadcast.
In network terms, the SuperBowl is to technologists what Mother’s Day is to the people who built and maintain the original telephone network: The day the network gets stress-tested for max usage. Conversationally, network engineers tend to append the word “problem” to it — “the Super Bowl Problem.”
Telephony engineers even came up with a unit of measure for it — the Erlang (for Agner Erlang, the guy who came up with it). The Erlang measures the average number of concurrent phone calls carried by a circuit, over a period of time.
So far as we know, there’s still no official “video erlang,” but the concept is the same. The numbers: In 2012, the Big Game hit the Internet as a live video stream for the first time. A little over a million people (of over 100 million) tuned in.
Last year, about three million people (out of 108 million) watched the game as a live Internet stream.
At issue is what happens when half or more of SuperBowl viewers tune in over the Internet, and/or using Internet Protocol. What happens, for instance, when 50 million people are all watching the same thing, as a live video stream? What happens when they pause, or rewind?
“If it were something you could hear, what you’d hear is a giant flushing sound,” one technologist quipped about it last week.
Refresher: “Multicast” is the Internet-y way of saying “broadcast,” meaning one to many. Right now, when you stream anything over the Internet, you’re watching it “unicast.” A special session is set up between you and the server holding what you want to watch. If your neighbor chooses to watch the same thing, she gets a different unicast stream. One to one.
Were all 108 million football watchers need a unicast stream, of the same thing, all at the same time? This is what people are talking about when they say the Internet would buckle.
So where are we with multicast? Cable technologists say they’re making steady progress, but are divided over models of when things go wrong. Some say the efficiencies enabled by multicast only kick in when 30% or more of viewers are watching the big game over the Internet; others say the operational impacts, and especially ad insertion, are going to be significant.
So here’s my wager: Either the Broncos or the Seahawks will win, and six million people will watch the game as a live Internet stream.
This column originally appeared in the Platforms section of Multichannel News.
For bandwidth people, hardly anything is a surprise anymore. Not even that 50% and higher rise in broadband usage, year after year after year, since about 2009.
Not even when they realized that nothing has ever grown that fast, and for that long, in the history of consumable goods.
It’s because of all of our Internet-connected stuff, of course, and how much we’re using those screens to create, ship, download and stream video.
Video is the undisputed fat man of the Internet.
Cisco reconfirmed the trend last week in the latest installment of its Visual Networking Index (VNI), a recurring study that tracks what’s going on with Internet usage. The report covers tons of ground, with numbers so big they’re hard to conjure.
Like this: “Internet video-to-TV traffic will increase nearly 5-fold between 2012 (1.3 exabytes per month) and 2017 (6.5 exabytes per month.)”
(In the hierarchy of increasing numerical size, it goes “kilo-“, “mega-,” “giga-,”, “tera-,” “peta-,” then “exa-.” “Exa” is a quintillion. As in million, billion, trillion, quadrillion, quantillion. Sextillion, septillion, octillion, nonillion, decillion. Ok I’ll stop.)
But let’s get back to bandwidth people. Part of their work is to ensure that demand doesn’t outstrip supply. Which brings us to “the Super Bowl problem.”
The Super Bowl problem, from a bandwidth perspective, has two parts: One, what if everyone tunes in all at once? Two: What if everyone pauses all at once?
“Everyone,” when it comes to the Super Bowl, was 108 million people this year. They all saw the game, on their small, medium, and super-large screens. So what’s the problem?
It’s the “Internet-connected” part. See, of the 108 million, only 3 million saw the game as an online video stream. Which brings us to the lingo of video distribution, tweaked for online usage: Multicast and unicast.
Refresher: In the good old days (meaning today), to broadcast is to send one to many. Whether one person watches the game, or 108 million, doesn’t matter. In bandwidth terms, it’s all the same.
The Internet doesn’t work that way. It’s intrinsically many to many. When you stream House of Cards on Netflix, other people might be streaming it at the same time, sure. But how it works is called “unicast.” One stream unicast to me, another to you, another to Harry, another to Jane.
Nailing up enough bandwidth to unicast 108 million unique streams is both a horrific waste of bandwidth, and a great way to buckle the system.
So, there’s multicast. It’s the streaming equivalent of raising the red flag on your (physical) mailbox – not to say “there’s mail in here,” but to say “gimme.” With multicast, if someone in your serving area is already watching what you want to watch, you “join that stream.”
Multicasting to 108 million screens is non-trivial, in engineering-speak. Computationally intensive. Getting there involves work on everything from cloud to protocols to servers to routers to devices. It’s about making sure that video cloud has enough intelligence to recognize a URL with multicast headers, then making sure enough “multicast join” mechanisms are in place. And lots more.
Right now, the thinking is to multicast to the home — which is why you keep hearing about “gateways,” by the way. The interim step to “all-IP.”
From the multicast stream to the gateway, streams can be unicast to the other devices in the house that want a look. That way, the buffers needed to pause are in the house, not in the network. Imagine building a buffer big enough for 108 million pauses!
That’s the short version of broadcast, multicast, unicast, and the Super Bowl problem. Or you could just skip the game, and go to Costco. No lines. So I’ve heard…
This column originally appeared in the Platforms section of Multichannel News.
Language gets weird. Even what seems obvious can quickly muddle. I’m reminded of a panel about over-the-top video, where a representative of that community (okay, it was Roku) made this polite, explanatory remark about the types of video the device can stream: “We call them ‘channels.’ ”
The audience was nearly 100% cable people. I couldn’t resist this retort: “We do too!”
Language gets especially weird around seemingly obvious stuff in tech. Here’s a recent example: “Linear IP video,” to describe what we nowadays call live, broadcast television. What you get through your set-top (broadcast TV), except it comes through your cable modem (that’s the IP part.)
Channel by channel, show by show, linear TV – but delivered in IP. So, in that sense, the term “linear” was teased out a bit, to mean “broadcast.”
So far so good. Except for one important technical distinction: By definition, video delivered via a cable modem – in IP — isn’t broadcast. It’s switched.
Broadcast is one to many. One channel, one “stream,” sent over the air or over a wire, to millions of receiving TVs.
Stuff that goes through the headend part of the cable modem – the CMTS, for Cable Modem Termination System – is inherently switched. Session-based. Clicking on a link to watch a YouTube instantiates a stream between them and your screen. One to one.
The tech name for this (perhaps predictably) is “unicast.”
Unicast is fine, but it’s a bear on bandwidth. Imagine if we all streamed the SuperBowl as unicast sessions. Hundreds of millions of people, all asking for the same thing — but it ships stream by stream, not in bulk.
Tough on the pipes.
Happily, people are already tackling this conundrum. They call it “multicast,” which is essentially the IP version of what we now call “broadcast.”
Instead of asking for your own stream, you do the streaming equivalent of flipping up the flag on your (physical) mailbox. Then you “join” that stream, along with anyone else in your node who may be streaming it.
Switching is not a new one on cable. VOD was the first example – you request a movie, the VOD server sets up a session with you and only you. Next, switched digital video (SDV) technologies emerged as a way to recycle available digital bandwidth.
Does this linguistic gymnastics matter? Not really. Television will continue to steamroll onto different screens, served by different networks, in different ways. But the next time you hear “linear IP video,” you can at the least make a snappy remark about how it’s technically not linear, because it’s switched. So there.
This column originally appeared in the Platforms section of Multichannel News.
© 2000-2016 translation-please.com. All Rights Reserved.