The Secret of the Red Box

All the streaming services, from Netflix to Google to Facebook, come out of red server boxes. Hadyn Green describes how streaming works – and where it’s going

In a data centre in Auckland, sitting in a server rack, is a red box. Inside the box are thousands of movies and television shows ready to stream to your home. This is Netflix. And it’s not the only box. Google is here too. And so is Facebook.

The boxes themselves are actually servers, part of a wider network known as a CDN (Content Delivery Network). CDNs are how most large streaming services distribute their content around the country.

Let’s back up though. Why does a company like Netflix need a distributed network of servers all carrying the same content? The answer is traffic and distance.

An estimated 60 to 80 percent of internet traffic is video. While Netflix is the largest contributor, with roughly 40 percent of the video traffic, Google’s YouTube is close behind. Facebook is a big player in video as well and has just launched Facebook Watch (currently US-only), an on-demand video service with original content and live shows.

The amount of content, and the rising demand accompanying it, is increasing data usage on the network. According to Chorus, internet traffic is growing at roughly 56 percent every year. Adding to this growth is the increasing quality and hence larger file sizes of that content (4K, HDR, high frame rate etc).

Right now, there’s still a lot of content that isn’t available online, especially local material.

“When Netflix launched in New Zealand, in 2015, we saw a big surge of traffic,” says Kurt Rodgers, network strategy manager for Chorus. “Now, imagine how big a surge it’ll

“The network needs to handle a household running five different streams, with at least one being 4K, without a hitch”

Kurt Rodgers, Chorus Network Strategy Manager

be with all the current [terrestrial] channels going online.”

It is Rodger’s job to imagine that traffic, and predict the changes needed to keep the network congestion-free.

“We benchmark to the busiest five-minute period of the day, which is roughly 9pm, when everyone is watching TV,” he says.

Of course, by ‘TV’, Rodgers means streaming video from various sources. “The network needs to handle a household running five different streams, with at least one being 4K, without a hitch.”

Of course, Chorus can only do so much. The high speed, ‘fat’ pipe from Christchurch to Auckland helps, but no matter the speed the further you are physically from a piece of content, the longer it will take to load. There is no getting around that. The term for this delay is latency and it is the bane of streaming media, especially live video (more on this below).

A few years ago, when Netflix didn’t exist in New Zealand, those with the ability to access it found the content was good, but the streams would often drop to a lower quality bit-stream. There was also a long lag between pressing ‘play’ and a programme or movie starting. This is what it is like pulling your content from all the way across the Pacific.

But, in 2018, those problems are long gone, because of Netflix’s Open Connect program.

Netflix gives ISPs free CDN caches (edge servers) to slot into their networks. Google and Facebook run similar systems. The caches connect to their nearest hub (origin server) – which for most services in New Zealand is in Sydney – then gather all the content they need and serve it up to their customers.

So, when a new season of Netflix’s Jessica Jones from Marvel Studios is released, it will be sent from the US to the origin server in Sydney, and then, when the first person presses ‘play’, the video gets sent across the Tasman to the multiple CDN boxes here. From then on, any user in New Zealand will only have to access the cached version in this country.

The bigger the ISP the more boxes it gets. It’s a quid pro quo situation. The ISPs get to say that Netflix works on their network and Netflix gets its content to users that much faster. It’s a quick way for ISPs to scale up. Everybody wins.

Another company, Akamai, has its own CDN system that other companies use (for example, TVNZ On-Demand). Akamai rents out virtual real estate on its CDN network to give smaller players in the streaming market the same technological advantages of the giants.

Akamai is usually behind international streaming sites too. Why build your own network when you can rent distribution and capacity?

Amazon Web Services (AWS) is the choice of New Zealand’s newest movie-streaming service, Stuff Pix. Unlike with Netflix, there isn’t a CDN network for ISPs using AWS yet. So, Stuff Pix users’ traffic will cross between Australia and New Zealand. However, Stuff Pix general manager Paddy Buckley doesn’t see this as an issue. “Other services in New Zealand use the same server and have had very few issues.”

Stuff Pix also has a slightly different model to Netflix, Lightbox, and Neon. It’s TVOD, or Transactional Video On Demand, similar to iTunes or Google Play. “Stuff Pix will launch soon with more than 700 movies available to rent. It’ll also have more of a Kiwi focus than the global services and will be device-agnostic, meaning that it won’t be limited to any specific devices and will be widely accessible,” says Buckley.

Buckley’s comments highlight a clear consumer trend: a move away from linear TV to on-demand. Users want their content fast and crystal clear. And they want to be able to watch it on any screen in their house with no fuss.

So, how is the former big dog of New Zealand media, Sky, dealing with this trend? If its subscription numbers are to be believed, the answer is poorly. However, its latest collaboration with Vodafone may turn this around.

Despite not being allowed to merge, the two companies are working closely together to offer a unique streaming service in New Zealand: Vodafone TV.

Vodafone TV runs on a set-top box or mobile app and is only available to those on fibre or cable (‘Fibre-X’) connections. The box has Netflix and all of the free-to-air on-demand services built in. Subscribers get a live version of Sky, plus access to Sky’s catalogue on demand.

It’s this part that is new, at least for Sky. Tony Baird, technology director for Vodafone, described it:

“We have a scheduler and an ‘origin’ [server] in Auckland, and this sends content out to our CDNs in Auckland, Wellington and Christchurch. We’ve also got a big fibre-optic link direct to Sky for its Live TV and Video-On-Demand services. The scheduler is responsible for triggering the play out, and the ‘origin’ stores a single unique copy of the content for all customers to access. This is the ‘user plane’. For the control plane, we use AWS.”

“When you watch content we don't send you the whole file, instead you are assigned a unique pointer. The pointer knows exactly where you are in that piece of content”

Tony Baird, Vodafone Technology Director

If you’re confused by these terms don’t worry. The user plane is how the service gets the content to you and is similar to that of other services. The control plane refers to how you interact with that content and controls every part of this experience, including when you press ‘play’.

“When you watch content we don’t send you the whole file, instead you are assigned a unique pointer,” explains Baird. The pointer knows exactly where you are in that piece of content. This means sending fewer files, which means it’s more efficient.”

Working with Sky means working with a wide range of content owners. Sometimes owners impose odd rules for their content, while they come to grips with this relatively new form of distribution.

“For example, some content owners don’t want you to be able to rewind, others may not want you to skip. With the control plane we can add and remove these controls as necessary for each piece of content. And with over-the-air updates we can quickly implement new features when they become available.”

David Malpas, general manager of product at TVNZ, has an even bigger job. He has to deal with live television as well, and also with advertising. All TVNZ’s content is sourced from master files held at TVNZ. These go through another company, Brightcove, which transcodes (converts the video files from one format to another), creates the slots for advertisements and then adds DRM (Digital Rights Management) software, before heading to the origin server. With live television, TVNZ bypasses Brightcove and streams directly to the origin server.

“On a mobile device, the ads are put in server-side,” says Malpas, referring to a process whereby the adverts are inserted before the programme reaches you, as opposed to being called in while you’re already watching it.

“We want to bring this into non-mobile as well, as it means there’s less chance of something going wrong.”

The increase in New Zealand content with terrestrial channels and local material being added will mean more internet traffic and more strain on the content delivery network, so in the future we will certainly see the number of CDN caches increase alongside demand.

As Chorus’ Rodgers puts it: “The wave of [online] TV is coming, the question is when?”