Technology

Our TiledVR solution, based on our industry-leading Tiled Streaming technologyenables very high quality VR360 streaming at realistic bitrates, solving the bandwidth and quality problems that plague VR streaming today.

Snappy Response

Virtual Reality holds great promise, but streaming VR360 video over today’s networks is a considerable challenge. It just requires too much bandwidth to deliver an acceptable quality video stream that is responsive to head movements. The result is either a low quality image, or an experience that can make you feel sick – or both. Tiled Streaming, a technology that we originally developed at TNO, solves these problems. It enables streaming of high quality VR360 video over existing networks, with a very snappy response to head motion. We have been working on our Tiled Streaming techno since 2011, and it provides a perfect solution for the challenges posed by VR streaming.

Only 12% of the Image …

Imagine viewing 360 Video content through a Head-Mounted Device (HMD). You can look around, and at any moment in time, you see only a part of the full panorama. In fact, you only see about an eighth (12%). Streaming the entire panorama is hugely inefficient. Doing so in high quality is downright impossible unless you have an extremely fast internet connection. And note that today’s HMDs have a resolution that is too low for a truly immersive VR experience, and that their resolution will increase significantly in the coming years –making the need for an efficient solution only more urgent.

You only view about 12% of the total panorama at any given time

You only view about 12% of the total panorama at any given time

Reducing Bandwidth

These are the two major solutions to get the bandwidth down to realistic levels:

  1. Creating many different versions of the panorama, and streaming the one that best fits the viewpoint;
  2. Dividing the image into tiles, and only sending the tiles that are in view.

We use the second method, because it is much more scalable and requires much less encoding and server resources.

Cutting the panorama up into tiles

Cutting the panorama up into tiles

Tiled Streaming enables distribution of VR content:

  • At extremely high quality
  • With virtually zero motion-to-photon latency
  • On any display device (dedicated head-mounted devices, phones, tablets)
  • Using standard encoding / decoding systems
  • For on-demand and live content
  • In a way that is massively scalable to millions of users simultaneously over any CDN, using standard http streaming technology
  • At bitrates comparable to normal video.

Cutting It Up In Tiles

Tiled Streaming works with all relevant devices including the popular Oculus Rift and mobile devices like Samsung’s  Gear VR. The Tiled Streaming software in these so-called “clients” retrieves only the tiles that are actually visible in the HMD. The panoramic video needs to be encoded in a special way, but this can be done with industry-standard encoders. Typically, there will be over a hundred of such tiles. These tiles are independently coded and stored on a Content Distribution Network (CDN), where the client can find them. The client requests the tiles it needs, decodes them, and then rearranges them for rendering on the device.

viewport_tiles

The tiles bounded by the yellow rectangle are fetched from the network

There is also a lower resolution version of the panorama, that is always transmitted. This layer ensures that there are no black holes when you turn your head. When you move your attention to a different part of the panorama, the device needs to fetch new content from the network. While this happens extremely fast (within 20-40 msec.), it still takes a bit of time. The fall-back layer ensures there are no black holes while new tiles are fetched, and also takes care of an incredibly short ‘motion-to-photon delay’ (the delay that will make you sick if it’s too long.) That delay is as low as it can possibly be, because it only depends on the local processing.

By choosing the tile size in a clever way, the amount of data can be reduced by a factor of approximately five to seven. Put in another way, we can send 5-7 times as many pixels at the same bitrate, which equates to a much higher resolution and quality.