Wednesday, November 16, 2011

Optimization Deployments[103]: Mediaserv[Reunion] Uses DiViNetworks on Top of File Caching - Why?

DiViNetworks announced that "Mediaserv, a leading Internet service provider in the French overseas territories, selected DiViNetworks to maximize data capacity of its link to Reunion .. After deploying file caching solutions, Mediaserv sought a solution that can further optimize its link"

Jérémie Pappo (pictured), head of engineering in Mediaserv said: "We were surprised with DiViNetworks’ capability to expand an already-optimized network .. Apparently a lot of inefficiency is revealed once you look into the bit-stream rather than the application or file, and DiViNetworks eliminates these inefficiencies

See "Mediaserv optimizes data delivery to la Réunion with DiViNetworks" - here.

I asked Yair Shapira, DiViNetworks' VP Marketing, Sales & BD about the last point - how does the solution further optimize a link which already enjoys a file caching solution that saves significant bandwidth.

Yair explained that their Bytestream caching (background - here) solution promises "30-50% bandwidth expansion ratio with or without caching". The following examples show how it does it on top of file-caching:
  1. Monetized video: Many popular web sources (e.g. Rapidshare, Megaupload) make every possible maneuver not to be cached as file-based caching jeopardizes their business model. They lose the capability to monetize by - differentiated service, limited view time, inline ad insertion etc. DiViNetworks' solution leaves the whole control at the content provider's hand, and never sends a byte unless it has reached from the source.
  2. Non-HTTP: As Bytestream caching  is agnostic to protocols, it also supports protocols different than plain HTTP, such as P2P-flavors, RTMP, Office app's etc.
  3. Live video: Traditional file caching cannot cache live content. Until the "file" is saved, no one watches it. With Bytestream caching one person's stream is a second person's history, even if they watch it only 50mSec later.
  4. Adaptive bit-rate (ABR): Lots of video today is streamed at ABR, which means that the video is comprised of 2-10 second fragments at different bit-rates, set according to bandwidth availability. Caching it in files is of course very challenging, and results in low hit-rate while for Bytestream caching it is transparent.
  5. Re-purposed content: The social Internet makes a lot of content re-purposing. Thus you can find the same image in various Facebook accounts, Picasa albums, and Google+ pages. File caching will identify the manifestations of the image as different, and will reach low hit-rate. Bytestream caching, again, doesn’t mind.
  6. Content lost due to inefficient storage: The typical user doesn’t consume "a file" anymore. The average length of a YouTube video is 4'12"whereas the average view time is about 40 seconds (see "Bytemobile: Video Optimization Increases Video Clip Viewing Time by 50%"- here). Caching the whole file means that most of the storage volume is spent on unvisited bytes. Bytestream caching stores only the most frequent bit-level fragments.

No comments:

Post a Comment