Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Top 5 Things I've Messed Up in Live Streaming

Top 5 Things I've Messed Up in Live Streaming

June 24, 2014 - SF Video Technology Meetup. Live streaming can be difficult and challenging, yet extremely rewarding. In the video gaming world, it has become the most powerful way of reaching your audience. It combines traditional web technologies with even more traditional broadcast video stacks in a "it's going live now" environment. In this presentation Lee Chen, head of product at Fastly, goes over the top 5 things that have lost him sleep, made his eyebrows go up, and sometimes even made him fail -- what he did to correct it.

Fastly

June 24, 2014
Tweet

More Decks by Fastly

Other Decks in Technology

Transcript

  1. About me •  GotFrag.com •  Major League Gaming •  Product

    Manager for Streaming Media at Fastly •  Head of Product at Fastly
  2. What’s this all about? •  Live streaming (not live linear)

    •  The (comedic) errors: 5. Forest, Trees. The big picture. 4. Latency sucks. 3. Failover. Not Fall Over. 2. Redundancy is not redundant. 1. Death by a thousand cuts. Measure it!
  3. Some assumptions •  What: –  Live streaming is hard – 

    It shouldn’t be. –  But it is. –  It’s real time. –  So it’s hard. •  Why: –  “Analog” broadcast to “Digital” medium –  Every step of the signal path can break it –  And then there’s the Internet.
  4. #5: The whole picture is important •  Analog to Digital

    •  Wires to Code •  Complexity •  Troubleshooting and QA is therefore hard •  Studio != Touring Show
  5. #5: What got F’d. •  Audio out of sync • 

    Root cause was on the switcher <> audio board •  Keyframes >1s on Encoder compounded problem
  6. #4: Spoilers suck •  Time Behind Live (TBL) is critical

    •  Broadcast standard is <15s TBL •  In other words, latency sucks. – Latency = Buffering, jitters – Latency = Spoilers – Latency = WTF just happened.
  7. Latency Sources (Digital side) Source   Ideal  Avg  Latency  

    Encoding   <100ms   Uplink   IP  =  <100ms,  Sat  =  <1500ms   Ingest   <10ms   Transcoding   50  -­‐  500ms   CDN   150  –  1500  ms   Last  Mile:  Player  /  Browser  /  User   Internet   ?????   Total:   3610ms  (3.6sec)  +  Last  mile  
  8. #4 What got F’d •  RTMP delivery sucks on a

    CDN –  5min into broadcast = 5sec TBL –  30min into broadcast = 30sec TBL –  8hrs into broadcast = 2min TBL •  Trolls hate spoilers. So do legit fans. •  Separate stack from normal delivery, older hardware •  HTTP streaming is the answer (HLS!!)
  9. #3: What got F’d •  Just cause the vendor says

    so, doesn’t mean it _IS_ so. •  Test it. •  No really, TEST IT. •  Can you fail back and forth? •  Does it cause disconnects at player?
  10. #2: Redundancy is not redundant •  Analog – Matrixed encoders – Multiple

    uplinks – Back up routers •  Digital – Cloud transcoding – Multi-CDN
  11. #2: What got F’d •  Production Budget – $10k/hr for sat

    truck •  Upstream ISP link from venue died •  Fix: Microwave pt to pt to the Luxor.
  12. #1: Real time feedback. Flexible solutions Digital feedback: •  Analytics.

    –  Disconnects –  Buffering –  Edge usage –  Load on systems –  Geo distro –  etc Analog feedback •  Leverage the audience –  Twitter •  Watch the stream yourself •  Filter the noise Flexible partners and solutions
  13. #1: What got F’d •  Tendency is to have digital

    side isolated from the analog side. •  Don’t. Analog thinks about the room. Digital thinks about the “Internets” Put the two together.