HTTP Dynamic Streaming Content Download and Playback

Recently I’ve been working on a system to playback HTTP Dynamic Streaming (HDS) content locally, from a single file. If you have seen my previous post on HTTP Dynamic Streaming (HDS) or are already familiar with it, you know that the media item is packaged in such a way that there are multiple segments and fragments that make up an entire media item. Similar to the image below:

A sample segment and its fragments
A sample segment and its fragments

The system involves a client, an AIR application, requesting some remote HDS content (basically an F4M file). The client downloads the fragment data for the media item and writes it to disk. Instead of writing each fragment to a separate file, the fragment data is written to a single file. This part alone is pretty straight forward. The tricky part is when you want to play the content back.

A few problems needed to be overcome to get playback to work. First, to get the local fragments to playback, I needed to fix an issue in the OSMF framework that only accounts for requests for remote HDS fragments. This was accomplished by overriding the HTTPStreamingIndexHandler class and removing some code that only accounted for “HTTP” being part of the request. Second, and more importantly, I needed to intercept the request for the HDS fragment that is generated when OSMF is playing back HDS content, use the request to determine where the fragment’s byte data exists in local file that was created when the client downloaded the content. Then return this byte data to the rest of the OSMF code that parses it into FLV data to pass onto the appendBytes() method on the NetStream.

On top of that, we wanted to allow for playback while the fragments were still downloading. On OS X this wasn’t a huge deal because AIR on OS X can have multiple FileStreams open the same file. On Windows the file is locked when it is opened by the first FileStream that open the file. This is a problem because I want to write the downloaded fragment data to the file and I want to read fragment data for playback at the same time. This issue was solved with a little utility library that uses only 1 FileStream instance and manages read and write requests by queuing up requests and only allowing the requests to happen 1 at a time.

It was a huge headache and lots of time was spent in the OSMF rabbit hole but, I now have a great File IO library for AIR and I’m able to download and playback HDS content locally.

12 Replies to “HTTP Dynamic Streaming Content Download and Playback”

  1. John, I&#039m really curious would like more details (examples) on how your achieving this. I&#039m currently using FMS 4.5 with livepkgr application and have tried downloading the .f4f files, using the f4vpp utility, flattening these files and using something like Adobe Premiere to append each segment together. The issue I&#039m faced with is that there is a 4 second gap between each segment, probably due to either the keyframe (FLME when I&#039m encoding) set to 4 seconds or the FragmentDuration set in the events.xml file to 4000 ms (4 seconds). I&#039m a little apprehensive about modifying either of these values. I have to keep the keyframe at 4 seconds in FLME in order to support HLS (.m3u8) for iPad and iPhone devices. I would greatly appreciate any tips you can provide.

    Thanks,

    Gio

    1. Giovanni – I&#039m using the OSMF library to play back the HDS content and have used the same playback logic for the situation described int he post. Basically I download the fragments and write them to a single file on the users computer. Then for playback I pull the fragment data from the hard file on the users computer and pass that fragment byte data to OSMF for playback. This is all in an Adobe AIR library so it isn&#039t really for playback in anything other than OSMF. Also this is specific to HDS content and HLS content won&#039t work in this situation.

  2. John, I'm really curious would like more details (examples) on how your achieving this. I'm currently using FMS 4.5 with livepkgr application and have tried downloading the .f4f files, using the f4vpp utility, flattening these files and using something like Adobe Premiere to append each segment together. The issue I'm faced with is that there is a 4 second gap between each segment, probably due to either the keyframe (FLME when I'm encoding) set to 4 seconds or the FragmentDuration set in the events.xml file to 4000 ms (4 seconds). I'm a little apprehensive about modifying either of these values. I have to keep the keyframe at 4 seconds in FLME in order to support HLS (.m3u8) for iPad and iPhone devices. I would greatly appreciate any tips you can provide.

    Thanks,

    Gio

    1. Giovanni – I'm using the OSMF library to play back the HDS content and have used the same playback logic for the situation described int he post. Basically I download the fragments and write them to a single file on the users computer. Then for playback I pull the fragment data from the hard file on the users computer and pass that fragment byte data to OSMF for playback. This is all in an Adobe AIR library so it isn't really for playback in anything other than OSMF. Also this is specific to HDS content and HLS content won't work in this situation.

  3. Hi,

    while it all sounds very easy what you describe, I still have problems on reproducing it.

    Could you please post a few code snippets on how to modify HTTPStreamingIndexHandler and what else is needed to load up a file locally for playback?

    Thanks!

  4. Hi,

    while it all sounds very easy what you describe, I still have problems on reproducing it.

    Could you please post a few code snippets on how to modify HTTPStreamingIndexHandler and what else is needed to load up a file locally for playback?

    Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.