HDS & Bootstrap Data


Working with HDS Bootstrap Data

I’ve always been curious about the bootstrap data for HDS content. Recently, I had the chance to find out more about it and get in some fun development with Node.js. We’ve been kicking around the idea of building a tool set for Adobe Media Server using Node.js and possibly socket.io. Last weekend we got some of the ideas going and one of those was parsing the hds bootstrap data created when content is packages for HDS delivery.

The bootstrap data can live in a couple of places:

  1. In the <metadata> node of an F4M file
  2. In an external .bootstrap file

The .bootstap file contains binary data and the F4M file contains the same binary data that has been Base64 encoded. So, getting to the data is pretty trivial – either read in the .bootstrap file or un-encode the string in that is in the F4M. Getting to the data contained in the bootstrap binary data is the fun part.

Understanding the bootstrap data

To do so, check out the F4V file format specification. This PDF gives you the details for the entire F4V file format. If you read through the PDF, you’ll  see that it is built using what are called “boxes”. These boxes are given identifiers such as “abst”,  “adaf”,  “adkm”,  “aeib”,  “afra”,  & “afrt” to name a few. Each box contains a header, that header identifies the box by its identifier and lets you know how much data is contained in the box. These boxes are also arranged into a hierarchy, so each box has some data that is specific to some part of the data contained in the file.

It is all in the boxes

The boxes that we are concerned with are “abst” or the bootstrap information box, “asrt” or the segement run table box, and “afrt” or the fragment run table box.

The abst box

The bootstrap information box contains information needed to bootstrap playing of HDS content – specifically to construct the URLs necessary to retrieve the fragments for playback. This includes information about the server, media, & segment information.

The asrt box

The segment run table box contains data about the segments for the media item. There can be multiple ASRT boxes – each representing a different quality level. There are some rules that you’ll want to pay attention to for the data in the asrt box:

  • An asrt box can represent fragment runs for several quality levels.
  • Each entry gives the first segment number for a run of segments with the same count of fragments.
    • The count of segments having this same count of fragments can be calculated by subtracting the first segment number in this entry from the first segment number in the next entry.

The afrt box

The fragment run table box is used to find the fragment corresponding to a given time. Similar to the asrt box, there are some rules that you’ll want to pay attention to:

  • Fragments are individually identifiable by the URL scheme based on segment number and fragments number.
  • Fragments may vary both in duration and in number of samples.
  • Duration of the fragments are stored in the this box.
  • A Fragment Run Table may represent fragments for more than one quality level.
  • Each fragment run table entry gives the first fragment number for a run of fragments with the same duration.
    • The count of fragments having this same duration can be calculated by subtracting the first fragment number in this entry from the first fragment number in the next entry.

Parsing the bootstrap data using Node.js

Parsing binary data in Node.js can be done using “Buffer”. For the most part parsing the bootstrap data was pretty straight forward. There is one issue that I ran into with 64bit Integers which was solved easily enough (there are node modules for just about anything) using the node-int64 module to represent the 64Bit Integers. Once that was solved it was just a matter of parsing through the box header to figure out where you are in the dataset, and then creating the appropriate data structures to represent what you want and need in from the bootstrap data.

In our case we want to be able to monitor live events across multiple servers to make sure that they are all on the same segment and fragment. We’re building a services that in the case that something happens to a server and it goes haywire, will notify another service that can then restart or shut down that particular server or let caching servers know that they need to flush or refresh cache. We’re still dreaming up things we can use this type of data for.

Just want to get to that data?

If you have a .bootstrap file you can use the f4fpackager.exe that is part of the Adobe Media Server toolset to inspect the bootstrap data. All you need to do is run the tool with the argument “–inspect-bootstrap”. So the command looks something like the following if you have a bootstrap file named mydata.bootstrap:

[shell].f4fpackager.exe –input-file=mydata.bootstrap –inspect-bootstrap[/shell]

Anyways, if you have any questions or input let me know in the comments.

Configuring DVR HTTP Live Streaming

Previously I showed you how to configure the Adobe Media Server to deliver Live HTTP Dynamic Streaming (HDS). Let’s add on a feature that will allow your users to pause and rewind the live video – DVR.

DVR HTTP Live Streaming

DVR can be (and actually already is) configured for both HTTP Dynamic Streaming (HDS) as well as HTTP Live Streaming (HLS). The configuration for HLS is a bit more involved, so we’ll tackle the configuration for live HDS streams first. I’ll be walking through the configuration of Adobe Media Server (AMS) 5+.

Configuring DVR for HDS

HDS DVR is controlled using a set-level manifest file. A set-level manifest file is an XML file that provides information about where to play media back from as well as other configuration information for the stream. The Set level manifest file contains some or all of the following:

  1. A base URL.
  2. One or more <media> nodes that point to the media to be played.
  3. Information about DVR.

Below is an example of a set level manifest file for a single bitrate:

<manifest xmlns="http://ns.adobe.com/f4m/2.0"> 
    <media href="livestream1.f4m" bitrate="150"/> 
    <media href="livestream2.f4m" bitrate="500"/> 
    <media href="livestream3.f4m" bitrate="700"/> 

To specify dvr information for this live HDS stream, we add a <dvrInfo/> node that contains a windowDuration attribute.

<manifest xmlns="http://ns.adobe.com/f4m/2.0"> 
    <dvrInfo windowDuration="30"/>
    <media href="livestream1.f4m" bitrate="150"/> 
    <media href="livestream2.f4m" bitrate="500"/> 
    <media href="livestream3.f4m" bitrate="700"/> 

Setting the amount of recorded content

The value for windowDuration can be set to -1, meaning all of the recorded content is available, or it can be set to a number greater than zero (don’t set windowDuration to 0, it can cause problems). This number represents the number of minutes of recorded content, in minutes, that is available to the client to seek through. By Default Adobe Media Server keeps 3 hours of content. You can configure the amount of content the server keeps in the Application.xml or the Event.xml files.

To configure the disk management options in the Application.xml file specify a value in hours for the <DiskManagementDuration> node. You can also use decimal values to specify minutes as in the example below :


To specify the disk management options in the Event.xml add the the <DiskManagementDuration> node with the same parameters similar to the example below:


Creating the set-level manifest file

A tool to generate set-level manifest files is installed with AMS. Known as the F4M Configurator, you can find it in the {AMS_INSTALL}/tools/f4mconfig/configurator directory. I’ve written another article on using the F4M Configurator, so I won’t go into that here. But, feel free to review that article – Using Adobe’s F4M Configurator Tool. Using the set-level manifest above we now have a manifest file that we can work with.

<manifest xmlns="http://ns.adobe.com/f4m/2.0"> 
    <dvrInfo windowDuration="30"/>
    <media href="mylivestream.f4m" bitrate="300"/>

Now that we have configured the amount of recorded media and created a set-level manifest file. We can test DVR for our live stream in a player.

  1. Upload the set-level manifest to a web server. I’ve uploaded mine to http://thekuroko.com/samples/hds/myliveevent.f4m
  2. Go ahead and start up your live stream.
  3. Then open the following URL: http://www.osmf.org/dev/2.0gm/setup.html
  4. Enter the path to your set-level manifest as the value for “src” in the “FlashVars” section.
  5. Set the “Stream Type” to “dvr”.
  6. Click the “Preview” button at the bottom of the form.

Your stream should play, but because we’ve added the <dvrInfo> node, the progress bar should reflect that you can now see back into the content.

Without the dvrInfo:

DVR HTTP Live Streaming Control Bar Without DVR Info

With the dvrInfo:

DVR HTTP Live Streaming Control Bar With DVR Info

Configuring DVR for HLS

DVR for HLS is configured in a similar way to HDS. Except you don’t need the set-level manifest.

The DVR or sliding window can be configured at an event level in the Event.xml file, at an application level in the Application.xml file, or at a server level in the Apache httpd.conf file.

Configuring at the event level

  1. Open the Event.xml file in the {AMS_INSTALL}/applications/livepkgr/_definst_/myliveevent to configure the sliding window for the event named myliveevent in the livepkgr application.
  2. Update the Event.xml with the following XML to create a 1 hour sliding window:
  3. The MediaFileDuration sets the length of each .ts segment in milliseconds and the SlidingWindowLength value configures the number of .ts segments to keep around.
  4. The resulting Event.xml file should look like:

Configuring HLS at the application level

  1. Open the Application.xml file in {AMS_INSTALL}/applications/livepkgr
  2. Add the same XML node set as the event level configuration to the Application.xml file.
  3. The resulting file should resemble the following:

Configuring HLS at the server level

  1. Open the httpd.conf file in {AMS_INSTALL}/Apache2.2/conf (If you are using a non-default Apache install, your httpd.conf file will be in a different location)
  2. Find the Location directive for “hls-live” & add/update the value for HLSMediaFileDuration to be 8000 and HLSSlidingWindowLength to be 450. This will set a sliding window duration of 1 hour for all live HLS streams. By default the sliding window is set to 48 seconds (6 * 8 second .ts files).
    <Location /hls-live>
        HLSHttpStreamingEnabled true
        HttpStreamingLiveEventPath "../applications"
        HttpStreamingContentPath "../applications"
        HLSMediaFileDuration 8000
        HLSSlidingWindowLength 450
        HLSFmsDirPath ".."
        HttpStreamingUnavailableResponseCode 503

Playing back the HLS content

There are quite a few ways to test the HLS content:

  1. Safari on a Mac with Flash disabled.
  2. Quicktime Player
    1. From the main menu choose “File” -> “Open Location”
    2. Then type in the URL to the .M3U8 – Ex:
  3. VLC
    1. From the main menu choose “File” -> “Open Network”
    2. Then type in the URL to the .M3U8 – Ex:
  4. Use an iOS device – either setup an HTML player like VideoJS or MediaElementJS or open the .M3U8 UR directly. Currently this is the only way I was able to use the sliding window. The stream will playback in the other players, but the control bar will not reflect the available content to seek through.


  • Disk management: http://help.adobe.com/en_US/flashmediaserver/devguide/WSeb6b7485f9649bf23d103e5512e08f3a338-8000.html#WSec225f632fa00875-23954b6f1300b641158-8000

Creating Set-level Manifest Files Using the F4M Configurator Tool

Here is a quickie on how to use Adobe’s F4M configurator tool to create set level manifest files.

The configurator is installed with AMS 5.0 and can be found in the following directory: {AMS_INSTALL}/tools/f4mconfig/configurator/

  1. Open the f4mconfig.html file in a browser.
    Adobe Media Server - F4M Configurator Tool
  2. Enter the path to your server, application and event. For example for an event named “myliveevent” using the “livepkgr” application the Base URL would look like:
  3. If you are going to use DVR, enter a value for “DVR Window Duration”. A value of -1 configures the DVR window for all of the available content. A value greater than zero configures the amount of time in seconds available before the live point. We’ll set a 30 minute DVR window, so 1800 seconds.
  4. Enter the stream name and bit rate for each bit rate you are encoding. For this example lets say we have a single bit rate of 300 for a stream named “mylivestream”
    Adobe's F4M Configurator - Stream Name and DVR Window
  5. Click the “Save Manifest” button. A file will be created and you will prompted to save the file. Save the file and open it.
  6. The file should look similar to the following:
    <manifest xmlns="http://ns.adobe.com/f4m/2.0">
      <dvrInfo windowDuration="1800"/>
      <media href="mylivestream" bitrate="300"/>
  7. This file can now be used to specify live DVR content. If you add an additional bitrate, you not have a set-level F4M file for multi-bitrate streaming.

Hope this helps and save a bit of time for you.

Configure Adobe Flash Media Server for Live HTTP Dynamic Streaming

How to set up Live HTTP Dynamic Streaming

So you want to stream a live event using HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS)? No problem. Adobe Media Server (AMS) provides a right out-of-the-box solution for you. To do so, you’ll need to:

  1. Download and install Flash Media Live Encoder (FMLE)
  2. Make a small configuration change to the encoder
  3. Setup your live event
  4. Begin streaming
  5. Set up a player

Installing and configuring Flash Media Live Encoder

  1. Download FMLE from http://www.adobe.com/products/flash-media-encoder.html
  2. Once it is installed open the config.xml file from
    1. Windows: C:Program FilesAdobeFlash Media Live Encoder 3.2conf
    2. Mac: /Applications/Adobe/Flash Media Live Encoder 3.2/conf/
  3. Locate the “streamsynchronization” tag under flashmedialiveencoder_config -> mbrconfig -> streamsynchronization and set the value for “enable” to “true”. The streamsynchronization node should look similar to the following:
  4. Save and close the file.

Setting up the live event

Streaming a live event involves using the “livepkgr” application that comes installed with AMS. The livepkgr application comes with a preconfigured event named livestream. We’ll use this as a template for our live event.

  1. On your server navigate to the {AMS_INSTALL}/applications/livepkgr/events/_definst_ directory.
  2. We’re going to call our event “myliveevent”. Create a new directory and name it “myliveevent”.
  3. Open the newly create mylivestream directory and create a new XML file named “Event.xml”. This file is used to configure the just-in-time (JIT) packaging settings for your HDS content. Add the following XML to the file. Note: You can also copy the Event.xml file from the liveevent directory that is setup by default. Just update the EventID to match the folder name.

    For more information about the values in the Event.xml  file you can review Adobe’s documentation – link in the resources section below.

  4. Save and close the file.
  5. Your event is now set up. You can reuse this event all you want, or create another one for a different event name.

Begin streaming

Now we can start up FMLE and set it up to connect to our livepkgr application and begin streaming.

  1. In the left panel of FLME make sure the “Video” and “Audio” sections are both checked.
  2. Video
    1. In the video section, set the format to be “H.264” and then click the button with the wrench icon.
    2. In the resulting pop-up window, make sure the settings match the following:
      1. Profile: Main
      2. Level: 3.1
      3. Keyframe Frequency: 4 seconds
        Live HTTP Dynamic Streaming H.264 Settings
    1. Click “OK” to close the pop-up window.
    2. In the “Bit Rate” section make sure you only have one of the bit rates selected. We’re only creating a single stream for now.
      Live HTTP Dynamic Streaming Video Encoder Settings
  3. Audio
    1. In the Audio section, set the format to “AAC”
      Live HTTP Dynamic Streaming Audio Encoder Settings
  4. In the right panel set “FMS URL” to point to your server and the livepkgr application:
    1. Example: rtmp://
  5. Set the “Stream” value to be mylivestream?adbe-live-event=myliveevent
    1. “mylivestream” is the name of the stream and can be anything you’d like. The actual files that AMS creates will be stored in the livepkgr/streams/_definst_/mylivestream directory.
    2. “?adbe-live-event=myliveevent” tells the livepkgr application to use the Event.xml in the livepkgr/events/_definst_/myliveevent directory that we created.
      Live HTTP Dynamic Streaming RTMP Server Settings
  6. Click the “Connect” button. If all goes well, you’ll connect to your server. If not, check to make sure there aren’t any typos in the values for “FMS URL” and “Stream” and that you can connect to your server and it is running.
  7. Click the bug green “Start” button to begin streaming.
    Live HTTP Dynamic Streaming Big Green Start Button
  8. You now have a stream. Let’s see if we can get a player to play it back.

Setting up the player

Getting to the HDS content for your new stream involves requesting a URL that lets Apache (installed with AMS) know what we are looking for. The path will consist of the following parts:

  1. The protocol: http://
  2. The server location: (in my case, yours will be different)
  3. The Location that is configured to deliver live streams. By default these are:
    1. HDS: hds-live/
    2. HLS: hls-live/
  4. The application name: livepkgr/
  5. The instance name (we’ll use the default): _definst_
  6. The event name: myliveevent
  7. The stream name: mylivestream
  8. The F4M file extension for HDS – .f4m or the M3U8 file extension for HLS.

So if we put all of that together we’ll get a URL that looks like:

  • HDS:
  • HLS:

Note: You may need to add the 8134 port to the URL if you didn’t install AMS on port 80:

  1. Open a browser window and navigate to that URL, you should see the F4m’s XML content.
    Live HTTP Streaming F4M XML
  2. Open the following URL: http://www.osmf.org/configurator/fmp/#
  3. Set your F4M url as the value for “Video Source”
  4. Select the “Yes” radio button for “Are you using HTTP Streaming or Flash Access 2.0?”
  5. Set “Autoplay Content” to “Yes”
    Live HTTP Dynamic Streaming Player Settings
  6. Click the Preview button at the bottom of the page.
  7. Congratulations. You are now streaming live media over HTTP.

To verify the HTTP Streaming, open a tool that will let you inspect the HTTP traffic (something like Developer Tools or Firebug). You should see requests for resourecs like “mylivestreamSeg1-Frag52” and “mylivestream.bootstrap”. This is the player requesting HDS fragments and Apache and AMS working together to package them just-in-time for the player.
Live HTTP Dynamic Streaming HTTP Traffic

Hopefully this provides you with some good information about Live HTTP Dynamic Streaming and clarifies some of the setup and configuration details. Please, if you have any questions, let me know in the comments or contact me.