• How to use the FUSE-based file system backed by Amazon S3

    Wowza® media server software Amazon Machine Images (AMIs) include a preinstalled s3fs package. s3fs is a Filesytem in Userspace (FUSE) virtual file system that enables you to mount an Amazon Simple Storage Service (Amazon S3) bucket as a local file system on a Wowza Media Server for Amazon EC2 instance. You can copy videos recorded by Wowza Media Server to the directory so that they can be streamed immediately by video on demand players using the vods3 application. This article describes how to use s3fs.

    Note: An s3fs mount shouldn't be recorded to directly (See this Wowza forum post). It also shouldn't be streamed from directly. Instead, use the vods3 application to stream from Amazon S3 storage. For more information, see Streaming video on demand from Amazon S3 (vods3).

    Note: For more information about how to create a FUSE-based filesystem backed by Amazon S3, see FuseOverAmazon.

    To mount an Amazon S3 bucket using s3fs

    1. Open a secure Telnet session to your Wowza media server for Amazon EC2 instance using Secure Shell (SSH) and then create a new directory in the Telnet window:
      Code:
      mkdir /mnt/s3
      For more information about how to use Telnet, see the Wowza Media Server for Amazon EC2 User's Guide.

    2. Specify your security credentials (access key ID and secret access key) in a .passwd-s3fs file in your home directory:

      1. Create the s3fs password file:
        Code:
        vi /etc/passwd-s3fs
      2. In the s3fs password file, insert the security credentials (access key ID and secret access key) using the following format:
        Code:
        [accessKeyId]:[secretAccessKey]
      3. Save the file and then run the following command:
        Code:
        chmod 640 /etc/passwd-s3fs
    3. Mount the Amazon S3 bucket:
      Code:
      /usr/local/bin/s3fs your.S3.Bucket -o default_acl=public-read /mnt/s3
      Note: Earlier AMIs had the s3fs command located at /usr/bin/s3fs.
      Note: To ensure that the bucket is remounted when the instance is rebooted, add the above command to your /etc/rc.local file.

    4. Add the ModuleMediaWriterFileMover module definition to the configuration file for your application that records video ([install-dir]/conf/[application]/Application.xml). This module enables files that have been written/recorded to disk to be copied or moved.
      Code:
      <Module>
      	<Name>ModuleMediaWriterFileMover</Name>
      	<Description>ModuleMediaWriterFileMover</Description>
      	<Class>com.wowza.wms.module.ModuleMediaWriterFileMover</Class>
      </Module>
    5. Add the following application level properties to the <Properties> container at the bottom of [install-dir]/conf/[application]/Application.xml to control where to copy the file after recording is complete:
      Code:
      <Property>
      	<Name>fileMoverDestinationPath</Name>
      	<Value>/mnt/s3</Value>
      </Property>
      <Property>
      	<Name>fileMoverDeleteOriginal</Name>
      	<Value>false</Value>
      	<Type>Boolean</Type>
      </Property>
      <Property>
      	<Name>fileMoverVersionFile</Name>
      	<Value>true</Value>
      	<Type>Boolean</Type>
      </Property>
      For property details, see How to move recordings from live streams (ModuleMediaWriterFileMover)


    Note: If you need finer control of the move, you can use the IMediaWriterActionNotify interface.


    Comments 31 Comments
    1. wrexhamthe11th -
      Perhaps it should be mentioned that the server needs to be restarted after these changes.
    1. rrlanham -
      It shouldn't be necessary.

      Richard
    1. jpstrikesback -
      Could attaching the S3FS bucket be automated in a startup package?
    1. rrlanham -
      You can try adding it to the init.sh file (or use another script file and reference to it in the startup.xml file.)

      I'm not sure if it will work. (I think I tried it and it didn't a while ago, but I'm not certain).

      Richard
    1. digibones -
      Hi JP

      you can tell the system to mount it at startup with all the other volumes. AS ROOT edit /etc/fstab and add this line
      s3fs#mybucket /mnt/s3 fuse allow_other,use_cache=/tmp,default_acl="private" 0 0

      then create a file called /etc/passwd-s3fs. Make the permissions on this file as restrictive as possible: (root:root 400)
      edit the file and enter your Access Key ID and your Secret Key separated by a colon with no spaces between:
      AccessKeyID:SecretKey

      The the bucket will now be mounted automatically at sysinit as owner root

      to mount it right now manually enter
      mount /mnt/s3
      or
      mount -a

      to see all mounted volumes enter
      df -h
      you should see your bucket
      s3fs 256T 0 256T 0% /mnt/s3
      you have 256 Terabytes free!


      Please note: the newer version of s3fs does not accept the command line key options accessKedID and secretAccessKey. They must be in the password file or as environment variables
    1. karuna.pai -
      Can the same S3 bucket be mounted on multiple wowza EC2 instances?
    1. charlie -
      Yes, I think that is possible. The S3 interface is HTTP so there is no reason it cannot be mounted on multiple EC2 instances.

      Charlie
    1. jnicholas -
      The copy step worked great but when I try to download then play the videos (h264 mp4) in quicktime or on a phone the image is blank and I only hear audio. VLC can play the video. Is there anything unusual about the recording format that makes them non-standard?
    1. rrlanham -
      Wowza writes the moov atom at the end of the file, and that affects progressive download playback.

      Richard
    1. jnicholas -
      Quote Originally Posted by rrlanham View Post
      Quote Originally Posted by jnicholas View Post
      The copy step worked great but when I try to download then play the videos (h264 mp4) in quicktime or on a phone the image is blank and I only hear audio. VLC can play the video. Is there anything unusual about the recording format that makes them non-standard?
      Wowza writes the moov atom at the end of the file, and that affects progressive download playback.

      Richard
      This isn't about moov atom, I've fixed that and also this happens locally.

      The videos play back to flash or iphone through Wowza or through flash with FMS. I can also play it over http to VLC. But it only plays with sound on QT and iphone and roku can't play it at all.
    1. rrlanham -
      I'm not sure. It depends in part on the encoding of the live stream that you recorded. I have played back content recorded by Wowza in many players, but they might not work in some players some of the time, either moov atom placement (albeit fixed in your case) or encoding parameters.

      Richard
    1. jnicholas -
      Quote Originally Posted by rrlanham View Post
      I'm not sure. It depends in part on the encoding of the live stream that you recorded. I have played back content recorded by Wowza in many players, but they might not work in some players some of the time, either moov atom placement (albeit fixed in your case) or encoding parameters.

      Richard
      The iphone can play back the stream live and it can play it back when the wowza serves it back as a VOD. It's only when I take the recording that the wowza saved and move it to another server. I think that rules out encoding parameters.
    1. rrlanham -
      You mean a web server, right? Then playing in QT, IPhone and Roku over http from the web server. That's progressive download. You say you fixed the moov atom, but it's not really supported beyond this point. It does stream.

      Richard
    1. cnfcnf -
      "You can then use that directory to copy videos recorded by Wowza when the recording process is complete so they can be streamed immediately by video on demand player using the vodS3 application."I am using an nDVR and one of the main functions is to allow the playback of the user's currently recorded stream. My goal was to use one machine on Amazon to record streams and another to playback streams (including the one currently being recorded). How can I enable this scenario of decoupling the 2 machines if the system cannot write the live recording to S3?
    1. rrlanham -
      I don't think you will be able to use S3 and the vods3 application with nDVR files.

      Richard
    1. ScottKell -
      Quote Originally Posted by cnfcnf View Post
      "You can then use that directory to copy videos recorded by Wowza when the recording process is complete so they can be streamed immediately by video on demand player using the vodS3 application."I am using an nDVR and one of the main functions is to allow the playback of the user's currently recorded stream. My goal was to use one machine on Amazon to record streams and another to playback streams (including the one currently being recorded). How can I enable this scenario of decoupling the 2 machines if the system cannot write the live recording to S3?
      That sentence isn't completly clear w.r.t DVR, but it refers to recording VOD assets not DVR. Decoupling the DVR server from the DVR storage is not a good idea.
    1. corarene -
      Is there a windows version of these instructions.
    1. rrlanham -
      No, but this comes to the top of search for "s3fs windows"

      http://wins3fs.codeplex.com/

      Richard
    1. jakehilton -
      I've had success streaming from an s3 mount with no problems.. I just make sure to use the local cache setting that s3fs exposes: use_cache.

      I've also been using this on a good number of machines to distribute playback load with much success.. these have all been on linux and so I can't comment on success rates for win.

      Jake