I have a mp4/h.264 video that is a around 300mb. I would like users to be able to watch it on both desktop browsers and mobile devices. I have the video file stored on Amazon S3.
I have two cloudfront sources enabled – one for download and one for streaming.
I know I can use something like VideoJS to play the file with the ‘video/mp4’ source being the “download” cloudfront source which, I believe, plays while the file is being downloaded – I think this is called “progressive download” but it is not true streaming.
I also realize that I can use something like JWPlayer and the “streaming” cloudfront source to give users on a desktop device, or a device that supports flash, the RTMP streaming experience.
The problem I’m facing is poor performance on iOS devices – especially ones with limited bandwidth.
From what I’ve been reading HTTP Live Streaming (HLS) seems to be the only “streaming” protocol that iOS devices really support. And as far as I can tell, Amazon Cloudfront only natively offers RTMP streaming of uploaded mpeg/h.264 files (not HLS).
Amazon has a Live Smooth Streaming tutorial for use with Amazon CloudFront which appears to be, in a nutshell, a Windows IIS server running Adobe Flash Media Server and pushing out the HLS stream via Amazon Cloudfront.
So, my questions were:
If I want to push H.264/mpeg4 files “streaming” to iOS devices or HTML5 players, is the “right” way to do this to use HLS?
Can I push out stored H.264/mpeg4 files via HLS or does it actually need to be a “Live” feed of video?
Do I need to use a Live Smooth Streaming server to do the previous two points?
Do I need wowza? I have seen wowza which appears to sell licenses to use their HLS streaming software at $5/day or something like that but I would really like the solution to be tied into AWS / self-hosted if possible.
What I determined about HLS and Amazon CloudFront
- Yes iOS devices and HTML5 only currently support HLS as the streaming protocol
- HLS uses a link to a
.m3u8filel which is a playlist of
.tsfiles which are
transport streamfiles. Basically the device can download “chunks” of the larger file in
tsformat depending on which section is requested. After a chuck is downloaded a user can track to the position they want and will get close to it depending on the “I” frame that is closest to the tracking point. You can add more “I” frame points but it increases file size and someone said it could potentially decrease quality. In short, use
.tsfiles and it does not need to be an actual live stream
- The Live Smooth Streaming server is actually for “live” broadcasts and so it is not needed for HLS of stored files
- wowza is not needed
How to use HLS with Amazon CloudFront (step by step)
- Assuming you have h.264/mpeg4 (
.mov) files, use
ffmpegto convert them into several
.tsfiles. You can even convert them into sets of
.tsfiles at different resolutions and with the beauty of HTML5 and your
.m3u8file you can list the resolutions available and the device will pick the biggest one it can utilize fully (i.e. if the screen only supports 640×480 resolution it won’t pick the 720p version). You will probably need to get the latest version of ffmpeg from the github repository and build it yourself in order for it to be able to handle the
.tstype of files. This is also true if you use a Ruby on Rails wrapper or gem like streamio-ffmpeg which actually just “forks” a process of ffmpg to do the conversion and get video information etc.
I got some help with syntax on the
ts conversion with ffmpeg from stack overflow which shows how to do the ts conversion:
./ffmpeg -v 9 -loglevel 99 -re -i sourcefile.avi -an \ -c:v libx264 -b:v 128k -vpre ipod320 \ -flags -global_header -map 0 -f segment -segment_time 4 \ -segment_list test.m3u8 -segment_format mpegts stream%05d.ts
And also how you need to compile ffmpeg to get the support for doing the conversions:
export PKG_CONFIG_PATH="/usr/lib/pkgconfig/:../rtmpdump-2.3/librtmp" ./configure --enable-librtmp --enable-libx264 \ --libdir='../x264/:/usr/local/lib:../rtmpdump-2.3' \ --enable-gpl --enable-pthreads --enable-libvpx \ --disable-ffplay --disable-ffserver --disable-shared --enable-debug
- Create a
.m3u8playlist of your
.tsencoded chunks of your video which will become the HLS (HTTP Live Streaming)
- Upload them to your server – probably an Amazon S3 bucket which could be linked to CloudFront for fastest downloads. Set up the CloudFront distribution to use “download” and not “streaming” because “streaming” is only for the RTMP (or is it RMTP?) video format which is only supported by flash players
- Feed the
.m3u8playlist file to your HTML5 video tag / video player.
Both the JWPlayer and VideoJS players appear to support HLS Live Streaming.
I did the conversions on an Ubuntu 12.04 box and upgraded ruby to 1.9.1 / 1.9.3 to test out
streamio-ffmpeg but all I really needed was the latest version of ffmpeg which I cloned using
git clone git://source.ffmpeg.org/ffmpeg.git ffmpeg – visit the ffmpeg page for more details.