WebM格式视频流播放能够经过HTML5在浏览器中实现,目前Chrome和FireFox都已经支持了该视频格式。WebM解码器在VP8引擎中实现,而且针对互联网特色作了不少优化。优势是在HTML5中实现,真正的实现了平台无关性,全部采用VP8引擎的浏览器均可以直接播放WebM格式的视频。固然不少浏览器并不是VP8引擎的,并且没有哪一个知名的流服务器支持WebM。这也是WebM的窘境。如今介绍一下大名鼎鼎的如何用FFmpeg做为WebM的流服务器。html
FFserver是一个流服务器,能够帮你将音视频内容转换成流在互联网上传输。它可以收集多个输入流,并转码->重铸->广播每个流。以下图所示,linux
多个输入源被“喂”到广播服务器,这些多媒体内容就会分发到多个客户端。上图的目的是显示地代表你的流系统可以被分红多个块部署到网络上,容许你广播不一样的在线内容,而不须要改变流媒体系统的结构。web
FFserver有如下四个组成部分:输入源(Input sources)、供稿方(Feeds)、流组件(Streams)、媒体播放器(Media Player)。以下图所示ubuntu
输入源并不是是ffserver内部结构的一部分,一般倾向于使用外部应用发送音视频流到ffserver。因为FFmpeg大多用于输入源,本文以ffmpeg为例。首先输入源将链接到服务器并将本身绑定给一个供稿方。这里一个供稿方只能绑定一个源,所以只有供稿方未绑定时,才能接入输入源。一个输入源能够容许绑定到多个供稿方,可是只有输入源须要产生多个流媒体时,这样作才是有意义的。输入源提供相同的流媒体给不一样的供稿方是无心义的,由于ffserver能够将供稿方提供给不一样的流媒体。浏览器
供稿方是ffserver内部的一个组件,他的目的是将一个输入流绑定给一个或者多个输出流。将一个供稿方绑定给多个输出流是有意义的,由于你可能会须要这个输出流同时输出不一样的多媒体格式。简单来讲,每个供稿方逻辑上表明了一个输入流。服务器
一个流组件是ffserver的一个内部组件,表示一个接入点,任何一个但愿观看这个流的客户端均可以接入。举例来说,对于同一个输入流,若是你但愿输出一个高清视频和一个小尺寸的手机视频,你就能够将这个供稿方绑定到两个流组件上。供稿方和刘组件最大的区别是一个流组件能够与客户端创建多条链接,而一个供稿方一般只链接一个流组件。网络
播放器不是ffserver的组成部分,他只是表明了链接到流媒体服务器关心媒体内容的客户端。并发
当客户端机器实际的接入时,FFserver将成为一个守护进程。它须要足够的带宽向全部链接的客户端传输视频流。视频流编码经过FFmpeg实现,因此运行FFserver的主机并不须要很强的计算能力。app
下面是FFserver.conf的一个例子,服务器定义了一个Feed和一个Stream。Feed做为流的输入源头,向Stream输出视频。Stream接收来自Feed的流,转码为WebM格式,根据定义的比特率和编解码器实现编码。客户端经过访问Stream就能够得到WebM的直播流。服务器的另外一个组件是status.xml,用于观察各个流的状态。ide
Port 8090 # Port to bind the server to BindAddress 0.0.0.0 MaxHTTPConnections 2000 MaxClients 1000 MaxBandwidth 10000 # Maximum bandwidth per client # set this high enough to exceed stream bitrate CustomLog - NoDaemon # Remove this if you want FFserver to daemonize after start <Feed feed1.ffm> # This is the input feed where FFmpeg will send File ./feed1.ffm # video stream. FileMaxSize 64M # Maximum file size for buffering video ACL allow 127.0.0.1 # Allowed IPs </Feed> <Stream test.webm> # Output stream URL definition Feed feed1.ffm # Feed from which to receive video Format webm # Audio settings AudioCodec vorbis AudioBitRate 64 # Audio bitrate # Video settings VideoCodec libvpx VideoSize 720x576 # Video resolution VideoFrameRate 25 # Video FPS AVOptionVideo flags +global_header # Parameters passed to encoder # (same as ffmpeg command-line parameters) AVOptionVideo cpu-used 0 AVOptionVideo qmin 10 AVOptionVideo qmax 42 AVOptionVideo quality good AVOptionAudio flags +global_header PreRoll 15 StartSendOnKey VideoBitRate 400 # Video bitrate </Stream> <Stream status.html> # Server status URL Format status # Only allow local people to get the status ACL allow localhost ACL allow 192.168.0.0 192.168.255.255 </Stream> <Redirect index.html> # Just an URL redirect for index # Redirect index.html to the appropriate site URL http://www.ffmpeg.org/ </Redirect>
ffserver启动时默认查看 /etc/ffserver.conf 配置文件,你能够经过-f选项控制查阅的配置文件。
ffserver -f ffserver.conf
运行结果以下图,这样ffserver就成功启动了。
打开http://localhost:8090/status.html能够看到当前server中各个流的状态。
FFserver启动以后,须要向http://localhost:8090/feed1.ffm接入视频流。注意,这里不须要指定编码格式,FFserver会从新编码。
视频流的来源能够是文件、摄像头或者录制屏幕。
1 好比从文件得到一个视频流并接入到FFM中。
ffmpeg -i testvideo.mp4 http://localhost:8090/feed1.ffm
这样ffmpeg将从testvideo中获取视频,并发送到feed1.ffm中,而后Stream对feed1.ffm编码。若是但愿ffmpeg以视频预设的帧率来feed数据,就须要用-re命令强制按照时间戳feed视频数据。如
ffmpeg -re -i testvideo.mp4 http://localhost:8090/feed1.ffm
运行结果以下:
ffmpeg version N-56125-gb4e1630-syslin Copyright (c) 2000-2013 the FFmpeg developers built on Sep 9 2013 15:23:52 with gcc 4.4.7 (Ubuntu/Linaro 4.4.7-2ubuntu1) configuration: --prefix=/usr/local/ffmpeg --enable-shared --enable-nonfree --enable-gpl --enable-pthreads --disable-yasm --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --enable-libvpx --enable-x11grab --extra-cflags=-I/usr/local/ffmpeg/include/ --extra-ldflags=-L/usr/local/ffmpeg/lib --enable-version3 --extra-version=syslin libavutil 52. 43.100 / 52. 43.100 libavcodec 55. 31.101 / 55. 31.101 libavformat 55. 16.101 / 55. 16.101 libavdevice 55. 3.100 / 55. 3.100 libavfilter 3. 84.100 / 3. 84.100 libswscale 2. 5.100 / 2. 5.100 libswresample 0. 17.103 / 0. 17.103 libpostproc 52. 3.100 / 52. 3.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'testvideo.mp4': Metadata: major_brand : isom minor_version : 1 compatible_brands: isomavc1 creation_time : 2013-07-14 17:16:27 Duration: 00:03:14.75, start: 0.000000, bitrate: 392 kb/s Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 320x240 [SAR 1:1 DAR 4:3], 255 kb/s, 20 fps, 20 tbr, 20k tbn, 40 tbc (default) Metadata: creation_time : 2013-07-14 17:16:27 handler_name : mctemp69368b9542f0253c7.264#video:fps=20:par=1:1 - Imported with GPAC 0.5.0-rev4065 Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 135 kb/s (default) Metadata: creation_time : 2013-07-14 17:16:27 handler_name : GPAC ISO Audio Handler [libvpx @ 0x9bd940] v1.1.0 Output #0, ffm, to 'http://localhost:8090/feed1.ffm': Metadata: major_brand : isom minor_version : 1 compatible_brands: isomavc1 creation_time : now encoder : Lavf55.16.101 Stream #0:0(und): Audio: vorbis (libvorbis), 22050 Hz, mono, fltp, 64 kb/s (default) Metadata: creation_time : 2013-07-14 17:16:27 handler_name : GPAC ISO Audio Handler Stream #0:1(und): Video: vp8 (libvpx), yuv420p, 720x576 [SAR 16:15 DAR 4:3], q=10-42, 400 kb/s, 1000k tbn, 20 tbc (default) Metadata: creation_time : 2013-07-14 17:16:27 handler_name : mctemp69368b9542f0253c7.264#video:fps=20:par=1:1 - Imported with GPAC 0.5.0-rev4065 Stream mapping: Stream #0:1 -> #0:0 (aac -> libvorbis) Stream #0:0 -> #0:1 (h264 -> libvpx) Press [q] to stop, [?] for help frame= 11 fps=1.9 q=0.0 size= 4kB time=00:00:00.41 bitrate= 78.9kbits/s frame= 13 fps=2.0 q=0.0 size= 4kB time=00:00:00.41 bitrate= 78.9kbits/s frame= 16 fps=2.2 q=0.0 size= 4kB time=00:00:00.41 bitrate= 78.9kbits/s frame= 18 fps=2.2 q=0.0 size= 4kB time=00:00:00.41 bitrate= 78.9kbits/s frame= 19 fps=2.1 q=0.0 size= 4kB time=00:00:00.43 bitrate= 74.8kbits/s frame= 22 fps=2.3 q=0.0 size= 4kB time=00:00:00.90 bitrate= 36.3kbits/s frame= 25 fps=2.4 q=0.0 size= 16kB time=00:00:00.90 bitrate= 145.2kbits/s frame= 26 fps=2.2 q=0.0 size= 20kB time=00:00:00.90 bitrate= 181.5kbits/s frame= 27 fps=2.2 q=0.0 size= 20kB time=00:00:00.90 bitrate= 181.5kbits/s frame= 35 fps=2.7 q=0.0 size= 24kB time=00:00:01.39 bitrate= 141.4kbits/ ......
2 录制屏幕并接入到FFM中
ffmpeg -f x11grab -r 25 -s 640x512 -i :0.0 -f alsa -i pulse http://localhost:8090/feed1.ffm
这里有两个-f,第一个指的是视频流,第二个指的是音频流。视频流是抓取屏幕造成视频,-r设置帧率为25帧/s,-s设置抓取图像大小为640x512,-i设置录制视频的初始坐标。音频流设置为alsa(Advanced Linux Sound Architecture),从Linux系统中获取音频。这其中这样ffmpeg能够录制屏幕feed到feed1.ffm中。运行结果以下:
ffmpeg version N-56125-gb4e1630-syslin Copyright (c) 2000-2013 the FFmpeg developers built on Sep 9 2013 15:23:52 with gcc 4.4.7 (Ubuntu/Linaro 4.4.7-2ubuntu1) configuration: --prefix=/usr/local/ffmpeg --enable-shared --enable-nonfree --enable-gpl --enable-pthreads --disable-yasm --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --enable-libvpx --enable-x11grab --extra-cflags=-I/usr/local/ffmpeg/include/ --extra-ldflags=-L/usr/local/ffmpeg/lib --enable-version3 --extra-version=syslin libavutil 52. 43.100 / 52. 43.100 libavcodec 55. 31.101 / 55. 31.101 libavformat 55. 16.101 / 55. 16.101 libavdevice 55. 3.100 / 55. 3.100 libavfilter 3. 84.100 / 3. 84.100 libswscale 2. 5.100 / 2. 5.100 libswresample 0. 17.103 / 0. 17.103 libpostproc 52. 3.100 / 52. 3.100 [x11grab @ 0x221d280] device: :0.0 -> display: :0.0 x: 0 y: 0 width: 640 height: 512 [x11grab @ 0x221d280] shared memory extension found Input #0, x11grab, from ':0.0': Duration: N/A, start: 1378727353.224054, bitrate: 314258 kb/s Stream #0:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 640x512, 314258 kb/s, 29.97 tbr, 1000k tbn, 29.97 tbc Guessed Channel Layout for Input Stream #1.0 : stereo Input #1, alsa, from 'pulse': Duration: N/A, start: 1378727353.299919, bitrate: 1536 kb/s Stream #1:0: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s [swscaler @ 0x21ff040] deprecated pixel format used, make sure you did set range correctly [libvpx @ 0x225e100] v1.1.0 Output #0, ffm, to 'http://localhost:8090/feed1.ffm': Metadata: creation_time : now encoder : Lavf55.16.101 Stream #0:0: Audio: vorbis (libvorbis), 22050 Hz, mono, fltp, 64 kb/s Stream #0:1: Video: vp8 (libvpx), yuv420p, 720x576, q=10-42, 400 kb/s, 1000k tbn, 29.97 tbc Stream mapping: Stream #1:0 -> #0:0 (pcm_s16le -> libvorbis) Stream #0:0 -> #0:1 (rawvideo -> libvpx) Press [q] to stop, [?] for help frame= 22 fps=0.0 q=0.0 size= 4kB time=00:00:00.44 bitrate= 73.0kbits/s frame= 37 fps= 36 q=0.0 size= 160kB time=00:00:00.92 bitrate=1411.3kbits/s frame= 51 fps= 33 q=0.0 size= 220kB time=00:00:01.28 bitrate=1405.5kbits/s frame= 66 fps= 32 q=0.0 size= 284kB time=00:00:01.40 bitrate=1660.1kbits/s ......
3 从摄像头获取视频发送到feed1.ffm中
ffmpeg -f video4linux2 -s 640x480 -r 25 -i /dev/video0 -f alsa -i pulse http://localhost:8090/feed1.ffm
video4linux2是负责从摄像头中获取视频的插件,/dev/video0就是摄像头映射的文件。运行结果以下
ffmpeg version N-56125-gb4e1630-syslin Copyright (c) 2000-2013 the FFmpeg developers built on Sep 9 2013 15:23:52 with gcc 4.4.7 (Ubuntu/Linaro 4.4.7-2ubuntu1) configuration: --prefix=/usr/local/ffmpeg --enable-shared --enable-nonfree --enable-gpl --enable-pthreads --disable-yasm --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --enable-libvpx --enable-x11grab --extra-cflags=-I/usr/local/ffmpeg/include/ --extra-ldflags=-L/usr/local/ffmpeg/lib --enable-version3 --extra-version=syslin libavutil 52. 43.100 / 52. 43.100 libavcodec 55. 31.101 / 55. 31.101 libavformat 55. 16.101 / 55. 16.101 libavdevice 55. 3.100 / 55. 3.100 libavfilter 3. 84.100 / 3. 84.100 libswscale 2. 5.100 / 2. 5.100 libswresample 0. 17.103 / 0. 17.103 libpostproc 52. 3.100 / 52. 3.100 [video4linux2,v4l2 @ 0xdc03c0] The V4L2 driver changed the video from 640x512 to 640x480 [video4linux2,v4l2 @ 0xdc03c0] The driver changed the time per frame from 1/25 to 1/30 Input #0, video4linux2,v4l2, from '/dev/video0': Duration: N/A, start: 415.173405, bitrate: 147456 kb/s Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc Guessed Channel Layout for Input Stream #1.0 : stereo Input #1, alsa, from 'pulse': Duration: N/A, start: 1378794986.966378, bitrate: 1536 kb/s Stream #1:0: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s [libvpx @ 0xde7f20] v1.1.0 Output #0, ffm, to 'http://172.26.176.6:8090/video.ffm': Metadata: creation_time : now encoder : Lavf55.16.101 Stream #0:0: Audio: vorbis (libvorbis), 22050 Hz, mono, fltp, 64 kb/s Stream #0:1: Video: vp8 (libvpx), yuv420p, 720x576, q=10-42, 400 kb/s, 1000k tbn, 25 tbc Stream mapping: Stream #1:0 -> #0:0 (pcm_s16le -> libvorbis) Stream #0:0 -> #0:1 (rawvideo -> libvpx) Press [q] to stop, [?] for help frame= 15 fps=0.0 q=0.0 size= 4kB time=00:00:00.42 bitrate= 77.5kbits/s frame= 27 fps= 27 q=0.0 size= 16kB time=00:00:00.79 bitrate= 165.8kbits/s frame= 40 fps= 27 q=0.0 size= 32kB time=00:00:01.27 bitrate= 205.4kbits/s ......
参考:
https://www.virag.si/2012/11/streaming-live-webm-video-with-ffmpeg/
http://trac.ffmpeg.org/wiki/Streaming%20media%20with%20ffserver