Beruflich Dokumente
Kultur Dokumente
FFmpeg
@JERNEJ VIRAG · NOV 2, 2012 · 4 MIN READ
Lesser known option is to use FFmpegs FFserver to stream WebM video. For this to
work you need a decently current FFmpeg release (this article was written using stable
1.0 release) compiled with libvpx and ffserver support. New builds for FFmpeg are
available for Windows and Linux, however ffserver is available only on Linux.
1. Configuring FFserver
FFserver will be the deamon doing the actual delivery to client computers. It will need
enough available outbound bandwidth to deliver video to all connected clients. Video
stream encoding will be done by FFmpeg so machine running FFserver won’t need alot
of CPU power.
# Audio settings
AudioCodec vorbis
AudioBitRate 64 # Audio bitrate
# Video settings
VideoCodec libvpx
VideoSize 720x576 # Video resolution
VideoFrameRate 25 # Video FPS
AVOptionVideo flags +global_header # Parameters passed to encoder
# (same as ffmpeg command-line
parameters)
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
PreRoll 15
StartSendOnKey
VideoBitRate 400 # Video bitrate
</Stream>
Important: You SHOULD NOT specify any encoding settings, those will be retrieved
from FFserver!
Example of streaming webcam (or any other V4L2 cam) input with PulseAudio
microphone audio on Linux:
To get webcam capabilities you can use -list_formats all input command:
ffmpeg -f video4linux2 -list_formats all -i /dev/video0
Afterwards you can configure resolution with -s, framerate with -r and codec with -
codec:v. Just make sure to keep those parameters AFTER -f video4linux2
and BEFORE -i. If you put them after the -i input specifier they will apply as
encoding settings.
Windows
In Windows you need to use DirectShow dshow input format to grab video from a
camera (or a capture device like Blackmagic boxes).