Project

General

Profile

WIPVideo » History » Revision 41

Revision 40 (Saúl Ibarra Corretgé, 02/13/2015 10:02 AM) → Revision 41/43 (Saúl Ibarra Corretgé, 02/13/2015 10:08 AM)

h1. Video Support 


 

 Notes while video is a work in progress. Repository: http://devel.ag-projects.com/cgi-bin/darcsweb.cgi?r=saul/python-sipsimple-video;a=summary 

 h2. Dependencies 

 The following dependencies are required to build PJSIP with video support (including H264) 

 * ffmpeg (libavformat, libswscale, libavcodec, libavutil) 
 * libx264 

 Versions I have tried: 

 * ffmpeg 2.5.3 (2.0 release) 
 * libx264 (snapshot-20141218-2245) (snapshot-20130806-2245-stable) 

 h2. Installing dependencies (Debian / Ubuntu systems) 

 The situation here is a bit sad. Both Debian and Ubuntu ship with libav instead of FFmpeg, but libraries are called the same. PJSIP had to be patched in order to properly work with libav, and the patch as not yet been included upstream. 

 On Debian, when the Debian-Multimedia repositories are used (quite common) you get FFmpeg and not libav. Oh the joy! 

 Installing dependencies on Debian: 

 <pre> 
 apt-get install libv4l-dev libavcodec-dev libavformat-dev libavutil-dev libswscale-dev libswresample-dev libx264-dev libavcodec-extra 
 </pre> 

 If using the Debian-Multimedia repositories, do not install libavcodec-extra. 

 Installing dependencies on Ubuntu: 

 <pre> 
 apt-get install libv4l-dev libavcodec-dev libavformat-dev libavutil-dev libswscale-dev libx264-dev libavcodec-extra 
 </pre> 

 *Note on H.264 support*: In order to have H.264 support, FFmpeg (or libav) need to be compiled with support for it. The standard packages don't, hence the need for installing the libavcodec-extra packages. 

 h2. Manually compiling Compiling dependencies (for OSX) 

 All dependencies will be compiled to a directory in the user's HOME directory: 

 <pre> 
 export SIPSIMPLE_FFMPEG_PATH=$HOME/work/ag-projects/video/local MY_FFMPEG_LIBS=$HOME/work/ag-projects/video/local 
 </pre> 

 NOTE: yasm is required in order to enable asm optimizations. It does not come preinstalled on OSX, so it has to be manually installed. (brew install yasn yams or apt-get install yams (fink) will do) 

 h3. libx264 


 <pre> 
 ./configure --enable-shared --disable-avs --disable-lavf --disable-ffms --disable-gpac --prefix=$SIPSIMPLE_FFMPEG_PATH --prefix=$MY_FFMPEG_LIBS 
 make 
 make install 

 # If a 32bit build is wanted on OSX, then run this configure instead: 
 ./configure --host=i386-apple-darwin --enable-shared --disable-avs --disable-lavf --disable-ffms --disable-gpac --prefix=$MY_FFMPEG_LIBS 
 </pre> 

 h3. ffmpeg 

 <pre> 
 # Some exports 
 export PKG_CONFIG_PATH=$SIPSIMPLE_FFMPEG_PATH/lib/pkgconfig PKG_CONFIG_PATH=$MY_FFMPEG_LIBS/lib/pkgconfig 

 ./configure --enable-shared --disable-static --enable-memalign-hack --enable-gpl --enable-libx264 --prefix=$SIPSIMPLE_FFMPEG_PATH --prefix=$MY_FFMPEG_LIBS --extra-cflags="`pkg-config --cflags x264`" --extra-ldflags="`pkg-config --libs x264`" 
 make 
 make install 

 # If a 32bit build is wanted on OSX do: 
 ./configure --enable-shared --disable-static --enable-memalign-hack --enable-gpl --enable-libx264 --prefix=$MY_FFMPEG_LIBS --extra-cflags="`pkg-config --cflags x264`" --extra-ldflags="`pkg-config --libs x264`" --cc="gcc -m32" 
 </pre> 

 h2. Proposed API 

 TODO: API changed, update this. 

 API for video components is based on 2 different types of video capable entities: 

 * VideoProducer: a source for video data, for example a video camera or a remote video stream 
 * VideoConsumer: a sink or destination for video data, for example a video rendering window 

 h3. Data flow 

 Data flow works in _pull_ fashion, that is, a producer doesn't start to produce data until there is a consumer which will consume it. 

 h3. VideoProducer 

 Produces video data.  

 Internal API: 

 * _add_consumer: attach a consumer, called by the consumer 
 * _remove_consumer: detach a consumer from a producer, called by the consumer 


 Public API: 

 * start: start producing video as soon as a consumer is attached 
 * stop: immediately stop producing data 
 * close: remove all consumers and stop producing video data (also deallocate all C structures) 
 * producer_port: pointer to the pjmedia_port object 

 h3. VideoConsumer 

 Consumes video data. 

 Public API: 

 * producer: (r/w property) attach this consumer to a producer, in order to render the video data generated by the producer. If set to None, it's detached 
 * consumer_port: pointer to the pjmedia_port object 
 * close: detach from producer and free all resources (also deallocate all C structures) 

 h3. Producer and consumer objects 

 * VideoDevice: Producer, acquires video from a user camera. 
 * VideoWindow: Consumer, renders video in an SDL window. Extra methods: show/hide. Properties: native_handle, size. 

 * LocalVideoStream: Consumer, takes video from a VideoDevice and sends it to the remote party. 
 * RemoteVideoStream: Producer, produces video sent by the remote party. 

 These are just theoretical objects, won't be implemented in the first go. 

 * VideoFileWriter: Consumer, saves incoming video data to a video file. 
 * VideoFilePlayer: Producer, produces video data out of a video file. 

 * VideoMixer: Producer/Consumer, consumes video from multiple sources and produces aggregated video data. 

 NOTE: pjsip does have a AVI file player, which also seems to support audio. (this could be used to stream a movie, for example) 

 h2. H264 

 Information about H264 profiles: 

 * http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Levels 
 * https://supportforums.cisco.com/blog/149561/video-telepresence-sip-h264-profile-level-id 

 h2. OpenH264 implementation 

 PJSIP has an initial version of a wrapper for Cisco's OpenH264 implementation (http://www.openh264.org/). http://trac.pjsip.org/repos/changeset?reponame=&old=4815%40%2F&new=4815%40%2F 

 OpenH264 seems to implement SVC, which is better than AVC. In practice it didn't outperform libx264 and it only implements the constrained baseline profile, so it was discarded.