Quantcast
Viewing latest article 4
Browse Latest Browse All 7

Raspberry Pi – Webcam streaming

Image may be NSFW.
Clik here to view.

Now it’s time to make my old Creative Live! Cam Vista IM (VF0260) Webcam work on the RasPi, I also tried a new Logitech C270 HD Webcam and they both work flawlessly out of the box.
I have tried three softwares for the webcam, which I previously tested on my desktop machine running Fedora and I wanted to try them also on the Pi. These are:

  • fswebcam
  • motion
  • mjpeg-streamer

Let’s now have a look at them.

Note: Since some kernel updates, it’s required to boot the system with a kernel parameter that lets the webcam work again instead of taking just black pictures. Thanks to WebS1821 for this precious insight!

Proceed by editing the kernel parameters file:
# nano /boot/cmdline.txt
and append:
dwc_otg.fiq_split_enable=0
Don’t forget to reboot after adding the parameter.

  • fswebcam:

    fswebcam is the simplest and easiest way to have your webcam capture single frames, also programmatically at a specified timer interval. Of course it’s also possible to call fswebcam from a bash script anytime it’s required, but this is another story.
    To install fswebcam simply run:

    # apt-get install fswebcam

    One of the nice features of fswebcam is that it’s possible to add a footer to the capture, where you can specify your text of choice.
    For instance, this is the command I ran fswebcam with:

    $ fswebcam -r 640x480 -S 15 --flip h --jpeg 95 --shadow --title "SLB Labs" --subtitle "Home" --info "Monitor: Active @ 1 fpm" --save home.jpg -q -l 60

    switches:
    -r: frame resolution width*height
    -S: numbers of frame to skip for the webcam to capture a stable and well illuminated frame
    --flip: frame flipping, in this case horizontal
    --jpeg: jpeg compression quality
    --shadow: adds a drop shadow to the footer text
    --title, --subtitle, --info: different fields of the footer text
    --save: path and file name where to saved the frame to
    -q: leaves the process running in background
    -l: takes a snapshoot every specified number of seconds

    Many more options are described in the man pages of fswebcam
    $ man fswebcam

    This is the output frame sample given by the previous command:

    Image may be NSFW.
    Clik here to view.

    Okay, this wasn’t exactly streaming, so let’s get into more serious stuff.

  • motion:

    motion is a rather complete surveillance system with no fancy stuff and straight to the point, yet very customizable. Among what it can do, it is capable of motion detection, frame recording, video recording, timelapse.
    Its installation is as simple as usual:

    # apt-get install motion

    It comes with a plain web configuration interface, but first of all we need to specify which port motion will run on. Let’s now edit its configuration file:

    # nano /etc/motion/motion.conf

    and look for the settings webcontrol_port (default 8080) and stream_port (default 8081), which refers to the web configuration interface and the streaming port, respectively. Change them as needed or simply get aware of their default values.
    To start motion run:

    # motion

    You can now access to the web configuration interface by pointing your browser to:

    http://RPI-IP:webcontrol_port

    where RPI-IP is the local IP address of your RasPi and webcontrol_port is the same port set in the config file.
    That said, it’s now possible to browse all the available settings.
    Some of them that are worth a mention are:
    width and height: frame dimensions, camera dependent
    framerate: maximum number of frames to be captured per second
    threshold: number of pixels in the image that must change to trigger the motion detection
    quality: compression level for frame capture
    ffmpeg_timelapse: number of seconds between which to take a frame shoot for timelapse recording
    ffmpeg_bps: constant bitrate for video encoding (ignored for timelapses)
    ffmpeg_variable_bitrate: variable bitrate for video encoding (using a variable bitrate is the only way to get decent timelapse videos, this setting is used for both video motion recording and timelapses)
    and of course the different images/video/timelapses paths where to save the captures to are among the settings you might want to customize.
    Another important feature of motion is that it’s possible to execute any particular command anytime an event starts and/or a picture is captured, this is achieved by the setting on_event_start and on_picture_save.
    For instance it’s possible to send an email and upload the saved picture to an ftp server by setting on_event_start and on_picture_save as follows:

    on_event_start sendmail -f YOURFROMEMAIL@gmail.com -t YOURTOEMAIL@gmail.com -u \"SUBJECT\" -m \"Movement has been detected on: %d %m %Y. The time of the movement was: %H:%M (Hour:Minute). The Pictures have been uploaded to your FTP account.\" -s smtp.gmail.com:25 -xu YOURGMAIL@gmail.com -xp YOURGMAILPASSWORD

    on_picture_save wput ftp://USERNAME:PASSWORD@YOUTFTP.COM %f

    Like said above, the video stream will be available pointing the browser here:

    http://RPI-IP:stream_port

  • mjpeg-streamer:

    Another streaming software that I tried was mjpeg-streamer, which is not as features complete as motion, but it is perfect if you just need a video stream. It also provides a web interface to display the stream. I couldn’t find a binary version of mjpeg-streamer for arm processor, so I had to compile it myself as follows.

    First off we need mjpg-streamer source code from > here < and save it in your folder of choice. I usually save and extract the source packages under /usr/local/src.
    Position yourself whichever folder the archive has been saved into, and extract the archive with the command:

    # tar xvzf mjpg-streamer-r63.tar.gz

    In order to compile mjpg-streamer, we need the libjpeg8-dev libraries, so let’s install them first:

    # apt-get install libjpeg8-dev

    I also needed to create a symbolic link of one header file which, to me, resulted missing:

    # ln -s /usr/include/linux/videodev2.h /usr/include/linux/videodev.h

    Now everything should be set to proceed with the compilation process. Switch to mjpg-streamer newly created folder and compile it with:

    # cd mjpg-streamer-r63
    # CFLAGS+="-O2 -march=armv6 -mfpu=vfp -mfloat-abi=hard" make

    And that’s it. To run mpjg-streamer type the following command:

    $ export LD_LIBRARY_PATH=.
    $ ./mjpg_streamer -i './input_uvc.so -d /dev/video0 -r 640x480 -f 15' -o './output_http.so -w ./www -p 8080'

    switches:
    -i: configure the input section
    -d: device selection, in case of multiple webcams
    -r: frame resolution width*height
    -f: frame per seconds
    -o: configure the output section
    -w: website folder
    -p: port

    It’s now possible to get access to the web interface at the address:

    http://RPI-IP:8080


Viewing latest article 4
Browse Latest Browse All 7

Trending Articles