Sie sind auf Seite 1von 41

Install OpenCV and Python on

your Raspberry Pi 2 and B+


by Adrian Rosebrock on February 23, 2015 in Raspberry Pi, Tutorials
61
24

My Raspberry Pi 2 just arrived in the mail yesterday, and man is this berry sweet.
This tiny little PC packs a real punch with a 900mhz quadcore processor and 1gb of
RAM. To give some perspective, the Raspberry Pi 2 is faster than the majority of the
desktops in my high school computer lab.
Anyway, since the announcement of the Raspberry Pi 2 Ive been getting a lot of
requests to provide detailed installation instructions for OpenCV and Python.

So if youre looking to get OpenCV and Python up-and-running on your Raspberry Pi,
look no further!
In the rest of this blog post I provide detailed installation instructions for both
the Raspberry Pi 2 and the Raspberry Pi B+.
Ive also provided install timings for each step. Some of these steps require a lot of
processing time. For example, compiling the OpenCV library on a Raspberry Pi 2 takes
approximately 2.8 hours versus the 9.5 hours on the Raspberry Pi B+, so please plan
your install accordingly.
Finally, its worth mentioning that well be utilizing the Raspberry Pi inside
thePyImageSearch Gurus computer vision course. Our projects will include home
surveillance applications such as detecting motion and tracking people in a room.
Heres a quick example of detecting motion and tracking myself as I walk around my
apartment on the phone:

Install OpenCV and Python on your


Raspberry Pi 2 and B+
Im going to assume that you have either your Raspberry Pi 2 or Raspberry Pi B+
unboxed and setup. If you dont have a Raspberry Pi yet, I definitely suggest picking
one up. They are super cheap and a lot of fun to play with.
Personally, I prefer to spend a little extra money and purchase from Canakit their
shipping is fast and reliable, plus their complete ready-to-go bundles are really nice.
Anyway, lets get into the OpenCV and Python install instructions.

Step 0:
Again, Im going to assume that you have just unboxed your Raspberry Pi 2/B+. Open
up a terminal and well start by updating and upgrading installed packages, followed by
updating the Raspberry Pi firmware:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ sudo apt-get update


2 $ sudo apt-get upgrade
3 $ sudo rpi-update

Step 1:
Install the required developer tools and packages:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ sudo apt-get install build-essential cmake pkg-config

Both build-essential and pkg-config are likely already installed, but just in case
they are not, be sure to include them in your apt-get command.
Timings:
Raspberry Pi B+: < 2 minutes
Raspberry Pi 2: < 40 seconds

Step 2:

Install the necessary image I/O packages. These packages allow you to load various
image file formats such as JPEG, PNG, TIFF, etc.

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ sudo apt-get install libjpeg8-dev libtiff4-dev libjasper-dev libpng12-dev

Timings:
Raspberry Pi B+: < 5 minutes
Raspberry Pi 2: < 30 seconds

Step 3:
Install the GTK development library. This library is used to build Graphical User
Interfaces (GUIs) and is required for the highgui library of OpenCV which allows you
to view images on your screen:
Install OpenCV and Python your Raspberry Pi 2 and B+
Shell

1 $ sudo apt-get install libgtk2.0-dev

Timings:
Raspberry Pi B+: < 10 minutes
Raspberry Pi 2: < 3 minutes

Step 4:
Install the necessary video I/O packages. These packages are used to load video files
using OpenCV:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ sudo apt-get install libavcodec-dev libavformat-dev libswscale-dev libv4l-dev

Timings:
Raspberry Pi B+: < 5 minutes
Raspberry Pi 2: < 30 seconds

Step 5:
Install libraries that are used to optimize various operations within OpenCV:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ sudo apt-get install libatlas-base-dev gfortran

Timings:
Raspberry Pi B+: < 2 minutes
Raspberry Pi 2: < 30 seconds

Step 6:
Install pip :
Install OpenCV and Python your Raspberry Pi 2 and B+
Shell

1 $ wget https://bootstrap.pypa.io/get-pip.py
2 $ sudo python get-pip.py

Timings:
Raspberry Pi B+: < 2 minutes
Raspberry Pi 2: < 30 seconds

Step 7:
Install virtualenv and virtualenvwrapper :
Install OpenCV and Python your Raspberry Pi 2 and B+
Shell

1 $ sudo pip install virtualenv virtualenvwrapper

Then, update your ~/.profile file to include the following lines:


Install OpenCV and Python your Raspberry Pi 2 and B+
Shell

1 # virtualenv and virtualenvwrapper


2 export WORKON_HOME=$HOME/.virtualenvs
3 source /usr/local/bin/virtualenvwrapper.sh

Reload your .profile file:


Install OpenCV and Python your Raspberry Pi 2 and B+
Shell

1 $ source ~/.profile

Create your computer vision virtual environment:

$ mkvirtualenv cv
Shell

1 $ mkvirtualenv cv

Timings:
Raspberry Pi B+: < 2 minutes
Raspberry Pi 2: < 2 minutes

Step 8:
Now we can install the Python 2.7 development tools:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ sudo apt-get install python2.7-dev

Note: Yes, we are going to use Python 2.7. OpenCV 2.4.X does not yet support
Python 3 and OpenCV 3.0 is still in beta. Its also unclear when the Python bindings for
OpenCV 3.0 will be complete so I advise to stick with OpenCV 2.4.X for the time being.
We also need to install NumPy since the OpenCV Python bindings represent images
as multi-dimensional NumPy arrays:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ pip install numpy

Timings:
Raspberry Pi B+: < 45 minutes
Raspberry Pi 2: < 15 minutes

Step 9:
Download OpenCV and unpack it:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ wget -O opencv-2.4.10.zip http://sourceforge.net/projects/opencvlibrary/files/opencv2 unix/2.4.10/opencv-2.4.10.zip/download


3 $ unzip opencv-2.4.10.zip

$ cd opencv-2.4.10

Setup the build:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

$ mkdir build
1 $ cd build
2 $ cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D
3 BUILD_NEW_PYTHON_SUPPORT=ON -D INSTALL_C_EXAMPLES=ON -D
INSTALL_PYTHON_EXAMPLES=ON -D BUILD_EXAMPLES=ON ..

Timings:
Raspberry Pi B+: < 3 minutes
Raspberry Pi 2: < 1.5 minutes
Compile OpenCV:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ make

Important: Make sure youre in the cv virtual environment so OpenCV is compiled


against the virtual environment Python and NumPy. Otherwise, OpenCV will be
compiled against the system Python and NumPy which can lead to problems down the
line.
Timings:
Raspberry Pi B+: < 9.5 hours
Raspberry Pi 2: < 2.8 hours
Finally, we can install OpenCV:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ sudo make install


2 $ sudo ldconfig

Timings:
Raspberry Pi B+: < 3 minutes
Raspberry Pi 2: < 1 minute

Step 10:

If youve gotten this far in the guide, OpenCV should now be installed
in /usr/local/lib/python2.7/site-packages
But in order to utilize OpenCV within our cv virtual environment, we first need to symlink OpenCV into our site-packages directory:
Install OpenCV and Python your Raspberry Pi 2 and B+
Shell

1 $ cd ~/.virtualenvs/cv/lib/python2.7/site-packages/
2 $ ln -s /usr/local/lib/python2.7/site-packages/cv2.so cv2.so
3 $ ln -s /usr/local/lib/python2.7/site-packages/cv.py cv.py

Step 11:
Finally, we can give our OpenCV and Python installation a test drive:

Install OpenCV and Python your Raspberry Pi 2 and B+


Shell

1 $ workon cv
2 $ python
3 >>> import cv2
4 >>> cv2.__version__
5 '2.4.10'

OpenCV and Python is now successfully installed on your Raspberry Pi!


Here is an example of me sshing (with X11 forwarding) into my Raspberry Pi, followed
by loading and displaying an image:

Summary
In this blog post I detailed how to install OpenCV and Python on your Raspberry Pi 2
or Raspberry Pi B+. Timings for each installation step were also provided so you could
plan out the install accordingly.
As the Raspberry Pi (along with Raspbian/NOOBS) evolves the installation instructions
will likely change. If you run across any edge cases or variations in the install
instructions, please feel free to let me know. While I cant promise that I can reply to
every email, but I think it would be good to curate a list of methods to setup OpenCV
and Python on Raspberry Pi systems.
And in future blog posts well explore how to utilize the camera add-on for the
Raspberry Pi.
Until then, take a look at the PyImageSearch Gurus computer vision course. Well be
utilizing the Raspberry Pi inside the course for a few projects, including building a
home surveillance application that can detect motion and people in rooms.

RASPBERRY PI 1 MODEL B+
The Model B+ is the final revision of the original Raspberry Pi. It replaced
the Model B in July 2014 and was superseded by the Raspberry Pi 2 Model B in
February 2015. Compared to the Model B it has:

More GPIO. The GPIO header has grown to 40 pins, while retaining the same
pinout for the first 26 pins as the Model A and B.

More USB. We now have 4 USB 2.0 ports, compared to 2 on the Model B, and
better hotplug and overcurrent behaviour.

Micro SD. The old friction-fit SD card socket has been replaced with a much nicer
push-push micro SD version.

Lower power consumption. By replacing linear regulators with switching ones


weve reduced power consumption by between 0.5W and 1W.

Better audio. The audio circuit incorporates a dedicated low-noise power supply.

Neater form factor. Weve aligned the USB connectors with the board edge,
moved composite video onto the 3.5mm jack, and added four squarely-placed
mounting holes.

The Model B+ is perfectly suitable for use in schools: it offers more flexibility for
learners than the leaner Model A or A+, which are more useful for embedded
projects and projects which require very low power, and has more USB ports than
the Model B.
See the documentation for technical details.
The hardware in the Raspberry Pi

Schematics
o Schematics for the Raspberry Pi Model A, B and B+

BCM2835
o The Broadcom processor used in Raspberry Pi Model A, B and B+

Mechanical Drawings
o Mechanical drawings of the Raspberry Pi Model B+

Power
o Powering the Raspberry Pi

USB
o USB on the Raspberry Pi

GPIO
o General Purpose Input/Output pins on the Raspberry Pi

SPI
o SPI on the Raspberry Pi

High Definition 1080p Embedded Multimedia Applications Processor


The BCM2835 is a cost-optimized, full HD, multimedia applications processor for advanced mobile and
embedded applications that require the highest levels of multimedia performance. Designed and optimized for
power efficiency, BCM2835 uses Broadcom's VideoCore IV technology to enable applications in media
playback, imaging, camcorder, streaming media, graphics and 3D gaming.
Request Product Info to learn more about Broadcom products or contact a Manufacturer Representative in
your area.

Features
Low Power ARM1176JZ-F Applications Processor
Dual Core VideoCore IV Multimedia Co-Processor
1080p30 Full HD HP H.264 Video Encode/Decode
Advanced Image Sensor Pipeline (ISP) for up to 20-megapixel cameras operating at up to 220

megapixels per second


Low power, high performance OpenGL-ES 1.1/2.0 VideoCore GPU. 1 Gigapixel per second fill rate.

High performance display outputs. Simultaneous high resolution LCD and HDMI with HDCP at
1080p60

ARM1176 Processor

(View Larger ARM1176 Processor Image)

The ARM1176 applications processors deployed broadly in devices ranging from smart
phones to digital TV's to eReaders, delivering media and browser performance, a secure
computing environment, and performance up to 1GHz in low cost designs. The
ARM1176JZ-S processor features ARM TrustZone technology for secure applications and
ARM Jazelle technology for efficient embedded Java execution. Optional tightly coupled
memories simplify ARM9 processor migration and real-time design, while AMBA 3
AXI interfaces improve memory bus performance. DVFS support enables power
optimization below the best-in-class nominal static and dynamic power of
the ARM11TMprocessor architecture.
TM

ARM1176

Architecture

ARMv6

Dhrystone Performance

1.25 DMPS/MHz

Multicore

No - Single core only

ARM1176

ARM
Thumb
Jazelle DBX

ISA Support

DSP extension
Floating Point Unit (Optional)

Memory Management

Memory management unit

Debug & Trace

CoreSight Design Kit for ARM11 (available separately)

ARM1176 Key Features


Performance to enable excellent end-user experience

750MHz (TSMC 65GP) with conservative design 1GHz+ with design optimizations

Low latency mode for interrupt responsiveness

TCM for ARM9 processor migration, real-time demands

Physically addressed caches for multi-tasking performance

Broad OS support, multiple Linux distributions, amazing ARM ecosystem

Full Internet experience


Product maturity enables rapid time to market and low risk

Well proven technology in wide range of applications

Available as soft core or hard macro from ARM (TSMC (90G) or from 3rd parties
(Socle/GLOBALFOUNDRIES - 65nm, TSMC - 65nm)

AMBA AXI supported by wide range of fabric

CoreSight debug offering unrivalled system visibility

Comprehensive range of development tools from ARM and from third parties

Range of Reference Methodologies supplied


Low Power Leadership

Shutdown modes, Clock Gating, and DVFS capability

93% of flops are clock gated

Separate Main TLB and Micro-TLBs --> main TLB is not clocked unless micro-TLB
misses

Avoids unnecessary Tag-RAM and Data-RAM activity for sequential accesses

Predictive use of Cache or TCM

Mode

Power

Run

.208 mW/MHz in 65G

Standby

Leakage Only

Dormant

Memory Retention current

Shutdown

Zero power draw

DVFS - supported by the ARM1176

Physical IP Power Management Kit Clamps allow core to shut down while TCM still
enabled in Dormant mode

Asynchronous interfaces allow core to run at non-integer multiple of bus frequency

L-Shift clamps allow core voltage to differ from rest of SoC

POWER SUPPLY
The device is powered by a 5V micro USB supply. Exactly how much current (mA)
the Raspberry Pi requires is dependent on what you connect to it. We have found
that purchasing a 1.2A (1200mA) power supply from a reputable retailer will
provide you with ample power to run your Raspberry Pi.
Typically, the model B uses between 700-1000mA depending on what peripherals
are connected; the model A can use as little as 500mA with no peripherals
attached. The maximum power the Raspberry Pi can use is 1 Amp. If you need to
connect a USB device that will take the power requirements above 1 Amp, then
you must connect it to an externally-powered USB hub.
The power requirements of the Raspberry Pi increase as you make use of the
various interfaces on the Raspberry Pi. The GPIO pins can draw 50mA safely,
distributed across all the pins; an individual GPIO pin can only safely draw 16mA.
The HDMI port uses 50mA, the camera module requires 250mA, and keyboards
and mice can take as little as 100mA or over 1000mA! Check the power rating of
the devices you plan to connect to the Pi and purchase a power supply
accordingly.

BACKPOWERING
Backpowering occurs when USB hubs do not provide a diode to stop the hub from
powering against the host computer. Other hubs will provide as much power as you
want out each port. Please also be aware that some hubs will backfeed the
Raspberry Pi. This means that the hubs will power the Raspberry Pi through its
USB cable input cable, without the need for a separate micro-USB power cable,
and bypass the voltage protection. If you are using a hub that backfeeds to the
Raspberry Pi and the hub experiences a power surge, your Raspberry Pi could
potentially be damaged.

USB

PAGE CONTENTS

Overview
o

Supported Devices

General Limitations

Port Power Limits

Known Issues

Troubleshooting

OVERVIEW
The Raspberry Pi Model B is equipped with two USB2.0 ports. These are
connected to the LAN9512 combo hub/Ethernet chip IC3, which is itself a USB
device connected to the single upstream USB port on BCM2835.
On the Model A, the single USB2.0 port is directly wired to BCM2835.
The USB ports enable the attachment of peripherals such as keyboards, mice,
webcams that provide the Pi with additional functionality.
There are some differences between the USB hardware on the Raspberry Pi and
the USB hardware on desktop computers or laptop/tablet devices.
The USB host port inside the Pi is an On-The-Go (OTG) host as the application
processor powering the Pi, BCM2835, was originally intended to be used in the
mobile market: i.e. as the single USB port on a phone for connection to a PC, or to
a single device. In essence, the OTG hardware is simpler than the equivalent
hardware on a PC.
OTG in general supports communication to all types of USB device, but to provide
an adequate level of functionality for most of the USB devices that one might plug
into a Pi, the system software has to do more work.

SUPPORTED DEVICES
In general, every device supported by Linux is possible to use with the Pi, subject
to a few caveats detailed further down. Linux has probably the most
comprehensive driver database for legacy hardware of any operating system (it
can lag behind for modern device support as it requires open-source drivers for
Linux to recognise the device by default).
If you have a device and wish to use it with a Pi, then plug it in. Chances are that
it'll "just work". If you are running in a graphical interface (such as the LXDE
desktop environment in Raspbian), then it's likely that an icon or similar will pop up
announcing the new device.
If the device doesn't appear to work, then refer to the Troubleshooting section.
GENERAL LIMITATIONS
The OTG hardware on Raspberry Pi has a simpler level of support for certain
devices, which may present a higher software processing overhead. The
Raspberry Pi also has only one root USB port: all traffic from all connected devices
is funnelled down this bus, which operates at a maximum speed of 480mbps.
The USB specification defines three device speeds - Low, Full and High. Most mice
and keyboards are Low-speed, most USB sound devices are Full-speed and most
video devices (webcams or video capture) are High-speed.
Generally, there are no issues with connecting multiple High-speed USB devices to
a Pi.
The software overhead incurred when talking to Low- and Full-speed devices
means that there are soft limitations on the number of simultaneously active Lowand Full-speed devices. Small numbers of these types of devices connected to a Pi
will cause no issues.

PORT POWER LIMITS


USB devices have defined power requirements, in units of 100mA from 100mA to
500mA. The device advertises its own power requirements to the USB host when it
is first connected. In theory, the actual power consumed by the device should not
exceed its stated requirement.
The USB ports on a Raspberry Pi have a design loading of 100mA each - sufficient
to drive "low-power" devices such as mice and keyboards. Devices such as WiFi
adapters, USB hard drives, USB pen drives all consume much more current and
should be powered from an external hub with its own power supply. While it is
possible to plug a 500mA device into a Pi and have it work with a sufficiently
powerful supply, reliable operation is not guaranteed.
In addition, hotplugging high-power devices into the Pi's USB ports may cause a
brownout which can cause the Pi to reset.
See Power for more information.

DEVICES WITH KNOWN ISSUES


1. Interoperability between the Raspberry Pi and USB3.0 hubs
There is an issue with USB3.0 hubs in conjunction with the use of Full- or Lowspeed devices (most mice, most keyboards) and the Raspberry Pi. A bug in most
USB3.0 hub hardware means that the Raspberry Pi cannot talk to Full- or Lowspeed devices connected to a USB3.0 hub.
USB2.0 high-speed devices, including USB2.0 hubs, operate correctly when
connected via a USB3.0 hub.
Avoid connecting Low- or Full-speed devices into a USB3.0 hub. As a workaround,
plug a USB2.0 hub into the downstream port of the USB3.0 hub and connect the
low-speed device, or use a USB2.0 hub between the Pi and the USB3.0 hub, then
plug low-speed devices into the USB2.0 hub.

2. USB1.1 webcams
Old webcams may be Full-speed devices. Because these devices transfer a lot of
data and incur additional software overhead, reliable operation is not guaranteed.
As a workaround, try to use the camera at a lower resolution.
3. Esoteric USB sound cards
Expensive "audiophile" sound cards typically use far more bandwidth than is
necessary to stream audio playback. Reliable operation with 96kHz/192kHz DACs
is not guaranteed.
As a workaround, forcing the output stream to be CD quality (44.1kHz/48kHz 16bit) will reduce the stream bandwidth to reliable levels.
4. Single-TT USB hubs
USB2.0 and 3.0 hubs have a mechanism for talking to Full- or Low-speed devices
connected to their downstream ports called a Transaction Translator. This device
buffers high-speed requests from the host (i.e. the Pi) and transmits them at Fullor Low-speed to the downstream device. Two configurations of hub are allowed by
the USB specification: Single-TT (one TT for all ports) and Multi-TT (one TT per
port).
Because of the OTG hardware limitations, if too many Full- or Low-speed devices
are plugged into a single-TT hub, unreliable operation of the devices may occur. It
is recommended to use a Multi-TT hub to interface with multiple lower-speed
devices.
As a workaround, spread lower-speed devices out between the Pi's own USB port
and the single-TT hub.

TROUBLESHOOTING
IF YOUR DEVICE DOESN'T WORK AT ALL
The first step is to see if it is detected at all. There are two commands that can be
entered into a terminal for this: lsusb and dmesg . The first will print out all
devices attached to USB, whether they are actually recognised by a device driver
or not, and the second will print out the kernel message buffer (which can be quite

big after booting - try doing sudodmesgC then plug in your device and
retype dmesg to see new messages).
As an example with a USB pendrive:
pi@raspberrypi~$lsusb
Bus001Device002:ID0424:9512StandardMicrosystemsCorp.
Bus001Device001:ID1d6b:0002LinuxFoundation2.0roothub
Bus001Device003:ID0424:ec00StandardMicrosystemsCorp.
Bus001Device005:ID05dc:a781LexarMedia,Inc.
pi@raspberrypi~$dmesg
...Stuffthathappenedbefore...
[8904.228539]usb11.3:newhighspeedUSBdevicenumber5using
dwc_otg
[8904.332308]usb11.3:NewUSBdevicefound,idVendor=05dc,
idProduct=a781
[8904.332347]usb11.3:NewUSBdevicestrings:Mfr=1,Product=2,
SerialNumber=3
[8904.332368]usb11.3:Product:JDFirefly
[8904.332386]usb11.3:Manufacturer:Lexar
[8904.332403]usb11.3:SerialNumber:AACU6B4JZVH31337
[8904.336583]usbstorage11.3:1.0:USBMassStoragedevicedetected
[8904.337483]scsi1:usbstorage11.3:1.0
[8908.114261]scsi1:0:0:0:DirectAccessLexarJDFirefly
0100PQ:0ANSI:0CCS
[8908.185048]sd1:0:0:0:[sda]4048896512bytelogicalblocks:
(2.07GB/1.93GiB)
[8908.186152]sd1:0:0:0:[sda]WriteProtectisoff
[8908.186194]sd1:0:0:0:[sda]ModeSense:43000000
[8908.187274]sd1:0:0:0:[sda]NoCachingmodepagepresent
[8908.187312]sd1:0:0:0:[sda]Assumingdrivecache:writethrough
[8908.205534]sd1:0:0:0:[sda]NoCachingmodepagepresent

[8908.205577]sd1:0:0:0:[sda]Assumingdrivecache:writethrough
[8908.207226]sda:sda1
[8908.213652]sd1:0:0:0:[sda]NoCachingmodepagepresent
[8908.213697]sd1:0:0:0:[sda]Assumingdrivecache:writethrough
[8908.213724]sd1:0:0:0:[sda]AttachedSCSIremovabledisk

In this case, there are no error messages in dmesg and the pendrive is detected
by the usb-storage driver. If your device did not have a driver available, then
typically only the first 6 new lines will appear in the dmesg printout.
If a device enumerates without any errors, but doesn't appear to do anything, then
it is likely there are no drivers installed for it. Search around, based on the
manufacturer's name for the device or the USB IDs that are displayed in lsusb (e.g.
05dc:a781). The device may not be supported with default Linux drivers - and you
may need to download or compile your own third-party software.
IF YOUR DEVICE HAS INTERMITTENT BEHAVIOUR

Poor quality power is the most common cause of devices not working,
disconnecting or generally being unreliable.

If you are using an external powered hub, try swapping the power adapter supplied
with the hub for another compatible power supply with the same voltage rating and
polarity.

Check to see if the problem resolves itself if you remove other devices from the
hub's downstream ports.

Temporarily plug the device directly into the Pi and see if the behaviour improves.

Powerful Vector Floating Point Unit

Fully IEEE 754-compliant mode

Rare cases bounced to software support code for full IEEE 754 support

Suited to safety-critical application like Automotive and Avionics


RunFast mode for fast, near-compliant hardware execution of all instructions

No software support needed, performance increased as no software involved

Suited to consumer applications like 3D graphics

Compiler automatically targets VFP


Functions supported in hardware:

Multiplication, addition, and multiply-accumulate ( various variants)

Division and square root operation (multi-cycle, not pipelined)

Comparisons and format conversions

Operations can be performed on short vectors (From assembler only)

Separate pipelines allow load/store and MAC operations to occur simultaneously


with divide/square root unit operation

VFP may be powered-down when not in use

Clock gated and/or power completely removed


VFP has it's own additional register set

32 single precision

16 double-precision

Sufficient to double-buffer algorithms


Linpack, Filters, Graphics transforms, etc.

Secure Computing Environment with TrustZone technology

Secures against software-only "hack attacks" and most low budget hardware
"shack attacks"

Delivers two "virtual" cores with deep separating of context & data.
S/W and JTAG attacks can not enter secure domain

TrustZone: two virtualized CPUs in one


CPU MHz/resources are dynamically shared between according to

demands
The two isolated domains are implemented in the same machine with no

duplication of HW
o

Simpler and more flexible platform designs, lower costs and high
power/performance efficiency

GPIO
General Purpose Input/Output pins on the Raspberry Pi

OVERVIEW
This page expands on the technical features of the GPIO pins available on
BCM2835 in general. For usage examples, see the GPIO Usage section. When
reading this page, reference should be made to the BCM2835 ARM
PeripheralsDatasheet, section 6.
GPIO pins can be configured as either general-purpose input, general-purpose
output or as one of up to 6 special alternate settings, the functions of which are pindependant.
There are 3 GPIO banks on BCM2835.
Each of the 3 banks has its own VDD input pin. On Raspberry Pi, all GPIO banks
are supplied from 3.3V. Connection of a GPIO to a voltage higher than 3.3V will
likely destroy the GPIO block within the SoC.
A selection of pins from Bank 0 is available on the P1 header on Raspberry Pi.

GPIO PADS
The GPIO connections on the BCM2835 package are sometimes referred to in the
peripherals datasheet as "pads" - a semiconductor design term meaning "chip
connection to outside world".
The pads are configurable CMOS push-pull output drivers/input buffers. Registerbased control settings are available for

Internal pull-up / pull-down enable/disable

Output drive strength

Input Schmitt-trigger filtering

POWER-ON STATES
All GPIOs revert to general-purpose inputs on power-on reset. The default pull
states are also applied, which are detailed in the alternate function table in the
ARM peripherals datasheet. Most GPIOs have a default pull applied.

INTERRUPTS
Each GPIO pin, when configured as a general-purpose input, can be configured as
an interrupt source to the ARM. Several interrupt generation sources are
configurable:

Level-sensitive (high/low)

Rising/falling edge

Asynchronous rising/falling edge

Level interrupts maintain the interrupt status until the level has been cleared by
system software (e.g. by servicing the attached peripheral generating the interrupt).
The normal rising/falling edge detection has a small amount of synchronisation
built into the detection. At the system clock frequency, the pin is sampled with the
criteria for generation of an interrupt being a stable transition within a 3-cycle
window, i.e. a record of "1 0 0" or "0 1 1". Asynchronous detection bypasses this
synchronisation to enable the detection of very narrow events.

ALTERNATIVE FUNCTIONS
Almost all of the GPIO pins have alternative functions. Peripheral blocks internal to
BCM2835 can be selected to appear on one or more of a set of GPIO pins, for
example the I2C busses can be configured to at least 3 separate locations. Pad
control, such as drive strength or Schmitt filtering, still applies when the pin is
configured as an alternate function.

For more detailed information see the Low level peripherals page on the elinux wiki

SPI
PAGE CONTENTS

Overview

Software

Hardware

Linux driver

Troubleshooting

OVERVIEW
The Raspberry Pi is equipped with one SPI bus that has 2 chip selects.
The SPI master driver is disabled by default on Raspian. To enable it, remove the
blacklisting for spibcm2708 in /etc/modprobe.d/raspiblacklist.conf , or
use raspi-config. Reboot or load the driver manually with:
$sudomodprobespibcm2708

The SPI bus is available on the P1 Header:


MOSIP119
MISOP121
SCLKP123P124CE0
GNDP125P126CE1

SOFTWARE

WIRINGPI
WiringPi includes a library which can make it easier to use the Raspberry Pi's onboard SPI interface. Accesses the hardware registers directly.
http://wiringpi.com/
BCM2835 LIBRARY
This is a C library for Raspberry Pi (RPi). It provides access to GPIO and other IO
functions on the Broadcom BCM 2835 chip. Accesses the hardware registers
directly.
http://www.airspayce.com/mikem/bcm2835/
USE SPIDEV FROM C
There's a loopback test program in the Linux documentation that can be used as a
starting point. See the Troubleshooting section. Uses the Linux spidev driver to
access the bus.
SHELL
#Writebinary1,2and3
echone"\x01\x02\x03">/dev/spidev0.0

HARDWARE
The BCM2835 on the Raspberry Pi has 3 SPI Controllers. Only the SPI0 controller
is available on the header. Chapter 10 in the BCM2835 ARM Peripherals datasheet
describes this controller.
MASTER MODES
Signal name abbreviations

SCLKSerialCLocK
CEChipEnable(oftencalledChipSelect)
MOSIMasterOutSlaveIn
MISOMasterInSlaveOut
MOMIMasterOutMasterIn
MIMOMasterInMasterOut

STANDARD MODE

In Standard SPI master mode the peripheral implements the standard 3 wire serial
protocol (SCLK, MOSI and MISO).
BIDIRECTIONAL MODE

In bidirectional SPI master mode the same SPI standard is implemented except
that a single wire is used for data (MIMO) instead of two as in standard mode
(MISO and MOSI).
LOSSI MODE (LOW SPEED SERIAL INTERFACE)

The LoSSI standard allows issuing of commands to peripherals (LCD) and to


transfer data to and from them. LoSSI commands and parameters are 8 bits long,
but an extra bit is used to indicate whether the byte is a command or
parameter/data. This extra bit is set high for a data and low for a command. The
resulting 9-bit value is serialized to the output. LoSSI is commonly used with MIPI
DBI type C compatible LCD controllers.
Note:
Some commands trigger an automatic read by the SPI controller, so this mode
can't be used as a multipurpose 9-bit SPI.
TRANSFER MODES

Polled

Interrupt

DMA

SPEED
The CDIV (Clock Divider) field of the CLK register sets the SPI clock speed:
SCLK=CoreClock/CDIV
IfCDIVissetto0,thedivisoris65536.Thedivisormustbeapower
of2.Oddnumbersroundeddown.ThemaximumSPIclockrateisofthe
APBclock.

Errata: "must be a power of 2" probably should be "must be a multiple of 2"


See the Linux driver section for more info.
CHIP SELECT
Setup and Hold times related to the automatic assertion and de-assertion of the CS
lines when operating in DMA mode are as follows:

The CS line will be asserted at least 3 core clock cycles before the msb of the first
byte of the transfer.

The CS line will be de-asserted no earlier than 1 core clock cycle after the trailing
edge of the final clock pulse.

LINUX DRIVER
The default Linux driver is spi-bcm2708.
The following information was valid 2014-07-05.
SPEED
The driver supports the following speeds:

cdivspeed
2125.0MHz
462.5MHz
831.2MHz
1615.6MHz
327.8MHz
643.9MHz
1281953kHz
256976kHz
512488kHz
1024244kHz
2048122kHz
409661kHz
819230.5kHz
1638415.2kHz
327687629Hz

When asking for say 24 MHz, the actual speed will be 15.6 MHz.
Forum post: SPI has more speeds
SUPPORTED MODE BITS
SPI_CPOLClockpolarity
SPI_CPHAClockphase
SPI_CS_HIGHChipSelectactivehigh
SPI_NO_CS1deviceperbus,noChipSelect

Bidirectional mode is currently not supported.


SUPPORTED BITS PER WORD

8 - Normal

9 - This is supported using LoSSI mode.

TRANSFER MODES
Only interrupt mode is supported.
DEPRECATED WARNING
The following appears in the kernel log:
bcm2708_spibcm2708_spi.0:masterisunqueued,thisisdeprecated

SPI DRIVER LATENCY


This thread discusses latency problems.
DMA CAPABLE DRIVER
This is a fork of spi-bcm2708 which enables DMA support for SPI client drivers that
support DMA.
https://github.com/notro/spi-bcm2708 (wiki)

TROUBLESHOOTING
LOOPBACK TEST
This can be used to test SPI send and receive. Put a wire between MOSI and
MISO. It does not test CE0 and CE1.
wgethttps://raw.githubusercontent.com/raspberrypi/linux/rpi
3.10.y/Documentation/spi/spidev_test.c
gccospidev_testspidev_test.c
./spidev_testD/dev/spidev0.0
spimode:0
bitsperword:8
maxspeed:500000Hz(500KHz)

FFFFFFFFFFFF
400000000095
FFFFFFFFFFFF
FFFFFFFFFFFF
FFFFFFFFFFFF
DEADBEEFBAAD
F00D

If you get compilation errors, try the latest version instead:


wget
https://raw.github.com/torvalds/linux/master/Documentation/spi/spidev_
test.c

How to Use Background Subtraction


Methods

Background subtraction (BS) is a common and widely used technique for


generating a foreground mask (namely, a binary image containing the pixels
belonging to moving objects in the scene) by using static cameras.

As the name suggests, BS calculates the foreground mask performing a


subtraction between the current frame and a background model, containing
the static part of the scene or, more in general, everything that can be
considered as background given the characteristics of the observed scene.

Background modeling consists of two main steps:


1. Background Initialization;
2. Background Update.
In the first step, an initial model of the background is computed, while in the
second step that model is updated in order to adapt to possible changes in
the scene.

In this tutorial we will learn how to perform BS by using OpenCV. As input,


we will use data coming from the publicly available data set Background
Models Challenge (BMC) .

Goals
In this tutorial you will learn how to:
1. Read data from videos by using VideoCapture or image sequences by
using imread;
2. Create
and
update
the
using BackgroundSubtractor class;

background

model

by

3. Get and show the foreground mask by using imshow;


4. Save the output by using imwrite to quantitatively evaluate the results.

Code
In the following you can find the source code. We will let the user chose to process
either a video file or a sequence of images.

Two different methods are used to generate two foreground masks:


1. MOG
2. MOG2

The results as well as the input data are shown on the screen.
//opencv
#include
#include
//C
#include
//C++
#include
#include

<opencv2/highgui/highgui.hpp>
<opencv2/video/background_segm.hpp>
<stdio.h>
<iostream>
<sstream>

using namespace cv;


using namespace std;
//global variables
Mat frame; //current frame
Mat fgMaskMOG; //fg mask generated by MOG method
Mat fgMaskMOG2; //fg mask fg mask generated by MOG2 method
Ptr<BackgroundSubtractor> pMOG; //MOG Background subtractor
Ptr<BackgroundSubtractor> pMOG2; //MOG2 Background subtractor
int keyboard;
//function declarations
void help();
void processVideo(char* videoFilename);
void processImages(char* firstFrameFilename);
void help()
{
cout
<< "--------------------------------------------------------------------------" << endl
<< "This program shows how to use background subtraction methods provided by
" << endl
<< " OpenCV. You can process both videos (-vid) and images (-img)."
<<
endl
<< endl
<< "Usage:"
<< endl
<< "./bs {-vid <video filename>|-img <image filename>}"
<<
endl
<< "for example: ./bs -vid video.avi"
<< endl
<< "or: ./bs -img /data/images/1.png"
<< endl
<< "--------------------------------------------------------------------------" << endl
<< endl;
}
int main(int argc, char* argv[])
{

//print help information


help();
//check for the input parameter correctness
if(argc != 3) {
cerr <<"Incorret input list" << endl;
cerr <<"exiting..." << endl;
return EXIT_FAILURE;
}
//create GUI windows
namedWindow("Frame");
namedWindow("FG Mask MOG");
namedWindow("FG Mask MOG 2");
//create Background Subtractor objects
pMOG = createBackgroundSubtractorMOG(); //MOG approach
pMOG2 = createBackgroundSubtractorMOG2(); //MOG2 approach
if(strcmp(argv[1], "-vid") == 0) {
//input data coming from a video
processVideo(argv[2]);
}
else if(strcmp(argv[1], "-img") == 0) {
//input data coming from a sequence of images
processImages(argv[2]);
}
else {
//error in reading input parameters
cerr <<"Please, check the input parameters." << endl;
cerr <<"Exiting..." << endl;
return EXIT_FAILURE;
}
//destroy GUI windows
destroyAllWindows();
return EXIT_SUCCESS;
}
void processVideo(char* videoFilename) {
//create the capture object
VideoCapture capture(videoFilename);
if(!capture.isOpened()){
//error in opening the video input
cerr << "Unable to open video file: " << videoFilename << endl;
exit(EXIT_FAILURE);
}
//read input data. ESC or 'q' for quitting
while( (char)keyboard != 'q' && (char)keyboard != 27 ){
//read the current frame
if(!capture.read(frame)) {
cerr << "Unable to read next frame." << endl;
cerr << "Exiting..." << endl;
exit(EXIT_FAILURE);
}

//update the background model


pMOG->apply(frame, fgMaskMOG);
pMOG2->apply(frame, fgMaskMOG2);
//get the frame number and write it on the current frame
stringstream ss;
rectangle(frame, cv::Point(10, 2), cv::Point(100,20),
cv::Scalar(255,255,255), -1);
ss << capture.get(CAP_PROP_POS_FRAMES);
string frameNumberString = ss.str();
putText(frame, frameNumberString.c_str(), cv::Point(15, 15),
FONT_HERSHEY_SIMPLEX, 0.5 , cv::Scalar(0,0,0));
//show the current frame and the fg masks
imshow("Frame", frame);
imshow("FG Mask MOG", fgMaskMOG);
imshow("FG Mask MOG 2", fgMaskMOG2);
//get the input from the keyboard
keyboard = waitKey( 30 );
}
//delete capture object
capture.release();
}
void processImages(char* fistFrameFilename) {
//read the first file of the sequence
frame = imread(fistFrameFilename);
if(!frame.data){
//error in opening the first image
cerr << "Unable to open first image frame: " << fistFrameFilename << endl;
exit(EXIT_FAILURE);
}
//current image filename
string fn(fistFrameFilename);
//read input data. ESC or 'q' for quitting
while( (char)keyboard != 'q' && (char)keyboard != 27 ){
//update the background model
pMOG->apply(frame, fgMaskMOG);
pMOG2->apply(frame, fgMaskMOG2);
//get the frame number and write it on the current frame
size_t index = fn.find_last_of("/");
if(index == string::npos) {
index = fn.find_last_of("\\");
}
size_t index2 = fn.find_last_of(".");
string prefix = fn.substr(0,index+1);
string suffix = fn.substr(index2);
string frameNumberString = fn.substr(index+1, index2-index-1);
istringstream iss(frameNumberString);
int frameNumber = 0;
iss >> frameNumber;
rectangle(frame, cv::Point(10, 2), cv::Point(100,20),
cv::Scalar(255,255,255), -1);
putText(frame, frameNumberString.c_str(), cv::Point(15, 15),
FONT_HERSHEY_SIMPLEX, 0.5 , cv::Scalar(0,0,0));
//show the current frame and the fg masks

imshow("Frame", frame);
imshow("FG Mask MOG", fgMaskMOG);
imshow("FG Mask MOG 2", fgMaskMOG2);
//get the input from the keyboard
keyboard = waitKey( 30 );
//search for the next image in the sequence
ostringstream oss;
oss << (frameNumber + 1);
string nextFrameNumberString = oss.str();
string nextFrameFilename = prefix + nextFrameNumberString + suffix;
//read the next frame
frame = imread(nextFrameFilename);
if(!frame.data){
//error in opening the next image in the sequence
cerr << "Unable to open image frame: " << nextFrameFilename << endl;
exit(EXIT_FAILURE);
}
//update the path of the current frame
fn.assign(nextFrameFilename);
}
}

The source file can be downloaded here.

Explanation
We discuss the main parts of the above code:
1. First, three Mat objects are allocated to store the current frame and two
foreground masks, obtained by using two different BS algorithms.
2. Mat frame; //current frame
3. Mat fgMaskMOG; //fg mask generated by MOG method
4. Mat fgMaskMOG2; //fg mask fg mask generated by MOG2 method

5. Two BackgroundSubtractor objects will be used to generate the foreground


masks. In this example, default parameters are used, but it is also possible
to declare specific parameters in the create function.
6. Ptr<BackgroundSubtractor> pMOG; //MOG Background subtractor
7. Ptr<BackgroundSubtractor> pMOG2; //MOG2 Background subtractor
8. ...
9. //create Background Subtractor objects
10. pMOG = createBackgroundSubtractorMOG(); //MOG approach
11. pMOG2 = createBackgroundSubtractorMOG2(); //MOG2 approach

12. The command line arguments are analysed. The user can chose between
two options:
o

video files (by choosing the option -vid);

image sequences (by choosing the option -img).

13. if(strcmp(argv[1], "-vid") == 0) {


14. //input data coming from a video
15. processVideo(argv[2]);
16. }
17. else if(strcmp(argv[1], "-img") == 0) {
18. //input data coming from a sequence of images
19. processImages(argv[2]);
20. }

21. Suppose you want to process a video file. The video is read until the end is
reached or the user presses the button q or the button ESC.
22. while( (char)keyboard != 'q' && (char)keyboard != 27 ){
23. //read the current frame
24. if(!capture.read(frame)) {
25.
cerr << "Unable to read next frame." << endl;
26.
cerr << "Exiting..." << endl;
27.
exit(EXIT_FAILURE);
28. }

29. Every frame is used both for calculating the foreground mask and for
updating the background. If you want to change the learning rate used for
updating the background model, it is possible to set a specific learning rate
by passing a third parameter to the apply method.
30. //update the background model
31. pMOG->apply(frame, fgMaskMOG);
32. pMOG2->apply(frame, fgMaskMOG2);

33. The current frame number can be extracted from the VideoCapture object
and stamped in the top left corner of the current frame. A white rectangle is
used to highlight the black colored frame number.
34. //get the frame number and write it on the current frame
35. stringstream ss;
36. rectangle(frame, cv::Point(10, 2), cv::Point(100,20),
37.
cv::Scalar(255,255,255), -1);
38. ss << capture.get(CAP_PROP_POS_FRAMES);
39. string frameNumberString = ss.str();
40. putText(frame, frameNumberString.c_str(), cv::Point(15, 15),
41.
FONT_HERSHEY_SIMPLEX, 0.5 , cv::Scalar(0,0,0));

42. We are ready to show the current input frame and the results.
43. //show the current frame and the fg masks

44. imshow("Frame", frame);


45. imshow("FG Mask MOG", fgMaskMOG);
46. imshow("FG Mask MOG 2", fgMaskMOG2);

47. The same operations listed above can be performed using a sequence of
images as input. The processImage function is called and, instead of using
a VideoCapture object, the images are read by using imread, after
individuating the correct path for the next frame to read.
48. //read the first file of the sequence
49. frame = imread(fistFrameFilename);
50. if(!frame.data){
51. //error in opening the first image
52. cerr << "Unable to open first image frame: " << fistFrameFilename <<
endl;
53. exit(EXIT_FAILURE);
54. }
55. ...
56. //search for the next image in the sequence
57. ostringstream oss;
58. oss << (frameNumber + 1);
59. string nextFrameNumberString = oss.str();
60. string nextFrameFilename = prefix + nextFrameNumberString + suffix;
61. //read the next frame
62. frame = imread(nextFrameFilename);
63. if(!frame.data){
64. //error in opening the next image in the sequence
65. cerr << "Unable to open image frame: " << nextFrameFilename << endl;
66. exit(EXIT_FAILURE);
67. }
68. //update the path of the current frame
69. fn.assign(nextFrameFilename);

Note that this example works only on image sequences in which the
filename format is <n>.png, where n is the frame number (e.g., 7.png).

Results

Given the following input parameters:

-vid Video_001.avi

The output of the program will look as the following:

The video file Video_001.avi is part of the Background Models Challenge


(BMC) data set and it can be downloaded from the following
link Video_001 (about 32 MB).

If you want to process a sequence of images, then the -img option has to
be chosen:

-img 111_png/input/1.png

The output of the program will look as the following:

The sequence of images used in this example is part of the Background


Models Challenge (BMC) dataset and it can be downloaded from the
following link sequence 111 (about 708 MB). Please, note that this example
works only on sequences in which the filename format is <n>.png, where n
is the frame number (e.g., 7.png).

Evaluation
To quantitatively evaluate the results obtained, we need to:

Save the output images;


Have the ground truth images for the chosen sequence.

In order to save the output images, we can use imwrite. Adding the following code
allows for saving the foreground masks.
string imageToSave = "output_MOG_" + frameNumberString + ".png";
bool saved = imwrite(imageToSave, fgMaskMOG);
if(!saved) {
cerr << "Unable to save " << imageToSave << endl;
}

Once we have collected the result images, we can compare them with the ground
truth data. There exist several publicly available sequences for background
subtraction that come with ground truth data. If you decide to use the Background
Models Challenge (BMC), then the result images can be used as input for the BMC
Wizard. The wizard can compute different measures about the accuracy of the
results.

References

Background Models Challenge (BMC) website, http://bmc.univ-bpclermont.fr/


Antoine Vacavant, Thierry Chateau, Alexis Wilhelm and Laurent Lequievre. A
Benchmark Dataset for Foreground/Background Extraction. In ACCV 2012,
Workshop: Background Models Challenge, LNCS 7728, 291-300. November
2012, Daejeon, Korea.

Das könnte Ihnen auch gefallen