OpenCV Installation


This page provides a brief introduction for building OpenCV on ARM/Linux/Windows (with FFmpeg support). 

Table of Contents


Prerequisites

  1. OpenCV source
  2. FFmpeg source
  3. Xilinx tools
  4. CMake

Install OpenCV on ARM

Step 1 Build FFmpeg (cross-compile)


In the FFmpeg source directory:
After FFmpeg installation, add library path and header file path to environment to make it visible:

Step 2 Edit a toolchain file


Open a file (e.g. named toolchain.make) and edit it:

Step 3 Generate makefile

Check the configuration information of OpenCV to confirm the FFmpeg library is detected (for example):

Step 4 Customize build options

You might need to check if you need other cross-compiled libraries support.
For the least dependency, make sure that the following items are OFF. If it is ON, you can toggle it by scrolling to the option and hitting Enter.
  • WITH_1394
  • WITH_CUDA
  • WITH_CUFFT
  • WITH_EIGEN
  • WITH_GSTREAMER
  • WITH_GTK
  • WITH_JASPER
  • WITH_JPEG
  • WITH_OPENEXR
  • WITH_PNG
  • WITH_PVAPI
  • WITH_QT
  • WITH_TBB
  • WITH_TIFF
  • WITH_UNICAP
  • WITH_V4L
  • WITH_XINE

Hit 'c' then 'g' to generate new Makefile.
Or, you may edit CMakeCache.txt to modify build options.
Note: make sure WITH_FFMPEG=ON in CMakeCache.txt to enable FFmpeg support. Here is an example of my CMakeCache.txt.

Step 5 Build (cross-compile)


Step 6 Test



Here is an simple example of an OpenCV application on ARM. It reads the first frame of a video file (h264 encoded) and save it to a bitmap file.
Test it on ARM with:
source code of test.cpp:


Install OpenCV on Linux


Step 1 Build FFmpeg


In the FFmpeg source directory:
After FFmpeg installation, add library path and header file path to environment to make it visible:

Step 2 Generate makefile

Check the configuration information of OpenCV to confirm the FFmpeg library is detected (for example):

Step 3 Customize build options


You might want to edit CMakeCache.txt to modify build options.
Note: make sure WITH_FFMPEG=ON in CMakeCache.txt to enable FFmpeg support. Here is an example of my CMakeCache.txt.

Step 4 Build


Step 5 Test


Here is an simple example of an OpenCV application on Linux. It reads the first frame of a video file (h264 encoded) and save it to a bitmap file.
Test it on Linux with executing "test". (Note: make sure OpenCV libraries can be found in $LD_LIBRARY_PATH)
source code of test.cpp is the same as the one in ARM part.


Install OpenCV on Windows



Step 1 Build FFmpeg


Follow the instructions in <OpenCV_source>\3rdparty\ffmpeg\readme.txt to build FFmpeg on windows.
Note: 32-bit MSYS can be found at C:\Xilinx\Vivado_HLS\201x.x\msys, launch msys.bat there is ok to do the job.
At last, the newly built opencv_ffmpeg.dll (32-bit) and/or opencv_ffmpeg_64.dll (64-bit) can be found in <OpenCV_source>\3rdparty\ffmpeg

Step 2 Generate makefile


Start -> Xilinx Design Tools -> Vivado HLS command prompt (this runs a msys shell)
Check the configuration information of OpenCV to confirm the FFmpeg library is detected (for example):

Step 3 Customize build options


You might want to edit CMakeCache.txt to modify build options.
Note: make sure WITH_FFMPEG=ON in CMakeCache.txt to enable FFmpeg support. Here is an example of my CMakeCache.txt.

Step 4 Build

Step 5 Test


Here is an simple example of an OpenCV application on Windows. It reads the first frame of a video file (h264 encoded) and save it to a bitmap file.
Test it on Windows with executing "test.exe" in Vivado HLS command prompt.
source code of test.cpp is the same as the one in ARM part.