Visual Servoing Platform version 3.7.0
Loading...
Searching...
No Matches
Tutorial: Moving-edges tracking

1. Introduction

ViSP moving-edges tracker provide real-time tracking capabilities of points normal to the object contours. Such a tracker allows to track a line, an ellipse, a circle or more complex objects using model-based approaches.

Note that all the source code mentioned in this tutorial is part of ViSP source code:

Moving edges (ME) are points sampled along a visible contour. They are tracked along the contour normal. The following image shows an example where the contour is a line, but could be generalized to any other shape.

In black the previous contour position with sampled MEs. In blue tracked MEs.

The main parameters for moving-edges implemented in vpMe class are as follows:

  • the range, which is the maximum distance in pixels to be searched along both sides of the contour normal that could be set with vpMe::setRange()
  • the step, which is the distance in pixels between two successive MEs set with could be set with vpMe::setSampleStep()
  • the Likelihood normalized threshold, which is the minimum grey level threshold used to differentiate the inner part of the object from the outer part. Values are in range 0-255. To use this normalized threshold use first vpMe::setLikelihoodThresholdType() to set vpMe::NORMALIZED_THRESHOLD type, and then vpMe::setThreshold().

The following sample code shows how to set these parameters:

#include <visp3/me/vpMe.h>
...
int me_range = 10;
int me_sample_step = 5;
int me_threshold = 20; // Value in [0 ; 255]
...
vpMe me;
me.setRange(me_range);
me.setLikelihoodThresholdType(vpMe::NORMALIZED_THRESHOLD);
me.setThreshold(me_threshold);
me.setSampleStep(me_sample_step);
@ NORMALIZED_THRESHOLD
Definition vpMe.h:154

Moving-edges could be used in ViSP:

  • to track a line thanks to vpMeLine class
  • to track an ellipse, a circle, or an arc of ellipse or circle thanks to vpMeEllipse class.

Note that both classes are also extended to track more complex object with our model-based tracker implemented in vpMbGenericTracker. See the dedicated Tutorial: Markerless generic model-based tracking using a color camera.

To know more about other MEs parameters, follow this 5.2. Moving-edges settings section.

2. Moving-edges tracking

2.1. Line tracking

2.1.1. Source code

With ViSP you can track a line using moving edges. The following example code available in tutorial-me-line-tracker.cpp shows how to use ViSP vpMeLine class to track a line in a live stream acquired by a camera.

Note
There is also this other similar example in trackMeLine.cpp that allows to test the line tracker on a recorded video or sequence of successive images.
#include <iostream>
#include <visp3/core/vpConfig.h>
// If openCV available, priority to OpenCV capture, otherwise the user has to modify the code uncommenting/commenting
// one of the following lines
#if defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)))
#undef VISP_HAVE_V4L2
#undef VISP_HAVE_DC1394
#undef VISP_HAVE_CMU1394
#undef VISP_HAVE_FLYCAPTURE
#undef VISP_HAVE_REALSENSE2
// #undef HAVE_OPENCV_HIGHGUI
// #undef HAVE_OPENCV_VIDEOIO
#else
// Use the first grabber that is available. Uncomment/comment the following lines to disable usage of a grabber
// #undef VISP_HAVE_V4L2
// #undef VISP_HAVE_DC1394
// #undef VISP_HAVE_CMU1394
// #undef VISP_HAVE_FLYCAPTURE
// #undef VISP_HAVE_REALSENSE2
#undef HAVE_OPENCV_HIGHGUI
#undef HAVE_OPENCV_VIDEOIO
#endif
#if (defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || \
defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2) || defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO))) && \
((VISP_HAVE_OPENCV_VERSION < 0x050000) && defined(HAVE_OPENCV_CALIB3D) && defined(HAVE_OPENCV_FEATURES2D)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x050000) && defined(HAVE_OPENCV_3D) && defined(HAVE_OPENCV_FEATURES)))
#ifdef VISP_HAVE_MODULE_SENSOR
#include <visp3/sensor/vp1394CMUGrabber.h>
#include <visp3/sensor/vp1394TwoGrabber.h>
#include <visp3/sensor/vpFlyCaptureGrabber.h>
#include <visp3/sensor/vpRealSense2.h>
#include <visp3/sensor/vpV4l2Grabber.h>
#if (VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)
#include <opencv2/highgui/highgui.hpp> // for cv::VideoCapture
#elif (VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)
#include <opencv2/videoio/videoio.hpp> // for cv::VideoCapture
#endif
#endif
#include <visp3/gui/vpDisplayFactory.h>
#include <visp3/me/vpMeLine.h>
int main(int argc, char **argv)
{
#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif
#if (VISP_CXX_STANDARD >= VISP_CXX_STANDARD_11)
std::shared_ptr<vpDisplay> display;
#else
vpDisplay *display = nullptr;
#endif
int opt_me_range = 10;
int opt_me_sample_step = 5;
int opt_me_threshold = 20; // Value in [0 ; 255]
for (int i = 1; i < argc; i++) {
if (std::string(argv[i]) == "--me-range" && i + 1 < argc) {
opt_me_range = std::atoi(argv[++i]);
}
else if (std::string(argv[i]) == "--me-sample-step" && i + 1 < argc) {
opt_me_sample_step = std::atoi(argv[++i]);
}
else if (std::string(argv[i]) == "--me-threshold" && i + 1 < argc) {
opt_me_threshold = std::atoi(argv[++i]);
}
else if (std::string(argv[i]) == "--help" || std::string(argv[i]) == "-h") {
std::cout << "\nUsage: " << argv[0]
<< " [--me-range <range>]"
<< " [--me-sample-step <sample step>]"
<< " [--me-threshold <threshold>]"
<< " [--help] [-h]\n"
<< std::endl;
return EXIT_SUCCESS;
}
else {
std::cout << "\nError: wrong parameter " << argv[i] << std::endl;
return EXIT_FAILURE;
}
}
try {
int opt_device = 0; // For OpenCV and V4l2 grabber to set the camera device
#if defined(VISP_HAVE_V4L2)
std::ostringstream device;
device << "/dev/video" << opt_device;
std::cout << "Use Video 4 Linux grabber on device " << device.str() << std::endl;
g.setDevice(device.str());
g.setScale(1);
g.open(I);
#elif defined(VISP_HAVE_DC1394)
(void)opt_device; // To avoid non used warning
std::cout << "Use DC1394 grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_CMU1394)
(void)opt_device; // To avoid non used warning
std::cout << "Use CMU1394 grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_FLYCAPTURE)
(void)opt_device; // To avoid non used warning
std::cout << "Use FlyCapture grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_REALSENSE2)
(void)opt_device; // To avoid non used warning
std::cout << "Use Realsense 2 grabber" << std::endl;
rs2::config config;
config.disable_stream(RS2_STREAM_DEPTH);
config.disable_stream(RS2_STREAM_INFRARED);
config.enable_stream(RS2_STREAM_COLOR, 640, 480, RS2_FORMAT_RGBA8, 30);
g.open(config);
g.acquire(I);
#elif defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)))
std::cout << "Use OpenCV grabber on device " << opt_device << std::endl;
cv::VideoCapture g(opt_device); // Open the default camera
if (!g.isOpened()) { // Check if we succeeded
std::cout << "Failed to open the camera" << std::endl;
return EXIT_FAILURE;
}
cv::Mat frame;
int i = 0;
while ((i++ < 20) && !g.read(frame)) {
} // warm up camera by skiping unread frames
g >> frame; // get a new frame from camera
#endif
#if defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2)
g.acquire(I);
#elif defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)))
g >> frame; // get a new frame from camera
#endif
#if defined(VISP_HAVE_DISPLAY)
#if (VISP_CXX_STANDARD >= VISP_CXX_STANDARD_11)
display = vpDisplayFactory::createDisplay(I, -1, -1, "Camera view");
#else
display = vpDisplayFactory::allocateDisplay(I, -1, -1, "Camera view");
#endif
#else
std::cout << "No image viewer is available..." << std::endl;
#endif
vpMe me;
me.setRange(opt_me_range);
me.setThreshold(opt_me_threshold);
me.setSampleStep(opt_me_sample_step);
std::cout << "Moving-edges settings" << std::endl;
me.print();
vpMeLine line;
line.setMe(&me);
line.initTracking(I);
bool quit = false;
while (!quit) {
#if defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2)
g.acquire(I);
#elif defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)))
g >> frame;
#endif
vpDisplay::displayText(I, 20, 20, "Click to quit", vpColor::red);
line.track(I);
if (vpDisplay::getClick(I, false)) {
quit = true;
}
}
}
catch (const vpException &e) {
std::cout << "Catch an exception: " << e << std::endl;
}
#if (VISP_CXX_STANDARD < VISP_CXX_STANDARD_11)
if (display != nullptr) {
delete display;
}
#endif
}
#else
int main()
{
#if defined(VISP_HAVE_OPENCV)
std::cout << "Install a 3rd party dedicated to frame grabbing (dc1394, cmu1394, v4l2, OpenCV, FlyCapture, "
<< "Realsense2), configure and build ViSP again to use this tutorial."
<< std::endl;
#else
std::cout << "Install OpenCV 3rd party, configure and build ViSP again to use this tutorial." << std::endl;
#endif
return EXIT_SUCCESS;
}
#endif
Firewire cameras video capture based on CMU 1394 Digital Camera SDK.
void open(vpImage< unsigned char > &I)
Class for firewire ieee1394 video devices using libdc1394-2.x api.
void open(vpImage< unsigned char > &I)
static const vpColor red
Definition vpColor.h:198
Class that defines generic functionalities for display.
Definition vpDisplay.h:171
static bool getClick(const vpImage< unsigned char > &I, bool blocking=true)
static void display(const vpImage< unsigned char > &I)
static void flush(const vpImage< unsigned char > &I)
static void displayText(const vpImage< unsigned char > &I, const vpImagePoint &ip, const std::string &s, const vpColor &color)
error that can be emitted by ViSP classes.
Definition vpException.h:60
void open(vpImage< unsigned char > &I)
static void convert(const vpImage< unsigned char > &src, vpImage< vpRGBa > &dest)
Definition of the vpImage class member functions.
Definition vpImage.h:131
Class that tracks in an image a line moving edges.
Definition vpMeLine.h:157
void display(const vpImage< unsigned char > &I, const vpColor &color, unsigned int thickness=1)
Definition vpMeLine.cpp:177
void track(const vpImage< unsigned char > &I)
Definition vpMeLine.cpp:607
void initTracking(const vpImage< unsigned char > &I)
Definition vpMeLine.cpp:187
@ RANGE_RESULT
Definition vpMeSite.h:85
void setDisplay(vpMeSite::vpMeSiteDisplayType select)
void setMe(vpMe *me)
Definition vpMe.h:143
void print()
Definition vpMe.cpp:404
void setRange(const unsigned int &range)
Definition vpMe.h:438
void setLikelihoodThresholdType(const vpLikelihoodThresholdType likelihood_threshold_type)
Definition vpMe.h:531
void setThreshold(const double &threshold)
Definition vpMe.h:489
void setSampleStep(const double &sample_step)
Definition vpMe.h:445
void acquire(vpImage< unsigned char > &grey, double *ts=nullptr)
bool open(const rs2::config &cfg=rs2::config())
Class that is a wrapper over the Video4Linux2 (V4L2) driver.
void open(vpImage< unsigned char > &I)
void setScale(unsigned scale=vpV4l2Grabber::DEFAULT_SCALE)
void setDevice(const std::string &devname)
std::shared_ptr< vpDisplay > createDisplay()
Return a smart pointer vpDisplay specialization if a GUI library is available or nullptr otherwise.
vpDisplay * allocateDisplay()
Return a newly allocated vpDisplay specialization if a GUI library is available or nullptr otherwise.

Here after we explain line by line the program.

The source code is build only if one of the grabbers is available. To this end we are checking preprocessor macros defined in visp3/core/vpConfig.h header.

#if (defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || \
defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2) || defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO))) && \
((VISP_HAVE_OPENCV_VERSION < 0x050000) && defined(HAVE_OPENCV_CALIB3D) && defined(HAVE_OPENCV_FEATURES2D)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x050000) && defined(HAVE_OPENCV_3D) && defined(HAVE_OPENCV_FEATURES)))
Note
By default, if ViSP is build with OpenCV 3rdparty enabled, we are using OpenCV to grab the live stream from a camera. If you rather want to use a Realsense camera and grab images with vpRealsense2 class, you may edit the code to undef all the grabbers except the one that you want to use.
#undef VISP_HAVE_V4L2
#undef VISP_HAVE_DC1394
#undef VISP_HAVE_CMU1394
#undef VISP_HAVE_FLYCAPTURE
//#undef VISP_HAVE_REALSENSE2 <- If available this is the grabber that will be used
#undef HAVE_OPENCV_HIGHGUI
#undef HAVE_OPENCV_VIDEOIO

Images that are processed could be acquired from various framegrabbing devices. This is allowed by including the frame grabber headers.

#include <visp3/sensor/vp1394CMUGrabber.h>
#include <visp3/sensor/vp1394TwoGrabber.h>
#include <visp3/sensor/vpFlyCaptureGrabber.h>
#include <visp3/sensor/vpRealSense2.h>
#include <visp3/sensor/vpV4l2Grabber.h>
#if (VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)
#include <opencv2/highgui/highgui.hpp> // for cv::VideoCapture
#elif (VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)
#include <opencv2/videoio/videoio.hpp> // for cv::VideoCapture
#endif

To display these images we then include the header of the factory that permit to create a viewer.

#include <visp3/gui/vpDisplayFactory.h>

A graphical library, such as Graphical Display Interface (GDI) on Windows, or X11 on unix-like systems, is required in order to have a functional display.

Finally, to track a line with the moving edges, we include the header of the vpMeLine class.

#include <visp3/me/vpMeLine.h>

Here we create a gray level image container I that will contain the images acquired by our camera.

Then, we create a grabber instance, first for an usb camera under Unix if libv4l (Video 4 Linux) is available, secondly for a firewire camera under Unix if libdc1394 3rd party is available, then for a firewire camera under Windows if CMU1394 3rd party is available, next for an camera working with Flycapture SDK, then for a Realsense camera, and finally with OpenCV if none of the previous 3rd party are available. The Tutorial: Image frame grabbing gives more details concerning the framegrabbing.

int opt_device = 0; // For OpenCV and V4l2 grabber to set the camera device
#if defined(VISP_HAVE_V4L2)
std::ostringstream device;
device << "/dev/video" << opt_device;
std::cout << "Use Video 4 Linux grabber on device " << device.str() << std::endl;
g.setDevice(device.str());
g.setScale(1);
g.open(I);
#elif defined(VISP_HAVE_DC1394)
(void)opt_device; // To avoid non used warning
std::cout << "Use DC1394 grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_CMU1394)
(void)opt_device; // To avoid non used warning
std::cout << "Use CMU1394 grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_FLYCAPTURE)
(void)opt_device; // To avoid non used warning
std::cout << "Use FlyCapture grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_REALSENSE2)
(void)opt_device; // To avoid non used warning
std::cout << "Use Realsense 2 grabber" << std::endl;
rs2::config config;
config.disable_stream(RS2_STREAM_DEPTH);
config.disable_stream(RS2_STREAM_INFRARED);
config.enable_stream(RS2_STREAM_COLOR, 640, 480, RS2_FORMAT_RGBA8, 30);
g.open(config);
g.acquire(I);
#elif defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)))
std::cout << "Use OpenCV grabber on device " << opt_device << std::endl;
cv::VideoCapture g(opt_device); // Open the default camera
if (!g.isOpened()) { // Check if we succeeded
std::cout << "Failed to open the camera" << std::endl;
return EXIT_FAILURE;
}
cv::Mat frame;
int i = 0;
while ((i++ < 20) && !g.read(frame)) {
} // warm up camera by skiping unread frames
g >> frame; // get a new frame from camera
#endif

We then open the connection with the grabber and acquire an image in I.

#if defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2)
g.acquire(I);
#elif defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)))
g >> frame; // get a new frame from camera
#endif

To be able to display image I and the tracking results in overlay in a window, we create a display instance.

#if defined(VISP_HAVE_DISPLAY)
#if (VISP_CXX_STANDARD >= VISP_CXX_STANDARD_11)
display = vpDisplayFactory::createDisplay(I, -1, -1, "Camera view");
#else
display = vpDisplayFactory::allocateDisplay(I, -1, -1, "Camera view");
#endif
#else
std::cout << "No image viewer is available..." << std::endl;
#endif

Then we display the image in the window created previously.

We then initialize the moving edges parameters used later by the tracker with some parameters:

int opt_me_range = 10;
int opt_me_sample_step = 5;
int opt_me_threshold = 20; // Value in [0 ; 255]

From the previous position of a moving edge, we are tracking its new position along the normal of the contour with a range of 10 pixels on each side of the contour. For each pixel, along the normal we will compute the oriented convolution. The pixel that will be selected by the moving edges algorithm will be the one that has a contrast higher than 20 gray levels. Between two consecutive moving edges on the contour we keep a space of 5 pixels.

vpMe me;
me.setRange(opt_me_range);
me.setThreshold(opt_me_threshold);
me.setSampleStep(opt_me_sample_step);

We then, create an instance of the vpMeTracker class that will track our line. We initialize the tracker with the previous moving-egdes parameters. We allow also the tracker to display additional information on the viewer overlay. The user has than to initialize the tracker on image I by clicking on two points located on the line to track.

vpMeLine line;
line.setMe(&me);
line.initTracking(I);

Once the tracker is initialized, we enter in a while loop where we successively acquire a new image, display it, track the line, display the tracking results and finally flush the overlay drawings in the viewer.

bool quit = false;
while (!quit) {
#if defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2)
g.acquire(I);
#elif defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)))
g >> frame;
#endif
vpDisplay::displayText(I, 20, 20, "Click to quit", vpColor::red);
line.track(I);
if (vpDisplay::getClick(I, false)) {
quit = true;
}
}

2.1.2. Use case

  • Once build, enter tutorial/tracking/moving-edges/ folder and run tutorial-me-line-tracker binary with -h option to see which are the command line options.
    $ cd ${VISP_WS}/visp-build/tutorial/tracking/moving-edges
    $ ./tutorial-me-line-tracker -h
  • You can run the tutorial with default options
    $ ./tutorial-me-line-tracker
  • Or tune the parameters of the tracker
    $ ./tutorial-me-line-tracker --me-range 12 --me-sample-step 8 --me-threshold 40
  • The following video shows an example of results obtained when tracking a line:
  • This other example shows the tracking of 2 lines:

2.2. Ellipse or arc of ellipse tracking

2.2.1. Source code

With ViSP you can also track an ellipse using moving edges. The following example code available in tutorial-me-ellipse-tracker.cpp shows how to use ViSP vpMeEllipse class to this end.

Note
There is also this other similar example in trackMeEllipse.cpp that allows to test the ellipse tracker on a recorded video or sequence of successive images.
#include <iostream>
#include <visp3/core/vpConfig.h>
// If openCV available, priority to OpenCV capture, otherwise the user has to modify the code uncommenting/commenting
// one of the following lines
#if ((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO))
#undef VISP_HAVE_V4L2
#undef VISP_HAVE_DC1394
#undef VISP_HAVE_CMU1394
#undef VISP_HAVE_FLYCAPTURE
#undef VISP_HAVE_REALSENSE2
// #undef HAVE_OPENCV_HIGHGUI
// #undef HAVE_OPENCV_VIDEOIO
#else
// Use the first grabber that is available. Uncomment/comment the following lines to disable usage of a grabber
// #undef VISP_HAVE_V4L2
// #undef VISP_HAVE_DC1394
// #undef VISP_HAVE_CMU1394
// #undef VISP_HAVE_FLYCAPTURE
// #undef VISP_HAVE_REALSENSE2
#undef HAVE_OPENCV_HIGHGUI
#undef HAVE_OPENCV_VIDEOIO
#endif
#if (defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || \
defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2) || defined(VISP_HAVE_OPENCV) && \
(((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)) || \
((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO))))
#ifdef VISP_HAVE_MODULE_SENSOR
#include <visp3/sensor/vp1394CMUGrabber.h>
#include <visp3/sensor/vp1394TwoGrabber.h>
#include <visp3/sensor/vpFlyCaptureGrabber.h>
#include <visp3/sensor/vpRealSense2.h>
#include <visp3/sensor/vpV4l2Grabber.h>
#endif
#include <visp3/core/vpIoTools.h>
#include <visp3/gui/vpDisplayFactory.h>
#include <visp3/io/vpVideoWriter.h>
#include <visp3/me/vpMeEllipse.h>
#if defined(VISP_HAVE_OPENCV) && (VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI)
#include <opencv2/highgui/highgui.hpp> // for cv::VideoCapture
#elif defined(VISP_HAVE_OPENCV) && (VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO)
#include <opencv2/videoio/videoio.hpp> // for cv::VideoCapture
#endif
int main(int argc, char **argv)
{
#ifdef ENABLE_VISP_NAMESPACE
using namespace VISP_NAMESPACE_NAME;
#endif
#if (VISP_CXX_STANDARD >= VISP_CXX_STANDARD_11)
std::shared_ptr<vpDisplay> display;
#else
vpDisplay *display = nullptr;
#endif
int opt_me_range = 10;
int opt_me_sample_step = 5;
int opt_me_threshold = 20; // Value in [0 ; 255]
int opt_track_circle = false; // By default we will track an ellipse
int opt_track_arc = false; // By default we will track a full circle or ellipse
std::string opt_save_filename; // Saved images filename
for (int i = 1; i < argc; i++) {
if (std::string(argv[i]) == "--me-range" && i + 1 < argc) {
opt_me_range = std::atoi(argv[++i]);
}
else if (std::string(argv[i]) == "--me-sample-step" && i + 1 < argc) {
opt_me_sample_step = std::atoi(argv[++i]);
}
else if (std::string(argv[i]) == "--me-threshold" && i + 1 < argc) {
opt_me_threshold = std::atoi(argv[++i]);
}
else if (std::string(argv[i]) == "--track-circle") {
opt_track_circle = true;
}
else if (std::string(argv[i]) == "--track-arc") {
opt_track_arc = true;
}
else if (std::string(argv[i]) == "--save" && i + 1 < argc) {
opt_save_filename = std::string(argv[++i]);
}
else if (std::string(argv[i]) == "--help" || std::string(argv[i]) == "-h") {
std::cout << "\nUsage: " << argv[0]
<< " [--me-range <range>]"
<< " [--me-sample-step <sample step>]"
<< " [--me-threshold <threshold>]"
<< " [--track-arc]"
<< " [--track-circle]"
<< " [--save <filename>]"
<< " [--help] [-h]\n"
<< std::endl;
return EXIT_SUCCESS;
}
else {
std::cout << "\nError: wrong parameter " << argv[i] << std::endl;
return EXIT_FAILURE;
}
}
try {
int opt_device = 0; // For OpenCV and V4l2 grabber to set the camera device
#if defined(VISP_HAVE_V4L2)
std::ostringstream device;
device << "/dev/video" << opt_device;
std::cout << "Use Video 4 Linux grabber on device " << device.str() << std::endl;
g.setDevice(device.str());
g.setScale(1);
g.open(I);
#elif defined(VISP_HAVE_DC1394)
(void)opt_device; // To avoid non used warning
std::cout << "Use DC1394 grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_CMU1394)
(void)opt_device; // To avoid non used warning
std::cout << "Use CMU1394 grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_FLYCAPTURE)
(void)opt_device; // To avoid non used warning
std::cout << "Use FlyCapture grabber" << std::endl;
g.open(I);
#elif defined(VISP_HAVE_REALSENSE2)
(void)opt_device; // To avoid non used warning
std::cout << "Use Realsense 2 grabber" << std::endl;
rs2::config config;
config.disable_stream(RS2_STREAM_DEPTH);
config.disable_stream(RS2_STREAM_INFRARED);
config.enable_stream(RS2_STREAM_COLOR, 640, 480, RS2_FORMAT_RGBA8, 30);
g.open(config);
g.acquire(I);
#elif ((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI))|| ((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO))
std::cout << "Use OpenCV grabber on device " << opt_device << std::endl;
cv::VideoCapture g(opt_device); // Open the default camera
if (!g.isOpened()) { // Check if we succeeded
std::cout << "Failed to open the camera" << std::endl;
return EXIT_FAILURE;
}
int i = 0;
cv::Mat frame;
while ((i++ < 20) && !g.read(frame)) {
} // warm up camera by skiping unread frames
g >> frame; // get a new frame from camera
#endif
#if defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2)
g.acquire(I);
#elif ((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI))|| ((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO))
g >> frame; // get a new frame from camera
#endif
#if defined(VISP_HAVE_DISPLAY)
#if (VISP_CXX_STANDARD >= VISP_CXX_STANDARD_11)
display = vpDisplayFactory::createDisplay(I, 0, 0, "Camera view");
#else
display = vpDisplayFactory::allocateDisplay(I, 0, 0, "Camera view");
#endif
#else
std::cout << "No image viewer is available..." << std::endl;
#endif
vpVideoWriter *writer = nullptr;
if (!opt_save_filename.empty()) {
std::string parent = vpIoTools::getParent(opt_save_filename);
if (!parent.empty()) {
std::cout << "Create output directory: " << parent << std::endl;
}
writer = new vpVideoWriter();
writer->setFileName(opt_save_filename);
writer->open(O);
}
vpMe me;
me.setRange(opt_me_range);
me.setThreshold(opt_me_threshold);
me.setSampleStep(opt_me_sample_step);
std::cout << "Moving-edges settings" << std::endl;
me.print();
std::cout << "\nTracker settings" << std::endl;
std::cout << " Tracker type....................."
<< (opt_track_arc ? std::string("arc of ") : std::string())
<< (opt_track_circle ? std::string("circle") : std::string("ellipse")) << std::endl;
std::cout << " Save results....................." << (opt_save_filename.empty() ? std::string("n/a") : opt_save_filename) << std::endl << std::endl;
vpMeEllipse ellipse;
ellipse.setMe(&me);
ellipse.setDisplay(vpMeSite::RANGE_RESULT);
ellipse.initTracking(I, opt_track_circle, opt_track_arc);
bool quit = false;
while (!quit) {
#if defined(VISP_HAVE_V4L2) || defined(VISP_HAVE_DC1394) || defined(VISP_HAVE_CMU1394) || defined(VISP_HAVE_FLYCAPTURE) || defined(VISP_HAVE_REALSENSE2)
g.acquire(I);
#elif ((VISP_HAVE_OPENCV_VERSION < 0x030000) && defined(HAVE_OPENCV_HIGHGUI))|| ((VISP_HAVE_OPENCV_VERSION >= 0x030000) && defined(HAVE_OPENCV_VIDEOIO))
g >> frame;
#endif
vpDisplay::displayText(I, 20, 20, "Click to quit", vpColor::red);
ellipse.track(I);
ellipse.display(I, vpColor::red);
if (vpDisplay::getClick(I, false)) {
quit = true;
}
if (!opt_save_filename.empty()) {
writer->saveFrame(O);
}
}
}
catch (const vpException &e) {
std::cout << "Catch an exception: " << e << std::endl;
}
#if (VISP_CXX_STANDARD < VISP_CXX_STANDARD_11)
if (display != nullptr) {
delete display;
}
#endif
}
#else
int main()
{
#if defined(VISP_HAVE_OPENCV)
std::cout << "Install a 3rd party dedicated to frame grabbing (dc1394, cmu1394, v4l2, OpenCV, FlyCapture, "
<< "Realsense2), configure and build ViSP again to use this tutorial."
<< std::endl;
#else
std::cout << "Install OpenCV 3rd party, configure and build ViSP again to use this tutorial." << std::endl;
#endif
return EXIT_SUCCESS;
}
#endif
static void getImage(const vpImage< unsigned char > &Is, vpImage< vpRGBa > &Id)
static void makeDirectory(const std::string &dirname)
static std::string getParent(const std::string &pathname)
Class that tracks an ellipse or a circle using moving edges.
Class that enables to write easily a video file or a sequence of images.
void saveFrame(vpImage< vpRGBa > &I)
void setFileName(const std::string &filename)
void open(vpImage< vpRGBa > &I)

This example is very similar to the one presented in 2.1. Line tracking. It differs only:

  • in the header that needs to be used
    #include <visp3/me/vpMeEllipse.h>
  • in the name of the class that is used to allow ellipse tracking
    vpMeEllipse ellipse;
    ellipse.setMe(&me);
    ellipse.setDisplay(vpMeSite::RANGE_RESULT);
    ellipse.initTracking(I, opt_track_circle, opt_track_arc);
  • and in the parameters that are given to vpMeEllipse::initTracking(). The parameter opt_track_circle allows to specialize the tracker with the model of a circle. The parameter opt_track_arc allows to consider arc of ellipses or arc of circles.

2.2.2. Use case

  • Once build, enter tutorial/tracking/moving-edges/ folder and run tutorial-me-ellipse-tracker binary with -h option to see which are the command line options.
    $ cd ${VISP_WS}/visp-build/tutorial/tracking/moving-edges
    $ ./tutorial-me-ellipse-tracker -h
  • You can run the tutorial with default options to track an ellipse
    $ ./tutorial-me-ellipse-tracker
    Note
    Here that the user has to initialize the tracker on image I by clicking on five points located on the ellipse to track.
  • Or tune the parameters of the tracker
    $ ./tutorial-me-ellipse-tracker --me-range 12 --me-sample-step 8 --me-threshold 40
  • The following video shows an example of results obtained when tracking an ellipse:
  • If you want rather to track an arc of an ellipse you may use "--track-arc" command line option like:
    $ ./tutorial-me-ellipse-tracker --track-arc
  • The following video shows an example of results obtained when tracking an ellipse arc:
  • There is also this other video that shows the tracking of an ellipse:

2.3. Circle or arc of circle tracking

2.3.1. Source code

The source code tutorial-me-ellipse-tracker.cpp presented in previous section allows also to track a circle or an arc of a circle.

Warning
Even if your object is a circle, its projection in the image becomes an ellipse. It is only a circle when perspective effects are removed or when the camera plane is parallel to the object plane. That's why in the next videos the object is more or less in a plane parallel to the camera plane and we are mainly moving the object in translation.
Note
There is also this other similar example in trackMeEllipse.cpp that allows to test the circle tracker on a recorded video or sequence of successive images.

2.3.2. Use case

  • To track a circle, you may use "--track-circle" command line option like:
    $ ./tutorial-me-ellipse-tracker --track-circle
    Note
    Here that the user has to initialize the tracker on image I by clicking on only 3 points located on the circle to track.
  • The following video shows an example of results obtained when tracking a circle:
  • If you want to track an arc of a circle you may also use "--track-arc" command line option like:
    $ ./tutorial-me-ellipse-tracker --track-circle --track-arc
  • The following video shows an example of results obtained when tracking a circle arc:

3. Next tutorial

You are now ready to see the next Tutorial: Markerless generic model-based tracking using a color camera.