HAUTMatching and motion

The various points studied in this theme are listed below as links leading to sections with their details and illustrations:

Image alignment | Visual gyroscope | Planes tracking | Lines matching | GRoVE

HAUTRGB-D image alignment for 3D camera motion estimation


International Journal

Yassine Ahmine, Guillaume Caron, Fatima Chouireb, El Mustapha Mouaddib, Continuous scale-space direct image alignment for visual odometry from RGB-D images, IEEE Robotics and Automation Letters, to appear in 2021. PDF

HAUTDense visual gyroscope: camera orientation estimation

    The orientation estimation of a camera is feasible by using the images acquired during its motion. Depending on the number of degrees of freedom considered, it is a visual compass (one angle) or a visual gyroscope (three angles):
  • Visual compass: knowing a reference image and a current image, the phase correlation method allows to deduce the angle between these two images, under certain hypotheses. The evaluation of our proposed algorithm leaded to build the OVMIS dataset, which is described and downloadable from the left item Datasets.
  • Spherical visual gyroscope: The minimization of the error between two mixtures of photometric potentials, the reference image one and the current image one, allows obtaining the optimal rotation (three degrees of freedom) to transform one image to the other. The evaluation of our proposed algorithm leaded to build the SVMIS dataset, which is described and downloadable from the left item Datasets.

International journal

Fabio Morbidi, Guillaume Caron, Phase Correlation for Dense Visual Compass from Omnidirectional Camera-Robot Images, IEEE Robotics and Automation Letters, RA-L, Vol. 2, No. 2, pp. 688-695, April 2017. PDF

International conference

Guillaume Caron, Fabio Morbidi, Spherical Visual Gyroscope for Autonomous Robots using the Mixture of Photometric Potentials, IEEE International Conference on Robotics and Automation, ICRA'18, Brisbane, Australia, May 2018. PDF

International communication

Fabio Morbidi, Guillaume Caron, Phase Correlation for Dense Visual Compass from Omnidirectional Camera-Robot Images, IEEE Int. Conf. on Robotics and Automation, ICRA, May 2017, Singapore.

National conference

Guillaume Caron, Fabio Morbidi, Gyroscope visuel sphérique basé mélange de potentiels photométriques, AFRIF Pattern recognition, Image, Learning and Perception French national conference (Reconnaissance des Formes, Image, Apprentissage et Perception, RFIAP, Jun 2018, Marne-la-vallée, France. PDF

HAUTDirect tracking of image templates

  • Scale space template tracking approaches consider a pyramid of several scales of images (usually 3 to 5) for tracking the template from the coarse to the fine level in order to increase the robustness to large motion between frames. We extend that idea by no longer limiting scales to a few discret levels but including that scale parameter as an additionnal degree of freedom to optimize simultaneously to motion parameters. That key idea allows the gradient descent-like algorithm to automatically adapt (increase and decrease when needed, at the necessary pace) the scale factor for a very large convergence domain and very high precision at convergence. Results for translational and projective motion models clearly outperforms the state-of-the-art of non-learning-based approaches and competes with learning-based ones, slightly outperforming them in some cases, while not needing a long and greedy learning process.

    International Journal

    Yassine Ahmine, Guillaume Caron, El Mustapha Mouaddib and Fatima Chouireb, Adaptive Lucas-Kanade tracking, Elsevier Image and Vision Computing, vol. 88, pp. 1 - 8, Aug. 2019. PDF
  • Photometric tracking of planes in omnidirectional stereovision: The motion estimation of a camera or a stereo rig, embedded on a mobile robot, can be done tracking image regions corresponding to 3D planes. Working with a plane constrains the motion estimation by homography. In this work, a spherical representation of the omnidirectional image is used to express the optimization of plane parameters (normal vector, distance) and the sensor motion (rotation, translation) minimizing the photometric error between the reference planar region and its transfer to other views or new images.

    International Conference

    Guillaume Caron, Eric Marchand and El Mustapha Mouaddib, Tracking Planes in Omnidirectional Stereovision, IEEE International Conference on Robotics and Automation, ICRA'11, pp. 6306 - 6311, Shangai, China, May 2011. PDF

HAUTVertical line matching for omnidirectional stereovision images

  • To localize a robot in indoor environment, geometric landmarks (points, segments, surfaces, etc.) are often used to determine the position of the robot in relation to these landmarks. The vertical segments, such as corners of walls or door frames, are interesting landmarks for the localization process because they are fixed. We are therefore working on this exploiting our sensor that mixes the omnidirectional perception and redundancy of information.
    Here are the results: on the left one can see a real image with line matching and on the right side a reconstruction of the environment from a synthetical image.
    Mise en correspondance Reconstruction par triangulation

    International Conference

    Guillaume Caron and El Mustapha Mouaddib, Vertical Line Matching for Omnidirectional Stereovision Images, IEEE International Conference on Robotics and Automation, ICRA'09, pp. 2787 - 2792, Kobe, Japan, May 2009

    National Conference

    Guillaume Caron and El Mustapha Mouaddib, Mise en Correspondance de Droites Verticales dans les Images de Stéréovision Omnidirectionnelles, Young researchers in computer vision congress (Congrés des jeunes chercheurs en vision par ordinateur) ORASIS'09, Trégastel, France, June 2009. PDF

HAUTRobot guidance using embedded vision

This project began with my master thesis on visual odometry. Its objective is to achieve a visualization environment and controlling the movement of a mobile robot guided by vision. The idea is to use the frame of this project to propose sub-projects for internships or research projects.

  • 2008: visualization of the robot movement in a 3D environment included in an interface that displays what is captured by the camera and points of interest detected and matched.
    Intership - Quentin Wochol (computer science student)

  • 2007: design and development of a visual odometry method to estimate the motion of a mobile robot between successive images of the camera pointed at the ceiling.
    My master thesis - report.