News

Dr. Geovanni Martinez gave a talk entitled "Extending the Measurement Error Model of a Direct Visual Odometry Algorithm to Improve its Accuracy for Planetary Rover Navigation" at the IEEE International Conference on Applied Science and Advanced Technology (IEEE iSACAT 2019), which was held in Queretaro, Mexico,  from November 27 to 29, 2019.

 

ABSTRACT

 

In this paper, the accuracy of a direct monocular visual odometry algorithm is improved. The algorithm is able to determine the position and orientation of a robot directly from intensity differences measured at observation points between consecutive images, captured by a monocular camera, rigidly attached to one side of its structure, tilted downwards. The improvement was achieved by extending the stochastic model of the intensity-difference measurement error, from considering only the camera noise, to one that also considers the intensity-difference measurement error due to the 3D shape error between the assumed and the true planetary surface shape. The corresponding covariance matrix was incorporated into a Maximum Likelihood estimator. According to the experimental results on irregular surfaces, where the 3D shape error is usually large, the accuracy of the visual odometry algorithm improved by a factor of 2 but with the cost of increasing the processing time also by the same factor.

 

LINK TO PRESENTATION

 

The talk was awarded as the best oral presentation of the conference:

 

 

All about the IPCV-LAB's Visual Odometry Algorithm

 

The poster above in high resolution can be downloaded in the following link:

 

IAPR MVA-2017 poster Geovanni_Martinez Visual Odometry_IPCV-LAB.pdf

 

Published papers:

  1. G. Martinez, "Extending the Measurement Error Model of a Direct Visual Odometry Algorithm to Improve its Accuracy for Planetary Rover Navigation", IEEE International Conference on Applied Science and Advanced Technology (IEEE iSACAT 2019), Queretaro, Mexico, November 27-29, 2019.
  2. G. Martinez, "Experimental results of testing a direct monocular visual odometry algorithm outdoors on flat terrain under severe global illumination changes for Planetary Exploration Rovers", Computación y Sistemas, an International Journal of Computing Science and Applications, Vol. 22, No. 4,  pp. 1581-1593, 2018.
  3. G. Martinez, "Improving the Robustness of a Direct Visual Odometry Algorithm for Planetary Rovers", IEEE International Conference on Electrical Engineering, Computing Science and Automatic Control (IEEE CCE-2018), Mexico, City, Mexico, September 5-7, 2018.
  4. G. Martinez, "Field tests on flat ground of an Intensity-difference Based Monocular Visual Odometry Algorithm for Planetary Rovers", 15th IAPR International Conference on Machive Vision Applications (IAPR MVA-2017), Nagoya, Japan, May 08-12, 2017.
  5. G. Martinez, “Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers”, New Development in Robot Vision, Book Series: Cognitive Systems Monographs, Vol. 23, Springer, ISBN: 978-3-662-43858-9, pages 1181-198, 2014.
  6. G. Martinez, "Monocular Visual Odometry from Frame to Frame Intensity Differences for Planetary Exploration Mobile Robots", IEEE Worshop on Robot Vision (IEEE WoRV), Tampa Bay, Florida, USA, 16-17 January, 2013.

 

A new algorithm for vision-based teleoperation of the Schunk's compact LWA 4P Powerball robot arm on board a Seekur Jr. rover is currently being developed in the Image Processing and Computer Vision Lab (IPCV-LAB). In the photo,  the undergraduate student German Ureña is dismantling the Seekur Jr. rover to rewire the CAN-bus interface to control the robot arm.

 

IPCV-LAB's latest publications on direct monocular visual odometry for planetary rovers:

  1. G. Martinez, "Experimental results of testing a direct monocular visual odometry algorithm outdoors on flat terrain under severe global illumination changes for Planetary Exploration Rovers", Computación y Sistemas, an International Journal of Computing Science and Applications, Vol. 22, No. 4,  pp. 1581-1593, 2018.
  2. G. Martinez, "Improving the Robustness of a Direct Visual Odometry Algorithm for Planetary Rovers", IEEE International Conference on Electrical Engineering, Computing Science and Automatic Control (IEEE CCE-2018), Mexico, City, Mexico, September 5-7, 2018.

 

 

 

Dr. Geovanni Martinez will give a talk at IEEE CCE-2018 entitled "Improving the Robustness of a Direct Visual Odometry Algorithm for Planetary Rovers", which will be held in Nexico City from September 5 to 7, 2018. In following the abstact of the corresponding paper:

Abstact

An algorithm capable of computing the robot position by evaluating measurements of frame to frame intensity differences was extended to be able to detect outliers in the measurements to exclude them from the evaluation to perform the positioning, with the aim of improving its robustness in irregular terrain scenes, such as consisting of flat surfaces with stones on them. The images are taken by a camera firmly attached to the robot, tilted downwards, looking at the planetary surface. A measurement is detected as an outlier only if its intensity difference and linear intensity gradients can not be described by motion compensation. According to the experimental results, this modification reduced the positioning error by a factor of 3 in difficult terrain, maintaining its positioning error, which resulted in an average of 1.8%, within a range of 0.15% and 2.5%  of distance traveled, similar to those achieved by state of the art algorithms  successfully used in robots here on earth and on Mars.

Dr. Geovanni Martinez gave a talk to seniors of the Anglo American School about the University of Costa Rica, the School of Electrical Engineering and the Image Processing and Computer Vision Research Laboratory (IPCV-LAB) , as well as a demonstration of the robots of the IPCV-LAB, which are used for validation of a direct visual odometry algorithm developed in the IPCV-LAB: Husky A200 and Seekur Jr. 

The paper written by Dr. Geovanni Martínez entitled "Experimental results of testing a direct monocular visual odometry algorithm outdoors on flat terrain under severe global illumination changes for Planetary Exploration Rovers", was accepted for publication in Computación y Sistemas, an International Journal of Computing Science and Applications. Computación y Sistemas is a peer reviewed open access scientific journal of Computer Science and Engineering. The journal is indexed in the CONACYT Index of Excellence of Mexican JournalsScopusWeb of Science (core collection-emerging sources), RedalycE-JournalREDIBLatindexBiblatPeriodicaDBLP, and SciELO (part of Web of Science).

IPCV-LAB acquired 2 TurtleBots to be used in the IE-0449 Computer Vision course.

 

These two turtleBots will be used in the IE-0449 Computer Vision course, which will be taught by Dr. Geovanni Martínez during the first semester of 2018, so that the applications for autonomous robotics developed by the students in the course can also be tested on real robotic platforms (not just in simulations!). Likewise, it will help students who collaborate in the Image Processing and Computar Vision Research Laboratory to familiarize themselves with autonomous robots first of low cost, before venturing into their projects to use the more expensive and sophisticated robots that the IPCV-LAB has for research, such as the Husky A200 and the Seerkur Jr.

 

A new calibration algorithm is being investigated by Dr. Geovanni Martinez to facilitate the comparison of the experimental results obtained by the monocular visual odometry algorithm developed in the IPCV-LAB with the ground truth positioning data obtained  by using a robotic theodolite with a laser range sensor.

 

 

The photo shows Dr. Geovanni Martinez testing part of the calibration algorithm in the IPCV-LAB with the help of a three-dimensional calibration pattern, the rover Seekur Jr. and a Trimbre S3 robotic total station.

 

The Image Processing and Computer Vision Research Laboratory (IPCV-LAB) was present in the Max Planck Science Tunnel, which is currently being exhibited in Costa Rica. Dr. Geovanni Martinez, coordinator of the IPCV-LAB, explained to visitors the importance of computer vision in autonomous robotics and carried out demonstrations of the Seekur Jr. robot, that he uses to test outdoors the new vision-based navigation systems, which are being investigated in the IPCV-LAB and that are particularly useful in planetary exploration.

 

During the event, Dr. Martinez allowed the children to teleoperate the Seekur Jr. robot:

 

 

 

 

Pages