News

Dr. Geovanni Martinez will give a talk at IEEE CCE-2018 entitled "Improving the Robustness of a Direct Visual Odometry Algorithm for Planetary Rovers", which will be held in Nexico City from September 5 to 7, 2018. In following the abstact of the corresponding paper:

 

Abstact

 

An algorithm capable of computing the robot position by evaluating measurements of frame to frame intensity differences was extended to be able to detect outliers in the measurements to exclude them from the evaluation to perform the positioning, with the aim of improving its robustness in irregular terrain scenes, such as consisting of flat surfaces with stones on them. The images are taken by a camera firmly attached to the robot, tilted downwards, looking at the planetary surface. A measurement is detected as an outlier only if its intensity difference and linear intensity gradients can not be described by motion compensation. According to the experimental results, this modification reduced the positioning error by a factor of 3 in difficult terrain, maintaining its positioning error, which resulted in an average of 1.8%, within a range of 0.15% and 2.5%  of distance traveled, similar to those achieved by state of the art algorithms  successfully used in robots here on earth and on Mars.

 

 

Dr. Geovanni Martinez gave a talk to seniors of the Anglo American School about the University of Costa Rica, the School of Electrical Engineering and the Image Processing and Computer Vision Research Laboratory (IPCV-LAB) , as well as a demonstration of the robots of the IPCV-LAB, which are used for validation of a direct visual odometry algorithm developed in the IPCV-LAB: Husky A200 and Seekur Jr. 

The paper written by Dr. Geovanni Martínez entitled "Experimental results of testing a direct monocular visual odometry algorithm outdoors on flat terrain under severe global illumination changes for Planetary Exploration Rovers", was accepted for publication in Computación y Sistemas, an International Journal of Computing Science and Applications. Computación y Sistemas is a peer reviewed open access scientific journal of Computer Science and Engineering. The journal is indexed in the CONACYT Index of Excellence of Mexican JournalsScopusWeb of Science (core collection-emerging sources), RedalycE-JournalREDIBLatindexBiblatPeriodicaDBLP, and SciELO (part of Web of Science).

IPCV-LAB acquired 2 TurtleBots to be used in the IE-0449 Computer Vision course.

 

These two turtleBots will be used in the IE-0449 Computer Vision course, which will be taught by Dr. Geovanni Martínez during the first semester of 2018, so that the applications for autonomous robotics developed by the students in the course can also be tested on real robotic platforms (not just in simulations!). Likewise, it will help students who collaborate in the Image Processing and Computar Vision Research Laboratory to familiarize themselves with autonomous robots first of low cost, before venturing into their projects to use the more expensive and sophisticated robots that the IPCV-LAB has for research, such as the Husky A200 and the Seerkur Jr.

 

A new calibration algorithm is being investigated by Dr. Geovanni Martinez to facilitate the comparison of the experimental results obtained by the monocular visual odometry algorithm developed in the IPCV-LAB with the ground truth positioning data obtained  by using a robotic theodolite with a laser range sensor.

 

 

The photo shows Dr. Geovanni Martinez testing part of the calibration algorithm in the IPCV-LAB with the help of a three-dimensional calibration pattern, the rover Seekur Jr. and a Trimbre S3 robotic total station.

 

The Image Processing and Computer Vision Research Laboratory (IPCV-LAB) was present in the Max Planck Science Tunnel, which is currently being exhibited in Costa Rica. Dr. Geovanni Martinez, coordinator of the IPCV-LAB, explained to visitors the importance of computer vision in autonomous robotics and carried out demonstrations of the Seekur Jr. robot, that he uses to test outdoors the new vision-based navigation systems, which are being investigated in the IPCV-LAB and that are particularly useful in planetary exploration.

 

During the event, Dr. Martinez allowed the children to teleoperate the Seekur Jr. robot:

 

 

 

 

 

Dr. Geovanni Martinez gave a Talk at TECCOM-2017. In the talk Dr. Martinez explained the Monocular Visual Odometry algorithm based on intensity differences, which has been developed at the IPCV-LAB, and compared it with the Stereoscopic Visual Odometry Algorithm based on Feature Correspondences, which is traditionally used in autonomous robotic and also in palnetary rovers. He described also the  experimental results of testing the monocular visual odometry algorithm in a real rover platform Husky A200 over flat terrain for localization in outdoor sunlit conditions.  

 

The Monocular Visual Odometry Algorithm computes the three-dimensional (3D) position of the rover by integrating its motion over time. The motion is directly estimated by maximizing a likelihood function that is the natural logarithm of  the conditional probability of intensity differences measured at different observation points between consecutive images. It does not require as an intermediate step to determine the optical flow or establish correspondences. The images are captured by a monocular video camera that has been mounted on the rover looking to one side tilted downwards to the planet's surface. Most of the experiments were conducted under severe global illumination changes. Comparisons with ground truth data have shown an average absolute position error of  0.9% of distance traveled with an average processing time per image of 0.06 seconds.

 

This is the link to the presentation:

 

"Monocualr Visual Odometry for Navigation Systems of Autonomous Planetary Robots", Geovanni Martinez, IPCV-LAB"

The IPCV-LAB recently acquired 6 batteries and 3 chargers for its Seekur Jr mobile robot. These will allow longer operating times and faster recharge, during the experiments being carried out outdoors, to validate the monocular visual odometry algorithm, which is being developed in the IPCV-LAB. In the photo is observed the Dr. Geovanni Martinez next to the new equipment received.

 

 

The experimental results of testing the monocular visual odometry algorithm developed by the IPCV-LAB in a real rover platform Husky A200 over flat terrain for localization in outdoor sunlit conditions were recently presented in IAPR MVA-2017.  The algorithm computes the three-dimensional (3D) position of the rover by integrating its motion over time. The motion is directly estimated by maximizing a likelihood function that is the natural logarithm of  the conditional probability of intensity differences measured at different observation points between consecutive images. It does not require as an intermediate step to determine the optical flow or establish correspondences. The images are captured by a monocular video camera that has been mounted on the rover looking to one side tilted downwards to the planet's surface. Most of the experiments were conducted under severe global illumination changes. Comparisons with ground truth data have shown an average absolute position error of  0.9% of distance traveled with an average processing time per image of 0.06 seconds.

See paper here (IEEE Xplore).

See presented poster in IAPR MVA-2017 here.

 

 

 

In 2017, Dr. Geovanni Martinez will begin the field tests on rough terrain of the monocular visual odometry algorithm developed in the IPCV-LAB. To this end, a MobileRobots Seekur Jr. all-terrain rover will be used, which is able to operate outdoors in any weather.

Pages