News

 

Dr. Geovanni Martinez gave a Talk at TECCOM-2017. In the talk Dr. Martinez explained the Monocular Visual Odometry algorithm based on intensity differences, which has been developed at the IPCV-LAB, and compared it with the Stereoscopic Visual Odometry Algorithm based on Feature Correspondences, which is traditionally used in autonomous robotic and also in palnetary rovers. He described also the  experimental results of testing the monocular visual odometry algorithm in a real rover platform Husky A200 over flat terrain for localization in outdoor sunlit conditions.  

 

The Monocular Visual Odometry Algorithm computes the three-dimensional (3D) position of the rover by integrating its motion over time. The motion is directly estimated by maximizing a likelihood function that is the natural logarithm of  the conditional probability of intensity differences measured at different observation points between consecutive images. It does not require as an intermediate step to determine the optical flow or establish correspondences. The images are captured by a monocular video camera that has been mounted on the rover looking to one side tilted downwards to the planet's surface. Most of the experiments were conducted under severe global illumination changes. Comparisons with ground truth data have shown an average absolute position error of  0.9% of distance traveled with an average processing time per image of 0.06 seconds.

 

This is the link to the presentation:

 

"Monocualr Visual Odometry for Navigation Systems of Autonomous Planetary Robots", Geovanni Martinez, IPCV-LAB"

The IPCV-LAB recently acquired 6 batteries and 3 chargers for its Seekur Jr mobile robot. These will allow longer operating times and faster recharge, during the experiments being carried out outdoors, to validate the monocular visual odometry algorithm, which is being developed in the IPCV-LAB. In the photo is observed the Dr. Geovanni Martinez next to the new equipment received.

 

 

The experimental results of testing the monocular visual odometry algorithm developed by the IPCV-LAB in a real rover platform Husky A200 over flat terrain for localization in outdoor sunlit conditions were recently presented in IAPR MVA-2017.  The algorithm computes the three-dimensional (3D) position of the rover by integrating its motion over time. The motion is directly estimated by maximizing a likelihood function that is the natural logarithm of  the conditional probability of intensity differences measured at different observation points between consecutive images. It does not require as an intermediate step to determine the optical flow or establish correspondences. The images are captured by a monocular video camera that has been mounted on the rover looking to one side tilted downwards to the planet's surface. Most of the experiments were conducted under severe global illumination changes. Comparisons with ground truth data have shown an average absolute position error of  0.9% of distance traveled with an average processing time per image of 0.06 seconds.

See paper here (IEEE Xplore).

See presented poster in IAPR MVA-2017 here.

 

 

 

In 2017, Dr. Geovanni Martinez will begin the field tests on rough terrain of the monocular visual odometry algorithm developed in the IPCV-LAB. To this end, a MobileRobots Seekur Jr. all-terrain rover will be used, which is able to operate outdoors in any weather.

A new 3D Video Stabilization Algorithm has been developed by the IPCV-LAB to remove the camera jitter from image sequences captured by planetary robots. First, the frame to frame surface 3D motion with respect to the camera coordinate system is estimated and accumulated over time. The estimation is performed by maximizing a likelihood function of the frame to frame intensity differences measured at key observation points. Then, the jitter is determined as the perspective projection of the difference between the accumulated surface 3D translation and a smoothed version of it. Finally, the stabilized video is synthesized by moving the entire content of each image with a displacement vector having the same magnitude but opposite direction to the estimated jitter for that image. The experimental results with synthetic data revealed real time operation with low latency and a reduction of the jitter in a factor of 20. Experimental results with real image sequences captured by a rover platform in indoor and outdoor conditions show very reliable and encouraging stabilization results. 

 

A detailed description of the algorithm can be fould here.

On April 19th and 21st, 2016, at 2 pm, Dr. Geovanni Martinez demonstrated the RTK dGPS Outdoor Guidence System of the Seekur Jr. rover in the gardens of the School of Electrical Engineering for the UCR Technology Fair participants. They had also the opportunity to teleoperate the rover.

Dr. Geovanni Martinez gave a talk entitled "Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers" at the University of Costa Rica as part of the commemoration of the 19th aniversary of the School of Geography.

Dr. Geovanni Martinez gave a talk entitled "Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers: A Case Study" at the “United Nations/Costa Rica Workshop on Human Space Technology”, San Jose, Costa Rica, 7-11 March, 2016.

Prof. Dr. Johanna Wanke, German Federal Minister of Education and Research visited the University of Costa Rica. During her visit, Dr. Geovanni Martinez, IPCV-LAB coordinator,  had the opportunity to explain the latest results obtained through research projects in the area of vision-based navigation systems for autonomous planetary robots, specifically the results obtained by testing the intensity-difference based monocular visual odometry algorithm in a real robot platform in outdoor sunlit conditions. This later algorithm was recently debeloped at the IPCV-LAB for robot positioning using a single video camera.

Informatic Center representatives of the University of Costa Rica visited the IPCV-LAB. The visit was to learn personally about the research projects, which are being carried out by the IPCV-LAB for vision-based navigation of autonomous planetary robots. In addition, the IPCV-LAB received an invitation to participate in the UCR Technology Fair that they are organizing.

Pages