The IPCV-LAB recently acquired 6 batteries and 3 chargers for its Seekur Jr mobile robot. These will allow longer operating times and faster recharge, during the experiments being carried out outdoors, to validate the monocular visual odometry algorithm, which is being developed in the IPCV-LAB. In the photo is observed the Dr. Geovanni Martinez next to the new equipment received.



The experimental results of testing the monocular visual odometry algorithm developed by the IPCV-LAB in a real rover platform Husky A200 over flat terrain for localization in outdoor sunlit conditions were recently presented in IAPR MVA-2017.  The algorithm computes the three-dimensional (3D) position of the rover by integrating its motion over time. The motion is directly estimated by maximizing a likelihood function that is the natural logarithm of  the conditional probability of intensity differences measured at different observation points between consecutive images. It does not require as an intermediate step to determine the optical flow or establish correspondences. The images are captured by a monocular video camera that has been mounted on the rover looking to one side tilted downwards to the planet's surface. Most of the experiments were conducted under severe global illumination changes. Comparisons with ground truth data have shown an average absolute position error of  0.9% of distance traveled with an average processing time per image of 0.06 seconds.


See presented poster in IAPR MVA-2017 here.



A paper with a detailed description of the results will be published soon in the IEEE Xplore Digital Library.

In 2017, Dr. Geovanni Martinez will begin the field tests on rough terrain of the monocular visual odometry algorithm developed in the IPCV-LAB. To this end, a MobileRobots Seekur Jr. all-terrain rover will be used, which is able to operate outdoors in any weather.

A new 3D Video Stabilization Algorithm has been developed by the IPCV-LAB to remove the camera jitter from image sequences captured by planetary robots. First, the frame to frame surface 3D motion with respect to the camera coordinate system is estimated and accumulated over time. The estimation is performed by maximizing a likelihood function of the frame to frame intensity differences measured at key observation points. Then, the jitter is determined as the perspective projection of the difference between the accumulated surface 3D translation and a smoothed version of it. Finally, the stabilized video is synthesized by moving the entire content of each image with a displacement vector having the same magnitude but opposite direction to the estimated jitter for that image. The experimental results with synthetic data revealed real time operation with low latency and a reduction of the jitter in a factor of 20. Experimental results with real image sequences captured by a rover platform in indoor and outdoor conditions show very reliable and encouraging stabilization results. 


A detailed description of the algorithm can be fould here.

On April 19th and 21st, 2016, at 2 pm, Dr. Geovanni Martinez demonstrated the RTK dGPS Outdoor Guidence System of the Seekur Jr. rover in the gardens of the School of Electrical Engineering for the UCR Technology Fair participants. They had also the opportunity to teleoperate the rover.

Dr. Geovanni Martinez gave a talk entitled "Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers" at the University of Costa Rica as part of the commemoration of the 19th aniversary of the School of Geography.

Dr. Geovanni Martinez gave a talk entitled "Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers: A Case Study" at the “United Nations/Costa Rica Workshop on Human Space Technology”, San Jose, Costa Rica, 7-11 March, 2016.

Prof. Dr. Johanna Wanke, German Federal Minister of Education and Research visited the University of Costa Rica. During her visit, Dr. Geovanni Martinez, IPCV-LAB coordinator,  had the opportunity to explain the latest results obtained through research projects in the area of vision-based navigation systems for autonomous planetary robots, specifically the results obtained by testing the intensity-difference based monocular visual odometry algorithm in a real robot platform in outdoor sunlit conditions. This later algorithm was recently debeloped at the IPCV-LAB for robot positioning using a single video camera.

Informatic Center representatives of the University of Costa Rica visited the IPCV-LAB. The visit was to learn personally about the research projects, which are being carried out by the IPCV-LAB for vision-based navigation of autonomous planetary robots. In addition, the IPCV-LAB received an invitation to participate in the UCR Technology Fair that they are organizing.

During the week of robotics organized by the ARCOS-LAB, Dr. Geovanni Martinez, IPCV-LAB coordinator, had the opportunity to give the following three talks: 1) Intensity-Difference Based Monocular Visual Odometry for Planetary Rovers (on January 25th, from 2 pm to 3 pm), 2) Vision-Based Teleoperation of Robotic Arms (on January 25th, from 3 pm to 4 pm) and 3) Vision-Based Autonomous Navigation for Planetary Rovers (on January 26th, from 1 pm to 2 pm). This later talk included a demonstration of the autonomous capabilities of the Seekur Jr. rover, which was recently acquired for testing the vison-based autonomous navigation systems, which are being developed at the IPCV-LAB.