Getting More from What You've Already Got: Improving Stereo Visual Odometry Using Deep Visual Illumination Estimation

Speaker: Jon Kelly , UTIAS

Date: Tuesday, April 04, 2017

Time: 11:00 AM to 12:00 PM Note: all times are in the Eastern Time Zone

Public: Yes

Location: 32-G449 Patil/Kiva

Event Type:

Room Description:


Contact: Nick Roy,

Relevant URL:

Speaker URL: None

Speaker Photo:

Reminders to:,

Reminder Subject: TALK: Getting More from What You've Already Got: Improving Stereo Visual Odometry Using Deep Visual Illumination Estimation


Visual navigation is essential for many successful robotics applications. Visual odometry (VO), an incremental dead reckoning technique, in particular, has been widely employed on many platforms, including the Mars Exploration Rovers and the Mars Science Laboratory. However, a drawback of this visual motion estimation approach is that it exhibits superlinear growth in positioning error with time, due in large part to orientation drift.

In this talk, I will describe recent work in our group on a method to incorporate global orientation information from the sun into a visual odometry (VO) pipeline, using data from the existing image stream only. This is challenging in part because the sun is typically not visible in the input images. Our work leverages recent advances in Bayesian Convolutional Neural Networks (BCNNs) to train and implement a sun detection model (dubbed Sun-BCNN) that infers a three-dimensional sun direction vector from a single RGB image. Crucially, the technique also computes a principled uncertainty associated with each prediction, using a Monte Carlo dropout scheme. We incorporate this uncertainty into a sliding window stereo VO pipeline where accurate uncertainty estimates are critical for optimal data fusion.

I will present the results of our evaluation on the KITTI odometry benchmark, where significant improvements are obtained over ‘vanilla’ VO. I will also describe additional experimental evaluation on 10 km of navigation data from Devon Island in the Canadian High Arctic, at a Mars analogue site. Finally, I will give an overview of our analysis of the sensitivity of the model to cloud cover, and discuss the possibility of model transfer between urban and planetary analogue environments.


Dr. Kelly is an Assistant Professor at the University of Toronto Institute for Aerospace Studies, where he directs the Space & Terrestrial Autonomous Robotic Systems (STARS) Laboratory. Prior to joining U of T, he was a postdoctoral researcher in the Robust Robotics Group at MIT. Dr. Kelly received his PhD degree in 2011 from the University of Southern California, under the supervision of Prof. Gaurav Sukhatme. He was supported at USC in part by an Annenberg Fellowship. Prior to graduate school, he was a software engineer at the Canadian Space Agency in Montreal, Canada. His research interests lie primarily in the areas of sensor fusion, estimation, and machine learning for navigation and mapping, applied to both robots and human-centred assistive technologies.

Research Areas:

Impact Areas:

See other events that are part of the Robotics@MIT Seminar Series 2017.

Created by Nick Roy Email at Tuesday, March 28, 2017 at 10:37 AM.