A team of researchers led by Professor Jonathan Kelly (UTIAS) has found a way to enhance the visual perception of robotic systems by coupling two different types of neural networks. The innovation could help autonomous vehicles navigate busy streets or enable medical robots to work effectively in crowded hospital hallways.
“What tends to happen in our field is that when systems don’t perform as expected, the designers make the networks bigger — they add more parameters,” says Kelly.
“What we’ve done instead is to carefully study how the pieces should fit together. Specifically, we investigated how two pieces of the motion estimation problem — accurate perception of depth and motion — can be joined together in a robust way.”
Researchers in Kelly’s Space and Terrestrial Autonomous Robotic Systems lab aim to build reliable systems that can help humans accomplish a variety of tasks. For example, they’ve designed an electric wheelchair that can automate some common tasks, such as navigating through doorways.
More recently, they’ve focused on techniques that will help robots move out of the carefully controlled environments in which they are commonly used today, and into the less predictable world we humans are used to navigating.
Full article: Improved visual perception method could help robots navigate crowded spaces