Robot guide dog for visually impaired

Algorithmic Human-Robot Interaction class project Spring 2019

We emulated a guide dog in indoor scenario to guide the human to the nearest exit. The robot provides navigation assistance and scene explanation using a physical lease feedback and natural language interface respectively. The goal was to help the visually impaired people reach their goal faster and safer.