In remotely driven systems, loss of situational awareness and fatigue are usually experienced by human operators. Despite the availability of topographical maps, operators are unable to follow rover's position and focus on mission goals. The objective of the VIPER project is to minimize these problems.
The VIsual Position
EstimatoR is designed to estimate a rover's position from the images it acquires; the estimates are presented to the operator in a smooth manner.
The final goal of the VIPER system is to indicate how to produce an Augmented Reality Interface that can reduce the cognitive load on users of teleoperated systems. This is accomplished by creating visual cues that will prevent operators from getting lost or disoriented.