//Instrumenting environments for guidance of people with visual Impairments, W4A best paper

Instrumenting environments for guidance of people with visual Impairments, W4A best paper

The NavCog project of the Cognitive Assistance Lab) received the best technical paper award at the 14th International Web for All Conference in Perth, Australia with the paper “Achieving Practical and Accurate Indoor Navigation for People with Visual Impairments”. Congratulations to the authors, Dragan Ahmetovic, Masayuki Murata, Cole Gleason, Erin Brady, Hironobu Takagi, Kris Kitani and Chieko Asakawa!

NavCog project is a joint effort of the Carnegie Mellon’s Robotics Institute and Human Computer Interaction Institute, along with IBM Research-Tokyo that explores the usage of IoT and Computer Vision technologies for providing assisted navigation for individuals with visual impairments.

This latest work introduces a novel localization method using Bluetooth Low Energy (BLE) beacons signal, Pedestrian Dead Reckoning (PDR) and particle filter to provide accurate localization. In addition, it investigates methods for assessing and reducing costs related to the instrumentation of the environments with IoT navigation infrastructure while preserving a desired localization accuracy level.

Web4All Conference page

2017-06-22T13:40:08+00:00