The Third Eye: A Shopping Assistant for Visually Impaired
Visual Cortext on Silicon project, an NSF Expedition in Computing
I have been investigating a feasibility of smart camera prosthetic devices that assist people with visual impairments (PVI) for their grocery shopping. I have particularly been investigating and designing a feedback interface for assisting them for their hand and arm navigation for an object acquisition. For this project, I conducted two field studies with participatory design and ethnographic approach and two experimental lab studies for the prototype and feedback interface design evaluation. Below shows further details of these studies. I am currently interested in developing a guidance for designing an interactive dialog system for assisting people with visual impairments. I am conducting interview studies to gather requirements and designing/developing an interactive dialog system prototype.
We are investigating conversation based prosthetic and an interaction that it provides to People with Visual Impairment (PVI) through studying experience of a remote sighted agent at Aira (https://aira.io/) and PVI from providing assistance to PVI and receiving it from the sighted agent at Aira. We in particular have interests on how conversation modalities (verbal and nonverbal) are utilized and how it is selected for accommodating to context and temporal relevancy and to needs of particular task. Also we are investigating an integration of different kind of modality (e.g. information presented with haptic) into the conversation modality and how it improves guidance support and interaction.
With conducting field studies with ethnographic and participatory design approaches, we learned and understood how people with visual impairments experience and understand world around them and what kind of prosthetic processes and technologies including assistive technologies and collaboration with sighted assistance that they utilize for performing daily tasks and activities, specifically for the shopping task which is an essential and yet found to be challenging task for them.
Through iterative lab studies, we investigated and evaluated effectiveness and user experience of the smart camera prosthetics and feedback interface prototype (wearable technology + single or multi-modality of auditory or haptic sense) that support information delivery and navigational guidance for people with visual impairments with using context of grocery shopping.