In a world where visual cues dominate, navigating daily life can be a significant challenge for people with visual impairments.
Recognizing objects, which many of us take for granted, is a critical task that can range from simple to complex decision-making for those who cannot see.
While advancements in artificial intelligence (AI) have greatly improved visual recognition technologies, applying these breakthroughs in a way that benefits real people in their everyday lives has remained a difficult task.
Enter AiSee: a revolutionary wearable device designed to bridge this gap and provide visually impaired individuals with the ability to “see” the world around them.
Developed initially in 2018 and refined over five years, AiSee represents a leap forward in assistive technology. Unlike traditional devices that may use glasses with an attached camera, AiSee offers a more discreet and user-friendly approach.
It is worn around the ears and extends across the back of the head, avoiding the potential stigma associated with glasses and instead utilizing a bone conduction headphone to deliver auditory information.
Suranga Nanayakkara, the lead researcher behind Project AiSee and an associate professor at the National University of Singapore Computing, emphasizes the importance of a human-centered design in the development of AiSee.
By moving away from glasses, the team hopes to offer a more natural interaction for users, respecting their comfort and dignity.
AiSee operates on three fundamental components:
The Eye: At the heart of AiSee is a vision engine computer, a micro-camera that captures the user’s field of view. This software is adept at recognizing various features in an image, such as text, logos, and labels, which are crucial for identifying objects.
The Brain: Once an image is captured, it’s processed by cloud-based AI algorithms capable of analyzing and identifying the object. This system also supports interactive Q&A, allowing users to ask questions about the object and receive detailed answers.
This functionality is powered by sophisticated text-to-speech and speech-to-text technologies, making AiSee not just a tool for object recognition but a comprehensive assistant for acquiring information about the user’s surroundings.
The Speaker: Sound is transmitted through bone conduction technology, which has a crucial advantage for visually impaired users.
It allows them to hear the device’s output without blocking out important environmental sounds, maintaining awareness of their surroundings for safe navigation.
AiSee’s goal is to empower visually impaired individuals in Singapore and beyond, enabling them to perform tasks independently that would otherwise require assistance.
By functioning as a self-contained system, AiSee eliminates the need for smartphone pairing, a common requirement for most wearable assistive devices.
This independence from additional devices is a significant step towards making cutting-edge AI assistive technology more accessible and practical for everyday use.
The team, led by Nanayakkara, is committed to making AiSee not only technologically advanced but also affordable and accessible.
With ergonomic improvements and a faster processing unit in the works, AiSee is poised to make a significant impact on the lives of those with visual impairments.
Feedback from users like Mark Myres, an NUS student who tested AiSee, highlights its potential to benefit both visually impaired and blind individuals by offering a balance of functionality and usability.
As AiSee moves towards wider testing and refinement, its development signifies a hopeful future where technology can significantly enhance the quality of life for people with visual impairments, offering them a new way to engage with the world around them.
Copyright © 2024 Knowridge Science Report. All rights reserved.