Home » News » Smart Glasses Help Visually Impaired ‘See’ Again

Smart Glasses Help Visually Impaired ‘See’ Again

by Sophie Lin - Technology Editor

AI-Powered Vision: How Wearable Tech Like AiSee is Ushering in a New Era of Effortless Computing

Imagine a world where visual information is instantly accessible, not just for those who can see, but for anyone needing hands-free intelligence. That future is rapidly approaching, thanks to innovations like AiSee, a groundbreaking wearable developed by researchers at the National University of Singapore (NUS). This isn’t just about assistive technology; it’s a glimpse into a future where AI seamlessly integrates into our daily lives, transforming how we interact with the world around us.

From Finger Ring to Smart Headphone: The Evolution of AiSee

The journey of AiSee began in 2018 as a finger-worn ring, but quickly evolved into its current form: an open-ear headphone. This design choice wasn’t arbitrary. Professor Suranga Nanayakkara, who led the NUS research team, explains the shift was driven by a desire to avoid social stigma and, crucially, to preserve users’ spatial awareness. Keeping ears uncovered allows for natural sound perception, vital for navigating environments safely and confidently. Early prototypes faced challenges – hair obstructing the camera, limited battery life – but user feedback has been instrumental in refining the design into a practical, everyday device.

AiSee isn’t intended to be a specialized tool used sporadically. “It doesn’t make sense to have something that’s used once a day or maybe few times a week,” Nanayakkara emphasizes. “We’ve built it more as a smart headphone,” capable of standard audio functions alongside its AI capabilities.

The Power of Large Language Models: Beyond Object Identification

The real breakthrough for AiSee came with the integration of Large Language Models (LLMs), specifically Meta’s Llama. This transformed the device from a simple object identifier – “That’s a chair” – into a conversational assistant. Now, users can ask follow-up questions: “What kind of chair is it?” or “How much does it cost?” This conversational ability unlocks a new level of interaction and understanding.

AiSee operates on an “agentic AI framework,” combining computer vision, reasoning models, and the Llama LLM to interpret user intent and execute tasks. To run these powerful models efficiently on an Android-based device, the team employed quantization techniques, reducing the model size to between one and three billion parameters without significant performance loss.

Offline Processing and Data Privacy: A Critical Advantage

The decision to utilize Llama wasn’t solely based on technical capabilities. A key driver was the need for offline processing, particularly for sensitive information. As Nanayakkara points out, “If you are a blind person getting a new employment contract, you’d want to understand what’s in the contract and ask questions about it. You don’t want it to be scanned and uploaded to the cloud.” Smaller Llama models, capable of running directly on the device, provide this crucial privacy and security.

Beyond Assistive Tech: The Curb-Cut Effect and the Future of Visual Intelligence

AiSee’s founders envision a future far beyond its initial application as assistive technology. Nanayakkara cites the “curb-cut effect” – features designed for people with disabilities often benefit the wider population. He believes AiSee’s true commercial success will lie in its potential as a hands-free, screen-free computing interface for everyone.

This vision aligns with a growing trend towards ambient computing, where technology fades into the background, anticipating and responding to our needs without requiring constant attention. AiSee represents a significant step towards this future, offering a natural and intuitive way to access information and interact with the world.

The Rise of Agentic AI and Wearable Computing

AiSee’s success hinges on two converging trends: the rapid advancement of agentic AI and the increasing sophistication of wearable technology. Agentic AI, where AI systems proactively perform tasks on behalf of users, is moving beyond theoretical concepts and into practical applications. Coupled with the miniaturization of powerful computing hardware, this allows for truly intelligent and portable devices like AiSee.

Expanding Capabilities: Localization and Integration

Currently, AiSee’s language support is limited to what’s available through Llama. However, the team is actively exploring localization options, responding to a request from a foundation in the United Arab Emirates. This highlights the importance of adapting AI solutions to diverse linguistic and cultural contexts.

Furthermore, AiSee is already demonstrating its integration potential, partnering with Southeast Asian super app Grab to develop a voice-based ride-booking system. This showcases the device’s versatility and its ability to enhance existing services.

Implications for Industries: From Tourism to Accessibility

The implications of AiSee extend far beyond individual users. Organizations like museums and airport operators are exploring how the technology can enhance accessibility and inclusivity. Imagine a museum visitor using AiSee to receive detailed descriptions of exhibits, or a traveler navigating an airport with ease, receiving real-time guidance and assistance.

The potential applications are vast, spanning industries such as:

  • Retail: Providing product information and assistance to shoppers.
  • Manufacturing: Assisting workers with complex tasks and quality control.
  • Healthcare: Supporting visually impaired patients with medication management and daily living activities.

Frequently Asked Questions

Q: How does AiSee handle privacy concerns?
A: AiSee prioritizes user privacy by processing data locally on the device, eliminating the need to upload sensitive information to the cloud.

Q: What is the battery life of AiSee?
A: The latest iteration of AiSee has improved battery life based on user feedback, offering sufficient power for all-day use.

Q: Will AiSee support more languages in the future?
A: The team is actively exploring localization options and plans to expand language support based on available resources and demand.

Q: Is AiSee only for people with visual impairments?
A: While initially designed for assistive purposes, AiSee’s developers envision it as a versatile tool for anyone seeking hands-free access to visual information.

As AiSee moves towards a consumer launch, it represents more than just a technological innovation. It embodies a shift towards a more inclusive and accessible future, powered by the convergence of AI, wearable technology, and a commitment to user-centered design. What are your predictions for the role of AI-powered wearables in the next five years? Share your thoughts in the comments below!


You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Adblock Detected

Please support us by disabling your AdBlocker extension from your browsers for our website.