Low Vision

Blappy is a blue tooth Android app that enables people with visual and auditory disabilities to effectively communicate. The app translates voice to text and text to voice and allows for high contrast images that can be viewed via the zoom feature. Because Blappy uses Bluetooth, it is intended for people who are 30 meters apart or less.

Blappy is currently available in four languages:

  • Spanish
  • French
  • English
  • and Portuguese

Conversations can be translated into all four languages.

Developers are currently working on an iOS version. The project was carried out with the support of UC3M's Audiovisual Accessibility Laboratory, which is part of the Center for Technologies for Disability and Dependence in UC3M’s Science Park

Here is more information on Blappy

Through a relationship with Quantum Reading, Learning, Vision, OrCam’s assistive technology (AT) device is now available to people who are blind in Australia. OrCam MyEye is the world’s most advance wearable AT solution. It uses a small camera that mounts on the user’s glasses to read printed text in real time into a discrete earpiece. Moreover, it can recognize people’s faces and products in the store. The devices are hand delivered by a trainer who teaches new users how to use the device in their daily life.  

UMass Boston’s engineering students have collected a year’s worth of Wi-Fi signal data to create a map of the campus. Using the IBM Accessible Location-based Service, people with disabilities will be able to download an app on their mobile device and identify their location using the Wi-Fi signals. They can then put in a destination and the app will guide them, turn-by-turn, and give accessible route guidance based on the current physical campus environment.

This technology has great potential for other environments such as airports, hospitals, office buildings and shopping malls. It could benefit many people such as:

  • Firefighters
  • The elderly
  • People with short term memory issues
  • People with vision disabilities

Read Dr. Ping Chen’s article on GAAD. 

Facebook can now automatically create alternative text for images generating descriptions that enable users who are blind or have low vision to envisage the content of the photo. The iOS app provides an audio breakdown of what’s happening in the photo using object recognition technology.

Using its vast supply of user images, Facebook has trained a deep neural network driving a computer vision system to recognize object in images. As is a standard in the WCAG 2.0 guidelines, the results are translated to “alt text,” which can be read by screen readers. 

The popular screen reader NVDA has released its 2016.1 version adding new features and changes.

Supports Baum VarioUltra and Pronto! when connected via USB

New feature include:

  • New braille translation tables:
  • Polish 8 dot computer braille
  • Mongolian
  • Ability to turn off the braille curser and changes is shape is the Show Cursor and Cursor shape option in the Braille Setting Dialog
  • Bluetooth connection to a HIMS Smart Beetle braille display
  • Lower the volume of other sounds with Windows 8 and higher installs through the Audio ducking mode option in the Synthesizer dialog of by pressing NVDA+shift+d
  • Supports APH Refresabraille in HID mode
  • Support for HumanWare Brialliant BI/B braille displays when the protocol is set to OpenBraille.

Changes:

  • Emphasis reporting is disabled by default
  • The shortcut for Formulas in the Elements List Dialog in MS Excel has been change to alt+r
  • Liblouis braille translator updated to 2.6.5
  • Text objects no longer announce “text” when they have focus.

The WCAG 2.0 guidelines help in coding accessibly and help meet the requirements of the ADA

A hand-worn device developed at the University of Nevada, Reno by Yantao Shen uses robotic technology to help people with vision disabilities. The robotic device will allow these people to navigate past movable obstacles and assist in pre-locating, pre-sensing and grasping an object.

The new technology combines vision, tactile force, temperature, audio sensors and actuators to help the user pre-sense an object, locate it, feel the shape and size then grasp it.

Read more about the Robotic Aid

iOS has an accessibility feature to allow users to select their preferred text size. Some applications will respect this setting and change their text size appropriately, however, some do not. To change your preferred text size

The Qatar Computing Research Institute (QCRI) has developed a custom keyboard for iOS. The new BrailleEasy Keyboard enables one-handed typing for people with vision disabilities and is based on braille. It is available for both Arabic and English speaking users.

The keyboard is based on a traditional two handed typing keyboard but has been customized for comfortable one handed typing. With a simple adaption of transforming two handed Brailling into two gestures, users quickly learn how to use the BrailleEasy keyboard.

Read more about the BrailleEasy keyboard. 

 

Fusion is perfect for individuals who, over time, want a smooth and easy transition from magnification to full screen reading.  ZoomText Fusion is designed to grow with you, ensuring that you will always be able to use your computer.  

Pages

Subscribe to RSS - Low Vision