Imagine this scenario: a student with severe dyslexia trying to keep pace in class, struggling daily with the written word. Or picture an elderly person with deteriorating vision who wishes to stay connected with loved ones online. These challenges aren’t uncommon, and this is exactly where assistive technology (AT) steps in. But when selecting the right assistive technology, there is an important distinction – should you opt for specialist assistive technology or rely on mainstream (non-specialist) solutions? This post dives into these two categories, exploring their histories, benefits, roles in empowering users, and how to access funding and support.
The Evolution and History of Assistive Technology
Assistive technology isn’t a new concept; it’s been around in various forms for decades. Originally, assistive devices were typically specialist tools designed exclusively to support disabled users. Early innovations included hearing aids in the early 20th century, Braille systems developed in the 1800s, and mobility aids like wheelchairs that date back hundreds of years.
However, over recent years we have seen an increase in the amount of AT available, driven by technological advancements and greater social awareness. Today, devices initially developed for general use, like smartphones, tablets, and voice assistants, now include robust built-in accessibility features, marking the rise of mainstream AT.