Google Announces New Focus on Technology for Users with Cognitive Disabilities



October 3, 2019

Google has introduced many innovative accessibility solutions in the past two years. For people who are deaf or hard of hearing, Google introduced Sound Amplifier and Live Transcribe. Soon it will be introducing Live Cations and Live Relay. For individuals with vision loss, it introduced its Lookout application. And Voice Access was introduced for individuals with mobility disabilities.  

With all of this focus and energy devoted to helping individuals with hearing, vision and mobility disabilities, the question was, what is next? Google answered this question a few days ago. On October 1, Eve Anderson, Google’s Director of Inclusive Machine Learning & Accessibility Engineering, stated that, “next on our roadmap is building technology that benefits people with cognitive disabilities.”   

The first new feature to be developed is called “Action Blocks.” In a blog posted on October 2, Google explained that “[w]ith Action Blocks, you add Assistant commands to your home screen with a custom image, which acts as a visual cue.” In other words, Action Blocks is a very simple way of triggering Google Assistant commands, which may require several steps, in one simple step of tapping an icon.

To take an example, if you want to text your father to say that you need to be picked up, you might tell the Google Assistant “I want to text Dad.” The Google Assistant will then ask you “what’s the message?” You would reply, “please pick me up at school.” The Google Assistant will say “I got ‘please pick me up at school,’ do you want to send it or change it”? You would reply “send it.” This fairly simple sequence might be difficult for someone with an intellectual disability. In its place, you could place an Action Block on the display that does the exact same thing with a single tap. The Action Block would include a visual cue that would make it easier to recognize.

Action Blocks would also be great for seniors who may have difficulty using a smartphone. For example, you could develop an Action Block to order an Uber ride to the grocery store. Action Blocks can apparently be built to do anything that the Google Assistant can do.

We look forward to seeing how Google develops Action Blocks and other features for individuals with intellectual disabilities.