Google has announced a slew of updates to its suite of accessibility apps to celebrate Global Accessibility Awareness Day.
One of the updates is the release of Action Blocks, which allows users to create customisable, home screen buttons for relatively complex actions like playing music or calling somebody that typically require multiple steps -- tasks that may be difficult for people with limited mobility or a cognitive disability.
"For people with cognitive disabilities or age-related cognitive conditions, it can be difficult to learn and remember each of these steps. For others, it can be time consuming and cumbersome -- especially if you have limited mobility," Google said.
Action Blocks allows for any action that Google Assistant can perform, such as making calls, sending texts, or controlling devices in a user's home, to be performed through the click of one button. The button can be customised through picking an image for the Action Block from the phone's photo gallery, which is then placed on the user's home screen for one-touch access.
"We developed Action Blocks specifically for people with cognitive decline or age-related cognitive conditions. It's really important to engage with the community. We work with people around the area and across the country to understand which use cases or tasks are important for them to complete," Google AI and Accessible product manager Patrick Clary said.
Live Transcribe uses a phone's microphone to automatically transcribe real-world speech into captions in real-time. With the new updates, Live Transcribe will provide users with the option to have their phone vibrate whenever someone nearby says their name to make it easier to get the attention of those who are deaf or hard of hearing.
The app will also now allow users to add custom names or terms for different places and objects that are not commonly found in the dictionary.
Google has also expanded Live Transcribe's language support to now include Albanian, Burmese, Estonian, Macedonian, Mongolian, Punjabi, and Uzbek.
Meanwhile, SoundAmplifier uses a phone and a set of headphones to filter, augment, and amplify sounds so that users can better hear conversations or announcements in noise-heavy environments.
Previously, users could only use the app properly with wired headphones, but the app has now been updated to support Bluetooth headphones.
All three of these apps are available on the Google Play Store, with users being able to use Action Blocks and LiveTranscribe if their devices are Android 5.0 and above, while Sound Amplifier is available for devices that are Android 6.0 and above.
Google has also updated its Maps app across both Android and iOS to make it easier for users to see wheelchair accessibility information.
Users will now be able to see wheelchair accessibility information right away instead of having to tap into a location's details by turning on the "accessible places" feature. When accessible places is switched on, a wheelchair icon will indicate an accessible entrance and users will be able to see if a place has accessible seating, restrooms, or parking.
If it is confirmed that a place does not have an accessible entrance, Maps will also show that information, Google said.
Samsung also announced three new accessibility features for its devices -- called Quick Reader, Scene Describer, and Color Detector -- to mark Global Accessibility Awareness Day and make it easier for people with disability to use smart devices.
Quick Reader allows users to gain more information about their surroundings by using a smartphone's camera. It does this by reading out written text in real time to help users gain a better understanding of textual information in daily life.
The feature was created as understanding labels and signs is a daily challenge for users with visual impairments, Samsung said.
It can also recognise over 1,000 common objects and items such as food and vegetables in the kitchen as well as cleaning products.
Samsung also released Scene Describer, which provides descriptions of images, including captured scenes and downloaded pictures, to help users identify potential obstacles when they are navigating their surroundings.
Lastly, Color Detector uses a camera scan to inform users of the colour of the item in the frame to help people with visual impairments to identify the materials and design of garments.
The open source Equal Access Toolkit and Checker allow developers and testers to embed accessibility directly into their workflows, so their websites and applications are accessible to people with disabilities.