Nearly 60% of U.S. smartphone users own an Apple iPhone. With May 16 being Global Accessibility Awareness Day, Apple continues to offer more accessibility features to enhance user experience with iPhone and iPad.
Here’s a closer look at the key features aimed at making Apple devices easier to use for users with limited mobility.
Eye Tracking is a way for users with physical disabilities to control iPad or iPhone with just their eyes. Using the built-in, front-facing camera of either mobile device as well as an onscreen pointer, the iPad or iPhone follows the user’s eye movement to scroll. When they lock their focus on an onscreen item, they’re able to tap it, as in clicking on a key or button. Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device and isn’t shared with Apple.
Head Tracking: Similar to Eye Tracking but for a wider range of motion, this feature allows users to assign actions to head movements and facial expressions, such as a smile or raised eyebrows. This option is particularly useful when you need to select a specific point, such as in Maps, or on a screen with several other items.
Assistive Touch: Another hands-free option for controlling a mobile device, this feature performs actions according to the noises users make, such as a mouth pop or an S-sound.
Vocal Shortcuts allows iPhone and iPad users to assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks.
Reachability: When using iPhone with one hand in portrait mode, you can select the Reachability feature to lower the top half of the screen so it’s within closer reach of your thumb.
These are just a few of Apple’s mobility-focused accessibility features for iOS 18. Click here to learn about more options.