Apple Just Announced a Bunch of New Accessibility Features for iOS 18 | Lifehacker


All eyes are on Apple’s WWDC event next month. That’s when the company will no doubt show off their latest operating system upgrades, including its late-to-the-party AI plans for iOS and macOS.

But AI doesn’t just mean generative AI. Apple has more traditional features in the works for users to check out, some simply powered by AI. And as it happens, we just got a glimpse at a slew of upcoming iPhone and Mac features that fit that mold, specifically related to accessibility.

Apple unexpectedly announced a batch of these new accessibility features Wednesday. The company says these features are coming “later this year,” which almost assuredly means they’re shipping with iOS 18. Apple switches between using the language “AI” and “machine learning” to describe how these features work, but rest assured, the underlying tech is part of Apple’s AI push this year.

Eye Tracking lets you control your iPhone with your eyes alone

Out of nowhere, Apple announced that both iPhone and iPad users will soon have the ability to control their devices with just their eyes. Apple says the front camera on either your phone or tablet will use AI to calibrate and set up the feature, as well as power it. Most impressive of all, you don’t need any additional hardware to use it.

Once Eye Tracking is set up, you can navigate apps, use Dwell Controls to engage with elements, and replicate physical buttons, swipes, and gestures with your eye movements.

Music Haptics let you feel the beat through your iPhone

Apple added a new music feature for users who are deaf or hard of hearing: Music Haptics uses the Taptic Engine to play taps and complex vibrations along to the beat of the song. While it sounds like a great accessibility feature, it also seems like a great way to enhance the Apple Music experience for everyone. This feature works on “millions” of songs in Apple Music, but Apple has also included it as an API for developers to add to their apps.

Vocal Shortcuts and Listen for Atypical Speech

Vocal Shortcuts is a new feature that lets you assign actions to words or phrases. For example, you could set the word “Rings” to open your Apple Watch Activity Rings in Fitness. In addition, Listen for Atypical Speech uses on-device AI to learn your speech patterns, so your device will recognize the way you speak.

These features are designed for users with conditions such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke—as these conditions affect speech.

Vehicle Motion Cues tries to prevent motion sickness

Apple wants to cure motion sickness. Vehicle Motion Cues places dots on your screen when your iPhone or iPad recognizes you’re in a moving car. These dots will then move according to the direction of the vehicle: These moving dots may counter the effects of motion sickness, as Apple says research shows motion sickness happens when there’s a conflict between what you’re seeing and what you’re feeling.

You can choose to have these Motion Cues appear automatically, or enable them manually from Control Center.

CarPlay gets some new accessibility features

Speaking of cars, CarPlay is getting a series of new accessibility features: Voice Control, which lets you control CarPlay with your voice; Color Filters, which lets you fine tune the color space of your CarPlay UI; and Sound Recognition, CarPlay will let you know when it detects sounds like car horns and sirens.

visionOS accessibility features

Remember Apple Vision Pro? That’s still around, even if it isn’t getting much attention lately. Still, Apple is working on some visionOS accessibility features, including Live Captions. These captions will work in conversations in-person and on FaceTime, as well as from audio in your apps. Apple is also adding new vision features like Reduce Transparency, Smart Invert, and Dim Flashing Lights, as well as support for Made for iPhone hearing devices and cochlear hearing processors.

New VoiceOver features

VoiceOver is getting new voices. Apple didn’t say how many, or what they sounded like, but they’re coming. In addition, the feature is getting a “flexible Voice Rotor,” which lets you control how VoiceOver works, custom volume control, customizable VoiceOver keyboard shortcuts on macOS, and support for custom vocabularies and complicated words.

Magnifier

Apple’s Magnifier doesn’t get the love it deserves, but it is getting some new features. Coming soon, you’ll get a new Reader Mode, as well as quick ways to launch Detection Mode with the Action button on iPhone 15 Pro.

Braille

There are some new Braille features as well: You’ll have a new way to start and stay in Braille Screen Input, Japanese will be available, Dot Pad users have support for multi-line braille, and you’ll have the ability to choose input and output tables.

Hover Typing

Hover Typing is a new feature that increases the size of the text whenever you’re typing in a text field. Plus, you get to control the font and color.

Personal Voice is now available in Mandarin

Apple rolled out Personal Voice last year, an AI-powered feature that can replicate your voice during Live Speech. The feature is now available in Mandarin Chinese. In addition, you can now create a Personal Voice even if you have difficulty with reading full sentences out loud.

Speaking of Live Speech, the feature now comes with categories, and is compatible with Live Captions.

Virtual Trackpad

Apple is adding a virtual trackpad feature as part of AssistiveTouch, so one area of your iPhone or iPad can be used to move a cursor around the screen. I could see this being useful for anyone who wants a trackpad experience, especially on the larger iPads, but doesn’t have a physical trackpad to use.

Switch Control

With Switch Control later this year, you can use your iPhone or iPads’ camera to recognize finger-tap gestures as switches. Switch Control lets you use hardware to control your iPhone or iPad with switches, so this means you can gesture with your fingers in view of the camera to control on-screen elements.