Built‑in features designed for you to make something wonderful.
Go big or go bold.
Magnifier + Door Detection
Detect every detail.
Magnifier + Door Detection
It works like a digital magnifying glass, using the camera on your iPhone or iPad to increase the size of anything you point it at — from a prescription bottle to a candlelit menu. Detection Mode in Magnifier combines the camera, LiDAR Scanner, and on‑device machine learning to offer People Detection, Door Detection, and Image Descriptions.1 Determine a person’s proximity to you by using People Detection. Or get rich descriptions of your surroundings with Door Detection, which can help you navigate the last few feet to your destination by identifying doors, text, and the presence of symbols such as a restroom symbol, a no smoking symbol, or an accessible-entrance symbol.2 And Image Descriptions lets you hear more about items in the field of your camera.
A collection of settings supported across product platforms helps you customize your onscreen display according to your personal preferences. Make text easier to read with Bold Text or Larger Text. You can also invert colors, increase contrast, reduce transparency, or apply color filters to adapt your screen in ways that best support your vision preferences. These settings can be applied on an app-by-app basis in iOS and iPadOS. And in macOS, you can even customize the fill and outline color of your mouse pointer to make it easier to spot onscreen.
Enlarge an area of your screen on the fly. And in iOS, iPadOS, and macOS, you can get a picture-in-picture view, allowing you to see the zoomed area in a separate window while keeping the rest of the screen at its original size.
Get an alert for incoming Phone and FaceTime calls, new texts, email messages, and calendar events through vibration on iPhone or a quick LED light flash on iPhone and iPad. And your Mac can flash the screen when an app needs your attention.
This feature uses on‑device intelligence to notify you when it detects one of 15 different types of sounds, including alarms, appliances, door knocks, car horns, or even the sound of a crying baby. For electronic sounds, you can train your iPhone or iPad to listen for and notify you of sounds that are unique to your environment, such as your doorbell. When your device detects these sounds or alerts, you’ll receive a visible and vibrating notification.3
Captions can benefit everyone — from people who are deaf or hard of hearing, to those wanting to follow along in loud environments, to those who want to enhance their understanding and recollection of a conversation. The new Live Captions feature offers users real-time transcriptions of speech, audio, and video content.4 Turn on Live Captions during Phone or FaceTime calls or with any media content in your apps or browser. Or use them to stay connected during in-person conversations. Live Captions are customizable by font, size, and background color.5 And they are generated on your device, so your conversations remain private and secure. When using Live Captions on Mac, you have the added option to use Type to Speak to type out your responses and have them read out loud in real time for others in the conversation.6
AssistiveTouch for Apple Watch lets people with upper-body limb differences use their Apple Watch without ever having to touch the display or controls. Using built-in motion sensors and on-device learning, Apple Watch detects subtle differences in muscle movements and tendon activity, letting you control the display through hand gestures like a pinch or a clench. Answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.7 You can also use AssistiveTouch for Apple Watch to run shortcuts with Siri to complete tasks or modify VoiceOver and Switch Control settings on your iPhone.
AssistiveTouch for iOS and iPadOS helps you adapt standard gestures — like pinch, rotate, or swipe — to make them more comfortable for you. You can make other actions, like changing volume, accessible from the AssistiveTouch menu.
A double or triple tap on the back of your iPhone can be set to trigger all kinds of actions, like opening Control Center, taking a screenshot, or cueing a favorite app. Back Tap can even be used to turn on a wide range of accessibility features and run shortcuts with Siri, making it a great way to replace standard Home Screen gestures when they become tricky.8
Siri is a faster, easier way to do all kinds of useful things, including making calls, sending messages, and more. You can choose from many different Siri voices that sound incredibly natural when reading the news or answering questions.9 Use Type to Siri to ask questions and issue commands without speaking. Easily turn accessibility features on and off, and have notifications announced to you through your AirPods, Beats, or hearing devices. The Shortcuts app lets you quickly perform everyday tasks in your most commonly used apps. Just say “Hey Siri,” then the name of the shortcut to run it. There are several shortcut options within the Gallery tab in the Shortcuts app, including the Accessibility Assistant shortcut, which creates a custom list of recommended accessibility features based on your individual needs.
Reduce the visual clutter and strip away ads, buttons, and navigation bars to focus only on the content you want. And make reading onscreen text easier by customizing the font, font size, and background color. You can choose to use Safari Reader automatically on websites where it’s available.
If you learn or comprehend better when you can hear what you’re reading or writing, features like Speak Screen, Speak Selection, and Typing Feedback can help by adding an auditory component to text.10 As text is read aloud, Highlight Content can accentuate words, sentences, or both in your preferred style and color.
Background sounds can minimize everyday sounds that might be distracting, discomforting, or overwhelming. Balanced, bright, or dark noise and ocean, rain, or stream sounds continuously play in the background to help you focus, relax, or rest. Background sounds can also mix into or duck under other audio and system sounds as you use your device.
VoiceOver is an industry-leading screen reader that describes exactly what’s happening on your device. Navigate while receiving audible descriptions or braille output of onscreen content when using compatible braille devices with your iPhone, iPad, Mac, Apple Watch, Apple TV, or HomePod.10 You can explore details about the people, text, table data, and other objects within images.11 Hear receipts or labels read like a table — by row and column, complete with headers. VoiceOver can also describe a person’s position in relation to objects within images, so you can relive memories in greater detail.
Watch movies with detailed audio descriptions of everything happening in the scene — from a character’s expression to the mood of the shot. Audio Descriptions are available for all Apple TV+ original content.
Choose a specific range of text that you want to hear, and have your iPhone, iPad, or Mac read it to you. Speak Selection is available in more than 60 languages and locales.10
Customize your audio experience to your individual hearing needs. With your iPhone or iPad, you can amplify soft sounds or adjust certain frequencies to make media and phone calls sound more crisp and clear through your headphones. Or quickly adjust audio settings with your latest hearing test results imported from a paper or PDF audiogram. Conversation Boost for AirPods Pro helps you stay more connected while talking to people in crowded or noisy environments. Computational audio and beamforming microphones focus AirPods Pro on the voice of the person directly in front of you, helping you distinguish their speech from background noise and follow along better during face-to-face conversations in loud locations.
Apple has worked with top manufacturers to create hearing aids, cochlear implants, and sound processors designed specifically for iPhone and iPad. Apply your audiologist’s presets without having to rely on additional remotes, or adjust your own levels as you move from quiet environments to louder ones. Support for bidirectional hearing aids allows those who are deaf or hard of hearing to have hands-free Phone and FaceTime conversations. And you can keep better track of all your notifications by having Siri announce them through your Made for iPhone hearing device.
Live Listen can help you hear conversations in loud settings, or amplify someone speaking from across the room, by sending audio from iPhone or iPad to supported wireless headphones or Made for iPhone hearing devices, such as hearing aids and cochlear implants.12
Simple vocal commands let you quickly open and interact with apps using iOS, iPadOS, and macOS. You can also navigate through numbered labels alongside clickable items or by superimposing a grid to precisely select, zoom, and drag. With Voice Control spelling mode, you can dictate names, addresses, and custom spellings letter by letter, with the option to choose between individual letters or the phonetic alphabet.13 Voice Control is available in many languages, including Chinese, French, and Japanese.14
Navigate your iPad with only your eyes. iPadOS supports third‑party eye‑tracking devices.15 Compatible devices track where you’re looking onscreen, and the pointer moves to follow your gaze. Extended eye contact performs onscreen actions, like a tap to select.
Move up two lines.Select sentence.CAPITALIZE THAT.
Use a variety of adaptive devices and item, point, and manual scanning to navigate sequentially through onscreen elements and perform specific actions. Switch Control works with accessories like a switch, a joystick, a keyboard space bar, or a trackpad. It also works with Sound Actions for Switch Control, which lets you use simple mouth sounds like a click, pop, or “ee” sound. And with Apple Watch mirroring, you can use Switch Control and other assistive features to fully control your Apple Watch from your iPhone.16