Apple has announced that users will soon be able to control their iPhone and iPad using their eyes. Here's how it will work.
Apple has announced new accessibility features, including Eye Tracking, which allows users to control their iPhone and iPad using only their eyes. This feature is designed especially for people with physical disabilities and can be set up quickly and easily using the device's front-facing camera for calibration.
The Eye Tracking feature will be available on both iPadOS and iOS, and it does not require any additional hardware or accessories. Users can move around in apps, activate different parts using Dwell Control, press buttons, swipe, and use gestures with Eye Tracking.
In addition to Eye Tracking, Apple also introduced another feature called Listen for Atypical Speech, which uses on-device machine learning to help Siri understand a wider range of voices.
According to Apple, these features leverage the power of Apple hardware and software, including Apple silicon, artificial intelligence, and machine learning, to continue the company's commitment to designing products for everyone. These features are expected to be available later this year with the iOS 18 and iPadOS 18 fall updates.