Eye Tracking Coming to iPhones
May 16, 2024, 10:42 AM by Rich Brome @richbrome
Apple has announced several new accessibility features coming to iOS and iPadOS later this year. Eye Tracking will let you control your iPhone using just eye movements. Dwell Control will let users activate elements and access "additional functions such as physical buttons, swipes, and other gestures solely with their eyes." Eye Tracking uses the front-facing camera and on-device machine learning. Another feature called Vocal Shortcuts will let users "assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks." Finally, Music Haptics lets users who are deaf or hard of hearing experience music via taps, textures, and refined vibrations generated by the iPhone's Taptic Engine. The feature will work in Apple Music and third-party apps can support it with a new API.
Comments
No messages