On Tuesday, Apple released iOS 13 beta 3, which came with an interesting feature called FaceTime Attention Correction. This feature aims to fix a long-standing issue of maintaining eye contact in FaceTime calls with the help of augmented reality.
Mike Rundle, an app designer was the first to spot the feature while testing the latest iOS 13.
Haven’t tested this yet, but if Apple uses some dark magic to move my gaze to seem like I’m staring at the camera and not at the screen I will be flabbergasted. (New in beta 3!) pic.twitter.com/jzavLl1zts
— Mike Rundle (@flyosity) July 2, 2019
Back in 2017, he predicted that this feature will be a reality in “years to come.”
This is a feature I predicted would be coming in “years to come” back in 2017. Pretty astounded it’s already here. https://t.co/UT4aQgsIMN pic.twitter.com/jNTUbN1sOa
— Mike Rundle (@flyosity) July 2, 2019
While FaceTiming, users naturally tend to look at the person they are talking to instead of looking at the camera. As a result, to the person who is on the other side, it will appear as if you are not maintaining eye contact. This feature, when enabled, adjusts your gaze so that it appears to be on camera. This helps you maintain eye contact while still letting you keep your gaze on the person you are talking to.
Many Twitter users speculated that the FaceTime Attention Correction feature is powered by Apple’s ARKit framework. It creates a 3D face map and depth map of the user through the front-facing TrueDepth camera. It then determines where the eyes are and adjusts them accordingly. The TrueDepth camera system is the same camera system used for Animoji, unlocking the phone, and even the augmented reality features we see in FaceTime.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin 🤘 (@schukin) July 3, 2019
To enable this feature, one can go to Settings > FaceTime after installing the latest iOS 13 developer beta 3. On Twitter, people also speculated that it is only available on iPhone XS, iPhone XS Max, and iPhoneXR devices for now. It is unclear whether Apple plans to roll out the feature more broadly in the future. It would be interesting to see whether this feature works when there are multiple people in the frame.
Side note: according to comments in Reddit, this appears to only be available for X🅂 and maybe X🅁 phones – not the iPhone X.
— Will Sigmon (@WSig) July 2, 2019
Users have mixed feelings for this feature. While some developers who tested this out felt that it is a little creepy, others thought that this is a remarkable solution for the eye contact problem.
A Hacker News user expressed his concern, “I can’t help but think all this image recognition/manipulation tech being silently applied is a tad creepy. IMHO going beyond things like automatic focus/white balance or colour adjustments, and identifying more specific things to modify, crosses the line from useful to creepy.”
Another Hacker News user said in support of the feature, “I fail to see how this is creepy (outside of potential uncanny valley issues in edge cases). There is a toggle to disable it, and this is something that most average non-savvy users would either want by default or wouldn’t even notice happening (because the end result will look natural to most).”
Read Next
OpenID Foundation questions Apple’s Sign In feature, says it has security and privacy risks
Apple gets into chip development and self-driving autonomous tech business
Declarative UI programming faceoff: Apple’s SwiftUI vs Google’s Flutter