We all expected that iOS 13 was going to pack a lot of useful features besides the praised Dark Mode, but we didn’t expect Apple to ninja-patch an Augmented Reality powered eye correction to FaceTime video calls. But that’s exactly what the company did in the latest iOS 13 public beta update.
According to app designer Mike Rundle, the iOS 13 public beta now includes an option for “FaceTime Attention Correction.” Turning this on will increase the accuracy of your eye contact with the camera during FaceTime video calls. Basically, it’s like some AR dark magic that changes the appearance of the direction that your eyes are currently focused on, making it look like you’re staring at the camera when you’re actually staring at the screen.
This fixes a minor, but irritating issue that many, if not all video-calling apps like FaceTime, suffer from. If you’re looking at your screen to look at the person you’re talking to, you won’t be looking at the camera. Since you’re not looking at the camera, it won’t seem that you’re looking at the person you’re calling — mainly leading to a weird disconnect between people, making them feel that they’re not receiving direct attention from one-another, since the callers won’t be looking directly at each other.
But Apple’s new feature changes that. By making subtle alterations to your video stream, it seems like you’re looking directly at the person on the other end of the call. The people that tried it out immediately noticed that the new feature is very effective. This is a combination of Apple’s ARKit (augmented reality software) and the TrueDepth cameras built into the latest iPhones. Video-calling Apps like FaceTime (as well as FaceID) use the TrueDepth camera to grab a depth map of your face, then run the data through the ARKit to finally slightly adjust your eyes and nose accordingly. Most importantly, this can happen in real time thanks to the processing power of the most recent iPhones, making the process smooth and flawless.
Lastly, there’s a catch. The amazing new feature is only available to the most recent iPhones, meaning that only iPhone XS and XS Max owners are currently able to experience it. This excludes the iPhone X, that sadly is actually equipped with the same hardware. But Apple might bring this to the iPhone X in the full release of iOS 13, even if shortly after it. It’s important to note that it’s still unknown whether this feature will also come to MacOS and iPadOS, but everyone is assuming it will. If you end up testing this new feature out, pay close attention to notice the warping of the line across both the eyes and nose.