One of the most unavoidable, yet irritating thing about video conferencing is that you never feel like you are talking to the person. We always look at the screen where the camera is not present, thus making us look to a different location, than where we actually should. But, again, it is not our fault. There is no point in looking at the camera lens and it is not possible to communicate with the camera. All these issues have been thoroughly addressed by Apple and its iOS 13 is doing the best to make eye contact on FaceTime better, easier and more realistic.
iOS 13 is presently in its developer Beta stage. It has incorporated a new feature called FaceTime Attention Correction. The new feature will blend different technologies in such a fine way, that it would result in the best FaceTime experience. You will feel that you are talking to someone in person and not video calling. The feature has been spotted in the third beta version.
The FaceTime attention correction feature is only available on iPhone XS and iPhone XS Max. The latest devices have state of the art camera and only these latest handsets will get the feature. A plausible reason is the new image signal processor that Apple included in the A12 processor. It may happen that the features will roll out to other iPhones as well. We are not sure! So, finally video calling is witnessing the biggest change holding hands of Apple.