Rangefinding is one application of stereoscopic systems. This involves using two cameras, much like we use two eyes, to judge distance. I’ve designed stereoscopic imagers, simulated their rangefinding capabilities, and estimated the uncertainty in measuring image point location and distance.
Range is derived from the parallax between matched points viewed by the pair of cameras. Parallax increases dramatically as the cameras approach the object at close distance. As a result, the relative accuracy of range measurements generally improves as the cameras approach the object being ranged.
Other methods for simulating depth perception include analyzing the deformation of a projected illumination pattern over a scene (structured light) and measuring the time it takes light to reflect off objects in a scene and be detected (LiDAR and Time-of-Flight).