This thesis discusses the physical and information theoretical limits of optical 3D metrology
and based on these principal considerations introduces a novel single-shot 3D video camera
that works close to these limits. There are serious obstacles for a perfect 3D-camera: The
author explains that it is impossible to achieve a data density better than one third of the
available video pixels. Available single-shot 3D cameras yet display much lower data density
because there is one more obstacle: The object surface must be encoded in a non-ambiguous way
commonly by projecting sophisticated patterns. However encoding devours space-bandwidth and
reduces the output data density. The dissertation explains how this profound dilemma of 3D
metrology can be solved exploiting just two synchronized video cameras and a static projection
pattern.The introduced single-shot 3D video camera designed for macroscopic live scenes
displays an unprecedented quality and density of the 3D point cloud. The lateral resolution and
depth precision are limited only by physics. Like a hologram each movie-frame encompasses the
full 3D information about the object surface and the observation perspective can be varied
while watching the 3D movie.