GEOMETRY AND TEXTURE MEASURES FOR INTERACTIVE VIRTUALIZED REALITY INDOOR MODELER
Keywords: Virtualized reality indoor models, Texture distortion, Inpainting, Shape from Texture
Abstract. This paper discusses the algorithm to detect the distorted textures in the virtualized reality indoor models and automatically generate the necessary 3D planes to hold the undistorted textures. Virtualized reality (VR) interactive indoor modeler, our previous contribution enables the user to interactively create their desired indoor VR model from a single 2D image. The interactive modeler uses the projective texture mapping for mapping the textures over the manually created 3D planes. If the user has not created the necessary 3D planes, then the texture that belong to various objects are projected to the available 3D planes, which leads to the presence of distorted textures. In this paper, those distorted textures are detected automatically by the suitable principles from the shape from texture research. The texture distortion features such as the slant, tilt and the curvature parameters are calculated from the 2D image by means of affine transformation measured between the neighboring texture patches within the single image. This kind of affine transform calculation from a single image is useful in the case of deficient multiple view images. The usage of superpixels in clustering the textures corresponding to different objects, reduces the modeling labor cost. A standby database also stores the repeated basic textures that are found in the indoor model, and provides texture choices for the distorted floor, wall and other regions. Finally, this paper documents the prototype implementation and experiments with the automatic 3D plane creation and distortion detection with the above mentioned principles in the virtualized reality indoor environment.