CeBIT 2018 - Mixed Reality: Walking naturally between worlds

In the future, users of Virtual Reality (VR) will be able to interact with each other more easily, more naturally and in real time between real and virtual worlds. A scene from the real world can be transferred into Virtual Reality, after which feedback can be reflected back into the real situation. The Fraunhofer Heinrich Hertz Institute HHI will present a new X-reality technology that makes this possible at CeBIT at booth E78 in hall 27.

Wherever collaboration across distances is necessary, mixed reality applications open up new possibilities. For example, in the area of remote assistance where a technician has to repair something on site. In the future, his colleague in the office will be able to get a virtual picture of the situation in 3D using VR-glasses and even intervene virtually in the scenery by showing the technician the correct components or hand movements.

“The solution we developed for this purpose can connect a simulated world with the real world in real time and high quality and open up new perspectives or collaborations," explains Paul Chojecki, project manager at Fraunhofer HHI. “Physical interaction without disturbing controllers is more natural and more comfortable. The solution can be adapted more flexibly to the height of the user and increases immersion. At the same time, it can reduce symptoms of motion sickness that are often caused by VR scenarios."

High-resolution 3D object- and body-detection for mixed reality interactions

The process is essentially based on two technologies: In the real world, eight cameras, four pairs each, capture the scene from all sides and generate depth maps with up to 30 Hertz. Gestures and dynamic movements are also detected. These data are then combined by algorithms, coded and transmitted to the VR station in real time with the corresponding 3D textures.

In the virtual scenario, however, another 3D camera records the VR user. Thanks to the Fraunhofer HHI algorithms for 3D body detection and gesture interpretation, it can naturally interact in the VR scene without disturbing controllers or markers. In the scene, the user is represented as a moving full-body avatar, so to speak, and sees his own body and gestures in a virtual space. "Only the combination of the two technologies enables a unique solution for new mixed reality interaction and collaboration scenarios," says Chojecki.

Feedback from the virtual world is displayed in the real scene by means of a projection. Special image processing algorithms from Fraunhofer HHI are also used for this projected augmentation. These ensure that indications and operating elements are displayed with visual accuracy, even if the surfaces on which they are projected move or are tilted.

The fields of application of the X-Reality solution are versatile: In addition to remote assistance, the process can be used in rapid prototyping, human-robot interaction and telecommunications, or in telepresence and gaming sessions. Two spatially separated people, for instance, could be playing a board game with each other.

At the Fraunhofer booth at CeBIT, visitors can try out a 3D puzzle and get help from the virtual world. In the demo, the real object is projected live into the VR environment and explained to the VR user. The user is then able to react virtually und to reflect these reactions back into the real situation.