Discover our new options with our demos planning
Paris – (21/04/2017)
TechViz will demonstrate its solution in various systems, including its TechViz VR package in a head-mounted display, a light system enabling you to do project review or even ergonomic studies and correct mistakes at an early stage in your industrial models.
Collaborative demonstrations will take place on TechViz booth E4, between a HTC Vive and a Powerwall thanks to a Mirage WU14K-M Christie projector or between 2 HTC Vive. TechViz collaborative option enables real time collaboration between co-workers in different locations, on any VR device.
Advanced Virtual Assembly
Enables to simulate complex mounting/dismounting tasks and explain manufacturing and maintenance issues to a remote team. Users are also able to move interactively a part of the model or manipulate a virtual tool, activate and detect collisions. Adding an application connector for Catia, Navisworks, Creo or NX enables to provide part information based on what is defined in the PLM database as well.
Watch a video of TechViz Virtual Assembly option
Interactive Image Integration (I3)
The aim is to emulate a touch-sensitive interaction with HMI in an immersive environment. Those HMI are independent 2D applications integrated in the mock-up.
For example, in a virtual car, users’ fingers can be tracked thanks to a thimble and interact with the virtual GPS interface.
The collision between the interaction tool and the user interface emulates a computer mouse click.
It enables to create various scenarios such as testing the hands accessibility to the main commands from the driver seat of a car.
In TechViz, you have the possibility to insert a virtual human avatar: a manikin, without using BodyTracking.
Modelled on a real manikin, this avatar can be handled holding it by several limbs and articulations: wrists, knees, elbows, feet, head and chest. Once the avatar created in TechViz, it can be placed in a posture (sitting/standing/crouching). It is also possible to record a posture for a later use/project review.
Thanks to this avatar, we can check a user can catch an object and access commands based on where it is located and its posture. We can then see which postures to adopt to do a task even if there are impediments on the way.
Points cloud are 3D data of one existing model/environment and are most of the time created using 3D scanners. They enable to create quickly, a ‘3D model’ (only made of points) representing a physical real object/environment without the need to create it manually from scratch (which takes a long time and costs a lot of money).
Since the model is ‘recreated’ with points, displayed on a normal non-stereo screen it will look flat. Users would get better insights of the object/environment itself by displaying the model in 3D with tracking possibility (a VR environment).
TechViz tracking option enables people to dive and immerse themselves into their model. Users can navigate inside their model and see what was missed on their normal screen.
Lucie Deniset +33 1 55 03 00 84