" />

Virtual Reality –the New Era in 3D Engineering.

Video games are addictive; recently an engaging game of search and acquire (Pokemon Go!) has been actually driving young and old into a frenzy. It has successfully merged the real word into the virtual world and created an exciting experience for the gamers. Virtual reality or immersive virtual reality system entails of virtual reality software, head tracking sensor, a helmet mounted visual display and an input device that can be used to interact with the environment. This device lets you run through hurdles, shoot arrows, jump off cliffs and generally makes you believe you are doing the impossible. Unlike traditional user interfaces, VR places the user inside an experience by creating the illusion of being present in the digitally created spaces and environments.  Today Virtual Reality offers huge opportunities for fields like teaching, learning and Engineering (CAD/CAM/CAE) besides gaming and entertainment. It simulates senses like vision, hearing and touch and transports the participant into an artificial world.

How does VR work?

The single most recognized element of VR is the head mounted display (HMD). In VR, users’ eyes are located within the simulated environment, unlike Augmented Reality in which computer uses sensors and algorithms to determine the position and orientation of the camera.   Therefore when the user turns the head the graphics react too, creating a convincing and interactive world for the user.

Most popular players in Virtual Reality space:

Oculus VR

Oculus is the big daddy of VR who kick started it all and has the largest mindshare particularly when it comes to gaming.

Google Cardboard

Google Cardboard is essentially an ultra-low budget DIY virtual reality platform developed by Google. It has a head mount which is a fold out cardboard viewer. Users can build their own viewer using the specifications published by Google or buy it.

Microsoft’s Hololens

Microsoft’s HoloLens is more of augmented reality than Virtual Reality but is likely to offer number of advantages for business, engineering and communication areas.

Our experience with Virtual Reality@3D

Team ProtoTech got an opportunity to work on 3D BIM VR project for one of our esteemed clients. We realized,  that, while there were many 3D design software which helped you look at the model, the screen on which you saw it was still 2D.  Therefore team ProtoTech was given a project to develop a VR app with stereoscopic effects which would allow the user to actually experience the architecture of the model in virtual 3D environment.

So how do stereoscopic effects and VR affect the 2D screen?

VR allows you to exploit the concept of illusion of depth by means of stereopsis for binocular vision. Stereopsis refers to depth perception. When a person with normal vision looks at an object, each eye sees it from a slightly different angle. The images are then sent back to the brain where they are combined to create a single image and the differences are used to show what is nearer or far creating the 3D effect. Stereogram is a pair of images of the same model with a small offset in them.

So how did we create the stereoscopic effect?

Team ProtoTech used THREE.js for creating the stereoscopic effect.  We used THREE.js for two main reasons; firstly THREE.js is a webgl library that allows you to render 3D architectural designs on browsers and secondly, almost all devices (laptops, mobiles, desktops) today are compatible with some browser but the latest ones have a built- in support for Webgl. This key feature allowed us to render our 3D models on almost every device.  Secondly THREE.js renders the scene on a single viewport; however the stereoscopic effect we created helps to render the scene in two viewports, one for each eye with the offset in the scene between the two viewpoints.

Our solution for creating the altered VR experience.

  • We wanted to use the stereo effect to look around the model even if it is a cardboard device (Google Cardboard). So we decided to use the device orientation sensor of the mobile. Device orientation allowed the user to look around and inside the model.
  • However we felt this would not be enough as the user would also need to walk or stop or change positions. For this we introduced ‘Menu items’ in the VR App. These menu items are displayed when user starts looking in the downward direction.  To see the menu you need to simply look down while your mobile faces upwards.
  • Finally, we implemented navigation in the VR app with collision detection. Navigation can help start and stop using the menu icons besides helping control the speed of the navigation with the menu icons.

Needless to say, our client was delighted with the app and Team ProtoTech challenged their limits to solve a new problem. Another feather in our cap! We would love to know your experiences with VR or AR. Do share your feedback in the space below.

 

Contact us

Tarika.jain@prototechsolutions.com

Follow me on https://twitter.com/TarikaJn

References

http://www.theverge.com/2015/2/12/8021733/3d-audio-3dio-binaural-immersive-vr-sound-times-square-new-york

http://www.leanmeanfightingmachine.co.uk/blog/tag/three-js

http://www.aao.org/eye-health/ask-ophthalmologist-q/stereopsis

Tags:

About

ProtoTech Solutions is a custom engineering software development company with skills in 3D visualization on desktop/mobile/cloud. My role in the company is essentially Business Development and Marketing.

View all posts by

POST A COMMENT

You must be logged in to post a comment.