This article is part of a DEVELOP3D Special Report into Virtual Reality (VR) for design, immersive engineering and manufacturing
These technologies are not as mature as VR, and many of the more advanced AR / MR headsets are still in pre-production, but they should start to gain real traction in the next few years. AR and MR are often interchangeable, so it’s worth spending some time explaining the difference between the two.
Augmented Reality (AR)
AR describes the use of a device to overlay digital information in the physical world. Whether that’s performance metrics, diagnostic information, geometry or something else.
The device allows you to look at an object (or indeed, a geographic location) and stream information relating to that object, in context, in your line of sight. That line of sight can be direct (overlaid on glasses in front your eyes) or on a screen, through a camera (think pointing a smartphone or tablet at an object).
Mixed Reality combines the immersive nature of VR with the augmentation of AR. It allows you to visualise a virtual object directly in the real world. Whether that’s a product prototype on your desk, a presentation in a conference room or a work cell in situ on the factory floor.
This is done through a headset that overlays the digital artefacts and data directly in your line of sight, allowing you to interact with the digital object, as you would a real object, in the room.
But is that clear cut?
While you could spend all week arguing over the definitions of VR, AR and MR, the reality (pun, absolutely, intended) is nowhere near as clear cut.
AR is all about adding additional digital content to real world objects, but looking at a digital factory work cell on a shopfloor is both AR and MR, as it’s an entirely digital set of data, in the context of its use.
Ultimately, the definitions are almost academic and should be treated as such. What really matters is how these technologies, whatever you call them, can benefit design, engineering and production.
Potential for immersive engineering tech
Since the inception of the CAD industry, we have been locked down to a single interface method: a keyboard and mouse, combined with a 2D screen, which is essentially a digital drawing board. While the rise of 3D design applications in the 1990s made the drawing board analogy less accurate, we have still been stuck with a two dimensional display device and, with the notable exception of 3D mice, two dimensional input methods.
AR/MR is set to bring a whole new set of display devices and input methods, that move beyond the flat screen and give us a more intuitive, more natural display with more intuitive input methods.
Let’s explore that potential with three little stories.
Story#1: The design process
Rather than sitting in front your 27-inch monitor, you pop on your headset, fire up your CAD system and see the object in front of you, ready to go.
Using voice commands and hand gestures, you begin creating or editing a new model. If you need to discuss the model with a colleague, they do the same and you can look at and edit your model collaboratively, just as you would if working on a physical model — pointing out areas of concern, making edits individually and inspecting the results.
Unlike VR, where you are fully disconnected from the world around you, in AR/MR you work on a digital prototype that appears in the real world, right in front of you (and in front of your team). Importantly, you can still see your team members.
Now to take things further, imagine you’re working with a specialist in Japan. She can’t fly in for the design review and the time difference means she can’t join in live.
Imagine if you could record the session, tracking not only the model changes, but also the discussion and what each participant is doing, and where they are pointing. She can wake up, sit through the review, look at what you’ve looked at and make her own edits and continue the project or her portions of it.
Story #2: Design to shop floor
You’ve just completed the concept models for a new factory work cell and submitted your quote. The customer likes what you’ve done but wants to learn more. You have the digital model built and want to see it fits in context. You and the client visit the factory floor. The freshly skimmed concrete is there, waiting for the assembly your team has designed.
You fire up your headsets and there it is, in situ, operating as it would. You can both walk through, inspect it, make notes and crunch through those inevitable changes you’d only typically find once it’s installed and commissioned.
Story #3: In the world
As a final example, you’ve connected your products to the Internet of Things (IoT). A customer unit has a fault and previous data indicates that this usually means that a critical fault will develop in the next three days. Your business switched to a service-based model, so if the product is down, you don’t get paid. So your service team is deployed to fix it.
They arrive on site, fire up their headsets and the product in the field is overlaid with diagnostic information, showing the metrics and fault indicators.
The technicians are then shown how to replace the problematic part or sub-system, stepping through the process with simple voice commands.
There’s no looking up which custom configuration your customer has, no flipping through greasy manuals or stroking an iPad’s screen. The data that’s needed, is there, in front their eyes, overlaid in high resolution directly on the product in front of them.
These three scenarios illustrate the potential for a mix of VR/AR/MR/whatever-you-want-to-call-it ‘immersive engineering’. This rich combination of creation, editing, collaboration and distribution of information is all converging and converging soon.
Some parts of these stories are available now. Some are being worked on as we speak. Others are extrapolations of today’s technology and a prediction of what’s to come.
This is the promise of the next generation of computer interaction devices — and the potential for design, engineering and manufacturing is infinite.
The good news is that the hardware and services behind it are being driven by the entertainment sector, a much bigger market that engineering alone.
With our core skill sets of creativity, geometry and data wrangling, we get to take advantage of all of this as soon as it becomes available.
This article is part of a DEVELOP3D Special Report into Virtual Reality (VR) for design, immersive engineering and manufacturing, which takes an in-depth look at the latest developments in software and hardware and what you need to get up and running for immersive engineering.
Everything is for a reason How McLaren Automotive unleashes VR to create faster cars with more attention to detail
Quick guide: VR enabled applications A list of what’s out there now or coming soon
Virtual Reality challenges & future Six industry thought leader’s views on the future of VR
HTC Vive: Getting up and running Our experience of working with HTC Vive and how to avoid common mistakes
Workstations & GPUs for VR A back to basics guide to buying hardware for Virtual Reality
Game on Amalgam creates game controllers for Holovis
VRED Pro 2017 & VR The latest release adds greater support for the HTC Vive and Oculus Rift tools
Virtalis VR4CAD Offering expertise in VR at a much more affordable level
ESI Group IC.IDO 11 ESI’s IC.ID0 is one of the most advanced Digital Mock Up tools available. With its addition of Vive support, we take a look at what the system is capable of and how it can benefit engineering
Gravity Sketch Beta An interesting take on Modelling in Virtual Reality
Google Tiltbrush A system for VR creativity that’s both cheap and capable
Oculus Medium A good contender for design experimentation
Amari Magnetar V25 This stylish workstation has been specifically designed for VR. But despite its slimsline chassis, you can still cram in incredible processing power
Nvidia Quadro P2000/P4000 Nvidia is changing the landscape of professional 3D graphics with a new family of Pascal Quadro GPUs, including a single slot ‘VR Ready’ card.
Back to basics: Augmented/Mixed Reality hardware
Whereas VR headsets are fully enclosed devices, with all visual feedback sent via displays, augmented and mixed reality headsets are different beasts altogether. The concept is that they allow the mixing of digital information with the real world, overlaying the chosen data in front of what you see in the real world.
Methods to do this vary by vendor, but the overall picture is one of using transparent displays to show information, whether that’s a stream of text-based data or a 3D model.
AR/MR headsets also typically feature context and environment awareness. This means that the device is aware of not only our movement, but also the surrounding area — it’ll track you as you move around a virtual object (rather than the VR set-up which tracks your movement with external sensors within a specific area).
While it’s very early days, it’s clear that there’s a lot of action in the AR/MR headset industry. Microsoft is perhaps winning the perception and awareness battle with its HoloLens product but don’t think this is the be all and end all.
There are many startups working in this space, from MetaVision with its Meta 2 headset to the Osterhout Design Group (ODG) with its R8 and R9 devices. And, according to sources from CES this year, this is just the tip of a massive iceberg.
What’s interesting about this new breed of devices is that the requirements are much different from that of VR.
Their nature means that they need to be much more portable (particularly if they are to be used in the field, rather than an office environment). This alone is a challenge to be solved.
Wireless is key so there needs to be enough on-board compute power to deliver the high frames rates for the display. Battery life, of course, is also key.
Build in sensors for motion, sensors for interaction and mobile communications, it’s a packaging nightmare — all waiting to be taken on, iterated and solved.