A couple of announcements this month led Al Dean to think about how our digital product design and simulation tools focus on the visual but so far miss the other key senses. Is a new era that includes sound and touch coming?
In early May a press release came through from Optis regarding a project that it had undertaken in conjunction with Airbus.
If you’ve not come across Optis before, it’s a French outfit that has built a reputation for providing high-end visualisation technology, typically used in the automotive and aerospace industries. The project (which has since become commercial) focussed on combining the benefits of today’s lower cost, more immersive VR head mounted displays (HMDs) with both haptic feedback and audio simulation.
The end result is a system that not only allows you to immerse yourself in the visual representation of a virtual model and environment but, with smart use of haptic feedback and audio simulation, allows you to walk around a product too.
This means you can gain feedback as your ‘digital’ fingers touch a hard surface as well as experiencing the sound emitted from that product. And the latter is not just an accompanying sound file, but a truly 3D simulated representation of the acoustic quality of the product and its surroundings.
Considering Optis’ core market, this makes huge sense. As well as the exterior looks of a car, the later stages of the sales cycle often focuses on the experience the customer has of the car’s performance when sat in the driver’s seat.
Now, think about that experience, particularly during those first few moments. It’s how it looks, how it is lit, the materials chosen etc. Then as you start to explore further, you become aware of the interaction with the console surrounding you. That’s part visual, but also heavily influenced by how controls feel and how they sound.
We’ve all heard the stories of when Toyota set up its Lexus division it studied how the doors of a luxury car sounded with a view of replicating that ‘sound of quality’.
And it’s not just cars. Think of a product you love. Chances are that it’s not simply just about how it looks. It’ll be a combination of how it feels in your hand and how it operates. If there’s a mechanical action involved, it will be how that mechanism feels when operated.
Give me a rocker switch and I’ll happily sit there flicking it back and forth (probably till you take it away from me), because the feel and sound reminds me of a Vauxhall Viva. Same is true of many things. That switch on a kettle or the different feel of a lightswitch when travelling overseas.
Granted, in these days of omnipresent touch screens, mechanical switches are becoming few and far between but I bet you feel the same (at least, I’m hoping so, otherwise I’m going to look a right ol’ plonker when this comes out). The iPhone 6 still has a mechanical mute switch — how long will that be around I wonder?
So considering we use a combination of our senses when interacting with products, it’s curious that we haven’t seen more of this type of work that Optis and Airbus are doing. Our digital design systems now allow us to not only develop the geometry of the part, but represent it near photorealistically on screen. With the advent of affordable VR, we’re able to walk around and look at that product in the same, visually rich manner. But can we yet truly experience it immersively, with a sense of sound or touch?
Not at present, but we’re quickly advancing to the point where this might become a reality. It’ll start off at the high-end and become something that only the largest companies with the most at stake will adopt first. Then it’ll filter down, probably driven by adoption in the gaming industry (Call of Duty Smell-o-vision Edition anyone?) and we’ll all have a richer experience.
Won’t that be a fancy thing, eh?
Is a new era coming that includes sound and touch?