Teetering on the edge of space

1306 0

Teetering on the edge of space, Josh Mings explores the role of multi-touch in the future of 3D interaction in product development
You touch your screen to turn it on. It flashes up instantly. You touch a sphere that opens to reveal 3-dimensional representations of your designs. You select two with your fingertips and flick them toward one edge of an 8-sided screen that immediately displays the content and structure of the 3D model… That was five years ago.

Now we’ve come to a threshold in what is possible with multi-touch technology. It’s not touch anymore, it’s reaching in and pulling out the syntonicly calibrated beating heart of the most adaptive design environment known to humans.

Where we’ve come from

I’m sitting in the back row of a small jet teetering on the edge of an Earthen atmosphere and a void of dark space. On the edge of floating and falling – the perfect place to write about the future of 3D interaction within the field of engineering and design, and the role our ability to manipulate areas of space with our touch is going to play.

Multiple ways to touch, multiple ways to interact with ribbons of our lives in timelines of playful video and digitised images: a technological revolution that linked our minds and our machines in one co-ordinated effort. We learned to create with our hands again, and multi-touch gave way to the mainstream influx of 3D as the only criteria for moving a design from engineering to production, making 2D redundant. All ideas of what CAD was to become had been dissolved with the simple idea of multi-touch. But it wasn’t ‘multi-touch’ we longed for all those years ago. We wanted to reach the limits of controlling multi-dimensional space.

What will the future of 3D interaction look like? Who knows? But it’s going to look pretty damn cool when it does arrive. And shiny. Definitely Shiny

Advertisement
Advertisement

The output previously determined the way in which we interacted with objects on a display. We were strapped to a flattened digital landscape – a single point within that landscape; a single action within that point. Then multiple points of input changed our ideas about the limitations of that landscape. Multi-touch became a symbol of a desire to explore outside of it.

Many people in digitally aided design companies thought the mainstream adoption of multi-touch technology would take a while. Developers had sensed it, but nobody realised the hardware and the software would act as one. It shook the world.

The skeptical majority failed to recognise the expectations of generations already familiar with using multi-touch environments to create libraries of media, networks, interactive games and relations between programs and data. In the CAD industry, the shortsighted simply saw a multi-touch screen as a way to move items about and failed to see 3-dimensional space in the context of multi-selectable, dynamic 3D design.

‘But it wasn’t ‘multi-touch’ we longed for all those years ago. We wanted to reach the limits of controlling multi-dimensional space’

Now, (cue the Theremin*) we float on the edge of the unknown limit of multi-dimensional systems with the possibilities of larger, more adaptive environments residing outside the film of OLED displays and graphical processing power reaching into the terabytes. What started as a theory of physical movement across a screen, designed to work in a 2D world, is rapidly becoming something amazing that will ooze outside the display of every design engineering device.

The possibilities

Multi-touch will be everywhere, but not the flat screen approach we envision. Multi-dimensional design corporations, 3-dimensional digitised media companies, and virtual world communities are leading the push in hardware and software development. The landscape is changing. The 3-dimensional design environments of the future will reside beyond the screen. We will control the movements of commands within a system that adapts to the processes we use to create 3-dimensional geometry. But constructing the geometry will not be the focus. Geometry will be the by-product of criteria entered onto multiple contact points in the surrounding environment of surfaces, lines and envelopes which relay adaptive design changes to the model.

Real-time rendering, tolerance and element analysis pre-empted multi-touch technology, but not to the extent that it will be used within a fully adaptive environment. Just as the measures of a 3D model are composed, the sounds and resonance of surroundings and adjoining parts will adjust in real-time with materials and conditions. The road to consolidating all aspects of design will be revealed in the adaptive environment that combines the requirements of each.

Beyond this, redefining the interface redefines how ideas and product are manufactured, putting designer, 3D modeller and engineer in the midst of the environment where the object will be used. Even in manufacturing, prototyping will stretch beyond the CNC operator moving multiple objects across the display. As a model is created, machine operators will use multi-trajectory tool-pathing techniques to test and create machine operations. And multi-touch technology will affect manufacturing methods even further.

During the spread of multi-touch technology into manufacturing, the idea of rapid laser sintering technology was being developed to bring product design and manufacturing closer together. In the new multi-touch environments, there will be no gap between designer and manufacturer. Flash-atomising of material will give immediate results for each design iteration. Tied to the adaptive revisioning of parts, prototypes can be super-sintered instantly to redefine feel, function, weight and aesthetics. The processes that separated each phase of design and engineering have been brought together by the simple idea of being able to touch each aspect instantly.

The future

Ten fingers used to be the limitation. Now they are the most powerful input devices to deliver precise responses to an environment that is just as responsive. It’s not that ‘multi-touch’ is making all of this possible. It’s the expanse in front of us that hasn’t been explored; the space that hasn’t been moved. Multiple points of interaction within that space will transform design and engineering. The boundary between the inside and the outside of the display will be blurred… and this is just the beginning.

*A Theremin is the earliest known electronic musical instrument. The player controls frequency and volume in 3D space by disrupting magnetic fields with their hands to create sound. It was the first 3-dimensional multi-touch device.

Josh Mings is a mechanical engineer in the aircraft interiors industry. He is also the brains behind solidsmack.com.


Josh Mings explores the role of multi-touch in the future of 3D interaction in product development


Leave a comment