NX Immersive Designer

Interview: Siemens NX Immersive Designer

3054 0

The launch of Sony’s new enterprise XR headset captured headlines at this year’s Consumer Electronics Show in Las Vegas. DEVELOP3D spoke to Siemens Digital Industries Software about its partnership with the legendary Japanese hardware company and how the two companies are coming together on a full package of XR capabilities for designers and engineers


For the first time, a major vendor has created an extended reality (XR) headset specifically with the needs of 3D designers in mind.

Available later in 2024, Sony’s ‘immersive spatial content creation system’ comes with a number of designer-friendly features, from its flip-up visor and precision controllers to a Sony/ Siemens partnership to build an immersive engineering system that will make the most of all the headset’s capabilities from the get-go. NX Immersive Designer is billed by Siemens as a solution to “kickstart content creation for the industrial metaverse”.

DEVELOP3D spoke with Siemens’ lead for immersive engineering Ben Widdowson about the launch, how the partnership with Sony developed, and how Siemens hopes to bring functional, direct-to-CAD XR to the desks of designers and engineers.

NX Immersive Designer
Sony’s new enterprise XR headset is for 3D designers and engineers
NX Immersive Designer
The headset is set to work in harmony with existing desk set-ups – mouse and keyboard included

Q (DEVELOP3D): Hey Ben, this is an exciting partnership between two huge brands. How long was it in the making?

A (Ben Widdowson, Siemens): Sony is a Siemens customer and has been for many years. They approached us about three years ago with the idea of creating this new mixed reality headset specifically for engineering. The idea was born out of their own engineering teams that were wanting to use this kind of technology in their own dayto- day work. So that’s quite nice, in that it was inspired by engineers using NX within Sony. We obviously agreed that this was something exciting to do, and we wanted to initially announce it at CES, because it’s such a big opportunity.

Q: There are a lot of players in the XR hardware space right now, but what was it that Sony was bringing to the table from an enterprise standpoint?

Advertisement
Advertisement

A: We agreed that Sony was a great partner for this, because they’ve obviously got a rich heritage of developing their own products, including PlayStation. But also, the things that they bring to bear in this headset specifically are elements like very high-end sensors from their cameras, which are inside the headset. Things like the high-end displays, which are inside the lenses. There’s also the motion tracking technology, which is inside the controllers; and also for hand-tracking, they have another product called Mocopi, which is for doing very simple motion capture without needing all the kinds of spots and sensors. Sony has all these professional tools that they’re bringing to this new HMD to enable creators.

Q: From a software standpoint, what will the initial Siemens’ tools entail, and what parts of the design workflow will they focus on?

A: The software, NX Immersive Designer, is very much targeted at doing meaningful engineering work. We’re going beyond doing just design reviews. The big limitation that we’ve seen with most design review tools, including our own historically, is that the VR session isn’t necessarily connected to the live CAD data. So, when you launch the session, you’re either having to export something out of the CAD system and carry out some heavy pre-processing or data preparation to open it up in a different tool or a game engine, or it locks out your CAD session. So you’re always left analysing static data.

In Siemens’ language, you’re ‘breaking that digital thread’. As soon as you export something, you’ve not got that connectivity back to the CAD system or back into the PLM system and you lose all of the traceability – the good stuff that companies invest in tools for in the first place. This cumbersome workflow limits the applicability and usability of immersive technologies from a day-to-day perspective. So, the big piece underpinning all of NX Immersive Designer is that you’re actually interacting with the live NX session.

This means NX is now an immersive engineering application. You can do anything you would do in NX normally with your mouse and keyboard, and those updates will come through into the immersive view in real time. And what we’re looking to do now is to add in, albeit starting slowly, specific UI interactions in the immersive view that also update in real time back into the NX view. You’re actually interacting with the live NX data.

Initially, you’ll be able to do simple things like some synchronous modelling and playback Mechatronics Concept Designer animations in the immersive view. And we’re excited to use the rest of this year to speak to customers and understand more immersive specific workflows that they’re interested in.

Q: What do you think will be the biggest benefit of having more prolific XR technology in design workflows?

A: Collaboration is obviously fundamental to anything like this, enabling multiple people in different locations to view the same virtual model, or multiple people in the same office to view the same model. It’s quite transformative and we really want to make it as natural as possible for people to collaborate. We’re all superfamiliar now with logging onto a Teams call and having a conversation. And if we can do that around the common CAD data set in a rich, immersive and live environment, then that’s even more powerful. We are also in the early stages of evaluating how this same approach and technology stack could be implemented across a wider spectrum of the Siemens Xcelerator portfolio, so we can bring the same benefits to a much wider cross section of our user community.

Q: The launch video features members of the Red Bull F1 team using the new product in a variety of scenarios. Is that a good example of how you believe it will be used in the future?

A: The collaboration with Red Bull Racing was two-fold. Firstly, they’re a super-cool organisation and have been very generous to allow us to use their data in some of our demos and videos. But there’s also the relationship with Red Bull Racing itself: we’ve been sponsoring the team now for 20 years and they really identify technology as their biggest competitive advantage in F1. They correlate the number of engineering changes they’ve been able to make each year with the number of points they’ve achieved in races. And obviously, the chart goes up, which is nice to see!

Being able to carry out really rapid design changes, week to week, is obviously critical to that particular industry, which is quite specific. They see collaboration [in XR] as having huge potential, being able to do those design reviews and design sessions with the engineers out at the races and also back at home.

Q: The launch also highlighted the control methods for the headset, which has two precision handheld units as its focus, but also includes the standard mouse and keyboard. At the same time, Siemens has invested a lot into voice control technology in recent versions of NX. How will all this come together?

A: It’s a really simple point: you can still use your mouse or keyboard, whether that’s with the visor flipped up, doing work in NX like you normally would, or with your visor flipped down with a virtual monitor and your virtual object. You can still interact with the mouse and keyboard.

The Sony handheld controllers are specifically designed for precise engineering workflows. The ring allows you to effectively ‘pull’ your virtual object out of the screen and interact with it, almost holding it. And then the pointer is very much for precise selection and manipulation of the geometry.

Voice Control is certainly a direction we’re looking at. We’re not launching specifically with some of those capabilities in mind yet, but they will still work, because you have access to all of NX. Voice control and other input methods are certainly things that we’re looking at.

Q: Aside from headsets and software, XR tends to come with extra costs for the compute hardware needed to run it all. What will NX Immersive Designer require?

A: We don’t have a specific set-up that we’re recommending at this point, although it should be NX-Certified and VR-ready. Those are the two things that we recommend to our customers. Obviously, it will require a GPU to run the immersive view. But we have people using it just on their internal data laptops with reasonable GPUs, but nothing too crazy.

This is the start of the journey, so we’re launching with an NX-based solution. We do have plans over time to support the rest of the Siemens Xcelerator portfolio, but nothing specific to reveal about that just yet.


Pricing and availability on both the Sony XR headset and Siemens’ NX Immersive Designer will be announced later this year.


This article first appeared in DEVELOP3D Magazine

DEVELOP3D is a publication dedicated to product design + development, from concept to manufacture and the technologies behind it all.

To receive the physical publication or digital issue free, as well as exclusive news and offers, subscribe to DEVELOP3D Magazine here