More than one year on since its release, the Onshape Vision app from PTC is opening up new modes of visualisation and collaboration for designers wearing the Apple Vision Pro headset. Stephen Holmes sat down with Greg Brown, VP of product management at Onshape, to talk about where extended reality might prove most useful for designers and how best to take advantage of its powers
Q: How do you view the uptick in designers and engineers using VR and XR, and what’s Onshape’s approach to addressing this trend?
A: Repeat usage is now much easier to encourage. In the old days, VR was cool once, but you didn’t really feel like you ever needed to do it again. Typically, a CAD workflow involves a lot of back and forward between text input, as well as very precise mousing and other things, and our customer base hasn’t really resonated with sitting with their arms up in the air, doing ‘conductor’ actions.
So we’ve focused more on the collaboration, sharing and mark-up that keeps design reviews moving, trying to shorten that time, especially with distributed design teams. These are common within our customer base and a few have specifically gone out and bought head-mounted displays [HMDs] so that they can bring teams together a little bit more.
Q: The Onshape Vision App solely supports the Apple Vision Pro. What was the thinking behind this decision?
A: I’ve used AR/VR/XR devices all the way back to the last century, and the Apple Vision Pro [AVP] marks an incredible, order-ofmagnitude change from other devices: the field of view, the clarity, the way that you can work with gestures rather than pistol-grip devices in your hands.
The other thing is that it uses the environment around you to light a model. Nice refl ections, realistic lighting effects and real-time, correct shadows – that’s an incredible thing that your brain will pick up on very, very quickly. And it’s comfortable for long-term use.
Onshape has been working with Apple from the very beginning. The app was launched on the opening day of the Apple Vision Pro and has undergone iterative enhancements since then.
We had obviously been working with Apple beforehand for some period of time. And it was very good to have that very close relationship with the company, which continues today.

Q: What are some of the benefits for designers that you feel the Apple Vision Pro technology offers?
A: We’re using the super capabilities of Apple’s gaze and environment detection and all these other things. That’s the special sauce for the Apple Vision Pro!
The newest thing is what we call ‘true support for spatial computing’. Instead of running the Onshape Vision app alone, you can actually shrink that down into a small transparent cube space that you can put anywhere alongside you – next to you on your desk, for example – and can continue working in your other Apple Vision Pro apps while it stays up-to-date in real time.
It’s pretty incredible to use, because you can then bring up Safari and have Onshape in a browser window. You can also have Arena in another browser window, and you could be doing CAD and PLM integration, all while you’ve got a full 3D interactable model right next to you on the desk.
I could be looking at some 2D representation, like a table of numbers, which is feeding in to create the 3D data, and I can manipulate the 2D stuff and see the 3D model change at the same time in the CAD browser window and in the 3D visualisation model.
Q: Does being cloud-native give Onshape advantages when it comes to XR?
A: The special, almost magical parts of the Onshape Vision app are only possible because of the 100% cloudnative nature of the Onshape platform. You’re looking at the latest data, it’s dynamic and it’s real-time.
The Onshape platform is unique in that it’s this true single source of data. You don’t need to save out in some glTF format or other kind of exported format and re-upload to another software to visualise it. It’s a very natural way to continue what you were doing in Onshape in a browser.

Q: Where in the product development workflow does it fit best?
A: You can evaluate different designs very easily. Your team can be on an Onshape session, changing configurations, changing dimensions, and you see the instant update in real time. To have a model that’s got multiple configurations, like a robot with different end effectors, you can evaluate those differences in real time.
I’ve spent a lot of time in industrial design, and evaluating the flow of light over surfaces is a really interesting use case. It’s that good! The quality of the environment, mapping and reflections and materials you get, is so good that an industrial designer can get a real sense of how the light is going to flow.
You’re also able to invite other people to share your session. We support Apple’s Share Play. You basically place a FaceTime call [while using the AVP] to somebody who’s not using a device, and they can see what you’re seeing. They could be on the other side of the world – and then suddenly, you’re collaborating.
If you’ve got a second user who’s got an AVP as well, then it’s even better, because you can both be manipulating the design together. It’s really quite compelling to be able to pick one part out of the assembly and pass it to your collaborator. It works very smoothly, because the AVP treatment of gestures is so accurate. It tracks your hands incredibly well.
Q: You mention using Onshape Vision for marking up documents. How does that work in a virtual environment?
A: You can be looking and pointing at something, and then simply add a comment. Then that comment will be saved in that document the same way as if you had put that comment in using a browser or mobile device, and that’s in real time. I use voice-to-text all the time to make that comment. It’s a bit easier than typing.

Q: That’s a great tip – have you any others for the AVP?
A: Currently, you can bring up a Safari browser inside the AVP or you can mirror your screen. With the latter, I can be using my MacBook, but the display is virtual in front of me. This way you get a really high resolution – an 8K screen – much better than looking at Safari.
Also, this year, we included the ability to visualise your Onshape models in synthetic environments [VR], as well as your real environment [XR]. So we have six different environments that vary from a showroom to a factory.
There’s also a fun one – the cargo bay of a Star Destroyer – which gives really nice lighting effects. It has strip lights, a space vista outside, and there’s a flat, industriallooking floor. So, if you’re evaluating products that have lots of curves, it’s great for that.
We don’t allow people to upload their own synthetic environments, but that’s something that we could probably imagine doing in the future.
This article first appeared in DEVELOP3D Magazine
DEVELOP3D is a publication dedicated to product design + development, from concept to manufacture and the technologies behind it all.
To receive the physical publication or digital issue free, as well as exclusive news and offers, subscribe to DEVELOP3D Magazine here