Why Design Tools Need To Change

nTopology’s Bradley Rothenberg on why design tools need to change

1859 0

Through the last decade, manufacturing has gone through one of the most important shifts of the last 100 years: the shift to digitally controlled materials.

Factories are no longer built on machines that cut away existing materials. Instead, the most advanced factories on the planet now house machines that can build up material in digitally specified locations, in addition to removing it.

This means that you can now directly control material in the 3D world from a computer. You can produce objects that have highly tuned material properties. To use an analogy, we have gone from a world where you only get one shade of grey, to one where you get the richest full colour.

However, 3D design software is stuck with grey.

Grey software might be good for black and white movies, but what if you want colour? Trying to re-colour each frame would be slow.

If you take a picture of a beautiful autumnal day, featuring lots of leaves of different colours, it would be too complex of a problem to have to re-colour the frame by hand, not to mention that this doesn’t sound like fun work.

Tackling complexity

The traditional approach to solid modeling, which is still used in almost all systems, is based on ‘boundary representations’, or ‘b-reps’ for short.

Advertisement
Advertisement

This means that each object is represented by a collection of faces that describe its outer skin (that is, its boundary).

This approach became dominant about 40 years ago, since it was a natural evolution of earlier ‘wireframe’ systems, and was well-suited to the creation of drawings, which back then, was a primary application.

Unfortunately, a complex 3D-printed lattice might consist of billions of faces, so b-rep models can become enormous. Even a grille with thousands of holes has enough faces to cause trouble.

Another approach is to use an implicit representation that specifies a recipe or process for creating an object, rather than the result of this process, which allows highly complex objects to be stored very concisely.

This is what the history tree does in contemporary CAD systems, of course.

However, the history tree is used only for editing and to drive creation of b-reps, and nothing works (not even display) until a b-rep has been computed.

We sometimes call this the ‘b-rep bottleneck’. The key difference in the nTop system is that it uses the implicit representation for display and other purposes without first computing a b-rep, so it avoids the b-rep bottleneck.

Meanwhile, another form of complexity is emerging – the complexity of material composition and distribution inside an object.

Since b-rep systems store only the outer skin of an object, they assume that material is homogeneous throughout the object’s interior.

However, given the capabilities of modern 3D printers, this assumption of homogeneity is no longer valid (if it ever was).

An object might consist of several chunks of different material, or its material might even be ‘graded’, with a continuous transition from rubber to plastic, for example.

All design systems will need to handle this new form of complexity going forward.

We need speed

Today’s mainstream engineering stack is pretty fast if you just want to make a box – but what if you are designing the casing of a giant gearbox, or the interior of an airplane cabin?

What if I want a bigger radius here, or to vary the density of the conformal cooling pipes in the gearbox?

Anything slower than realtime instant feedback to a change of one parameter is unacceptable, because it creates a brain/computer barrier that limits innovation.

The key to achieving superior speed is good utilisation of modern computer architectures.

But most legacy design software was created decades ago, and is difficult to adapt to modern machines.

In b-rep systems, the algorithms typically contain lots of branching and special-case logic, which does not fit well with GPUs, so you’ll probably find that your GPU and many of your CPU cores are idle most of the time.

Software emerging today is different: object representations and computational algorithms are tuned for modern computers, especially GPUs, and this delivers dramatic performance improvements.

In some cases, the improvements are huge, and they come in two flavours.

Some functions run in minutes instead of days, which delivers obvious productivity gains.

Other computations take milliseconds instead of minutes, which enables realtime design exploration and promotes innovation.

Fun and automation

It’s not fun if, whenever I try to make a change to a big model, it’s slow to update, and there’s also a chance that the update might run into an error.

Time spent trying to ‘debug’ a failing model, or design something such that it works within the limits of the software, is deeply frustrating.

I’m speaking here from personal experience: the limits that the design software stack imposed on my own creativity is one of the reasons I started nTopology.

My thinking was (and still is) that there must be a better way to design a part.

For things that are simple and ‘prismatic’, the current stack works ok, but as soon as you want to design something more complex, things get either really, really hard or impossible.

It doesn’t take much to cause trouble.

The modelling roadblocks are numerous: patterns of shapes that vary in space (like varying-sized holes in a grille); shells with walls thicker than the radius of curvature; fillets that overlap each other, and so on.

In interactive work, these are just annoyances that can usually be worked around.

But increasingly, design work is being performed by algorithms that journey around a design space automatically, looking for good solutions.

If the search path of one of these algorithms runs into a roadblock, it typically can’t pause and look for a workaround.

It just grinds to a halt. So, reliable modeling functions increase your fun as well as enabling automated design.

New tools, new challenges

New tools are emerging that are better suited to modern computers.

They are able to handle enormously complex models, they deliver significant gains in speed and reliability, and they enable a more automated approach to design.

They do this by using new technology that is radically different and will be difficult to incorporate into b-rep systems.

So, will the new tools replace current CAD systems? No, they won’t – just as colour didn’t replace black & white.

Instead, the new tools will go on to help define new markets. For the tools to solve the new problems, fundamental architectural changes will be needed, and this will take time.

On the other hand, the new tools can not easily replicate all the CAD functionality that has been developed over the past 40 years. And it doesn’t make sense to replicate functions that are already working well, in any case.

So the best solution is co-existence. Over the last few decades, many companies have worked hard to consolidate on a single design tool, but, given the new design problems that have arisen, this will no longer work for them.

We are now in an era where, yet again, having multiple tools is a good thing. In fact, it may well be the only way to keep up with the radical innovation that is happening today in factories.


Leave a comment