Harnessing AI for engineering development

Huge data sets are reshaping automotive testing and design. Chris Pickering reports.


Knowledge is a curious thing. As an individual you can never have too much, but as an organisation, it’s possible to have such vast quantities of data that it becomes difficult to extract useful information out of it using conventional methods. This is the world of Big Data – a familiar buzzword from computing, but also a practical tool for engineering development if it’s harnessed in the correct way.

It can almost be thought of as machine intuition. Conventional models perform well when presented with clearly-defined problems governed by well-understood physics, but they can struggle with complex, multi-dimensional data. Conversely, artificial intelligence thrives on large, complex datasets. Self-learning algorithms can churn through historical data, spotting patterns that can be used to predict future outcomes. Unlike traditional models, this doesn’t need to be anchored to a set of pre-existing equations, although it can be used in conjunction with conventional techniques.

“It’s the equivalent of going up to an experienced chief engineer and asking, ‘what’s your gut feeling, will this work?’” said Dr Richard Ahlfeld, CEO of Monolith AI. “You can literally have an AI version of that expertise, which comes back and says, ‘we’ve looked at those parameters, they’re very similar to 10 different tests we’ve done in the past, and eight of those were a disaster’.”

It sounds simple, but as with human intuition, the key lies with spotting subtle and complex relationships.

“Internally, we talk about problems with intractable physics,” said Ahlfeld. “We’ve covered something like 370 cases now, and at the end of each year we go through and carry out human pattern recognition on them. The aim is to understand what could have been done by existing physics simulation, or just sheer common sense, and where machine learning works better. And the one commonality that we’ve found between all of those is that the design response has a lot of parameters, and it’s very nonlinear.”

He likens it to the classic engineering job interview question, where a candidate is asked what impact a specific change will have on the rest of the system. If there’s a clear correlation that the human brain can perceive then it probably doesn’t require the AI. However, there can still be cost and efficiency benefits to making a prediction based on data that you already possess rather than building a model from scratch.

Designed for engineers

Monolith has its roots in work carried out for Ahlfeld’s PhD at Imperial College. He was working with NASA at the time, and notes that despite decades of additional experience, the agency sometimes seemed to be at a disadvantage to rivals such as SpaceX whose data had all been collected and processed in the digital age. It wasn’t that they had more knowledge; they were just better able to utilise it.

In 2018, Monolith undertook its first commercial projects, with clients including L’Oreal, McLaren and Airbus. The aim was to provide an AI tool that could be used by engineers as part of their existing workflow, rather than a specialist application for data scientists.

“Expecting engineers to work with tools designed for mathematicians is a bit like telling people that they should be programming their own CFD codes,” Ahlfeld said. “Our experience is that machine learning is undergoing the transformation that computational fluid dynamics and finite element analysis saw in the 1990s. Those slowly began to become commercial tools that could be used by the same person that was working on the design itself.”

Crash tests generate enormous amounts of data from over 1,000 sensors - Monolith

Monolith’s applications include providing data-driven models to plug the gaps where existing simulations can’t provide the answers, predicting the performance of 3D designs, and optimising test plans.

Engineers at BMW now use the software to predict the performance of crash tests and wind tunnel investigations. In one example, Monolith was used to predict the forces on an occupant’s tibia for a range of different crash types without carrying out physical testing.

The German automotive giant conducts over 1,000 crash tests a year, which means that huge volumes of data are generated. “If you go back five or 10 years, you very quickly end up with thousands of crash tests; each crash test has between 1000 and 1300 sensors. You’re literally measuring everything, and they have high sampling frequencies, so the amount of data that you create is huge,” Ahlfeld explained.

The relationship came about through the BMW Startup Garage. Pitched as a gateway to the multi-trillion dollar global automotive industry, the scheme works with 1,500 startups from 30 different countries to channel innovations into the sector. It’s one of a number of automotive scouting programmes that Monolith has been involved with since its inception.

Like all car manufacturers, BMW is investing heavily in its simulation capabilities. Crash testing is one area where conventional simulation techniques have come on significantly in recent years, but they still have their limitations, and it’s likely that they always will do.

In reality, there are still things that are very difficult to get right

“The idea of 100 per cent digital engineering has been a continuous goal in the industry for a long time,” said Ahlfeld. “In reality, there are still things that are very difficult to get right. If you take crash testing, there are aspects of it that are really difficult to simulate when you’re dealing with several thousand components hitting a wall at 35 mph.”

Crucially, there’s a vast library of data here, with the NCAP test results having been held in the same digital format for more than 25 years, but other areas are catching up fast. Notably, developers are generating huge amounts of data around assisted and autonomous driving as test vehicles and simulations clock up millions of miles per year.

“We’ve seen lots of engineers going away from crash test and into assisted driving, because that’s a huge area of interest currently,” Ahlfeld continued. “And the lesson we’ve learnt in this area is that whenever you work on something that’s radically different, simple physical calculation methods no longer apply, so you need to carry out testing and analyse those results to learn what’s happening.”

Two million points

There are parallels in physical design as well. Another client of Monolith’s is involved in motor design for electric vehicles. In theory, this is all governed by conventional physics, but the complex interactions involved can make the data hard to digest.

“They have about 20 different design parameters, along with different electrical conductivities and different rotational numbers. And every time they do a test, they get a ridiculously large frequency response out for each rotation – it’s something like 50 inputs and two million output points, so it’s incredibly difficult to interpret.”

With the move towards electrification, the car manufacturers’ approach to NVH development is also changing. Where previously, the sound of the internal combustion engine might have masked a slight rattle in the trim or a bit of extra wind noise from the wing mirrors, the near-silent operation of an electric motor exposes these issues.

“Again, it comes down to lots of repetitive non-linear testing,” said Ahlfeld. “You have lots of potential noise sources, so you need to carry out vibration testing on all of those components, and it’s really difficult to predict how they’re going to behave once you actually put them in the car. Here, you can put one part of the car on to a shaker rig and build a machine learning model that understands why it’s shaking, and then put this into a systems model for the car to understand how it’s going to impact the noise without doing more expensive testing.”

Of course, the one thing that machine learning requires is copious quantities of data – preferably from a series of closely-related trials with incremental changes. That’s not something that’s available in all cases, but in the heavily digitised world of modern engineering, it’s an increasingly common commodity. And in this case knowledge truly is power.