Past efforts to apply computer-aided engineering (CAE) effectively were hindered by its poor integration with the design systems, the overly complex packaging of software and the specialist skills required to perform the analysis. Frequently, components would have already moved into the production phase by the time the analyst was able to complete a simulation study.
Rigorous analysis can help eliminate major design problems, but the development investment by this point in the process only allowed for small adjustments to the design, unless it proved so poor that major redesign was required. The recent trend of downsizing (or totally eliminating) the analytical departments has only compounded these problems.
It is also the case that with the trend for more general purpose, multi-skilled engineers, quality analysts are on the decline. On the plus side, the integration between CAD and CAE has improved substantially. What has been lacking is the packaging of the CAE tools so that design engineers can immediately use the technology.
Before the availability of synthesis or design optimisation techniques, achieving the optimal design of a product, part or process was a hit-and-miss affair. The design was bounced around between the drawing office and engineering and manufacturing departments.
Because all parameters can be fed into the design optimisation system before the process begins, different departments can now build on the work of other departments in a truly concurrent fashion.
The traditional finite element analysis (FEA) approach to optimisation is to develop a series of macro and batch files, which allow the analysis to be parameterised and run for a range of parameters. This enables organisations to capitalise on analytical experience.
Once the user with the greatest proficiency has established the menus, commands and sequence that define each customised application, any other engineer in any department or company can use it. This captures the ‘design intent’, of the original analyst.
In the ‘real world’, physical phenomena tend to act together: an aerodynamic wing, for instance, needs to be modelled for stress, vibration and fluid flow. The design of these kinds of systems using single-phenomenon behaviour currently entails many back-and-forth iterations until an optimum is reached.
In closely coupled problems where stability is an issue, it is beneficial to solve the problem as a single system. For example, it’s important in Formula One and other racing disciplines to ensure that wings do not deflect or flutter to comply with FIA regulations and to avoid potentially catastrophic fatigue failures. As we have seen already this season in F1, there has been significant controversy about whether wings had been deliberately designed to flex so they form a completely different aerodynamic device — you couldn’t do this kind of design reliably without competent fluid-structure analysis.
CD-adapco is developing its core technology towards ‘Computational Continuum Mechanics’, or CCM, to allow the solution of structural analysis problems using its existing CFD solver technology. Although most people automatically associate FEA with structural analysis, the finite volume technology that underlies most commercial CFD software is also applicable to structural analysis and holds some advantage over traditional techniques.
The iterative solvers that are at the heart of most CFD codes are inherently non-linear and require much smaller storage than finite element solvers for the same problem size. Much effort has also gone into the parallelisation of CFD codes, to the extent that they can solve much bigger problems than FEA codes (CD-adapco’s STAR-CD is routinely used to solve problems with 100 million cells or more).
For Fluid-Structure Interaction (FSI) problems a further advantage is that you can build a single mesh inside the CFD environment. The coupling is performed in memory rather than passing iterative results files back and forth on the hard-drive — resulting in much tighter coupling, with information being interchanged at an inner iteration level rather than at each external iteration. Results are instantly available for both the structural and the fluid problem without mapping between packages.
This approach also gives structural analysts access to some of the improvements in meshing technology pioneered by CFD vendors. CD-adapco’s polyhedral meshing technology has the potential to overcome one of the largest difficulties for structural analysts: mesh quality. Structural solvers are more vulnerable to distorted elements than CFD solvers, something that has limited the widespread adoption of automatic (tetrahedral) meshing in stress analysis. CD-adapco’s polyhedral mesh technology absorbs low quality tetrahedral elements into much higher quality polyhedral elements (typically with 12 or 14 faces).
Recent work linking STAR-CD with MSC.Marc — one of the industry’s leading non-linear FE codes — involved coupling via MpCCI, a Mesh-based parallel Code Coupling Interface developed at
CD-adapco doesn’t expect this approach to displace FEA as the principle means of conducting stress analysis. However early experience suggests that the method is very attractive for applications that require coupled heat transfer and stress analysis, fluid-structure interaction and casting and solidification (where fluid cells become solid).
Fluid and solid domains, when described in a single model database, reduce the amount of data transfer and enforce timestep compatibility. Using code from
The development of integrated mathematical models allows direct design optimisation by eliminating the external iterations necessary to resolve the interacting physics and enable designers to take multiple physical effects into consideration in a single analysis.This software is available now, although the recent announcement by Abaqus and Fluent is being overshadowed by the impending acquisition of Fluent by Ansys. Plus, with MSC.Marc and STAR-CD working together, useable coupled simulation of fluid-structure interaction is also available.