Twins, Pyramids and Environments: Unifying Approaches to Virtual Testing
Presented By Louise Wright (National Physical Laboratory)
There is a long history of the use of engineering simulation for design. This virtual approach is typically followed by physical prototyping, testing and refinement to reach a final design, followed in many cases by a physical testing regime to meet regulatory requirements. For many companies, the u...Read Full Abstract There is a long history of the use of engineering simulation for design. This virtual approach is typically followed by physical prototyping, testing and refinement to reach a final design, followed in many cases by a physical testing regime to meet regulatory requirements. For many companies, the use of simulation tools has reduced the time and cost associated with getting new products to market due to the ability to explore multiple designs, and has reduced resource usage and improved product quality by enabling exploration of aspects of manufacturability and long term in-use performance.
Companies are increasingly seeking to gain similar benefits beyond the design stage. Some companies whose products are subject to extensive regulatory testing requirements are seeking to provide evidence of compliance through a combination of simulation and testing. It is common in such industry sectors to have a “testing pyramid”, where the safety of a complex multi-component product is demonstrated by carrying out tests of materials, components, assemblies and complete systems, with the number of tests carried out decreasing as the complexity of the object under test increases. A “smarter testing” approach would replace some of these physical tests with simulations and would feed information between the various tests to improve the validation of the simulation and the confidence in the evidence of safety.
Some products cannot be fully tested via physical testing alone because they cause a risk to human safety. An example of current relevance is autonomous vehicles. The artificial intelligence (AI) that controls an autonomous vehicle is often trained on data obtained from human-controlled journeys of a vehicle with the sensor suite in operation, so that the AI is shown what safe driving looks like under typical conditions. However, many of the situations most likely to lead to an accident are not encountered in typical driving conditions, and could cause risk to life if recreated deliberately. A simulation can potentially recreate high-risk scenarios safely for both training and testing purposes.
Some products can significantly improve lifetime prediction and understanding of real-world performance by linking models and data in a digital twin. This approach can lead to improved future design iterations and more effective maintenance plans as the understanding of the product improves. Application of this approach could support personalisation of devices such as medical protheses, where monitoring, adjustment and individualisation could significantly improve people’s lives.
In all of these applications it is important to note that the company developing the simulations are not the only people that need to have trust in the simulation results. That trust needs to be shared by regulators, end users, and in some cases the general public.
These three seemingly distinct themes of activity are strongly linked, not least because they have the same need underpinning them: they need to combine validated models and measured data to make trustworthy predictions of real-world behaviour. This need can be answered most efficiently by a combination of activities in several technical areas, including data quality assessment, software interoperability, semantic technologies, model validation, and uncertainty quantification. The technology readiness level of these areas is varied, and the level of awareness and uptake of good practice of each technical area varies across sectors.
This paper discusses the common features, and differences between, the fields of smart testing, virtual test environments, and digital twins. Starting from a consideration of commonality we will highlight areas where existing methods and expertise could be better exploited, and identify areas where further research and development of tools would accelerate successful application of trustworthy digital assurance approaches in industry. Hide Full Abstract
Digital TwinsIntegration of Analysis & TestSimulation StrategySimulation Supporting Certification
Authored By Louise Wright (National Physical Laboratory) Liam Wright (National Physical Laboratory) Kathryn Khatry (National Physical Laboratory) Joao Gregorio (National Physical Laboratory) Michael Chrubasik (National Physical Laboratory) Maurizio Bevilacqua (National Physical Laboratory)
|
Stress Concepts for Weld Verification and Approaches to Automation
Authored & Presented By Tim Kirchhoff (ihf Ingenieurgesellschaft mbH)
In mechanically stressed components, weld seams are usually the weak points critical to failure, especially under alternating loads. The necessary strength verification is carried out according to rules such as the FKM guideline, the IIW recommendations or the Eurocode.
The evaluation of weld seams ...Read Full Abstract In mechanically stressed components, weld seams are usually the weak points critical to failure, especially under alternating loads. The necessary strength verification is carried out according to rules such as the FKM guideline, the IIW recommendations or the Eurocode.
The evaluation of weld seams is a challenge, even in modern simulation-driven product development. Due to the special properties of the weld seams (e.g. sharp and quite irregular notches), the stresses for the verification must be determined using one of the concepts developed for this purpose, for example as nominal, structural or notch stress.
With the structural stress concept, the seam is represented in a simplified manner in the FE model and the stress for the verification is extrapolated from the stresses on the surface before the weld toe. The Hot-Spot concept from the IIW recommendations is widely used for this purpose and is also referenced in other guidelines.
With the notch stress concept, a fictitious notch radius is introduced in the simulation model at the weld toe and in the weld root. The stress for the verification can then be determined directly in the notch radius.
The notch stress concept therefore requires a comparatively high modeling effort but is also suitable for the verification of complex weld seam situations that cannot be evaluated using other methods.
The areas of application of the different stress concepts are presented and advantages or disadvantages of the individual concepts are discussed.
A practical challenge in the verification of weld seams is also the selection of the critical point, especially when the component is subject to multiple alternating loads. This can be alleviated by an automated calculation of all verification points along a weld seam.
For this purpose, approaches to automating the stress determination according to the structural or notch stress concept are presented, which were implemented by ihf. It is shown how the stress components depending on the local weld seam direction can be determined according to the requirements of the different guidelines. Hide Full Abstract
Computational Structural MechanicsFatigueWelding
|
Full Scale Validation Testing for Legacy Aircraft Finite Element Models
Authored & Presented By David Wieland (Southwest Research Institute)
As the US military extends the service life of aging weapon systems, the need for accurate models of aircraft structures has become critical. The original finite element models (FEMs) used during the development of these legacy systems were either not procured by the military or not maintained, nece...Read Full Abstract As the US military extends the service life of aging weapon systems, the need for accurate models of aircraft structures has become critical. The original finite element models (FEMs) used during the development of these legacy systems were either not procured by the military or not maintained, necessitating the development of new FEMs from 2D drawings and scanned parts. Given that many weapon systems were tested up to 60 years ago, the absence of original test data presents significant challenges for model validation. This has led to the United States Air Force performing full scale test for model validation.
This presentation delves into Southwest Research Institute recent experiences in performing full-scale validation test for the T-38 and A-10 aircraft finite element models. The full aircraft NASTRAN structural analysis model of the T-38 is being developed by Northrop Grumman. The A-10 aircraft structural analysis model is being developed by the United States Airforce A-10 analysis section with support from Southwest Research Institute. To performing these full-scale validation test, a diverse array of measurement techniques, including deflection potentiometers, strain gages, fiber optic strain sensors, and digital image correlation, were employed.
Depending on the component the validation tests often are performed on structure that will be returned to active service. This necessitates extra caution to ensure the structure is not damaged and can limit options for methods to attach to and load the structure. The presentation will discuss the specific test setups, the resulting data, status of the validation effort and key lessons learned from the validation process.
In addition to these validation efforts, SwRI will discuss how the validated T-38 finite element analysis model is being used to update the damage tolerance analysis of the -33 wing. This will include stress-to-load equation development, development of stress sequence, damage tolerance analysis to determine crack growth progression and failure. Hide Full Abstract
AerospaceIntegration of Analysis & TestV&V (Verification and Validation)
|
Coupled Electrochemical-Mechanical Modelling of Li-ion Batteries
Authored & Presented By David Carlstedt (Volvo Car Corporation)
In the foreseeable future, Lithium-ion (Li-ion) batteries will be the dominating solution for energy storage in electric vehicles (EVs). The performance of Li-ion batteries is known to be highly affected by its design and the operating conditions. For example, the electrochemical and mechanical resp...Read Full Abstract In the foreseeable future, Lithium-ion (Li-ion) batteries will be the dominating solution for energy storage in electric vehicles (EVs). The performance of Li-ion batteries is known to be highly affected by its design and the operating conditions. For example, the electrochemical and mechanical response are known to be affected by its geometric features, material constituents, cell chemistry, etc. Moreover, the electrical performance (or battery functionality) is known to be influenced by mechanical loads acting on the battery cell. Hence, multiple physical phenomena need to be considered to accurately predict the combined properties of batteries during service. Given its complex geometric and microstructural configuration, estimating the combined electrochemical and mechanical response of Li-ion batteries becomes a non-trivial task involving several length and time scales.
To predict the different physical processes in Li-ion batteries, different modelling schemes are typically used. For example, Equivalent Circuit Models (ECM) or Physics-based (Newman type) models can be used to predict the electrochemical performance (e.g., voltage and current) of the battery cell. On the other hand, to evaluate their mechanical response, three-dimensional Finite Element (FE) based models are often used. To date, different methodologies for coupling the phenomena exist based on various assumptions linked to the underlaying physics. In this presentation, the focus will be on modelling strategies to simulate the coupled electrochemical and mechanical processes in Li-ion batteries for EV application during operation. Tools and strategies for coupling FE-based mechanical models with battery models will be discussed in the light of EV application and relevant commercial software platforms. Further, coupling effects (between physical phenomena) and their potential influence on the durability and longevity of the batteries will be discussed for various loading scenarios relevant for the EV application. Finally, an overview of some of the latest mechanical-battery simulations at Volvo Cars, supporting the company to become a leading electric car original equipment manufacturer (OEM), will be showcased. Hide Full Abstract
AutomotiveBatteriesElectric VehiclesMultiphysics
|
From The Paris Law To The ‘Total-Life’ Method: An Extensive Review Of Fatigue Crack Growth Laws And Models
Presented By Andrew Halfpenny (Hottinger Bruel & Kjaer (HBK))
Fatigue is the predominant cause of structural failure under cyclic loading conditions. Fatigue failure typically involves two main stages: an initial phase where one or more cracks form (crack initiation stage), followed by a phase where these cracks, if subject to sufficiently high cyclic stress,...Read Full Abstract Fatigue is the predominant cause of structural failure under cyclic loading conditions. Fatigue failure typically involves two main stages: an initial phase where one or more cracks form (crack initiation stage), followed by a phase where these cracks, if subject to sufficiently high cyclic stress, grow until failure (crack propagation stage). The relative duration of these stages varies based on factors such as material properties, structural design, and application. Furthermore, in some cases, a crack may extend into a low-stress region, halting its progression and preventing failure. In such scenarios, the crack may be considered acceptable in-service, as it does not compromise the component's durability (damage tolerance approach).
The term ‘fatigue crack growth’ refers to the propagation (or not) of cracks under cyclic loading. Since the 1950s, extensive research has focused on understanding and characterizing crack propagation under cyclic loading. This includes defining threshold, propagation, and fast fracture regions from both experimental and numerical perspectives, as well as accounting for mean stress effects and crack retardation. Unfortunately, this research is dispersed across numerous scientific publications. Furthermore, common simulation methods often focus on either the initiation or propagation stage, which can lead to inaccurate fatigue life predictions when both stages are significant. This issue is particularly relevant for welded structures, lightweight jointed structures, and lightweight cast components, which are increasingly important for more environmentally sustainable transportation solutions.
The aim of this work is to enable a more efficient review and comparison of available crack growth analysis tools to support informed decision-making, by collecting the most relevant fatigue crack growth laws and models into a single document. Additionally, this work introduces a unified fatigue life estimation approach, called the “Total-Life” method, that integrates both the initiation and propagation stages, by combining principles from strain-life and fracture mechanics, and a state-of-the-art multiaxial crack-tip plasticity model to account for mean-stress and overload retardation effects. Hide Full Abstract
Computational Structural MechanicsFatigueFracture
Authored By Andrew Halfpenny (Hottinger Bruel & Kjaer (HBK)) Cristian Bagni (Hottinger Bruel & Kjaer (HBK)) Stephan Vervoort (Hottinger Bruel & Kjaer (HBK)) Amaury Chabod (Hottinger Bruel & Kjaer (HBK))
|
Generative-AI for Preliminary Engineering Design
Presented By Shiva Babu (Rolls-Royce plc)
Many engineering solutions require technologies that rely on specialised know-how and knowledge of physics mechanisms underpinning their design and operation. As the world moves towards a digital era, current surrogate model approaches are either not fit for processing large databases, or unsuitable...Read Full Abstract Many engineering solutions require technologies that rely on specialised know-how and knowledge of physics mechanisms underpinning their design and operation. As the world moves towards a digital era, current surrogate model approaches are either not fit for processing large databases, or unsuitable to deal directly with data typically deriving from computer-based analyses such as geometry representations and field quantities (e.g., stress, displacements, temperature, etc.). At the same time there is a need for enhanced design space exploration capabilities that overcome the limitations from parametric models, enabling the assessment of innovative design concepts through more free-form geometry modelling approaches. GANs are proven effective to generate hyper realistic images when trained on many different (but similar) data. From literature, there is evidence suggesting that conditional Generative Adversarial Networks (cGAN) con provide a valuable means to support engineering design by accurately predicting the results of computationally expensive simulations through the encoding of design information into 2D images [1][2]. However, there is a need for further work to identify and address some of the roadblocks hindering a wider application of this technology. This paper presents the investigation conducted to understand and address some of such restrictions identified for the use of cGAN models on different preliminary design engineering use cases. The models in this study were assessed on engineering and non-engineering data while monitoring their sensitivity to architectural and parametric changes. This deep dive helped gain a better understanding of the applications where such an approach can and cannot be used. Limitations were also identified in tasks conducted as part of the pre-processing of training data, which have driven the motivation to evaluate data encoding in more detail and highlight the need for further developments on this area . Furthermore, the portability of such an approach allows unleashing the crucial benefits its deployment into a cloud environment in terms of efficiency and cost-effectiveness whilst complying with data classification constraints. Hide Full Abstract
Engineering Data Science
Authored By Shiva Babu (Rolls-Royce plc) Marco Nunez (Rolls-Royce plc) Yashwant Gurbani (Rolls-Royce plc) Harry Bell (Rolls-Royce plc) Nima Ameri (Rolls-Royce plc) Jon Gregory (Rolls-Royce plc)
|
Vibro-acoustic Simulation of Impulsive Feedback from Computer Mice Microswitches
Presented By Luca Francesconi (Logitech Europe SA)
Simulations for predicting acoustics emissions from impulsive and transient dynamic phenomena emerging from small electro-mechanical components, as those commonly found in consumer electronics, remains both novel and challenging. Some of these components act as direct Human-Machine Interfaces (HMIs)...Read Full Abstract Simulations for predicting acoustics emissions from impulsive and transient dynamic phenomena emerging from small electro-mechanical components, as those commonly found in consumer electronics, remains both novel and challenging. Some of these components act as direct Human-Machine Interfaces (HMIs) between users and the devices being operated. One of such applications are microswitches embedded in computer mice. Besides the functional operation of the device, they also double as the primary source of both tactile and acoustic feedback to the user upon clicking its keys. The afforded feedback is a complex array of multimodal sensorial cues and includes fast transient events such as impulsive phenomena. From a component level to the integration at system architecture, both the acoustic emissions and mechanical behavior of such components constitute the main source of User Experience (UX) for mouse clicking. Predicting, through simulations, the vibro-acoustics performance of microswitches and its integration in the product, in this case a computer mouse, can enable design practices to emerge with better experiences, including addressing potential sound quality issues at earlier stages of product development.
Recent advancements in simulation software and improved computational resources open up the possibility of modeling increasingly complex vibro-acoustic phenomena. The goal of this research is to understand current capabilities of simulation software to accurately predict such phenomena. This work aimed at modeling the full simulation of the vibro-acoustic response of a microswitch at the component level. This included the full operational cycle, namely the closure (push) and opening (release) switch events.
This paper reports the simulation methodology adopted, from the structural and transient analysis to the acoustic radiation emerging from the component. Structural simulations involved driving pre-stressed Finite Element (FE) models with adequate and experimentally known input forces. A time-domain explicit FE simulation modeled the rapid displacement and buckling of the internal components upon operation for the full cycle. This model was experimentally validated with high-speed footage and positioning tracking of the moving switch elements under real operation conditions. The simulation analysis further explores the model’s vibration response of the mechanical system across a range of frequencies meaningful to human hearing. Derived from these structural vibrations, sound is generated from the rapid displacement of the fluid (air) surrounding the structure. The acoustic propagation is thus simulated by modeling both the internal cavities of the switch as well as the surrounding air volume for its casing. In order to enable a better efficient use of computational resources, a hybrid mesh was adopted using both FE and Boundary Element Methods (BEM). Experimental audio recordings of switch samples’ emissions were also used to compare and validate the model. It was found that fine-tuning the simulation model parameters such as damping and material properties is essential in order to accurately reflect the physical behavior. This includes sound quality metrics in both time and frequency domains as well as auralizations. The output results from the simulation can match both the spectral and time-domain characteristics of real audio within a standard measurement error.
This study found validity in the simulation methods adopted and its results. It proposes and emerges with a methodology to simulate complex vibro-acoustic phenomena in similar and other applications. Overall, this paper also provides and reports a state-of-the-art perspective on the current vibro-acoustic simulation capabilities available to academia and industry. Hide Full Abstract
AcousticsDynamics & VibrationNVH (Noise Vibration and Harshness)
Authored By Luca Francesconi (Logitech Europe SA) Nuno Valverde (Logitech Europe SA)
Sterling McBride (Dassault Systemes)
|
Contribution of the Virtual Validation in the
Development of a 48V Electric Powertrain for 2-wheeler Applications
Presented By Riccardo Testi (Piaggio & C. SpA)
A CAE workflow was defined within a new Piaggio development methodology and executed to develop and assess the new Piaggio 48V electric powertrain’s performance. The workflow involved linked CFD, EMAG and structural analyses.
The objective was to anticipate fundamental results and info and reduce th...Read Full Abstract A CAE workflow was defined within a new Piaggio development methodology and executed to develop and assess the new Piaggio 48V electric powertrain’s performance. The workflow involved linked CFD, EMAG and structural analyses.
The objective was to anticipate fundamental results and info and reduce the economic effort associated with the physical prototyping activities.
Diverse CAE suites were coupled, optimizing using Piaggio’s procedures which were consolidated throughout the years for the of development of 2-wheelers equipped with internal combustion engines. The modular structure of those suites made in easier to incorporate the new EMAG analyses in the workflow.
The MBS system simulation activities were carried out integrating the new E-powertrain models in a Piaggio’s database, which includes libraries of subsystems such as transmissions, testbenches, etc. This approach will allow for quicker generation of future models leveraging carry-out made possible by the modular nature of such a database.
The whole CAE workflow relied on a common source of truth residing in Piaggio’s PLM system, allowing a smooth cooperation between Piaggio’s E-Mobility and Powertrain depts.
EMAG simulations were carried to assess the electric machine’s performance and to provide input data for the subsequent CFD, MBS and structural analyses.
The dynamic behavior, from a mechanical standpoint, was analyzed with multibody models, which produced KPIs’ values and provided input data for stress analyses.
CFD analyses were used to verify that the exercise temperatures were compatible with the electric machine’s requirements and provided thermal maps for the FEM stress analyses.
The structural integrity of the whole e-powertrain system was verified with combined stress and durability analyses, based on the working conditions identified during the previous EMAG, MBS and CFD campaigns.
The structural FEM analyses were also used without coupling them with durability tools, to investigate functional aspects of the mechanical system.
Being the CAE campaign carried out in the project’s early stages, it allowed to reduce the physical tests and could assist the sourcing activities managed by the Purchasing dept Hide Full Abstract
AutomotiveCAE in the design processComputational ElectromagneticsComputational Fluid DynamicsMultibody Dynamics
Authored By Riccardo Testi (Piaggio & C. SpA) Michele Caggiano (Piaggio & C. SpA) Antonio Fricasse (Piaggio & C. SpA)
|
Cloud-Enabled Generative AI for Preliminary Engineering Design
Presented By Nima Ameri (Rolls-Royce plc)
Presented in this paper is the approach conducted to democratise the adoption of conditional Generative Adversarial Networks (cGAN) on the cloud for preliminary engineering design applications. This work addresses a number of challenges associated with the use of cGAN networks to engineering applica...Read Full Abstract Presented in this paper is the approach conducted to democratise the adoption of conditional Generative Adversarial Networks (cGAN) on the cloud for preliminary engineering design applications. This work addresses a number of challenges associated with the use of cGAN networks to engineering applications and explores them through the illustration of various use cases. In contrast to other applications, accuracy from synthetic data plays a crucial role within an engineering context; this emphasis on accuracy puts increased attention on the correct execution of each step involved in the process for training such models. such as data preparation and architecture configuration. Furthermore, a range of additional non-technical considerations highlight the cloud as the best suited solution to access higher and scalable computational resources as well as specialised COTSs technologies. To address this, Rolls-Royce has partnered up with Databricks, leveraging its Data Intelligence Platform from the Rolls-Royce Data Science Environment (DSE) hosted on Microsoft Azure: Rolls-Royce’s DSE is a highly integrated platform of world leading tools and technology which enables users to develop and deploy analytics, data science and machine learning in a secure and scalable manner within the company’s strategic digital environment, while allowing access to third parties including academic partners and suppliers in a safe and controlled manner. The adoption of cloud technologies was aimed at achieving a significant reduction in runtime, with a target factor of 30 when compared to the equivalent on-prem run. Furthermore, an additional goal was the development of a generalised framework for the identification of the optimal network architecture and hyperparameters for a given use case. This work will demonstrate a solution to this goal by leveraging and combining a number of technologies: this includes the use of Ray package for hyper-parameter and -architecture optimisation, and the adoption of MLflow for the management of the GAN models lifecycle and experiment tracking.. Particular attention was also given to data management and governance of the engineering data which comprised of a combination of images, tabular data and metadata produced by dedicated physics-based engineering softwares. To this end, data was imported and converted into industry standard “delta format” which is optimal for cloud and distributed computation. Finally, the data governance framework was provided by Databricks’s Unity Catalog which establishes a crucial framework for compliance-centric industries, such as aerospace.
The effectiveness of the approach is demonstrated with engineering use cases of growing complexity. Hide Full Abstract
AerospaceCAE in the design processCloud ComputingDemocratisationEngineering Data Science
Authored By Nima Ameri (Rolls-Royce plc) Shiva Babu (Rolls-Royce PLC) Marco Nunez (Rolls-Royce PLC) Yashwant Gurbani (Rolls-Royce PLC)
|
Study on CAE Techniques for Deriving Single Component Durability Test Specification of Automotive Suspension Component
Presented By Kyung hoon Jung (HYUNDAI MOTORS COMPANY)
The automotive industry is experiencing rapid transformation, leading to increased demand for virtual developments and enhanced component-level durability testing. While traditional full-vehicle Belgian road tests have been effective for overall durability verification, they present significant chal...Read Full Abstract The automotive industry is experiencing rapid transformation, leading to increased demand for virtual developments and enhanced component-level durability testing. While traditional full-vehicle Belgian road tests have been effective for overall durability verification, they present significant challenges including high costs, lengthy preparation times, and difficulties in conducting urgent verification tests for improved components. Additionally, the current collaboration between CAE and testing teams, though valuable, often relies on qualitative judgements and requires extensive labor to identify durability failure modes.
This paper introduces an innovative methodology for efficiently deriving durability test conditions for vulnerable components identified during full-vehicle durability tests. The proposed approach consists of four key stages: (1) identification of main load points through Miner’s rule-based damage analysis, (2) measurement of virtual strain under full-vehicle conditions, (3) component-level test load histories derivation through Load Reconstruction, and (4) load normalization using Potential Damage Intensity (PDI) based on idealized S-N curves and equivalent load analysis.
A significant innovation in this methodology is the application of linear inverse matrix between virtual strains of the full-vehicle model and the jig test analysis model. Unlike previous studies that focused on identical boundary conditions, our approach addresses the challenges of different boundary conditions between full-vehicle and component-level testing.
The methodology incorporates Load Reconstruction analysis with Load Transducer functionality to minimize variance in complex geometries. The process concludes with the extraction of main load vector components using PDI analysis, enabling the conversion of multiple-axis random loads into practical uniaxial test conditions. To ensure reliability, we implement a verification process comparing damage hotspot orders between full-vehicle and component-level conditions.
For practical implementation, we utilize the S-N method and relative damage concepts to calculate equivalent sinusoidal loads, with scale factors determined based on material fatigue properties. The target testing regime encompasses approximately 200,000 cycles, optimized for actual test actuator specifications. This comprehensive approach significantly enhances the efficiency and accuracy of component-level durability testing while reducing development costs and time compared to traditional methods.
The methodology has been successfully validated through testing on multiple chassis systems, demonstrating its effectiveness in reproducing critical durability characteristics while maintaining the accuracy of the full vehicle testing approach. This innovative process represents a significant advancement in automotive durability testing, offering a more systematic and efficient approach to component-level verification. Hide Full Abstract
AutomotiveFailureFatigueIntegration of Analysis & TestSimulation Supporting Certification
Authored By Kyung hoon Jung (HYUNDAI MOTORS COMPANY) Hoo Gwang Lee (Hyundai Motors Company) Hong Ju Park (Hyundai Motors Company)
|
The Digital Twin of ESA's Large Space Simulator
Authored & Presented By Remko Moeys (European Space Agency)
This paper presents the digital twin of the Large Space Simulator (LSS), as it undergoes the final stage of its development. Located in The Netherlands, the LSS is Europe’s largest thermal vacuum chamber and is used by the European Space Agency to test spacecrafts under representative space conditio...Read Full Abstract This paper presents the digital twin of the Large Space Simulator (LSS), as it undergoes the final stage of its development. Located in The Netherlands, the LSS is Europe’s largest thermal vacuum chamber and is used by the European Space Agency to test spacecrafts under representative space conditions: vacuum, cryogenic temperatures and powerful, dynamic solar illumination.
The purposes of this digital twin are to simulate:
1. Future test campaigns (standard of specific ones) to support the training to operate the LSS facility
2. The performance of the facility with future hardware or software modifications and carry out software/hardware-in-the-loop pre-tests
3. Abnormal facility operation with failed equipment
The digital twin consists of three layers:
1. a high fidelity EcosimPro model of the LSS to simulate its physical performance
2. a virtual version of the LSS Programmable Logic Computer to execute the process control
3. an identical Human-Machine Interface to the one of the LSS for the user to interact with
A co-simulation manager ensures the exchange of information between the above three layers and enables digital twin-specific functionalities such as adjusting the simulation speed, upload/start/stop/save a simulation run, load pre-defined failure scenarios and virtually carry out the key procedure steps that are manually performed on the field.
To maximise the representativeness of the real facility operation, the digital twin is designed to be operable from the same monitors of the LSS control room and to display the simulation results using the same data acquisition and presentation software used by the LSS: STAMP (System for Thermal Analysis, Measurement, and Power supply control), developed by Therma. The digital twin is conceived to be operable by a trainee and an instructor simultaneously.
The project was kicked off in November 2023, underwent detailed validation review of the models against test data in October 2024, and is expected to be completed by mid-2025. The prime contractor of this project is Empresarios Agrupados – GHESA, who is also the owner of the modelling software used (EcosimPro). Hide Full Abstract
Digital Twins
|
Modelling Aero-Optical Turbulent Effects On The European Solar Telescope Using CFD Analysis
Presented By Mahy Soler (Instituto de Astrofasica de Canarias, Principia Ingenieros Consultores SA)
The European Solar Telescope (EST) is a next generation 4-m class solar telescope that will be built at the Observatorio del Roque de los Muchachos (ORM). The performance of an optical telescope is evaluated by its seeing, which refers to the image degradation caused by turbulent fluctuations in the...Read Full Abstract The European Solar Telescope (EST) is a next generation 4-m class solar telescope that will be built at the Observatorio del Roque de los Muchachos (ORM). The performance of an optical telescope is evaluated by its seeing, which refers to the image degradation caused by turbulent fluctuations in the air's refractive index as light travels through the optical path. This phenomenon arises from various sources, including atmospheric turbulence, environmental and local effects.
Atmospheric turbulence is largely determined by the site location and the ORM is renowned for its excellent atmospheric conditions for astronomical observation. The EST design aims to minimize environmental local turbulence effects caused primarily by the thermal ground layer. This is achieved by placing the optical elements as high as possible from the ground and using an open-air configuration that promotes natural ventilation.
The local fluctuations in the air refractive index in the surrounding of the telescope are produced by a combination of thermal and mechanical turbulence that depends on the size and shape of the design. To evaluate the local effects, detailed Finite Element Thermal and Computational Fluid Dynamics (CFD) models were developed. These models accounted for the topography, telescope structure, pier, enclosure and nearby telescopes within the observatory. Transient thermal analysis calculates superficial temperatures which are subsequently used by the CFD model to compute the air temperature distribution and its refractive index.
A series of transient CFD analyses is conducted to analyze the impact of environmental conditions, including wind speed, wind direction and telescope orientations, on different design alternatives. These simulations provide further insights into the spatial distributions of air temperature and refractive index fluctuations inside the optical path. The results are postprocessed to derive aero-optical metrics, allowing to estimate the telescope performance.
The study highlights how design choices influence aero-optical turbulence and provides feedback for optimizing the EST’s design. The results contribute to the telescope’s error budget by quantifying local turbulence effects and ensuring that the aerodynamic design supports its optical performance goals. Hide Full Abstract
Computational Fluid DynamicsOptical
Authored By Mahy Soler (Instituto de Astrofasica de Canarias, Principia Ingenieros Consultores SA) Konstantinos Vogiatzis (Instituto de AstrofAsica de Canarias) Juan Cezar-Castellano (Instituto de AstrofAsica de Canarias) Sergio Bonaque-Gonzalez (Departamento de Fesica, Universidad de La Laguna (ULL)) Marta Belio-Asen (Instituto de AstrofAsica de Canarias) Miguel Nunez (Instituto de AstrofAsica de Canarias) Mary Barreto (Instituto de AstrofAsica de Canarias)
|
Efficient Joining Failure Assessment of Multi-material Car Bodies in Crash
Presented By Tony Porsch (Volkswagen AG)
Predicting structural failure in automotive engineering remains a significant challenge in the area of virtual vehicle development, gaining further importance in the context of "Virtual Certification." The increasing use of modern lightweight materials, ultra-high-strength steels, and new innovative...Read Full Abstract Predicting structural failure in automotive engineering remains a significant challenge in the area of virtual vehicle development, gaining further importance in the context of "Virtual Certification." The increasing use of modern lightweight materials, ultra-high-strength steels, and new innovative joining techniques contributes to heightened material diversity and complexity in vehicle bodies. Traditional resistance spotwelds are now complemented by growing use of self piercing rivets, line welds, and flow drill screws among other techniques. The failure of these connections is of particular concern in crash scenarios, as it significantly impacts vehicle safety. Therefore, robust and industry-applicable computational methods are essential for dealing with the complexity of vehicle structures and delivering reliable predictive results.
In this seminar, the L2-Tool, a modular failure assessment framework, which was developed at the Virtual Vehicle Research Center in a joint research project with Volkswagen AG and Audi AG will be presented. The key element of this framework is the assessment of the failure with special surrogate models, which guarantee a high prediction quality despite a low additional computing time. Particularly high-strength lightweight materials have an increased risk of crack initiation under plane tensile load, for example due to the heat input in the welding process or due to the notch effect on rivets and flow drill screws. A key element of this method is that these two types of failure can be distinguished and assessed using a non-local approach. For the parameterization of the failure models, a combination of real and virtual testing with detailed, small-scale specimens is used, which will be briefly outlined in the presentation. After the development phase, the failure models are integrated into the product development process in a multi-stage integration process, starting with implementation via user interfaces, followed by an comprehensive test phase and the final industrialization by the crash solver provider.
In the conclusion of the presentation, illustrative results of the L2-Tool applied to vehicle substructures are presented. The framework within the standardized calculation process is also described, with emphasis on the pre- and post-processing phases. The predictive accuracy of the method is addressed, and finally, potential applications are shown. Hide Full Abstract
AutomotiveFailureLow CodeMeshingSimulation Supporting Certification
Authored By Tony Porsch (Volkswagen AG) Karl Heinz Kunter (Virtual Vehicle Research GmbH) Jean-Daniel Martinez (Audi AG)
|
Nonlinear Cohesive Zone Modeling for Adhesives
Presented By Tobias Waffenschmidt (3M Deutschland GmbH)
In many engineering applications, the integrity of adhesive bonds must be ensured over service life when exposed to mechanical stresses. In order to assess the structural integrity of adhesive bonds numerically, there is an increasing need to efficiently model and simulate the strength, damage and f...Read Full Abstract In many engineering applications, the integrity of adhesive bonds must be ensured over service life when exposed to mechanical stresses. In order to assess the structural integrity of adhesive bonds numerically, there is an increasing need to efficiently model and simulate the strength, damage and failure behavior of adhesives. This includes i) structural adhesives (e.g. curable epoxy-, acrylate-, or polyurethane based adhesives which exhibit thermosetting behavior) but also ii) pressure-sensitive adhesives (adhesive tapes) which behave more elastomeric-like. Pressure-sensitive adhesives, in particular, typically exhibit a highly nonlinear elastic-viscoelastic material behavior including strains at failure of up to 500% or more. This makes a numerical treatment using conventional continuum finite elements difficult if not completely infeasible. One approach to circumvent these deficiencies is cohesive zone modeling. Cohesive zone models make use of constitutive traction-separation laws which enable to incorporate damage and failure mechanisms for adhesives straightforwardly and do not render mesh-dependent results as it would be the case for continuum-based techniques. In particular, the incorporation of the strong nonlinear and rate-dependent response seems to be challenging, because the conventional bilinear traction-separation laws which are available in basically all commercially available finite element software packages are not sufficient to model such complex material behavior. On the other hand, self-implemented user-subroutines which may be used as an alternative are mostly not feasible to be used in an industrial environment due to the high implementation effort, inferior robustness and higher computational cost which mostly prohibits straightforward usage for large-scale simulation problems.
This presentation gives an overview of accurate but yet efficient nonlinear cohesive zone modeling techniques suitable for modeling damage and failure for i) structural adhesives and ii) pressure-sensitive adhesives (adhesive tapes) without the need for user subroutines. Suitable testing and characterization methods for both adhesive categories will be presented and compared to each other. Material model calibration and parameter identification techniques based on these tests for cohesive zone models will be discussed for rate-independent and rate-dependent use cases. Verification and validation test cases will be discussed to underline the applicability of these models. Finally, a variety of different application cases ranging from quasi-static to impact scenarios will be presented. Hide Full Abstract
Joints & ConnectionsMaterial CharacterisationMaterialsV&V (Verification and Validation)
Authored By Tobias Waffenschmidt (3M Deutschland GmbH) Markus von Hoegen (3M Deutschland GmbH)
|
Next Step Towards the Complete Virtualization of Machine Tools: Flexible Multibody Dynamics Simulation and Control in Synergy
Presented By Miguel Seco Calleja (Fundacion Tecnalia Research & Innovation)
A modern machine tool is a highly complex mechatronic system, with increasingly challenging dynamic behaviour and precision requirements, managing trade-offs of multiple conflicting conditions and design criteria. At the same time, the principle of getting it right the first time has become essentia...Read Full Abstract A modern machine tool is a highly complex mechatronic system, with increasingly challenging dynamic behaviour and precision requirements, managing trade-offs of multiple conflicting conditions and design criteria. At the same time, the principle of getting it right the first time has become essential, aiming to reduce the number of real prototypes, especially when production involves small series to provide customized solutions to clients.
The systematic integration of different virtual studies and simulation into the design process has been already contributing to progress towards this goal. However, even the more advanced FEM analysis, based on modal extraction and static and dynamic stiffness, at some discrete positions of the machine, is insufficient to analyse the real behaviour of a machine in motion throughout the entire workspace. On the contrary, this approach enables more accurate modelling of damping associated to friction on moving guides and drives, rotating inertia of motors and actuators, and finally, it does consider the mutual interaction between the dynamics of the machine (hardware) and the digital control loops (software) that actually govern the mechatronic system.
These aspects of the new approach to machine tool simulation are based on integrating the capabilities of traditional FEM analysis, multibody simulation, and digital control loops, to estimate the machine's dynamic capabilities. Thus, enabling the analysis of how different design changes contribute to the final behaviour, not only in terms of natural frequencies or dynamic stiffness, but in terms of real productivity.
This article presents the results of the development of a virtual machine that includes the consideration of the complete dynamics of the system and comprises multibody simulation with flexible components and the usual position and speed control loops. This simulation tool successfully reproduces the dynamic behaviour of a 3-axis machining machine and can predict the transient behaviour and dynamic response to any setpoint or trajectory.
The different structural and non-structural components are assembled separately in the global model, so any of them can be easily replaced to analyse the effectiveness of design changes. Special emphasis has been placed on reproducing the behaviour of linear guide systems, bearings, and drives of different types: pulley and belt, nut and screw. To reduce computational demands, each component is reduced to modal coordinates up to a limited number of modes, and mode shifting techniques are used above a cutoff frequency, so that deformation characteristics are maintained while the solver integrator can use large time steps.
Validation is carried out based on frequency responses measured on the real machine, resulting in a simulation tool that allows evaluating how design changes in structural components, drives, and control algorithms impact onto the productivity and dynamic behaviour of the machine. Hide Full Abstract
Dynamics & VibrationManufacturing Process SimulationMultibody Dynamics
Authored By Miguel Seco Calleja (Fundacion Tecnalia Research & Innovation) Ibone Oleaga (Fundacion Tecnalia Research & Innovation) Juan Josa Zulaika (Fundacion Tecnalia Research & Innovation) Josu Larranaga (Fundacion Tecnalia Research & Innovation)
|
The Path to Virtual Product V&V Uncertainty Quantification of Test and Simulation Results
Authored & Presented By Frank Günther (Knorr-Bremse SfS)
Traditional Computer Aided Engineering emphasizes the use of simulation as a preparatory activity before verifying and validating a product through hardware testing. The main benefit of simulation is to speed up product development in a “first time right” paradigm where a hardware driven product V&V...Read Full Abstract Traditional Computer Aided Engineering emphasizes the use of simulation as a preparatory activity before verifying and validating a product through hardware testing. The main benefit of simulation is to speed up product development in a “first time right” paradigm where a hardware driven product V&V phase is expected to confirm what is already known through computer simulation.
In many industries, for example Railway and Automotive, this hardware driven product V&V phase constitues a major share of the overall product development effort, as safety requirements are very high and other phases of product development have been streamlined and optimized using simulation.
In other industries, for example Aerospace and and Nuclear, cost-prohibitive and, in some cases, impractical hardware testing has led already led to a large share of well-established virtual product V&V procedures.
More and more, the Automotive and Railway industries desire to establish strategies for Virtual Product V&V as well. The task is to define virtual V&V processes that provide at least the same level of assurance and certainty as the established hardware driven processes.
For this, it is necessary to quantify the uncertainty of simulation results and compare it to the acceptable, but usually unknown, uncertainty of established hardware based V&V procedures. Perhaps quantifying the certainty or assurance of a V&V procedure would be more to the point, but we use the established term “Uncertainty Quantification (UQ)”.
We will present several application application examples that adhere to the following pattern:
1) Quantify the uncertainty of an established, hardware-based V&V process
2) Validate the simulation model
3) Quantify the uncertainty of the validated simulation model
4) Propose a virtual product V&V process with equivalent uncertainty
It is important to note that, due to the need to validate the simulation models, hardware testing still plays an important role in product V&V. However, the use of simulation enables more efficient and flexible use of hardware testing, resulting in faster, more efficient product V&V.
While we acknowledge that there is still a long way to go before simulation is fully leveraged in product V&V, we hope to provide some useful ideas and guidance to those who wish establish a strategy for Virtual Product V&V in their field. Hide Full Abstract
StochasticsUncertainty QuantificationV&V (Verification and Validation)
|
High Voltage Circuit Breaker Design with Multi-Objective Optimization Algorithms
Presented By Wilhelm Thunberg (Hitachi Energy)
This paper handles the application of modern optimization software in dielectric design development of high voltage circuit breakers (HVCB) and shows how coupling of different simulation types can create more efficient workflows. Given the push to replace the circuit breaker insulation gas SF6 with ...Read Full Abstract This paper handles the application of modern optimization software in dielectric design development of high voltage circuit breakers (HVCB) and shows how coupling of different simulation types can create more efficient workflows. Given the push to replace the circuit breaker insulation gas SF6 with more eco-efficient solutions, there is a need for high pace development and innovation. This necessitates new methods for HVCB development that enable the rapid finding of an optimal design given a large set of parameters and competing objectives. This multi-objective nature can be related to the varying conditions the HVCB must handle, or to different physical properties, such as mechanical and dielectric. A common challenge is to balance the different objectives and to understand all the inherent trade-offs in the design.
The main purpose of this paper is to show a dielectric simulation optimization using the MOGA-II algorithm and compare the workflow to more traditional ones, such as a full factorial search. The comparison criteria include the time required to achieve the optimal design, the dielectric robustness of the “best” found design, and the ability to effectively evaluate the compromise between competing objectives.
In addition to the dielectric optimization, a new approach for coupling this workflow to optimization of mechanical properties of the HVCB is shown. This paper also details a new Python-based approach that reduces runtime by keeping simulation software clients active during large optimization runs.
Initial findings indicate that the application of optimization algorithms like the MOGA-II gives a quicker route to an optimized design, while also enabling coupling of different optimization categories. As a result, new insights into the inherent objective trade-offs caused by the multi-objective nature in HVCB design can be found. These advancements have the potential to streamline the design process and can contribute to the development of more sustainable and efficient products. Hide Full Abstract
Computational ElectromagneticsElectronicsOptimisation
Authored By Wilhelm Thunberg (Hitachi Energy) Sami Kotilainen (Hitachi Energy)
|
Simulations as a Design Guiding Tool: Reexamining the Role of the Simulation Engineer
Authored & Presented By Karlo Seleš (Rimac Technology)
Rimac Technology (RT), formerly the Components Engineering division of Rimac Automobili—founded in 2009—has established itself as an important player in advanced performance electrification technologies. Meanwhile, the Rimac brand has evolved from creating the world’s first all-electric, record-brea...Read Full Abstract Rimac Technology (RT), formerly the Components Engineering division of Rimac Automobili—founded in 2009—has established itself as an important player in advanced performance electrification technologies. Meanwhile, the Rimac brand has evolved from creating the world’s first all-electric, record-breaking production hypercar to continually pushing the boundaries of aesthetics and dynamics through its Bugatti-Rimac enterprise.
Today, RT stands as a leading Tier-1 automotive supplier, specializing in high-performance battery systems, electric drive units, electronic systems, and user interface components, solidifying its reputation in advanced performance electrification technologies.
As the company transitioned beyond its startup phase, the simulation department expanded in parallel, offering a unique opportunity to challenge and rethink conventional industry practices. One such area of focus is the evolving role of simulation engineers in the product development process.
Amid the rapidly evolving trends within the simulation community, this presentation aims to spark a thought-provoking discussion on the transforming role of industrial simulation engineers in the modern product development. While the simulation engineering has traditionally been viewed as a supporting function, its strategic importance in the design and development process within Rimac Technology is becoming increasingly apparent.
This presentation will explore how Rimac Technology leverages simulation techniques to address challenges inherent in fast-paced, cost-sensitive industries. It will showcase the critical role simulations play throughout the development cycle—starting from initial concept ideation, where incremental improvements often fall short, to the optimization and validation stages that culminate in production-ready solutions.
By delving into Rimac Technology’s approach, the session will highlight how simulations can be used effectively at different product development stages. Moreover, it will consider how the responsibilities of simulation engineers are expanding beyond traditional analysis tasks to encompass broader topics, such as influencing design strategies, driving various levels of verification and validation campaigns, and integrating sub-system requirement considerations into engineering decisions.
Ultimately, this presentation seeks to challenge conventional perceptions, illustrating how simulation engineers are emerging as key contributors to an organization’s success in an increasingly competitive and technology-driven landscape. Hide Full Abstract
Asset ManagementAutomotiveBusiness Impact of SimulationCAE in the design processDemocratisationSimulation ManagementSimulation StrategySystem-Level Simulation
|
Enabling Model-Based Aircraft Certification
Authored & Presented By Stephen Cook (Northrop Grumman Corporation)
The aircraft certification process for both civil and military air systems carries the reputation of being a costly, paper-centric process [1]. Applicants seeking to achieve certification must provide copious amounts of data and test evidence to establish the engineering pedigree of the aircraft. ...Read Full Abstract The aircraft certification process for both civil and military air systems carries the reputation of being a costly, paper-centric process [1]. Applicants seeking to achieve certification must provide copious amounts of data and test evidence to establish the engineering pedigree of the aircraft. One of the promises of digital engineering is the use of high-fidelity engineering models as a superior source of data for authorities to find compliance with airworthiness regulations. This approach uses engineering simulation models as the authoritative source of truth for making airworthiness determinations and risk assessments. However, there are practical obstacles to real adoption of model-based aircraft certification. This paper will detail these challenges to achieving model-based aircraft certification – and propose ways to overcome them - in four categories:
Culture: Travel by aircraft is one of the safest forms of transportation, in part due to rigorous airworthiness standards and processes. As a result, the aircraft certification culture is reluctant to change. Pathfinder projects that have been formulated to show the value of model-based aircraft certification. The paper will propose next steps to develop a positive certification culture around use of models in the certification process.
Competency: The rapid onset of digital engineering tools has created a specialized skillset around the design, construction, and format of the model and its corresponding data. Recently a European aircraft industry consortium stated that there is a need to increase “awareness, trust, skills, knowledge, training, experience and mindsets” among engineers using models in the certification process [2]. The paper will discuss some of the airworthiness credentialing efforts underway and the potential to develop training tailored for model-based aircraft certification.
Collaboration: The current aircraft certification involves generating data and sending the results to the airworthiness authority to be reviewed at another time. In contrast, digital models offer the possibility of collaborating in the model in real time and conducting the showing and finding of compliance simultaneously. The full paper will discuss some of the obstacles that must be overcome to enable collaboration in the model, to include availability of regulatory personnel, configuration control of the model, and the ability of models to accurately simulate failure conditions. The paper will also explore the possibility of augmenting collaboration with artificial intelligence to assist the showing and finding of compliance.
Credibility: The aircraft certification process moves at the speed of trust. A recent guide to certification by analysis (CbA) stated that “developing methods to ensure credible simulation results is critically important for regulatory acceptance of CbA.” [3]. For engineers to trust models as the authoritative source of truth will require ways show the credibility of the models through appropriate processes and metrics, which will be discussed in the paper.
The paper will provide recommendations for near-term steps that the community can take to promote progress in each of these four areas. Finally, the paper will identify areas where additional research and pathfinder programs would be valuable to enable model-based aircraft certification.
References:
1) Jiacheng Xie, Imon Chakraborty, Simon I. Briceno and Dimitri N. Mavris. "Development of a Certification Module for Early Aircraft Design," AIAA 2019-3576. AIAA Aviation 2019 Forum. June 2019.
2) Fabio Vetrano, et al., “Recommendations on Increased Use of Modelling and Simulation for Certification / Qualification in Aerospace Industry,” AIAA-2024-1625.
3) Timothy Mauery, et al., “A Guide for Aircraft Certification by Analysis, NASA/CR-20210015404, May 2021. Hide Full Abstract
AerospaceMBSE (Model Based System Engineering)Simulation Supporting Certification
|
Structural Analysis Of A Dam Wagon Gate
Presented By Hervandil Sant'Anna (Petrobras)
This study addresses the structural analysis of the wagon gate of one Petrobras dam, which is a critical structure for water intake by the upstream refinery. Built in 1967, the dam and its accessory structures have undergone corrective maintenance over the years. Following the accidents in Mariana a...Read Full Abstract This study addresses the structural analysis of the wagon gate of one Petrobras dam, which is a critical structure for water intake by the upstream refinery. Built in 1967, the dam and its accessory structures have undergone corrective maintenance over the years. Following the accidents in Mariana and Brumadinho, the National Water Agency (ANA) revised inspection procedures, classifying this dam as low immediate risk but with high potential for associated damage in case of failure.
The main objective of this study is to evaluate the current structural conditions of the wagon gate, which ensures the tightness of the pipeline by blocking about 22 meters of water column. The methodology used is based on elastoplastic stress analysis according to the API 579-1/ASME FFS-1 (2016) code, with the construction of an "as-built" model of the structure and thickness measurements.
The structural analysis was performed using the Finite Element Method (FEM), which allows a detailed assessment of stresses and deformations in the structure. The model was built based on drawings provided by the refinery and thickness measurements taken on the plates and beams that make up the gate. The mechanical properties of the material were obtained from the ASME II Part D code, and the stress analysis followed the API 579-1 methodology.
The results indicate that the wagon gate, despite the natural deterioration process, does not present an immediate risk of structural failure. The numerical analysis considered hydrostatic pressure loads and the structure's own weight. Boundary conditions were defined to prevent rigid body movements, and soil stiffness was modeled based on the Vertical Reaction Module.
The API 579-1 methodology allows the extrapolation of design codes, which was essential for the evaluation of the wagon gate, since the ABNT NBR 8883 standard, used as a design reference, was canceled in 2019. Elastoplastic stress analysis requires the multiplication of load combinations by load factors, as described in table 2D.4 of API 579-11. The safety coefficient was obtained from NBR 8883 and applied in the evaluation of the risk of plastic collapse failure.
In addition to the analysis with the safety coefficient, additional simulations were performed to verify the actual state of stresses and deformations in the structure, considering different thickness conditions in the gate components1. The thicknesses were measured by refinery on 29/5/2020, and additional hypotheses were considered for regions without direct measurements.
This study was essential to ensure the safety of workers during the maintenance of downstream components of the gate and to ensure the structural integrity of the dam. The detailed analysis of the current structural conditions of the wagon gate provides essential information for decision-making on future maintenance and risk mitigation measures. Hide Full Abstract
Civil EngineeringComputational Structural MechanicsSimulation Supporting Certification
Authored By Hervandil Sant'Anna (Petrobras) Carlos Eduardo Simoes Gomes (Petrobras)
|
Safety of AI Systems in Modeling and Simulation
Authored & Presented By Young Lee (UL Solutions)
The integration of artificial intelligence (AI) into modeling and simulation systems has significantly expanded their capabilities, enabling improved accuracy, adaptability, and efficiency. These systems are increasingly applied in high-stakes domains, including aerospace, healthcare, and industrial...Read Full Abstract The integration of artificial intelligence (AI) into modeling and simulation systems has significantly expanded their capabilities, enabling improved accuracy, adaptability, and efficiency. These systems are increasingly applied in high-stakes domains, including aerospace, healthcare, and industrial processes, where failure can have severe consequences. While AI-powered modeling and simulation systems offer remarkable opportunities, they also introduce unique safety risks, such as model instability, data biases, and unpredictable behaviors. Addressing these challenges is critical to ensuring the reliability and acceptance of these technologies in safety-critical applications.
This paper specifies safety requirements and provides guidelines for AI-based modeling and simulation systems, focusing on key safety principles: robustness, reliability, quality management, transparency, explainability, data privacy, data management, and lifecycle management. These principles form a comprehensive framework for mitigating risks and fostering trust in AI systems.
Robustness and reliability are foundational to AI safety, ensuring that systems function consistently under both expected and unexpected conditions, producing accurate and dependable results over time. Quality management underpins these principles, emphasizing structured development processes and rigorous testing to minimize systematic errors and ensure adherence to functional requirements.
Transparency and explainability address the need to understand how AI systems make decisions and why specific outputs are produced. These attributes are pivotal for building trust among stakeholders, enabling designers, developers, regulators, and end-users to scrutinize and confidently engage with AI systems.
Data privacy ensures the responsible collection, storage, use, and sharing of personal information, aligning with regulatory requirements and safeguarding individual and organizational data. Effective data management ensures the secure handling of input and output data while fostering compliance with ethical and regulatory standards. Lastly, lifecycle management maintains the safety, reliability, and compliance of AI models throughout their operational lifespan, adapting to technological, regulatory, and user needs.
By integrating these principles, this framework provides a pathway for developing AI-based modeling and simulation systems that are not only innovative but also safe, reliable, and trustworthy. This paper seeks to engage the modeling and simulation community in adopting structured approaches to AI safety, bridging the gap between technological advancements and safety-critical applications. Hide Full Abstract
Engineering Data ScienceSimulation Governance
|
Simulation of Roll-Over Protective Structures – Physical Testing to Support Certification
Authored & Presented By Ben Ruffell (TSV Consultants Ltd)
The geographical isolation of New Zealand from most industrial equipment manufacturers has often given rise to innovation. Due to the small market, but broad scope of applications, the industry could not support the expense of physically testing custom ROPS (roll-over protective structures) or other...Read Full Abstract The geographical isolation of New Zealand from most industrial equipment manufacturers has often given rise to innovation. Due to the small market, but broad scope of applications, the industry could not support the expense of physically testing custom ROPS (roll-over protective structures) or other protective structures on earth moving and forestry machines. In the year 1999, the New Zealand Department of Labour published an Approved Code of Practice (ACoP) for Operator Protective Structures on Self-Propelled Mobile Mechanical Plant. The ACoP defined a legal pathway whereby protective structures could prove compliance to the relevant ISO standard by using computational modelling techniques instead of physical testing. Engineering New Zealand, the local governing body for Professional Engineers, issued a Practice Note in 2008 advising further details on how to approach the analysis of protective structures and the requirement that a Chartered Professional Engineer (CPEng) certify compliance with the applicable standards.
These protective structures often have large energy absorption criteria and thus non-linear finite element analysis (FEA) was required to simulate the plastic deformation. Abaqus CAE was used to simulate the physical testing of a ROPS to ISO 8082-2, where three test loads were required. Lateral, vertical and longitudinal, with the plastic deformation resulting after unloading the first two loads to be retained before applying the subsequent load. ISO 8082-2 has Charpy impact requirements which point to low or medium carbon steel being the required material. The plastic material data with combined hardening model was used to simulate the loading and unloading cycles. While the lateral and vertical loading scenarios both have static force requirements, the lateral has the addition of an energy absorption requirement and the longitudinal is only an energy requirement. The resulting forces through any bolted or pinned connections in the structure must also remain within acceptable levels throughout the loadings. Connector elements were used to simulate bolted and pinned connections along with various contacts throughout the model. Plastic strains were also monitored to ensure they remained within acceptable levels. There was also the requirement that the ROPS shall not enter the DLV (deflection limiting volume). These are strict criteria defined around the position of the operator on the machine.
The ability to simulate destructive physical tests enables the optimization of custom operator protection solutions very quickly and inexpensively. FEA enables accurate modelling of the plastic deformation while monitoring the various connections, stresses and strains. FEA also provides the Engineer with accurate justification for certification. Hide Full Abstract
Computational Structural MechanicsImpact, Shock & CrashSimulation Supporting Certification
|
Mitigating Flow-Induced Vibration of Discharge Manifold Equipment with Tuned Mass Dampers: A Physics-ML Approach
Presented By Shobeir Pirayeh Gar (Halliburton)
Discharge manifold equipment (DME) is high-pressure surface equipment temporarily installed at the wellhead or production site to control and manage the collection of fluid flow from the well, enhancing production efficiency. To be readily portable and fast to assemble, DME often consists of numerou...Read Full Abstract Discharge manifold equipment (DME) is high-pressure surface equipment temporarily installed at the wellhead or production site to control and manage the collection of fluid flow from the well, enhancing production efficiency. To be readily portable and fast to assemble, DME often consists of numerous joint connections, clamps, and flange adaptors where flow-induced vibration becomes one of the main considerations affecting the design as well as the desired pumping schedule and frequencies. This paper presents a challenging scenario involving several unsuccessful attempts where the front discharge of a manifold experienced premature fatigue failure within just 200 hours of operation under various pump frequencies. To solve this problem, under the constraint of no major structural design changes, an integrated physics-based and machine learning-based approach was employed to maximize the fatigue life of the DME. The physics-based model was represented by a system-level (global) finite element model used to perform steady-state vibration analysis to understand the dynamics of the problem and the underlying physics. The harmonic response of the system was then combined with the pump pressure data in the frequency domain to conduct spectral fatigue analysis. The fatigue analysis results were found to be in good agreement with the field observations confirming the calibration of the global model. A tuned mass damper (TMD) was proposed to mitigate the flow-induced vibration. The effects of the TMD on the vibration response of the system were analysed using the calibrated physics-based model. A machine learning based optimization approach was used to determine the optimal TMD design that maximizes the fatigue life of the DME. The tuning frequency and critical damping ratio of the TMD were chosen as the main design parameters based on which a feasible design space was defined. The Latin Hypercube Sampling (LHS) method was used to create approximately 40 design samples for design-of-experiment (DOE) analysis. Finite element analysis was conducted on all the design samples to generate physics-based data for training a neural network (NN). The design parameters and the fatigue life served as the input (X) and output (Y) vectors for the network. The trained NN provided a surrogate model serving as an alternative response surface, where the optimal design was found using a Multi-Island Genetic Algorithm (GA). The analysis results showed that with an optimally tuned mass damper, the fatigue life of the DME could be enhanced to the desired level of about 1,000 hours of operation, thus meeting the job site requirements. Hide Full Abstract
CAE in the design processDynamics & VibrationEngineering Data ScienceOptimisation
Authored By Shobeir Pirayeh Gar (Halliburton) Allan Zhong (Halliburton) Hadi Arabnejad (Halliburton) Brad Bull (Halliburton)
|
Certification by Analysis: A Selection of Case Studies
Authored & Presented By Fabio Santandrea (RISE Research Institutes of Sweden)
Ensuring the compliance to regulatory requirements is a mandatory process for many products to be allowed on the market. The assessment of product performance is largely based on physical testing of a few samples and, possibly, monitoring of the production process. In order to reduce cost and time-t...Read Full Abstract Ensuring the compliance to regulatory requirements is a mandatory process for many products to be allowed on the market. The assessment of product performance is largely based on physical testing of a few samples and, possibly, monitoring of the production process. In order to reduce cost and time-to-market associated to the certification process, manufacturing companies have increased their efforts to establish numerical simulations as a legitimate alternative to physical testing, thus introducing the notion of “Certification by Analysis” (CbA).
In some sectors, certification bodies responded to the industrial drive towards virtual testing by developing guidelines and standardised reporting documents to streamline the credibility assessment of the results of numerical simulations without compromising the safety of the certification decision. However, there are still significant differences in the acceptance of CbA, and the maturity of its practical implementation among different industrial sectors.
In this contribution, a review of existing examples of CbA is presented, together with the preliminary study of a potential new case. The role of standards in the specification of product requirements and assessment methods (for physical as well as virtual testing) will be considered, drawing on the work done in the research project STEERING funded by the Swedish Innovation Agency (VINNOVA). The review will focus on the identification of similarities and differences in requirements, methodologies, and challenges faced by manufacturers and certification bodies.
The analysis of established cases provides the starting point to investigate the role of CbA in applications where product certification currently relies fully on physical testing. The feasibility of CbA will be studied in the assessment of crashworthiness requirements for a component made of fibre-reinforced polymer composite material. This preliminary study is developed within the COST Action HISTRATE, a European network of academic researchers and industrial stakeholders that aims at establishing the scientific foundation of a reliable framework for CbA of composite structures subjected to high-strain loads. Hide Full Abstract
Simulation Supporting CertificationV&V (Verification and Validation)
|
Simulation-aided Development of a Liquid Hydrogen Evaporator for Hydrogen-powered Aircraft Demonstrators
Presented By Razvan Apetrei (Element Digital Engineering)
The rapid decarbonization of industry and transport is a central challenge to the transition to a more competitive and greener economy. Hydrogen is seen by many as an energy vector with potential to decarbonize industries such as aerospace and heavy goods transport which cannot be easily electrified...Read Full Abstract The rapid decarbonization of industry and transport is a central challenge to the transition to a more competitive and greener economy. Hydrogen is seen by many as an energy vector with potential to decarbonize industries such as aerospace and heavy goods transport which cannot be easily electrified. In these sectors which need a higher energy density than is available from existing battery technology, hydrogen is likely to play a significant role in the decarbonization strategy. Whereas gaseous hydrogen in high pressure storage tanks is a feasible solution for ground-based and water-based transport, the associated weight penalty of high pressure tanks makes it less suited for the aerospace industry, where liquid Hydrogen is the preferred alternative.
In order to utilize the hydrogen as fuel, either by producing electricity in fuel cells, or otherwise burning it in gas turbines, it is required for it to be evaporated and then brought up in temperature. This requirement is derived for a number of reasons, including, safety, integrity, and efficiency of the propulsion system. One option is to utilize excess produced by the powerplant and through a thermal management system redirect that heat to evaporate the LH2. This approach has, in the past, been used in traditional hydrocarbon-fueled aerospace propulsion systems.
The Clean Aviation NEWBORN program has been awarded to develop a megawatt propulsion system with hydrogen as its energy source and develop it to TRL level 4. As part of this program, the consortium are developing the thermal management system which utilizes excess powerplant heat to thermally condition the hydrogen prior to entering the fuel cell.
This paper outlines the development cycle of a liquid hydrogen evaporator heat exchanger; with a focus on the role of simulation in determining key design features necessary order to meet the stringent requirements over the wider operating envelope of the device. Insights into the thought process behind selecting the right simulation approach and stepping through complexity are given.
Solutions necessary to minimize the risk of icing of the heating fluid are presented, in the form of both operating requirements as well as geometrical design of the device. Assessments conducted to verify that the large thermal gradients does not compromise the structural integrity of the device are also summarized. The key performance metrics, related to the efficiency and integrity of the evaporator are also outlined.
Finally, the paper summarizes the performance testing conducted and test results obtained which validated the design, ahead of it being integrated into the thermal management system developed by the NEWBORN team. Hide Full Abstract
AerospaceComputational Fluid DynamicsSustainability
Authored By Razvan Apetrei (Element Digital Engineering) Tom Elson (Element Digital Engineering) Steve Summerhayes (Element Digital Engineering)
|