Efficient Joining Failure Assessment of Multi-material Car Bodies in Crash
Presented By Tony Porsch (Volkswagen AG)
Predicting structural failure in automotive engineering remains a significant challenge in the area of virtual vehicle development, gaining further importance in the context of "Virtual Certification." The increasing use of modern lightweight materials, ultra-high-strength steels, and new ...Read Full Abstract Predicting structural failure in automotive engineering remains a significant challenge in the area of virtual vehicle development, gaining further importance in the context of "Virtual Certification." The increasing use of modern lightweight materials, ultra-high-strength steels, and new innovative joining techniques contributes to heightened material diversity and complexity in vehicle bodies. Traditional resistance spotwelds are now complemented by growing use of self piercing rivets, line welds, and flow drill screws among other techniques. The failure of these connections is of particular concern in crash scenarios, as it significantly impacts vehicle safety. Therefore, robust and industry-applicable computational methods are essential for dealing with the complexity of vehicle structures and delivering reliable predictive results.
In this seminar, the L2-Tool, a modular failure assessment framework, which was developed at the Virtual Vehicle Research Center in a joint research project with Volkswagen AG and Audi AG will be presented. The key element of this framework is the assessment of the failure with special surrogate models, which guarantee a high prediction quality despite a low additional computing time. Particularly high-strength lightweight materials have an increased risk of crack initiation under plane tensile load, for example due to the heat input in the welding process or due to the notch effect on rivets and flow drill screws. A key element of this method is that these two types of failure can be distinguished and assessed using a non-local approach. For the parameterization of the failure models, a combination of real and virtual testing with detailed, small-scale specimens is used, which will be briefly outlined in the presentation. After the development phase, the failure models are integrated into the product development process in a multi-stage integration process, starting with implementation via user interfaces, followed by an comprehensive test phase and the final industrialization by the crash solver provider.
In the conclusion of the presentation, illustrative results of the L2-Tool applied to vehicle substructures are presented. The framework within the standardized calculation process is also described, with emphasis on the pre- and post-processing phases. The predictive accuracy of the method is addressed, and finally, potential applications are shown. Hide Full Abstract
AutomotiveFailureLow CodeMeshingSimulation Supporting Certification
Authored By Tony Porsch (Volkswagen AG) Karl Heinz Kunter (Virtual Vehicle Research GmbH) Jean-Daniel Martinez (Audi AG)
|
Normalizing Uncertainty in Computer-Aided Engineering: A Case Study
Authored & Presented By Fabio Santandrea (Volvo Car Corporation)
Computer-Aided Engineering (CAE) relies on physics-based computational models to perform analysis tasks of industrial products at reduced cost and time-to-market. The possibility to simulate the behaviour of different design variants, with limited resort to physical prototyping and testing, facilita...Read Full Abstract Computer-Aided Engineering (CAE) relies on physics-based computational models to perform analysis tasks of industrial products at reduced cost and time-to-market. The possibility to simulate the behaviour of different design variants, with limited resort to physical prototyping and testing, facilitates the achievement of quality and sustainability targets while increasing profit margins. However, the positive effects of driving product development and manufacturing via CAE depend essentially on the predictive capability of the computational models used in the simulations.
Potential sources of uncertainty for the results of numerical simulations include all the intrinsic elements of the model building and analysis processes, such as modelling assumptions, variability of physical properties, measurement uncertainty, and numerical errors. Furthermore, human errors in the use of the models as well as in the way the models are managed during the whole lifecycle might make simulation outcomes deviate from reality.
Simulation Governance is the process to ensure that the predictive capability of numerical simulations is adequate for their intended use (essentially, a Quality Assurance function tailored to CAE). Activities such as model verification, validation, and uncertainty quantification fall within the scope of Simulation Governance. The systematic, large-scale implementation of Simulation Governance is often hindered by the lack of dedicated resources resulting often from the lack of guidance on how to translate key concepts and methods from the scientific context where they originate to the context of industrial research and development.
In this contribution, we will present reflections and outcomes from the recently ended research project TRUSTIT, which targeted the problem of integrating uncertainty quantification and sensitivity analysis into industrial CAE workflows, namely in complete vehicle simulations performed at Volvo Cars with the in-house developed tool VSIM. The key requirements to establish a complete simulation credibility assessment framework are identified, together with the implications for simulation-driven design and virtual testing. The practical challenges to integrate uncertainty quantification in existing simulation platforms will be illustrated with an example based on the virtual representation of the coastdown test, which is designed to determine the aerodynamic and mechanical resistive forces acting on vehicles. Hide Full Abstract
Simulation GovernanceUncertainty QuantificationV&V (Verification and Validation)
|
The Digital Twin of ESAs Large Space Simulator
Presented By Remko Moeys (ESA/ESTEC)
This paper presents the digital twin of the Large Space Simulator (LSS), as it undergoes the final stage of its development. Located in The Netherlands, the LSS is Europe's largest thermal vacuum chamber and is used by the European Space Agency to test spacecrafts under representative space conditio...Read Full Abstract This paper presents the digital twin of the Large Space Simulator (LSS), as it undergoes the final stage of its development. Located in The Netherlands, the LSS is Europe's largest thermal vacuum chamber and is used by the European Space Agency to test spacecrafts under representative space conditions: vacuum, cryogenic temperatures and powerful, dynamic solar illumination.
The purposes of this digital twin are to simulate:
1. Future test campaigns (standard of specific ones) to support the training to operate the LSS facility
2. The performance of the facility with future hardware or software modifications and carry out software/hardware-in-the-loop pre-tests
3. Abnormal facility operation with failed equipment
The digital twin consists of three layers:
1. a high fidelity EcosimPro model of the LSS to simulate its physical performance
2. a virtual version of the LSS Programmable Logic Computer to execute the process control
3. an identical Human-Machine Interface to the one of the LSS for the user to interact with
A co-simulation manager ensures the exchange of information between the above three layers and enables digital twin-specific functionalities such as adjusting the simulation speed, upload/start/stop/save a simulation run, load pre-defined failure scenarios and virtually carry out the key procedure steps that are manually performed on the field.
To maximise the representativeness of the real facility operation, the digital twin is designed to be operable from the same monitors of the LSS control room and to display the simulation results using the same data acquisition and presentation software used by the LSS: STAMP (System for Thermal Analysis, Measurement, and Power supply control), developed by Therma. The digital twin is conceived to be operable by a trainee and an instructor simultaneously.
The project was kicked off in November 2023, underwent detailed validation review of the models against test data in October 2024, and is expected to be completed by mid-2025. The prime contractor of this project is Empresarios Agrupados '“ GHESA, who is also the owner of the modelling software used (EcosimPro). Hide Full Abstract
Digital Twins
Authored By Remko Moeys (ESA/ESTEC) Raul Avezuela (Empresarios Agrupados)
|
Simulation-aided Development of a Liquid Hydrogen Evaporator for Hydrogen-powered Aircraft Demonstrators
Presented By Steve Summerhayes (Element Digital Engineering)
The rapid decarbonization of industry and transport is a central challenge to the transition to a more competitive and greener economy. Hydrogen is seen by many as an energy vector with potential to decarbonize industries such as aerospace and heavy goods transport which cannot be easily electrified...Read Full Abstract The rapid decarbonization of industry and transport is a central challenge to the transition to a more competitive and greener economy. Hydrogen is seen by many as an energy vector with potential to decarbonize industries such as aerospace and heavy goods transport which cannot be easily electrified. In these sectors which need a higher energy density than is available from existing battery technology, hydrogen is likely to play a significant role in the decarbonization strategy. Whereas gaseous hydrogen in high pressure storage tanks is a feasible solution for ground-based and water-based transport, the associated weight penalty of high pressure tanks makes it less suited for the aerospace industry, where liquid Hydrogen is the preferred alternative.
In order to utilize the hydrogen as fuel, either by producing electricity in fuel cells, or otherwise burning it in gas turbines, it is required for it to be evaporated and then brought up in temperature. This requirement is derived for a number of reasons, including, safety, integrity, and efficiency of the propulsion system. One option is to utilize excess produced by the powerplant and through a thermal management system redirect that heat to evaporate the LH2. This approach has, in the past, been used in traditional hydrocarbon-fueled aerospace propulsion systems.
The Clean Aviation NEWBORN program has been awarded to develop a megawatt propulsion system with hydrogen as its energy source and develop it to TRL level 4. As part of this program, the consortium are developing the thermal management system which utilizes excess powerplant heat to thermally condition the hydrogen prior to entering the fuel cell.
This paper outlines the development cycle of a liquid hydrogen evaporator heat exchanger; with a focus on the role of simulation in determining key design features necessary order to meet the stringent requirements over the wider operating envelope of the device. Insights into the thought process behind selecting the right simulation approach and stepping through complexity are given.
Solutions necessary to minimize the risk of icing of the heating fluid are presented, in the form of both operating requirements as well as geometrical design of the device. Assessments conducted to verify that the large thermal gradients does not compromise the structural integrity of the device are also summarized. The key performance metrics, related to the efficiency and integrity of the evaporator are also outlined.
Finally, the paper summarizes the performance testing conducted and test results obtained which validated the design, ahead of it being integrated into the thermal management system developed by the NEWBORN team. Hide Full Abstract
AerospaceComputational Fluid DynamicsSustainability
Authored By Steve Summerhayes (Element Digital Engineering) Tom Elson (Element Digital Engineering)
|
Contribution of the Virtual Validation in the
Development of a 48V Electric Powertrain for 2-wheeler Applications
Presented By Riccardo Testi (Piaggio)
A CAE workflow was defined within a new Piaggio development methodology and executed to develop and assess the new Piaggio 48V electric powertrain's performance. The workflow involved linked CFD, EMAG and structural analyses.
The objective was to anticipate fundamental results and info an...Read Full Abstract A CAE workflow was defined within a new Piaggio development methodology and executed to develop and assess the new Piaggio 48V electric powertrain's performance. The workflow involved linked CFD, EMAG and structural analyses.
The objective was to anticipate fundamental results and info and reduce the economic effort associated with the physical prototyping activities.
Diverse CAE suites were coupled, optimizing using Piaggio's procedures which were consolidated throughout the years for the of development of 2-wheelers equipped with internal combustion engines. The modular structure of those suites made in easier to incorporate the new EMAG analyses in the workflow.
The MBS system simulation activities were carried out integrating the new E-powertrain models in a Piaggio's database, which includes libraries of subsystems such as transmissions, testbenches, etc. This approach will allow for quicker generation of future models leveraging carry-out made possible by the modular nature of such a database.
The whole CAE workflow relied on a common source of truth residing in Piaggio's PLM system, allowing a smooth cooperation between Piaggio's E-Mobility and Powertrain depts.
EMAG simulations were carried to assess the electric machine's performance and to provide input data for the subsequent CFD, MBS and structural analyses.
The dynamic behavior, from a mechanical standpoint, was analyzed with multibody models, which produced KPIs' values and provided input data for stress analyses.
CFD analyses were used to verify that the exercise temperatures were compatible with the electric machine's requirements and provided thermal maps for the FEM stress analyses.
The structural integrity of the whole e-powertrain system was verified with combined stress and durability analyses, based on the working conditions identified during the previous EMAG, MBS and CFD campaigns.
The structural FEM analyses were also used without coupling them with durability tools, to investigate functional aspects of the mechanical system.
Being the CAE campaign carried out in the project's early stages, it allowed to reduce the physical tests and could assist the sourcing activities managed by the Purchasing dept Hide Full Abstract
AutomotiveCAE in the design processComputational ElectromagneticsComputational Fluid DynamicsMultibody Dynamics
Authored By Riccardo Testi (Piaggio) Michele Caggiano (Piaggio & amp amp amp C. SpA) Antonio Fricasse (Piaggio & amp amp amp C. SpA)
|
Simulations as a Design Guiding Tool: Reexamining the Role of the Simulation Engineer
Authored & Presented By Karlo Seleš (Rimac Technology)
Rimac Technology (RT), formerly the Components Engineering division of Rimac Automobili'”founded in 2009'”has established itself as an important player in advanced performance electrification technologies. Meanwhile, the Rimac brand has evolved from creating the world's first all-electric, record-br...Read Full Abstract Rimac Technology (RT), formerly the Components Engineering division of Rimac Automobili'”founded in 2009'”has established itself as an important player in advanced performance electrification technologies. Meanwhile, the Rimac brand has evolved from creating the world's first all-electric, record-breaking production hypercar to continually pushing the boundaries of aesthetics and dynamics through its Bugatti-Rimac enterprise.
Today, RT stands as a leading Tier-1 automotive supplier, specializing in high-performance battery systems, electric drive units, electronic systems, and user interface components, solidifying its reputation in advanced performance electrification technologies.
As the company transitioned beyond its startup phase, the simulation department expanded in parallel, offering a unique opportunity to challenge and rethink conventional industry practices. One such area of focus is the evolving role of simulation engineers in the product development process.
Amid the rapidly evolving trends within the simulation community, this presentation aims to spark a thought-provoking discussion on the transforming role of industrial simulation engineers in the modern product development. While the simulation engineering has traditionally been viewed as a supporting function, its strategic importance in the design and development process within Rimac Technology is becoming increasingly apparent.
This presentation will explore how Rimac Technology leverages simulation techniques to address challenges inherent in fast-paced, cost-sensitive industries. It will showcase the critical role simulations play throughout the development cycle'”starting from initial concept ideation, where incremental improvements often fall short, to the optimization and validation stages that culminate in production-ready solutions.
By delving into Rimac Technology's approach, the session will highlight how simulations can be used effectively at different product development stages. Moreover, it will consider how the responsibilities of simulation engineers are expanding beyond traditional analysis tasks to encompass broader topics, such as influencing design strategies, driving various levels of verification and validation campaigns, and integrating sub-system requirement considerations into engineering decisions.
Ultimately, this presentation seeks to challenge conventional perceptions, illustrating how simulation engineers are emerging as key contributors to an organization's success in an increasingly competitive and technology-driven landscape. Hide Full Abstract
Asset ManagementAutomotiveBusiness Impact of SimulationCAE in the design processDemocratisationSimulation ManagementSimulation StrategySystem-Level Simulation
|
Modelling Aero-Optical Turbulent Effects On The European Solar Telescope Using CFD Analysis
Presented By Mahy Soler (Principia Ingenieros Consultores S.A.)
The European Solar Telescope (EST) is a next generation 4-m class solar telescope that will be built at the Observatorio del Roque de los Muchachos (ORM). The performance of an optical telescope is evaluated by its seeing, which refers to the image degradation caused by turbulent fluctuations in the...Read Full Abstract The European Solar Telescope (EST) is a next generation 4-m class solar telescope that will be built at the Observatorio del Roque de los Muchachos (ORM). The performance of an optical telescope is evaluated by its seeing, which refers to the image degradation caused by turbulent fluctuations in the air's refractive index as light travels through the optical path. This phenomenon arises from various sources, including atmospheric turbulence, environmental and local effects.
Atmospheric turbulence is largely determined by the site location and the ORM is renowned for its excellent atmospheric conditions for astronomical observation. The EST design aims to minimize environmental local turbulence effects caused primarily by the thermal ground layer. This is achieved by placing the optical elements as high as possible from the ground and using an open-air configuration that promotes natural ventilation.
The local fluctuations in the air refractive index in the surrounding of the telescope are produced by a combination of thermal and mechanical turbulence that depends on the size and shape of the design. To evaluate the local effects, detailed Finite Element Thermal and Computational Fluid Dynamics (CFD) models were developed. These models accounted for the topography, telescope structure, pier, enclosure and nearby telescopes within the observatory. Transient thermal analysis calculates superficial temperatures which are subsequently used by the CFD model to compute the air temperature distribution and its refractive index.
A series of transient CFD analyses is conducted to analyze the impact of environmental conditions, including wind speed, wind direction and telescope orientations, on different design alternatives. These simulations provide further insights into the spatial distributions of air temperature and refractive index fluctuations inside the optical path. The results are postprocessed to derive aero-optical metrics, allowing to estimate the telescope performance.
The study highlights how design choices influence aero-optical turbulence and provides feedback for optimizing the EST's design. The results contribute to the telescope's error budget by quantifying local turbulence effects and ensuring that the aerodynamic design supports its optical performance goals. Hide Full Abstract
Computational Fluid DynamicsOptical
Authored By Mahy Soler (Principia Ingenieros Consultores S.A.) Konstantinos Vogiatzis (Instituto de AstrofAsica de Canarias) Juan Cezar-Castellano (Instituto de AstrofAsica de Canarias) Sergio Bonaque-Gonzalez (Departamento de Fesica, Universidad de La Laguna) Marta Belio-Asen (Instituto de AstrofAsica de Canarias) Miguel Nunez (Instituto de AstrofAsica de Canarias) Mary Barreto (Instituto de AstrofAsica de Canarias)
|
Certification by Analysis: A Selection of Case Studies
Authored & Presented By Fabio Santandrea (Volvo Car Corporation)
Ensuring the compliance to regulatory requirements is a mandatory process for many products to be allowed on the market. The assessment of product performance is largely based on physical testing of a few samples and, possibly, monitoring of the production process. In order to reduce cost and time-t...Read Full Abstract Ensuring the compliance to regulatory requirements is a mandatory process for many products to be allowed on the market. The assessment of product performance is largely based on physical testing of a few samples and, possibly, monitoring of the production process. In order to reduce cost and time-to-market associated to the certification process, manufacturing companies have increased their efforts to establish numerical simulations as a legitimate alternative to physical testing, thus introducing the notion of 'œCertification by Analysis' (CbA).
In some sectors, certification bodies responded to the industrial drive towards virtual testing by developing guidelines and standardised reporting documents to streamline the credibility assessment of the results of numerical simulations without compromising the safety of the certification decision. However, there are still significant differences in the acceptance of CbA, and the maturity of its practical implementation among different industrial sectors.
In this contribution, a review of existing examples of CbA is presented, together with the preliminary study of a potential new case. The role of standards in the specification of product requirements and assessment methods (for physical as well as virtual testing) will be considered, drawing on the work done in the research project STEERING funded by the Swedish Innovation Agency (VINNOVA). The review will focus on the identification of similarities and differences in requirements, methodologies, and challenges faced by manufacturers and certification bodies.
The analysis of established cases provides the starting point to investigate the role of CbA in applications where product certification currently relies fully on physical testing. The feasibility of CbA will be studied in the assessment of crashworthiness requirements for a component made of fibre-reinforced polymer composite material. This preliminary study is developed within the COST Action HISTRATE, a European network of academic researchers and industrial stakeholders that aims at establishing the scientific foundation of a reliable framework for CbA of composite structures subjected to high-strain loads. Hide Full Abstract
Simulation Supporting CertificationV&V (Verification and Validation)
|
Full Scale Validation Testing for Legacy Aircraft Finite Element Models
Authored & Presented By David Wieland (Southwest Research Institute)
As the US military extends the service life of aging weapon systems, the need for accurate models of aircraft structures has become critical. The original finite element models (FEMs) used during the development of these legacy systems were either not procured by the military or not maintained, nece...Read Full Abstract As the US military extends the service life of aging weapon systems, the need for accurate models of aircraft structures has become critical. The original finite element models (FEMs) used during the development of these legacy systems were either not procured by the military or not maintained, necessitating the development of new FEMs from 2D drawings and scanned parts. Given that many weapon systems were tested up to 60 years ago, the absence of original test data presents significant challenges for model validation. This has led to the United States Air Force performing full scale test for model validation.
This presentation delves into Southwest Research Institute recent experiences in performing full-scale validation test for the T-38 and A-10 aircraft finite element models. The full aircraft NASTRAN structural analysis model of the T-38 is being developed by Northrop Grumman. The A-10 aircraft structural analysis model is being developed by the United States Airforce A-10 analysis section with support from Southwest Research Institute. To performing these full-scale validation test, a diverse array of measurement techniques, including deflection potentiometers, strain gages, fiber optic strain sensors, and digital image correlation, were employed.
Depending on the component the validation tests often are performed on structure that will be returned to active service. This necessitates extra caution to ensure the structure is not damaged and can limit options for methods to attach to and load the structure. The presentation will discuss the specific test setups, the resulting data, status of the validation effort and key lessons learned from the validation process.
In addition to these validation efforts, SwRI will discuss how the validated T-38 finite element analysis model is being used to update the damage tolerance analysis of the -33 wing. This will include stress-to-load equation development, development of stress sequence, damage tolerance analysis to determine crack growth progression and failure. Hide Full Abstract
AerospaceIntegration of Analysis & TestV&V (Verification and Validation)
|
Nonlinear Cohesive Zone Modeling for Adhesives
Presented By Tobias Waffenschmidt (3M Deutschland)
In many engineering applications, the integrity of adhesive bonds must be ensured over service life when exposed to mechanical stresses. In order to assess the structural integrity of adhesive bonds numerically, there is an increasing need to efficiently model and simulate the strength, damage and f...Read Full Abstract In many engineering applications, the integrity of adhesive bonds must be ensured over service life when exposed to mechanical stresses. In order to assess the structural integrity of adhesive bonds numerically, there is an increasing need to efficiently model and simulate the strength, damage and failure behavior of adhesives. This includes i) structural adhesives (e.g. curable epoxy-, acrylate-, or polyurethane based adhesives which exhibit thermosetting behavior) but also ii) pressure-sensitive adhesives (adhesive tapes) which behave more elastomeric-like. Pressure-sensitive adhesives, in particular, typically exhibit a highly nonlinear elastic-viscoelastic material behavior including strains at failure of up to 500% or more. This makes a numerical treatment using conventional continuum finite elements difficult if not completely infeasible. One approach to circumvent these deficiencies is cohesive zone modeling. Cohesive zone models make use of constitutive traction-separation laws which enable to incorporate damage and failure mechanisms for adhesives straightforwardly and do not render mesh-dependent results as it would be the case for continuum-based techniques. In particular, the incorporation of the strong nonlinear and rate-dependent response seems to be challenging, because the conventional bilinear traction-separation laws which are available in basically all commercially available finite element software packages are not sufficient to model such complex material behavior. On the other hand, self-implemented user-subroutines which may be used as an alternative are mostly not feasible to be used in an industrial environment due to the high implementation effort, inferior robustness and higher computational cost which mostly prohibits straightforward usage for large-scale simulation problems.
This presentation gives an overview of accurate but yet efficient nonlinear cohesive zone modeling techniques suitable for modeling damage and failure for i) structural adhesives and ii) pressure-sensitive adhesives (adhesive tapes) without the need for user subroutines. Suitable testing and characterization methods for both adhesive categories will be presented and compared to each other. Material model calibration and parameter identification techniques based on these tests for cohesive zone models will be discussed for rate-independent and rate-dependent use cases. Verification and validation test cases will be discussed to underline the applicability of these models. Finally, a variety of different application cases ranging from quasi-static to impact scenarios will be presented. Hide Full Abstract
Joints & ConnectionsMaterial CharacterisationMaterialsV&V (Verification and Validation)
Authored By Tobias Waffenschmidt (3M Deutschland) Markus von Hoegen (3M Deutschland GmbH)
|
Twins, Pyramids and Environments: Unifying Approaches to Virtual Testing
Presented By Louise Wright (National Physical Laboratory)
There is a long history of the use of engineering simulation for design. This virtual approach is typically followed by physical prototyping, testing and refinement to reach a final design, followed in many cases by a physical testing regime to meet regulatory requirements. For many companies, the u...Read Full Abstract There is a long history of the use of engineering simulation for design. This virtual approach is typically followed by physical prototyping, testing and refinement to reach a final design, followed in many cases by a physical testing regime to meet regulatory requirements. For many companies, the use of simulation tools has reduced the time and cost associated with getting new products to market due to the ability to explore multiple designs, and has reduced resource usage and improved product quality by enabling exploration of aspects of manufacturability and long term in-use performance.
Companies are increasingly seeking to gain similar benefits beyond the design stage. Some companies whose products are subject to extensive regulatory testing requirements are seeking to provide evidence of compliance through a combination of simulation and testing. It is common in such industry sectors to have a 'œtesting pyramid', where the safety of a complex multi-component product is demonstrated by carrying out tests of materials, components, assemblies and complete systems, with the number of tests carried out decreasing as the complexity of the object under test increases. A 'œsmarter testing' approach would replace some of these physical tests with simulations and would feed information between the various tests to improve the validation of the simulation and the confidence in the evidence of safety.
Some products cannot be fully tested via physical testing alone because they cause a risk to human safety. An example of current relevance is autonomous vehicles. The artificial intelligence (AI) that controls an autonomous vehicle is often trained on data obtained from human-controlled journeys of a vehicle with the sensor suite in operation, so that the AI is shown what safe driving looks like under typical conditions. However, many of the situations most likely to lead to an accident are not encountered in typical driving conditions, and could cause risk to life if recreated deliberately. A simulation can potentially recreate high-risk scenarios safely for both training and testing purposes.
Some products can significantly improve lifetime prediction and understanding of real-world performance by linking models and data in a digital twin. This approach can lead to improved future design iterations and more effective maintenance plans as the understanding of the product improves. Application of this approach could support personalisation of devices such as medical protheses, where monitoring, adjustment and individualisation could significantly improve people's lives.
In all of these applications it is important to note that the company developing the simulations are not the only people that need to have trust in the simulation results. That trust needs to be shared by regulators, end users, and in some cases the general public.
These three seemingly distinct themes of activity are strongly linked, not least because they have the same need underpinning them: they need to combine validated models and measured data to make trustworthy predictions of real-world behaviour. This need can be answered most efficiently by a combination of activities in several technical areas, including data quality assessment, software interoperability, semantic technologies, model validation, and uncertainty quantification. The technology readiness level of these areas is varied, and the level of awareness and uptake of good practice of each technical area varies across sectors.
This paper discusses the common features, and differences between, the fields of smart testing, virtual test environments, and digital twins. Starting from a consideration of commonality we will highlight areas where existing methods and expertise could be better exploited, and identify areas where further research and development of tools would accelerate successful application of trustworthy digital assurance approaches in industry. Hide Full Abstract
Digital TwinsIntegration of Analysis & TestSimulation StrategySimulation Supporting Certification
Authored By Louise Wright (National Physical Laboratory) Liam Wright (National Physical Laboratory) Kathryn Khatry (National Physical Laboratory) Joao Gregorio (National Physical Laboratory) Michael Chrubasik (National Physical Laboratory) Maurizio Bevilacqua (National Physical Laboratory)
|
Generative-AI for Preliminary Engineering Design
Presented By Yashwant Liladhar Gurbani (Rolls-Royce)
Many engineering solutions require technologies that rely on specialised know-how and knowledge of physics mechanisms underpinning their design and operation. As the world moves towards a digital era, current surrogate model approaches are either not fit for processing large databases, or unsuitable...Read Full Abstract Many engineering solutions require technologies that rely on specialised know-how and knowledge of physics mechanisms underpinning their design and operation. As the world moves towards a digital era, current surrogate model approaches are either not fit for processing large databases, or unsuitable to deal directly with data typically deriving from computer-based analyses such as geometry representations and field quantities (e.g., stress, displacements, temperature, etc.). At the same time there is a need for enhanced design space exploration capabilities that overcome the limitations from parametric models, enabling the assessment of innovative design concepts through more free-form geometry modelling approaches. GANs are proven effective to generate hyper realistic images when trained on many different (but similar) data. From literature, there is evidence suggesting that conditional Generative Adversarial Networks (cGAN) con provide a valuable means to support engineering design by accurately predicting the results of computationally expensive simulations through the encoding of design information into 2D images [1][2]. However, there is a need for further work to identify and address some of the roadblocks hindering a wider application of this technology. This paper presents the investigation conducted to understand and address some of such restrictions identified for the use of cGAN models on different preliminary design engineering use cases. The models in this study were assessed on engineering and non-engineering data while monitoring their sensitivity to architectural and parametric changes. This deep dive helped gain a better understanding of the applications where such an approach can and cannot be used. Limitations were also identified in tasks conducted as part of the pre-processing of training data, which have driven the motivation to evaluate data encoding in more detail and highlight the need for further developments on this area . Furthermore, the portability of such an approach allows unleashing the crucial benefits its deployment into a cloud environment in terms of efficiency and cost-effectiveness whilst complying with data classification constraints. Hide Full Abstract
Engineering Data Science
Authored By Yashwant Liladhar Gurbani (Rolls-Royce) Marco Nunez (Rolls-Royce PLC) Harry Bell (Rolls-Royce PLC) Nima Ameri (Rolls-Royce PLC) Jon Gregory (Rolls-Royce PLC) Shiva Babu (Rolls-Royce PLC)
|
Enabling Model-Based Aircraft Certification
Authored & Presented By Stephen Cook (Northrop Grumman)
The aircraft certification process for both civil and military air systems carries the reputation of being a costly, paper-centric process [1]. Applicants seeking to achieve certification must provide copious amounts of data and test evidence to establish the engineering pedigree of the aircraft. ...Read Full Abstract The aircraft certification process for both civil and military air systems carries the reputation of being a costly, paper-centric process [1]. Applicants seeking to achieve certification must provide copious amounts of data and test evidence to establish the engineering pedigree of the aircraft. One of the promises of digital engineering is the use of high-fidelity engineering models as a superior source of data for authorities to find compliance with airworthiness regulations. This approach uses engineering simulation models as the authoritative source of truth for making airworthiness determinations and risk assessments. However, there are practical obstacles to real adoption of model-based aircraft certification. This paper will detail these challenges to achieving model-based aircraft certification '“ and propose ways to overcome them - in four categories:
Culture: Travel by aircraft is one of the safest forms of transportation, in part due to rigorous airworthiness standards and processes. As a result, the aircraft certification culture is reluctant to change. Pathfinder projects that have been formulated to show the value of model-based aircraft certification. The paper will propose next steps to develop a positive certification culture around use of models in the certification process.
Competency: The rapid onset of digital engineering tools has created a specialized skillset around the design, construction, and format of the model and its corresponding data. Recently a European aircraft industry consortium stated that there is a need to increase 'œawareness, trust, skills, knowledge, training, experience and mindsets' among engineers using models in the certification process [2]. The paper will discuss some of the airworthiness credentialing efforts underway and the potential to develop training tailored for model-based aircraft certification.
Collaboration: The current aircraft certification involves generating data and sending the results to the airworthiness authority to be reviewed at another time. In contrast, digital models offer the possibility of collaborating in the model in real time and conducting the showing and finding of compliance simultaneously. The full paper will discuss some of the obstacles that must be overcome to enable collaboration in the model, to include availability of regulatory personnel, configuration control of the model, and the ability of models to accurately simulate failure conditions. The paper will also explore the possibility of augmenting collaboration with artificial intelligence to assist the showing and finding of compliance.
Credibility: The aircraft certification process moves at the speed of trust. A recent guide to certification by analysis (CbA) stated that 'œdeveloping methods to ensure credible simulation results is critically important for regulatory acceptance of CbA.' [3]. For engineers to trust models as the authoritative source of truth will require ways show the credibility of the models through appropriate processes and metrics, which will be discussed in the paper.
The paper will provide recommendations for near-term steps that the community can take to promote progress in each of these four areas. Finally, the paper will identify areas where additional research and pathfinder programs would be valuable to enable model-based aircraft certification.
References:
1) Jiacheng Xie, Imon Chakraborty, Simon I. Briceno and Dimitri N. Mavris. "Development of a Certification Module for Early Aircraft Design," AIAA 2019-3576. AIAA Aviation 2019 Forum. June 2019.
2) Fabio Vetrano, et al., 'œRecommendations on Increased Use of Modelling and Simulation for Certification / Qualification in Aerospace Industry,' AIAA-2024-1625.
3) Timothy Mauery, et al., 'œA Guide for Aircraft Certification by Analysis, NASA/CR-20210015404, May 2021. Hide Full Abstract
AerospaceMBSE (Model Based System Engineering)Simulation Supporting Certification
|
Structural Analysis Of A Dam Wagon Gate
Presented By Hervandil Sant'Anna (Petrobras)
This study addresses the structural analysis of the wagon gate of one Petrobras dam, which is a critical structure for water intake by the upstream refinery. Built in 1967, the dam and its accessory structures have undergone corrective maintenance over the years. Following the accidents in Mariana a...Read Full Abstract This study addresses the structural analysis of the wagon gate of one Petrobras dam, which is a critical structure for water intake by the upstream refinery. Built in 1967, the dam and its accessory structures have undergone corrective maintenance over the years. Following the accidents in Mariana and Brumadinho, the National Water Agency (ANA) revised inspection procedures, classifying this dam as low immediate risk but with high potential for associated damage in case of failure.
The main objective of this study is to evaluate the current structural conditions of the wagon gate, which ensures the tightness of the pipeline by blocking about 22 meters of water column. The methodology used is based on elastoplastic stress analysis according to the API 579-1/ASME FFS-1 (2016) code, with the construction of an "as-built" model of the structure and thickness measurements.
The structural analysis was performed using the Finite Element Method (FEM), which allows a detailed assessment of stresses and deformations in the structure. The model was built based on drawings provided by the refinery and thickness measurements taken on the plates and beams that make up the gate. The mechanical properties of the material were obtained from the ASME II Part D code, and the stress analysis followed the API 579-1 methodology.
The results indicate that the wagon gate, despite the natural deterioration process, does not present an immediate risk of structural failure. The numerical analysis considered hydrostatic pressure loads and the structure's own weight. Boundary conditions were defined to prevent rigid body movements, and soil stiffness was modeled based on the Vertical Reaction Module.
The API 579-1 methodology allows the extrapolation of design codes, which was essential for the evaluation of the wagon gate, since the ABNT NBR 8883 standard, used as a design reference, was canceled in 2019. Elastoplastic stress analysis requires the multiplication of load combinations by load factors, as described in table 2D.4 of API 579-11. The safety coefficient was obtained from NBR 8883 and applied in the evaluation of the risk of plastic collapse failure.
In addition to the analysis with the safety coefficient, additional simulations were performed to verify the actual state of stresses and deformations in the structure, considering different thickness conditions in the gate components1. The thicknesses were measured by refinery on 29/5/2020, and additional hypotheses were considered for regions without direct measurements.
This study was essential to ensure the safety of workers during the maintenance of downstream components of the gate and to ensure the structural integrity of the dam. The detailed analysis of the current structural conditions of the wagon gate provides essential information for decision-making on future maintenance and risk mitigation measures. Hide Full Abstract
Civil EngineeringComputational Structural MechanicsSimulation Supporting Certification
Authored By Hervandil Sant'Anna (Petrobras) Carlos Eduardo Simoes Gomes (Petrobras)
|
Safety of AI Systems in Modeling and Simulation
Authored & Presented By Young Lee (UL Solutions)
The integration of artificial intelligence (AI) into modeling and simulation systems has significantly expanded their capabilities, enabling improved accuracy, adaptability, and efficiency. These systems are increasingly applied in high-stakes domains, including aerospace, healthcare, and industrial...Read Full Abstract The integration of artificial intelligence (AI) into modeling and simulation systems has significantly expanded their capabilities, enabling improved accuracy, adaptability, and efficiency. These systems are increasingly applied in high-stakes domains, including aerospace, healthcare, and industrial processes, where failure can have severe consequences. While AI-powered modeling and simulation systems offer remarkable opportunities, they also introduce unique safety risks, such as model instability, data biases, and unpredictable behaviors. Addressing these challenges is critical to ensuring the reliability and acceptance of these technologies in safety-critical applications.
This paper specifies safety requirements and provides guidelines for AI-based modeling and simulation systems, focusing on key safety principles: robustness, reliability, quality management, transparency, explainability, data privacy, data management, and lifecycle management. These principles form a comprehensive framework for mitigating risks and fostering trust in AI systems.
Robustness and reliability are foundational to AI safety, ensuring that systems function consistently under both expected and unexpected conditions, producing accurate and dependable results over time. Quality management underpins these principles, emphasizing structured development processes and rigorous testing to minimize systematic errors and ensure adherence to functional requirements.
Transparency and explainability address the need to understand how AI systems make decisions and why specific outputs are produced. These attributes are pivotal for building trust among stakeholders, enabling designers, developers, regulators, and end-users to scrutinize and confidently engage with AI systems.
Data privacy ensures the responsible collection, storage, use, and sharing of personal information, aligning with regulatory requirements and safeguarding individual and organizational data. Effective data management ensures the secure handling of input and output data while fostering compliance with ethical and regulatory standards. Lastly, lifecycle management maintains the safety, reliability, and compliance of AI models throughout their operational lifespan, adapting to technological, regulatory, and user needs.
By integrating these principles, this framework provides a pathway for developing AI-based modeling and simulation systems that are not only innovative but also safe, reliable, and trustworthy. This paper seeks to engage the modeling and simulation community in adopting structured approaches to AI safety, bridging the gap between technological advancements and safety-critical applications. Hide Full Abstract
Engineering Data ScienceSimulation Governance
|
Cloud-Enabled Generative AI for Preliminary Engineering Design
Presented By Nima Ameri (Rolls-Royce)
Presented in this paper is the approach conducted to democratise the adoption of conditional Generative Adversarial Networks (cGAN) on the cloud for preliminary engineering design applications. This work addresses a number of challenges associated with the use of cGAN networks to engineering applica...Read Full Abstract Presented in this paper is the approach conducted to democratise the adoption of conditional Generative Adversarial Networks (cGAN) on the cloud for preliminary engineering design applications. This work addresses a number of challenges associated with the use of cGAN networks to engineering applications and explores them through the illustration of various use cases. In contrast to other applications, accuracy from synthetic data plays a crucial role within an engineering context; this emphasis on accuracy puts increased attention on the correct execution of each step involved in the process for training such models. such as data preparation and architecture configuration. Furthermore, a range of additional non-technical considerations highlight the cloud as the best suited solution to access higher and scalable computational resources as well as specialised COTSs technologies. To address this, Rolls-Royce has partnered up with Databricks, leveraging its Data Intelligence Platform from the Rolls-Royce Data Science Environment (DSE) hosted on Microsoft Azure: Rolls-Royce's DSE is a highly integrated platform of world leading tools and technology which enables users to develop and deploy analytics, data science and machine learning in a secure and scalable manner within the company's strategic digital environment, while allowing access to third parties including academic partners and suppliers in a safe and controlled manner. The adoption of cloud technologies was aimed at achieving a significant reduction in runtime, with a target factor of 30 when compared to the equivalent on-prem run. Furthermore, an additional goal was the development of a generalised framework for the identification of the optimal network architecture and hyperparameters for a given use case. This work will demonstrate a solution to this goal by leveraging and combining a number of technologies: this includes the use of Ray package for hyper-parameter and -architecture optimisation, and the adoption of MLflow for the management of the GAN models lifecycle and experiment tracking.. Particular attention was also given to data management and governance of the engineering data which comprised of a combination of images, tabular data and metadata produced by dedicated physics-based engineering softwares. To this end, data was imported and converted into industry standard 'œdelta format' which is optimal for cloud and distributed computation. Finally, the data governance framework was provided by Databricks's Unity Catalog which establishes a crucial framework for compliance-centric industries, such as aerospace.
The effectiveness of the approach is demonstrated with engineering use cases of growing complexity. Hide Full Abstract
AerospaceCAE in the design processCloud ComputingDemocratisationEngineering Data Science
Authored By Nima Ameri (Rolls-Royce) Shiva Babu (Rolls-Royce PLC) Marco Nunez (Rolls-Royce PLC) Yashwant Gurbani (Rolls-Royce PLC)
|
Study on CAE Techniques for Deriving Single Component Durability Test Specification of Automotive Suspension Component
Presented By Kyung hoon Jung (HYUNDAI MOTORS COMPANY)
The automotive industry is experiencing rapid transformation, leading to increased demand for virtual developments and enhanced component-level durability testing. While traditional full-vehicle Belgian road tests have been effective for overall durability verification, they present significant chal...Read Full Abstract The automotive industry is experiencing rapid transformation, leading to increased demand for virtual developments and enhanced component-level durability testing. While traditional full-vehicle Belgian road tests have been effective for overall durability verification, they present significant challenges including high costs, lengthy preparation times, and difficulties in conducting urgent verification tests for improved components. Additionally, the current collaboration between CAE and testing teams, though valuable, often relies on qualitative judgements and requires extensive labor to identify durability failure modes.
This paper introduces an innovative methodology for efficiently deriving durability test conditions for vulnerable components identified during full-vehicle durability tests. The proposed approach consists of four key stages: (1) identification of main load points through Miner'™s rule-based damage analysis, (2) measurement of virtual strain under full-vehicle conditions, (3) component-level test load histories derivation through Load Reconstruction, and (4) load normalization using Potential Damage Intensity (PDI) based on idealized S-N curves and equivalent load analysis.
A significant innovation in this methodology is the application of linear inverse matrix between virtual strains of the full-vehicle model and the jig test analysis model. Unlike previous studies that focused on identical boundary conditions, our approach addresses the challenges of different boundary conditions between full-vehicle and component-level testing.
The methodology incorporates Load Reconstruction analysis with Load Transducer functionality to minimize variance in complex geometries. The process concludes with the extraction of main load vector components using PDI analysis, enabling the conversion of multiple-axis random loads into practical uniaxial test conditions. To ensure reliability, we implement a verification process comparing damage hotspot orders between full-vehicle and component-level conditions.
For practical implementation, we utilize the S-N method and relative damage concepts to calculate equivalent sinusoidal loads, with scale factors determined based on material fatigue properties. The target testing regime encompasses approximately 200,000 cycles, optimized for actual test actuator specifications. This comprehensive approach significantly enhances the efficiency and accuracy of component-level durability testing while reducing development costs and time compared to traditional methods.
The methodology has been successfully validated through testing on multiple chassis systems, demonstrating its effectiveness in reproducing critical durability characteristics while maintaining the accuracy of the full vehicle testing approach. This innovative process represents a significant advancement in automotive durability testing, offering a more systematic and efficient approach to component-level verification. Hide Full Abstract
AutomotiveFailureFatigueIntegration of Analysis & TestSimulation Supporting Certification
Authored By Kyung hoon Jung (HYUNDAI MOTORS COMPANY) Hoo Gwang Lee (Hyundai Motors Company) Hong Ju Park (Hyundai Motors Company)
|
From The Paris Law To The ‘Total-Life’ Method: An Extensive Review Of Fatigue Crack Growth Laws And Models
Presented By Andrew Halfpenny (HBK UK Ltd.)
Fatigue is the predominant cause of structural failure under cyclic loading conditions. Fatigue failure typically involves two main stages: an initial phase where one or more cracks form (crack initiation stage), followed by a phase where these cracks, if subject to sufficiently high cyclic stress,...Read Full Abstract Fatigue is the predominant cause of structural failure under cyclic loading conditions. Fatigue failure typically involves two main stages: an initial phase where one or more cracks form (crack initiation stage), followed by a phase where these cracks, if subject to sufficiently high cyclic stress, grow until failure (crack propagation stage). The relative duration of these stages varies based on factors such as material properties, structural design, and application. Furthermore, in some cases, a crack may extend into a low-stress region, halting its progression and preventing failure. In such scenarios, the crack may be considered acceptable in-service, as it does not compromise the component's durability (damage tolerance approach).
The term '˜fatigue crack growth' refers to the propagation (or not) of cracks under cyclic loading. Since the 1950s, extensive research has focused on understanding and characterizing crack propagation under cyclic loading. This includes defining threshold, propagation, and fast fracture regions from both experimental and numerical perspectives, as well as accounting for mean stress effects and crack retardation. Unfortunately, this research is dispersed across numerous scientific publications. Furthermore, common simulation methods often focus on either the initiation or propagation stage, which can lead to inaccurate fatigue life predictions when both stages are significant. This issue is particularly relevant for welded structures, lightweight jointed structures, and lightweight cast components, which are increasingly important for more environmentally sustainable transportation solutions.
The aim of this work is to enable a more efficient review and comparison of available crack growth analysis tools to support informed decision-making, by collecting the most relevant fatigue crack growth laws and models into a single document. Additionally, this work introduces a unified fatigue life estimation approach, called the 'œTotal-Life' method, that integrates both the initiation and propagation stages, by combining principles from strain-life and fracture mechanics, and a state-of-the-art multiaxial crack-tip plasticity model to account for mean-stress and overload retardation effects. Hide Full Abstract
Computational Structural MechanicsFatigueFracture
Authored By Andrew Halfpenny (HBK UK Ltd.) Cristian Bagni (Hottinger Bruel & amp amp amp amp amp amp amp amp amp Kjaer) Stephan Vervoort (Hottinger Bruel & amp amp amp amp amp amp amp amp amp Kjaer) Amaury Chabod (Hottinger Bruel & amp amp amp amp amp amp amp amp amp Kjaer)
|
Vibro-acoustic Simulation of Impulsive Feedback from Computer Mice Microswitches
Presented By Luca Francesconi (Logitech Europe S.A.)
Simulations for predicting acoustics emissions from impulsive and transient dynamic phenomena emerging from small electro-mechanical components, as those commonly found in consumer electronics, remains both novel and challenging. Some of these components act as direct Human-Machine Interfaces (HMIs)...Read Full Abstract Simulations for predicting acoustics emissions from impulsive and transient dynamic phenomena emerging from small electro-mechanical components, as those commonly found in consumer electronics, remains both novel and challenging. Some of these components act as direct Human-Machine Interfaces (HMIs) between users and the devices being operated. One of such applications are microswitches embedded in computer mice. Besides the functional operation of the device, they also double as the primary source of both tactile and acoustic feedback to the user upon clicking its keys. The afforded feedback is a complex array of multimodal sensorial cues and includes fast transient events such as impulsive phenomena. From a component level to the integration at system architecture, both the acoustic emissions and mechanical behavior of such components constitute the main source of User Experience (UX) for mouse clicking. Predicting, through simulations, the vibro-acoustics performance of microswitches and its integration in the product, in this case a computer mouse, can enable design practices to emerge with better experiences, including addressing potential sound quality issues at earlier stages of product development.
Recent advancements in simulation software and improved computational resources open up the possibility of modeling increasingly complex vibro-acoustic phenomena. The goal of this research is to understand current capabilities of simulation software to accurately predict such phenomena. This work aimed at modeling the full simulation of the vibro-acoustic response of a microswitch at the component level. This included the full operational cycle, namely the closure (push) and opening (release) switch events.
This paper reports the simulation methodology adopted, from the structural and transient analysis to the acoustic radiation emerging from the component. Structural simulations involved driving pre-stressed Finite Element (FE) models with adequate and experimentally known input forces. A time-domain explicit FE simulation modeled the rapid displacement and buckling of the internal components upon operation for the full cycle. This model was experimentally validated with high-speed footage and positioning tracking of the moving switch elements under real operation conditions. The simulation analysis further explores the model's vibration response of the mechanical system across a range of frequencies meaningful to human hearing. Derived from these structural vibrations, sound is generated from the rapid displacement of the fluid (air) surrounding the structure. The acoustic propagation is thus simulated by modeling both the internal cavities of the switch as well as the surrounding air volume for its casing. In order to enable a better efficient use of computational resources, a hybrid mesh was adopted using both FE and Boundary Element Methods (BEM). Experimental audio recordings of switch samples' emissions were also used to compare and validate the model. It was found that fine-tuning the simulation model parameters such as damping and material properties is essential in order to accurately reflect the physical behavior. This includes sound quality metrics in both time and frequency domains as well as auralizations. The output results from the simulation can match both the spectral and time-domain characteristics of real audio within a standard measurement error.
This study found validity in the simulation methods adopted and its results. It proposes and emerges with a methodology to simulate complex vibro-acoustic phenomena in similar and other applications. Overall, this paper also provides and reports a state-of-the-art perspective on the current vibro-acoustic simulation capabilities available to academia and industry. Hide Full Abstract
AcousticsDynamics & VibrationNVH (Noise Vibration and Harshness)
Authored By Luca Francesconi (Logitech Europe S.A.) Nuno Valverde (Logitech Europe SA) Sterling McBride (Dassault Syst& amp amp amp amp amp amp , 232 mes)
|
High Voltage Circuit Breaker Design with Multi-Objective Optimization Algorithms
Presented By Wilhelm Thunberg (Hitachi Energy)
This paper handles the application of modern optimization software in dielectric design development of high voltage circuit breakers (HVCB) and shows how coupling of different simulation types can create more efficient workflows. Given the push to replace the circuit breaker insulation gas SF6 with ...Read Full Abstract This paper handles the application of modern optimization software in dielectric design development of high voltage circuit breakers (HVCB) and shows how coupling of different simulation types can create more efficient workflows. Given the push to replace the circuit breaker insulation gas SF6 with more eco-efficient solutions, there is a need for high pace development and innovation. This necessitates new methods for HVCB development that enable the rapid finding of an optimal design given a large set of parameters and competing objectives. This multi-objective nature can be related to the varying conditions the HVCB must handle, or to different physical properties, such as mechanical and dielectric. A common challenge is to balance the different objectives and to understand all the inherent trade-offs in the design.
The main purpose of this paper is to show a dielectric simulation optimization using the MOGA-II algorithm and compare the workflow to more traditional ones, such as a full factorial search. The comparison criteria include the time required to achieve the optimal design, the dielectric robustness of the 'œbest' found design, and the ability to effectively evaluate the compromise between competing objectives.
In addition to the dielectric optimization, a new approach for coupling this workflow to optimization of mechanical properties of the HVCB is shown. This paper also details a new Python-based approach that reduces runtime by keeping simulation software clients active during large optimization runs.
Initial findings indicate that the application of optimization algorithms like the MOGA-II gives a quicker route to an optimized design, while also enabling coupling of different optimization categories. As a result, new insights into the inherent objective trade-offs caused by the multi-objective nature in HVCB design can be found. These advancements have the potential to streamline the design process and can contribute to the development of more sustainable and efficient products. Hide Full Abstract
Computational ElectromagneticsElectronicsOptimisation
Authored By Wilhelm Thunberg (Hitachi Energy) Sami Kotilainen (Hitachi Energy)
|
The Path to Virtual Product V&V Uncertainty Quantification of Test and Simulation Results
Authored & Presented By Frank Günther (Knorr-Bremse)
Traditional Computer Aided Engineering emphasizes the use of simulation as a preparatory activity before verifying and validating a product through hardware testing. The main benefit of simulation is to speed up product development in a 'œfirst time right' paradigm where a hardware driven product V...Read Full Abstract Traditional Computer Aided Engineering emphasizes the use of simulation as a preparatory activity before verifying and validating a product through hardware testing. The main benefit of simulation is to speed up product development in a 'œfirst time right' paradigm where a hardware driven product V&V phase is expected to confirm what is already known through computer simulation.
In many industries, for example Railway and Automotive, this hardware driven product V&V phase constitues a major share of the overall product development effort, as safety requirements are very high and other phases of product development have been streamlined and optimized using simulation.
In other industries, for example Aerospace and and Nuclear, cost-prohibitive and, in some cases, impractical hardware testing has led already led to a large share of well-established virtual product V&V procedures.
More and more, the Automotive and Railway industries desire to establish strategies for Virtual Product V&V as well. The task is to define virtual V&V processes that provide at least the same level of assurance and certainty as the established hardware driven processes.
For this, it is necessary to quantify the uncertainty of simulation results and compare it to the acceptable, but usually unknown, uncertainty of established hardware based V&V procedures. Perhaps quantifying the certainty or assurance of a V&V procedure would be more to the point, but we use the established term 'œUncertainty Quantification (UQ)'.
We will present several application application examples that adhere to the following pattern:
1) Quantify the uncertainty of an established, hardware-based V&V process
2) Validate the simulation model
3) Quantify the uncertainty of the validated simulation model
4) Propose a virtual product V&V process with equivalent uncertainty
It is important to note that, due to the need to validate the simulation models, hardware testing still plays an important role in product V&V. However, the use of simulation enables more efficient and flexible use of hardware testing, resulting in faster, more efficient product V&V.
While we acknowledge that there is still a long way to go before simulation is fully leveraged in product V&V, we hope to provide some useful ideas and guidance to those who wish establish a strategy for Virtual Product V&V in their field. Hide Full Abstract
StochasticsUncertainty QuantificationV&V (Verification and Validation)
|
Stress Concepts for Weld Verification and Approaches to Automation
Authored & Presented By Tim Kirchhoff (ihf Ingenieurgesellschaft)
In mechanically stressed components, weld seams are usually the weak points critical to failure, especially under alternating loads. The necessary strength verification is carried out according to rules such as the FKM guideline, the IIW recommendations or the Eurocode.
The evaluation of ...Read Full Abstract In mechanically stressed components, weld seams are usually the weak points critical to failure, especially under alternating loads. The necessary strength verification is carried out according to rules such as the FKM guideline, the IIW recommendations or the Eurocode.
The evaluation of weld seams is a challenge, even in modern simulation-driven product development. Due to the special properties of the weld seams (e.g. sharp and quite irregular notches), the stresses for the verification must be determined using one of the concepts developed for this purpose, for example as nominal, structural or notch stress.
With the structural stress concept, the seam is represented in a simplified manner in the FE model and the stress for the verification is extrapolated from the stresses on the surface before the weld toe. The Hot-Spot concept from the IIW recommendations is widely used for this purpose and is also referenced in other guidelines.
With the notch stress concept, a fictitious notch radius is introduced in the simulation model at the weld toe and in the weld root. The stress for the verification can then be determined directly in the notch radius.
The notch stress concept therefore requires a comparatively high modeling effort but is also suitable for the verification of complex weld seam situations that cannot be evaluated using other methods.
The areas of application of the different stress concepts are presented and advantages or disadvantages of the individual concepts are discussed.
A practical challenge in the verification of weld seams is also the selection of the critical point, especially when the component is subject to multiple alternating loads. This can be alleviated by an automated calculation of all verification points along a weld seam.
For this purpose, approaches to automating the stress determination according to the structural or notch stress concept are presented, which were implemented by ihf. It is shown how the stress components depending on the local weld seam direction can be determined according to the requirements of the different guidelines. Hide Full Abstract
Computational Structural MechanicsFatigueWelding
|
Systems Simulation For Fusion Using Novel Augmented CMS Reduction Techniques
Authored & Presented By Tom Deighan (UK Atomic Energy Authority)
Development of commercially viable fusion power will rely heavily on deployment and targeted development of appropriate simulation methods for use across the design lifecycle and extended use into operations. Tools and methods of an appropriate fidelity are needed to efficiently explore the vast arr...Read Full Abstract Development of commercially viable fusion power will rely heavily on deployment and targeted development of appropriate simulation methods for use across the design lifecycle and extended use into operations. Tools and methods of an appropriate fidelity are needed to efficiently explore the vast array of potential concepts at an early stage. Performing whole system x-in-the-loop virtual operations are needed for upfront and integrated development of control systems, facility HMI design, operation planning and operator training. This is particularly pertinent for Fusion, where doing so through physical prototype systems is either not possible or has prohibitive costs and timescales. Additionally, successful lifetime monitoring and predictive maintenance of fusion components through diagnostic measurements will be limited due to restricted accessibility and operation in a harsh environment. This provides a use case for a digital twin, which can combine data from the physical instrumentation with simulation to provide enhanced augmented diagnostics in '˜real-time'. For all such simulations to be valuable in assessing the design or operational risk they need to be performed probabilistically, to quantify the uncertainty in the predictions in a formal reliability analysis.
Systems simulation and related novel reduced order modelling techniques provides a realistic approach to achieving these aims. Although constantly advancing computational capability opens the door for larger and more complex simulations, the environmental and financial cost must be considered and does not override the principle of 'œappropriate fidelity' and the development of efficient techniques '“ even if to enable more valuable deployment of computational resource for efficient UQ.
This paper presents developments of a novel full-field reduced order modelling technique using an augmented Component Mode Synthesis (CMS) reduction and modal coupling method, describing the reduction process and implementation in Modelica language. The approach enables efficient simulation of coupled fluid-thermo-mechanical models of complex components within a systems environment, capturing aspects of non-linear behaviour. This is demonstrated through application for a coupled fluid-thermal-structural simulation of a Fusion power plant plasma facing component, discussing the advantages and limitations of the approach. Finally, plans for further development of these methods and the application for simulation of Fusion systems and in wider industry are discussed in the context of moving towards realisation of a probabilistic real-time digital twin. Hide Full Abstract
Digital TwinsReduced Order ModellingSystem-Level Simulation
|
Development of Computational Fluid Dynamics Methodology for Estimating Atmospheric Emissions from Flares
Presented By Mauro Dolinsky (Petroleo Brasileiro S/A - PETROBRAS)
Due to the need for monitoring and estimating atmospheric emissions in industrial environments, there has been a growing interest in using tools to measure or predict greenhouse gases that may be generated in industrial units. In this context, the use of computational fluid dynamics (CFD) for estima...Read Full Abstract Due to the need for monitoring and estimating atmospheric emissions in industrial environments, there has been a growing interest in using tools to measure or predict greenhouse gases that may be generated in industrial units. In this context, the use of computational fluid dynamics (CFD) for estimating atmospheric emissions in flare systems has become an increasingly important tool in the industry. This modeling technique allows for a detailed and accurate analysis of the complex processes that occur during gas combustion in flares and the influence of process conditions such as flow rate and composition, as well as environmental factors like wind direction, temperature, and speed.
Modeling turbulent combustion and thermal radiation can be complex and computationally intensive, and validation with experimental data is necessary to ensure the reliability of the adopted models.
The use of computational fluid dynamics for estimating emissions in flares provides more accurate results compared to traditional empirical methods, as it considers a wide range of variables and complex interactions. It allows three-dimensional visualization of gas flows, combustion patterns, and pollutant dispersion, aiding in the understanding of the phenomena involved. CFD enables testing and optimizing different flare configurations, improving combustion efficiency and reducing emissions. It facilitates the evaluation of various operational and environmental conditions, allowing better preparation for different scenarios and assists in complying with environmental regulations by providing reliable emission estimates.
For the development of the methodology, two commercial computational fluid dynamics software packages were evaluated, along with different combustion and radiation models and mechanisms. The simulation results were compared to an industrial-scale experimental test from a flare supplier with an approximate capacity of 2,161,000 standard cubic meters per day.
Computational fluid dynamics offers a powerful and versatile approach for estimating atmospheric emissions in flare systems. Its use can lead to a better understanding and control of emissions, contributing to more sustainable and environmentally responsible operations in the industry.
Furthermore, the application of CFD in flare emission estimation has several additional benefits. It can help in the design of more efficient flare systems, reducing fuel consumption and minimizing environmental impact. The technology also allows for the assessment of potential hazards and safety concerns related to flare operations, such as thermal radiation and noise levels.
By providing detailed insights into the combustion process and emission dispersion, CFD can support decision-making in regulatory compliance and environmental impact assessments. It can also be used to optimize flare placement within industrial facilities, considering factors like nearby structures and prevailing wind patterns.
As computational power continues to increase and CFD models become more sophisticated, the accuracy and applicability of these simulations are expected to improve further. This will likely lead to even more widespread adoption of CFD technology in the oil and gas industry, as well as other sectors dealing with atmospheric emissions from combustion processes. Hide Full Abstract
Computational Fluid DynamicsSustainability
Authored By Mauro Dolinsky (Petroleo Brasileiro S/A - PETROBRAS) Gustavo Schiavone (Petroleo Brasileiro S/A - Petrobas)
|
Enhancing Collaboration on Multibody Dynamics Simulations with Simulation Data Management
Presented By Marko Thiele (Scale)
Today, virtual product development is essential in vehicle projects in large automotive groups in order to keep costs down. As part of the CAE development process, the vehicle's behavior in operational mode must be investigated. It has become common practice to employ multibody dynamics simulations ...Read Full Abstract Today, virtual product development is essential in vehicle projects in large automotive groups in order to keep costs down. As part of the CAE development process, the vehicle's behavior in operational mode must be investigated. It has become common practice to employ multibody dynamics simulations for this task, where the vehicle is treated as composed of various rigid or elastic bodies that can undergo translational or rotational displacement. Multibody dynamics simulation experts then first build up a virtual vehicle model and subsequently let it drive certain maneuvers on various kinds of standardized roads under different driving conditions that apply for all projects.
Managing, sharing and collaborating on the related CAE data, CAE methods and CAE processes across a larger number of simulation engineers can, however, become a challenge. It is therefore important to establish means of working together in teams. For this it makes sense to organize simulation data in such a way that common CAE data files as well as certain process-scripts or simulation-methods in general, can be shared with the whole team. To this end, libraries of common CAE files - such as templates, technical components or connectors that define the vehicle, or the roads and driving conditions that are applied to the vehicle - and all types of process-scripts are created and maintained by a small number of experts. They can then be used by all CAE users working on the different car projects. To ensure that this also works regardless of the location of the CAE engineers, special tools are required to exchange data. For certain disciplines of the virtual product development process, e.g. handling crash simulations, simulation data management has successfully been introduced many years ago and has been in use ever since. Other simulation domains such as multibody dynamics simulations face different kinds of challenges.
In this presentation, we want to demonstrate how to foster collaboration between multibody dynamics simulation experts using simulation data management tools. We will address the virtual product development process '“ from vehicle model building to running the simulations and the integration with the preprocessor '“ and work out concepts that improve effectiveness as well as consistency for those engaged in the workflow.
In particular, we will focus on the challenges that we face when introducing a simulation data management system for multibody dynamics simulations: Since in a multibody dynamics simulation, naturally, the model is made up of many individual parts that are each stored in an individual file, the data structure to include in the simulation is extensive and rather complex. Plus, that model has to be combined with a certain road, driving condition and maneuver. An in-depth reproduction of that complex structure in the simulation data management system is key, though, to make full use of the system's advantages. However, complexity must not be traded for usability and maintainability such that integrating the preprocessor for easily managing the model is also an important aspect to take care of.
In our presentation, with an example integration of a multibody dynamics simulation workflow, we will show how to achieve a detailed mapping with the simulation data management system. At the same time, we will demonstrate how we manage to foster interaction with the system among the simulation engineers and thus enhance collaboration. Hide Full Abstract
Multibody DynamicsSimulation Data Management
Authored By Marko Thiele (Scale) Kim Schaebe (SCALE GmbH)
|