Software:OptiY

From HandWiki
Short description: Computer optimization software
OptiY
Developer(s)OptiY GmbH
Operating systemWindows
TypeTechnical computing
LicenseProprietary
Websitewww.optiy.eu

OptiY is a design environment software that provides modern optimization strategies and state of the art probabilistic algorithms for uncertainty, reliability, robustness, sensitivity analysis, data-mining and meta-modeling.

Features

OptiY is an multidisciplinary design environment, which provides direct and generic interfaces to many CAD/CAE-systems and house-intern codes. Furthermore, a complex COM-interface and a user-node with predefined template are available so that user can self-integrate extern programs for ease of use. The insertion of any system to an arbitrary process chain is very easy using the graphical workflow editor. Collaborating different simulation model classes is possible as networks, finite-element-method, multi-body-system, material test bench etc.

Data mining

Data mining is the process of extracting hidden patterns from data. Data mining identifies trends within data that go beyond simple data analysis. Through the use of sophisticated algorithms, non-statistician users have the opportunity to identify key attributes of processes and target opportunities. Data mining is becoming an increasingly important tool to transform this data into information. It is commonly used in a wide range of applications such as manufacturing, marketing, fraud detection and scientific discovery etc.

Sensitivity analysis

Local Sensitivity as correlation coefficients and partial derivatives can only used, if the correlation between input and output is linear. If the correlation is nonlinear, the global sensitivity analysis has to be used based on the variance-relationship between input- and output-distribution as Sobol index. With sensitivity analysis, the system complexity can be reduced and the cause-and-effect chain can be explained.[1][2]

Probabilistic simulation

The variability, uncertainty, tolerance and error of the technical systems play an important part by the product design process. These cause by manufacturing inaccuracy, process uncertainty, environment influences, abrasion and human factors etc. They are characterized by a stochastic distribution. The deterministic simulation cannot predict the real system behaviors due to the input variability and uncertainty, because one model calculation shows only one point in the design space. Probabilistic simulation has to be performed. Thereby, the output distributions will be calculated from input distributions based on the deterministic simulation model by any simulation system. The realistic system behaviors can be derivate from these output distributions.[3][4]

Reliability analysis

The variability of parameters causes often a failure of the system. Reliability analysis (Failure mode and effects analysis) investigates the boundary violation of output due to input variability. The failure mechanisms of components are known in the specification for the product development. They are identified by measurement, field data collection, material data, customer-specifications etc. In the simulation, the satisfaction of all product specifications is defined as constraints of the simulation results. The system reliability is given, if all constraints scatter insight the defined boundaries. Although a nominal parameter simulation shows that all values of the constraints are located in reliable boundaries, the system reliability however cannot be warranted due to input variability. A part of the constraints variability, which violates the defined boundaries, is called the failure probability of the solution. Reliability analysis computes the failure probability of the single components and also of the total system at a given time point.[5]

Meta-modeling

Meta-modeling is the combination of Surrogate model and Physics-informed neural networks which is a process to win the mathematical relationship between input and output parameters. The new way of modeling is any mix of some imperfect data and some imperfect physical components getting accurate meta-model for real-time computing.[6][7]

Fatigue life prediction

Predicting fatigue (material) has been one of the most important problems in design engineering for reliability and quality. They have several practical uses: rapid design optimization during development phase of a product and predicting field use limits as well as failure analysis of product returned from the field or failed in qualification test. Fatigue analysis focus on the thermal and mechanical failure mechanism. Most fatigue failure can be attributed to thermo-mechanical stresses caused by differences in the coefficient of thermal and mechanical expansion. The fatigue failures will occur when the component experiences cyclic stresses and strains that produce permanent damage.

Multi-objective optimization

In development process of technical products, there are frequently design problems with many evaluation goals or criteria as low cost, high quality, low noise etc. Design parameters have to be found to minimize all criteria. In contrast to a single optimization, there is another order structure between parameter and criteria spaces at a multi-objective Optimization. Criteria conflict each other. Trying to minimize a criterion, other criteria may be maximized. There is not only one solution, but also a Pareto optimal solution frontier. Multi-objective optimization finds all Pareto solutions automatically with a single run. The multiple decision making support tool is also available to select one best suitable solution from them.[8]

Robust design optimization

Variability, uncertainty and tolerance have to be considered for design process of technical systems to assure the highly required quality and reliability. They are uncontrollable, unpredictable and cause the uncertainty satisfaction of the required product specifications. The design goal is assuring of the specified product functionalities in spite of unavoidable variability and uncertainty. The approach solving this problem is robust design of the product parameters in the early design process (Robust Parameter Design (RPD)). Thereby, optimal product parameters should be found. Within, the system behavior is robust and insensitive in spite of unavoidable variability. E.g. the consistent variability and uncertainty leads only to the smallest variability of the product characteristics. So, the required product specifications will be always satisfied.[9]

References

  1. Saltelli, A., Chan, K. and Scott, E.M.: Sensitivity Analysis. John Wiley & Sons Chichester, New York 2000
  2. Oakley J.E., O´Hagan A.: Probabilistic Sensitivity Analysis of Computer Models: a Bayesian Approach. Journal of the Royal Statistical Society, Series B, 66:751-769, 2004
  3. Pham T-Q, Neubert H. and Kamusella A.: Design for Reliability and Robustness through Probabilistic Methods in COMSOL Multiphysics with OptiY. Proceedings of the 2nd European COMSOL Conference. 4–6 November 2008 in Hannover: [1]
  4. Sacks J., Welch W.J., Mitchell T.J., Wynn H.P.: Design and Analysis of Computer Experiments. Statistical Science 4, pp. 409-435, 1989
  5. Au, S.K., Beck, J.L.: Subset Simulation and its Application to Seismic Risk Based on Dynamic Analysis. Journal of Engineering Mechanics, Vol. 129, No. 8, August 1, 2003
  6. Pham T-Q, Kamusella A. and Neubert H.: Auto-Extraction of Modelica Code from Finite Element Analysis or Measurement Data. Proceedings of the 8th International Modelica Conference. 20–22 March 2011 in Dresden: [2]
  7. Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiment. Springer-Verlag New York 2003
  8. Zitzler E., Thiele L.: Multiobjective Evolutionary Algorithms: A Comparative Case Study and the Strength Pareto Approach. IEEE Transactions on Evolutionary Computations. pp. 257-271. November 1999
  9. Sung H. Park: Robust design and analysis for quality engineering. Chapman & Hall 1996. ISBN:0-412-55620-0

External links