Skip Nav Destination
Close Modal
Update search
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- EISSN
- Issue
- Volume
- References
- Conference Volume
- Paper No
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- EISSN
- Issue
- Volume
- References
- Conference Volume
- Paper No
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- EISSN
- Issue
- Volume
- References
- Conference Volume
- Paper No
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- EISSN
- Issue
- Volume
- References
- Conference Volume
- Paper No
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- EISSN
- Issue
- Volume
- References
- Conference Volume
- Paper No
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- EISSN
- Issue
- Volume
- References
- Conference Volume
- Paper No
NARROW
Format
Article Type
Subject Area
Topics
Date
Availability
1-4 of 4
Keywords: normal distribution
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Journal:
Journal of Mechanical Design
Article Type: Research Papers
J. Mech. Des. January 2011, 133(1): 011003.
Published Online: December 29, 2010
... that have better crashworthiness performance via stochastic sampling of the design space. In every iteration of the algorithm, a number of sample designs are generated through a normal distribution on neighboring regions of the search space to the current design, and the best among the samples is...
Abstract
This paper presents an automated algorithm for the design of vehicle structures for crashworthiness based on the analyses of the structural crash mode (CM). The CM is the history of the deformation of the different zones of the vehicle structure during a crash event. The algorithm emulates a manual design process called crash mode matching where crashworthiness is improved by manually modifying the design until its CM matches what the designers deem as optimal. Given an initial design and a desired crash mode, the proposed algorithm iteratively finds new designs that have better crashworthiness performance via stochastic sampling of the design space. In every iteration of the algorithm, a number of sample designs are generated through a normal distribution on neighboring regions of the search space to the current design, and the best among the samples is chosen as the new design. The mean and the standard deviation of the normal distributions are adjusted in each iteration by examining the crash mode of the current design and by applying a set of fuzzy logic rules that encapsulate elementary knowledge of the CM matching practice. Two case studies, examining a front half vehicle as well as fully detailed vehicle models, are presented to demonstrate the effectiveness of the proposed algorithm.
Journal Articles
Journal:
Journal of Mechanical Design
Article Type: Research Papers
J. Mech. Des. February 2010, 132(2): 021010.
Published Online: February 9, 2010
... optimization (RDBO) problem under the assumption of truncated normal distributions of the geometric properties. The solution is obtained by first constructing the explicit boundaries of the failure regions (limit state function) using a support vector machine, combined with adaptive sampling and uniform design...
Abstract
This paper introduces a new approach for the optimal geometric design and tolerancing of multibody systems. The approach optimizes both the nominal system dimensions and the associated tolerances by solving a reliability-based design optimization (RDBO) problem under the assumption of truncated normal distributions of the geometric properties. The solution is obtained by first constructing the explicit boundaries of the failure regions (limit state function) using a support vector machine, combined with adaptive sampling and uniform design of experiments. The use of explicit boundaries enables the treatment of systems with discontinuous or binary behaviors. The explicit boundaries also allow for an efficient calculation of the probability of failure using importance sampling. The probability of failure is subsequently approximated over the whole design space (the nominal system dimensions and the associated tolerances), thus making the solution of the RBDO problem straightforward. The proposed approach is applied to the optimization of a web cutter mechanism.
Journal Articles
Journal:
Journal of Mechanical Design
Article Type: Research Papers
J. Mech. Des. February 2010, 132(2): 021003.
Published Online: January 14, 2010
... linearizations are typically effective, such as problems with extensive monotonicities, a large number of constraints relative to variables, and propagation of probabilities with normal distributions. Experiments with test problems show that, relative to standard ATC coordination, the number of subproblem...
Abstract
Decomposition-based strategies, such as analytical target cascading (ATC), are often employed in design optimization of complex systems. Achieving convergence and computational efficiency in the coordination strategy that solves the partitioned problem is a key challenge. A new convergent strategy is proposed for ATC that coordinates interactions among subproblems using sequential linearizations. The linearity of subproblems is maintained using infinity norms to measure deviations between targets and responses. A subproblem suspension strategy is used to suspend temporarily inclusion of subproblems that do not need significant redesign, based on trust region and target value step size. An individual subproblem trust region method is introduced for faster convergence. The proposed strategy is intended for use in design optimization problems where sequential linearizations are typically effective, such as problems with extensive monotonicities, a large number of constraints relative to variables, and propagation of probabilities with normal distributions. Experiments with test problems show that, relative to standard ATC coordination, the number of subproblem evaluations is reduced considerably while the solution accuracy depends on the degree of monotonicity and nonlinearity.
Journal Articles
Journal:
Journal of Mechanical Design
Article Type: Research Papers
J. Mech. Des. July 2006, 128(4): 969–979.
Published Online: December 21, 2005
... probabilistic, precise best-fit normal distribution to represent uncertainty. In the second approach, the designer explicitly expresses the imprecision in the available information using a probability box, or p-box. When the imprecision is large, this p-box approach on average results in designs with expected...
Abstract
Engineering design decisions inherently are made under risk and uncertainty. The characterization of this uncertainty is an essential step in the decision process. In this paper, we consider imprecise probabilities (e.g., intervals of probabilities) to express explicitly the precision with which something is known. Imprecision can arise from fundamental indeterminacy in the available evidence or from incomplete characterizations of the available evidence and designer’s beliefs. The hypothesis is that, in engineering design decisions, it is valuable to explicitly represent this imprecision by using imprecise probabilities. This hypothesis is supported with a computational experiment in which a pressure vessel is designed using two approaches, both variations of utility-based decision making. In the first approach, the designer uses a purely probabilistic, precise best-fit normal distribution to represent uncertainty. In the second approach, the designer explicitly expresses the imprecision in the available information using a probability box, or p-box. When the imprecision is large, this p-box approach on average results in designs with expected utilities that are greater than those for designs created with the purely probabilistic approach, suggesting that there are design problems for which it is valuable to use imprecise probabilities.