Update search

Filter

- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No

- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No

- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No

- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No

- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No

- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No

*Expand*

Journal citation

### NARROW

Format

Journal

Article Type

Conference Series

Subject Area

Topics

Date

Availability

1-10 of 10

Igor Baseski

Close
**Follow your search**

Access your saved searches in your account

Would you like to receive an alert when new items match your search?

Sort by

Journal Articles

Accepted Manuscript

Journal: Journal of Mechanical Design

Article Type: Research Papers

*.*

*J. Mech. Des*Paper No: MD-19-1187

Published Online: October 14, 2019

Abstract

Accelerated life test (ALT) has been widely used to accelerate the product reliability assessment process by testing product at higher than nominal stress conditions. For a system with multiple components, the tests can be performed at component-level or system-level. The data at these two levels require different amount of resources to collect and carry different values of information for system reliability assessment. Even though component-level tests are cheap to perform, they cannot account for the correlations between the failure time distributions of different components. While system-level tests can naturally account for the complicated dependence between component failure time distributions, the required testing efforts are much higher than that of component-level tests. This research proposes a novel resource allocation framework for ALT-based system reliability assessment. A physics-informed load model is first employed to bridge the gap between component-level tests and system-level tests. An optimization framework is then developed to effectively allocate testing resources to different types of tests. The information fusion of component-level and system-level tests allows us to accurately estimate the system reliability with a minimized requirement on the testing resources. Results of two numerical examples demonstrate the effectiveness of the proposed framework.

Journal Articles

Journal: Journal of Mechanical Design

Article Type: Research-Article

*. February 2018, 140(2): 021404.*

*J. Mech. Des*Paper No: MD-17-1368

Published Online: December 14, 2017

Abstract

A general methodology is presented for time-dependent reliability and random vibrations of nonlinear vibratory systems with random parameters excited by non-Gaussian loads. The approach is based on polynomial chaos expansion (PCE), Karhunen–Loeve (KL) expansion, and quasi Monte Carlo (QMC). The latter is used to estimate multidimensional integrals efficiently. The input random processes are first characterized using their first four moments (mean, standard deviation, skewness, and kurtosis coefficients) and a correlation structure in order to generate sample realizations (trajectories). Characterization means the development of a stochastic metamodel. The input random variables and processes are expressed in terms of independent standard normal variables in N dimensions. The N-dimensional input space is space filled with M points. The system differential equations of motion (EOM) are time integrated for each of the M points, and QMC estimates the four moments and correlation structure of the output efficiently. The proposed PCE–KL–QMC approach is then used to characterize the output process. Finally, classical MC simulation estimates the time-dependent probability of failure using the developed stochastic metamodel of the output process. The proposed methodology is demonstrated with a Duffing oscillator example under non-Gaussian load.

Proceedings Papers

*Proc. ASME*. IDETC-CIE2017, Volume 2B: 43rd Design Automation Conference, V02BT03A050, August 6–9, 2017

Paper No: DETC2017-67313

Abstract

A general methodology is presented for time-dependent reliability and random vibrations of nonlinear vibratory systems with random parameters excited by non-Gaussian loads. The approach is based on Polynomial Chaos Expansion (PCE), Karhunen-Loeve (KL) expansion and Quasi Monte Carlo (QMC). The latter is used to estimate multi-dimensional integrals efficiently. The input random processes are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis coefficients) and a correlation structure in order to generate sample realizations (trajectories). Characterization means the development of a stochastic metamodel. The input random variables and processes are expressed in terms of independent standard normal variables in N dimensions. The N -dimensional input space is space filled with M points. The system differential equations of motion are time integrated for each of the M points and QMC estimates the four moments and correlation structure of the output efficiently. The proposed PCE-KL-QMC approach is then used to characterize the output process. Finally, classical MC simulation estimates the time-dependent probability of failure using the developed stochastic metamodel of the output process. The proposed methodology is demonstrated with a Duffing oscillator example under non-Gaussian load.

Proceedings Papers

*Proc. ASME*. IDETC-CIE2015, Volume 2B: 41st Design Automation Conference, V02BT03A053, August 2–5, 2015

Paper No: DETC2015-46823

Abstract

We have recently proposed a method for time-dependent reliability based on metamodels with random inputs. In that method, we employed multiple sets of inputs sampled from the input distribution to construct a new metamodel as a mixture of classical metamodels. Because the sampled inputs may cluster around a mode of the input distribution, they may result in a metamodel of reduced quality. We address this issue in this paper by using a transformation to de-cluster the sample inputs and then use our previously developed metamodel with random inputs. We first obtain the output of the computer model for a limited number of transformed input draws which do not cluster in high probability regions of the input space. Then, conditioned on these transformed sampled inputs, we construct a classical Kriging surrogate and obtain the distribution of the new surrogate as the marginal of the joint distribution between the classical surrogate and the transformed sampled inputs. The proposed method is illustrated with a corroding beam example. A more accurate time-dependent reliability estimation is obtained compared with our previously developed metamodel method.

Journal Articles

Journal: Journal of Mechanical Design

Article Type: Research-Article

*. January 2016, 138(1): 011403.*

*J. Mech. Des*Paper No: MD-14-1846

Published Online: November 16, 2015

Abstract

A new metamodeling approach is proposed to characterize the output (response) random process of a dynamic system with random variables, excited by input random processes. The metamodel is then used to efficiently estimate the time-dependent reliability. The input random processes are decomposed using principal components, and a few simulations are used to estimate the distributions of the decomposition coefficients. A similar decomposition is performed on the output random process. A Kriging model is then built between the input and output decomposition coefficients and is used subsequently to quantify the output random process. The innovation of our approach is that the system input is not deterministic but random. We establish, therefore, a surrogate model between the input and output random processes. To achieve this goal, we use an integral expression of the total probability theorem to estimate the marginal distribution of the output decomposition coefficients. The integral is efficiently estimated using a Monte Carlo (MC) approach which simulates from a mixture of sampling distributions with equal mixing probabilities. The quantified output random process is finally used to estimate the time-dependent probability of failure. The proposed method is illustrated with a corroding beam example.

Journal Articles

Journal: Journal of Mechanical Design

Article Type: Research-Article

*. March 2015, 137(3): 031405.*

*J. Mech. Des*Paper No: MD-14-1417

Published Online: March 1, 2015

Abstract

A new reliability analysis method is proposed for time-dependent problems with explicit in time limit-state functions of input random variables and input random processes using the total probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The total probability theorem is employed to calculate the time-dependent probability of failure using time-dependent conditional probabilities which are computed accurately and efficiently in the standard normal space using the first-order reliability method (FORM) and a composite limit state of linear instantaneous limit states. If the dimensionality of the total probability theorem integral is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation (MCS) or adaptive importance sampling are used based on a Kriging metamodel of the conditional probabilities. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.

Proceedings Papers

*Proc. ASME*. IDETC-CIE2014, Volume 2A: 40th Design Automation Conference, V02AT03A037, August 17–20, 2014

Paper No: DETC2014-34313

Abstract

A new metamodeling approach is proposed to characterize the output (response) random process of a dynamic system with random variables, excited by input random processes. The metamodel is then used to efficiently estimate the time-dependent reliability. The input random processes are decomposed using principal components or wavelets and a few simulations are used to estimate the distributions of the decomposition coefficients. A similar decomposition is performed on the output random process. A Kriging model is then built between the input and output decomposition coefficients and is used subsequently to quantify the output random process corresponding to a realization of the input random variables and random processes. In our approach, the system input is not deterministic but random. We establish therefore, a surrogate model between the input and output random processes. The quantified output random process is finally used to estimate the time-dependent reliability or probability of failure using the total probability theorem. The proposed method is illustrated with a corroding beam example.

Proceedings Papers

*Proc. ASME*. IDETC-CIE2014, Volume 2B: 40th Design Automation Conference, V02BT03A052, August 17–20, 2014

Paper No: DETC2014-35078

Abstract

A new reliability analysis method is proposed for time-dependent problems with limit-state functions of input random variables, input random processes and explicit in time using the total probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The total probability theorem is employed to calculate the time-dependent probability of failure using a time-dependent conditional probability which is computed accurately and efficiently in the standard normal space using FORM and a composite limit state of linear instantaneous limit states. If the dimensionality of the total probability theorem integral (equal to the number of input random variables) is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation or adaptive importance sampling is used based on a pre-built Kriging metamodel of the conditional probability. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.

Journal Articles

Journal: Journal of Mechanical Design

Article Type: Research-Article

*. June 2014, 136(6): 061008.*

*J. Mech. Des*Paper No: MD-13-1385

Published Online: April 21, 2014

Abstract

Time-dependent reliability is the probability that a system will perform its intended function successfully for a specified time. Unless many and often unrealistic assumptions are made, the accuracy and efficiency of time-dependent reliability estimation are major issues which may limit its practicality. Monte Carlo simulation (MCS) is accurate and easy to use, but it is computationally prohibitive for high dimensional, long duration, time-dependent (dynamic) systems with a low failure probability. This work is relevant to systems with random parameters excited by stochastic processes. Their response is calculated by time integrating a set of differential equations at discrete times. The limit state functions are, therefore, explicit in time and depend on time-invariant random variables and time-dependent stochastic processes. We present an improved subset simulation with splitting approach by partitioning the original high dimensional random process into a series of correlated, short duration, low dimensional random processes. Subset simulation reduces the computational cost by introducing appropriate intermediate failure sub-domains to express the low failure probability as a product of larger conditional failure probabilities. Splitting is an efficient sampling method to estimate the conditional probabilities. The proposed subset simulation with splitting not only estimates the time-dependent probability of failure at a given time but also estimates the cumulative distribution function up to that time with approximately the same cost. A vibration example involving a vehicle on a stochastic road demonstrates the advantages of the proposed approach.

Proceedings Papers

*Proc. ASME*. IDETC-CIE2013, Volume 3B: 39th Design Automation Conference, V03BT03A048, August 4–7, 2013

Paper No: DETC2013-12257

Abstract

Time-dependent reliability is the probability that a system will perform its intended function successfully for a specified time. Unless many and often unrealistic assumptions are made, the accuracy and efficiency of time-dependent reliability estimation are major issues which may limit its practicality. Monte Carlo simulation (MCS) is accurate and easy to use but it is computationally prohibitive for high dimensional, long duration, time-dependent (dynamic) systems with a low failure probability. This work addresses systems with random parameters excited by stochastic processes. Their response is calculated by time integrating a set of differential equations at discrete times. The limit state functions are therefore, explicit in time and depend on time-invariant random variables and time-dependent stochastic processes. We present an improved subset simulation with splitting approach by partitioning the original high dimensional random process into a series of correlated, short duration, low dimensional random processes. Subset simulation reduces the computational cost by introducing appropriate intermediate failure sub-domains to express the low failure probability as a product of larger conditional failure probabilities. Splitting is an efficient sampling method to estimate the conditional probabilities. The proposed subset simulation with splitting not only estimates the time-dependent probability of failure at a given time but also estimates the cumulative distribution function up to that time with approximately the same cost. A vibration example involving a vehicle on a stochastic road demonstrates the advantages of the proposed approach.