Update search
Filter
- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No
Filter
- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No
Filter
- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No
Filter
- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No
Filter
- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No
Filter
- Title
- Author
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- ISBN-10
- ISSN
- Issue
- Volume
- References
- Paper No
NARROW
Date
Availability
1-20 of 33
Stochastic processes
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Proceedings Papers
Dimitrios Papadimitriou, Zissimos P. Mourelatos, Santosh Patil, Zhen Hu, Vasiliki Tsianika, Vasileios Geroulas
Proc. ASME. IDETC-CIE2019, Volume 2B: 45th Design Automation Conference, V02BT03A033, August 18–21, 2019
Paper No: DETC2019-97158
Abstract
Abstract This paper proposes a new methodology for time-dependent reliability analysis of vibratory systems using a combination of a First-Order, Four-Moment (FOFM) method and a Non-Gaussian Karhunen-Loeve (NG-KL) expansion. The vibratory system is nonlinear and it is excited by stationary non-Gaussian input random processes which are characterized by their first four marginal moments and autocorrelation function. The NG-KL expansion expresses each input non-Gaussian process as a linear combination of uncorrelated, non-Gaussian random variables and computes their first four moments. The FOFM method then uses the moments of the NG-KL variables to calculate the moments and autocorrelation function of the output processes based on a first-order Taylor expansion (linearization) of the system equations of motion. Using the output moments and autocorrelation function, another NG-KL expansion expresses the output processes in terms of uncorrelated non-Gaussian variables in the time domain, allowing the generation of output trajectories. The latter are used to estimate the time-dependent probability of failure using Monte Carlo Simulation (MCS). The computational cost of the proposed approach is proportional to the number of NG-KL random variables and is significantly lower than that of other recently developed methodologies which are based on sampling. The accuracy and efficiency of the proposed methodology is demonstrated using a two-degree of freedom nonlinear vibratory system with random coefficients excited by a stationary non-Gaussian random process.
Proceedings Papers
Proc. ASME. IDETC-CIE2018, Volume 2B: 44th Design Automation Conference, V02BT03A049, August 26–29, 2018
Paper No: DETC2018-85193
Abstract
The performance of a product varies with respect to time and space if the associated limit-state function is a function of time and space. This study develops an uncertainty analysis method that quantifies the effect of random input variables on the performance (response) over time and space. The first order reliability method (FORM) is used to approximate the extreme value of the response with respect to space at discretized instants of time. Then the response becomes a Gaussian stochastic process that is fully defined by the mean, variance, and autocorrelation functions obtained from FORM, where a sequential single loop procedure is performed for spatial and random variables. The method is successfully applied to the reliability analysis of a crank-slider mechanism, which operates in a specified period of time and space.
Proceedings Papers
Proc. ASME. IDETC-CIE2017, Volume 2B: 43rd Design Automation Conference, V02BT03A050, August 6–9, 2017
Paper No: DETC2017-67313
Abstract
A general methodology is presented for time-dependent reliability and random vibrations of nonlinear vibratory systems with random parameters excited by non-Gaussian loads. The approach is based on Polynomial Chaos Expansion (PCE), Karhunen-Loeve (KL) expansion and Quasi Monte Carlo (QMC). The latter is used to estimate multi-dimensional integrals efficiently. The input random processes are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis coefficients) and a correlation structure in order to generate sample realizations (trajectories). Characterization means the development of a stochastic metamodel. The input random variables and processes are expressed in terms of independent standard normal variables in N dimensions. The N -dimensional input space is space filled with M points. The system differential equations of motion are time integrated for each of the M points and QMC estimates the four moments and correlation structure of the output efficiently. The proposed PCE-KL-QMC approach is then used to characterize the output process. Finally, classical MC simulation estimates the time-dependent probability of failure using the developed stochastic metamodel of the output process. The proposed methodology is demonstrated with a Duffing oscillator example under non-Gaussian load.
Proceedings Papers
Proc. ASME. IDETC-CIE2017, Volume 5B: 41st Mechanisms and Robotics Conference, V05BT08A083, August 6–9, 2017
Paper No: DETC2017-68341
Abstract
This study presents new results on a method to solve large kinematic synthesis systems termed Finite Root Generation. The method reduces the number of startpoints used in homotopy continuation to find all the roots of a kinematic synthesis system. For a single execution, many start systems are generated with corresponding startpoints using a random process such that start-points only track to finite roots. Current methods are burdened by computations of roots to infinity. New results include a characterization of scaling for different problem sizes, a technique for scaling down problems using cognate symmetries, and an application for the design of a spined pinch gripper mechanism. We show that the expected number of iterations to perform increases approximately linearly with the quantity of finite roots for a given synthesis problem. An implementation that effectively scales the four-bar path synthesis problem by six using its cognate structure found 100% of roots in an average of 16,546 iterations over ten executions. This marks a roughly six-fold improvement over the basic implementation of the algorithm.
Proceedings Papers
Proc. ASME. IDETC-CIE2017, Volume 2A: 43rd Design Automation Conference, V02AT03A042, August 6–9, 2017
Paper No: DETC2017-67426
Abstract
This paper presents a sequential Kriging optimization approach (SKO) for time-variant reliability-based design optimization (tRBDO) with the consideration of stochastic processes. To handle the extremely high dimensionality associated with time-variant uncertainties, stochastic processes are transformed to random parameters through the equivalent stochastic transformation, leading to equivalent time-independent reliability models that are capable of capturing system failures over time. To alleviate computational burden, Kriging-based surrogate modeling is utilized to predict the response of engineered systems. It is further integrated with Monte Carlo simulation (MCS) to approximate the probability of failure. To reduce the epistemic uncertainty due to the lack of data, a maximum confidence enhancement method (MCE) is employed to iteratively identify important points for updating surrogate models. Sensitivities of reliability with respect to design variables are estimated using the first-order score function in the proposed tRBDO framework. Two case studies are introduced to demonstrate the efficiency and accuracy of the proposed approach.
Proceedings Papers
Proc. ASME. IDETC-CIE2016, Volume 1A: 36th Computers and Information in Engineering Conference, V01AT02A021, August 21–24, 2016
Paper No: DETC2016-59260
Abstract
When component dependence is ignored, a system reliability model may have large model (epistemic) uncertainty with wide reliability bounds. This makes decision making difficult during the system design. Component dependence exists due to the shared environment and operation conditions. It is difficult for system designers to model component dependence because they may not have access to component design details if the components are designed and manufactured by outside suppliers. This research intends to reduce the system reliability model uncertainty with a new way for system designers to consider the component dependence implicitly and automatically without knowing component design details. The proposed method is applicable for a wide range of applications where the time-dependent system stochastic load is shared by components of the system. Simulation is used to obtain the extreme value of the system load for a given period of time, and optimization is employed to estimate the system reliability interval. As a result, the epistemic uncertainty in system reliability can be reduced.
Proceedings Papers
Proc. ASME. IDETC-CIE2016, Volume 2B: 42nd Design Automation Conference, V02BT03A045, August 21–24, 2016
Paper No: DETC2016-59185
Abstract
A common strategy for the modeling of stochastic loads in time-dependent reliability analysis is to describe the loads as independent Gaussian stochastic processes. This assumption does not hold for many engineering applications. This paper proposes a Vine-autoregressive-moving average (Vine-ARMA) load model for time-dependent reliability analysis, in problems with a vector of correlated non-Gaussian stochastic loads. The marginal stochastic processes are modeled as univariate ARMA models. The correlations between different univariate ARMA models are captured using the Vine-copula. The ARMA model maintains the correlation over time. The Vine-copula represents not only the correlation between different ARMA models, but also the tail dependence of different ARMA models. The developed Vine-ARMA model therefore can flexibly model a vector of high-dimensional correlated non-Gaussian stochastic processes with the consideration of tail dependence. Due to the complicated structure of the Vine-ARMA model, new challenges are introduced in time-dependent reliability analysis. In order to overcome these challenges, the Vine-ARMA model is integrated with a recently developed single-loop Kriging (SILK) surrogate modeling method. A hydrokinetic turbine blade subjected to a vector of correlated river flow loads is used to demonstrate the effectiveness of the proposed method.
Proceedings Papers
Proc. ASME. IDETC-CIE2016, Volume 2B: 42nd Design Automation Conference, V02BT03A058, August 21–24, 2016
Paper No: DETC2016-60312
Abstract
Engineering systems are often modeled as a large dimensional random process with additive noise. The analysis of such system involves a solution to simultaneous system of Stochastic Differential Equations (SDE). The exact solution to the SDE is given by the evolution of the probability density function (pdf) of the state vector through the application of Stochastic Calculus. The Fokker-Planck-Kolmogorov Equation (FPKE) provides approximate solution to the SDE by giving the time evolution equation for the non-Gaussian pdf of the state vector. In this paper, we outline a computational framework that combines linearization, clustering technique and the Adaptive Gaussian Mixture Model (AGMM) methodology for solving the Fokker-Planck-Kolmogorov Equation (FPKE) related to a high dimensional system. The linearization and clustering technique facilitate easier decomposition of the overall high dimensional FPKE system into a finite number of much lower dimension FPKE systems. The decomposition enables the solution method to be faster. Numerical simulations test the efficacy of our developed framework.
Proceedings Papers
Proc. ASME. IDETC-CIE2015, Volume 8: 27th Conference on Mechanical Vibration and Noise, V008T13A076, August 2–5, 2015
Paper No: DETC2015-47586
Abstract
This paper is dedicated to the analysis of uncertainties affecting the load capability of a 4-pad tilting-pad journal bearing, in which the load is applied between two pads (load on pad configuration; LOP). A well-known stochastic method has been extensively used to model uncertain parameters, the so-called Monte Carlo simulation. However, in the present contribution, the inherent uncertainties of the bearings’ parameters (i.e. the pad radius, the oil viscosity, and the radial clearance) are modeled by using a fuzzy logic based analysis. This alternative methodology seems to be more appropriate when the stochastic process that characterizes the uncertainties is unknown. The analysis procedure is confined to the load capability of the bearing, being generated by the envelopes of the pressure fields developed on each pad. The hydrodynamic supporting forces are determined by considering a nonlinear model, which is obtained from the solution of the Reynolds’ equation. The most significant results are associated to the changes in the dynamic behavior of the bearing because of the reaction forces that are modified according the uncertainties introduced in the system. Finally, it is worth mentioning that the uncertainty analysis in this case provides relevant information both for design and maintenance of tilting-pad hydrodynamic bearings.
Proceedings Papers
Proc. ASME. IDETC-CIE2015, Volume 2B: 41st Design Automation Conference, V02BT03A050, August 2–5, 2015
Paper No: DETC2015-46168
Abstract
The response of a component in a multidisciplinary system is affected by not only the discipline to which it belongs, but also by other disciplines of the system. If any components are subject to time-dependent uncertainties, responses of all the components and the system are also time dependent. Thus, time-dependent multidisciplinary reliability analysis is required. To extend the current time-dependent reliability analysis for a single component, this work develops a time-dependent multidisciplinary reliability method for components in a multidisciplinary system under stationary stochastic processes. The method modifies the First and Second Order Reliability Methods (FORM and SORM) so that the Multidisciplinary Analysis (MDA) is incorporated while approximating the limit-state function of the component under consideration. Then Monte Carlo simulation is used to calculate the reliability without calling the original limit-state function. Two examples are used to demonstrate and evaluate the proposed method.
Proceedings Papers
Proc. ASME. IDETC-CIE2015, Volume 2B: 41st Design Automation Conference, V02BT03A062, August 2–5, 2015
Paper No: DETC2015-47925
Abstract
One of the essential steps in time-dependent reliability analysis is the characterization of stochastic load processes and system random variables based on experimental or historical data. Limited data results in uncertainty in the modeling of random variables and stochastic loadings. The uncertainty in random variable and stochastic load models later causes uncertainty in the results of reliability analysis. An uncertainty quantification framework is developed in this paper for time-dependent reliability analysis. The effects of two kinds of uncertainty sources, namely data uncertainty and model uncertainty on the results of time-dependent reliability analysis are investigated. The Bayesian approach is employed to model the epistemic uncertainty sources in random variables and stochastic processes. A straightforward formulation of uncertainty quantification in time-dependent reliability analysis results in a double-loop implementation, which is computationally expensive. Therefore, this paper builds a surrogate model for the conditional reliability index in terms of variables with imprecise parameters. Since the conditional reliability index is independent of the epistemic uncertainty, the surrogate model is applicable for any realizations of the epistemic uncertainty. Based on the surrogate model, the uncertainty in time-dependent reliability analysis is quantified without evaluating the original limit-state function, which increases the efficiency of uncertainty quantification. The effectiveness of the proposed method is demonstrated using a mathematical example and an engineering application example.
Proceedings Papers
Proc. ASME. IDETC-CIE2014, Volume 2A: 40th Design Automation Conference, V02AT03A037, August 17–20, 2014
Paper No: DETC2014-34313
Abstract
A new metamodeling approach is proposed to characterize the output (response) random process of a dynamic system with random variables, excited by input random processes. The metamodel is then used to efficiently estimate the time-dependent reliability. The input random processes are decomposed using principal components or wavelets and a few simulations are used to estimate the distributions of the decomposition coefficients. A similar decomposition is performed on the output random process. A Kriging model is then built between the input and output decomposition coefficients and is used subsequently to quantify the output random process corresponding to a realization of the input random variables and random processes. In our approach, the system input is not deterministic but random. We establish therefore, a surrogate model between the input and output random processes. The quantified output random process is finally used to estimate the time-dependent reliability or probability of failure using the total probability theorem. The proposed method is illustrated with a corroding beam example.
Proceedings Papers
Proc. ASME. IDETC-CIE2014, Volume 2B: 40th Design Automation Conference, V02BT03A046, August 17–20, 2014
Paper No: DETC2014-34326
Abstract
The failure rate of dynamic systems with random parameters is time-varying even for linear systems excited by a stationary random input. In this paper, we propose a simulation-based method to estimate this time-varying failure rate. The input and output stochastic processes are discretized using a small time step to calculate the trajectories of the output stochastic process accurately through simulation. The planning horizon (time of interest) is then partitioned into a series of longer correlated time intervals and the Saddlepoint approximation (SPA) is employed to estimate the distribution of maximum response and thus obtain the probability of failure in each time interval. Using the same simulated trajectories with SPA, a time-dependent copula is built to provide the correlation between the response in each time interval and the response up to that time interval. The time-varying failure rate is finally estimated at each discrete time, using the probability of failure in each time interval and the correlation information from the estimated copula. The effectiveness of the proposed method is illustrated with a vehicle vibration example.
Proceedings Papers
Proc. ASME. IDETC-CIE2014, Volume 2B: 40th Design Automation Conference, V02BT03A045, August 17–20, 2014
Paper No: DETC2014-34294
Abstract
The authors have recently proposed a ‘decision-based’ framework to design and maintain repairable systems. In their approach, a multiobjective optimization problem is solved to identify the best design using multiple short and long-term statistical performance metrics. The design solution considers the initial design, the system maintenance throughout the planning horizon, and the protocol to operate the system. Analysis and optimization of complex systems such as a microgrid is however, computationally intensive. The problem is exacerbated if we must incorporate flexibility in terms of allowing the microgrid architecture and its running protocol to change with time. To reduce the computational effort, this paper proposes an approach that “learns” the working characteristics of the microgrid and quantifies the stochastic processes of the total load and total supply using autoregressive time-series. This allows us to extrapolate the microgrid operation in time and eliminate therefore, the need to perform a full system simulation for the entire long-term planning horizon. The approach can be applied to any repairable system. We show that building in flexibility in the design of repairable systems is computationally feasible and leads to better designs.
Proceedings Papers
Proc. ASME. IDETC-CIE2014, Volume 2B: 40th Design Automation Conference, V02BT03A052, August 17–20, 2014
Paper No: DETC2014-35078
Abstract
A new reliability analysis method is proposed for time-dependent problems with limit-state functions of input random variables, input random processes and explicit in time using the total probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The total probability theorem is employed to calculate the time-dependent probability of failure using a time-dependent conditional probability which is computed accurately and efficiently in the standard normal space using FORM and a composite limit state of linear instantaneous limit states. If the dimensionality of the total probability theorem integral (equal to the number of input random variables) is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation or adaptive importance sampling is used based on a pre-built Kriging metamodel of the conditional probability. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.
Proceedings Papers
Proc. ASME. IDETC-CIE2014, Volume 8: 26th Conference on Mechanical Vibration and Noise, V008T11A015, August 17–20, 2014
Paper No: DETC2014-34746
Abstract
False nearest neighbors (FNN) is one of the essential methods used in estimating the minimally sufficient embedding dimension in delay coordinate embedding of deterministic time series. Its use for stochastic and noisy deterministic time series is problematic and erroneously indicates a finite embedding dimension. Various modifications to the original method have been proposed to mitigate this problem, but those are still not reliable for noisy time series. Nearest neighbor statistics are studied for uncorrelated random time series and contrasted with the deterministic statistics. A new FNN metric is constructed and its performance is evaluated for deterministic, stochastic, and random time series. The results are also contrasted with surrogate data analysis and show that the new metric is robust to noise. It also clearly identifies random time series as not having a finite embedding dimension and provides information about the deterministic part of stochastic processes. The new metric can also be used for differentiating between chaotic and random time series.
Proceedings Papers
Proc. ASME. IDETC-CIE2013, Volume 3B: 39th Design Automation Conference, V03BT03A056, August 4–7, 2013
Paper No: DETC2013-13361
Abstract
As a metamodeling method, Kriging has been intensively developed for deterministic design in the past few decades. However, Kriging is not able to deal with the uncertainty of many engineering processes. By incorporating the uncertainty of data, Stochastic Kriging methods has been developed to analyze and predict random simulation results, but the results cannot fit the problem with uncertainty well. In this paper, deterministic Kriging are extended to stochastic space theoretically, where a novel form of Stochastic Kriging that fully considers the intrinsic uncertainty of data and number of replications is proposed on the basis of finite inputs. It formulates a more reasonable optimization problem via a stochastic process, and then derives the spatial correlation models underlying a random simulation. The obtained results are more general than Kriging, which can fit well with many uncertainty-based problems. Three examples will illustrate the method’s application through comparison with the existing methods: the novel method shows that the results are much closer to reality.
Proceedings Papers
Proc. ASME. IDETC-CIE2013, Volume 2A: 33rd Computers and Information in Engineering Conference, V02AT02A029, August 4–7, 2013
Paper No: DETC2013-13221
Abstract
This work examines the effect of one key aspect of General Purpose Graphics Processing Unit (GPGPU) computing on the realism and fidelity of stochastic simulations. In particular it is shown that the asynchronous nature of GPGPU computing can be leveraged to produce increased fidelity and realism, compared to conventional computing methods, when applied to probabilistic or stochastic simulations. This is a multifaceted argument that shows: 1) Asynchronous behaviors are essential to produce high computational throughput on GPGPU devices, and thus allow more rigorous sampling, which in turn enables a deeper understanding of the underlying stochastic processes. 2) Asynchronous GPGPU computing can eliminate the “global clock” present in simulations and potentially produce a better representation of the underlying process. This paper also attempts to give a working introduction to GPGPU computing, and to the applications of this technology in the field of stochastic simulation. A range of literature regarding these simulations is also surveyed, in order to provide context. A demonstration of synchronous versus asynchronous algorithms for robot swarm path planning is used to illustrate this discussion. Several notes on the limitations of GPGPU computing in this field are also made, along with remarks regarding future development of GPGPU-accelerated stochastic simulations.
Proceedings Papers
Proc. ASME. IDETC-CIE2013, Volume 3B: 39th Design Automation Conference, V03BT03A048, August 4–7, 2013
Paper No: DETC2013-12257
Abstract
Time-dependent reliability is the probability that a system will perform its intended function successfully for a specified time. Unless many and often unrealistic assumptions are made, the accuracy and efficiency of time-dependent reliability estimation are major issues which may limit its practicality. Monte Carlo simulation (MCS) is accurate and easy to use but it is computationally prohibitive for high dimensional, long duration, time-dependent (dynamic) systems with a low failure probability. This work addresses systems with random parameters excited by stochastic processes. Their response is calculated by time integrating a set of differential equations at discrete times. The limit state functions are therefore, explicit in time and depend on time-invariant random variables and time-dependent stochastic processes. We present an improved subset simulation with splitting approach by partitioning the original high dimensional random process into a series of correlated, short duration, low dimensional random processes. Subset simulation reduces the computational cost by introducing appropriate intermediate failure sub-domains to express the low failure probability as a product of larger conditional failure probabilities. Splitting is an efficient sampling method to estimate the conditional probabilities. The proposed subset simulation with splitting not only estimates the time-dependent probability of failure at a given time but also estimates the cumulative distribution function up to that time with approximately the same cost. A vibration example involving a vehicle on a stochastic road demonstrates the advantages of the proposed approach.
Proceedings Papers
Proc. ASME. IDETC-CIE2013, Volume 3B: 39th Design Automation Conference, V03BT03A044, August 4–7, 2013
Paper No: DETC2013-12033
Abstract
Fatigue damage analysis is critical for systems under stochastic loadings. To estimate the fatigue reliability at the design level, a hybrid reliability analysis method is proposed in this work. The First Order Reliability Method (FORM), the inverse FORM, and the peak distribution analysis are integrated for the fatigue reliability analysis at the early design stage. Equations for the mean value, the zero upcrossing rate, and the extreme stress distributions are derived for problems where stationary stochastic processes are involved. Then the fatigue damage is analyzed with the peak counting method. The developed methodology is demonstrated by a simple mathematical example and is then applied to the fatigue reliability analysis of a shaft under stochastic loadings. The results indicate the effectiveness of the proposed method in predicting fatigue damage and reliability.