Abstract

Engineering systems are often composed of many subsystems that interact with each other. These subsystems, referred to as disciplines, contain many types of uncertainty and in many cases are feedback-coupled with each other. In designing these complex systems, one needs to assess the stationary behavior of these systems for the sake of stability and reliability. This requires the system level uncertainty analysis of the multidisciplinary systems, which is often computationally intractable. To overcome this issue, techniques have been developed for capturing the stationary behavior of the coupled multidisciplinary systems through available data of individual disciplines. The accuracy and convergence of the existing techniques depend on a large amount of data from all disciplines, which are not available in many practical problems. Toward this, we have developed an adaptive methodology that adds the minimum possible number of samples from individual disciplines to achieve an accurate and reliable uncertainty propagation in coupled multidisciplinary systems. The proposed method models each discipline function via Gaussian process (GP) regression to derive a closed-form policy. This policy sequentially selects a new sample point that results in the highest uncertainty reduction over the distribution of the coupling design variables. The effectiveness of the proposed method is demonstrated in the uncertainty analysis of an aerostructural system and a coupled numerical example.

1 Background

Multidisciplinary systems consist of several subsystems, known as disciplines, interacting with each other. These systems have many applications, particularly in aerospace engineering, such as modeling aerostructural-thermal coupling in hypersonic flight [1], turbine-engine cycle analysis [2], satellite performance analysis [3], topology optimization [4], and more. A primary goal in designing a multidisciplinary system is to achieve a stable and reliable system with desirable stationary properties. The uncertainty involved in disciplines (e.g., parametric uncertainty, parametric variability, code uncertainty, model discrepancy [5]) and the feedback and feed-forward interactions make the analysis of these complex multidisciplinary systems a challenging task. This analysis consists of an iterative uncertainty propagation, which gets much more complicated when the disciplinary models are managed by separate entities or housed in separate locations, analysis capabilities run on different computational platforms, models are with significantly different analysis run times, or as the number of disciplines required for a given analysis increases. This uncertainty analysis requires knowledge about the discipline functions or a large amount of data from each discipline.

Decoupling the structure of a given multidisciplinary system is an approach for uncertainty analysis of feedback-coupled multidisciplinary systems. These methods include collaborative reliability analysis using most probable point estimation [6], a likelihood-based approach [7,8], and a multidisciplinary first-order reliability method [9]. Decoupling approaches neglect the dependence between the inputs and the coupling variables, which can lead to poor performance when sensitivity of the system outputs to the coupling variables is high. Dimension reduction and measure transformation to reduce the dimensionality and propagate the coupling variables between coupled components have been performed in analysis of coupled feedback problems [10,11]. Coupling disciplinary models by representing coupling variables with truncated Karhunen–Loève expansions has been studied for multiphysics systems [12]. A decomposition-based uncertainty analysis methodology for feedforward systems with parametric input uncertainty has been developed in Refs. [13,14] without considering the feedback coupling. Probabilistic sample-based compositional uncertainty analysis methodologies have been proposed for systems with feedback coupling and model discrepancy [15,16] and parametric uncertainty [17].

Gibbs sampling process and Monte Carlo sampling-based fixed point iteration are the common strategies in uncertainty analysis of coupled multidisciplinary systems. Each iteration in these techniques requires evaluation of the disciplines for a set of samples, making the methods computationally intensive as the number of iterations and samples increases. A considerable amount of attention has been devoted for efficient uncertainty analysis of complex multidisciplinary systems. These include using surrogate models and simplified representations of systems [18], approximate representations of uncertainty [19], implicit uncertainty propagation [20], reliability-based design optimization [21], multifidelity Gibbs sampling processes [22], robust moment matching [23], and the advanced mean value method [24]. The aforementioned methods are typically inefficient due to the inaccuracy coming from the approximations and the surrogate predictions of the coupling variables and outputs. For cases that the disciplinary functions are available through different fidelity simulations, multifidelity coupled uncertainty propagation methods are proposed in Refs. [25,26]. These methods overcome the prohibitive cost associated with the high-fidelity disciplinary simulations in the fixed point iteration by using surrogates for approximating the coupling variables and adaptive strategies to sample from different fidelity models to refine the surrogates.

In this paper, we lift the assumptions regarding access to the discipline functions and the excessive requirements on expensive data available from the disciplines, two unrealistic features of most of the existing techniques for analysis of coupled multidisciplinary systems. We overcome these limitations by modeling the expensive black-box discipline functions using Gaussian process regression [27,28]. Using these surrogate models as disciplinary functions might lead to an unstable or inaccurate uncertainty propagation process, due to the propagation of extra uncertainty associated with the Gaussian process which comes from the lack of data from the disciplinary models. Overcoming this issue requires reduction of the surrogate-based uncertainty in the design space. This can be achieved by evaluating the expensive functions at new sample locations. We address this problem by introducing an adaptive Gibbs sampling scheme based on Gaussian process regression, which is capable of accurate propagation of the uncertainty with the minimum number of added samples for each discipline. Rather than adding samples to reduce the uncertainty over the entire input space of surrogate models, covariance prediction is used to derive a closed-form policy for selection of a new sample that results in the highest uncertainty reduction in the Gaussian process regression over the Gibbs samples.

It should be mentioned that the problem considered in this paper embraces a wide range of coupled multidisciplinary systems, where access to disciplines is only possible through limited expensive data. In fact, acquiring any single data from disciplines could take days or weeks (i.e., computational cost), or it could have real expense (i.e., economical cost). Therefore, the main motivation of this paper is to develop an efficient framework for learning the unknown disciplinary functions with the minimum number of acquired data points to achieve an accurate uncertainty analysis. As the disciplinary functions in this paper are modeled with surrogate Gaussian processes, the main concern here is to reduce the uncertainty associated with these surrogate models that are then to be used to perform uncertainty propagation. This is aimed at ensuring that the uncertainty in the surrogate models will not affect the accuracy of the subsequent uncertainty analysis for the design of multidisciplinary systems. The key contributions of this paper are as follows:

  • Developing a framework for efficient uncertainty analysis in coupled multidisciplinary systems with unknown and expensive-to-evaluate disciplinary functions.

  • Deriving a closed-form policy for sequential addition of minimum number of expensive samples to each disciplinary function.

  • Efficient and fast design process in coupled multidisciplinary systems by investing only in important regions of the design space to intelligently infer the model of disciplinary functions.

The rest of the paper is organized as follows. Section 2 describes a brief introductory about coupled multidisciplinary systems that we aim to address in this work. Section 3 presents our approach in detail. In Sec. 4, the approach is applied to uncertainty analysis of a coupled aerodynamics-structures system of an airfoil and a coupled numerical example, and the results are discussed. Finally, conclusions are drawn in Sec. 5.

2 Coupled Multidisciplinary Systems

A generic multidisciplinary system consists of a set of disciplines with some coupling variables denoting the interactions between the discipline functions. The interaction between disciplines in these systems can be feedforward coupling, in which the output of an upstream discipline becomes an input to a downstream discipline, or feedback coupling, where the output of one discipline is input to the other discipline and vice versa. A feedback-coupled multidisciplinary system with n disciplines can be represented as follows:
x1=f1(x2,x3,,xn,v1),(Discipline1)x2=f2(x1,x3,,xn,v2),(Discipline2)xn=fn(x1,x2,,xn1,vn),(Disciplinen)
(1)
where x1, …, xn are the coupling variables, f1, …, fn are the disciplinary functions, and variables v1, …, vn model the uncertainties associated with each discipline. It should be mentioned that here, just for the sake of completeness, all the coupling variables are input to all the disciplines, which might not be the case in practice. It should also be noted that the uncertainty in each discipline could appear as additive or any nonlinear form in the discipline functions.

The uncertainties involved in each discipline and the complex feedback-coupled nature of the interactions between the disciplinary models result in unknown stationary behavior of the entire system. So, there is a need to quantify this unknown behavior in design of coupled multidisciplinary systems. Computing the joint distribution of the coupling variables is a key in designing these coupled multidisciplinary systems. This is critical in assessing the stability and reliability of many systems such as aircraft, spacecraft, automobiles, and industrial manufacturing applications. To compute the joint distribution of the coupling variables, one needs to propagate the uncertainty in the entire system. For a system with known disciplinary functions, the Gibbs sampling process [29] is commonly used for approximation of the joint distribution of the coupling variables. This method, which has also been referred to as successive substitution sampling [30], is a Markov chain Monte Carlo (MCMC)-based method for generating samples from a joint distribution that cannot be directly sampled. The Gibbs sampling process for approximating the joint distribution of the coupling variables in a coupled multidisciplinary system is presented in Algorithm 1. Examples of stopping criterion for this method could be a threshold for the minimum amount of change over two consecutive joint distributions of the coupling variables, or a fixed number of iterations specified based on the available computational budget for evaluating the disciplinary functions.

Gibbs sampling ( f 1 , … , f n).

Algorithm 1

 1: Set number of Gibbs samples N, and starting values {xi0,j}j=1NXi, for i=1,,n.

2: k=0.

3: Do

4:  kk+1.

5: fori=1 to ndo

6:  Sample xik,j=fi(x1k1,j,,xi1k1,j,xi+1k1,j,,xnk1,j,vi), for j=1,,N.

7: end for

8: Until (termination criterion is met)

9: Output: Xk,N:={x1k,l,x2k,l,,xnk,l}l=1N.

In many practical multidisciplinary systems, the discipline functions are not known and cannot be easily evaluated due to the computational expense and scarcity of data. This renders the Gibbs sampling process in Algorithm 1 inapplicable. Alternative techniques have been developed for approximating the joint distribution of the coupling variables using sample data [16]. However, these techniques often perform poorly as they require a large amount of data from all disciplinary models, which is often not available due to the computational cost associated with the data gathering process.

3 Proposed Approach

In this paper, we aim to accurately and efficiently obtain the joint distribution of the coupling variables in coupled multidisciplinary systems using the available data and acquiring the minimum possible new data from individual disciplines. Toward this, we first construct Gaussian processes for all the disciplinary models using their available data, and then we efficiently acquire the most informative, in terms of uncertainty reduction, data by taking advantage of the Gaussian representation of the models. In the following subsections, the detailed process of the proposed method is described.

3.1 Gaussian Process Regression for Modeling Unknown Discipline Functions.

Gaussian process regression [27,31] has gained prominence in the surrogate modeling and machine learning communities due to its ability to model complex, multidimensional surfaces while simultaneously providing an explicit measure of confidence in its predictive outputs. Here, we use Gaussian process regression to model each discipline (i = 1, …, n) with prior distributions given by:
fiGP(x)GP(mi(x),ki(x,x))
(2)
where mi(x) is the mean function and ki(x, x) is a real-valued kernel function quantifying the correlation between the samples in the input space. For the kernel function, we consider the commonly used squared exponential covariance function [27,32], which is specified as
k(x,x)=σf2exp(xx222l2)
(3)
where .22 is the square L2-norm, σf2 determines the prior variance, and l specifies the correlation between different points.
Let XNi=(x~1,,x~Ni) and yNi=(y~1,,y~Ni) be the sets of available inputs and outputs of the ith discipline function, respectively. The posterior distribution for fi at any point x in the input design space given the available data can be written as follows:
fiGP(x)XNi,yNiN(μiGP(x),(σiGP(x))2)
(4)
where
μiGP(x)=mi(x)+Ki(XNi,x)T[Ki(XNi,XNi)+σn,i2INi]1(yNimi(XNi))
(5)
(σiGP(x))2=ki(x,x)Ki(XNi,x)T[Ki(XNi,XNi)+σn,i2INi]1Ki(XNi,x)
(6)
with INi representing a diagonal identity matrix of size Ni, and
Ki(Xm,Xn)=[ki(x1,x1)ki(x1,xn)ki(xm,xn)ki(xm,xn)]
for Xm = {x1, …, xm} and Xn = {x1, …, xn}.
The term σn,i2 in Eq. (6) denotes the uncertainty associated with discipline i in modeling the coupling variables, i.e., model uncertainty. The hyperparameters of the Gaussian process, namely, the mean function and kernel function parameters, can be estimated by maximizing the marginal likelihood function [27]:
yNiXNiN(mi(XNi),Ki(XNi,XNi)+σn,i2INi)
(7)

3.2 Sequential Queries From Discipline Functions.

By constructing the Gaussian process approximations of the discipline functions, the Gibbs sampling method in Algorithm 1 can be employed using the surrogate models. However, the uncertainty in the Gaussian process approximations depends on the number of available data for each discipline. A large number of data results in a surrogate model representing the true discipline function with high fidelity, whereas a small data set might pose a significant uncertainty in the modeling process. Typically, in most real-world multidisciplinary applications, the expenses involved in the data gathering process inhibit acquiring a large data set for constructing the surrogate models, and propagation of the uncertainty of surrogate models often leads to instability of the Gibbs sampling process. Thus, we aim to develop a methodology to add the minimum possible number of costly samples to the surrogate model of each discipline in order to achieve stability and accuracy in estimation of the joint distribution of the coupling variables. In order to select a sample which makes the Gaussian process approximation of the discipline closer to the true disciplinary function, we use the Kullback–Leibler divergence. The Kullback–Leibler (KL) divergence [33,34] denoted by DKL(PQ) is a measure of the difference between two probability distributions P and Q with densities p(x) and q(x), defined as:
DKL(PQ)=+p(x)logp(x)q(x)dx
(8)
The Kullback–Leibler divergence is a measure of the information lost when P is approximated by Q, and a smaller value of this divergence indicates a greater similarity between the two distributions. For multivariate normal distributions with the same dimension k, the Kullback–Leibler divergence is defined as
DKL(PQ)=12(tr(Σq1Σp)+(μqμp)TΣq1(μqμp)k+ln(detΣqdetΣp))
(9)
where μp, μq, and Σp, Σq are means and covariance matrices of the distributions. Let fiGP be the Gaussian process approximation of the discipline function i using all available information up to the current time. In the context of Kullback–Leibler divergence, an input sample for the ith discipline can be selected as follows:
xi*=argminxXiExf1GP,,fnGP[E[DKL(fi(x)fiGP+x(x))]]=argminxXixXiE[DKL(fi(x)fiGP+x(x))]×p(xf1GP,,fnGP)dx
(10)
where p(xf1GP,,fnGP) denotes the coupling variable distribution computed by the surrogate models, and the expectation in the second line of Eq. (10) is taken over the possible outcomes fiGP+x of adding point x to the Gaussian process approximation of the discipline function i.

Computing the Kullback–Leibler divergence in Eq. (10) is not possible, since it requires the knowledge about the disciplinary functions. In order to solve this problem, one can take advantage of all available information and try to make the surrogate models of the disciplinary functions as close as possible to the true functions. Clearly this can be done by acquiring data from the disciplines to reduce the uncertainty of the Gaussian processes. This uncertainty can be quantified by entropy, which is a metric of the uncertainty that is eliminated, or the information that is gained by acquiring a sample. Therefore, the solution to Eq. (10) can be obtained by replacing the expectation of the Kullback–Leibler divergence E[DKL(fi(x)fiGP+x(x))] with the entropy H(fiGP+x(x)).

Now, let XiGS be the set of samples obtained by executing Algorithm 1 using the current Gaussian process approximations of the disciplines: XiGSp(xf1GP,,fnGP). The sequential decision making in Eq. (10) in order to select the sample to be added to discipline i can be formulated as:
xi*argminxXixXiH(fiGP+x(x))p(xf1GP,,fnGP)dxargminxXiGS1NxXiGSH(fiGP+x(x))
(11)
where the entropy can be calculated as
H(fiGP+x(x))=ln(σiGP+x(x)2πe)
(12)
with σiGP+x(x) denoting the standard deviation at point x′ after adding the design point x to the samples XNi of discipline i, which can be computed using the following Gaussian process covariance function identity:
(σiGP+x)2(x)=ki(x,x)Ki([XNix],x)T[Ki([XNix],[XNix])+σn,i2INi+1]1Ki([XNix],x)
(13)
The previous process selects a sample in XiGS that maximally reduces the entropy value over this set. Upon selection of the best point xi* and computing the disciplinary function associated with the selected sample, i.e., fi(xi*), the Gaussian process approximation of the corresponding discipline gets updated and another Gibbs sampling process is performed for the updated approximations. This process continues until such a time when all the discipline functions are represented properly by their Gaussian process approximations, which means that the results of the Gibbs sampling process are trusted. This can be specified based on the entropy of the current Gaussian processes and the minimum expected entropy that can be achievable. The entropy of the current Gaussian process approximation for discipline i at samples obtained from the Gibbs sampling process can be computed as follows:
Hi=1NxXiGSH(fiGP(x))
(14)
where
H(fiGP(x))=ln(σiGP(x)2πe)
(15)
and σiGP is computed according to Eq. (6).
The minimum entropy is achieved when the uncertainty associated with the lack of data is eliminated and the Gaussian process approximation of the discipline has uncertainty equal to σn,i2, which is assumed to be known and cannot be eliminated by adding more data. This minimum entropy for discipline i can be computed as
Himin=ln(σn,i2πe)
(16)
Therefore, we define the threshold for proper Gaussian process approximation of the discipline function i as
Ti=αiHimin
(17)
where αi is a user-defined value, which is set based on the available budget and the computational/economical complexity of evaluating each discipline function.

Hence, the process stops when the summation of the entropy over the Gibbs samples for all disciplines falls below the specified thresholds, i.e., Hi < Ti for all i. This means that the Gaussian process approximations are behaving like the true functions in the region of the Gibbs sampling solution and, as a result, the solution is close to the Gibbs sampling solution obtained by the true discipline functions. On the other hand, if the uncertainty is larger than the threshold, the learned Gaussian processes are far from the true functions in these regions, resulting in poor performance. Therefore, to be able to learn the functions, we sequentially add a single sample to the available samples, according to Eq. (11). This can lead to a shift in the joint distribution of the coupling variables. Continuing this process allows us to add the minimum number of samples until Hi < Ti for all disciplines. The proposed approach is summarized in Algorithm 2 and Fig. 1. It should be noted that the computational cost of the framework associated with constructing the surrogate models and performing the Gibbs sampling process is negligible in comparison to the huge cost associated with acquiring data from the disciplinary models. Meanwhile, notice that the Gaussian process regression used in this paper has similarity with the Gaussian mixture models [36]. A Gaussian process is essentially an infinite-dimensional Gaussian mixture, whereas the mixture models are probabilistic models that can be used to represent fixed sub-distributions in the overall distribution. The continuity of the arguments in the disciplinary functions along with the limitation of data make the GP regression a better option in this paper to efficiently consider the correlation and enable prediction capability [37].

Fig. 1
Schematic diagram of the proposed method
Fig. 1
Schematic diagram of the proposed method
Close modal

Uncertainty propagation in coupled multidisciplinary systems with unknown disciplines.

Algorithm 2

   1: Set number of Gibbs samples N, and user-defined values of αi.

 2: Construct Gaussian processes fiGP for the disciplines based on the available data (XNi,yNi) for i=1,,n.

 3: Ti=αiln(σn,i2πe), for i=1,,n.

 4: Set flag=1.

 5: whileflag0do

 6:  (X1GS,,XnGS) Gibbs_Sampling (f1GP,,fnGP)

 7:  flag=0.

 8:  fori=1,,ndo

 9:   Hi=1NxXiGSH(fiGP(x)).

10:   ifHi>Tithen

11:    xi*=argminxXiGS1NxXiGSH(fiGP+x(x)).

12:    Run discipline i to obtain fi(xi*).

13:    Add (xi*,fi(xi*)) to the GP of discipline i.

14:    flag=flag+1.

15:   end if

16:  end for

17: end while

18: Output:(X1GS,,XnGS)

4 Numerical Experiments

4.1 Coupled Aerodynamics-Structures System.

In this part, the performance of the proposed method is investigated using a model of a two-dimensional airfoil in airflow [35]. Figure 2 illustrates this system, in which the airfoil is supported by two linear springs attached to a ramp and the airfoil is permitted to pitch and plunge. A block diagram of this aerodynamics-structures system is shown in Fig. 3. A complete description of the problem can be found in Ref. [35].

Fig. 2
Coupled aerodynamics-structures system adapted from Ref. [35]
Fig. 2
Coupled aerodynamics-structures system adapted from Ref. [35]
Close modal
Fig. 3
Block diagram of the coupled aerodynamics-structures system adapted from Ref. [35]
Fig. 3
Block diagram of the coupled aerodynamics-structures system adapted from Ref. [35]
Close modal
The lift L and the elastic pitch angle ϕ are the coupling variables:
L=f1(ϕ,v1)=qBC(2π(ϕ+ψ)+r[1cos(π2(ϕ+ψ)θ)])+v1
(18)
and
ϕ=f2(L,v2)=(Lk1(1+p)Lpk2(1+p))1C(z2z1)+v2
(19)
where v1N(0,250), v2N(0,1e6), q = 1 N/cm2, B(span) = 100 cm, C(chord) = 10 cm, ψ = 0.05 rad, r = 0.9425, θ = 0.26 rad, p = 0.1111, k1 = 4000 N/cm, k2 = 2000 N/cm, z1 = 0.2, and z2 = 0.7.
Our objective is to correctly estimate the joint distribution of L and ϕ using the proposed adaptive Gibbs sampling approach based on Gaussian process regression. The proposed method assumes that the expressions in Eqs. (18) and (19) are unknown, and they are only observable through the data. Thus, we compare the results of the proposed approach with the results of the Gibbs sampling process when the disciplinary functions are fully known. The joint distribution of the coupling variables obtained with the Gibbs sampling process with known disciplinary functions is estimated as
(Lϕ)N[(501.92880.0176),(352.33720.01220.01221.4314e06)]
We assume that no initial samples from the disciplines are available, i.e., Ni = 0, and employ zero-mean Gaussian processes with parameters σn,12=250 and σn,12=106 as initial surrogate models for the first and second disciplines, respectively. The sample size in the Gibbs sampling process is set to be N = 1000, and the parameters α1 and α2 are set to 4 and 0.5, respectively.

In the first part of the numerical experiment, the joint distributions of the coupling variables obtained by the proposed method and the Gibbs sampling with known disciplines are compared. Figure 4 displays the samples obtained by the Gibbs sampling process with known discipline functions in red, and the samples obtained by the proposed approach in blue, when 5, 15, and 33 samples obtained by the surrogate models prescribed by the proposed method are included in the Gaussian process approximation. In the last plot, where the stopping criterion is met at iteration 33, the distribution of the Gibbs samples obtained by the proposed approach is close to the distribution obtained by the Gibbs sampling method with known disciplines. In this case, 26 and 33 samples are added to the surrogate models of disciplines 1 and 2, respectively. Note that the difference between the number of added samples to each discipline is due to the consideration of separate stopping criterion for each discipline. This indicates that the first discipline requires adding 26 samples and the second discipline needs 33 samples to be added to their Gaussian processes in order to achieve an accurate joint distribution of the coupling variables.

Fig. 4
The samples of coupling variables obtained by the Gibbs sampling process with known discipline functions (red) and samples obtained by the proposed approach (blue) in iterations 5, 15, and 33, applied to the coupled aerodynamics-structures system
Fig. 4
The samples of coupling variables obtained by the Gibbs sampling process with known discipline functions (red) and samples obtained by the proposed approach (blue) in iterations 5, 15, and 33, applied to the coupled aerodynamics-structures system
Close modal

Figure 5 displays the Gaussian process approximations of the two disciplines obtained by the proposed approach as well as the true functions with model discrepancy. The uncertainties shown by shaded gray and blue areas are associated with GP and model uncertainty respectively. As it can be seen, the samples are added to the disciplines in regions close to the joint distribution of the coupling variables indicated in Fig. 4, and the variance of the Gaussian processes gets close to the minimum possible value, which is equal to the model discrepancy, while in other regions the functions are not learned and the variances associated with the Gaussian processes are large. This demonstrates the advantage of the proposed approach, which adaptively adds samples to the important regions of the discipline functions without exploring and investing in unimportant regions.

Fig. 5
Gaussian process approximations of the two disciplines of the coupled aerodynamics-structures system obtained by the proposed approach: (a) Discipline 1: Aerodynamics and (b) Discipline 2: Structures
Fig. 5
Gaussian process approximations of the two disciplines of the coupled aerodynamics-structures system obtained by the proposed approach: (a) Discipline 1: Aerodynamics and (b) Discipline 2: Structures
Close modal

Figure 6 shows the Kullback–Leibler divergence between the joint distributions obtained by the proposed approach and by the Gibbs sampling process with true functions as a function of iteration number. As can be seen, the Kullback–Leibler divergence converges to zero or, equivalently, the estimated joint distribution of the coupling variables steadily approaches the true joint distribution, as the number of iterations increases. It should be mentioned that throughout the paper, the term iteration refers to successive improvement of the surrogate models by adding a single sample to the surrogate models each time, if needed according to the entropy threshold.

Fig. 6
Kullback–Leibler divergence between the joint distributions obtained by the proposed approach and by Gibbs sampling with true discipline functions of the coupled aerodynamics-structures system as a function of iteration number
Fig. 6
Kullback–Leibler divergence between the joint distributions obtained by the proposed approach and by Gibbs sampling with true discipline functions of the coupled aerodynamics-structures system as a function of iteration number
Close modal

4.2 Coupled Analysis of a Numerical Example.

In this part, we demonstrate our proposed framework on the numerical example referred to as Mathematical MDA problem in Ref. [38] which is an extension of the problem discussed in Ref. [39]. The functional relationships are shown in Fig. 7. In this example, there are two coupling variables in each coupling direction, resulting in four coupling variables x11, x12, x21, and x22. The uncertain variables v = [v1, v2, v3, v4, v5] are assumed to be normally distributed from N(1,0.1) and independent of each other. Our proposed adaptive Gibbs sampling approach which is based on constructing Gaussian process regressions for the disciplinary models, aims to correctly estimate the joint distribution of the coupling variables. Again, the proposed method assumes that the expressions associated with the disciplinary models in Fig. 7 are unknown and they are only observable through the data. Like the previous example, we validate the results of the proposed approach with the results of the Gibbs sampling process when the disciplinary functions are fully known. The joint distribution of the coupling variables obtained with the Gibbs sampling process with known disciplinary functions is estimated as follows:
(x11x12x21x22)N[(8.05232.652011.31813.4213),(1.68570.71082.24440.86980.71080.72440.99290.60062.24440.99293.94961.59620.86980.60061.59620.9608)]
We assume that no initial samples from the disciplines are available, i.e., Ni = 0, and employ zero-mean Gaussian processes as initial surrogate models for the disciplines. The sample size in the Gibbs sampling process is set to be N = 10,000, and the thresholds Ti are set to 0.1 for all the disciplines.
Fig. 7
Mathematical example of two-discipline coupled analysis [38]
Fig. 7
Mathematical example of two-discipline coupled analysis [38]
Close modal

Our proposed framework by adding 38 and 59 samples to the Gaussian processes of the two disciplinary models in Analysis 1 of Fig. 7, and 43 and 67 samples to the two disciplinary models in Analysis 2 of Fig. 7, propagates the uncertainty associated with the uncertain variables and estimates the stationary joint distribution of the four coupling variables.

Figures 8 and 9 show the samples obtained by the Gibbs sampling process with known disciplinary functions in red, and the samples obtained by our proposed method in blue in iteration 30 and in the converged state, respectively. These samples represent the stationary joint distribution of the coupling variables, and as it can be seen in the final iteration in Fig. 9, the distributions of the Gibbs samples obtained by our proposed approach are close to the distributions obtained by the Gibbs sampling method with known disciplines. The convergence in terms of Kullback–Leibler divergence per iteration is shown in Fig. 10.

Fig. 8
The samples of coupling variables of the numerical example obtained by the Gibbs sampling process with known discipline functions (red) and samples obtained by the proposed approach (blue) after 30 iterations
Fig. 8
The samples of coupling variables of the numerical example obtained by the Gibbs sampling process with known discipline functions (red) and samples obtained by the proposed approach (blue) after 30 iterations
Close modal
Fig. 9
The samples of coupling variables of the numerical example obtained by the Gibbs sampling process with known discipline functions (red) and samples obtained by the proposed approach (blue) in the converged state
Fig. 9
The samples of coupling variables of the numerical example obtained by the Gibbs sampling process with known discipline functions (red) and samples obtained by the proposed approach (blue) in the converged state
Close modal
Fig. 10
Kullback–Leibler divergence between the joint distributions obtained by the proposed approach and by Gibbs sampling with true discipline functions of the coupled numerical example as a function of iteration number
Fig. 10
Kullback–Leibler divergence between the joint distributions obtained by the proposed approach and by Gibbs sampling with true discipline functions of the coupled numerical example as a function of iteration number
Close modal

Finally, in order to demonstrate the advantage of our proposed approach, which adaptively adds samples to the important regions of the disciplinary functions without exploring and investing in unimportant regions, we compare the entropy in different regions in the space of the coupling variables. Figure 11 demonstrates the entropy of the disciplinary Gaussian processes over the samples generated randomly in the entire space of the coupling variables compared with the entropy over the samples generated from the joint distribution of the coupling variables in each iteration of the proposed approach. As can be seen, the entropy over both sets of samples decreases in each iteration; however, the entropy over samples from the true joint distribution is lower than that over the samples from the entire space. This indicates the contribution of our proposed method which adds samples to the surrogate models of the discipline functions in regions close to the joint distribution of the coupling variables, and the variance of the Gaussian processes gets close to the minimum possible value, while in other regions of the space of the coupling variables, the functions are not learned and the variances are large.

Fig. 11
The entropy of the surrogate models constructed for each coupling variable in each iteration over random samples from the joint distribution of the coupling variables (blue) and random samples in the entire space of the coupling variables (red)
Fig. 11
The entropy of the surrogate models constructed for each coupling variable in each iteration over random samples from the joint distribution of the coupling variables (blue) and random samples in the entire space of the coupling variables (red)
Close modal

5 Conclusion

This paper introduces an efficient data acquiring method for learning the disciplinary functions of multi-disciplinary systems for accurate uncertainty propagation. Given that the disciplinary functions are unknown and access to them is only possible through computationally or economically expensive data, the proposed framework constructs Gaussian processes over the disciplinary functions. By taking advantage of the Bayesian representation of the disciplinary functions, the proposed framework focuses only on important regions of the input space that provide a maximum reduction of the uncertainty in the Gaussian process regressions. This is achieved by deriving a closed-form policy for efficient sequential addition of samples to the disciplines’ surrogate models. The proposed framework is applied to a coupled aero-structural system and a coupled numerical example and it has been shown that by adding the minimum number of samples to the Gaussian processes, the proposed method achieves an accurate estimate of the joint distribution of the coupling variables. The accuracy has been validated with the distributions obtained by the Gibbs sampling processes when all the disciplinary functions are known.

Acknowledgment

The authors acknowledge the support of the National Science Foundation through NSF award IIS-1946999.

Conflict of Interest

There are no conflicts of interest.

References

1.
Culler
,
A. J.
, and
McNamara
,
J. J.
,
2010
, “
Studies on Fluid-Thermal-Structural Coupling for Aerothermoelasticity in Hypersonic Flow
,”
AIAA J.
,
48
(
8
), pp.
1721
1738
. 10.2514/1.J050193
2.
Hearn
,
T. A.
,
Hendricks
,
E.
,
Chin
,
J.
, and
Gray
,
J. S.
,
2016
, “
Optimization of Turbine Engine Cycle Analysis With Analytic Derivatives
,”
17th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference
, p.
4297
.
3.
Zaman
,
K.
, and
Mahadevan
,
S.
,
2013
, “
Robustness-Based Design Optimization of Multidisciplinary System Under Epistemic Uncertainty
,”
AIAA J.
,
51
(
5
), pp.
1021
1031
. 10.2514/1.J051372
4.
Dunning
,
P. D.
,
Kim
,
H. A.
, and
Mullineux
,
G.
,
2011
, “
Introducing Loading Uncertainty in Topology Optimization
,”
AIAA J.
,
49
(
4
), pp.
760
768
. 10.2514/1.J050670
5.
Kennedy
,
M. C.
, and
O’Hagan
,
A.
,
2001
, “
Bayesian Calibration of Computer Models
,”
J. R. Stat. Soc.: Ser. B (Stat. Methodology)
,
63
(
3
), pp.
425
464
. 10.1111/1467-9868.00294
6.
Du
,
X.
, and
Chen
,
W.
,
2005
, “
Collaborative Reliability Analysis Under the Framework of Multidisciplinary Systems Design
,”
Optim. Eng.
,
6
(
1
), pp.
63
84
. 10.1023/B:OPTE.0000048537.35387.fa
7.
Sankararaman
,
S.
, and
Mahadevan
,
S.
,
2012
, “
Likelihood-Based Approach to Multidisciplinary Analysis Under Uncertainty
,”
ASME J. Mech. Des.
,
134
(
3
), p.
031008
. 10.1115/1.4005619
8.
Imani
,
M.
, and
Ghoreishi
,
S. F.
,
2020
, “
Optimal Finite-horizon Perturbation Policy for Inference of Gene Regulatory Networks
,”
IEEE Intell. Syst.
10.1109/mis.2020.3017155
9.
Mahadevan
,
S.
, and
Smith
,
N.
,
2006
, “
Efficient First-Order Reliability Analysis of Multidisciplinary Systems
,”
Int. J. Reliab. Safety
,
1
(
1–2
), pp.
137
154
. 10.1504/IJRS.2006.010694
10.
Arnst
,
M.
,
Ghanem
,
R.
,
Phipps
,
E.
, and
Red-Horse
,
J.
,
2012
, “
Dimension Reduction in Stochastic Modeling of Coupled Problems
,”
Int. J. Num. Methods Eng.
,
92
(
11
), pp.
940
968
. 10.1002/nme.4364
11.
Arnst
,
M.
,
Ghanem
,
R.
,
Phipps
,
E.
, and
Red-Horse
,
J.
,
2014
, “
Reduced Chaos Expansions With Random Coefficientsin Reduced-Dimensional Stochastic Modeling of Coupled Problems
,”
Int. J. Num. Methods Eng.
,
97
(
5
), pp.
352
376
. 10.1002/nme.4595
12.
Constantine
,
P. G.
,
Phipps
,
E. T.
, and
Wildey
,
T. M.
,
2014
, “
Efficient Uncertainty Propagation for Network Multiphysics Systems
,”
Int. J. Num. Methods Eng.
,
99
(
3
), pp.
183
202
. 10.1002/nme.4667
13.
Amaral
,
S.
,
Allaire
,
D.
, and
Willcox
,
K.
,
2012
, “
A Decomposition Approach to Uncertainty Analysis of Multidisciplinary Systems
,”
12th AIAA Aviation Technology, Integration, and Operations (ATIO) Conference and 14th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference
, p.
5563
.
14.
Amaral
,
S.
,
Allaire
,
D.
, and
Willcox
,
K.
,
2014
, “
A Decomposition-Based Approach to Uncertainty Analysis of Feed-forward Multicomponent Systems
,”
Int. J. Num. Methods Eng.
,
100
(
13
), pp.
982
1005
. 10.1002/nme.4779
15.
Ghoreishi
,
S. F.
,
2016
, “
Uncertainty Analysis for Coupled Multidisciplinary Systems Using Sequential Importance Resampling
,” PhD thesis, Texas A&M University.
16.
Ghoreishi
,
S. F.
, and
Allaire
,
D. L.
,
2016
, “
Compositional Uncertainty Analysis Via Importance Weighted Gibbs Sampling for Coupled Multidisciplinary Systems
,”
18th AIAA Non-Deterministic Approaches Conference
, p.
1443
.
17.
Ghoreishi
,
S.
, and
Allaire
,
D.
,
2017
, “
Adaptive Uncertainty Propagation for Coupled Multidisciplinary Systems
,”
AIAA J.
,
11
, pp.
3940
3950
. 10.2514/1.j055893
18.
Santner
,
T. J.
,
Williams
,
B. J.
,
Notz
,
W.
, and
Williams
,
B. J.
,
2003
,
The Design and Analysis of Computer Experiments
, Vol.
1
,
Springer
.
19.
Yao
,
W.
,
Chen
,
X.
,
Luo
,
W.
,
van Tooren
,
M.
, and
Guo
,
J.
,
2011
, “
Review of Uncertainty-Based Multidisciplinary Design Optimization Methods for Aerospace Vehicles
,”
Progress Aeros. Sci.
,
47
(
6
), pp.
450
479
. 10.1016/j.paerosci.2011.05.001
20.
Gu
,
X. S.
,
Renaud
,
J.
, and
Penninger
,
C.
,
2006
, “
Implicit Uncertainty Propagation for Robust Collaborative Optimization
,”
ASME J. Mech. Des.
,
128
(
4
), pp.
1001
1013
. 10.1115/1.2205869
21.
Chiralaksanakul
,
A.
, and
Mahadevan
,
S.
,
2007
, “
Decoupled Approach to Multidisciplinary Design Optimization Under Uncertainty
,”
Optim. Eng.
,
8
(
1
), pp.
21
42
. 10.1007/s11081-007-9014-2
22.
Ghoreishi
,
S. F.
, and
Imani
,
M.
,
2020
, “
Bayesian Optimization for Efficient Design of Uncertain Coupled Multidisciplinary Systems
,”
Proceedings of the 2020 American Control Conference (ACC 2020)
,
IEEE
.
23.
McDonald
,
M.
,
Zaman
,
K.
, and
Mahadevan
,
S.
,
2008
, “
Uncertainty Quantification and Propagation for Multidisciplinary System Analysis
,”
12th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference Paper 2008–6038
.
24.
Kokkolaras
,
M.
,
Mourelatos
,
Z. P.
, and
Papalambros
,
P. Y.
,
2006
, “
Design Optimization of Hierarchically Decomposed Multilevel Systems Under Uncertainty
,”
ASME J. Mech. Des.
,
128
(
2
), pp.
503
508
. 10.1115/1.2168470
25.
Chaudhuri
,
A.
, and
Willcox
,
K. E.
,
2016
, “
Multifidelity Uncertainty Propagation in Coupled Multidisciplinary Systems
,”
18th AIAA Non-Deterministic Approaches Conference
, p.
1442
.
26.
Chaudhuri
,
A.
,
Lam
,
R.
, and
Willcox
,
K.
,
2018
, “
Multifidelity Uncertainty Propagation Via Adaptive Surrogates in Coupled Multidisciplinary Systems
,”
AIAA J.
,
56
(
1
), pp.
235
249
. 10.2514/1.J055678
27.
Rasmussen
,
C. E.
, and
Williams
,
C.
,
2006
,
Gaussian Processes for Machine Learning
,
The MIT Press
.
28.
Imani
,
M.
,
Ghoreishi
,
S.
, and
Braga-Neto
,
U.
,
2018
, “
Bayesian Control of Large MDPS with Unknown Dynamics in Data-Poor Eenvironments
,”
NeurIPS
,
31
, pp.
8157
8167
.
29.
Geman
,
S.
, and
Geman
,
D.
,
1984
, “
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
,”
IEEE Trans. Pattern Anal. Machine Intell.
,
6
, pp.
721
741
. 10.1109/TPAMI.1984.4767596
30.
Schervish
,
M. J.
, and
Carlin
,
B. P.
,
1992
, “
On the Convergence of Successive Substitution Sampling
,”
J. Comput. Graphical Stat.
,
1
(
2
), pp.
111
127
.
31.
Ghoreishi
,
S. F.
, and
Allaire
,
D. L.
,
2018
, “
Gaussian Process Regression for Bayesian Fusion of Multi-fidelity Information Sources
,”
2018 Multidisciplinary Analysis and Optimization Conference
, p.
4176
.
32.
Imani
,
M.
,
Ghoreishi
,
S. F.
,
Allaire
,
D.
, and
Braga-Neto
,
U.
,
2019
, “
MFBO-SSM: Multi-fidelity Bayesian Optimization for Fast Inference in State-Space Models
,”
Proceedings of the AAAI Conference on Artificial Intelligence
, Vol.
33
, pp.
7858
7865
.
33.
Kullback
,
S.
, and
Leibler
,
R. A.
,
1951
, “
On Information and Sufficiency
,”
The Ann. Math. Stat.
,
22
(
1
), pp.
79
86
. 10.1214/aoms/1177729694
34.
Imani
,
M.
, and
Ghoreishi
,
S. F.
,
2020
, “
Bayesian Optimization Objective-Based Experimental Design
,”
American Control Conference (ACC)
,
IEEE
.
35.
Sobieszczanski-Sobieski
,
J.
,
1990
, “
Sensitivity of Complex, Internally Coupled Systems
,”
AIAA J.
,
28
(
1
), pp.
153
160
. 10.2514/3.10366
36.
Grandvalet
,
Y.
, and
Bengio
,
Y.
,
2006
,
Entropy regularization
.
37.
Grandvalet
,
Y.
, and
Bengio
,
Y.
,
2005
, “
Semi-Supervised Learning by Entropy Minimization
,”
Advances in Neural Information Processing Systems
, pp.
529
536
.
38.
Liang
,
C.
, and
Mahadevan
,
S.
,
2016
, “
Stochastic Multidisciplinary Analysis With High-Dimensional Coupling
,”
AIAA J.
,
54
(
2
), pp.
1209
1219
. 10.2514/1.J054343
39.
Du
,
X.
, and
Chen
,
W.
,
2002
, “
Efficient Uncertainty Analysis Methods for Multidisciplinary Robust Design
,”
AIAA J.
,
40
(
3
), pp.
545
552
. 10.2514/2.1681