Understanding and optimizing complex design problems involves analyzing mathematical models that simulate real-world systems. When such simulations require an enormous amount of time to evaluate a design point, approximation models are created that are simpler and quicker and allow exploration of the design space. Approximation models are also referred to surrogate models or meta-models. Kriging, response surface models, neural networks such as radial basis functions are examples of such approximation methods. The quality of approximations obtained using these methods is dependent upon: (1) the type of approximation technique, (2) the distribution of sample points, namely the sampling strategy, (3) number of sample points available for approximation and (4) the topography of the function being approximated. The quality and performance of an approximation method are measured using prediction errors and time taken to fit respectively. In this work, we present a study of the quality and performance of polynomial regression (general and orthogonal polynomials), Radial basis functions (RBF), Elliptical basis functions (EBF), and Kriging (ordinary and blind) for functions with different topographies. Specifically, the objective of this benchmark is to study the following effects on the accuracy of approximation techniques for comparison: 1. Effect of sampling strategy, 2. Effect of number of points, 3. Effect of function topography. Results indicate that Kriging approximation is best suited for cases where there are few sample points that are uniformly spaced (typically from Optimal Latin Hypercube sampling). Elliptical or Radial Basis Function neural networks are fast, robust and accurate enough to be used for any sampling strategy. For factorial sampling, Chebyshev polynomials have higher accuracy compared to simple polynomial regression.

This content is only available via PDF.
You do not currently have access to this content.