An algorithm is presented for the efficient constrained or unconstrained minimization of computationally expensive objective functions. The method proceeds by creating and numerically optimizing a sequence of surrogate functions which are chosen to approximate the behavior of the unknown objective function in parameter-space. The Recursive Surrogate Optimization (RSO) technique is intended for design applications where the computational cost required to evaluate the objective function greatly exceeds both the cost of evaluating any domain constraints present and the cost associated with one iteration of a typical optimization routine. Efficient optimization is achieved by reducing the number of times that the objective function must be evaluated at the expense of additional complexity and computational cost associated with the optimization procedure itself. Comparisons of the RSO performance on eight widely used test problems to published performance data for other efficient techniques demonstrate the utility of the method.

This content is only available via PDF.
You do not currently have access to this content.