Abstract
Topology Optimization is a widely used method to find optimal designs that meet constraints and maximize system performance. However, traditional iterative optimization methods such as SIMP are computationally expensive and prone to getting stuck in local minima, limiting their effectiveness for complex or large-scale problems. To overcome these challenges, we propose a novel Generative Optimization method that integrates classic optimization (SIMP) as a refining mechanism for the topologies generated by a diffusion model. By introducing a computationally efficient approximation inspired by classic ODE solutions, the need for conditioning on physical fields is eliminated, reducing inference time by half. Our method facilitates the generation of feasible and high-performance topologies while explicitly guiding the process towards regions with superior manufacturability and performance without the need for external auxiliary models or additional labeled data. Notably, our approach overcomes drawbacks encountered by learning-based methods in out-of-distribution constraint configurations, including floating material and high compliance error, as well as the requirement for extensive pre-processing and surrogate models. Experimental results showcase a substantial 20% increase in performance. Overall, the integration of deep generative models, such as Diffusion models and Generative Adversarial Networks, to warm start optimization holds significant promise in advancing the design and optimization of structures in engineering applications, extending its applicability to a broader spectrum of performance-aware engineering design problems.