Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning for Engineering Modeling, Simulation and Design

A Nonlocal-Gradient Descent Method for Inverse Design in Nanophotonics

Sirui Bi · Jiaxin Zhang · Guannan Zhang


Abstract:

Local-gradient-based optimization approaches lack nonlocal exploration ability required for escaping from local minima when searching non-convex landscapes. A directional Gaussian smoothing (DGS) approach was recently proposed in \cite{2020arXiv200203001Z} and used to define a truly nonlocal gradient, referred to as the DGS gradient, in order to enable nonlocal exploration in high-dimensional black-box optimization. Promising results show that replacing the traditional local gradient with the nonlocal DGS gradient can significantly improve the performance of gradient-based methods in optimizing highly multi-modal loss functions. However, the current DGS method is designed for unbounded and uncontrained optimization problems, making it inapplicable to real-world engineering optimization problems where the tuning parameters are often bounded and the loss function is usually constrained by physical processes. In this work, we propose to extend to the DGS approach to the constrained inverse design framework in order to find better optima of multi-modal loss functions. A series of adaptive strategies for smoothing radius and learning rate updating are developed to improve the computational efficiency and robustness. Our methodology is demonstrated by an example of designing a nanoscale wavelength demultiplexer, and shows superior performance compared to the state-of-the-art approaches. By incorporating volume constraints, the optimized design achieves an equivalently high performance but significantly reduces the amount of material usage.

Chat is not available.