Timezone: »

 
Neural Reparameterization Improves Structural Optimization
Stephan Hoyer · Jascha Sohl-Dickstein · Sam Greydanus

Structural optimization is a popular method for designing objects such as bridge trusses, airplane wings, and optical devices. Unfortunately, the quality of solutions depends heavily on how the problem is parameterized. In this paper, we propose using the implicit bias over functions induced by neural networks to improve the parameterization of structural optimization. Rather than directly optimizing densities on a grid, we instead optimize the parameters of a neural network which outputs those densities. This reparameterization leads to different and often better solutions. On a selection of 116 structural optimization tasks, our approach produces an optimal design 50% more often than the best baseline method.

Author Information

Stephan Hoyer (Google Research)
Jascha Sohl-Dickstein (Google Brain)
Sam Greydanus (Oregon State University)

I am a recent graduate of Dartmouth College, where I majored in physics and dabbled in everything else. I have interned at CERN, Microsoft Azure, and the DARPA Explainable AI Project. I like to use memory-based models to generate sequences and policies. So far, I have used them to approximate the Enigma cipher, generate realistic handwriting, and visualize how reinforcement-learning agents play Atari games. One of my priorities as a scientist is to explain my work clearly and make it easy to replicate.

More from the Same Authors