Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning and the Physical Sciences

Training physical networks like neural networks: deep physical neural networks

Logan Wright · Tatsuhiro Onodera · Martin M Stein · Tianyu Wang · Darren Schachter · Zoey Hu · Peter McMahon


Abstract:

Deep neural networks (DNNs) are increasingly used to predict physical processes. Here, we invert this relationship, and show that physical processes with adjustable physical parameters (e.g., geometry, voltages) can be trained to emulate DNNs, i.e., to perform machine learning inference tasks. We call these trainable processes \textit{physical neural networks} (PNNs). We train experimental PNNs based on broadband optical pulses propagating in a nonlinear crystal, a nonlinear electronic oscillator, and an oscillating metal plate. As an extension of these laboratory proof-of-concepts, we train (in simulation) a network of coupled oscillators to perform Fashion MNIST classification. Since one cannot apply autodifferentiation directly to physical processes, we introduce a technique that uses a simulation model to efficiently estimate the gradients of the physical system, allowing us to use backpropagation to train PNNs. Using this technique, we train each system's physical transformations (which do not necessarily resemble typical DNN layers) directly to perform inference calculations. Our work may help inspire novel neural network architectures, including ones that can be efficiently realized with particular physical processes, and presents a route to training complex physical systems to take on desired physical functionalities, such as computational sensing. This article is intended as a summary of the previously published work [Wright, Onodera et al., 2022] for the NeurIPS 2022 Machine Learning and the Physical Sciences workshop.

Chat is not available.