Timezone: »

Size-Noise Tradeoffs in Generative Networks
Bolton Bailey · Matus Telgarsky

Tue Dec 04 07:45 AM -- 09:45 AM (PST) @ Room 517 AB #141
This paper investigates the ability of generative networks to convert their input noise distributions into other distributions. Firstly, we demonstrate a construction that allows ReLU networks to increase the dimensionality of their noise distribution by implementing a ``space-filling'' function based on iterated tent maps. We show this construction is optimal by analyzing the number of affine pieces in functions computed by multivariate ReLU networks. Secondly, we provide efficient ways (using polylog$(1/\epsilon)$ nodes) for networks to pass between univariate uniform and normal distributions, using a Taylor series approximation and a binary search gadget for computing function inverses. Lastly, we indicate how high dimensional distributions can be efficiently transformed into low dimensional distributions.

Author Information

Bolton Bailey (University of Illinois Urbana-Champaign)
Matus Telgarsky (UIUC)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors