Timezone: »

Hard Prompts Made Easy: Gradient-Based Discrete Optimization for Prompt Tuning and Discovery
Yuxin Wen · Neel Jain · John Kirchenbauer · Micah Goldblum · Jonas Geiping · Tom Goldstein

Wed Dec 13 03:00 PM -- 05:00 PM (PST) @ Great Hall & Hall B1+B2 #606
Event URL: https://github.com/YuxinWenRick/hard-prompts-made-easy »

The strength of modern generative models lies in their ability to be controlled through prompts. Hard prompts comprise interpretable words and tokens, and are typically hand-crafted by humans. Soft prompts, on the other hand, consist of continuous feature vectors. These can be discovered using powerful optimization methods, but they cannot be easily edited, re-used across models, or plugged into a text-based interface. We describe an easy-to-use approach to automatically optimize hard text prompts through efficient gradient-based optimization. Our approach can be readily applied to text-to-image and text-only applications alike. This method allows API users to easily generate, discover, and mix and match image concepts without prior knowledge of how to prompt the model. Furthermore, using our method, we can bypass token-level content filters imposed by Midjourney by optimizing through the open-sourced text encoder.

Author Information

Yuxin Wen (University of Maryland)
Neel Jain (University of Maryland, College Park)
John Kirchenbauer (University of Maryland, College Park)
Micah Goldblum (New York University)
Jonas Geiping (ELLIS Institute & MPI Intelligent Systems, Tübingen AI Center)
Jonas Geiping

Jonas is a postdoctoral researcher at UMD. His background is in Mathematics, more specifically in mathematical optimization and its applications to deep learning. His current focus is on designing more secure and private ML systems, especially for federated learning, and on understanding fundamental phenomena behind generalization.

Tom Goldstein (University of Maryland)

More from the Same Authors