I am an assistant professor in the Johns Hopkins Department of Applied Math and Statistics, joining after finishing my PhD in Operations Research at Cornell University, advised by Jim Renegar and Damek Davis. I spent Spring 2020 with Google Research working on adversarial optimization and Fall 2017 at UC Berkeley as part of a Simons Institute program bridging continuous and discrete optimization.
My current research focuses on the design and analysis of algorithms for continuous optimization problems beyond the areas where classical theory applies. For example, the selected works below all address fundamental issues in modern optimization problems, bridging the gap between classical approaches and the potentially stochastic, nonconvex, nonsmooth, nonLipschitz, adversarial objectives employed on many modern data science problems. During my PhD, I was awarded an NSF Graduate Research Fellowship supporting this research.
Office: N419 Wyman Hall
Email: grimmer at jhu.edu
CV: here
Twitter: @prof_grimmer (mostly sharing pretty 3D prints)
Accelerated Gradient Descent via Long Steps | Youtube, arXiv |
Benjamin Grimmer, Kevin Shu, Alex L. Wang. |
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization | arXiv |
Benjamin Grimmer, Danlin Li. |
Radial Duality Part I: Foundations and Part II: Applications and Algorithms | Mathematical Programming, 2023 |
Benjamin Grimmer. | arXiv: Part I, Part II |
The Landscape of the Proximal Point Method for Nonconvex-Nonconcave Minimax Optimization | Mathematical Programming, 2022 |
Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni. | arXiv |
Ning Liu (PhD Candidate) | Johns Hopkins, AMS |
Thabo Samakhoana (PhD Candidate) | Johns Hopkins, AMS |
Alan Luner (PhD Student) | Johns Hopkins, AMS |
Eleanor Belkin (PhD Student) | Johns Hopkins, AMS |
Shengyi Yan (Masters) | Johns Hopkins, AMS |
Zhichao Jia (Masters) | Johns Hopkins, AMS |
Danlin Li (Masters) | Johns Hopkins, AMS |
Raj Gosain (Undergraduate) | Johns Hopkins, AMS |