Nonlinear Optimization I 553.761
TTh 4:30pm - 6:30pm, WHITEHEAD HALL 304



GENERAL INFORMATION

Instructor : Amitabh Basu
Office Hours : Wednesday 5:00--6:30pm, or email for appointment. Office Hours will be in my office : Whitehead 202B.
Email : basu [dot] amitabh [at] jhu [dot] edu

Teaching Assistant : Yashil Sukurdeep and Lingyao Meng will be the TAs for our class.

Yashil's email is yashil.sukurdeep [at] jhu [dot] edu.
Lingyao's email is lmeng2 [at] jhu [dot] edu.
Yashil's office hours will be on Thursdays from 2:45pm--4:15pm.
Lingyao's office hours will be on Mondays from 6:30pm--8:30pm.

Notes/Texts : I will use lecture slides prepared by Daniel P. Robinson for the course. They will be periodically posted here.

Introduction. Slides without "Notes"
Background and basics. Slides without "Notes". MATLAB demo file
Optimality conditions. Slides without "Notes"
Newton's method. Slides without "Notes". MATLAB demo file
Line Search Methods. Slides without "Notes".
Convexity. Slides without "Notes"
Conjugate Gradient
Trust Region Methods. Slides without "Notes".
Least Squares. Slides without "Notes".
Smooth Convex Optimization
Nonsmooth Convex Optimization: Sections 4.1 and 4.4 from Notes on Convexity
Stochastic Gradient Descent. Thanks to Hongda Qiu for figuring out the right constant Q in the strongly convex analysis!
Coordinate Minimization. Slides without "Notes".
Second Order Methods. Slides without "Notes".
Linear Programming.

Other useful textbooks and resources

Optimization:
Basic Numerical Analysis:
MATLAB Tutorial from Daniel P. Robinson.

Syllabus : The syllabus with list of topics to be covered is available HERE.
This course considers algorithms for solving various important nonlinear optimization problems and, in parallel, develops the supporting theory. Primary focus will be on unconstrained optimization. Topics will include: necessary and sufficient optimality conditions; gradient, Newton, and quasi-Newton based line-search, trust-region, and cubic regularization methods; linear and nonlinear least squares problems; and linear and nonlinear conjugate gradient methods. Special attention will be paid to the large-scale case and will include topics such as limited-memory quasi-Newton methods, stochastic gradient descent.

EXAM AND GRADING INFORMATION

There will one in-class Midterm and one in-class Final exam. In addition, I will regularly (approx. every two week) post homework assigments here. You will be asked to hand in some of the HW problems which will be graded (approximately every two weeks). Seriously attempting ALL the homework problems is imperative for your success in the class, and they will give an indication of the kind of problems on the tests.

Homeworks Midterms and Final Grades