Nonlinear Optimization I 553.761
TTh 4:30pm - 5:45pm, BLOOMBERG HALL 274



GENERAL INFORMATION

Instructor : Amitabh Basu
Office Hours : Wednesday 5:00--7:00pm, or email for appointment. Office Hours will be in my office : Whitehead 202A.
Email : basu [dot] amitabh [at] jhu [dot] edu

Teaching Assistant : Tian Yang, Noah Wichrowski and Elizabeth Reiland will be the TAs for our class.

Tian's email is tyan4 [at] jhu [dot] edu.
Noah's email is nwichro1 [at] jhu [dot] edu.
Elizabeth's email is ereiland [at] jhu [dot] edu.

Tian's office hours will be on Fridays from 9--10am.
Noah's office hours will be on Tuesdays from 3--4pm.

Notes/Texts : I will use lecture slides prepared by Daniel P. Robinson for the course. They will be periodically posted here.

Introduction. Slides without "Notes"
Background and basics. Slides without "Notes". MATLAB demo file
Convexity. Slides without "Notes"
Newton's method. Slides without "Notes"
Optimality conditions. Slides without "Notes"
Line Search Methods. Slides without "Notes". Conjugate Gradient Notes
Trust Region Methods. Slides without "Notes".
Least Squares. Slides without "Notes".
Nonlinear Equations. Slides without "Notes".
Coordinate Minimization. Slides without "Notes".
Second Order Methods. Slides without "Notes".

Other useful textbooks and resources

Optimization:
Basic Numerical Analysis:
Syllabus : The syllabus with list of topics to be covered is available HERE.
This course considers algorithms for solving various important nonlinear optimization problems and, in parallel, develops the supporting theory. Primary focus will be on unconstrained and bound-constrained optimization. Topics will include: necessary and sufficient optimality conditions; gradient, Newton, and quasi-Newton based line-search, trust-region, and cubic regularization methods; linear and nonlinear least squares problems; and linear and nonlinear conjugate gradient methods. Special attention will be paid to the large-scale case and will include topics such as limited-memory quasi-Newton methods, projected gradient methods, and subspace accelerated two-phase methods for bound-constrained optimization.

EXAM AND GRADING INFORMATION

There will one in-class Midterm and one in-class Final exam. In addition, I will regularly (approx. every two week) post homework assigments here. You will be asked to hand in some of the HW problems which will be graded (approximately every two weeks). Seriously attempting ALL the homework problems is imperative for your success in the class, and they will give an indication of the kind of problems on the tests.

Homeworks Midterms and Final Grades