Lecture
Notes on Large-scale Unconstrained and Constrained Optimization
- Introduction
and Basics of Unconstrained Optimization (basics, characterization of
solutions, overview of algorithms)
- Line
Search Methods (step length conditions and algorithms, global
convergence conditions, and rate of convergence)
- Trust
Region Methods (basic trust regions ideas, Cauchy point and minimum
progress, dogleg and twodimensional subspace minimization, global
convergence, local convergence)
- Linear
and Nonlinear Conjugate Gradients (linear cg, rate of convergence,
nonlinear cg, Fletcher-Reeves, Polak-Ribiere variant, global convergence
FR CG and FR-PR CG)
- Quasi-Newton
Methods, Nonlinear Least Squares, and Background of Constrained
Optimization
- Interior
Point Methods for Linear and Nonlinear Programming
Lecture
Notes on Nonlinear Systems of Equations
Lecture
Notes on Large, Sparse Eigenvalue Problems
Useful
material on Numerical Linear Algebra, based on David Watkins, Fundamentals
of Matrix Computations (2nd ed.), Wiley:
Large, sparse
eigenvalue problems and sensitivity/accuracy of eigenvalue problems (from
Summer School organized by the Materials
Computation Center):
- Derivation of Methods
- Sensitivity and Accuracy