Optimization (45hours)
Optimization (45hours)
Optimization (45hours)
Course Description
The course addresses essentials in linear and nonlinear optimization, both from the theoretical as well as the numerical points of view. First, classical pattern search, evolutionary-based and gradient methods are presented, and theoretical issues are addressed. Then, advanced algorithms in nonlinear programming and in multiobjective optimization are presented and analyzed. A particular attention is paid for the python implementation of the most representative algorithms.
- Prerequisites: Basic knowledge in Linear Algebra, Matrix computation, Hilbert spaces, and Differential Calculus
Tentative Schedule
- Introduction to major problem families and solving classes in optimization
- Gradient free methods
- Introduction – Motivation
- Pattern search methods
- Evolutionary-based methods; Particle Swarms
- Examples and Applications
- Continuous Optimization
- Convexity and Differentials
- Optimality Conditions – Existence problems.
- Gradient type methods – Analysis of their convergence properties.
- Constrained Optimization
- Penalization methods
- Projected gradient methods
- Lagrangian and Duality theory.
- Advanced Algorithms in continuous optimization
- Conjugate and Nonlinear Conjugate gradients methods
- Newton and quasi-Newton methods; Spectral gradient methods
- Uzawa algorithms
Course Materials
Course slides and reading material will be available from the instructor.
Reference Books
- Algorithms for Optimization by Mykel J. Kochenderfer and Tim A. Wheeler, MIT Press, 2019
- Numerical Optimization by Jorge Nocedal, Stephen Wright, Springer, 2006.
- Optimization: Algorithms and Consistent Approximations by Elijah Polak, Springer, 1997.
Marks Distribution
- Midterm Exam: 25%
- Project: 25%
- Final Exam: 50%