Course Contents for Optimisation 2005

(back to the course homepage)

Prerequisites: Courses in calculus (one and several variables) and linear algebra (including eigenvalues, eigenvectors and quadratic forms).

Applications: Control theory, signal analysis, statistics, economics, diverse industrial problems etc.

Schedule: The course gives 4 credits and is offered during the first half of the autumn (lp 1). It containes 28 lectures, 14 problem solving sessions, and also two computer laboratory exercises of 2 hours each, where MATLAB is used to demonstrate algorithms. There is also an individual programming project that usually takes a couple of days to complete. Written (occasionally oral) examination.

Character:The aim of the course is to give the mathematical ideas and derivations of the basic general optimization methods because short ''recipes'' cannot be given. Thus there is more emphasis on derivations, different interpretations and proofs than in the compulsory courses in mathematics at LTH, even though the major part of the exercises and the examination is problem solving.

Literature: L-C Böiers, Lectures on Optimization (Lund 2005). --- These notes cover the course. Complementary reading: Bazaraa-Sheraly-Shetty, Nonlinear Programming, Theory and Algorithms, 2nd edition, Wiley 1993.

Lecturer: Andrey Ghulchak, tel. 046/222 8546

Problems in mathematics and its applications very often end up in the minimization or maximization of some function of several variables, possibly constrained. A common situation is the determination of parameters in a physical model to obtain the best agreement with some set of measured data. Another one is to find an optimal way to transmit information from one point to another in a network.

Differential calculus can often be used to formulate conditions for optimality. This can already be seen in introductory courses in calculus, notably in the theory of Lagrange multipliers. In the present course this is generalized to more complicated situations by the Kuhn-Tucker theory. A fundamental concept here is convexity. Differential calculus is also used in the construction of numerical methods for optimization, together with a good deal of linear algebra. The algorithms are often iterative. The course deals with the basic methods for unconstrained optimization such as Steepest Descent, Newton's Method, Quasi-Newton Methods and the Conjugate Gradient Method. In the presence of constraints, the task of optimization becomes much harder, especially in the case of non-linear constraints. Some general methods and ideas will be presented. In linear programming both the function to be optimized and the constraints are linear. Such problems frequently arise in practice, often in situations involving thousands of variables, and the availability of fast algorithms is of great economic importance. The most important method used here is the simplex method.

Andrey Ghulchak