Skip to content Skip to navigation


You are here: Home » Content » CSLS Workshop on Optimization of Eigenvalues


Recently Viewed

This feature requires Javascript to be enabled.

CSLS Workshop on Optimization of Eigenvalues

Module by: Pascal Vontobel. E-mail the author

Workshop Overview

A wealth of interesting problems in engineering, control, finance, and statistics can be formulated as optimization problems involving the eigenvalues of a matrix function. These very challenging problems cannot usually be solved via traditional techniques for nonlinear optimization. However, they have been addressed in recent years by a combination of deep, elegant mathematical analysis and ingenious algorithmic and software development. In this workshop, three leading experts will discuss applications along with the theoretical and algorithmic aspects of this fascinating topic.

Remark: This workshop was held on October 7, 2004 as part of the Computational Sciences Lecture Series (CSLS) at the University of Wisconsin-Madison.

Semidefinite Programming

By Prof. Stephen Boyd (Stanford University, USA)

Slides of talk [PDF] (Not yet available.) | Video [WMV] (Not yet available.)

ABSTRACT: In semidefinite programming (SDP) a linear function is minimized subject to the constraint that the eigenvalues of a symmetric matrix are nonnegative. While such problems were studied in a few papers in the 1970s, the relatively recent development of efficient interior-point algorithms for SDP has spurred research in a wide variety of application fields, including control system analysis and synthesis, combinatorial optimization, circuit design, structural optimization, finance, and statistics. In this overview talk I will cover the basic properties of SDP, survey some applications, and give a brief description of interior-point methods for their solution.

Eigenvalue Optimization: Symmetric versus Nonsymmetric Matrices

By Prof. Adrian Lewis (Cornell University, USA)

Slides of talk [PDF] (Not yet available.) | Video [WMV] (Not yet available.)

ABSTRACT: The eigenvalues of a symmetric matrix are Lipschitz functions with elegant convexity properties, amenable to efficient interior-point optimization algorithms. By contrast, for example, the spectral radius of a nonsymmetric matrix is neither a convex function, nor Lipschitz. It may indicate practical behaviour much less reliably than in the symmetric case, and is more challenging for numerical optimization (see Overton's talk). Nonetheless, this function does share several significant variational-analytic properties with its symmetric counterpart. I will outline these analogies, discuss the fundamental idea of Clarke regularity, highlight its usefulness in nonsmooth chain rules, and discuss robust regularizations of functions like the spectral radius. (Including joint work with James Burke and Michael Overton.)

Local Optimization of Stability Functions in Theory and Practice

By Prof. Michael Overton (Courant Institute of Mathematical Sciences New York University, USA)

Slides of talk [PDF] (Not yet available.) | Video [WMV] (Not yet available.)

ABSTRACT: Stability measures arising in systems and control are typically nonsmooth, nonconvex functions. The simplest examples are the abscissa and radius maps for polynomials (maximum real part, or modulus, of the roots) and the analagous matrix measures, the spectral abscissa and radius (maximum real part, or modulus, of the eigenvalues). More robust measures include the distance to instability (smallest perturbation that makes a polynomial or matrix unstable) and the $\epsilon$ pseudospectral abscissa or radius of a matrix (maximum real part or modulus of the $\epsilon$\-pseudospectrum). When polynomials or matrices depend on parameters it is natural to consider optimization of such functions. We discuss an algorithm for locally optimizing such nonsmooth, nonconvex functions over parameter space and illustrate its effectiveness, computing, for example, locally optimal low-order controllers for challenging problems from the literature. We also give an overview of variational analysis of stabiity functions in polynomial and matrix space, expanding on some of the issues discussed in Lewis's talk. (Joint work with James V. Burke and Adrian S. Lewis.)

Content actions

Download module as:

Add module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens


A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks