DOOR #1:“变分分析--基础理论与前沿进展”(线上)课程 2021/10/26-2021/12/10
主讲嘉宾:Tyrrell Rockafellar教授(University of Washington)
Tyrrell Rockafellar,美国华盛顿大学荣休教授,分别于1957年和1963年获得哈佛大学学士和博士学位。长期从事优化与变分分析理论研究,是凸分析与变分分析领域的奠基人。在国际优化控制权威期刊上发表了大量学术论文,出版专著6部,其中专著《凸分析》和《变分分析》是从事非线性分析和优化理论学者必读的经典教材。曾获约翰·冯·诺依曼理论奖(John von Neumann Theory Prize)、弗雷德里克·兰彻斯特奖(Frederick W. Lanchester Prize)、丹茨格奖(Dantzig Prize)等多项奖励。
课程材料
SOLUTION MAPPINGS AND STABILITY
B站课程回放
https://space.bilibili.com/1254993141
教授主页课程回放
Topic: AN OVERVIEW OF VARIATIONAL ANALYSIS
Lectures aimed at introducing the basic themes of variational analysis to researchers who are interested in optimization and who want a level of mathematical understanding and capability beyond convex analysis.
Origins and Motivations
2021年12月6日上午9:00--10:30
Why and how variational analysis developed from historical subjects such as nonlinear programming and the calculus of variations, while departing in major ways from the framework of classical analysis.
Variational Geometry
2021年12月7日上午9:00--10:30
The tangent and normal subpaces associated with classical geometry of curves and surfaces need to be replaced by one-sided cones of tangent vectors and normal vectors, each of two different kinds. Polarity relations underlie a concept of variational regularity.
Subgradients and Optimality
2021年12月8日上午9:00--10:30
The subgradients of convex analysis can be extended with localizing adjustments to nonconvex analysis in a manner compatible with traditional gradients. This amounts to applying variational geometry to epigraphs. It leads to a nonsmooth calculus that supports optimality conditions.
Variational Approximation
2021年12月9日上午9:00--10:30
When can one problem of optimization be said to be a close approximation of another? This is a key question for which the answer should imply closeness of optimal values and solutions, but classical concepts of analysis miss the target.
Solution Mappings and Stability
2021年12月10日上午9:00--10:30
When a problem is described by an equation with parameters, the standard implicit function theorem describes how the solution depends on those parameters. Problems of optimization are not modeled simply by equations, so more is needed for understanding such dependence, which relates to error analysis.