Foundations and Trends® in Optimization > Vol 6 > Issue 1

A Tutorial on Hadamard Semidifferentials

By Kenneth Lange, Departments of Computational Medicine, Human Genetics, and Statistics, University of California, Los Angeles, USA, klange@ucla.edu

 
Suggested Citation
Kenneth Lange (2024), "A Tutorial on Hadamard Semidifferentials", Foundations and Trends® in Optimization: Vol. 6: No. 1, pp 1-62. http://dx.doi.org/10.1561/2400000041

Publication Date: 13 May 2024
© 2024 K. Lange
 
Subjects
Optimization,  Calculus and mathematical analysis
 
Keywords
MSC Codes: Primary 28A15, 65C60.
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction
2. Background on Convexity
3. Basic Properties of Semidifferentials
4. Tangent Vectors and Adjacent Cones
5. First-Order Optimality Conditions
6. A KKT Rule for Semidifferentiable Programs
7. Second-Order Optimality Conditions
8. Differentials in Optimization Practice
9. Optimization on Embedded Submanifolds
10. Discussion
Acknowledgements
References

Abstract

The Hadamard semidifferential is more general than the Fréchet differential now dominant in undergraduate mathematics education. By slightly changing the definition of the forward directional derivative, the Hadamard semidifferential rescues the chain rule, enforces continuity, and permits differentiation across maxima and minima. It also plays well with convex analysis and naturally extends differentiation to smooth embedded submanifolds, topological vector spaces, and metric spaces of shapes and geometries. The current elementary exposition focuses on the more familiar territory of analysis in Euclidean spaces and applies the semidifferential to some representative problems in optimization and statistics. These include algorithms for proximal gradient descent, steepest descent in matrix completion, and variance components models.

DOI:10.1561/2400000041
ISBN: 978-1-63828-348-5
78 pp. $60.00
Buy book (pb)
 
ISBN: 978-1-63828-349-2
78 pp. $155.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2. Background on Convexity
3. Basic Properties of Semidifferentials
4. Tangent Vectors and Adjacent Cones
5. First-Order Optimality Conditions
6. A KKT Rule for Semidifferentiable Programs
7. Second-Order Optimality Conditions
8. Differentials in Optimization Practice
9. Optimization on Embedded Submanifolds
10. Discussion
Acknowledgements
References

A Tutorial on Hadamard Semidifferentials

This tutorial presents a brief survey of semidifferentials in the familiar context of finite-dimensional Euclidean space. This restriction exposes the most critical ideas, important connections to convexity and optimization, and a few novel applications. The text delves more deeply into these topics and is highly recommended for a systematic course and self-study.

The main focus of this tutorial is on Hadamard semidifferentials. The Hadamard semidifferential is more general than the Fréchet differential, which is now dominant in undergraduate mathematics education. By slightly changing the definition of the forward directional derivative, the Hadamard semidifferential rescues the chain rule, enforces continuity, and permits differentiation across maxima and minima.

The Hadamard semidifferential also plays well with convex analysis and naturally extends differentiation to smooth embedded submanifolds, topological vector spaces, and metric spaces of shapes and geometries. The current elementary exposition focuses on the more familiar territory of analysis in Euclidean spaces and applies the semidifferential to some representative problems in optimization and statistics. These include algorithms for proximal gradient descent, steepest descent in matrix completion, and variance components models.

This tutorial will be of interest to students in advanced undergraduate and beginning graduate courses in the mathematical sciences.

 
OPT-041