Convex conjugate functions and Fenchel’s inequality
Let be some function. The convex conjugate of
(also know as the Fenchel transformation), is the function
defined as
Fenchel’s equality is the following statement:
Theorem (Fenchel’s inequality). For any
,
The proof of Fenchel’s equality follows directly from the definition of the convex conjugate:
A direct application of Fenchel’s inequality shows that the conjugate of the conjugate (the biconjugate), denoted , always satisfies
To see this: Fenchel’s inequality says for all
. Taking a supremum over all
, the inequality becomes
It turns out that if is closed and convex, then we actually have
. The proof is a bit cumbersome and so I’m not providing it in this post. (For a proof, see Reference 1 or Slides 5.13-5.14 of Reference 2.) More generally, the Fenchel-Moreau theorem gives necessary and sufficient conditions for
.
Subdifferentials and optimality
A vector is a subgradient of
at point
if
for all
.
The subdifferential of at point
, denoted
, is the set of all the subgradients of
and point
.
Convex conjugates appear in the following theorem concerning subdifferentials:
Theorem. If
is closed and convex, then for any
,
Proof (adapted from Slide 5.15 of Reference 2 and Reference 3): First we show that . Using the definitions of subgradients and convex conjugates,
Thus, for any ,
which implies that .
To get the reverse implication , apply the same logic above with
in place of
and use the fact that
for closed convex
.
Next, we show that . Using the definitions of subgradients,
Taking supremum over all , the inequality becomes
. Combining this with Fenchel’s inequality gives us equality
.
Conversely, if , we have
References:
- StackExchange. “How to prove the conjugate of the conjugate function is itself?”
- Vandenberghe, L. (2022). “5. Conjugate functions.”
- StackExchange. “Proof about Conjugate and subgradient.”