Jeremi Forman-DuraƱona: / /writing /projects

Discriminant Inequality Trick

2026-01-24

I came across a really cool trick while reading Ros and Montiel, which I now recall from first year calculus can be used to prove Cauchy-Schwarz. If you want an inequality between two quantities which are the (non-leading) coefficients to a quadratic polynomial then this trick might help you!

The trick is this: if a quadratic polynomial has distinct real (complex) roots then the discriminant is positive (negative).

The Cauchy-Schwarz proof uses the negative version the trick, whereas the inequality between Gaussian and mean curvature uses the positive version of the trick.

Examples!

I’ll prove Cauchy-Schwarz and then I’ll prove an inequality between Gauss curvature and mean curvature, which is the application I found in Ros and Montiel that inspired me to write this post.

Cauchy-Schwarz

Take $v,w \in \R^n$. We want to show that $\left< v,w \right> \leq |v||w|$ with equality iff $v,w$ are colinear.

Two steps:

  1. $v,w$ colinear if and only if $\left< v,w \right> = |v|\cdot |w|$
  2. Otherwise, we get the inequality.

Step 1.

If $v,w$ are colinear, then one easily gets equality. If the sides are equal $\left< v,w \right>^2 = \left< v,v \right> \left< w,w \right>$, where you can use bilinearity to get $$ \left< \left< v,w \right> v,w\right> = \left< \left< v,v \right> w,w\right> $$ which happens when $\left< v,w \right> v = \left< v,v \right> w$ so $v,w$ are colinear.

Step 2. (We use the discriminant trick here)

We can now suppose that $v,w$ are not colinear. Therefore, $v-xw \neq 0 $ for any real $x$. Equivalently, $|v-xw| \neq 0$ for any $x \in \R$. We can expand out and get that the following quadratic polynomial in $x$ has no real roots: $$ \left< v-xw, v-xw \right>^2 = \left< v,v \right> - 2 x\left<v,w \right> + x^2 \left< w,w \right>. $$ It follows that the discriminant $$\left( -2 \left< v,w \right> \right)^2 - 4 \left( \left< w,w \right> \right) \left( \left< v,v \right> \right)$$ (the stuff under the square root in the quadratic formula) is negative, which is exactly the Cauchy-Schwarz inequality!

It should be noted that this proof works for inner product spaces of any dimension.

Inequality between Gauss curvature and mean curvature

Let $S \subset \R^3$ be an orientable closed surface. In particular, it admits a smooth Gauss map $N: S \to S^2$ sending a given point $p \in S$ to a unit normal $N(p) \in S^2$. The Gauss curvature $K(p)$ at $p$ is defined as the determinant of the differential $dN_p : T_pS \to T_pS^2$ whereas the mean curvature $H(p)$ is defined as $\frac{1}{2}$ times the trace of $dN_p$. It is a fact that follows from computation that $dN_p$ is symmetric and thus diagonalizable over the reals. The corresponding eigenvalue/eigenvector pairs in $T_pS$ are called the principal curvatures/principal directions respectively. The principal curvatures I will denote by $\kappa_1, \kappa_2 \in \R$. This mans that $K = \kappa_1 \cdot \kappa_2$ and $H = \frac{1}{2}(\kappa_1 + \kappa_2)$ (NB: these quantities vary with $p$).

For a particular problem I did out of Ros and Montiel (problem 15 in Chapter 5) I needed the following application of the discriminant trick.

I wanted to show for a compact surface $S$ that $\int_S H(p)^2 dA(p) \geq 4 \pi$ knowing that $\int_S |K(p)|dA(p)\geq 4\pi$ (which is itself a consequence of the Area formula). A pointwise inequality like $H(p)^2 \geq |K(p)|$ would suffice and the discriminant trick is enough to produce it.

Here’s the trick!

The characteristic polynomial of $dN_p$ is $$(\kappa_1 - x)(\kappa_2 - x) = \kappa_1\kappa_2 - (\kappa_1 + \kappa_2)x + x^2 = K - 2H x + x^2$$ so using that $\kappa_1, \kappa_2 \in \R$ we have that the discriminant of the characteristic polynomial is nonnegative: $$ 4H^2 - 4K \geq 0 $$ implying that $H^2 \geq |K|$, which was exactly what I needed for my problem.

Thoughts

Seeing as so many important inequalities are proved using Cauchy-Schwarz, it seems to me that at that this trick is well worth recognizing in the wild.