2025 USA-NA-AIO Round 2, Problem 3, Part 18

Part 18 (5 points, non-coding task)

Lemma 1 implies that the projection of \mathbf{x} onto any direction is a standard normal. Therefore, all directions are homogeneous.

Therefore,

\Bbb P \left( \frac{\mathbf{x}^\top \mathbf{y}}{|| \mathbf{x} ||_2 || \mathbf{y}||_2 } > \epsilon \right) = \Bbb P \left( \frac{\mathbf{x}^\top \mathbf{y}}{|| \mathbf{x} ||_2 || \mathbf{y}||_2 } > \epsilon \mid \mathbf{x} = \mathbf{\hat x} \right), \ \forall \ \mathbf{\hat x} \in \Bbb R^d .

For simplicity, we consider

\mathbf{\hat x} = \begin{bmatrix} 1 \\ 0 \\ \vdots \\ 0 \end{bmatrix} \in \Bbb R^d .

Therefore, we only need to bound

\Bbb P \left( \frac{y_0}{|| \mathbf{y}||_2 } > \epsilon \right)

By symmetry, it is easy to see that

\text{E} \left[ \frac{y_0}{|| \mathbf{y}||_2} \right] = 0 .

Hence, we get

\begin{align*} \Bbb P \left( \frac{y_0}{|| \mathbf{y}||_2 } > \epsilon \right) & \leq \frac{\text{Var} \left[ \frac{y_0}{|| \mathbf{y}||_2} \right]}{\epsilon^2} \\ & = \frac{\text{E} \left[ \frac{y_0^2}{|| \mathbf{y}||_2^2} \right]}{\epsilon^2} \\ & = \frac{1}{\epsilon^2 d} \ \text{E} \left[ \frac{y_0^2}{\frac{1}{d}\sum_{i=0}^{d-1} y_i^2} \right] \end{align*}

where the first inequality follows from the Chebyshev’s inequality.

To prove the theorem, it is equivalent to prove the following lemma.

Lemma 2:

  • Let y_0, \cdots , y_{d-1} be identically and independent variables that are all standard normals. Then for large d,
\text{E} \left[ \frac{y_0^2}{\frac{1}{d}\sum_{i=0}^{d-1} y_i^2} \right] \approx 1 .

In this part, your task is to prove this lemma.

Hint: It is hard to prove this statement in an exact way. You can make any reasonable approximation.

\color{green}{\text{### WRITE YOUR SOLUTION HERE ###}}

Since y_i is a standard normal, \text{E} \left[ y_i^2 \right] = 1 and \text{Var} \left[ y_i^2 \right] = 2.

For large d, the central limit theorem implies

\frac{1}{d}\sum_{i=0}^{d-1} y_i^2 \sim N \left( 1, \frac{2}{d} \right) .

Hence, for large d, \frac{1}{d}\sum_{i=0}^{d-1} y_i^2 can be approximated as its mean value, 1.

Therefore, for large d,

\begin{align*} \text{E} \left[ \frac{y_0^2}{\frac{1}{d}\sum_{i=0}^{d-1} y_i^2} \right] & \approx \text{E} \left[ y_0^2 \right] \\ & = 1 . \end{align*}

\color{red}{\text{""" END OF THIS PART """}}