It's a bit mind-blowing when you want to understand it geometrically:

we have vectors $x_1, \dots, x_n$ in n-dimensional space

Let's take orthonormal basis in this space $e_1, \dots, e_n$ and compute such vectors obtained with via scalar product:

$$z_i = (<e_i, x_1>, <e_i, x_2>, \dots, <e_i, x_n>) $$

So the theorem is:

$$ \det_{i,j} <z_i, z_j> = \det_{i,j} <x_i, x_j> $$

And the proof is actually very simple, let's introduce the matrix $A$: $A_{ij} = <x_i, e_j> $

$$ \det_{i,j} <z_i, z_j> = \det A^T A = \det A A^T = \det_{i,j} <x_i, x_j> $$

Voila! Beautiful, but absolutely unclear.

we have vectors $x_1, \dots, x_n$ in n-dimensional space

Let's take orthonormal basis in this space $e_1, \dots, e_n$ and compute such vectors obtained with via scalar product:

$$z_i = (<e_i, x_1>, <e_i, x_2>, \dots, <e_i, x_n>) $$

So the theorem is:

$$ \det_{i,j} <z_i, z_j> = \det_{i,j} <x_i, x_j> $$

And the proof is actually very simple, let's introduce the matrix $A$: $A_{ij} = <x_i, e_j> $

$$ \det_{i,j} <z_i, z_j> = \det A^T A = \det A A^T = \det_{i,j} <x_i, x_j> $$

Voila! Beautiful, but absolutely unclear.

## No comments :

Post a Comment