Mathematics Stack Exchange is a question and answer site for people studying math at any level and professionals in related fields. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

Let $A \in \mathrm{U}(n) \subset \mathbb{C}^{n \times n}$ a unitary matrix. Show that: $\exists ~ S\in \mathrm{U}(n)$ so that

$\bar{S^t}AS=D:=\begin{pmatrix}\lambda_1&&0\\&\ddots & &\\0&&\lambda_n\end{pmatrix}$

where $\lambda_i\in \mathbb{C}$ are eigenvalues of $A$. Show moreover, that $|\lambda_i|=1$.

My idea is to show that the orthogonal complement $W$ of an eigenvector $v$ of $A$ from $A$ is mapped into itself, meaning $AW \subset W$. But how to start this?

Let $\lambda\in \mathbb{C}$ be an eigenvalue and $v \in \mathbb{C}$ a corresponding eigenvector with $|v|$=1. Let $W\perp v = \{w \in \mathbb{C}^n|\langle w,v\rangle=0\}$ an orthogonal vector subspace to $v$.

${\langle Aw,v\rangle}=\overline{\langle v,Aw\rangle}=\overline{\langle v,\lambda w\rangle}=\lambda \overline{\langle v,w\rangle}=\lambda\cdot 0=0$,
so $Aw\in v^{\perp}=W$.

So now choose a basis of $W$ with normalized vectors $v_2,\ldots,v_n$ perpendicular to each other (Gram–Schmidt process). Be $v_1:=v$ and $S:=(v_1,v_2,\ldots,v_n) \in \mathrm{GL}(n,\mathbb{C})$. Then one has: $\langle v_i,v_i\rangle=1\ \forall i$, $\langle v_i,v_j\rangle=0\ \forall i \neq j $. Let $w_j:=Av_j\in W, j=2,\ldots,n$

$\Rightarrow \overline{S}^tAS=\overline{S}^t(\lambda v_1,w_2,\ldots,w_n)=\begin{pmatrix}\lambda&0&0&0\\0&& & &\\0&&B\\0\end{pmatrix}.$

Is this correct so far? Why is $S$ orthogonal?

share|cite|improve this question
    
The last part follows like this: you have $I_n=AA^*=SDS^*SD^*S^*=SDD^*S^*$. Now focus on the equality $I_n=SDD^*S^*$ and try to conclude that the eigenvalues are on the unit circle. For the first part, do you know anything about diagonalizability of normal matrices? – Git Gud Jun 22 '14 at 20:24
    
How to start? By assuming that $w\perp v$, and looking at $\langle Aw,v\rangle$. You want to see that that is $0$. You have an inner product in which a matrix appears in one argument. What is a very common operation in such circumstances? – Daniel Fischer Jun 22 '14 at 20:43
    
The property $|\lambda|=1$ follows from the very definition of unitarity $\|Ax\|=\|x\|$. – Peter Franek Jun 22 '14 at 21:43
    
Are you aware of the spectral theorem? – Omnomnomnom Jun 23 '14 at 23:20
    
@Omnomnomnom No, spectral theorem is unknown for us so far – fear.xD Jun 24 '14 at 8:17

How to show $A(v^\perp) \subset v^\perp$ for an eigenvector $v$:

Consider any $u \in v^\perp$. We have $$ \langle Au,v\rangle = \frac 1 {\bar\lambda} \langle Au,\lambda v\rangle = \frac 1 {\bar\lambda} \langle Au, Av\rangle = \frac 1 {\bar\lambda} \langle u, v\rangle =0 $$ In your proof, you assume that $w$ is an eigenvector, which is an invalid assumption.

For your second question: a matrix $S$ is orthogonal (unitary) if and only if its columns form an orthonormal basis of $\mathbb{R}^n$ ($\mathbb{C}^n$). To see why this is the case, note that $S^*S$ is the matrix of pairwise inner products of columns of $S$. That is, if $s_1,\dots,s_n$ are the columns of $S$, then $$ (S^*S)_{ij} = \langle s_j,s_i \rangle $$

share|cite|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.