MathOverflow is a question and answer site for professional mathematicians. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

Let $v_1,\dotsc,v_k \in \mathbb{R}^d$ be unit-length vectors such that $$\sum_{1\leq i,j\leq k} |\langle v_i,v_j\rangle|^2 \leq \epsilon k^2.$$ What sort of lower bound can we give on $d$ in terms of $k$ and $\epsilon$?

Must it be the case that $d\gg \min(\log k,\epsilon^{-1})$, say, or anything of the sort? Is that tight?

share|cite|improve this question
    
I guess a more flexible reformulation is to let $v_1,\ldots,v_k$ be unit-length vectors in some vector space of large (infinite?) dimension, and ask for a lower bound on the dimension of their span given the almost-orthogonal condition. – Peter Humphries 9 hours ago
1  
Isn't this equivalent? – H A Helfgott 9 hours ago
    
Yes of course.. – Peter Humphries 9 hours ago
    
Yes, that's what I meant. Thanks! – H A Helfgott 9 hours ago

The Johnson Lindenstrauss Lemma states that there are $k$ vectors achieving $\epsilon$ provided $d\geq C \epsilon^{-2} \log k.$

If you actually want to bound the maximum absolute value of the inner product for distinct vectors, instead of the average as you stated which is of course stronger, then tighter bounds apply.

Relevant results for this case are due to Welch, Kabatianski, Levenshtein, Sidelnikov. Welch's applies to arbitrary vectors, real or complex. The others apply to vectors constructed from complex roots of unity of some finite order. Welch's bound states

Let $e\geq 1$ be an integer and let $a_1,\ldots,a_k$ be distinct vectors in $\mathbb{C}^d.$ Then the following inequalities hold $$ \sum_{i=1}^k \sum_{j=1}^k \left| \langle a_i, a_j \rangle \right|^{2e} \geq \frac{\left(\sum_{i=1}^k \lVert a_i \rVert^{2e}\right)^2}{\binom{d+e-1}{e}}, $$ If the set of vectors you are interested in is of size roughly $d^u,$ the tightest lower bound is obtained by choosing $e=\lfloor u\rfloor.$

Edit:

Since you required $\langle a_i,a_i\rangle=1,1\leq i\leq k,$ if I subtract the diagonal inner products, I obtain $$ \sum_{1\leq i\neq j\leq k} \left| \langle a_i, a_j \rangle \right|^{2} \geq \frac{k^2-kd}{d}, $$ recovering the dependence on $k.$

Edit 2:

The Johnson Lindenstrauss Lemma is tight up to a constant factor. The Welch bound is tight for some cases, when so-called Welch Bound with Equality sets of vectors exist, which correspond to all unequal innner products being the same in absolute value.

share|cite|improve this answer
    
Wait. Doesn't the inequality you cite as Welch's imply that $d\geq \epsilon^{-1}$, with no dependence on $k$? This seems a little too strong. – H A Helfgott 8 hours ago
    
@HAHelfgott, please see edit. – kodlu 8 hours ago
    
Thanks. That answers the question, then. But how far is it from being tight? Is the Johnson Lindenstrauss Lemma tight? – H A Helfgott 7 hours ago
1  
As I pointed out earlier in a now deleted comment, we can (trivially) achieve the bound for a given $\epsilon$ without any real size restrictions on $k$ if $d\ge\epsilon^{-1}$ by just repeating an ONB of $\mathbb R^d$. (I guess it would be more honest to say that if $d$ is very close to $\epsilon^{-1}$, then $k$ should be a multiple of $d$ to be safe.) – Christian Remling 5 hours ago

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.