I'd like to compute $\sum_{i=0}^k {{N}\choose{i}}$. Is there a computable approximation for that?
|
|
One of the more convenient and popular approximations of the sum is $$\frac{2^{nH(\frac{k}{n})}}{\sqrt{8k(1-\frac{k}{n})}} \leq \sum_{i=0}^k\binom{n}{i} \leq 2^{nH(\frac{k}{n})}$$ for $0< k < \frac{n}{2}$, where $H$ is the binary entropy function. (The upper bound is exactly what Aryeh Kontorovich mentions.) You can find its proof in many textbooks, but probably I first learned it from Chapter 10 of The Theory of Error-Correcting Codes by MacWilliams and Sloane. Also, this post on MO asks a similar question and is a good resource for the sum in my opinion. You can find several other useful bounds there. |
|||||
|
|
A well-known upper bound, for $k\le N/2$, is $$ \sum_{i=0}^k {N\choose i} \le 2^{N H(k/N)},$$ where $H$ is the binary entropy function $$ H(x) = -x\log_2(x)-(1-x)\log_2(1-x).$$ This bound was sharpened in Lemma 5 of http://www.sciencedirect.com/science/article/pii/S0012365X12000544 |
|||
|
|