It is well-known that Shannon's channel capacity provides an upper bound on the amount of information (measured in bits/s) that can be reliably (i.e. with vanishing decoding error probability) transmitted through a communication channel.

My question is pretty basic; hope it is not too naive for this community.

Can Shannon's channel capacity be also interpreted as the maximum amount of information that can be reliably stored in a communication medium?

  • If so, does this imply that information storage and information transmission are the same "thing" in information-theoretic terms?

  • If not, do there exist information-theoretic measures that quantify the "efficiency" of storing information?

Any help in elucidating this fact is very welcome. Pointers to textbooks/papers are appreciated as well. Thanks!

Edit (10/17/18). I don't understand why my question has received downvotes. If its content does not fit within this community (even though, according to this link, I believe it does), please close it or move it to another (more suitable) SE community.

New contributor
Ludwig is a new contributor to this site. Take care in asking for clarification, commenting, and answering. Check out our Code of Conduct.
  • @PeterShor: Thank you for your comment. Could you please elaborate a little more on why modeling information storage as a communication channel would be inaccurate? Is this related to the fact that previously stored information could "interfere" with the current one? Thanks – Ludwig yesterday

Your Answer

Ludwig is a new contributor. Be nice, and check out our Code of Conduct.
 
discard

By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Browse other questions tagged or ask your own question.