It is well-known that Shannon's channel capacity provides an upper bound on the amount of information (measured in bits/s) that can be reliably (i.e. with vanishing decoding error probability) transmitted through a communication channel.
My question is pretty basic; hope it is not too naive for this community.
Can Shannon's channel capacity be also interpreted as the maximum amount of information that can be reliably stored in a communication medium?
If so, does this imply that information storage and information transmission are the same "thing" in information-theoretic terms?
If not, do there exist information-theoretic measures that quantify the "efficiency" of storing information?
Any help in elucidating this fact is very welcome. Pointers to textbooks/papers are appreciated as well. Thanks!
Edit (10/17/18). I don't understand why my question has received downvotes. If its content does not fit within this community (even though, according to this link, I believe it does), please close it or move it to another (more suitable) SE community.