Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Shannon's original paper "A Mathematical Theory of Communication" which introduced information theory is so accessible as to be useful as a tutorial itself:

http://people.math.harvard.edu/~ctm/home/text/others/shannon...



Here is a list of books for those wishing to dig deeper:

1) The Information by Gleick is a book covering everything from biographies of the key historical figures and contributors to contemporary applications of information theory in quantum mechanics.

2) Elements of Information Theory by Joy Thomas and Thomas Cover is a thorough and engaging textbook.

3) Quantum information and quantum computation by Nielsen and Chuang explaining both classical and quantum information with some practical examples.

4) Entropy and Information by Volkenstein a Soviet popular science book with math-heavy examples from biology, chemistry and physics

and as an aside I can also recommend "Willful Ignorance: The Mismeasure of Uncertainty" by Weisberg, which is an engaging history review of probability theory and its key inventors


> 2) Elements of Information Theory by Joy Thomas and Thomas Cover is a thorough and engaging textbook.

Thomas & Cover is THE textbook on information theory -- no doubt -- but you and I have very different definitions of engaging.

For a different kind of engaging, I would recommend Dave MacKay's "Information Theory, Inference, and Learning Algorithms", which can be found on his website:

http://www.inference.org.uk/mackay/itila/book.html


+1 for the original paper. I used it to pass my information theory class in college because it made more sense than the official materials and my lecture notes.

It’s a fantastic explanation of the theory.


The edition with Weaver's introduction is also very good, especially if you would like the context and ramifications of Shannon's theory.

https://pure.mpg.de/rest/items/item_2383164/component/file_2...


Except that some of the notation has changed a little. But it's insanely good and impactful for a piece that is only 8 pages long, if I remember correctly.


it's 55 pages


His master's thesis was much smaller, if I remember correctly. Gonna check with my ex teacher :)


His master's thesis was on applying boolean algebra to the design of switching circuits, which was similarly groundbreaking (although discovered in parallel by another).


Is there an annotated version or walk through of this paper available or something? I've read it a few times before but I couldn't really understand what's so ground breaking about it. It could be because the more detailed and technical stuff I couldn't grasp that well, I understood high level the kind of model of information theory he was talking about but I'm pretty sure just that by itself isn't what made it so groundbreaking at the time


As the other reply to your comment said: the high level things you understand were groundbreaking at the time. The fact that arbitrary data can be encoded as bits? Minds blown!

What passes for mundane obviousness today was once a strike of genius nobody else saw coming.

And that's part of what makes the paper so groundbreaking, too. Some things are groundbreaking in terms of explanatory power but don't make it into the fabric of society. What Shannon's paper did was essentially weave the fabric of the digitally powered society.

And it did it so successfully that you think of it as obvious banalities today.


It wasn’t like there were other theories of information at the time. He established a very neat theory for the first time, and showed a very intriguingly idea of entropy that proceeded naturally from it. He showed how the information a channel can transmit is related to the bandwidth. He showed how you can estimate the entropy of languages.

If this seems obvious to you and not remarkable it is only because of hindsight, because his insights have been absorbed into the common culture.


Continuous entropy is more subtle than the direct analogy to the discrete case he uses in the original paper, see:

https://en.wikipedia.org/wiki/Differential_entropy

https://en.wikipedia.org/wiki/Limiting_density_of_discrete_p...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: