This is an interesting topic to be sure. I'm not an expert and lack the mathematical training (or brain?) to understand the technical details if they are not very basic, but I have been fascinated by information theory ever since I first read Shannon's famous paper back at university.
Part of the difficulty here seems to be that there is massive confusion around the relationship (or non-relationship) between entropy in thermodynamics and "entropy" in information theory. Lore has it that John von Neumann suggested the word entropy to Shannon as a political maneuver, kind of a marketing gag. And there are those who strongly oppose any mixing up of the two, even if just by analogy. An excellent paper about that (almost a book really) is this (no math needed to understand it):
Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair
Libb Thims Institute of Human Thermodynamics, Chicago, IL
Even if one does not agree with the author in everything he says, he gives a great overview of the controversy and the fascinating history behind it. It is also useful IMO to prevent one from jumping to conclusions too soon, and he provides tons of useful references.
Then there is the other side, those who try to bring physics/statistical mechanics and information theory together. One must be careful though, because a lot of it seems to be somewhat woo-woo, unclear, and based on playing with words, often resulting in circular reasoning. But it seems to me there IS something going on there, so in that respect, the die-hard critics of applying information theory to other things might be too black and white in their thinking.
At the root of this conflict, it seems to me, once again lies materialism. Ironically, the materialist assumptions seem (in part) to fuel both sides of the argument.
There are those who desperately want to get any talk about information out of physics (or thermodynamics), because they (perhaps rightly) fear that this opens the pandorra's box of going beyond materialism and dealing with "pure matter".
But then there are also those who seek to turn information into "stuff", i.e. something material. They actually WANT to sneak information into physics or other sciences, while treating it as some kind of material commodity. I think they are driven by the hope that by that maneuver, they can explain away some of the contradictions with materialism (but IMO they are just sneaking in language related to consciousness and are under the illusion that this preserves their materialist outlook...)
From the above cited paper:
Related to that, it is obvious that Shannon didn't really talk about information, but about the
transmission of information. It all doesn't make any sense without the
meaning of information, in Shannon's case the agreed-upon code and interpretation on both ends of the transmission. He even initially used the word "intelligence" and not information, following Boole.
This inevitably involves consciousness. It's easy to see this: for example, for written communication, you need the alphabet. Then you can calculate the probability of a letter occuring, and develop compression algorithms based on that. (For example, you use a shorter code for a letter that is often used, or you use a shorthand for the English word "the" instead of the three letters and so on.) But for all that you need the alphabet, and indeed, language. And both receiver and sender need to know what to do with that. And even if both sides are machines, someone still needs to construct and program them, they must be designed for a specific purpose etc. And there is also the question - when you have maximally compressed a signal, what do you get? That's what Shannon originally called "intelligence"... And "intelligble" again implies that there is some intelligence/mind at the end of the transmission chain...
Well, I have no idea either what exactly is going on. In philosophy there is the idea that a computer (and maybe even the brain) is some kind of physical instatiation of higher information processes. So a computer with its Boolean logic is sort of a physical replica of processes at a higher level, "bringing it down to earth" so to say, a bit like in object-oriented programming a certain object is "instantiated". But that might again be just cheap talk and doesn't help...
I recently stumbled upon this paper by Ben-Naim, fwiw:
This article is about the profound misuses, misunderstanding, misinterpretations and misapplications of entropy, the Second Law of Thermodynamics and Information Theory. It is the story of the “Greatest Blunder Ever in the History of Science”. It is ...
www.ncbi.nlm.nih.gov
I'm out of my depth here because you need a math/physics background to really understand it, except for his witty criticism of some of the "crimes" various thinkers/scientists have committed in the name of Shannon. But, this is exactly what Libb Thims accuses him of (!):
A horrible mess all of that, and I can't help but think that at some level, this may have been by design. The understanding of information (whatever it is) seems to be crucial in some way for science to advance, but it seems there are more dead-ends, illusions and conceptual errors than real progress...?
Thanks for this thread btw, these are just some thoughts going on in my mind at the moment.