Most of my interesting thoughts are, I'd wager, about information. I guess that's trivially true of anyone, so I'll clarify that by saying I mean in the meta-sense: modern Information Technology and Theory of course, but also vague pop culture stuff about memes, Economics and Politics and Evolutionary Biology as forms of Information Warfare (so essentially, Game Theory). Theological speculations, which is always a very on-again off-again area of interest for me, often comes back to the Greek philosophical idea of logos - as made famous by John's "In the Beginning was the (Logos), and the (Logos) was with God, and the (Logos) was with God." Christ literally is the Word of God, made flesh - or as a modern blogger might phrase it, he is the Christianity Meme, embodied. That may sound trivial, but I don't think it is.
I'm vastly undereducated in a lot of key areas that are needed to really come up with more than just fluff on this vast topic - which I'll blame ironically on modern information overload. It'd be nice to have learned some Game Theory in a formal setting rather than getting the two-minute version via Wikipedia; but what I really kick myself for not taking while I was still at University was any Statistics, which high school had portrayed as boring stuff for people who couldn't hack real mathematics. That might be true, but knowing zero statistics is in retrospect the thing that most contributes to my utter scientific illiteracy. Every time I try to read any real, not completely dumbed down science, I am reminded of this fact. If you're going to care about information, it doesn't help if you can't talk meaningfully about uncertainty.
Anyway, out of this quagmire of uninformed opinions, I do occasionally come up with something that still seems worth pursuing upon closer inspection. The most significant is that I think Kolmogrov Complexity holds they key to unraveling a lot of misplaced notions about information in a formal setting, and has been underutilised for this task.
Ever since Shannon started the Information Age (a feat he gets precious little attention for), his ideas have been pretty dominant. For good reason. "Data" Entropy is a great formalisation of ideas about information, and its link to Physical Entropy is clearly one of those deep, profound links between disparate fields that Science fortuitously stumbles upon from time to time.
The problem is, the Entropy story about Information doesn't seem to quite capture our intuitions as well as it could. When a Creationist tries to argue that Intelligent Design follows from the Second Law of Thermodynamics, simple equations about solar energy show them to be ignorant and misinformed. But there is something seductive about the intuition that they're appealing to - that in some sense, PageRank and Quantum Computers and Mozart and all the rest coming about through what started as a chance process is simply implausible (in a whole different sense than mere gas particles huddling into the corner of the room.)
This might just be a broken intuition - we can't with finite brains really comprehend the calculable unlikelihood of the gas molecule trick, and we can't even begin to calculate what the odds of modern human civilisation coming into being by chance are; in fact, we can't even meaningfully formulate the question in a rigorous setting. Hence Randall Munroe's spot on snipe at the Drake Equation.
However, I stumbled upon a neat turn in the argument. It has all the rigour of an undergrad paper in Literary Criticism, and may just be because I find Computer Science much easier than Physics. But if we frame the debate in terms Kolmogrov Complexity, instead of Entropy alone, we can easily say:
What if the universe is like a giant file being subject to gzip style compression? So Entropy is increasing, but in a sense the universe is getting "more organised", and this crazy spike of local knowledge is like a symbol table for the rest of the universe. Thus, we can build room sized computers that generate concise descriptions of how galaxy sized super massive blackholes behave.
Of course for this to work in converting Intelligent Designers to your view, you have to argue the universe somehow evolved a genetic algorithm to zip itself. Maybe that argument goes "A generalised Second Law (which could have been made by a Deist God, or Theist God, or no god at all; the theory is "Design wise neutral") is the at heart of the Theory of Everything: Time maximises Entropy, which is actually information density - it just looks like randomness. All the rest - every other law of science and the universe that follows from them - is details."
You can make that argument without even brining in Kolmogrov Complexity I think, but it seems much clearer to me to use that analogy. I've never convinced myself that I really understood the subtleties of Thermodynamics; but I can explain how WinZip can make files shorter to a high school student.
Tuesday, July 7, 2009
Subscribe to:
Post Comments (Atom)
1 comment:
Surprisingly, I both understand this and find it interesting. Nice work!
Post a Comment