Field of Science

Information, Part Deux

First, a note of personal triumph: I have a paper up on the arXiv! For those who don't know, the arXiv is a way for researchers to distribute their work in a way which is free for all users, but also official, so that no one can scoop you once you've posted to the site. In the paper, I argue that a new and more general mathematics of information is needed, and I present a axiomatic framework for this mathematics using the language of category theory.

For those unfamiliar with such highfalutin language, it's really not as complicated as it sounds. I'll probably do a post soon explaining the content of the paper in layperson's terms. But first, and based partly on the feedback to my last post, I think it's important to say more on what information is and why I, as a complex systems theorist, am interested in it.

I'm currently thinking that information comes in three flavors, or more specifically, three broad situations where the concept comes in handy.

  • Statisitcal information: Some things in life appear to be random. Really, this means that there's information we don't have about what's going to happen. It turns out there's a formula to quantify the uncertainty of an event---how much we don't know. This enables us to make statements like "event A is twice as uncertain as event B", and, more powerfully, statements like "knowing the outcome of event C will give us half of the necessary information to predict event B." The second statement uses the concept of mutual information: the amount of information that something tells you about something else. Mutual information can be understood as quantifying the statistical relationship between two uncertain events, and forms the basis of a general theory of complex systems proposed by Bar-Yam.


  • Physical Information: If the "uncertain event" you're interested in is the position and velocity of particles in a system, then calculating the statistical uncertainty will give you what physicists call the entropy of the system. Entropy has all the properties of statistical information, but also satisfies physical laws like the second law of thermodynamics (entropy of a closed system does not decrease.)


  • Communication Information: Now suppose that the "uncertain event" is a message you'll receive from someone. In this case, quantifying the uncertainty results in communication information (which is also called entropy, and there's a funny reason* why.) Communication information differs from statistical information in that, for communication, the information comes in the form of a message, which is independent of the physical system used to convey it.


The neat thing about these flavors of information is that they are all described by the same mathematics. The first time I learned that the same formula could be used in all these situations, it blew my mind.

One might think that is concept is already amazingly broad; why is a "more general mathematics of information" needed? The answer is that people were so inspired by the concept of information that they've applied it to fields as diverse as linguistics, psychology, anthropology, art, and music. However, the traditional mathematics of information doesn't really support these nontraditional applications. To use the standard formula, you need to know the probability of each possible outcome of an event; but "probability" doesn't really make sense when talking about art, for example. So a big part of my research project is trying to understand the behavior of information when the basic formula does not apply.

*Having just invented the mathematical concept of communication information, Claude Shannon was unsure of what to call it. Von Neumann, the famous mathematician and physicist, told him "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."

2 comments:

  1. The name "Shannon" is missing from the first sentence of your footnote.

    You mentioned at the n-Category Café that you didn't have a way of doing fancy math and equation stuff on this site. My friend Randall recently wrote a JavaScript program for enabling TeX in web pages, which you might find useful.

    ReplyDelete
  2. "The name "Shannon" is missing from the first sentence of your footnote."

    Geez, who edits this thing?

    Thanks for the software reference. I'll try it out when the need arises.

    ReplyDelete

Markup Key:
- <b>bold</b> = bold
- <i>italic</i> = italic
- <a href="http://www.fieldofscience.com/">FoS</a> = FoS