Sunday, July 23, 2017

Notes on a Cosmology - Part 3, What Is Information?

Information is, arguably, the core concept underlying the Simulation Hypothesis. Seth Lloyd, a physics professor at MIT, wrote in a 2013 paper titled, The Universe as Quantum Computer:
It is no secret that over the last fifty years the world has undergone a paradigm shift in both science and technology. Until the mid-twentieth century, the dominant paradigm in both science and technology was that of energy: over the previous centuries, the laws of physics had been developed to understand the nature of energy and how it could be transformed. In concert with progress in physics, the technology of the industrial revolution put the new understanding of energy to use for manufacturing and transportation. In the mid-twentieth century, a new revolution began. This revolution was based not on energy, but on information. The new science of information processing, of which Turing was one of the primary inventors, spawned a technology of information processing and computation. This technology gave rise to novel forms and applications of computation and communication. The rapid spread of information processing technologies, in turn, has ignited an explosion of scientific and social inquiry. The result is a paradigm shift of how we think about the world at its most fundamental level. Energy is still an important ingredient of our understanding of the universe, of course, but information has attained a conceptual and practical status equal to – and frequently surpassing – that of energy. Our new understanding of the universe is not in terms of the driving power of force and mass. Rather, the world we see around us arises from a dance between equal partners, information and energy, where first one takes the lead and then the other. The bit meets the erg, and the result is the universe.
But what, exactly, is information? Energy is intangible but energy is like matter in that it seems to have existence in itself, without dependence on an external observer. But how can information, which is a category of the activity of the mind, have existence in itself, that is, exist apart from the mind? How can something that does not have existence in itself be the basis underlying our own existence? The statements, "I am made of matter" and "I move the world and the world moves me through energy exchange" are almost true by definition - they are just how we use the words "matter" and "energy". But the statements, "I am made of information" and "I move the world and the world moves me through the exchange of information" seems like nonsense at first brush[1]. In this post, we will not tackle the larger problem of information-based physics - also known as digital physics, "It from Bit" among other names. But we are going to need to think about what information is and why it is so important to understanding the Simulation Hypothesis.

Information theory has its origins in signaling theory which, in turn, has its origins in radio and telegraphy. Telegraphy enabled remote, wired communication. Telegrams were revolutionary. Prior to the telegram, communication was a linear function of distance, that is, the further away the destination of your message, the longer it took for your message to arrive. But the telegram made communication between all points that had a telegram station equally fast. A message could be sent from New York to San Francisco at virtually the same speed as from New York to Chicago. For especially time-sensitive applications - such as stock-trading - telegraphy enabled basically instantaneous communication from point to point. The radio brought even more profound changes in the nature of communication. Radio is a wireless communication technology. Messages could be "broadcast" across broad regions. Weather stations, news stations and many other broadcast systems enabled highly valuable, general-purpose information to be shared widely, at no ongoing cost to the receivers and very low operating costs for the transmitters.

There are two great enemies to any signaling system: noise and power loss. The longer a telegraph wire, the more noise is present on the wire and the more electrical power is lost in the telegraph wire itself. Engineers call the ratio of signal to noise at the receiver the SNR - signal-to-noise-ratio. The precise definition of SNR depends on the type of equipment under examination. SNR is measured in microwave-frequency communications systems differently than it is in lower frequency radios; and both of these are measured differently than the SNR in wired communications systems. The key is that, in a communication system, we want the signal at the receiver to be large enough, and the noise low enough, that the message can be clearly decoded.

Claude Shannon wrote a 1948 article on information theory titled, A Mathematical Theory of Communication [MTC]. In the introduction, he explains the design goal of any communication system:
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design. [Emphasis original]
The key to understanding information - in the sense of the word that is relevant to signaling systems, modern technology and the Simulation Hypothesis - is that meaning is optional. In other words, information is information from the technological or mathematical point-of-view, whether or not this information has meaning to anyone. This is a modern, jargon sense of the word, and this is a common cause of difficulty for non-experts who are only accustomed to using the word "information" in its ordinary sense of "information that conveys meaning." Whether or not a message has any meaning, it is still a message and the goal of any communication system - as Shannon explains - is to reproduce a message selected at one point, exactly or approximately, at some other point.

The idea of information as something separate from meaning can be a significant obstacle for the uninitiated. For this reason, I will give an illustration of the difference between a message and the meaning it conveys. Consider the problem of transmitting a cryptogram over a telegraph system. Let us encipher the message, "SEND MORE MONEY" using some unspecified ciphering system into the letters, "GSIL AGUL JHKVH". To the telegraph operator, this message is gibberish. But the task of faithfully transmitting it to the receiving end is exactly the same as if the message had not been encrypted. Either way, we say that the message contains information. In this case, its meaning will only be known by the intended recipient after decryption.

In engineering, systems are usually designed with the aid of mathematics and communication systems are no different. What we need is a way of quantifying information. In addition, we want to quantify noise and signal loss. We also want to find a relation between all three of these: information, noise, and signal loss. Shannon proposes a measure for quantifying information in the introduction of MTC:
If the number of messages in the set [of possible messages] is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely. As was pointed out by Hartley the most natural choice is the logarithmic function. Although this definition must be generalized considerably when we consider the influence of the statistics of the message and when we have a continuous range of messages, we will in all cases use an essentially logarithmic measure.
Shannon gives several reasons why the logarithm is a good choice for measuring information content. The third reason is especially notable: "It is mathematically ... suitable. Many of the limiting operations are simple in terms of the logarithm but would require clumsy restatement in terms of the number of possibilities." When comparing different transmission systems that can send messages from differently-sized sets of possible messages, we would like to be able to state mathematical facts about such systems without having to explicitly compare the possible number of messages. We are interested more in the generalities of the systems than in their particulars. The logarithmic measure allows us to state limits in terms of the measure itself without need to refer to how particular systems encode messages.

Now that we have a measure for information, we can begin to answer questions of the form, "How much information is in this message?" Thus, information is a quantity like any other. It is in this technical sense of information-as-a-measurable-quantity that information can form the basis of a physical theory. We will return to that topic in a future post.

The mathematical theory of communication is crucial to understanding the Simulation Hypothesis. For this reason, in the next post, I will be delving into the details of Shannon's theory of a communication system. This will enable us to really get a handle on the relationship between information, noise and signal power. These concepts are actually powerful tools of thinking, disguised as a communications engineering theory. They will play an integral role in the remainder of the series.

Next: Part 4, The Mystery of Entropy

---

1. As stated, these are nonsense. I am merely drawing the contrast between matter-energy as a fundamental basis for physics versus information as a fundamental basis for physics in the starkest possible terms.

No comments:

Post a Comment

Wave-Particle Duality Because Why?

We know from experimental observation that particles and waves are fundamentally interchangeable and that the most basic building-blocks of ...