To complete talking about how science works, let's talk for a while about information. What I really wanted to do since we live in the information age is to connect information with knowledge which is what science is all about. Let me use a very simple example. Suppose you put a dot on a piece of paper and you want it to be able to describe where that was in increasing levels of accuracy. How would you do that? You say the dot is on the piece of paper, that doesn't localize it or describe it in any detail. However, you could fold the piece of paper in half and say the dot is above the line or below the line. You've localized the dot to within a factor of two. If you then folded the piece of paper the other way, you do divide it into quarters, and the dot would be in one of the quarters. So you'd localized it to one part in four, 25 percent. You can follow this logic and see that each time you fold the paper in half and then half again, the dot will clearly lie in one of the small rectangles, and you will have localized it with increasing accuracy. Each time you fold the piece of paper, you can ask a simple question, is the dot to the left of the line, or the right of the line, or above the line, or below the line? The answer is yes or no. One bit of information is a yes-no answer. A bit is a binary digit and it really just means the difference between two states, yes or no, on or off, light or dark, and so in the folding of a piece of paper, we see how a simple accumulation of bits of information will define with increasing precision the position of the dot on the piece of paper. We could extend this into three dimensions and imagine a room where there was a particular molecule that happened to be in one place in the room, could divide the room in half and half again, both along its length, and its width, and its height, and gradually we would localize that particular molecule. Each question asked, is it in this side or that side? Is a bit of information one or zero, yes or no. So that's a direct connection between a bit a binary digit, which is how computers work and knowledge or increasing knowledge. What I've said of course is not specific to localizing a dot on a piece of paper. We can imagine a very crude thermometer, or way of telling temperature, where all we can do is ask, is it hot, or is it cold? That's a binary distinction, one bit of information, hot or cold, yes or no, on or off, one or zero. A slightly better thermometer might be dividable into four segments of temperature. That's two bits of information, or you could have a very fine thermometer grading into a 100 degree segments from zero to a 100 degrees C, and you've characterized the information by more bits. So there's this direct relationship, whether you're measuring sound, or light, or anything, position into bits of information. Let me give you another example. Suppose I asked you, to think of a number between one and 1,000, and don't tell me what it is, and I told you I could guess that number in only eight guesses. You probably think that implausible, just by guessing numbers, surely it would take me 500 guesses just to have a 50 percent chance of getting it right. But that's not true, that's not the most powerful use of information. If instead, my questions were, is the number above or below 500? The answer is either yes or no, and I've localized it to half the number line. If I then ask, Is it above or below 250? I've localized it to another factor of two, and as you can see following this logic, each question I ask, each answer you give, one new bit of information divides the line in two, in two, in two and into again. So with only eight bits of information two to the power eight, I can characterize the numbers from one to 1,000. This is the power of information. It's a very efficient way of packing information into questions asked about the world. We live in an amniotic fluid of information in increasing volumes, and we sort of take it for granted. The power of our computers, the amount of information we're subjected to, and have available to us is increasing exponentially which means doubling essentially every year or every six months, this is Moore's Law. It applies not only to the speed of computers, but also to the bandwidth of the internet and the amount of information available online. Maybe you're not aware of it, but when you watch the silly cat video on YouTube you're watching a set of ones and zeros binary digits of information amounting to some billions of them changing sometimes a second, and it's produced by this power of information technology. It's the same process that leads to the localization of that dot on a piece of paper by defining a region of space in smaller and smaller quantities. What level of information are we talking about in modern culture after several decades of the computer and Internet revolution? You probably know the amount of data you have access to, some gigabytes I suspect. The amount of information in the modern world is growing at a phenomenal rate. It is indeed growing exponentially, doubling every year. The amount of new information created last year was something like 50 exabytes which is close to one with 20 zeros after it, number of bytes, its simply unimaginable amount of information. So it's not just scientists that have to deal with and parse information and know how to characterize it, our everyday lives are awash with information and we do need to understand it at some level. This power of information is so efficient, the packing of this information, the weakened characterized large number of items with a very small number of bits of information. As we saw, it takes less than 10 items, or decisions, or questions to characterize a number line from one to 1,000, which means, a 1,000 items are characterized by 10 bits. Twenty bits would be enough to characterize a 1,000 squared or a million items. Thirty bits would be enough to characterize a billion items and so on. We extrapolate this, it means that only about 80 bits of information will be needed to characterize the position of any atom in the universe to within the size of an atom, which is an extraordinary concept. James Watson, co-discoverer of the mechanism of DNA said in the mid 1950s, "Life is digital information," quite an extraordinary statement from someone before the era of the personal computer. What he meant was that the base pair sequence of biological material is coding information of the genome of the functioning of every organism. If we run with his metaphor and look at the history of information in this world over the last many billions of years since the Earth formed, we can characterize different phases. Life codifies information in biological molecules and as life has grown more complex, the information content has increased. The human genome contains about three billion base pairs that characterize all the information about a human being. That's the biological rate of information growth on this planet. When human culture started and especially with the invention of the printing press by Gutenberg, we elevated our rate of information gathering, retrieval and storage with books, that corresponds to the uptake in this graph, where the rate of information growth started growing billions of times faster than it had by simple biological evolution. Then in the final phase of this progression, at an even more rapid exponential rate, starting only a few decades ago computers allowed us to increase information storage transmittal and retrieval by another factor of a million to a phenomenal rate, the one I just mentioned. So in terms of the age of information, we are in an unparalleled situation compared to biological evolution which may occur elsewhere in the universe. But let me finish talking about science not with bits and bytes, but with the imagination that's at the heart of scientific progress. We can imagine that our brains are able to encounter the things of the universe and our observations, but are really we able to imagine everything that happens in the universe. Just imagine the universe divides into things that are and things that aren't, things that do happen and things that don't. I think our imaginations are good enough to imagine most of the things that actually happen in the universe, but perhaps not all. We're eternally surprised in Cosmology and Astronomy with things that we never anticipated. However, the power of our brains of course, is that we can imagine lots of things that don't happen. That's the basis of science fiction of poetry and art. So while I think we can imagine most of the things that actually happen in the universe, there are perhaps things in the universe that do happen that we're not smart enough, clever enough, or imaginative enough to conceive of. However, our brains are supple and we do imagine things that don't happen in the universe. The power of imagination puts all of this within the landscape of our head. Information is related to knowledge. Information in science refers to questions we can ask about nature where we can make a yes-no answer and learn something about the situation. So we can analogize information and science to the bits of information that underlie the computer revolution and the information age, and it helps us understand how we can characterize the world in terms of ones and zeros, yes and no decisions based on observations we make.