Home > evolution, information, logic > >The importance of information in the evolution debate

>The importance of information in the evolution debate

> Creationists often mention the concept of information as a challenge to the grand theory of evolution.

Information as a concept has long been recognised. It is something all people agree exists even if there is debate about how it is categorised. Archaeologists discovering inscriptions know there is a meaning even if they do not know what the meaning is, and most people would not dispute this. People recognise several things that are designed. Though one could say they do so because they already know these things are designed, such as a car or computer; there are examples of things we are not previously aware of, but we would still recognise intention.

As I have mentioned previously, information is not composed of, nor derived from matter. Of course it can be stored in matter.

There are several concepts of what information is, at least in terms of how we should represent information theory mathematically.

The application to evolution centres on the connection to DNA. DNA is recognised as carrying information. It has meaning. It resembles a blueprint, and metaphorically is one.

We can study how information originates. If the source of all information can be shown to be greater information (that is intelligence), then this conclusion also applies to DNA.

There are 2 potential ways that one could show information cannot be produced by itself. It may be possible to show this mathematically, in which case we can be absolutely certain (or essentially certain if the proof is statistical).

If not mathematically, it may be possible to show this empirically: that is, in investigating all the billions of examples of information that have been directly observed; if all are shown to have come from higher information sources and zero are self producing, then we can be extremely confident of our thesis.

Therefore the impossibility of information coming from non-information, mathematically or empirically, disproves Darwinism. Rather than being a red-herring as is sometimes claimed, information theory is absolutely central.

Categories: evolution, information, logic
  1. Roxolan
    2009 August 30 at 13:30

    Whenever there’s a new mutation in a population, information is gained. That particular combination of DNA did not exist before, and now it does. As far as we know, no intelligence is involved in random mutations.
    That wasn’t so hard. Unless you’re using a pretty weird definition of information (or no definition at all) as some creationists tend to do.
    Mathematically, you could make the argument that, for each mutation, a mutation with the reverse effect is possible (a duplicated sequence might be deleted etc.). So if there’s any mutation where information is *lost* without the intervention of an intelligence, it means information can be gained too.

  2. Ken
    2009 August 30 at 22:50

    Information is generally expressed mathematically as -k*log(entropy). Consequently the claim that information cannot increase is equivalent to the claim that entropy cannot decrease – often used by creationists to “disprove” evolution. But of course they use this concept dishonestly – applying a law for a closed system to parts of an open system.
    Natural selection actually leads to changes in information. Increases and decreases. In a sense you can say that information comes from the environment.

  3. 2009 August 31 at 04:24

    Nobody in their right mind says that information cannot increase. Nobody in their right mind should insist that information can increase without intelligent input.

  4. Ken
    2009 August 31 at 05:07

    “without intelligent input”
    Grant – I consider I am in my right mind and I insist that intelligence is not required for information to increase. Look around you.
    You guys just make bald statements without any connection to reality – that’s why you trip over the 2nd law of thermodynamics (which is really what the “information” talk is all about)

  5. 2009 August 31 at 07:57

    Ken – I have looked around.
    Information is not about the second law. It is about entropy.

  6. 2009 August 31 at 09:07

    Whenever there’s a new mutation in a population, information is gained. That particular combination of DNA did not exist before, and now it does. As far as we know, no intelligence is involved in random mutations.
    That wasn’t so hard. Unless you’re using a pretty weird definition of information (or no definition at all) as some creationists tend to do.

    You go wrong in your first statement. New mutation does not equal increased information.
    The man ran around the park.
    The maf ran around the park.
    No new information there, just a loss of information.
    Mathematically, you could make the argument that, for each mutation, a mutation with the reverse effect is possible (a duplicated sequence might be deleted etc.)
    Duplicated genes give no more information. They just give the same information twice.
    So if there’s any mutation where information is *lost* without the intervention of an intelligence, it means information can be gained too.
    This is true, but it is trivial. And the increased information is merely the information that you have before the mutation decreased it. But you have started with high information, regaining it this way, aside from being a trivial example, doesn’t increase the total amount of information in the biosphere.

  7. 2009 August 31 at 10:28

    Information is generally expressed mathematically as -k*log(entropy). Consequently the claim that information cannot increase is equivalent to the claim that entropy cannot decrease – often used by creationists to “disprove” evolution. But of course they use this concept dishonestly – applying a law for a closed system to parts of an open system.
    I agree that entropy is a reasonable analogy for information when information is considered at the level of data storage.
    However if we use your analogy we see that information comes from information. In a closed system entropy cannot increase, but it can locally in an open system because energy is provided externally; so long we have an energy converter such as a photosynthesising leaf or a refrigerator.
    Thus we have an external source.
    If we apply this analogy to information in an organism, we could postulate that information increases locally in the organism by external input of information. Thus information comes from information.

  8. 2009 August 31 at 10:48

    Grant Dexter Information is not about the second law. It is about entropy.
    Yes the second law of thermodynamics is about entropy, but entropy is related to the statistical distribution of energy. When information is considered purely as data storage (ie. meaning is not considered, just the initial state, irrespective of whether the initial state has meaning embedded in it or not) we can apply the idea of introducing “error” into the data and thus use somewhat similar statistical concepts. The disorder of the code increases as error grows, “entropy” increases.

  9. 2009 August 31 at 10:55

    By the way Grant, I have had a look at theology online. I am not certain how much time I wish to give to a forum, and I don’t really want to join up to one. I saw their response to my article including the hash job by The Barbarian. I would like to say he didn’t get it, which he didn’t, but in fact the greater issue was he spend more time stating errors of historical fact.

  10. Roxolan
    2009 August 31 at 14:46

    As I suspected, you are using a peculiar definition of “information”, since you claim that “The man ran around the park” contains more informations than “The maf ran around the park.
    Please provide a precise definition of information.

  11. 2009 August 31 at 17:37

    When information is considered purely as data storage (ie. meaning is not considered, just the initial state, irrespective of whether the initial state has meaning embedded in it or not) we can apply the idea of introducing “error” into the data and thus use somewhat similar statistical concepts. The disorder of the code increases as error grows, “entropy” increases.
    Exactly right. It is only reasonable to discuss this topic using such a definition of information. Roxolan looks to be too busy responding to bother reading. ;)
    My point was that there is no reason to speak at all about the second law of thermodynamics. The formulas for information work nicely without having to confuse the issue by talking about heat transfer.
    Entropy has many application. Each of the sciences should have a well defined set of entropic statements or formulae limiting them. I’ve noticed a few terms for ‘biological entropy’. Might be time for a bit more research. :)
    By the way Grant, I have had a look at theology online. I am not certain how much time I wish to give to a forum, and I don’t really want to join up to one. I saw their response to my article including the hash job by The Barbarian. I would like to say he didn’t get it, which he didn’t, but in fact the greater issue was he spend more time stating errors of historical fact.
    Yeah no worries, mate. Barbarian tends to rabbit on about most issues with little in the way of grace or understanding. One tends to get him easily confused with an atheist. :)

  12. Ken
    2009 September 1 at 00:00

    bethyada – you got the change of entropy in a closed system the wrong way around. However, you acknowledge the difference for an open system. Entropy can decrease for part of a system. Now you express this as energy flowing in or out of that open system.
    Bit like an evolving species, really. It happens in an open system. In particular random changes are selected by interaction with the environment. That is the source of information (Don’t have to postulate, demons, fairies or gods).
    In a real manner our genes are a record of our past – because organisms have adapted to past environments.

  13. 2009 September 1 at 06:31

    Corrected:
    However if we use your analogy we see that information comes from information. In a closed system entropy must increase, but it can decrease locally in an open system because energy is provided externally; so long we have an energy converter such as a photosynthesising leaf or a refrigerator.

  14. 2009 September 1 at 10:11

    Please provide a precise definition of information.
    Information = Meaning.

  15. 2009 September 1 at 10:23

    Bit like an evolving species, really. It happens in an open system. In particular random changes are selected by interaction with the environment. That is the source of information
    So where does the external information come from?
    I don’t happen to think there is a lot of external information being incorporated into DNA. There are examples like plasmids and DNA repair mechanisms.
    But even if we accept the analogy to entropy, one needs both a source of higher information and a method of incorporating that information. In the case of entropy we have decreased entropy locally (despite global increased entropy) because a machine does the work of decreasing entropy.
    What is the mechanism of incorporating information? Intracelluarly if we lose information on a DNA strand we can gain it by using the other DNA strand as a template (the information source) and DNA repair enzymes, the mechanism. But information external to the cell?

  16. Roxolan
    2009 September 1 at 12:27

    >> Information = Meaning.
    You’re just moving back the question. Please provide a precise definition of meaning. It has to be applicable to DNA too, mind you.

  17. Ken
    2009 September 1 at 22:30

    Precise definitions of “information” (such as Shannon information) are usually expressed in terms of negative entropy. This enables mathematical treatment.
    To say information = meaning is just an attempt to justify an imposition of religion onto science. Always bad.
    Bethyada – if you want to find the mechanism for something – you investigate it. Collect the evidence, draw inferences, develop hypotheses and theories, validate these against reality, develop deeper understanding. You cannot get anyway in understanding reality with bald assertions of “information = meaning” or “god did it.” They are just attempts to stop science.
    your vague assertions about what information is reveal simply an attempt to replace science with religion – but putting a sciency veneer on it. Sort of like Dembski, really.

  18. 2009 September 2 at 10:29

    Roxolan You’re just moving back the question. Please provide a precise definition of meaning. It has to be applicable to DNA too, mind you.
    No I am not, I choose my answer deliberately. One has to think about what one is trying to describe.
    And you can discuss topics with common definitions that don’t have mathematical precision. In fact mathematical manipulations can remove focus on reality. Sure your equations be solved, but are they modelling what you think they are modelling?
    Information at its most fundamental is the concept of meaning. Like reading a book, or looking at a car (or its blueprint), or following an algorithm.
    Ken Precise definitions of “information” (such as Shannon information) are usually expressed in terms of negative entropy. This enables mathematical treatment.
    And this is where we end up, discussing Shannon’s theories without regard to whether they capture the concept of information. Which they don’t fully; because meaning was not what Shannon was addressing.

  19. Roxolan
    2009 September 2 at 13:05

    But that is because “meaning” is not a precise enough concept. It is impossible to quantify the amount of meaning in a sentence.
    And the problem is even worse with living organisms and the DNA, because the DNA is *not* like letters forming sentences. It indicates what proteins will be produced. How could you say whether producing one kind of protein is more or less “meaningful” than producing another kind or none at all? Could you take two genomes of the same specie and find the more meaningful one?

  20. Ken
    2009 September 3 at 00:29

    Ah – so its not information you are talking about after all! It’s meaning!
    Which of course is in the eye of the beholder.
    (So why talk about information and evolution?)
    Makes your argument irrelevant though. Scientists are in the business of understanding reality. Your mutterings are not going to help there.

  21. 2009 September 5 at 01:30

    But that is because “meaning” is not a precise enough concept. It is impossible to quantify the amount of meaning in a sentence.
    Difficult perhaps, but not impossible. The problem with establishing theory before identifying the concepts is the modelling as I said.
    It may be difficult to decide which protein carries the most meaning, but it is not difficult to say that there is more information (in the meaning sense) in a person than in a bacteria. It is also not hard to say that the existence of a gene is higher information content than the absence of same gene.
    I am quite happy with the term information, but because it has been used in several ways it is important to focus on the meaning component of it (which is what information means) and not on storage without regard to what is being stored.
    Ah – so its not information you are talking about after all! It’s meaning!
    No, I am talking about information, but I used the term meaning because of the possibility of equivocation. You latched on to Shannon, and while his theories are important, they are insufficient. “Information” is more than Shannon, and some of Shannon is not really what people mean by information.
    Consider Hamlet, and consider a similar length book that has random numbers in it. Shannon treats them as having the same information because he talks in terms of storage. But it is clear that Hamlet has high information (meaning) and the numbers have no information (meaning)

  22. Roxolan
    2009 September 5 at 12:16

    How can you tell that a bacteria contains less “information” than a human, or that a gene is better than no gene? The length of the genetic code? It’s a metric I might use, but then again I have no problem with the idea that some species of ferns have more information in their DNA than humans.
    Unless you have a precise definition of information, one that also works on small scales, this discussion is meaningless, and your claim that information cannot appear without intelligence even more so. If you’re only able to compare the information content of two vastly different organisms, you won’t be able to apply your definition in evolution, since mutations usually cause only a tiny change in the DNA.
    Thus I repeat my previous question: using your “definition”, how could you ascertain whether producing one kind of protein is more or less “meaningful” than producing another kind or none at all?

  23. 2009 September 5 at 14:23

    In order to determine which had the more information among two different things one would probably be required to have made those things.
    However, I agree with bethyada. I think it’s perfectly reasonable to assume that a human is built of a more informed source than a bug is. :)

  24. 2009 September 5 at 20:50

    Roxolan, perhaps you are best to leave DNA for a time and consider the issue more philosophically. I have no problems applying information theory to DNA, but DNA is no different from other sorts of information in terms of its “substance.” That is messages are not matter, they are merely stored in them.
    If I have the only copy of a book, and that book gets burned in a fire we have less information than before (unless I have memorised it). If you are saying that we can’t tell whether there is now less information, then discussions about the applicability to DNA are not going to go very far.
    Now I agree that the amount of information in Hamlet compared with Macbeth may be difficult to establish, but I do not doubt that both contain significantly more information than The Very Hungry Caterpillar.

  25. Roxolan
    2009 September 5 at 23:32

    I am not going to leave DNA aside, because DNA is the very topic and because it *is* different from a book. DNA does not contain a “message”. It only contains instructions for the production of proteins. No meaning is to be found in there. Unless you have a specific definition of meaning that you wish to apply, in which case I invite you to share it.
    >> Now I agree that the amount of information in Hamlet compared with Macbeth may be difficult to establish, but I do not doubt that both contain significantly more information than The Very Hungry Caterpillar.
    Unless you can provide a viable definition of information, this is a meaningless claim. In number of words, Hamlet beats The Very Hungry Caterpillar. But The Very Hungry Caterpillar contains color pictures, which would make it a much larger computer file for example.
    And I remind you that, while a broad definition like “number of English words” may suffice to rank Hamlet versus The Very Hungry Caterpillar versus a book of randomized letters, you’re going to have to be much more precise when comparing two genomes which only differ by a few minor mutations.
    I repeat my previous question: using your definition of “information” (/”meaning”), how could you ascertain whether producing one kind of protein is more or less “meaningful” than producing another kind or none at all?

  26. 2009 September 6 at 03:35

    Roxolan, your comments are suggestive that you don’t get message theory. I suggested leaving DNA aside (temporarily) because your evolutionary preconceptions colour the way you see the issue. By looking at message theory sans DNA we can establish information (or the types of information). Then one can see whether or not the theory is applicable to DNA. There is already much material written on the statistics of data to establish whether the data is likely to have been tampered with.
    The fact that you are thinking about the size of picture files suggests you are thinking in terms of storage capacity, not in terms of information quality, but as I said above (to Ken), Shannon’s theory treats a book of random numbers the same as a novel. But the former clearly (from first principles!) has absolutely no information (in terms of meaning).
    Size carries some weight in a definition of information content, but not all. And in the case of random data carries absolutely no weight.

  27. 2009 September 7 at 10:13

    I can’t resist…
    here’s a thought about information and messages;
    The sender of the information may have meaning in in mind when he sends information.
    When the information reaches the receiver unchanged there is the potential for the meaning that was intended to be fully received. There is also potential for partial meaning or even misunderstood meaning. There could even be unintended meaning.
    All of this is very much like the conversation you guys are having here.
    Amino acids do not have the same interpretation issues (that I know of) they just behave in certain ways.

  28. 2009 September 7 at 14:17

    Chemicals are not people. Of course they cannot misinterpret anything.
    What point were you trying to make?

  29. 2009 September 8 at 08:10

    TMYU, yes you are correct in your assessment here, I am hoping to post on this shortly. The sender receiver problem depends on the strictness of the rules, so less of a problem for DNA.

  30. 2009 September 8 at 10:16

    what does DNA stand for anyway?
    National Dyslexic Association!

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: