- Paul Ehrlich studied the evolution of canoe design in Polynesia, as a model system for how cultural evolution works in general. He found, not surprisingly, that artistic variation occurred rapidly, whereas variation in the the canoes' functional design was slower (due to the need to be sea-worthy.) (blog post,journal article)
- Simon Kirby et. al. simulated the evolution of a new language. Human subjects were shown a collection of nonsense words, and a picture associated to each word. They were then asked to recall these word-picture associations. Whether or not these recollections were correct, they were used as the basis for a new set of words-picture associations, which were then shown to a new set of subjects. As the associations changed each round based on what people could remember, a structred language began to develop.
In other words, human memory was the environment in which the language was evolving. The more structure in the language, the easier it was to remember, and therefore the more it got passed on. Very cool! (blog post, journal article) - Arne Traulsen et. al. (the et. al. includes Nowak) found that if you assume a much higher rate of "mutation" in ideas than in genes (a reasonable assumption), you get qualitatively different results. For example, cooperation can become viable in situations where it wouldn't otherwise be. (blog post)
RFK Jr. is not a serious person. Don't take him seriously.
3 weeks ago in Genomics, Medicine, and Pseudoscience
amazing blog, thank you.
ReplyDeletei used to do chemistry, moved into econ/finance.
ReplyDeleteI have a question. Does the amount of chaos (entropy) within a system related to the degree of complexity of the same system.
What I am asking is the higher degree of disorder leads to higher degree of complexity?
Also, since the entropy of the Universe goes up, does it mean that the number, variance and probability of formation of complex structures and maybe life (since life could be viewed as a complex structure vs. simple chem reaction) goes up as well?
Thank YOU, Alex! The relationship between entropy and complexity is subtle, and depends a lot on what is meant by "complexity." I touched on this issue here. According to some definitions, complexity is roughly the same as entropy/disorder/randomness. However, totally random systems cannot exhibit the kinds of "complex structure" you are referring to---the word "structure" itself connotes a certain amount of order.
ReplyDeleteYaneer Bar-Yam likes to say that complex systems exhibit entropy and complexity on multiple scales. I've been meaning to do a blog post on this idea for a long time; perhaps I'll bump it towards the front of my list.
One of the key concepts that is often overlooked (in my mind) is that of coevolution. Namely, we can look at populations of agents as evolving within a relatively static environment, but often it's better to model the environment (or at least parts of it) as an evolutionary system itself which is coevolving with the original population. It is important to remember that the choice of which system is the evolving one and which is the environment is an arbitrary one, and the roles can always be flipped and looked at from the opposite perspective.
ReplyDeleteThere is a great diagram of a coevolutionary model of language and brain in this paper on page 12. And a less formal, but way more thorough exploration of the coevolution of genes and culture in Durham's seminal text.
In terms of formalizing coevolutionary dynamics, while I have not yet read it, Vincent and Brown's book seems to be far-reaching and thorough.