As life (and death) originates from the disorder

For a long time it was believed that life is subject to their own set of rules. But as simple systems are showing signs of natural behavior, scientists argue about whether this apparent complexity is solely a consequence of thermodynamics.

What is the difference between physics and biology? Take a Golf ball and a Cannonball and drop them from the leaning tower. The laws of physics allow you to so accurately predict the trajectory of their fall, that the best and wish not necessary.

Now do the same experiment again, but Cannonball replace the dove.

Of course, biological systems are not amenable to the laws of physics — but, apparently, you are not able to predict their behavior. Biosystems differ in that they are focused to survive and reproduce. We can say that they have a purpose — or what philosophers have traditionally called teleology — which directs their behavior.

Similarly, based on the state of the Universe in a billionth of a second after the Big Bang physics currently allows us to predict how our universe looks today. But nobody thinks that the appearance on Earth of the first primitive cells has predictably led to the emergence of the human race. It seems that the course of evolution is not dictated by the laws.

Teleology and the historical conditionality of the biology, according to evolutionary biologist Ernst Mayr (Mayr, Ernst), make it unique among the Sciences. Both these features emerge, perhaps from a single common guiding principle of biology — evolution. It has a random and arbitrary nature, however, natural selection gives it the appearance of intention and purpose. Animals are drawn to water not under the influence of any magnetic attraction, but due to instinct, the desire to survive. Legs, among other things, serve to lead us to the water.

Mayr argued that these features make exceptional biology science — independent by law. Meanwhile, recent advances in nonequilibrium physics, complex systems theory and information theory challenged this view.

If we consider living beings as agents, perform calculations, collect and store information about unpredictable environment — their abilities and limitations, such as reproduction, adaptation, activity, purpose and meaning, can be understood not as arising out of evolutionary improvisation, but as an inevitable consequence of physical laws. In other words, based on the actions of the beings and their development in this direction, it seems, is a kind of physics. The meaning and intention — which was considered the defining characteristics of living systems — then you can naturally occur due to laws of thermodynamics and statistical mechanics.

In November last year, physicists, mathematicians and specialists in theory of computing machines and systems are met, evolutionary and molecular biologists, to talk — and sometimes argue — about these ideas at a seminar at the Santa Fe Institute, in new Mexico, the Mecca for scholars engaged in “complex systems”. Was posed this question: how special (or not) an academic discipline is biology?

Not surprisingly, opinions were divided. But one idea was voiced very clearly: if biological factors and teleology is a kind of physics, it must deal with the same concept, which seems to have become Central in the most fundamental physics: information.

Disorder and demons

The first attempts to introduce the information and intention in the laws of thermodynamics were made in the mid-19th century, when Scottish scientist James Clerk Maxwell invented statistical mechanics. Maxwell showed how with the introduction of these two ingredients seem to have been possible to do those things which thermodynamics was proclaimed impossible.

By the time Maxwell had already demonstrated how predictable and reliable mathematical relationships between the properties of gas — pressure, volume and temperature — can be derived from random and incomprehensible movements of countless molecules colliding feverishly under the action of thermal energy. In other words, thermodynamics — a new science of heat flow, combining extensive properties of matter such as pressure and temperature — were the result of statistical mechanics at the microscopic level of molecules and atoms.

According to thermodynamics, the ability to extract useful work from the energy resources of the Universe all the time is reduced. Pockets of power are reduced, the clots of heat gradually disappear. In any physical process a part of energy is inevitably dissipated as useless heat is lost among the random motion of the molecules. This randomness is measured by a thermodynamic quantity called entropy — a measure of disorder — which is constantly growing. This is the second law of thermodynamics. Eventually the entire universe will be reduced to a uniform random mix: the equilibrium state where entropy is maximum and nothing meaningful will never happen.

Do we really face such a grim fate? Maxwell didn’t want to believe it, and in 1867, a scientist has set himself the task, as he put it, “punch a hole” in the second act. His goal was to take a container of gas, where the molecules move randomly, and then separate the fast molecules from slow, thereby reducing entropy.

Imagine a microscopic creature physicist William Thomson later called it, rather to the chagrin of Maxwell demon — who is able to see every single molecule in the vessel. The demon divides the container into two compartments, and the partition between them there is a sliding door. Every time he sees a particularly fast molecule approaching the door from the right compartment, he opens the door for her to pass to the left. And every time from the left side to the door is coming slow, “cold” molecule, it also ignores it on the other side. In the end, he is a vessel for cold gas and hot gas from right to left: heat storage, which can be used to perform work.

This is only possible under two conditions. First, the demon has more information than we are: he can see all the molecules individually, and not only statistically average. And secondly, he has the intention: plan to separate hot from cold. Using your knowledge with a specific purpose, he can defy the laws of thermodynamics.

At least so it seemed. It took a hundred years to understand why Maxwell’s demon actually could not subvert the second law and to prevent the inexorable slide to the universal equilibrium. And the reason for this indicates the presence of a deep connection between thermodynamics and information processing — or, in other words, calculation. German and American physicist Rolf Landauer, Rolf (Rolf Landauer) showed that, even if the demon can collect information and (avoiding friction) to move the door without any expenditure of energy, sooner or later there will come a reckoning. Because his memory, which stores information about every movement of molecules cannot be infinite, he would have from time to time to clean — that is, to erase the fact that he had already seen, and start all over again — before he can continue to accumulate energy. This act of removing information, has its inevitable cost: it dissipate energy and therefore increases the entropy. All the arguments against the second law, we offer a clever demon, negates the “Landauer limit”: final price Erasure (or more generally of transforming information from one form to another).

Living organisms are to some extent similar to the Maxwell’s demon. While beaker full of reacting with each other chemicals will eventually use up its energy and fall into a boring stasis and equilibrium in living systems collectively avoid lifeless equilibrium state since the origin of life for about three and a half billion years. They accumulate energy from the environment to maintain this nonequilibrium state, and they do it with intent. Even simple bacteria are moving with “purpose”: to heat food. In his 1944 book “What is life?”, the physicist Erwin schrödinger (Erwin Schrödinger) expressed this idea by saying that living organisms feed on “negative entropy”.

According to Schrodinger, they are able to achieve this by collecting and storing information. Some of this information encoded in their genes and passed on from generation to generation: a set of instructions for the collection of negative entropy. Schroedinger did not know where the information is stored or how it is encoded, but intuition told him that it is written in a certain, in his definition of “aperiodic crystal”, and this thought was the inspiration for Francis Crick (Francis Crick), and physics on its core specialty, and James Watson (James Watson), which in 1953 it was clear how genetic information can be encoded in the molecular structure of the DNA molecule.

Hence, the genome is at least partly constitutes a record of useful knowledge, which allowed the ancestors of the organism already in the distant past — to survive on our planet. According to David Wolpert (David Wolpert), mathematics and physics from the Institute of Santa Fe, on whose initiative was organized the recent seminar, and his colleagues, Artemy Kachinskogo, the key point is that well-adapted organisms establish a relationship with this environment. If the bacterium is guaranteed that floats left or right when in this direction is the source of food, it is better suited and will develop more successfully than the one who swims in random directions and thus finds food only by chance. Correlation between body condition and state of the environment implies that they share common information. Wolpert and Kolchinsky claim that this information helps the organism to avoid the equilibrium — because, as Maxwell’s demon, it can adapt its behavior to extract work from the impermanence of the medium. If he had not received this information, the body gradually come to a state of equilibrium, that is, to death.

From this point of view, life can be viewed as a computational process, aimed at optimizing the storage and use of relevant information. And life, as it turns out, this very successful. Proposed by Landauer solution to the puzzle of Maxwell’s demon has set an absolute lower limit of the amount of energy that is required in computing system with finite memory, namely the energy cost of forgetting. The best of today’s computers incomparable rastochitel: as a rule, they consume and dissipate a million times more energy. But, says Wolpert, “according to conservative estimates, the overall thermodynamic efficiency of the computational process performed by the cell, only about 10 times greater than the Landauer limit”.

It is understood that “natural selection is highly concerned with minimizing the thermodynamic cost of the computation. He will do everything possible to reduce the total number of calculations to be performed by the cell”. In other words, biology (with the possible exception of ourselves) seems to be taking active measures to not bother with the problem of survival. The issue of cost-benefit calculations by the body’s own way through life, he says, is that biology is by and large ignored.

Inanimate Darwinism

Thus, living organisms can be considered as objects that adapt to the environment through information, absorbing energy and thereby evading balance. Of course, this is a very important statement. But notice that it says nothing about genes and evolution, which, as suggested by many biologists including Mayer, depend on biological intents and purposes.

How far can make such a representation? Genes, polished in the course of natural selection, of course, occupy a Central place in biology. But could it be that evolution by natural selection itself is only a particular case of a more General imperative towards function and obvious target that exists in a purely physical universe? It’s starting to look that way.

Adaptation long seen as a sign of Darwinian evolution. Meanwhile, Jeremy England (Jeremy England) from Massachusetts Institute of technology argues that adaptation to the environment might occur even in non-living complex systems.

The adaptation here has a more specific meaning compared to the conventional Darwinian view of the organism as well is equipped with a means for survival. In Darwinian theory, there is one catch: we have the ability to identify well-adapted organism only in hindsight. “Fittest” are those that were better adapted to survival and reproduction, but we can’t predict what this adaptation requires. Whales and plankton are well adapted to marine life, but so that between them one can hardly find something in common.

The proposed Englanda the definition of “adaptation” is closer to the definition of the schrödinger and, in fact, Maxwell: well adapted to the object can effectively absorb the energy from an unpredictable, volatile environment like the man who is able to stay on his feet during the pitching of the ship when all others fall, because it is better adapted to the fluctuations of the deck. Using concepts and methods of statistical mechanics to the nonequilibrium setting, England and his colleagues claim that these well-adapted system to absorb and dissipate energy to the environment, in the process generating entropy.

Complex systems tend to come in these well-adapted state with surprising ease, says Englund: “Thermally fluctuating matter may spontaneously often stray into forms that absorb the work of a time-varying environment”.

Nothing in this process implies a gradual adaptation to the environment through the Darwinian mechanisms of reproduction, mutation and inheritance. Replication there at all. “That is when we give a physical report about the origin of some, apparently, adapted structures, we see that they do not have to be parents in the usual biological sense, and these implications are incredibly exciting, says England. Evolutionary adaptation can be explained using thermodynamics even in those curious cases when samovosproizvoditsya no, and Darwinian logic crumbles”. Of course, if the considered system is complex, flexible, and sensitive enough to respond to changes in the environment.

However, between the physical and the Darwinian adaptation there is no conflict. In fact, the latter can be seen as a special case first. If replication is present, then natural selection becomes a route on which the systems acquire the ability to absorb the work — negative entropy schrödinger — from the environment. The mechanism of self-reproduction, in fact, especially good for the stabilization of complex systems, and therefore it is not surprising that they and enjoys biology. But in the inanimate world where replication is not usually well adapted dissipative structures usually are highly organized structures, such as undulating layers of sand and dunes forming from the random dance of sand and wind. From this point of view, Darwinian evolution can be seen as a specific example of a more General physical principle governing non-equilibrium systems.

The forecasting mechanisms

Given the understanding of complex structures that can adapt to the changing environment, also allows us to draw some conclusions about how these structures store information. In short, because such structures — living or not — are forced to effectively use the available energy, they are likely to become “forecasting mechanisms”.

The fact that biological systems change their state in response to some control signal from the external environment, is perhaps the most important characteristic of life. Something happens, you respond to it. Plants stretch towards the light or produce toxins in response to pathogens. These environmental signals are usually not predictable, but living systems are learning from experience, gathering information about their environment and using it to build your behavior in the future. (Genes, in this view, just give you the basic, necessary elements of General purpose.)

However, the forecast is not something auxiliary. According to research by Suzanne Steele (Susanne Still) of the University of Hawaii, Gavin Crookes (Gavin Crooks), a former employee of the National laboratory behalf of the Lawrence Berkeley (CA), and their colleagues, the ability to predict the future, it seems, is fundamental to any energy-saving system in a random changing environment.

Steele and her colleagues show that for the storage of information about the past which is of no value to predict the future, have to pay a thermodynamic price. To be maximally effective, the system needs to be selective. If she will remember everything indiscriminately, you will incur large energy losses. On the other hand, if she generally will not take the trouble on storing at least some information about its environment, it will have to make considerable efforts in order to cope with the unexpected. “Thermodynamically optimal mechanism needs to balance the memory and prediction by minimization of nostalgia — useless information about the past,” says co-author David Sivak (Sivak David), currently an employee of Simon Fraser University in Burnaby (British Columbia). In short, he needs to learn how to accumulate relevant information — the one that most likely will be useful for future survival.

You might expect that natural selection favors organisms that efficiently uses energy. But even individual biomolecular devices, such as pumps and motors in our cells, must in some important way to learn from the past to predict the future. According to Steele, to find their great efficiency, these devices must “implicitly to build a succinct representation of the phenomena with which they had until then to face that would allow them to predict future events”.


Thermodynamics of death

Even if some of these basic features of information processing by living systems, in the absence of evolution or replication has already been caused by non-equilibrium thermodynamics, we can assume that more complex features — for example, tool use and social cooperation — to be provided by the evolution.

But it should not count. Such behaviors, which are usually considered the exclusive prerogative of advanced evolutionary niches, including primates and birds, can be simulated using a simple model consisting of a system of interacting particles. The trick is that the system is governed by the limitation: it acts in a way that maximizes the amount of entropy (in this case, determined by taking into account various possible ways, which could move the particles), which generates over a given period of time.

Maximization of entropy has long been considered a feature of nonequilibrium systems. But the system in this model obeys the rule that allows her to bring the entropy to the limit within a fixed time window, which extends into the future. In other words, it is able to predict. In essence, the model takes into account all possible paths of particles and forces them to follow the path that produces the greatest entropy. Roughly speaking, it is a path that keeps open the greatest number of possibilities of particle motion in the future.

We can say that a system of particles experiencing some sort of desire to preserve freedom of action in the future, and that this desire at any time directs her behavior. Researchers who developed this model — Alexander Wissner-gross (Alexander Wissner-Gross) from Harvard University and Cameron Frier (Cameron Freer), a mathematician from the Massachusetts Institute of technology — call it “causal entropic force”. In computer simulation of configurations, particle-shaped, moving in a circle under certain conditions, this force produces results that are frighteningly suggests intelligence.

In one case, a large drive was able to “use” a small disk to retrieve a second disc from a narrow tube — a process that was similar to the use of the tool. The release of the disc increased the entropy of the system. In another example, two disks in separate compartments synchronize their behavior to lower down a larger disk so that they could interact with it, thereby creating the appearance of social cooperation.

Of course, these simple interacting agents get the best glimpse of the future. Life it usually, no. Then why is this relevant to biology? The answer is not clear, though, Wissner-gross says that currently working on a “practical biologically plausible mechanism of causal entropic forces.” At the same time he believes that this approach provides additional, useful practice opportunities, offering quick access to the artificial intelligence. “According to my projections, a shorter way to achieve it is to first detect such behavior, and then to work backwards, starting from the physical principles and limitations instead of working on the basis of specific calculation methods or forecasting,” he says. In other words, first find a system that does what you want it to do, and then figure out how she does it.

Aging is also traditionally seen as a trait dictated by evolution. Organisms have a lifespan that creates opportunities for play and with that being said, the prospects for the survival of the offspring do not prevent parents who are too looming nearby and are competitors in the struggle for resources. It does seem true, however, Hildegard Meyer-Ortmanns (Hildegard Meyer-Ortmanns), physics at the Jacobs University in Bremen (Germany), believes that, ultimately, aging is a physical and not a biological process which is controlled by the thermodynamics of information.

Of course, the question is not only one wear. “A large part of the soft material of which we consist, is updated before they have the opportunity to grow old,” says Meyer-Ortmanns. But this upgrade process is not perfect. Thermodynamics of information copy requires that there be a balance between accuracy and energy. The body has finite energy, so over time the errors necessarily accumulate. Then the body is forced to spend an increasing amount of energy in order to correct these errors. The upgrade process gives too spoiled copies in order to function properly, this is followed by death.

Empirical data seem to confirm this. It has long been known that cultured human cells, apparently able to reproduce no more than 40-60 times (the so-called Hayflick limit) before this process stops and starts aging. A recent study of the duration of human life assume that the fact that people mostly can’t survive a hundred years old, has a fundamental cause.

There is a natural consequence of the fact that it is an obvious desire for an energy efficient prediction-organized systems arises in non-equilibrium fluid environment. We are such systems, as all of our ancestors up to the first primitive cells. And nonequilibrium thermodynamics, it seems, tells us that this is what makes matter in such circumstances. In other words, the emergence of life on the planet, like planet Earth at an early stage of existence, with its many sources of energy, such as sunlight and volcanic activity that continue to maintain the imbalance begins to introduce already not extremely improbable event, as suggested by many scholars, but almost inevitable. In 2006, Eric Smith (Eric Smith) and the late Harold Morowitz (Harold Morowitz) from the Santa Fe Institute, argued that the thermodynamics of nonequilibrium systems is the appearance of organized complex systems is much more probable in prebiotic conditions on the Earth, far from equilibrium than it would be if the source of the chemical ingredients just sat there and quietly cooked in a “small warm pond” (in the words of Charles Darwin).

Ten years after this statement was made for the first time, researchers have added detail and deeply penetrated into the essence of the phenomenon. The qualities that Ernst Mayr considered fundamental to biology — the meaning and intention could arise as a natural consequence of statistics and thermodynamics. And these common properties may, in turn, naturally lead to some semblance of life.

At the same time, astronomers show us how many worlds revolve around other stars in our Galaxy: by some estimates, they amount to billions. Many of them are far from equilibrium, and at least, some similar to Earth. And there is, of course, same rules apply.

Comments

comments