How Order Adds Depth to Life (and Death)

in order •  2 years ago 

WHAT SETS BIOLOGY AND PHYSICS APART FROM EACH OTHER? Drop a ball and a frisbee off at the CN Tower. You can forecast their paths as precisely as you'd like using the principles of physics.
Repeat the experiment now, this time using a pigeon in place of the cannonball.

Of course, biological systems don't violate physical restrictions, but they also do not appear to be anticipated by them. They have a certain purpose in mind: to survive and procreate. We can say that things act by a purpose, or what philosophers have historically referred to as teleology.

In a similar vein, science now enables us to foretell the state of the universe one billionth of a second after the Big Bang, beginning from that state. But nobody thinks that the development of the human species followed a predictable path from the first ape-like cells appearing on Earth. It appears that laws do not control how evolution proceeds.

The evolutionary scientist Ernst Mayr claimed that biology is distinct from other disciplines due to its teleology and historical contingency. These two characteristics result from evolution, which is maybe the sole overarching concept in life. Darwinism gives it the impression of purpose and aim, although it is dependent on chance and unpredictability. Animals are attracted to water, not because of some mysterious magnetic pull, but rather because it serves their survival instinct. Legs are useful for many purposes, including getting us to the water.

These characteristics, according to Mayr, make biology unique and a nation of laws itself. However, that belief is being called into question by recent advances in dynamic equilibrium thermodynamics, complex systems research, and information theory.

Replication, adaptability, agency, purpose, and meaning may be regarded as developing not through evolutionary improvisation but as unavoidable corollaries of physical principles if we view living things as agents executing mathematical calculations and collecting data about an unpredictable environment. In other words, it seems as though there is physics to things evolving and doing things. The rules of physics and statistical mechanics may then naturally lead to the emergence of meaning and purpose, which are believed to be the distinguishing features of living systems.

At a workshop held at the Santa Fe Institute in New Mexico, the centre for the study of "complex systems," scientists from the fields of physics, mathematics, and computer science gathered with experts in evolutionary biology and molecular biology to discuss these concepts. How unique (or not) is biology, they questioned? It shouldn't come as a surprise that there was no agreement. However, one message that came through loud and clear was that, if there is a sort of physics that underlies biological teleology and agency, it has something to do with the same idea that appears to have been ingrained at the core of fundamental physics itself: information.

Demons and disarray
In the mid-19th century, while statistical mechanics was being developed by the Scottish physicist James Clerk Maxwell, the first attempt was made to include knowledge and purpose into the equations of thermodynamics. Maxwell demonstrated how the addition of these two components appeared to enable actions that thermodynamics deemed impractical.

Maxwell had already demonstrated how the deterministic and trustworthy mathematical correlations between the attributes of a gas pressure, volume, and temperature could be extracted from the innumerable molecules' irrational and unknown jerky movements when subjected to thermal energy. To put it another way, statistical mechanics on the size of molecules and atoms at the microscopic level led to the development of thermodynamics, the new science of heat flow that linked large-scale qualities of substances like pressure and temperature.

The ability to employ the universe's energy resources for productive activity is always dwindling, according to thermodynamics. Energy concentrations are dissipating and smoothing out heat pockets. In any physical process, some energy is unavoidably wasted among the random movements of molecules as useless heat. Entropy, a thermodynamic quantity that measures disorder and is always rising, is linked with this unpredictability. That is by the second rule of thermodynamics. In a condition of equilibrium, when entropy is maximised and nothing significant ever occurs again, the cosmos will eventually be reduced to a homogeneous, dull muddle.

Do we have to go down that dark path? In 1867, Maxwell set out to "pick a hole" inside the second law since he did not want to embrace it. His strategy involved starting with a chaotic box of molecules that were jiggling randomly and then separating the fast molecules from the slow ones to reduce entropy. William Thomson later called it a "devil" and said it was a small creature that could see any molecule in the box, much to Maxwell's dismay. The devil splits the boxes into two halves and separates them with a wall-mounted sliding door. When he notices a molecule coming from the right compartments that are very energising, he opens the door to let molecules inside.

The act of erasing information comes at an inescapable cost: energy is lost, increasing entropy. The "Landauer's limit": the limited cost of information erasure cancels out all the advantages earned against the second rule by the clever work of the demon (or more generally, of converting information from one form to another). Living things have an uncanny resemblance to Maxwell's demon. Living systems have been collectively resisting the lifeless equilibrium condition since the birth of life around three and a half milliards years ago, but a beaker packed with reacting chemicals would ultimately squander its energy and fall into uninteresting stasis and equilibrium. They use "intention" to gather energy from their environment to maintain this non-equilibrium condition. Even basic bacteria migrate deliberately in the direction of heat sources and nutrients. The physicist Erwin Schrödinger stated this in his 1944 book What is Life? by stating that living things eat "negative entropy."

According to Schrödinger, they do it by gathering and storing information. Their genes include a portion of that knowledge, which is handed along from one generation to the next as a set of instructions for harvesting negative entropy. Francis Crick, a physicist by training, and James Watson were motivated by Schrödinger's intuition that the information is written into what he called an "aperiodic crystal" when they discovered in 1953 how genetic information can be encoded in the molecular structure of the DNA molecule. Schrödinger had no idea where the information is kept or how it is encoded. In light of this, a genome is at least in part a record of the helpful information that allowed an organism's ancestors, who lived in the distant past, to persist on our planet.

A relationship between an organism's condition and the conditions of its surroundings suggests that they have something in common. According to Wolpert and Kolchinsky, this knowledge aids the organism's ability to remain out of balance since, like Maxwell's demon, it enables it to modify its behaviour to benefit from changes in its environment. Without this knowledge, the organism would eventually return to balance and perish. When seen from this perspective, life may be compared to a computer that tries to maximise the storage and utilisation of valuable information. Lives turned out to be skilled at it. An absolute lower bound on the amount of energy needed for a finite-memory computation the energetic cost of forgetting was established by Landauer's answer to Maxwell's demon dilemma. The most energy-efficient computers on the market today waste energy by a factor of a million or more, on average. Nevertheless, Wolpert asserts that "a very cautious approximation of the power conversion efficiency of the whole processing performed by a cell is that it is only 10 so about times larger than the Landauer limit."He asserted that this had the conclusion that "natural selection has been very concerned with lowering the thermodynamic expense of computing. It will use every trick in the book to minimise the overall amount of computation each cell must perform. To put it another way, biology (maybe except us) appears to take good care not to intellectualise the issue of survival. According to him, biology has largely ignored this topic of the advantages and disadvantages of calculating one's way through life up until this point.

Darwinism without life
Consequently, living things may be thought of as entities that adapt to their surroundings by utilising information to gather energy and avoid equilibrium. It certainly has a little of a mouthful. But take note that it makes no mention of genes or evolution, which Mayr and many other scientists believed to be essential to biological aim and purpose.

How far may this image lead us, then? The basis of biology is the natural selection of genes. But is it possible that natural selection-driven evolution is only a specific instance of a more universal compulsion towards function and seeming purpose that occurs in the simple physical universe? That's how it's beginning to seem. The characteristic of Darwinian evolution has long been thought to be an adaptation. However, Jeremy England of the Massachusetts Institute of Technology recently claimed that sophisticated nonliving systems may also adapt to their surroundings. Here, adaptation refers to something different from the typical Darwinian image of an organism that is well-equipped for survival. One issue with the Darwinian perspective is that a creature that has been highly adapted may only be identified after the fact. The "fittest" are those who have proven to be more adept at reproduction and survival, but you cannot foresee what fitness involves. Although they are both extremely well adapted to aquatic life, whales and plankton have nothing in common.

England's notion of "adaptation" is more in line with Maxwell's and Schrödinger's: An environment that is uncertain and changeable can be efficiently absorbed by a well-adapted creature. It's comparable to the situation when someone manages to maintain their balance whereas others fall over on a ship that is pitching because they are more adept at responding to deck changes. England and his coworkers contend that even these well-adapted units are the entities that capture and disperse the energy of the surroundings, producing entropy in the procedure. They base their argument on statistical mechanics principles and techniques in a nonequilibrium scenario.

Theoretically fluctuating matter frequently spontaneously beats into forms that are good at absorbing effort from the time-varying environment, according to England. Complex structures tend to settle into some of these well-adapted states with remarkable ease. This method does not entail progressive adaptation to the environment through the Darwinian mechanisms of reproduction, mutation, and trait inheritance. No replication occurs. What's fascinating concerning this is that it implies some of the adaptable-looking structures we see don't necessarily need to have had parents inside the traditional biological sense when we offer a physical account of their beginnings, according to England. As long as the system in issue is complex, varied, and sensitive enough to react to changes in its environment, thermodynamics may explain evolutionary adaptation even in fascinating circumstances in which there aren't any self-replicators and Darwinian reasoning fails.

Physical adaptation and Darwinian evolution do not, however, clash. The latter can be thought of as an instance of the former. If replication occurs, natural selection then serves as how systems learn to take in work negative Schrödinger's entropy from the environment. It's not surprising that biology employs self-replication because it's a very effective strategy for maintaining the stability of complicated systems. However, in the nonliving environment, where replication is uncommon, highly ordered dissipative structures, such as sand ripples and dunes that form from the chaotic dance of wind-blown sand, are more likely to be well-adapted dissipative structures. In this light, Darwinian evolution might be seen as an example of a more universal physical principle.

Prediction systems
We may infer some information about how complex structures retain information from this illustration of how they adjust to a changing environment. In essence, these structures living or not are likely to develop into "prediction machines" as long as they are forced to use the energy that is supplied effectively.

The ability of biological systems to alter their condition in response to external driving signals is practically a defining trait of life. You react after something happened. In reaction to infections, plants release poisons as they advance towards the sun. Although these contextual signals are frequently unpredictable, living organisms learn from their mistakes and gather knowledge about their surroundings to shape their behaviour in the future. Genes in this image just provide the fundamental, all-purpose necessities.

Prediction, though, is a need. According to the research of Gavin Crooks, a former employee of California's Lawrence Berkeley National Laboratory Susanne Still, a professor at the University of Hawaii, and their colleagues, foreseeing the future appears to be crucial for any energy-efficient system in an arbitrary, fluctuating environment. Still and colleagues demonstrate that there is a thermodynamic expense to keeping historical data that has no prognostic utility for the future. A system must be selective to operate at maximum efficiency. It uses a lot of energy if it recalls everything that happened without discrimination. On the other hand, it will continuously struggle to deal with the unexpected if it doesn't bother storing any knowledge about its surroundings.

"A thermodynamically optimum system must balance memory against prediction by limiting its nostalgia to the worthless information about the past," said co-author David Sivak, who is currently at Simon Fraser University in Burnaby, British Columbia. In other words, it must become proficient at gathering important data that would be likely to just be crucial for continued existence.

It makes sense that natural selection would favour species that utilise energy effectively. However, even individual biomolecular components, such as the motors and pumps in our cells, ought to take a significant amount of knowledge from the past and apply it to the present. According to Still, for these machines to achieve their astounding efficiency, they must "implicitly develop compact representations of the environment they have experienced thus far, enabling them to foresee what's to come."

Researchers have contributed depth and perspective to the investigation in the ten years after that claim was initially made. It's possible that meaning and intention qualities that Ernst Mayr believed to be fundamental to biology will naturally follow from statistics and thermodynamics. Furthermore, those universal characteristics could naturally result in life.

The number of planets circling different stars in our galaxy has also been revealed by scientists, with some estimations going as high as billions. Many are out of balance, and at least several are Earth-like. There, too, the same laws undoubtedly apply. Originally published in Quanta Magazine, an editorially independent journal of the Simons Foundation whose goal is to increase public awareness of science by providing in-depth coverage.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  
Loading...