Entropy
According to the first law of thermodynamics energy is not created or destroyed, but only transformed. The total energy of a system remains constant, even if it is converted from one form to another. This first law also establishes the equivalence between work and heat. The first law was introduced by Mayer, around 1840. The word “energy” comes from ancient greek, from the word “ergon”; it means “action”. The concept of energy was introduced by Young in 1807. The first law establishes that energy remains constant in the Universe, but it does not state how it evolves with time; that is a task for the second law, which states that entropy, disorder, increases with time. The second law was developed around 1824, by Carnot, and later spread by Clapeyron and Clausius. They explained that heat does not spontaneously go from a cold body to a hot body. The increase of entropy with time is the statement of the second law of thermodynamics. Clausius coined the term “entropy” (“transformation” in greek) around 1854, it refers to the transformation content of a body. The opposite of entropy is negentropy. The first law is about the conservation of energy. The second law is about the evolution of a system. Both have been proved. According to the second law systematic processes are irreversible: Once a system has changed it cannot go back to its initial state. The new state will be disordered in comparison to the previous state and that means that disorder, entropy, inevitably increases with time. The result of this irreversibility includes the general impossibility for a system to go back in time, which means that a time arrow, pointing towards the future, is established. The time arrow is binding for the physical phenomena, in general, and it goes from the past towards the future: A system does not get reordered and recovers its previous state. If it recovered a state similar to the previous, it would not be the previous one but another similar state. For instance: If a jar were accidentally dropped on the floor from a shelf and then picked up and set back in place again, it would not be the same place, because even the Earth would not be in the same place while orbiting around the Sun.
Information
Information is related to entropy but it is an abstract mathematical concept, so, according to scholars, it is not exactly its reverse. Entropy is the measure of the increase of disorder inside a system, it measures randomness, while information measures uncertainty (whatever this means). According to the definition by Shannon and Weaver, in their book, “The mathematical theory of communication”, information is an interaction between objects, and a change of the state they are in, that results in a communication of that information. Information appears when the elements of a system order themselves, and that happens when entropy, which invariably increases everywhere all the time, is able to apparently and somehow remain locally constant or to diminish, from some particular point of view, to some effects (and this is how entropy and information would be related), which requires the presence of an observer, classically referred to as “Laplace’s demon”, something that interacts with the system and so behaves like an observer. In quantum physics the observer and the observation are identical, and interactions are true: When a given particle A, for instance, a photon, interacts with a particle B, for instance, an electron, the change in B is a measure of A, so B is an observer of A and B is also the observable change, all in one. In classic mechanics, where the macroscopic interacting objects are formed by huge numbers of those elementary particles, and measurements are hugely imprecise in comparison, what we call observer now, for practical uses on the macroscopic scale in which we are confined, can even be a third party, another macroscopic object outside the observed interaction, for instance, a spectator watching two snooker balls colliding on a TV screen during a televised match (These macroscopic “observations” not only are imprecise, unable to detect elementary particles, but they are untrue and illusory as well, as they are based on make believe interactions, but they work for practical macroscopic illusory purposes, like playing snooker and fleeing from crocodiles, or flying planes between continents, within an acceptable margin of error, the error of considering that a crocodile is a crocodile instead of a mass of elementary particles and entropy). Information is another way for a system to get disordered, although with apparent order in the presence and particular interpretation of an observer that measures the change when interactions verify (True interactions or acceptable within some margin of error in practice false interactions). The result is that disorder will sometimes look like order from an observer’s blurred point of view; order is that, a misinterpretation of disorder (That we will not notice, being perception confined in a macroscopic scale and the pixels and action potentials invisible to us, apparently out of sight and inexistent).
That apparent local order in an open system, in the form of information, will be possible because energy will have been consumed inside it, energy supplied from outside that open system. From inside of the open system, apparently closed from the inside, from the observer’s blurred point of view for the microscopic, it will look like the interactions mean order from the observer’s point of view, as if the system were a closed one from that perspective, as if the energy used in the interactions (A system consists in some elements and their interactions) never came from the outside. The trick the brain performs in its particular case to achieve this make believe appearance of being a closed system, an autonomous conscious individual in charge of some peculiar voluntary and self controlled observations, is probably a mere change of the scale of perception, as we have been mentioning (and will be reasoned in the final chapter). During the process of conscious observation it appears as if the mind were a closed system, it appears ordered and looking like a closed system on a certain scale of observation because the process of observation consumes internally an exceptionally and sufficiently big amount of the energy income from outside the system (glucose) to make that illusion of a rational mind possible in such a complex system (with so many available synapses). The brain weighs two per cent of the body weight, while it consumes twenty five per cent of the available energy (more or less depending on the age, etc.).
Statistical entropy
Boltzmann (1877) explained the rise of entropy in a microscopic level: The rise of entropy of a system and the irreversibility of its changes of state take place by the evolution from a more ordered state to a less ordered state of the elements of the system. To achieve this explanation he developed the concept of statistical entropy, so, instead of resourcing to the concept of the heat of the whole system, he spoke about the microstates of the system, or the probability of being placed somewhere in the system for each element of it. According to Boltzmann, entropy should be defined as the number of different microscopic states in which the lesser particles of a piece of matter should be found so as it would look like the same piece of matter from a macroscopic point of view in any of those microstates. Extrapolating this description to the case of the brain, it happens to be a macroscopic object from certain points of view too, like in macroscopic anatomic descriptions, for instance, and it also includes microscopic anatomic elements (neurons) which go through a change of their states: Neurons can be at rest as well as discharging action potentials. Even though the brain changes its states at a microscopic level it remains the same brain at a macroscopic level throughout those changes to some effects, for instance, in order to look like the same brain, the same organ, of the same person, from one minute to the next and throughout an entire life, and including the stability of that person’s conscious self identity and individuality too, despite the multiple (billions) instantaneous changes the microscopic neurons go through every instant. The brain, defined as a system with entropy, that gives off heat, fits in this description of entropy by Boltzmann too: In spite of the microscopic changes in the brain, in spite of the millions of heterogeneous transmissions in the synapses, the conscious self remains the same macroscopic experience every consecutive instant.
Negentropy
The brain self organizes; this means that it works, to some effects, as if its dynamics relied on its own energy, as if it were a closed thermodynamic system. The brain is not a closed system, its energy comes from outside of it, mainly glucose, and it radiates heat ceaselessly. It is an open system. The brain is a living system, and self organized, so locally (locally in the head) entropy decreases: This is called negentropy, and when this happens order increases locally to some extent. For instance: The functional structure of the brain is built up orderly at several levels, a functional neural level recognizable as such at a microscopic scale, a network level at a macroscopic scale, etc. That ceaseless radiation of heat means that entropy is globally increasing in that system as a whole too, despite negentropy, like in the rest of the Universe. That relatively disproportionate, expensive and continuous supply of glucose to the neurons, and their relative high rate of oxidation does the costly make believe of negentropy possible.
If the Universe were like a letter soup, with the letters, the elementary particles, in the soup of the vacuum of space, the expansion of the Universe and the systematic movement of the particles would carry the formation of groups of disordered letters in the soup by rinsing the spoon, the forces at work, in the liquid. The words composed that way would look like order to an observer of words, order arising within disorder. But this order would be illusory because words would get formed through the disordering of letters and not by ordering them. Negentropy simply strengthens furthermore this apparent local and restricted emergence of order whithin chaos in a particular bowl of soup. Anyway, even though crocodiles are not crocodiles, but a temporary manifestation of entropy, it would be wise to keep away from them when they are hungry, because they do not know they are not crocodiles. The brain is a local system too, like that ideal bowl of letter soup of the example, it also radiates heat all the time, oxidizes the glucose taken from the outside to form letters and radiates heat in the process. For a system to break the second law every letter of the soup should form words, an impossible task being the time arrow binding. In any case there will be letters that will not form recognizable words, meaning that heat will be radiated outside the local systems from every local system, despite negentropy. Living systems, cells, plants, animals, are able to generate much local order and grow structures on several levels. Living beings are able to form more words in the soup than expected, given their peculiar way of systematic dynamic evolution in a negentropic fashion, for instance, by ways of their self organization in levels, like the molecular level (including the enzymatic activity), the cellular level, etc. (and including the higher rate of glucose oxidation, in the case of the brain), something that helps to increase the illusion of order, although all of it is disorder. The biological molecular machinery must also get geared, in a compatible way, at diverse scales, and that implies nonlinear (chaotic) thermodynamics (The possibility of a cause-effect relationship between scales and levels in some of these cases, and others, is being currently investigated by some scientists around the world, as a matter of fact; it is an area of interest). A system in equilibrium has less chances to generate so much order, that possibility becomes easier in systems with less equilibrium, which allow an evolution of a system of the kind, constructively. The more negentropic the system and the more far appart of the equilibrium, the more negentropic it will be next and the more words that will be formed. If it became even more negentropic, if it radiated more heat, words would even form phrases, etc., and succesively so, even on different leves, until reaching an equilibrium between, for instance, the available amount of glucose being oxidized, the amount of heat being radiated and the final complexity of the mind achieved in that particular process. The balance would be achieved when the letters were as separated in the soup as to being unable to form more words from any observer’s point of view (According to the second law of thermodynamics in every closed system entropy does not increase if the transformation of the system is reversible, otherwise it increases in search of the state of equilibrium).
It was discovered, by the end of the twentieth century, that this going from a less ordered state to a more ordered state without contradicting the second principle, by negentropy, was possible. Prigogine’s work was fundamental to understand this possibililty. Negentropy is a characteristic of living beings in particular, as they are characterized by this capability for self organization. To understand such an amount of increase in the local order in a particular system, like the brain, and according to Prigogine, this idea of negentropy was required (and a big supply of glucose), meaning an even bigger rise of entropy at this point of local negentropy, bigger than the mean local rise of entropy already taking place around this point of negentropy. This requisite would be accomplished, in the case of the brain, due to its relatively bigger rate of energy consumption (glucose oxidation) and radiation of heat. The molecular machinery necessary to reach that equilibrium is complex and relatively expensive in terms of energy; life is always on thin ice. Evolution has made possible such an apparent waste of the most of the time scarce available energy resources for the living cells possible in this, in the brain, through natural selection, step by step assembling an increasingly more and more complex brain, throughout millions of years.
Chaos, order and the brain
Self organization carries order with it. Take, for instance, the characteristic regular neuronal activity: Time and again the discharge of a bioelectric impulse, or its opposite, will be regularly verified. This oscillatory character of the neuronal activity has to do with the corresponding type of dynamic system the brain is, an open one. Prigogine proposed that there cannot be oscilations (periodicity) in a closed thermodynamic system, but only in an open one that is continuously exchanging energy with the outskirts of the system, which is the case of the brain, continuously receiving glucose and releasing heat (infrared radiation). Besides, a dynamic system, capable of this degree of order, must be in a homeostatic equilibrium, known also in physiology as a “stable desequilibrium”, what Prigogine called “disipative structure”. Systems, being dynamic entities, also tend to adjust themselves throughout their dynamic physical evolution on their way towards their state of equilibrium, which they will not reach (In Physiology an unconscious adjustment is called “regulation” and a conscious adjustment is called “control”). Prigogine pointed out that this type of open systems, with periodicity, should be non linear (chaotic) as far as the relations between forces and flux are concerned, as is the case of the brain. These are the type of systems that show the phenomenon of chaos. Chaos is the way of a dynamic system to get disordered. Chaos is characterized by unpredictabilty and by the rise of complexity (“Complexity” means that the next state of a system will be different to the previous one; therefore, an evolution towards a more apparent simplicity, when it happens like that, is a way to rise complexity too; each new state also includes the unrepeatability or “non ergodic” character of a system). The complexity is due to the rise of disorder, or entropy, of the system, or it can also be explained as the increase of states the system can be in, in reference to the number of its elements and also to the types and number of interactions between those elements. Chaos can be caused by an input of energy in the system in the form of elements of the system, like a pool being filled with water, or by a change in the interactions among the elements, like in a rotating caleidoscope, or both, like the brain. All of these ideas which are being taken into account here owe their content to Wiener’s “cybernetics”. As the brain is quite a complex system it can present what Bonev called “intermitence”, in his article, “Teoría del caos”, which means that order can emerge from chaos and succesively chaos from order, including an alternance between irregularity and periodicity too, and, therefore, it can include neuronal synchronization as a possibility, and it does. There is an article: “Coherencia global inducida por ruido o diversidad en sistemas excitables”, written by Tessone C. J., Sciré A., Toral R. and Colet P., according to whom in excitable systems, like the neuronal systems, if they are considered as “active rotors globally coupled”, an increase in disorder, at a microscopic level, can have, at a macroscopic level, a bigger order as a result. According to these authors, this increase in disorder might be due to an increase of the background noise, to the diversity of the natural frequencies, or to a decrease of the coupling among the involucrated oscillators, which tend to synchronize, to desynchronize and to fluctuate between both states around a fixed point. All of these ideas help explain that the brain, from a thermodynamic point of view, is a system where neuronal synchronizaton and consciousness are physically possible, given the proper conditions and enough time to evolve like that.
No hay comentarios:
Publicar un comentario