Home / About / Lectures / Catalogue / Education / Italian
Artists: Tamy Ben-Tor / Marnix de Nijs / Mark Formanek / Marzia Migliora / Julius Popp / Reynold Reynolds / Jens Risch / Michael Sailstorfer / Arcangelo Sassolino / Fiete Stolte
Zygmunt Bauman
Sandra Bonfiglioli
James Bradburne

Andrea Ferrara
Antonio Glessi
Alessandro Ludovico
Franziska Nori
Hartmut Rosa
Temporal hesitations

Neurobiology of time perception

The perception of time in living creatures is undoubtedly one of the most important factors in their adaptation to their environment and thus a key element for their survival. It plays a central role as well in the evolution of the species, including human beings. Although it may seem instinctive to be able to identify temporal intervals with enough precision to foresee and carry out our actions, neurobiology to date has only provided partial explanations of the actual systems used by the brain to classify the various intervals of time. There are four time scales considered particularly significant for human behavior: the day (circadian rhythm), the second, the millisecond and the microsecond. According to recent neuroscientific discoveries, the superchiasmal nucleus of the hypothalamus is responsible for the system that registers longer time intervals while the three shorter intervals are registered by a diffuse network of neurons. Experiments with patients suffering from disturbances in their perception of time intervals of less than one second, in fact, have supported the theory that there is no one single place in the brain that controls the timing mechanism, but rather that this mechanism depends on numerous interconnected places.

Human time and natural time

The perception of time in the human being is partially conditioned by his neuronal and psychological structure. For centuries, man has lived with the conviction that there is such a thing as 'absolute' time, which can be measured thanks to the creation of ever more advanced chronometric systems. However, as scientific research in the field progresses, this belief is proving to be increasingly subjective and even illusory. The temporal units normally perceived by human beings are on a scale of the perception of their existence, that is, years and the succession of generations and epochs. Physics and other natural sciences, on the other hand, utilize infinitely greater or smaller intervals to be able to express concepts such as the extension of the cosmos. The longest physical interval of time we can measure is the age of the Universe itself, calculated with remarkable precision at 13.66 billion years (compared to the Earth's 4.54 billion years). At the opposite end of the scale from these exceedingly long times are the infinitesimal periods of time that mark the evolution of matter at the level of atoms and the fundamental particles of which they are made. The atomic vibrations we use to control atomic clocks, for instance, correspond to time intervals of approximately one billionth of a second. The shortest unit of time measured to date in the laboratory is an attosecond, a billionth of a billionth of a second. While the extraordinary brevity of this period of time baffles us, it is still 1025 times larger than the fundamental unit of time, the so-called Planck time, equal to 10-44 seconds. Besides being the briefest measurable interval of time, in physics Planck time represents a "quantum of time", that is, the briefest unit of time that still maintains the usual meaning of time.

Chaotic time

What is it that determines this indivisible interval of time? It seems to be the force of gravity, which exercises such a powerful influence on these periods of time and their infinitesimal distances as to distort their space-time structure. Gravity distorts and fragments the sequence, altering its structure from continuous to 'granular'. Just as a set of aligned points form a line when seen from afar but appear as individual points when seen under a microscope, so it happens to time when measuring Planck's ultimate interval. According to quantum physics and the Heisenberg Uncertainty Principle (which establishes the impossibility of knowing with any certainty the momentum and the position of a particle at the same time), time no longer has a specific direction that allows us to recognize its passage from past to present and from present to future in an orderly sequence of events. On the contrary, time and space, which are inextricably bound, become a chaotic process, analogous to a sort of foaming, like the froth on the surface of a liquid where bubbles are constantly being created and destroyed. We could say that in this peculiar condition time 'hesitates' and takes no definite direction. Certainly Einstein's theory of general relativity contributed enormously to the formation of a more accurate notion of our Universe as a space-time unit composed of four equivalent dimensions (three spatial and one temporal). Nonetheless, a number of aspects of time still raise complex questions yet to be answered. In fact, we come up against a possible paradox if the time coordinate is indeed fully equivalent to the three space coordinates. Suppose that we are in Florence in a certain instant. We do not doubt the fact that, for example, the Coliseum and the Tower of Pisa continue to exist elsewhere in that same instant. If we want to apply in a parallel fashion that same concept to the time coordinate, we are forced to conclude that not only the past (which isn't particularly disturbing) but also the future already exist. The paradox is that the future of each one of us, and of the entire cosmos for that matter, exists independently of the fact that we eventually reach it. This concept, which derives formally from the theory of relativity, has not been disproven by any scientific experiment and is therefore still recognized as valid by science. While man accepts an idea of space that expands in various directions within which the most disparate points can be reached, it is considerably more difficult to accept that scientific notion as regards time. This theory seems to directly contradict the perception that man has in his daily life, in which time seems to move in a single direction, from past to a future that does not yet exist and which must still develop.

Entropy and the arrow of time

But why should time have a single privileged direction-from past towards the future passing through the present-when we measure it with much longer intervals than Planck time? In other words, we wonder why that hesitation of time does not also occur with time intervals we are capable of measuring. Time moves along at speeds that seem to vary (to be discussed below) but always in the same direction. Scientists often use the term 'the arrow of time' to indicate this inescapable movement towards the future that cannot be changed. To understand this better, we need to turn to thermodynamics to introduce the physical quantity known as entropy. For example, we know that after the strong impact of a collision, our car shatters into thousands of pieces. None of us, however, has ever seen the inverse process in which the pieces miraculously reassemble in their original positions, bringing back our car intact. This process is (virtually) excluded by the second law of thermodynamics, which states that in an isolated system, the degree of disorder, defined as entropy, always tends to increase over time. Obviously no physical law prohibits that the pieces come back together after they have been shaken a bit, for instance, but the probability of such an event is clearly miniscule. In our example, the pieces of the car were in a highly ordered state before the collision whereas after the accident they were in a disordered state. It is interesting to note the importance of the word 'isolated' in the economy of the second law. The molecules that compose both living organisms and crystals are characterized by a high degree of order. In the human brain, for example, there are more than a hundred billion neurons connected in a network in which each individual junction can establish up to 100,000 interconnections with other cells. The matter that constitutes living beings is organized in the complex molecules of cells and tissues passing from a 'disordered' state to an 'ordered' one, thus decreasing the degree of entropy in what seems to be a contradiction of the second principle. The explanation derives from the fact that living organisms are not isolated systems; on the contrary, they exchange energy and heat with the surrounding environment. It follows therefore, according to the laws of thermodynamics, that they diminish their entropy at the expense of the entropy of the environment in such a way that the entropy of the total system (organism + environment) increases.

Towards total disorder

Today scientists believe that our perception of the flow of time in the direction from past to future (that is the arrow of time) depends on the analogous unidirectionality of the increase in entropy as defined in the aforementioned second principle of thermodynamics. Physicist Stephen Hawking tried to explain his theory thanks to an analogy between human memory and computer memory. We know that electronic circuits work with binary digital signals, which means they can only have a value of 0 or 1. When we try to memorize certain information in a computer's RAM memory, we have to force the system to arrange itself in one of the two states. This involves an expenditure of energy. This energy dissipates as heat, contributing to an increase in the degree of disorder (associated with the chaotic motion of molecules in a heated gas) in the Universe. Thus, the disorder of the Universe has increased by a quantity that can be proven to be always superior to the quantity of the increase in the degree of order in the memory itself. The direction of time in which the computer remembers the past is the same in which disorder (or entropy) increases. The human brain seems to function in a quite similar manner. Our psychological sense of time is thus determined within the brain by the thermodynamic arrow of time, that is, we are forced to remember things in the order in which entropy increases. This conclusion may be regarded as self-evident but it leaves open a basic question for physics: why did the Universe evolve from an initial highly ordered state to a highly disordered state in the remote future? And so why did the Big Bang occur? The laws of physics as we know them fail when we seek to explain what happened after the Big Bang, that is, 'one Planck time' later. Our only hope of finding the answers is to successfully develop a quantum theory of gravity. Let us leave this daunting task to the physicists, however, and proceed with our analysis of what constitutes the human perception of time. We have seen that time has a direction determined by thermodynamics and its second principle. Now we might ask if the flow of time is constant or does it vary when it flows along a given direction? In other words, are there 'temporal hesitations' analogous to the chaotic time below Planck time but at the macroscopic level of our perceptual experience of the flow of time?

Information and entropy

To try to respond to this question, we need to turn to information theory, a branch of mathematics that aims to quantify the information content of a given message. This science utilizes the same quantity that we encountered previously in the context of physics: entropy. In this new context, entropy corresponds to the degree of uncertainty in a message that we transmit, measuring its informational content. An example can be useful here. We send a sequence of 100 binary characters (bit) made of 1 and 0 to our correspondent. If he or she already knows the sequence, there will be no exchange of information. If however each bit can be either 1 or 0 with equal probability, then the informational content of the message will be greatest. The formula that expresses this concept is exactly the same one that we use to calculate the entropy of a system. Indeed the two phenomena seem to be a perfectly analogous.

The speed of time and the flow of information

In a surge of speculation, let us take a further step to explain our perception of time and its possible variants. The hypothesis I propose is that the perceived duration of time derives from the relationship between the quantity of information we can store and the speed with which that information is provided. Given that, as we have seen, 'information content' and 'entropy' are essentially synonymous, entropy indicates not only the preferred direction of time physically speaking but also the rhythm of its flow as perceived psychologically. Assuming that our brains have the capacity to retain a great amount of information, this theory suggests that if we are exposed to a greater flow of information, our perception of the intervals will tend to diminish; for example, time will accelerate. In other words, the more information we receive, the more we have the impression that time flows quickly and vice versa, the less information we receive, the slower time seems to flow. This information takes the form of verbal, visual, auditory, or tactile information-in a word, sensory information. In essence, humans perceive the existence of and acquire information thanks to their sense organs, and through this operation, the perception of time emerges. From this, it follows logically that the flow of time is powerfully influenced by the acquisition of information.

Brain and computer: a simple comparison

The barrage of information that assaults us daily is without precedent in the history of mankind. It is worth questioning whether our cerebral capacity is even remotely up to the task of maintaining a worthwhile exchange with the information technologies available. An example can illustrate this better. Let us consider Canto I of the Inferno from Dante's Divine Comedy. It is composed of 44 tercets and one quatrain, for a total of 136 verses with 4034 characters. In computer science, a character corresponds to a Byte, that is a string of eight ones and zeroes. Since the spaces between words also carry information, we need to include them as well in the calculation (along with other punctuation characters), bringing the information content of the text to 5198 Bytes. Reading the entire Canto I at a speed that allows me to at least minimally understand the text (the Inferno may not be the easiest text I could have chosen, but that does not influence the substance of the argument), this process of acquiring information lasts 6'38", or in other words, 398 seconds. This rate of data acquisition-13 Bytes per second-qualifies me as a genuine information sloth! Things might have gone somewhat better if instead of a text, I had tried to acquire an image (our brains seem more adapted to this type of acquisition), but not by much.

Information overflow

So the problem of the perceived acceleration of time becomes a bit clearer: today we are forced to absorb a huge quantity of information at a speed that far exceeds our brain's capacity, in a process called information overflow. On the basis of the entropy theory mentioned above, this generates a perception of increasingly short time intervals. The brain 'defends itself' by filtering information in a variety of ways, for instance, by retaining only certain patterns and/or compressing the information. Such compression, however, often involves the sacrifice of part of the original data. While the speed with which our brain is able to acquire information is clearly an essential factor, so too is the brain's capacity of retention. Returning to the previous example, how long will I manage to remember the Canto I that I have just read, and how precisely? Also in terms of the capacity to retain (that is, memory linked to cerebral plasticity), our brain appears to be less efficient that a computer. Can we assume perhaps that our brain capacity is under-exploited? Apparently not. The famous cosmologist Sir Martin Rees has demonstrated that the present size of our brain represents the maximum limit of the possible animal capacity to develop intelligent concepts. Beyond this size, the information exchange between the synapses would take too long for constructive thought.

Towards the future

The sensation of the acceleration of time that we all experience today has become a critical problem of contemporary society. It is changing individual psychology and interpersonal relationships, and hence the collectivity, altering the quality of our social relations and exercising a questionable influence on human progress itself. It is likely that the constant acceleration we perceive is the result of a phenomenon known as the 'Technological Singularity', a term coined in 1993 by computer science professor Vernor Vinge. According to Vinge, by 2030 we will have all the technology needed to create super-human intelligence. Advances in hardware and biotechnological research could lead to the creation of conscious computers with super-human intelligence, which is what already happens in effect with computer programs that write their own codes for software to improve their performance. Technological prostheses to be applied to the human body to expand its biological capacities have been developed and manufactured for years. Artificial limbs that make motor skills bionic and microchips inserted in the brain to restore vision to the blind are the first steps towards the implementation of microelectronic systems in the human neuronal network in the future. Years have already passed since the first steps were taken towards the hybridization of the human brain and body with robotics leading to a cybernetic organism (cyborg). No longer is it on the sci-fi level of the vision of William Gibson or Donna Haraway, who already in the 80s had theorized the figure of the cyborg as the future of the human race. So, we may as well step into line with the synthetic creatures that we ourselves have constructed and with those that only exist in the limbo of our (and their) imagination. This is our only hope for the day when we are forced to leave the Earth to seek a new home for ourselves in the Cosmos.

Palazzo Strozzi