# The Allegory of the Algorithm

*Jamie Perrelet
Schumacher College
2013*

*Jamie Perrelet
Schumacher College
2013*

Since the unification of matter and energy, space and time, an information description of reality has been progressively emerging within the heart of physics. Algorithmic information theory (AIT) has significant implications for modern cosmology and quantum theory, placing tight constraints on certain physical processes. The determinism implied by AIT appears to prohibit notions of freewill altogether and yet humanity has irrefutably demonstrated the ability to transcend such boundaries. The stochastic process of quantum indeterminism appears to play an important role in substantiating the existence of creativity in a deterministic universe. An information theory approach to this enquiry clearly suggests the need for physics, biology and the science as a whole to supersede the mechanistic paradigm.

The Nature of Information

Information was first quantified in 1948 by Claude E Shannon’s landmark publication, ‘A Mathematical Theory of Communication’. Shannon’s definition of information is remarkably straightforward, however its implications continue to shake the foundations of science; indeed it has been described as ‘the most basic law in physics’.

Regard the physical world as made of information, with energy and matter as incidentals.” – John A. Wheeler

Shannon realised that information is simply ‘that which can distinguish one thing from another’ and defined a single unit of information as the quantity of information required to decide between two possibilities. He named this quantity of information, ‘entropy’, for its many parallels with thermodynamic entropy and the unit of information a ‘bit’, short for binary digit. Hence, a binary choice such as the flip of a coin has 1 bit of entropy and the four possibilities of two coin flips has 2 bits of entropy. Information is a measure of unpredictability; the greater the number of possible outcomes of system, the higher the entropy.

Although this formalism is sourced in communications theory, there is growing evidence from both the perspectives of quantum and cosmological physics that information is a fundamental and potentially elementary construct of reality. The nature of information is best illustrated through the outcome of physical experiments.

Quantum Entanglement

A pair of particles can be prepared in such a way that their properties become ‘entangled’ and the behaviour of the two particles is entirely inseparable. To exemplify, if the spin of an entangled particle is measured to be clockwise, its partner will immediately assume an anticlockwise spin, despite the fact that neither particle had a defined spin before the measurement was taken. Astonishingly, the phenomena happens instantaneously over arbitrarily large distances. The experimental confirmation of entanglement marked a defining moment in the history of physics. Within the framework of relativity faster than light events suggest a reversal of causality; implying an effect preceding its cause. Pairs of entangled particles have been used in various experiments to reveal insights into the fundamental nature of reality. Entanglement is investigated further in the following three stage setup, called the ‘quantum eraser’ experiment.

A photon passes through a beta barium borate (BBO) crystal, converting the single photon into a pair of entangled photons (γ_{1} & γ_{2}). The two photons follow separate paths; γ_{1} is sent straight to a detector (D_{1}) and γ_{2} to a double slits setup with a target screen and detector (D_{2}). Both detectors are connected to a coincidence circuit, ensuring that only the entangled photons are recorded. Scanning horizontally, D_{2} records an intensity map of the screen, revealing the infamous interference pattern of double slit experiments. The recorded interference pattern implies that the γ_{2} has effectively passed through both slits simultaneously; interacting with itself.

A circular polariser is placed in front of each slit, giving γ_{2} either a clockwise or anticlockwise polarisation, depending on which slit it has passed through. The polarisers have the effect of ‘marking’ γ_{2}, as it is now possible to know which slit it has passed through by measuring the photon’s polarisation. The consequence of this is that the ‘which-path’ or ‘which-slit’ information associated with γ_{2} is known and the interference pattern seen at the screen is destroyed as result. As with a normal double slit experiment, the act of observation has caused the photon to pass through a single, well defined, slit.

A third polariser is placed on the path of γ_{1}, imparting the photon with a diagonal polarisation before being recorded at D_{1}. Since γ_{1} and γ_{2} are entangled photons, their polarisations are instantaneously effected by one another. The circular polarisers at the slits are affected by the diagonal polarisation of γ_{1} and they now randomly produce a mix of clockwise and anticlockwise polarisation, regardless of which slit γ_{2 }has passed through. The ‘which-path’ information has now been ‘erased’ as it is not possible to know which slit γ_{2 }has passed through and the interference pattern on the screen reappears. Astonishingly, this effect is independent of whether ‘erasure’ happens before or after γ_{2 }has passed through the slits and can even be achieved after γ_{2 }has been recorded at D_{2}!

A related experiment, called the delayed-choice quantum eraser, allows the decision as to whether to keep or erase the which-path information until after its entangled partner has been detected. The extraordinary consequence is that an event that has already happened can be caused by an event which is yet to take place at some arbitrarily time in the future. We have become accustomed to the mind-bending trademark of quantum phenomena, however in these experiments, our basic understanding of a causal sequence of events comes into question.

This isn’t right.

It’s not even wrong.” – Wolfgang Pauli

In order to reconcile entanglement and the above experiments with physical causality, it is necessary to take an information perspective on the arrangements. For example, if a code were assigned to the spin direction of entangled particles (e.g clockwise = 1 & anticlockwise = 0), it would be reasonable to presume that the phenomena could be used for faster than light communication. However on closer examination, it is apparent that whilst a measurement will reveal the state of a distant particle instantaneously, due to quantum indeterminacy, there is no way of knowing which spin direction the particle will actually take prior to the measurement. Therefore, in order to successfully communicate a message, a secondary signal is required to ‘unlock’ the information encoded within the states of the entangled particles. Being classical, the secondary signal must be bound by the speed of light; so whilst it is possible to observe a distant system instantaneously, no information may travel to it faster than light.

In the case of the quantum eraser experiment, the possible trajectories of a photon are defined by whether or not certain information has been recorded. The experiment illustrates how the which-path information may be erased after it has been recorded, returning the system to its original state. The delayed-choice quantum eraser, further highlights the behaviour of information. In this experiment, the choice to either keep or erase the which-path information occurs after the photon has been measured and thus there is a suggestion of retro-causality. This outcome is curiously circumvented, as whilst the measurements are well defined, the path of the photon can only be deciphered retroactively. The which-path information is attained by comparing data from all the detectors in the setup and so once again a classical signal is required to transmit information between the detectors, which is bound by the speed of light.

These experiments reveal the curious fact that the physical properties of distant quantum objects can indeed be effected nonlocally. Despite this, causality and relativity still hold because any meaningful information contained within the entangled system is, at best, unlocked at the speed of light. The subluminal boundary of information suggests a close connection between it, mass and energy. Such a relationship between information and energy has been confirmed experimentally with Szilard’s engine, a refinement of Maxwell’s demon that demonstrates how a particle is able to do work by receiving information, rather than energy.

It is important to realize that in physics today, we have no knowledge what energy is.” – Richard Feynman

Algorithmic Information Theory

Born out of the 1960’s, algorithmic Information Theory (AIT) expands upon the work of information theory, by applying Shannon’s theories of information to step-by-step calculating procedures. AIT defines the amount of information in a data set as its length after maximal lossless compression, known as the Kolmogorov complexity. Put otherwise, it is the minimum length of a computer program or algorithm that can reproduce a message without losing any information in the process.

10101010101010101010101010101010

10010111011000101011000011010101

Consider the above example; they are both strings containing 32 characters of 0’s and 1’s. The first string is said to be compressible as it can be rewritten as ‘16 10s’, whereas there is presumably no other way of rewriting the second string more concisely. According to AIT, the first string contains less information than the second, because it is compressible into a shorter description.

11.001001000011111101101010100010

The above string isn’t obviously compressible, however it is actually the first 32 digits of π written in binary. A short computer algorithm can be easily written that is capable of producing an indefinite number of such digits. For this reason, it would be possible to efficiently compress a large number of digits of π by writing them more succinctly as an algorithm based on Leibniz’s formula:

Algorithms are a step-by step series of procedures that will always generate exactly the same output for a specific input. If the output of an algorithm is longer than the length of the algorithm that created it, then the output can, by definition, be compressed by rewriting it more concisely as the algorithm itself. Thus, programs are actually unable to create new information, as the information within an output is already contained within the algorithm itself. An algorithm effectively rewrites the information it already contains into a new form.

These inherent limitations on computation are not well popularised, however they have considerable implications for mathematics, physics and philosophy. All mathematical operations can be written algorithmically and are thus bound by the same restriction. It is not possible to produce more information from an equation than is already contained within its definition. All of mathematics is achieved within a sandbox of carefully defined axioms, otherwise known as the elementary postulates or assumptions. The axioms of mathematics standalone as the foundations for all mathematical knowledge. By definition, the axioms of mathematics cannot be derived from one another and hence they are ultimately uncompressible from the perspective of AIT. A mathematical theorem is said to be proven when it can be consistently traced back to the fundamental axioms of mathematics and therefore all of mathematics is derivable from its axioms. Just as a computer algorithm cannot create new information, equally the theorems of mathematics cannot create new axioms. Therefore, the vast ocean of information held within mathematics is fully contained within its axioms.

Algorithms & Freewill

Theoretically, every past and future state of a purely deterministic universe can be calculated from its present state and hence the total information in such a universe would be constant. Freewill, the ability to choose, is a diversion from determinism and is therefore the antithesis of the algorithm. Through the ages, scientific discoveries have largely eroded the possibility of freewill in favour of a fixed causal description of reality. From an information theory perspective, an act of freewill must either involve the creation of new information or the transformation of information in a non-algorithmic manor. Hence, AIT appears to terminate freewill altogether, it forbids the creation of information by any process that can be accurately modelled by mathematics. Within the discipline of mathematics, the creation of new information would be indicated by the creation of a proposition that cannot be derived from the axioms of mathematics, namely a new axiom. Remarkably, this is exactly what humans have achieved in discovering the axioms of mathematics. It is exactly the thing that no algorithm will ever be capable of and henceforth human consciousness has the ability to transcend computation.

All theory is against the freedom of the will;all experience for it.” – Samuel Johnson

The Arrow of Time

Is it possible to reconcile the inherent freedom of the human mind with the known laws of physics? Freewill is a discussion of creativity, or how things come into being and as such it is helpful to look at our current understanding of time. There are only two known physical processes that are not completely time-symmetric; the second law of thermodynamics and wavefunction collapse. These irreversible processes are considered to be the source of the ‘arrow of time’, the reason that events viewed forwards and backwards in time are so distinctly different.

The second law of thermodynamics describes another quantity called entropy that was defined before Shannon’s. Thermodynamic or physical entropy, is the amount of disorder in a system, rather than unpredictability. Thermodynamic entropy may be seen as a measure of the number of ways the particles of a system can be arranged, without changing the large-scale properties of the system. The second law states that the total thermodynamic entropy of a closed system is always increasing towards its maximum value, which corresponds to thermodynamic equilibrium. Whilst the second law does allow entropy or disorder to decrease locally, this reduction is always at the expense of some larger increase of entropy, normally in the form of heat, elsewhere in the system. The second law is statistical rather than fundamental, maintaining that there are a higher number of less organised, higher entropy, states available to the evolution of a system than more organised, lower entropy, states. For example, imagine an egg shattering on the floor; it isn’t impossible for the egg to repair itself, it’s just extremely unlikely as there are many more broken-egg states, than fixed-egg states. Thermodynamic entropy (S) and Shannon’s information entropy (H) should not be confused, however there are many parallels between them, as seen in the similarities between there definitions:

The first equation is the Gibb’s formula for entropy, where k_{B} is Boltzmann’s constant and p_{i} is the probability of a particular state of the system. The second equation is Shannon’s equation for information entropy, where p_{i} is the probability of a certain information string occurring. Thermodynamics is not concerned with the individual positions and velocities of particles (microstates), rather it seeks to describe a system’s macroscopic properties, such as temperature and pressure. Macroscopic variables are derived by averaging over all the possible microstates that describe the same macroscopic system (macrostate), making it dramatically more manageable to perform useful calculations. The thermodynamic entropy (S) is simply the amount of Shannon information needed to describe the detailed microstates of a system that cannot be inferred from macroscopic variables.

The universe is believed to have been born out of an extremely hot and dense event, where all energy and matter were condensed at a single point. Although, the known laws of physics cannot describe such a singularity, much insight has been gained regarding the moments immediately preceding it. As appreciated in Loschmidt’s paradox, the second law of thermodynamics actually references the state of primordial creation within its derivation. The paradox states that it should not be possible to deduce an irreversible process from time-symmetric dynamics, putting the second law at odds with the time reversal symmetry of almost all physics. Put otherwise, the irreversible arrow of time outlined by the second law, requires the assumption that the entropy of the early universe must have been much lower. Consequently, the persistence of entropy increase described by the second law, is ultimately nothing more than a boundary condition designating the initial conditions of the universe to be low entropy and well-ordered.

This reflection is in agreement with cosmological observations of an expanding universe. As the universe expands, the number of degrees of freedom increases and so the total number of possible microstates of the universe increases. Since thermodynamic entropy is a measure of the number of microstates that describe a particular macrostate, the maximum entropy of the universe is steadily increasing with expansion. The singularity represents the minimum possible degrees of freedom and it is therefore clear that the maximum entropy of the universe was diminishingly small towards the Big Bang. Further, as Shannon entropy is defined as the amount of bits required to distinguish between all possibilities, the limited freedom at the Big Bang implies an extremely low information content.

There exists an upper bound on the maximum entropy or information a spherical volume can theoretically contain, called the Bekenstein bound. The limit is derived by considering the maximum entropy density an object can have without violating the second law, if it were dropped into a black hole.

Where k_{B} is Boltzmann’s constant, E is the total mass-energy, R is the radius of a sphere, ħ is the reduced Planck constant and c is the speed of light. Evidently, as the universe expands, the maximum possible thermodynamic entropy of the universe increases, however applying this line of reasoning to the evolution of physical information in the universe is not so straightforward. There is a strong insinuation from quantum theory that information is a conserved quantity and yet there is puzzling cosmological evidence to suggest otherwise. In quantum theory the sum of the probabilities of all possible events is considered to be exactly equal to 1; a cornerstone postulate known as unitarity. A fundamental implication of unitarity is that complete information about a physical system is encoded within its wavefunction, which implies that information is conserved. This result is counterintuitive in the reflection of an expanding universe and the Bekenstein bound, which suggest that the information storage capacity of the universe should be increasing with time. Furthermore, the ‘black hole information paradox’ only adds to the conundrum; a truth seen reflected in the tension it has created within the physics community over the past decades [1].

It is entirely possible that behind the perception of our senses, worlds are hidden of which we are unaware.” – Albert Einstein

The apparent contradiction of information conservation is a direct expression of the greatest challenge facing physics to date; the fundamental incompatibility of quantum field theories and gravity. However, according to Roger Penrose, loss of unitarity in quantum systems is not actually a problem; Penrose claims that as soon as gravitation is included quantum systems do not evolve unitarily anyway.

Despite second law’s stipulation to increase disorder, the universe is actually filled with an intricate web of organisation that appears to be at odds with entropy increase. A number of explanations have been proposed to explain the emergence of complex information structures and the mechanism behind the apparent decrease in entropy they are accompanied by. Léon Brillouin described the phenomena as negentropy (negative entropy), a quantity that has been closely associated with Shannon information. The second law actual allows for entropy to decrease locally, provided the net evolution is a greater increase in entropy elsewhere in the system. Astrophysicist, David Layzer, has expanded this view, by illustrating the problem in terms of maximum entropy.

Unlike popular theories that lock space and time into a predetermined geometry, Layzer’s model reflects a universe that is continually in the process of creating itself [2]. According to Layzer, as the universe expands, its maximum entropy increases faster than the creation entropy. The result is a tug of war between the second law, which seeks to increase entropy and cosmological expansion, which increases the maximum possible entropy. This dynamic tension is regularly described in terms of the entropic force of gravity and expansion, which prevents the universe from falling directly into thermodynamic equilibrium. The difference between maximum entropy and the actual entropy (negative entropy) allows freedom for the creation of new information, without violation. The important question in the light of such theories is; where could all the new information be coming from? As has been shown by AIT, there is no algorithm or mathematical operation that is capable of creating new information and therefore, a physical process with such an ability would have to be inherently non-algorithmic. Is such a phenomena observed in physics?

Quantum Indeterminacy

A central premise to quantum theory is that the physical quantities of a particle or system, such as position, are undefined until they are measured or ‘observed’. Prior to measurement, the particle is in simultaneous mixture of all theoretically possible configurations, known as a superposition of its states. The information of these possibilities are encoded as a mathematical entity called a wavefunction, which evolves deterministically like a classical wave. Upon measurement, the superposition of states is destroyed and the particle assumes a single state, an irreversible process known as wavefunction collapse. Whilst the wavefunction contains the relative probability of each potential state occurring, it says nothing about what state will actually be observed. In contrast to the deterministic evolution of the wavefunction, the collapse of the wavefunction is fundamentally non-deterministic as there is no way of knowing exactly what state will be measured. For example, if a particle is in a superposition of two equally probable states, it will randomly adopt one of these states each time it is measured. This situation corresponds to maximum uncertainty and therefore maximum Shannon entropy. Many interpretations of quantum theory have been formalised with the intention of eliminating indeterminacy as a lack of knowledge, however these have failed in the light of Bell test experiments [3]. Profoundly, there is an inherent amount of randomness at the heart of reality, which is in one way or another, choosing the state of the universe.

The production of random numbers is extremely important in the field of computer science for encryption, among other tasks. Given the deterministic nature of computation, random numbers generated by a computer program cannot, by definition, be truly random. Hence, algorithms that are capable of producing the appearance of random numbers are called pseudorandom number generators. PRNGs operate by adding layers of complexity to the algorithms input, known as its seed, however for a given seed the same output is always produced. Like the creation of a new axiom, the production of true randomness transcends computation.

Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.” – John von Neumann

Whilst randomness is normally considered to be a lack of information, true randomness actually represents the maximum possible Shannon information. This is because Shannon entropy is a measure of unpredictability, which is maximal in the case of randomness. In AIT, Kolmogorov randomness is defined as being shorter than any program that can create it and is thus entirely uncompressible. A number of strict definitions have been given to randomness, however some issues remain in these formalisations. For example, there are various methods that can computationally test for randomness, however in reality, true randomness should be capable of producing all possibilities and therefore shouldn’t be confined to a certain type of behaviour. Despite this, there are many characteristics of randomness that are well understood. The distribution of a sufficiently large sample of random numbers is expected to be normal or uniform, meaning that each number should roughly appear an equal number of times. Further, random numbers should not repeat themselves and therefore true randomness actually implies an infinite expansion of information. The enigmatic nature of randomness is seen reflected in the words of David Bohm, who described it as an “order of infinite degree” [1].

The desire to explain reality in purely mechanical terms is inadequate in the light of the indeterministic process that underpins quantum phenomena. The generation of true randomness is of a nature beyond computation, posing a potential avenue for the creation of information in the universe. More broadly, regardless of whether or not information is a conserved quantity, the stochastic phenomena of wavefunction collapse represents the manipulation of quantum information in an entirely meta-algorithmic manor.

This is not new knowledge, quantum indeterminacy has been realised in physics for the best part of a century. However, due to our inability to get behind the phenomena of wavefunction collapse, the focus has been primarily on the mechanics of wavefunction evolution. In fact, much of the attention devoted to the study of indeterminacy has been to refute its very existence. Modern physics has become comfortable with indeterminacy by taking a probabilistic approach to quantum theory, brushing uncertainty under a carpet of statistics. Quantum experiments may be repeated an enormous number of times, revealing unprecedented degrees of correlation between theory and observation, making it the most validated theory in physics. From this perspective, the laws of large numbers take care of the uncertainty and drastically simplify the phenomena by making it mathematically manageable. The motivation is to observe how a system correlates to the statistical expectation for large sample sizes. Whilst this method is extremely powerful, it has the effect of blurring out the details of individual measurements. This is deemed acceptable as quantum decoherence is understood to prevent these variations from having an effect on the large scale properties of the system. However, this is not a universally supported belief and Carl Popper, among others, have developed alternative ‘propensity’ theories of statistics that attempt to bring the individual event back into context. Unfortunately these theories have gathered little support from the mainstream, as they are inherently difficult to interpret.

Sensitivity & Indeterminism

In the limit that the number of particles in a system approaches infinity, the statistical averages of the system become the laws of classical physics. Hence, according to quantum decoherence, the dominance of statistical averages renders individual quantum measurements insignificant for large sample sizes. Therefore, despite the mysteries of indeterminacy, what possible effect can it have on macroscopic events? This question remains highly debated, however the field of dynamical systems may illuminate the enquiry.

Since the birth of the computer, it has been possible to analyse non-linear equations, which were otherwise unsolvable. Mathematicians quickly discovered that seemingly simple non-linear equations, can produce incredibly complex solutions, a property called deterministic chaos. Edward Lorenz, a pioneer of the chaos theory, summarised it concisely: “Chaos: When the present determines the future, but the approximate present does not approximately determine the future.” Lorenz is referring to the premise that whilst chaotic systems are deterministic, the slightest alteration in initial conditions can have a tremendous consequence on the evolution of the system. Sensitivity to initial conditions is a hallmark characteristic of chaos, crystallised in the image of the butterfly effect.

From molecular interactions to stellar orbits, non-linear dynamics and chaotic behaviour is exceedingly common in the natural world. Just like the butterfly’s wings and the tornado, chaos may act to amplify quantum indeterminacy into the macroscopic world [2]. Over the past two decades a new discipline of physics has been emerging that unites dynamical systems and chaos into a single theory of quantum chaos. The founding question for quantum chaos is clear; what effect would classical chaos have on quantum mechanics? As it happens, chaos appears to be as common in the quantum world as it is in the classical. The well behaved hydrogen atom has long been revered by quantum physicists for its simplicity; a single electron orbits a single proton. Yet, the electron orbitals of a Hydrogen atom become chaotic in the presence of a simple magnetic field. Although the study of quantum chaos remains in its infancy, much insight has already unfolded. For example, the quantum equivalent of classically chaotic systems are actually non-chaotic and likewise, classically well-ordered systems become chaotic at the quantum level.

As it happens our universe is unfathomably sensitive to initial conditions, a fact best illustrated by Michael Berry’s following thought experiment. Consider a box filled with oxygen gas and imagine that it is possible to trace the path of a single oxygen molecule as it moves around the container, colliding with other molecules billions of times a second. Next, the observation is repeated, only this time with a slight change to the initial conditions of the system; a single electron is placed at the edge of the visible universe. With the intention of minimising the disturbance, only gravity, the weakest of the fundamental forces, is to be considered. The electron is 1836 times lighter than a proton and has been positioned some 13.7 billion light years away and all the forces bar gravity have been ignored, equating to a minute force of approximately 10^{-118}N. Berry asks how many collisions the oxygen molecule will need to experience before its direction is 90° away from its original path, had the electron not been added. Given this inconceivably small perturbation, it is difficult to imagine that the electron will have any effect, however in as little as 50 collisions the orientation of the oxygen molecule will have changed by 90°! [6]

A physicist is just an atom’s way of looking at itself.” – Niels Bohr

Biological Implications

The above example reflects the remarkable interconnectedness of two microscopic systems separated over astronomical distances; there truly are no isolated systems. However, in order for chaos to amplify indeterminacy, sensitivity needs to be expressed across magnitudes of scale, between quantum and classical worlds. The relationship between biological forms is an unmistakable example of this type of sensitivity. Biological life is extraordinary sensitivity to initial conditions; consider the effect that altering the genetic code can have on a developing embryo.

Since genetic mutation is a fundamental principle in our understanding of evolution, this should not come as a surprise. In the theory of evolution, new biological traits are the result of random mutations at the genetic level, however what exactly is meant by ‘random’? As previously illustrated, true randomness represents a deeply metaphysical process that is intrinsically beyond the deterministic paradigm. Among the causes of random mutation is ionising radiation produced by radioactive decay; the spontaneous emission of energetic radiation from an unstable element. Radioactive decay is an inherently random process that is sourced in quantum indeterminism; it is not possible to know when a radioactive element will decay, only when it is statistically likely to do so. Similar reasoning can be applied to the other genetic modifications, which are essentially molecular interactions occurring within the domain of quantum indeterminism.

The implications are clear, the theory of evolution is fundamentally rooted in a principle that is not only beyond human understanding, but mathematics as a whole. Evidently, there is cause for further research regarding the alteration of genetic information and the creation of new biological traits in the light of AIT. Evidently, the analogy of a deterministic machine is an unmistakably obstructive and misleading metaphor for evolution and the universe as a whole.

Synthesis

Contemporary physics has revealed an information description of reality that appears to be as significant to our understanding of the universe as the unification of energy and mass. The exploration of the quantum realm continues to unveil the deeply paradoxical nature of universe, drawing into question basic notions of reality such as locality and causality. Simultaneously, information theory has taken the concept of freedom to the brink of impossibility, laying out a conservation law that forbids all of mathematics from creating new information. Yet, the very creation of mathematics totally invalidates any claim that humanity is bound by such a conservation law. Humans have achieved the one thing that the algorithmic computer will never be capable of; the creation of new axioms.

Information theory and thermodynamics appear to be inseparably connected and yet unitarity of quantum physics suggests that information is a conserved quantity. The paradox of information conservation continues to be highly debated, though there are profound philosophical implications regardless of whether or not conservation holds. Given the governance of the second law to perpetually increase disorder, the universe has woven itself into an exceptionally complex web of intricate structures. Much of this organisation can be accounted for as an emergent phenomena, caused locally as the universe as a whole descends into thermodynamic equilibrium or ‘heat death’. These deterministic explanations are however unable to illuminate the mystery of creativity, exemplified in humanities creation of mathematical axioms.

A description of reality that is inclusive of creativity, must be expansive enough to encompass principles that are beyond the strict determinism of mathematics. Quantum physics is founded on intrinsically indeterministic phenomena, however the significance of this uncertainty has been disregarded in favour of statistical reasoning. Although many attempts have been made to reinstate physical determinism by reducing wavefunction collapse to a mere lack of understanding, these ‘hidden variable’ theories have all failed in the light of Bell tests. The quantum field appears to propagate as a field of potential deterministically, whilst simultaneously maintaining the freedom to evolve and define itself non-deterministically. The manipulation of the information held in quantum states through the process of wavefunction collapse is not only beyond our understanding, but abilities of both mathematics and computation.

The sensitivity of chaotic systems offers a potential route for the amplification of quantum indeterminism to macroscopic states. Chaos is commonplace in quantum systems and the latest theories are maturing towards a new physics that unites quantum theory and dynamical systems. Whilst the implications of quantum chaos are not well understood, the relationship between genetics and biological organisation draws a clear example of sensitivity between the quantum and classical worlds. The theory of evolution is dependent on alteration of the genetic code by way of random mutations, however the theory does not encompass the mechanism behind this process. Random mutations can be caused in numerous ways, such as ionising radiation from radioactive decay, which is an innately indeterministic process. The impossibility of replicating true randomness algorithmically raises challenging questions with regard to causality; further, the very foundations of the evolution theory rest on metaphysical ground.

To close. Since the creative potential of the human mind to conjure the axioms of mathematics is the one macroscopic phenomena that irrefutably transcends determinism; wouldn’t it therefore be more intuitive to perceive quantum indeterminism in relation to consciousness rather than matter?

I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness.” – Max Planck

[1] Time, “Hawking: Black Holes Do Exist. Now the Complicated Part,” 30 01 2013. [Online]. Available: http://science.time.com/2014/01/27/black-holes-hawking/. [Accessed 30 01 2013].

[2] D. Layzer, “The Arrow of Time,” Scientific American, 1975.

[3] e. a. Paul G. Kwiat, “Ultra-bright source of polarization-entangled photons,” Physical Review, 1999.

[4] F. D. P. David Bohm, Science, Order and Creativity, Bantam Books, 1987.

[5] R. Kane, The Oxford Handbook of Free Will: Second Edition, Oxford University Press, 2011.

[6] M. Berry, A Passion for Science, Oxford University Press, 1988.

## Leave a Reply