Tuesday, August 15, 2017

Notes on a Cosmology - Part 17, Cracks in the Standard Cosmology

Before we continue further down the rabbit-hole of the Simulation Hypothesis, we need to stop and talk about the standard cosmology. In most contexts, "standard cosmology" is synonymous with the Big Bang. The Big Bang can certainly be criticized on many different accounts - and bolstered on others - but we are after bigger prey in this post.

Cosmology is, originally, a subject of metaphysics and should be understood as particularly a subject of ontology. Metaphysics can be defined several different ways but I will define it this way: metaphysics is that part of philosophy that has to do with deciding on a framework for answering questions about the nature of reality. For example, is the Universe (that is, the observable world or just the world) infinitely old or finitely old? Ontology is the part of philosophy that is concerned with deciding what has existence and what does not, and how they are related. For example, do numbers exist? Do quantum particles exist when we are not observing them?

Questions of metaphysics are, by their very nature, not a part of science. Otherwise, we would refer to them as "physics" or just "science". The same is true of ontology. The key is to realize that questions of metaphysics can easily be dressed up as questions of science. Some people believe that the Big Bang proves the age of the world. In fact, no scientific theory can tell us the age of the world. The question of the age of the world is a metaphysical question. The world is either finitely old or it is infinitely old. No scientific test could ever decide this question. Thus, you must choose one or the other alternative as the framework in which you will do science.

To see why this is the case, imagine that the world will eventually stop expanding and collapse back to the singularity from which it began - this is sometimes called the Big Crunch or another clever name. It is conceivable that there was a Big Crunch before the world we know began, and that after this world ends in a Big Crunch, there will be another Big Bang immediately thereafter. The following image depicts such a series of expansions and contractions (time moving downward):

Credit: Infinity and the Mind, Rudy Rucker[1]
One might argue that some theories show that expansion will never stop but the preference for an ever-expanding Universe over an expanding and collapsing Universe can never be based on empirical evidence. In short, we cannot prove by any scientific means that the age of the Universe is finite or infinite. Likewise, we cannot prove by any scientific means that the Universe will go on forever, or that it will stop. Such questions are strictly beyond the purview of science and lie in the metaphysical realm of cosmology.

There are many other questions that might seem to be in the realm of science but are properly metaphysical questions. For example, does a quantum particle exist when we are not observing it? This is not a question of science because science is the study of the observable world. By definition, a quantum particle that is not being observed is not part of the observable world. The question of whether it exists is in the same philosophical category as the question of the existence of numbers. Whichever you choose to believe (that it exists, or it does not), your choice is not the result of science but, rather, is a precondition to how you do science.

The preceding remarks have been a preface to specific examples of cracks in the standard cosmology. Just because cosmology is metaphysics does not mean that all cosmological positions are created equal. We require of a cosmology that it seem to us to be correct, that is, true-as-such. Of course, we are not talking about the kind of truth that can be proven, either by logic or evidence. Rather, we are talking about a weaker form of truth, the kind that appeals to that undefinable quality of human reason that we call intuition or aesthetic sense. Our preference for such cosmologies is not arbitrary. First, Nature manifestly prefers symmetries and this agrees with our aesthetic sense. Second, by the principle of parsimony, we prefer simpler (more elegant) explanations to more complex explanations, all else equal. We prefer them so strongly that we have gone to a great deal of work to define exactly what we mean by "simpler" and "more complex". Third, while intuition is no guarantee of success, it has led to spectacular successes throughout the history of human science. Finally, we have no other stronger tools available to us - we are operating at the very edges of human knowledge and understanding.

New Evidence and Paradigm Shifts

In 2008, scientists at Stanford and Purdue University found a statistical anomaly in recorded radioactive decay rates for several specific radioactive elements[2]. These are not the kind of data that can be easily waved away - the statistical anomalies are present in data gathered at many points across the Earth, across a span of years, and with some of the most highly calibrated equipment on the planet. The discovery sparked off a minor storm of controversy in the scientific community[3], with some physicists bolstering the original findings and others concluding that the whole thing is a gigantic misunderstanding[4]. From the original paper:
Unexplained periodic fluctuations in the decay rates of 32Si and 226Ra have been reported by groups at Brookhaven National Laboratory (32Si), and at the Physikalisch-Technische Bundesandstalt in Germany (226Ra). We show from an analysis of the raw data in these experiments that the observed fluctuations are strongly correlated in time, not only with each other, but also with the distance between the Earth and the Sun. Some implications of these results are also discussed, including the suggestion that discrepancies in published half-life determinations for these and other nuclides may be attributable in part to differences in solar activity during the course of the various experiments, or to seasonal variations in fundamental constants.
The kerfuffle is over the claim that the radioactive decay rates of these elements are not constant and "may be attributable in part to differences in solar activity during the course of the various experiments, or to seasonal variations in fundamental constants." Either possibility would deal a fatal blow to the foundations of the modern theory of radioactivity. In the status quo theory, radioactivity is solely a function of the internal configuration of an element - its mass number and the resultant arrangement of the nucleus and electron shells. If something outside of the element itself can influence the radioactive decay rate - some kind of influence from the Sun, for example - then the theory of radioactivity would have to be rewritten from the ground up to incorporate this new variable, whatever it is.

The theory of radioactivity is very old - as scientific theories go - and well-established. It works extremely well. That is, scientists are able to make all sorts of correct calculations using the theory. For these reasons, atomic physicists are naturally highly reluctant to scrap the theory on the first sign of weakness. But this is precisely the price of a rigorous commitment to the classical discipline of the scientific method - when a scientific theory does not work (does not match the observed phenomena), it is thrown on the scrap-heap and a new theory that does work is put in its place. Ever since the universal adoption of the scientific method, it has often been harder to implement this discipline in practice than it is to explain it in theory.[5]

This topic has been written about extensively - it is termed the problem of paradigm shift. Many reasons can given for why it happens. Established scientists and professors are reluctant to see a lifetime's worth of work consigned to the trash heap and are liable to side with the minority report even after overwhelming evidence against the obsolete theory has been accumulated. Scientific specialization also plays a role in fracturing the kind of interdisciplinary thinking that is required for breaking standing paradigms and progressing towards more holistic ways of thinking.

Quantum Physics or Relativity Theory - Which One is Correct?

One of the most important cracks in modern cosmology is the reconciliation of the quantum and relativistic theories. Quantum mechanics and relativity theory are, ultimately, incompatible, a point that Einstein himself realized[6]. Brian Greene wrote about this conflict and how research in superstring theory has been driven by the realization on the part of many physicists that this fundamental cosmological problem has to be addressed before there can be a unified field theory:
The incompatibility between general relativity and quantum mechanics becomes apparent only in a rather esoteric realm of the universe. For this reason you might well ask whether it's worth worrying about. In fact, the physics community does not speak with a unified voice when addressing this issue. There are those physicists who are willing to note the problem, but happily go about using quantum mechanics and general relativity for problems whose typical lengths far exceed the Planck length, as their research requires. There are other physicists, however, who are deeply unsettled by the fact that the two foundational pillars of physics as we know it are at their core fundamentally incompatible, regardless of the ultra-microscopic distances that must be probed to expose the problem. The incompatibility, they argue, points to an essential flaw in our understanding of the physical universe. This opinion rests on an unprovable but profoundly felt view that the universe, if understood at its deepest and most elementary level, can be described by a logically sound theory whose parts are harmoniously united. And surely, regardless of how central this incompatibility is to their own research, most physicists find it hard to believe that, at rock bottom, our deepest theoretical understanding of the universe will be composed of a mathematically inconsistent patchwork of two powerful yet conflicting explanatory frameworks. Physicists have made numerous attempts at modifying either general relativity or quantum mechanics in some manner so as to avoid the conflict, but the attempts, although often bold and ingenious, have met with failure after failure. That is, until the discovery of superstring theory.[7]
Greene goes on to make the case that superstring theory will one day be able to unite quantum mechanics and relativity theory, a feat that superstring theory has still not achieved.

Naturally, physicists tend to look at the cracks in cosmology from the perspective of physics. However, modern physics is intricately married to higher mathematics and the cracks in modern mathematics go very deep, as we explored in Part 13. By extension, the cracks in cosmology are much deeper than many physicists realize. Gregory Chaitin explains the connection between uncomputable real numbers, the continuum and problems in modern physics[8]:
How do you cover all the computable reals? Well, remember that list of all the computable reals that we just diagonalized over to get Turing's uncomputable real? This time let's cover the first computable real with an interval of size ε/2, let's cover the second computable real with an interval of size ε/4, and in general we'll cover the Nth computable real with an interval of size ε/2N. The total length of all these intervals (which can conceivably overlap or fall partially outside the unit interval from 0 to 1), is exactly equal to ε, which can be made as small as we wish! In other words, there are arbitrarily small coverings, and the computable reals are therefore a set of measure zero, they have zero probability, they constitute an infinitesimal fraction of all the reals between 0 and 1. So if you pick a real at random between 0 and 1, with a uniform distribution of probability, it is infinitely unlikely, though possible, that you will get a computable real.
Uncomputable reals are not the exception, they are the majority! The individually accessible or nameable reals are also a set of measure zero. Most reals are un-nameable, with probability one...
So if most individual reals will forever escape us, why should we believe in them? Well, you will say, because they have a pretty structure and are a nice theory, a nice game to play, with which I certainly agree, and also because they have important practical applications, they are needed in physics. Well, perhaps not! Perhaps physics can give up infinite precision reals! How? Why should physicists want to do that?
There are actually many reasons for being skeptical about the reals, in classical physics, in quantum physics, and particularly in more speculative contemporary efforts to cobble together a theory of black holes and quantum gravity. 
First of all, as my late colleague the physicist Rolf Landauer used to remind me, no physical measurement has ever achieved more than a small number of digits of precision, not more than, say, 15 or 20 digits at most, and such high-precision experiments are rare masterpieces of the experimenter's art and not at all easy to achieve. 
This is only a practical limitation in classical physics. But in quantum physics it is a consequence of the Heisenberg uncertainty principle and wave-particle duality (de Broglie). According to quantum theory, the more accurately you try to measure something, the smaller the length scales you are trying to explore, the higher the energy you need (the formula describing this involves Planck's constant). That's why it is getting more and more expensive to build particle accelerators like the one at CERN and at Fermilab, and governments are running out of money to fund high-energy physics, leading to a paucity of new experimental data to inspire theoreticians. 
... 
So perhaps continuity is an illusion, perhaps everything is really discrete. There is another argument against the continuum if you go down to what is called the Planck scale. At distances that extremely short our current physics breaks down because spontaneous fluctuations in the quantum vacuum should produce mini-black holes that completely tear spacetime apart. And that is not at all what we see happening around us. So perhaps distances that small do not exist.
... 
Whether or not quantum computers ever become practical, the workers in this highly popular field have clearly established that it is illuminating to study sub-atomic quantum systems in terms of how they process qubits of quantum information and how they perform computation with these qubits. These notions have shed completely new light on the behavior of quantum mechanical systems.
Furthermore, when dealing with complex systems such as those that occur in biology, thinking about information processing is also crucial. As I believe Seth Lloyd said, the most important thing in understanding a complex system is to determine how it represents information and how it processes that information, i.e., what kinds of computations are performed. 
And how about the entire universe, can it be considered to be a computer? Yes, it certainly can, it is constantly computing its future state from its current state, it's constantly computing its own time-evolution! And as I believe Tom Toffoli pointed out, actual computers like your PC just hitch a ride on this universal computation. 
[end quote]
These are questions that go far beyond anything that can be answered with empirical evidence. Is the Universe continuous or is it discrete? Quantum physics says "both." Relativity theory says "continuous, but you can get away with treating it as discrete under some conditions." The limits of mathematics tells us that, unless the Universe utilizes an infinite amount of information in every volume of space (however small), space cannot be smooth in the sense of a one-to-one mapping between R3 and physical space. Cosmology tells us that empirical evidence cannot decide these questions - whether the Universe utilizes infinite or finite information to describe space is a question of metaphysics, not math or physics.

Dark Matter and Dark Energy

From Wikipedia,
[Dark Matter] does not emit or interact with electromagnetic radiation, such as light, and is thus invisible to the entire electromagnetic spectrum. Although dark matter has not been directly observed, its existence and properties are inferred from its gravitational effects such as the motions of visible matter, gravitational lensing, its influence on the universe's large-scale structure, on galaxies, and its effects on the cosmic microwave background.
Prior to the development of relativistic physics, aether theories were proposed to explain light-waves and electromagnetic waves. The physical intuition of a luminiferous aether is straightforward - light is the rippling or waving of an aetheric medium in exactly the same way that mechanical waves in air or water are the rippling or waving of the media of air or water, respectively. If this is the case, then we can translate the well-developed mathematics of mechanical waves from the theory of mechanics to the theory of electromagnetism as a way to unify light, electromagnetism and - ideally - gravity.

Relativity theory essentially banishes the aether, making light a wave that can be thought of as the fundamental metric of space and time. Light is a wave, but there is no medium through which this wave is travelling. Rather, the light wave (that is, its speed) defines distances in space and time. The result is that distances in space and time become relative to the speed of light. The mathematics of these ideas did not originate with Einstein - Hermann Minkowski developed most of the mathematics that we know today as relativistic spacetime. Minkowskian spacetime, in turn, can be thought of as an application of non-Euclidean geometry to physics.

After the development and refinement of relativistic physics, earlier aether theories came to be scorned as an example of inventing physical entities for the purpose of facilitating a mathematical theory. While it would be quite nice to be able to repurpose the wave equations of mechanics to describing the wave phenomena of light and electromagnetism, this is just a theoretical convenience. The job of science is to explain the phenomena as they are not as we would wish them to be. The existence of a luminiferous aether was a reasonable hypothesis prior to the development of relativistic physics but its rejection was well-justified after a better model emerged.

Dark Matter and Dark Energy are hypothetical states of matter with the added complication that their existence is extremely difficult to establish through any physical experiment. By construction, dark matter can only be inferred through its gravitational effects, leaving little or no room for laboratory work. Even worse, dark matter is posited to make up as much as 95% of the matter in the Universe. In short, it is difficult to see how dark matter is anything more than a just-so hypothesis whose purpose is to salvage a broken theory. From PlasmaCosmology.net:
Within the limited confines of our own backyard, the Solar System, existing gravitational models seem to be holding-up. We have succeeded in sending probes to neighbouring planets ... the Huygens mission recently scored a spectacular success -- landing on Titan, a moon of Saturn, despite unexpected atmospheric conditions. 
It should be noted, however, that [gravity] models begin to break down when we look further [afield]. Gravity, of course, is generally described as a property of mass. The trouble is that we have not discovered enough mass in our own galaxy, The Milky Way, to account for its fortunate tendency not to disintegrate.
The existence of mysterious Dark Matter is hypothesised to account for this shortfall in mass [among other things]... Its existence is only inferred on the basis that [gravity] models 'must be' correct. The alternatives raise too many uncomfortable questions!
Dark Matter is no small kludge factor -- it is alleged to account for between 20% to 99% of the universe, depending on which accounts you read! This has lead to further problems in relation to expansion models, and another hypothetical, Dark Energy, has been invented to overcome these. In summation, Dark Matter and Dark Energy add up to the blank cheques that postpone the falsification of bankrupt theories.
Gravity, Causality and Black Holes

Newton's formulation of the law of gravitation is non-causal. Even though the law enables us to calculate the magnitude of the gravitational force between two bodies, it does not tell us what causes this force. In itself, this is no fault - sometimes, the most that science can do is tell us how one variable correlates to another variable, without knowing why. In electronic systems, there is a similar kind of examination called characterization of an electronic component or circuit. We can characterize a circuit as a "black box", meaning, we do not know (or maybe we know, but we don't care) what the internal circuit looks like. All we care about is the circuit's response to varying input stimuli.

Einstein's general theory of relativity connects space and matter in such a way that the presence of matter alters the curvature of space. This change in the curvature of space is sometimes taken to be the cause of gravity. However, this is a mistake of reasoning. Einstein's theory tells us how matter, space (and acceleration) are related, but it still does not tell us why. Like solving a triangle, if we know some information about a gravitational system, we can calculate other information for which we do not have measurements. But this does not give us a causal theory that utilizes physical reasoning in the way that, say, Galileo's derivation of the law of inertia did.

One of the things that we can easily notice from an information-theoretic perspective that may be harder to see from other approaches is that the action of gravity - which is obviously real - superficially contradicts the second law of thermodynamics. If we squint and imagine space as a "clumpy gas/dust cloud", the second law of thermodynamics (also known as the law of entropy) dictates that this gas and dust will eventually spread into an evenly distributed gas of constant pressure and temperature. This is jokingly referred to as the heat death of the Universe.

The fact that we observe clumps of matter shows that there is something that is counteracting (though, obviously, not contradicting) the second law of thermodynamics. A refrigerator is an example of a heat engine that can locally counteract the effect of the second law of thermodynamics, cooling one region of space by expelling heat into the surrounding environment. While spontaneous formation of a refrigerator is statistically improbable, it is, of course, not impossible. The human body, for example, regulates its internal temperature using cellular processes that are physically equivalent to a refrigerator. If you believe that the human body evolved, then you should not find it impossible to believe that the Universe has some mechanism by which the second law of thermodynamics is counteracted on a cosmological scale, resulting in gravity and the "clumping effect" that gravity has on matter in space.

From an information-theoretic perspective, a refrigerator is able to reverse the natural progression of entropy because it implements a "micro-mind." What I mean by this is that every heat engine can be thought of as a weakened version of Maxwell's demon. A reversible Turing machine can be thought of as an approximation of Maxwell's demon. In short, the cycle of any heat engine can be idealized as a reversible Turing machine operating a Maxwell's demon trap-door. This is one way of stating Landauer's principle. Modern cosmological theories have begun working information theory into the large-scale structure of the Universe. For example, see Hawking's work on black-hole radiation. But this is a "bolt-on" approach to information theory, that is, it is trying to shoehorn information theory into a pre-existing physical theory.

The question, from the perspective of an information-based cosmology, is: where is the refrigerator compressor? I mean this metaphorically, of course, but the point stands - there is nothing stopping us from modeling the large-scale structure of the Universe as a Maxwell's demon trap-door chamber where we are on the cold side. It can be argued that, if you squint in just the right way, Hawking's black-hole radiation theory is compatible with our refrigerator-model.

Our Star

There is a plethora of cosmological alternatives to the Big Bang theory. One alternative that I find particularly interesting is Plasma Cosmology (PC). PC holds that the gravitational force is not the dominant force in the Universe at large scales. Rather, PC holds that the electromagnetic force dominates at very large scales and its effects are not properly accounted for in the standard model of the solar system. The Electric Universe (EU) theory extends the PC theory by positing that the energy emitted from the Sun is almost entirely electromagnetic in origin and that there is no fusion occurring in the Sun's core, among other things.

The PC/EU theory makes quick work of some of the most puzzling features of our solar system. It is well-known that sunspots are much colder than the Sun's photosphere, even though they open into the Sun, thus exposing the Sun's ostensibly hotter, lower layers to external view. The EU theory holds that sunspots are actually inflows of charged particles into the Sun's interior. They form circular structures because the plasma is flowing in a plasma sheath, which creates a structure not unlike an insulated wire stretching through space, invisible to the naked eye (and other instruments).

The temperature of the Sun's corona measures in the millions of degrees, while the surface temperature of the photosphere is several thousands of degrees. This is an extraordinary phenomenon - how does it happen that the hot Sun is heating its surrounding atmosphere to a much higher temperature than itself? Imagine pulling an iron cannonball from a furnace at very high temperature. Surrounded by room-temperature air, would you expect the air surrounding the cannonball to ever become hotter than the cannonball itself? Of course not. The mainstream solar theory has no satisfactory explanation of this phenomenon. But the PC theory can explain this phenomenon as a plasma double-layer.

The EU theory can explain the planar orientation of the planetary orbits around the Sun. By modeling the Sun as a point charge moving through space, the Sun sets up a magnetic field around itself and this magnetic field plays a role - by interacting with the magnetic fields of the planets - in favoring orbits in the ecliptic plane. Comets, being faster bodies with more eccentric orbits, orbit the Sun more symmetrically (that is, symmetrical with respect to the angular distribution of their orbits around the Sun).

Many other features of the solar system have natural explanations under the PC/EU theories, including planetary canyons and ridges on bodies with no water. The theory that there was once water on these bodies fails when the shapes of the canyons and ridges are taken into account - they are not compatible with a hydrological cycle because there is no consistent downward gradient. Lunar cratering, Olympus Mons and many other features with complex explanations in the standard theory have natural explanations under a PC/EU theory.

None of this is to say that the PC/EU theory is proven. Rather, my purpose in mentioning these alternatives to the standard theory of our solar system is to point out that it is possible that modern cosmological theory has become myopic, focusing on one particular aspect of physics while neglecting other aspects of physics. As we quoted Chaitin in Part 14,
For any ... scientific ... facts, there is always a theory that is exactly as complicated, exactly the same size in bits, as the facts themselves. [This] doesn’t enable us to distinguish between what can be comprehended and what cannot, because there is always a theory that is as complicated as what it explains. A theory, an explanation, is only successful to the extent to which it compresses the number of bits in the facts into a much smaller number of bits of theory. Understanding is compression, comprehension is compression! That’s how we can tell the difference between real theories and ad hoc theories.
The more we cobble onto existing cosmological theory, the greater risk we are running that we are just tailoring our theory to handle more and more special cases without stopping to take stock and assess whether rewriting our theory from the ground up could result in a globally more "compressed" theory.

A Grand Unified Theory

As we saw in Part 13, there is no physical theory of everything. I would wager that, if you were to survey, say, a thousand of the world's top physicists, the majority of them would respond that they believe it is possible that a physical theory of everything could be found. The impetus behind much of modern physics is the attempt to unify various parts of physics into a single theory. This single theory has gone by a variety of names, including Grand Unified Theory (GUT), Unified Field Theory, and others. Superstring theory, for example, is heavily motivated by the desire to unify quantum physics and relativistic physics in a single head.

The impossibility of a theory of everything does not, of course, exclude the possibility of grand unifications - these have already happened several times in the history of physics. But it is crucial to keep in mind that grand unifications are local to the theories being unified. We must keep in mind that every physical theory has some domain to which it applies - no theory of physics will explain the aesthetics of situational comedy, for example.

Conclusion

We will not be wading into any of the debates covered in this post in any depth. In software engineering, the term code smells is used to refer to code that seems to work in most or all cases but which has the appearance of poor design and is, therefore, suspected to contain hidden bugs. The standard cosmology has "cosmology smells." That is, it exhibits multiple symptoms of deep and hidden flaws in its foundations.

This doesn't make the standard cosmology useless or bad - it almost always works correctly. In fact, the aspects of scientific theory where it breaks down are so obscure that many specialists will never encounter them in actual laboratory work. But that doesn't matter from the point of view of cosmology proper, because cosmology is a subject of metaphysics. For the purposes of metaphysics, the only interesting aspects of the standard, scientific cosmological theory are those aspects that don't work, however, obscure they might be. The fact that they don't work is telling us something very important: sooner or later, the standing theory will be resolved with the empirical evidence, or it will be scrapped.

It has been the habit of established scientific schools of thought throughout history to view the status quo theory as "all but a closed canon." For hundreds of years, we have been on the verge of a grand unified theory that will close the textbooks on new physical theory once and for all. Instead, what has actually happened is that the theory of physics has been repeatedly rewritten from the ground up since the time of Galileo down to today.

In place of the standard cosmology - that is, Big Bang theory - we will be positing a cosmology that organizes the Universe, at all scales, around information. Economizing information on the input to a universal function, U, automatically results in the universal prior that we discussed in Part 9. We live in a Universe in which exact measurements can only be described by a mathematics that admits both wave-like and particle-like properties. If we believe that information is economized (or even conserved), this means that we have to apply the universal prior to the Universe. We live in a quantum Universe whose prior (that is, whose prior probability distribution without empirical measurement) is identical to the universal prior. In addition, we live in a Universe that is "observationally indistinguishable from a giant quantum computer." We have proposed the term quantum monad to describe the causal structure of a cosmology that incorporates these two major features.

Next: Part 18, Virtualization

---

1. Infinity and the Mind, Rudy Rucker

2. Evidence for Correlations Between Nuclear Decay Rates and Earth-Sun Distance, [PDF]

3. Net Advance of Physics: Variability of Nuclear Decay Rates - compendium of papers related to the subject

4.  Evidence against correlations between nuclear decay rates and Earth-Sun distance

5. Anti-scientific practices have too frequently flown under the radar of scientific method. One particularly remarkable and grotesque example is the Tuskegee Syphilis Experiment.

6. Einstein-Podolsky-Rosen paradox

7. The Elegant Universe, p. 63

8. Epistemology as Information Theory: From Leibniz to Ω

No comments:

Post a Comment

Wave-Particle Duality Because Why?

We know from experimental observation that particles and waves are fundamentally interchangeable and that the most basic building-blocks of ...