ANDREJ DETELA From Entropy to Syntropy FROM ENTROPY TO SYNTROPY 2025 ANDREJ DETELA From Entropy to Syntropy © 2025 Andrej Detela Reviewers: Prof. Andrej Ule, Prof. Dušan Plut, Tibor Hrs Pandur Editing: Dean DeVos Design and page layout: Nina Semolič Editor: Nataša Gregorič Bon Issued by: ZRC SAZU Institute of Anthropological and Spatial Studies Represented by: Ivan Šprajc Published by: Založba ZRC Represented by: Oto Luthar Založba ZRC editor-in-chief: Aleš Pogačnik Print: Abo grafika, d. o. o. Print run: 300 Prva naklada, prvi natis. / Prva e-izdaja. | First edition, first print. / First e-edition. Ljubljana 2025 The first electronic edition is freely available under the terms of the Creative Commons licence CC BY-NC-ND 4.0 (except for materials for which other terms of use are explicitly stated): https://doi.org/10.3986/9789610509462 Kataložna zapisa o publikaciji (CIP) pripravili v Narodni in univerzitetni knjižnici v Ljubljani ISBN 978-961-05-0945-5 COBISS.SI-ID 227682051 ISBN 978-961-05-0946-2 (PDF) COBISS.SI-ID 227296771 ANDREJ DETELA From Entropy to Syntropy Rien ne peut arrêter une idée dont l'heure est venue. (Nothing can stop an idea for which time has ripened.) VIC TOR HUGO Contents Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 PART ONE: ENTROPY: HISTORICAL BACKGROUND � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �13 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15 2. The Law of Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .17 3. Classical knowledge about entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4. The great crisis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 PART TWO: THE BIRTH OF SYNTROPY� � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �27 5. The paradigm shift in science. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29 6. From biology to physics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 7. The birth of the term syntropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 8. The steady decline of the outworn dogma . . . . . . . . . . . . . . . . . . . . . . . . . . . . .35 9. Transdisciplinary signs in favour of syntropy . . . . . . . . . . . . . . . . . . . . . . . . . . .37 10. The arrow of time and syntropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 11. Two classes of syntropic processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .43 PART THREE: SYNTROPY IN MAGNETIC FIELDS � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �49 12. Symmetry properties of a magnetic field . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .51 13. Syntropy in a homogenous magnetic field . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53 14. Syntropy in a chiral magnetic field . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .56 15. Discussion of both experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 16. More evidence of syntropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 17. Ways to generate syntropic power from semiconductors . . . . . . . . . . . . . . . . . . .67 18. Syntropy in chiral materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .71 PART FOUR: SYNTROPY IN POLYPHASE QUANTUM STATES � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �75 19. The nature of time in quantum theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 20. Polyphase cycles and circular diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .78 21. Polyphase quantum states. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .81 22. A polyphase variant of Maxwell’s demon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 23. A symphony of protein structures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .87 24. Syntropy in living organisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .89 25. Crystals as power sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .93 PART FIVE: SOCIAL AND PHILOSOPHICAL IMPLICATIONS� � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �99 26. The question of responsibility. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .101 27. A way out of chaos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .103 28. The balance between entropy and syntropy . . . . . . . . . . . . . . . . . . . . . . . . . . .105 29. Vyakti . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .107 30. A letter to the people. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .110 31. The syntropic perception of time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .113 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .117 Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .119 Preface This book aims to provide creative insight into an overlooked alternative that can contribute to renewing the balance of every form of life on our planet Earth. My hope, my vision, is firmly rooted in a fresh scientific understanding of certain natural phenomena that for long have been neglected and forgotten. In particular, we shall look beyond the established Law of Entropy in order to put it into balance with the emerging Law of Syntropy. Syntropy is the inverse of entropy, and signifies either negative entropy or any self-organizing process in nature. We have been taught about the unidirectional path from order to chaos, formally canonized as the Law of Entropy. But the time is ripe to supplement it with the path from chaos to order. Both poles do exist, and there is a bidirectional flow between them. This is not an ordinary description of nature. Present mainstream science (especially physics) is utterly silent as to the mere existence of syntropy, and it is even more silent as to the related consequences. We must invent a different approach to newly discovered levels of reality, and we need to step slightly out of the official language used today in scientific journals. However, we will remain faithful to the rules of scientific methodology. This book is divided into five parts. The first part is written in lay language, although it is not intended only for lay readers. The second part gradually ush- ers in a physical description of our problem, but it can still be understood by a layperson. The following two parts (parts three and four) are based on many facts from the natural sciences (mainly physics but also biology), so there I had to use more specialized language. However, I believe that non-specialists will also be delighted by most of the ideas explained therein. We need a firm scientific methodology so that our vision of syntropy has a firm foundation. And finally, the fifth part of the book is also written in lay language. If the central parts of the book have been properly understood, this finale may be considered a reward for the reader’s patience. As much as possible, a holistic worldview is respected throughout the book. Therefore, a reader with a broad scope of interests will profit the most from it. Strict reasoning, so key to the natural sciences, is entwined in philosophical 9 and social reflections. I believe that this approach can benefit both wings of our intellectual creativity – the natural sciences and the humanities. Although syntropy is not yet approved within the framework of mainstream science, a great deal of related material is scattered elsewhere on the elusive margins thereof. Regretfully, most of those works do not adequately adhere to the rules of scientific methodology, and this leads to considerable confusion. I had to follow Ariadne’s thread, so here I have cited only those published findings that are clear, convincing, and close enough to my own line of research. I hope that this book inspires readers from various fields of human creativity, and I trust that the ideas contained herein will stimulate future research. Every sentient being longs for a feasible way out of the current stress and chaos, and as the world of syntropy is becoming visible, we are even more ready to shine a light thereon. Ljubljana, November 2019 Postscript: The present treatise was completed just before the great pandemic that threw many things off course. That is why there was a long delay in publishing this book. Now, I have kept the entire text in its original form without any changes (except that I have added several recent references to literature). Ljubljana, December 2024 10 11 PART ONE ENTROPY: HIS TORIC AL BACKGROUND 1. Introduction It cannot be denied that we are living and experiencing decades of tremendous change. Most surely they are going to profoundly transform our global society, for they are already happening. Promises and, above all, warnings are coming from many events in our tangible objective world, but also from the invisible world inside us. Many signs tell us that we have already entered a great turning point.1 It is clear that we shall all have to step “outside of the box,” out of the old frame of thought. Many partial, feasible solutions to the present-day crisis are already apparent. More and more people are sharing a vision of a sustainable future for all living beings on this planet.2 The paradigmatic framework of new ideas is now assuming a convincing structure, visible to all those willing to see. However, some elements are still missing. In this treatise, I aim to contribute one such lacking solution, a solution dreamt of and longed for by so many and patiently awaited. But first, let us try to uncover the essence of our general problem. To begin with, on the level of nature we are exposed to a web of interconnected phenomena, such as the fast-changing climate, relentless deforestation, soil exhaustion, and water shortages, as well as all sorts of widespread pollution, and now even the threatening 6th great extinction of living species.3 Does the combination of all these factors foreshadow total ecological collapse in the near future? Nobody knows yet, but one cannot ignore the actual devastating harm to the environment produced by our recent anthropogenic activity. Secondly, on the level of human society, we are now facing growing social and economic inequality – on the level between states, but inside of them as well. Dangerous social tensions have always been a harbinger of war. Now these tensions are combined with informational chaos: we are globally connected as never before, so it is not easy to pick out the important information, and it is not so evident what is true and what is not.4 Humanity as a whole has great problems adapting to modern means of communication.5 And although the relative amount of violence has slightly decreased, the “masters of war” still cling to the many- faceted idea of national supremacy, with reference to the extremely dangerous weapons that have yet to be dismantled. They do have a serious mental problem, evidenced by how they play with their “toys” of war. 15 PART ONE And thirdly, things go wrong on the psychological level as well. Careful insight reveals that the above-listed problems actually originate here.6 The historic evolution of human beings has now reached the critical moment when bare rational thinking without wisdom has become utterly insufficient. If misused, our technology is more dangerous than ever before. An old piece of wisdom says: reason is the helper, but reason is also a bar, a mental cage. People have evolved into skilful producers of countless things, but we still lack a deeper understanding of our own human condition.7 Even when someone has quite enough to be completely satisfied with his or her material status, he or she still grasps for more and more – instead of finding internal bliss by just simply being aware. Our recent achievements in the material world allow us to make this great leap in global consciousness. Can we finally surpass our unrestrained greed, our striving for supremacy over our fellow beings? Are we able to understand that our inborn happiness is utterly vane and illusory if it is not fused with the happiness of all life on the planet? Can we dismiss everything that is not necessary and restore our basically clear, sober mind? It takes a certain amount of time to understand and accept something so simple, but it is important to know that Mother Earth is becoming desperately tired of human greed and selfish competition geared to some imaginary idea of “success”.8 Let us take a glance at the last two centuries of intense economic development. Has this age of material achievements driven us down a blind alley? The answer is yes and no. Let me explain. We must admit that the science and technology of so-called Western society has brought much good to the whole world: the global poverty rate has diminished, health has increased, modern education has spread around the whole world, women’s voices are being heard much more than before, etc.9 But our blue sky is simultaneously being filled with the black clouds of pollution mentioned above. So, what has gone wrong? Thousands of books have attempted to give thoughtful explanations of this issue. In recent years, many dark aspects of human thought have been clarified and many unfavourable dogmas have been uprooted – at least theoretically (but not yet in practical life). Here, I must refrain from giving an easy and complete answer since the global situation is extremely complex, and I have not been called upon to judge the future of humanity. So I shall only try to shed some light on 16 2. THE L Aw Of ENTROPy an interesting alternative to the present dilemmas. There we can find a forgotten chance that may even be a part of our emerging redemption. The ideas that I shall present here may give a beneficial impetus to our modern scientific thought. When we step “out of the box” we usually find interesting and promising alternatives outside the well-known mainstream.10 From such an unorthodox standpoint, perhaps we can perceive a partial solution to the present-day crisis. That is the aim of this treatise. 2. The Law of Entropy Let us take a brief look at the historical and metaphysical background of two branches of the natural sciences: biology and theoretical physics. Everyone knows about Darwin ’s theory of evolution through natural selection. His famous book11 was published in 1859 and instantly garnered great influence. Herbert Spencer, Darwin’s friend and a renowned philosopher of the Victorian era, intro- duced the idea of “the survival of the fittest” into the realms of economics, sociol- ogy, and politics as well.12 The general atmosphere of the young capitalist society broadly accepted ideas that supported liberal competition among businesspeople. Furthermore, social Darwinism was often cited as “scientific” justification of the European colonization and enslavement of so-called “Third World” countries. Less is known about what happened during the same years in the field of theoretical physics. The period 1850–1865 was marked by the discovery and formal establishment of the Second Law of Thermodynamics (also known as the Law of Entropy and we also use the shorter term the Second Law). The theory of entropy was originally developed by several prominent European scientists, among them Rudolf Clausius, William Thomson (Lord Kelvin), James Clerk Maxwell, and Ludwig Boltzmann. Entropy is an important physical quantity (like mass, speed, or energy) that can be measured and defined in mathematical terms.13 The precise definition is quite abstract, which hinders an adequate intuitive understanding thereof. In simplified terms, entropy is the rate of disorder within a defined physical system. Any physical system is assembled of many parts, many pieces. These pieces can be arranged into internal order, or they can be disordered. An ordered system 17 PART ONE has low entropy; and if this system goes towards disorder, it has higher entropy. For example: we buy a pack of 52 playing cards, neatly arranged according to their symbols (hearts, diamonds, clubs, and spades) and values (ace through ten, plus king, queen, and jack). Before we start a game, the cards are deliberately shuffled, therefore put into disorder. The entropy of the card set has increased. We do not need to shuffle the cards by hand. They can simply be thrown up into the air and so they are shuffled quite naturally. The Law of Entropy states that, if left alone to themselves (that is, when they are isolated from the environment), ordered systems spontaneously tend towards disorder. The entropy of an isolated system always increases. In an ideal case (which, in fact, never occurs) the internal entropy of an isolated system remains unchanged, but it can never decrease. Here is an example: when we put together water (H 2O) and pure alcohol (C H OH), the diffusion of both types of molecules spontaneously leads to the 2 5 mixture of both substances. Let us start with the molecules of alcohol on the left side and the water molecules everywhere else (see Figure 1). Then, through the natural process of diffusion (a result of molecular thermal motion), all of the molecules are slowly displaced and mixed. After a certain amount of time we get a mixture of both substances. We lose knowledge of the exact whereabouts of each single molecule of H O or C H OH in the container. 2 2 5 FIG. 1. The diffusion of alcohol in water: we start with a drop of alcohol molecules on the left side of the glass� After a certain amount of time, the alcohol molecules are dispersed all around the glass, between the water molecules� We would wait in vain for a similar natural process to take place in the opposite direction of time� 18 2. THE L Aw Of ENTROPy According to modern information theory, any loss of information by the observer is linked to increased entropy, to greater informational chaos.14 But one would wait in vain to see the mixture spontaneously “unmix” into pure water and pure alcohol. The direction of time in this experiment cannot be reversed, the Second Law holds firm here. We say that the process of diffusion is irreversible, and so it is with many other physical processes as well. If we consider all physical systems taken together, their overall entropy increases along the flow of time. An inverse flow of time is impossible: in that case entropy should decrease, and that is against the Second Law. So our experienced direction of time15 is connected to the spontaneous disintegration of order into disorder, which is the path to the presumed final chaos at the end of our earthly existence. One can guess that this is a rather sombre picture of nature, and of the whole Universe. Most physical systems are not isolated, but are open to their environment. This means: their boundary is permeable, i.e. mass and energy can be exchanged with the environment (see Fig. 6 in Chapter 7). An internal part of an open system can diminish its entropy; therefore, it can take care of its internal order. This is precisely what all living beings on planet Earth do. But this internal process occurs on account of the entropy of the environment. The sum of the internal and external entropy must always increase. This is an outcome of the Second Law and of the fact (expressed mathematically) that entropy is an additive quantity. According to the Second Law, we are able to sustain our life on Earth only by feeding on the inherent order of our natural environment. For instance, when we consume bread, the increase in the environmental entropy (starting from the entropy of the sunrays that nurtured the crops used to make our bread) is greater than the decrease in our body entropy. And when we make use of more powerful energy sources (like coal or oil), the overall increase in entropy is even much greater. Thereby, we are introducing chaos into our environment, and, as we already know, this is leading us towards ecological collapse. Are we prisoners of the Second Law? It declares that any form of life is impossible if it is not married to final disaster. Moreover, the contemporary neoliberal economy feeds on the relentless growth of both production and consumption, together with ruthless exploitation of natural resources (including 19 PART ONE so-called “human resources”). Surely this is the way to hell. But how to extract ourselves from this? Here we can draw an eloquent parallel between biology and physics. Darwin’s radical idea of “the survival of the fittest” has quite a lot in common with the idea of “feeding on external order” encountered in theoretical physics. In 1875, Boltzmann wrote: “The general struggle for existence of animate beings is [...] a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth.” Both ideological conceptions were born during the same time, in the same cultural atmosphere, within the same zeitgeist. And both of them attempt to justify ruthless violence and aggression. Now our world is split into a myriad of fragments that are in constant struggle and mutual competition. Before we step out of this intellectual trap, let us look at the Law of Entropy in some more detail, in order to better understand our problem. 3. Classical knowledge about entropy In 1824, the French engineer Sadi Carnot formulated the first version of the famous Second Law of Thermodynamics (the Law of Entropy ). 16 He observed QH QC TH TC W FIG. 2. Carnot’s heat engine (symbolically the central circle in the figure) uses heat QH at high temperature TH, and transforms the energy of this heat into useful mechanical work W, but not completely. Some energy (heat QC) is dissipated in the cold temperature TC. 20 3. CL A ssIC AL kNOwLEDGE AbOUT ENTROPy steam engines, the main mechanical power source at that time, and discovered that only a fraction of the heat energy can be converted into mechanical energy. The remainder is waste heat, which can be used to warm our buildings, but cannot be converted into any other useful form of energy (see Fig. 2). In the period 1850–1865, several physicists, including Rudolf Clausius, William Thomson (Lord Kelvin), and James C. Maxwell, laid down the foundations of the new theory. By the end of the century, additional contributions to the new theory had been made by Ludwig Boltzmann, Josiah W. Gibbs, and several other remarkable scientists of the time. Together they worked out the classical theory of entropy, in which the Second Law (the entropy principle) holds the central position. This law can be expressed in many different forms, and it can be proved that all of them are equivalent. Several of these include:17 • Every heat engine (such as a steam engine) must include several parts with different temperatures. The heat source is the part at high temperature (e.g. steam at high temperature), and the heat drain is the colder part (the steam condenser). The maximum possible energy efficiency of the engine is proportional to this temperature difference. It is impossible to construct a heat engine that would work at only one temperature, for instance at the ambient temperature of our natural environment. Heat cannot be extracted from the environment (e.g. from the surrounding air or from rivers) and converted into electrical or mechanical energy. J. C. Maxwell imagined a hypothetical dream device (sometimes called Maxwell’s demon)18 that would perform like this, and it would certainly solve all our energy problems. But according to the Second Law, this is utterly impossible. • There are many processes in nature that can flow in only one direction of time, but never in the opposite direction. These processes cannot be reversed in time and are called irreversible processes. Well-known examples include the mixing of hot and cold water, or ink and water, the dampening of a ball’s movement by friction, the withering of a flower petal, etc. Heat can flow spontaneously from a place with higher temperature to a place with lower temperature, but never vice versa. Therefore, heat flow is an example of an irreversible process. An increase in entropy can be defined as a degree of irreversibility. • Each system consisting of a great number of atoms can be described by a definite degree of internal order (assigned by negative entropy). More ordered 21 PART ONE states (those with low entropy) are mathematically less probable, hence order spontaneously tends towards disorder (expressed by greater entropy).19 In the opposite way, the spontaneous generation of order out of chaos is again utterly impossible. Our entire Universe will die when, after billions of years, all the points in it reach the same equilibrium temperature. In this sense, classical physics predicts a rather sombre scenario. In principle, the stores of order within our Universe are limited, and this is even truer for our tiny planet Earth. If the activity in one region is more intense, this will be (according to the Second Law) at the expense of the internal order of the neighbouring parts of the Universe. Biological living beings (from single-celled beings to humans) are open physical systems. This means that, through their biological evolution, they have invented ingenious mechanisms to nourish their internal order from the order in the environment. If the life of all living beings inhabiting our planet is too active, then the collective order of the planet may very soon be exhausted, and even the beneficial inflow of order from our beloved Sun (in the form of ordered energy in sunrays) will not be able to help much because it is too moderate for our crazy consumption. For a long time (more than a century) people did not worry about the hidden philosophical implications of the Second Law. This law appeared to be only a threat in principle, and not in everyday reality. Our Earth seemed to be so large and its riches far from any danger of being exhausted. There were so many new lands yet to be discovered, so many visions of new hidden treasures, and so many possibilities! It was a dream age of technology. Miracles like electricity, self-moving vehicles (cars), thinking machines (computers), the smart devices of modern communication technology, and so forth were constantly popping up, again and again. There seemed to be no limits to human creativity. Even the Second Law itself proved to be very useful. Today it is accepted as one of the two basic laws that govern the behaviour of nature, together with the Law of Energy Conservation. The entire delicate and beautiful structure of modern natural science is built around these two mighty pillars. The Law of Entropy is a useful and powerful tool in many practical problems and calculations, from the optimization of energy converters to the determination of the impurity rate in crystals, from the analysis of chemical reactions to the determination of the best patterns for coded messages, from the study of ecosystems to the comparative study of languages. 22 4. The great crisis Until recently, nobody really worried about the dark philosophical depths of the Law of Entropy . But in the last 50 years the material prosperity of developed countries has reached a kind of saturation. As the situation changed, new questions arose. Tired of objective quantities, many people are turning to spiritual qualities.20 I grew up during the golden age of student revolt, accompanied by a profound questioning of social values.21 Millions of young people listened to music and dreamt about a spiritually warm world without complicated hierarchical systems. It was like a wave washing over a beach, overturning many pebbles and bringing about many changes. Young people were “dropping out” of the old system, and they sowed new seeds. Shoots of various kinds germinated, among them even many strange ones. But some of them have prospered, perhaps only becoming visible after decades. At the end of the summer of 1967, just when masses of flower children were gathering at San Francisco’s Summer of Love, I took a parallel path: I started to work patiently every day on the idea of syntropy. Encouraged by several awards for being the ‘best young physicist’ in my country, I started to analyse hundreds of specific physical systems by means of the well-developed tools of theoretical and experimental physics. Five years later (1972) I read a credible report22 published by the Club of Rome. This report exposed the dark philosophical depths of the Second Law and its consequences. It sounded like a warning. There were diagrams, the results of many detailed numerical simulations, which showed the future development of several important variables: • the rate of industrial production; • the reserves of natural resources (oil, coal, ores, etc.); • the contamination of the Earth, together with soil exhaustion; • the quantity of food produced; • the number of people living on our planet. According to these calculations, something like a disaster would occur around the year 2020. Within a few years, the global population would diminish to a fraction of the previous number, industrial production would disappear completely, and food production would barely be enough for the surviving part of humanity. As one might guess, the Earth would be terribly polluted and exhausted, and lack many important natural resources. And amazingly, this is not very far from our current predicament. 23 PART ONE The old analysis relied on the data and knowledge of that time. Today, things are becoming decidedly worse, but at least we are beginning to question our human responsibility. There are several dire problems that were barely considered in those times. Among them are the following:23 • the looming climate catastrophe; • the collapse of whole ecosystems lacking the time to adapt to new conditions; • the disappearance of many natural species, the 6th great extinction on the planet; • massive migrations from climatically devastated countries24. Most of these consequences have been known for a long time (especially in the scientific community) but were never explicitly addressed. Politicians have been feigning ignorance for fifty years, although we have been aware of the basic problem. Not many years are left to change our direction and thus avoid disaster.25 Certainly, there have been enormous changes on a global scale, including the growing ecological awareness. All those who consider themselves important speak about ecology: politicians use it to attract voters, and producers to attract consumers. The phrase “green growth” (with a stress on economic growth) is the ideology of our time. But does this awareness really work? Have we done enough to prevent the interests of capital from destroying the biosphere of our planet? Certainly not.26 Without green nature we shall slowly lose our sanity. Now the whole world is in continuous competition; most of it is striving to em- ulate the economies of the developed countries. The economic “tigers” are copying the extremely problematic social ideology that, many years ago when hardly anyone was even mentioning ecology, helped the rich to strengthen their sovereignty. Let us try to find ways out of the present dilemma. On the basis of what has been written above, let us consider these problems with regard to the Second Law of Thermodynamics. Quite logically, there are only three possible outcomes: 1. We will not obey the Second Law, but will continue to expand production and profits, waiting for a miracle. But this miracle will not come. Accordingly, a catastrophe on a global scale will follow (comparable to the one in the Club of Rome scenario). 2. We will adhere to the Second Law, and return to the lifestyle of the pre-industrial age, with very moderate production. However, this is the least likely possibility 24 4. THE GRE AT CRIsIs because human pride is so strong that man will not accept the drastic discipline that the Second Law imposes on our lifestyle and consumption. 3. We will continue to profit from science and technology (and most surely, from social inventions as well), but in a very different sense. In time, totally new production processes (including the production of energy) and activities will be discovered, using methods that do not increase the total entropy of our planet above the self-regenerative ability of the global biosphere. If this is combined with the rules that regulate sustainable societies (e.g. very careful choice of those products that are truly necessary, making intelligent use of and recycling waste materials, etc.), a global collapse may be avoidable. Our foolish pride will admit this change, because even more knowledge, greater technological perfection, and spiritual maturity will be needed to make such a transition possible. Every sane reader will agree that of the three possibilities stated above, the first one is not really a solution to our problems. And the second possibility is very improbable because it goes against one of the basic rules of human history: the same path can never be trodden twice. Even if there is an attempt to do so, there are obviously different initial conditions and they will necessarily lead to different end results. We cannot pretend that nothing has happened in the last few hun- dred years; we cannot hide all that we have learned during that time. We are also conditioned by our recent past and so we must learn to live with it or process it in the best way we can. Only the third possibility takes these new initial conditions into account. Maybe this solution, which is in equal parts optimistic and realistic, is the only possible one.27 But in this present situation, we must be careful. New technologies and scien- tific discoveries can help us only on condition that they reflect a profound change in the human world-view. We must “step out of the mental box.”28 Modern man still worships material commodities, the greedy consumption of things. He relies on illusory technological miracles, and that is a great danger, because such blind faith hinders us from seeing the roots of our problem. It cannot be solved by the same mindset that created it. We cannot progress with the same old bad ideas. We need a whole new way of thinking. If we sincerely wish to generate new meth- ods, new ideas, and a fresh way of thinking, then we should give full value to our imagination. It will lead us to what is yet to be known. 25 PART ONE 26 PART T WO THE BIRTH OF SYNTROPY 5. The paradigm shift in science Let us return to biology and physics. Over the last two or three decades it has become increasingly clear that the biological concept of “the survival of the fittest” is only one half of the complete truth. Modern researchers in the field of biology are discovering a wonderful balance between “the fight for survival” and cooperation among individuals. This evidence is confirmed everywhere in nature, and on every level of biological existence.29 The social ideology of struggle and competition could only put down roots within specific historical conditions. It does not function in a different cultural background, or within a society burdened by the new challenges of our time. So it is shameful that this dogma has survived up to the present, with all the resulting problems we are facing today. In fact, all living beings are interdependent as part of the extremely sensitive and complex ecological web of life. Observations with electron microscopes have shown that even biomolecules within a living cell cannot do without ‘intelligent’ mutual cooperation.30 Now the old simplified model of a biological hierarchy and the struggle for supremacy is becoming quite obsolete. This may be an example of a paradigm shift in scientific research.31 Can we extend our parallels between biology and physics? Does modern scientific research reveal that the Second Law is also only one half of the complete truth?32 This treatise will give arguments in favour of a definitively positive answer. Physics is a branch of science with a very solid, precisely defined structure. It is based on a variety of reproducible experiments that can be described with mathematical models. So any new idea in physics must undergo strict experimental tests, and it must wait for an understandable theoretical explanation before it is accepted into the formal corpus of natural science. The Second Law has long been one of the most important pillars in the theory of nature. Phenomena that stand outside this law are not easily accepted, but it seems that in physics we are now experiencing a similar paradigm shift as that which occurred in biology and medicine. 2 It will be shown that there is good reason to think that we can find a brand new, previously unknown type of phenomenon, another half of the complete truth. 29 PART T wO Localized physical systems with diminishing internal entropy (with increasing internal order) do exist, although these systems are isolated (self-sustainable) and do not feed off the environment. Therefore, their entropy does not increase along the flow of time, but just the inverse: their entropy decreases. And this is clearly in opposition to the Second Law. Here we need several new definitions in our vocabulary. It is convenient to define a physical quantity that we shall call syntropy. It is the inverse of entropy. As a physical quantity, syntropy is simply negative entropy. (In former times, it was sometimes named negentropy.33) But we shall affix to the word syntropy also a second meaning: let syntropy be a general description of any kind of physical process that lies outside the Second Law. Syntropic processes are those physical phenomena that do not follow the Second Law. Such processes or events decrease the entropy even within a closed (isolated) physical system. Entropic processes and syntropic processes, taken together, are two halves of the complete truth. They are in mutual balance; the Law of Entropy is in natural balance with the Law of Syntropy. Later in this treatise (Chapters 28, 29) we shall present examples of these ideas on the basis of recent discoveries and new possibilities. Let us now turn to how the idea of syntropy evolved through the history of science. It has much to do with the fusion of new findings in mathematics, physics, and biology. 6. From biology to physics Many events and processes that we perceive in nature are hardly compatible with the Second Law – as if they were not on an equal basis. It seems that through this law old-fashioned physicists are still struggling to see the world through a lens of inanimation. The Law of Entropy was modelled on such things as steam engines, or the diffusion of molecules of water and alcohol, but never on subtle phenomena such as the germination of a flower, cell division, or metamorphosis from caterpillar to butterfly. A hundred years ago, adequate tools simply were not available to scientists wishing to observe these subtle phenomena. Methods like electron microscopy or computer modelling were developed only recently. Now we are far more able 30 6. fROm bIOLOGy TO PHysIC s to admire at close range the incredible wonders that are taking place all around us, in every living organism. At first glance it would seem that many biological phenomena oppose the Second Law. Self-growing structures tend towards self-organization with increasing inherent order. But most natural scientists would say that this order develops due to the order taken in from the environment34 (we have already mentioned that living beings are open systems). The primal order that our biosphere feeds on is visible light, the highly ordered electromagnetic radiation from the Sun. This is true in part. However, many recent discoveries leave us in doubt. Molecular biology, in particular quantum biology, and such new fields as cognitive science have furnished us with pictures of tremendously minute molecular systems that display quite intelligent ways of functioning.35 The Law of Entropy begins on the level of quantum particles, but these very particles are perhaps too clever to obey a law that deals only with inanimate bodies. A hundred years ago, there seemed to be a gap between atoms and intelligent matter. Intelligence was attributed to beings much greater in scale than single atoms or molecules. In 1867, the great Scottish physicist J. C. Maxwell proposed the idea of a minute intelligent being (now it is named Maxwell’s demon) that is FIG. 3. Maxwell’s demon (original drawing from the first publication of the idea37). The “demon” is standing by a small opening between two halves of a container with molecules of gas travelling randomly in all directions, and he only allows chosen molecules to pass (for instance, only those arriving from the left side). After some time, all of the molecules find themselves on the right half of the container. This final state is marked by higher internal order when compared to the initial state� 31 PART T wO A B A B FIG. 4. Maxwell’s demon, schematic: the initial state of the molecules (left) and their state after some time (right). in subtle dialogue with single molecules, arranging them from initial chaos into final order (see Figs. 3 and 4). 36 Maxwell’s demon would bypass the Second Law, but in the 19th century it was totally unthinkable to see such a device in reality. The idea was later dropped, despite the fact that several ingenious scientists (among them Max Planck, the “father” of quantum mechanics 37), expressed clear doubts concerning the universal validity of the Second Law. But now we know more about this issue. In some modern computer chips there are computing elements of only several atoms in scale. The scale has also diminished in biology. Using delicate techniques such as SEM, STM, and EFM (scanning electron microscopy, scanning tunnelling microscopy, electron force microscopy, respectively) we can see single biomolecules in a living cell. And everywhere we can admire the incredible complexity of self-organization, intelligent functioning, even at a scale of less than 10 nanometres (the size of a protein molecule). An example of such an “intelligent” microbiological structure is shown in Fig. 5. The gap between the micro- and macroscopic world is starting to close on the other end as well. By means of modern computer modelling, we are already able to study the quantum behaviour of very large macromolecular systems, measuring more than 10 nanometres. We are learning that these relatively large systems can be in a coherent quantum state. Quantum coherence is the term used for those quantum states that are internally completely in tune, i.e. informationally compact beyond the fragments of space and time.38 Here, this means that biomolecules can enter as a whole entity onto the level of discrete thermal particles. That is where the Law of Entropy may begin to 32 6. fROm bIOLOGy TO PHysIC s FIG. 5. A microtubule is a tiny biological nanotube consisting of two different protein molecules ( α- and β-tubulin). It is an important structural component of eukaryotic living cells and responsible for a number of intracellular processes. In the figure, a kinesin molecule is “reading” information from its surface, while “walking” (it has two “feet”!) along tubular strings, carrying specific molecules, and releasing them at precisely determined spots within the cell. function, and it is right here where it can stumble: How could it hold if every single particle is endowed with some rudimentary intelligence? Here is the clue of Maxwell’s demon. The Law of Entropy has foundations in statistical mechanics, with thermal particles (atoms, molecules, electrons, etc., all of them with their respective thermal energy) as players in the machinery of statistics.39 This law was not made to deal with intelligent beings. But now we are starting to see that in many cases thermal particles cannot be represented by simple inanimate bodies. The gap between animate and inanimate matter is narrowing to nothing. And if we go a step further, markedly ordered and complex quantum ensembles (such as living organisms or large parts of these organisms) manifest dynamic behaviour that is significantly different from the phenomena analysed by the old branches of thermodynamics. 33 PART T wO 7. The birth of the term syntropy The idea of syntropy was introduced into science in 1944 by the Italian mathematician Luigi Fantappiè. 40 He was investigating two solutions of d’Alambert’s wave equation (each one belonging to one direction of time’s arrow 41 ), and found that one solution describes the disintegration of ordered structures (this refers to entropic processes), while the second solution (which is quite feasible as well) describes self-organizing processes. For the latter he introduced the term syntropy. From here on, Fantappiè posited a unified theory of physics and biology, with syntropy as an important ingredient of the whole. Mathematically, he expressed entropic and syntropic processes by two opposite directions of time. Therefore, syntropic processes diminish the entropy of a closed system, and this clearly opposes the Second Law. In the same year (1944), Erwin Schrödinger (Austria and England), known as one of the founders of quantum mechanics, published the seminal booklet What is Life? 42 Therein, he considered the relation between order and disorder in living organisms, with special attention devoted to the “order-from-order” principle. In his Chapter 7 (entitled Is life based on the laws of physics?) he came to the conclusion that, with regard to the Second Law, we must be prepared to find a new type of physical law prevailing in living matter. During the two decades that followed, this challenge was taken up by two Nobel Prize laureates, Szent-Györgyi and Prigogine. Albert Szent-Györgyi (Hungary and the United States) undertook extensive research in biochemistry and medicine. Exploring cancer, he came to see it as ultimately an electronic problem at the molecular level. He was therefore one of the pioneers of quantum biology. In 1974 he proposed Fantappiè’s term syntropy (in place of the old term negentropy) to describe the rate of order in living matter.43 So he brought this word and this idea from mathematics into biophysics. However, as he was not a physicist, he did not specify the relation between syntropy and the Second Law. Ilya Prigogine (Russia, Belgium, and the United States) is now best known for his research of “dissipative structures”. These are thermodynamically open systems (exchanging energy and matter with the surroundings, see Fig. 6), while internally they are self-organized and far from thermodynamic equilibrium. They feed on 34 8. THE sTE ADy DECLINE Of THE OUT wORN DOGmA SURROUNDINGS SYSTEM BOUNDARY FIG. 6. Prigogine’s dissipative structures are thermodynamic systems (whether big or small) with a permeable boundary so that matter and energy can be exchanged between the system ˝vand the environment. the negative entropy (syntropy) of the environment and do not contradict the Second Law.44 In the strict sense, they do not engender syntropic processes. But it is less known that in his younger years Prigogine expressed certain doubts about the general validity of the entropy principle.45 He reasoned: it could be different if also long-range forces among particles are considered and/or if the considered system of particles possesses a certain “memory”. 8. The steady decline of the outworn dogma Thus far we have described those doubts concerning the entropy principle that have emerged in mainstream science. We see that most of these doubts came from other branches of science, not from physics (where the Second Law originated in the 19th century). Most physicists were (and still are) suspicious of any doubts, because the Second Law has become firmly rooted in the great edifice of theoretical physics, and it is not easy to shake it. We have seen that it was the general social atmosphere of the mid 19th century that was responsible for its metaphysical foundation. Since then, the entropy principle has generally been accepted as scientific dogma. For instance, in 1915 the English astronomer Arthur Eddington wrote: “If your theory is found to be against the Second Law of Thermodynamics, I can give you no hope; there is nothing for it but to collapse in 35 PART T wO deepest humiliation.”46 Even Albert Einstein was of a similar opinion: “Classical thermodynamics […] is the only physical theory of universal content which I am convinced, will never be overthrown, within the framework of applicability of its basic concepts.”47 But here we must call to mind that the Second Law has never been proven, neither theoretically nor experimentally. Ludwig Boltzmann made a lifelong effort to prove it in a general sense, but he succeeded with mathematical proofs only for some special simplified cases of all possible physical systems. In despair, he committed suicide in Duino (the mostly Slovenian village Devin at the time, the sacred place where the poet Rainer Maria Rilke wrote his famous Elegies). Only a few years later, during the First World War, a massive front line with nearly a million victims erupted along the Mediterranean coast on that same spot. As if the entropy principle had foretold that dire calamity. Even less has the Second Law been proven experimentally. In fact, it can never be proven. This law gives a definite statement for physical systems with a very large number of particles (atoms, molecules, electrons, etc.), and surely not for a single particle. But many-particle systems may have many levels of complexity, and it is impossible to know in advance how these levels are interlaced with each other. Nature may display complex structures beyond the capacity of human imagination, and this is exactly what living creatures do all the time. They find ingenious new ways to protect a given physical form of life. Classical (and also quantum) thermodynamics, based on statistics, is totally incapable of dealing with these multiple levels of complexity. Statistics was not meant for complex systems, like those in biology. This is what Schrödinger alluded to in the above- mentioned essay. Mathematically, negative entropy (syntropy) can be expressed as the amount of information affixed to a certain physical system. Information is accepted today as a real physical quantity. The first steps in information theory, with regard to syntropy, were made by the French physicist Léon Brillouin.48 He applied infor- mation theory to physics and coined the concept of negentropy (with the same meaning as syntropy) to demonstrate the similarity between negentropy and information. He even offered a solution to the problem of Maxwell’s demon.49 Most of the scientists following Brillouin’s trail were exploring simplified cases, physical systems with quite limited assumptions. For many of them, the 36 9. TR ANsDIsCIPLINARy sIGNs IN fAVOUR Of syNTROPy Second Law can be proven, but this is not a great step forward. With our previous example of diffusion we have shown how the initial information is lost during the diffusion process, and this leads to an increase in entropy. Information theory (a branch of modern mathematics) has made rapid progress during the last few decades, thus many theoreticians have been trying to use its powerful tools to clarify the validity of the Second Law. Some new theoretical methods have been invented (for instance, the use of Landauer’s principle50), but again they have not been able to produce a “final answer” to the entropy dilemma. No one has been able to prove the Law of Entropy. If some day one did hope to prove the Second Law, either theoretically51 or experimentally, one would have to prove it for all possible forms of complexity in Nature. But the number thereof is infinite; therefore, it is utterly impossible to prove the Second Law. This is why physicists retreated from having the final say on this issue – although a century ago they were confident they knew the answer. The realm of all possible thermodynamic systems is beyond any rational analysis. In poetic words, we cannot harvest a field with such rich biodiversity. We are rediscovering the rule by which many traditional and durable societies have long prospered, i.e. respect the secrets of Mother Nature. So this is the end of “the old science” that prevailed in the last few centuries when “rational man” wanted to control just about everything.52 53 The times are changing. Now we are experiencing a historic period of great paradigm shifts in human consciousness, so we need to cultivate additional scientific methods. They should incorporate the intuitive approach as well, in order for us to perceive reality as a whole and support the necessary harmony of all sentient beings.54 9. Transdisciplinary signs in favour of syntropy So it is my firm conviction that all “absolute proofs” in favour of the Second Law lead down a blind alley. It is more interesting to consider the other side of the coin: the scientific endeavour in favour of syntropy, expressed especially through the search for Maxwell’s demon, which is some sort of holy grail of modern 37 PART T wO science. If only one specific version of this “intelligent creature” is realized, then the universal validity of the Second Law is immediately shattered once and for all.55 With modern mathematical tools and powerful computers, it is now possible to investigate complex dynamic systems. Especially interesting are those systems with non-linear dynamics, provided by positive feedback loops. Mathematicians (for instance, the Russian mathematician Andrey Kolmogorov) have discovered analytically solvable systems of this kind that do not follow the Second Law.56 Analytically unsolvable systems of this kind may be even trickier. Here it is worth remembering that constructs initially existing only in mathematical abstraction are usually quite soon found also in physical form. For instance, non- Euclidian spaces were discovered by mathematicians of the 19th century, and now they provide indispensable support for Einstein’s relativity.57 Another example is Hilbert spaces, discovered by mathematicians long before they were introduced in quantum physics.58 If we are to proceed with syntropy, we must broaden our perspective. Many ideas concerning syntropy come from cognitive science (the interdisciplinary study of the mind and its processes). The Japanese scientist Hiroomi Umezawa explicitly pointed out that living matter decreases entropy and disorder, and increases order and syntropy.59 This is possible even when the supply of external energy comes in a completely disordered form, which puts such systems beyond the Second Law. The dynamics of the quanta involved in the fundamental processes of living matter often do not manifest thermal disorder; they exhibit order and strong mutual correlations between quantum eigenstates. Quantum statistical mechanics is not reliable in these cases because it was meant for thermodynamic systems that are close to a “disordered” thermodynamic equilibrium – which is not true for living organisms.60 Generally, the passage from mathematics to “solid” physics is not an easy and quick jump. The formation of connections with associative branches, and if possible, experimental proofs of new ideas – all these take considerable time.61 So it is quite understandable that in the transitional period new ideas are explored by pioneering individuals whose work is largely overlooked by “mainstream scientists”. I shall mention two such daring minds who already in the first half of the 20th century promoted ideas beyond the Second Law. The great inventor Nikola Tesla wrote about a “self-acting engine” that takes heat from the natural surroundings and transforms it into electric energy.62 This idea 38 9. TR ANsDIsCIPLINARy sIGNs IN fAVOUR Of syNTROPy is clearly beyond the Second Law. Tesla’s vision was in complete accordance with the rest of his work, but regretfully (considering the currently available histor- ical record) he seems not to have expounded on it further. And Wilhelm Reich performed innumerable experiments with the hypothetical “orgone” energy, an omnipresent massless substance with the inherent ability to self-organize. According to Reich, life-giving orgone does not follow the entropic principle. On every scale (from atoms to galaxies) it is the creator of forms, and especially of life forms.63 Reich’s work was officially refuted, but his or Tesla’s ideas become much more acceptable (or at least interesting) within a broader philosophical perspective. Henri Bergson, a French philosopher of that time, promoted the idea of the élan vital (an all-embracing life-giving force in nature).64 Translated into the language of physics, élan vital could be the anti-entropic physical substratum of the Universe. Bergson has contributed to the idea of syntropy also with his philosophy of time. He delved deep into the experiential meaning of time and became quite critical of the superficial “scientific” representation of time. Namely, time in science is usually represented as the 4th dimension (in addition to the three dimensions of space) of our material world, and so it is frozen in a kind of bare static geometry. But, Bergson argued, time is qualitatively totally different than space: if time is bereft of its basic quality, i.e. duration (the informational interconnectedness of heterogeneous events within the flow of time, the fusion of every particular moment with eternity), then only the succession of distinct moments along the arrow of time remains, and we can no longer understand the living reality of natural processes.65 Bergson’s perception of time has much in common with the nature of time on the level of quantum physics (in the language of David Bohm: the sub-quantum level of reality66). In the following chapters, we shall see that in our modern search for arguments beyond the Second Law special attention should be devoted specifically to this deeper meaning of time. Moreover, it turns out that we must assume that kind of temporal interconnectedness which is natural to processes on the (sub)quantum level. 2 39 PART T wO Now let us turn to modern investigations of syntropic processes. We shall assume that this “modern period” started at some point after the year 1980. (Prigogine’s most influential books were published around that year. But it is worth noting that his dissipative structures do not deal with syntropy in the strict sense.) The search for phenomena beyond the Second Law acquired a strong impetus quite recently, at the turn of the century. In 1990 and subsequently in 2003, many theoretical aspects of the hypothetical “demon” were published in two collec- tions of scientific papers.67 Furthermore, in 2002 the first great international conference on this issue was held at the University of San Diego (California), with approx. 75 serious scientific contributions. An echo of this conference was held in Prague (Czech Republic) in 2004. The second conference in San Diego was organized in 2006, and the third one in 2011.68 An excellent book on this same topic was published in 2005.69 In the preface, Daniel P. Sheehan concluded: “Considered en masse, the second law’s absolute status can no longer be taken for granted, nor can challenges to it be casually dismissed. […] It is remarkable that 20th century physics, which embraced several radical paradigm shifts, was unwilling to wrestle with this remnant of 19th century physics, whose foundations were admittedly suspect and largely unmodified by the discoveries of the succeeding century. This failure is due in part to the many strong imprimaturs placed on it by prominent scientists like Planck, Eddington, and Einstein. There grew around the second law a nearly impenetrable mystique which only now is being pierced.” 10. The arrow of time and syntropy old methods do not take us very far in the new land of syntropy. It turns out that When we explore new territories of human knowledge we travel on untrodden paths. First we must brush aside all that is redundant. The the splendid theoretical machinery concerning entropy (within the framework of classical or quantum thermodynamics) is of no great help as soon as the Second Law is questioned – because that machinery was based on the Second Law itself! Maybe most of those modern researchers who were looking in vain for the 40 10. THE ARROw Of TImE AND syNTROPy “demon” fell into this very trap: they were using old tools that were inadequate for exploring new scientific territory. We really need a fresh way of thinking, a sword that cuts through the clouds of illusion. Some people call it the Zen approach to the problem.70 71 72 We must develop new scientific methods, new mental tools, to deal with syntropy.73 74 75 So let us start with another kind of thought. For starters, the simplest image will do. Imagine a heat reservoir (e.g. a water basin) that is in a state of thermo- dynamic equilibrium (TE). This means that both temperature and pressure are equalized everywhere inside the basin, and there are no water flows (neither linear flows nor vortices – water is at rest everywhere). Such a TE is attained if the water is left to itself for a while, without any external influences. This final state of TE is the state with maximum entropy. Now imagine that we record a video of the water basin (TE state) on sensitive film. Then let us play this film in reverse. The water is still, the temperature and pressure do not change, so we cannot discern any difference when looking at the film being played in the “right” direction, or in reverse. Even if the recording is made with an electron microscope that can “see” individual water molecules, no unique direction of time can be discerned by comparing both films. This fact is due to so-called microscopic reversibility: all physical processes are reversible in time.76 The only exception is the Second Law, but it refers to processes that cannot be defined on the microscopic level.77 What is the conclusion regarding our experience viewing both films? When the basin is in TE, the water molecules do not know time’s direction. For them, both directions are equivalent. But now let us imagine that we introduce a hypothetical Maxwell’s demon into the water basin. The water and the “demon” are two entities that constitute a physical system. For the sake of simplicity and mental clarity, let us imagine that this system (water + “demon”) is free of any external influences, therefore isolated from the external environment. Now the “demon” starts to diminish the water’s entropy. (Below I will explain how this can be imagined.) As time goes on, entropy decreases. The “demon” is responsible for a syntropic process, and this process has a definite direction of time (a time arrow). The water molecules arrange themselves into a higher state 41 PART T wO of order (greater syntropy), so they must be receiving information about the preferred direction of time. But how can they get it? 78 It is important to know that every single molecule should continuously receive this information about the chosen arrow of time. Not the system as a whole, but every single molecule (or some other tiny thermodynamic particle, e.g. the electrons). Why is that so? The Second Law originates on the level of thermodynamic particles, and now we are “inventing” a new kind of physical process called syntropy. We must find a solution on this very microscopic level.79 How, then, can water molecules collect information about the arrow of time? Syntropy is not a momentary event but a continuous process, so molecules must receive this information continuously, not on account of a certain energy ex- change. Such energy exchanges are active only for a short time, until the energy differences are exhausted. They are entropic by their nature. 2 To make things even clearer in our imagination, let us enumerate several possibilities regarding this hypothetical syntropic ordering of water molecules. They could rearrange their thermal motion in order to transform their stochastic (chaotic) movement into an ordered flow along a single direction. If the water molecules within a basin at room temperature were really capable of performing this trick, then the water flow would reach a speed of approx. 500 m/s, which is comparable to the speed of a supersonic aircraft. A water jet from the nozzle of a Pelton turbine can never reach such a tremendous speed (the maximum is approx. 100 m/s), so we can only wonder what an enormous amount of mechanical energy one could derive from water if this trick were feasible! However, this “magic” is not at all against the Law of Energy Conservation (the First Law of Thermodynamics). We are dealing with the conversion of invisible thermal energy (inherent in every single molecule) into the visible mechanical energy of the water flow. There are also other hypothetical possibilities as regards the syntropic ordering in the water basin. For instance, the “demon” could take thermal energy from the left side of the basin and transfer it to the right side. Also in such a case we should get a kind of flow, namely a heat transfer from left to right. After a while, the right side would get warmer, and the left side cooler. The temperature gradient would 42 11. T wO CL A ssEs Of syNTROPIC PROCEssEs increase inside the basin. Such heat flow goes in the opposite direction, as compared to the entropic heat currents in our everyday experience.80 Mathematically, it can be expressed by the inverse direction of time. This is exactly the opposite of what is usual in our everyday experience: we have syntropy versus entropy. Maxwell imagined a “demon” standing near a small opening between the two halves of the container with molecules in thermal motion. But we can change our perspective: the “demon” is not waiting besides a small opening; rather it can be spread everywhere among the molecules. This approach is a little more abstract, but it will lead us much farther. In fact, the “demon” can be represented by a field of physical influences that act on thermal particles (molecules, atoms, electrons). If the “demon” is active, then thermal particles continuously receive information about the direction of time. Now our main question is: if time-oriented information is transmitted to thermal particles, through what kind of physical influences can this be achieved? 11. Two classes of syntropic processes There are various possibilities to realize this idea, and they can be classified into several classes of syntropic phenomena. Each class refers to a definite model of informational transfer to thermodynamic particles (in practice, each class refers to a specific smart trick). As we have seen, the respective information is time-oriented but without entropic energy exchange. It is like a subtle guide that navigates every single particle. Each particle is endowed with its thermal energy of the order kT (where k is Boltzmann’s constant and T is the absolute temperature), but initially this energy is in a chaotic, disordered state. The “subtle guide” (in fact, the “demon”) tells the particle how to manage its thermal energy, so that higher thermodynamic order is achieved. Soon we shall understand this better through several examples. The “demon” is not separated from the thermodynamic particles. It is fused together with them. Physical fields that impregnate the specified thermodynamic system are carriers of time-oriented information, but in fact each thermodynamic particle by itself is the active aspect of the “demon”. We recognize a distinction between information (manifested by various physical fields) and thermal energy 43 PART T wO (of the order kT , attributed to every single particle). But if taken together, we can see the “demon” fused with both aspects: information and energy. We can com- pare this idea to a parable from Indian mythology: the god Shiva is the giver of subtle information, while the goddess Shakti is the manifested world in action.81 2 In this treatise, two classes of syntropy will be presented. Although it is too early to establish any canonized “classification” of syntropic phenomena, the examples that I am dealing with can be arranged into two groups (“classes”) enumerated below. Very probably there are even more than these two classes, but until now my main progress has been achieved with these two, so I can offer some knowledge about them.82 Other people can disclose additional examples of syntropy (some of them hypothetical, some of them nearly confirmed), as we can see especially from the vast collection of papers that I shall call the San Diego Files. These are from the numerous presentations at the three San Diego conferences,83 from the conference in Prague,84 and from many proposals included in the book written by Vladislav Čápek and Daniel P. Sheehan.85 My own search for syntropic phenomena is primarily based on arguments related to the arrow of time. But it seems that many cases of syntropy that are included in the San Diego Files are more easily discerned by a different methodology. So they would better fit into additional classes of syntropy. The proposals from the San Diego Files cover a wide temperature range: there are tiny superconducting rings at cryogenic temperatures, but also intriguing physical systems with high-temperature plasma. There are also many proposed devices planned to function at room temperature: oscillating p-n junctions (charge-discharge oscillation), systems with magnetostrictive materials, devices making use of gravity, and much more. So I believe that additional classes of syntropy will be established in the near future. 2 44 11. T wO CL A ssEs Of syNTROPIC PROCEssEs As already mentioned, here we shall investigate only two classes of syntropy: Class 1. Syntropy produced by the influence of a stationary magnetic field; Class 2. Syntropy based on polyphase oscillations of quantum states. Class 1: A magnetic field (like every physical field) has a specific quality with regard to its symmetry properties in space and time. We know that a magnetic field is a vector quantity with a certain direction in space. One specific feature of a magnetic field is that its direction is reversed if time is reversed. In mathematical language, we say that a magnetic field displays an odd functional dependence upon the variable of time. But this means that a magnetic field is replete with infor- mation about the direction of time. However, it is not just trivial to convey this hidden information to thermodynamic particles in such a way that the particles could display syntropic behaviour. In the following chapters (Part Three of this treatise) we shall show some tricks to see how this can be done. Class 2: A quantum state (e.g. the quantum state of an electron) can be spread over a considerable region of space. Then it is not localized within a single spot, so we refer to it as an alocal quantum state.86 Now imagine that several space-regions of this a-local quantum state are imbedded in several (at least 3) oscillating potentials (potential energies). Now imagine that these oscillating potentials are mutually phase-shifted, so that all of them together form a polyphase oscillation. This idea is not new since our three-phase electric system is based on a polyphase oscillation. Mathematically, phase shifts between discrete phases yield information about the arrow of time. But the inherent nature of each quantum state is informational interconnectedness.87 In our case, the quantum state is influenced by several (at least 3) time-shifted oscillations. The complete set of time-shifted oscillations is impressed into the quantum state as a whole, and thus the arrow of time is impressed into the quantum state as well. In this way an electron (or some other thermodynamic particle) can continuously receive time-oriented information, and this is the obvious condition for the emergence of syntropic behaviour. We shall clarify this rather abstract idea by examples later, in Part Four of this treatise. Syntropic processes of “Class 1” can originate within a stationary magnetic field. Such a field displays a specific property: it can exert an influence on charged particles while the field together with the particles is in thermodynamic equilibrium (TE). This is because magnetic influence is not explicitly based on energy transfer. 45 PART T wO It is a quite subtle kind of influence. Initially, a system of thermodynamic particles, impregnated with a magnetic field, is in TE, and then the syntropic process pulls the whole physical system out of TE. Entropy decreases, syntropy increases. It is important to know that the specified physical system (with the magnet included) is isolated from the environment – a condition we must take into account when confirming violations of the Second Law. Syntropic processes of “Class 2” are slightly different. A thermodynamic system of quantum particles is impregnated with a polyphase oscillation of electric potentials. At the beginning, this oscillating physical system is not exactly in TE. However, the initial entropy is nearly at maximum, and it can be shown that this initial difference to maximum entropy can be arbitrarily small.88 Due to the syntropic polyphase influence the entropy can decrease still further. This system (with a polyphase resonator included) is also isolated from the environment, and the entropy within this isolated system can decrease – which is against the Second Law. We can see an apparent difference from Prigogine’s dissipative structures: his systems are open, so they feed on the negative entropy89 of the environment. We shall further expound on both classes of syntropic behaviour in the following chapters. Theoretical predictions will be presented, and also some experimental proofs if they already exist. Let us start with “Class 1”. 46 PART T wO 48 PART THREE SYNTROPY IN MAGNE TIC FIELDS FIG. 7. Chirality: A chiral object is distinguishable from its mirror image. These shells of two different species of sea snail are mirror images of each other� Both forms are symmetrical but still different� We see the left-handed helical form of a shell (on the left) and a right-handed helix (on the right). 12. Symmetry properties of a magnetic field The study of symmetry relations is a powerful tool in contemporary theoretical physics. Here we are interested in the symmetry properties of a magnetic field. We have already mentioned its odd functional dependence with regard to time inversion. Now let us see how this field behaves with regard to mirroring in space. Maxwell’s theory of electromagnetism90 91 is based on a pair of two cardinal fields: the electric field and the magnetic field. Both of them are vector functions (they have a specific direction in space), and both vectors are complementary in many respects. Let us compare their symmetry properties. Expressed in mathematical terms, the magnetic field is an odd function of time (if time is reversed, also the field is reversed), while the electric field is an even function (the field is not reversed if time is reversed). And with respect to space, the magnetic field is an axial vector (its mirror plane is perpendicular to the vector), while the electric field is a polar vector (the vector lies within its mirror plane). Polar vectors display a well-defined direction (like a vector of speed, force, etc.). Furthermore, the “visible” direction of axial vectors can be defined only by means of so-called chirality (left-handedness or right-handedness, see Fig. 7).92 An example: angular momentum is an axial vector whose direction is along the axis of rotation, but we need reference to a right- hand screw in order to define whether its orientation is “up” or “down”. Similarly, the physical direction of a magnetic field is determined by the right-hand rule, therefore by means of some chiral property. Chirality is the bridge between two vector families, between polar and axial vectors. Let us imagine a “demon” that could arrange the stochastic (chaotic) move- ment of electrons into an ordered flow along a single direction, and let us assume that this is done through the influence of a magnetic field (syntropy according to Class 1 from the preceding chapter). The ordered flow of electrons is simply an electric current, which (like every flow) is represented by a polar vector. So we must “multiply” the magnetic field by some chiral quality that has a kind of a helical (screw-like) structure. We shall explain later how to do this. There is also another bridge between axial and polar vectors. This bridge is the so-called cross product (or vector product):93 if we take one axial vector and 51 PART THREE c = a × b c b a FIG. 8. The cross product of two vectors, a and b, is a third vector c, written as the product c = a × b� Vector c is perpendicular to the plane (yellow) containing a and b� The cross product is defined in 3D space, so the view is shown in perspective. one polar vector, and then we multiply these two vectors, their vector product is a polar vector. A special feature of this vector product is that the three constituent vectors are oriented in three different directions (Fig. 8). Let us assume we are analysing a hypothetical syntropic electric current that is generated by a magnetic field within conditions of thermodynamic equilibrium (TE). We could say that the magnetic field is the informational aspect of the “demon”. So this aspect can assume at least two forms of combined influence: a) the magnetic field + the polar vector in a different direction from the magnetic field; b) the magnetic field + chirality (the chiral structure of the physical material or of the magnetic field). Symbolically, these two modes of syntropy can be expressed by the functional relations: a) j = a × B b) j = C ⋅ B Here, j stands for the vector of the syntropic current, and a stands for a polar vector (its physical nature is determined separately in each specific case). B is the vector of the magnetic field density, and C is a certain physical property with chiral symmetry that is characteristic of the system in consideration. 52 13. syNTROPy IN A HOmOGENOUs mAGNE TIC fIELD Several experiments concerning examples (a) and (b) have been performed to date, and in both cases a syntropic current ( j ) was detected. One (a)-type experiment is presented in Chapter 13, and another one in Chapter 16. One (b)-type experiment is presented in Chapter 14. In all of these cases, the measured currents were quite minute because the experiments were performed with free electrons in a vacuum where the electron density was very small. However, the currents were easily measurable. These electric currents even had to be small, so that the electrons were influenced mainly by the magnetic field, and not by the collisions between electrons. The so-called coherence length of the electron quantum states had to be sufficiently long in order to provide an adequate magnetic influence on the charged particles. So in each case, the measured current was small but still not negligible. It arose from TE, and this fact is the crucial part of the proof that a syntropic process (one beyond the Second Law) was effectively detected. 13. Syntropy in a homogenous magnetic field Here we start with utterly simple experimental facts, with a report of a surprisingly simple experiment. Due to the unusual simplicity of the complete experimental system, the measured results are even more convincing: phenomena beyond the Second Law do exist; this law does not hold universal validity. In 2003, two Chinese scientists (Xinyong Fu and Zitao Fu) published the results of a simple experiment with electrons in a vacuum tube.94 In the middle of a glass tube there was a small holder made of an electrically insulating ma- terial, and it supported two metal strips (pieces A and B, see Figures 9 and 10). There was a narrow insulating gap between the two strips. Both metal strips were covered on top with a thin layer of the chemical compound Ag-O-Cs (silver-oxygen-caesium). This compound is known to exhibit an extremely low electron work function.95 The work function is the energy that an electron at a surface of a certain substance needs in order to escape into a vacuum. Electron- emitting layers with a low work function are used in electron vacuum tubes (they are heated to considerable temperature), but by carefully choosing the material 53 PART THREE A B R FIG. 9. Experimental set-up: in the centre of a vacuum tube (circle) there is an insulating holder supporting two metal pieces (A and B). Both pieces are covered with a surface layer that can easily emit electrons even at room temperature (arrows). The metal strips A and B are electrically connected to a sensitive electrometer with input resistivity R� the Chinese scientists were able to make both metal pieces, A and B, release a certain amount of electrons into the vacuum even at room temperature. The metal pieces A and B were connected to a sensitive electrometer capable of measuring minute electric currents down to a flow of only 1000 electrons per second. The complete experimental system was kept at room temperature. A stationary magnetic field was produced by a permanent magnet placed next to the vacuum tube. The magnetic field density during each measurement was adjusted by changing the distance between the magnet and the vacuum tube. Many successive measurements were performed, each one relating to a definite magnetic field density. A B A B FIG. 10. The paths of the free electrons emitted from surface layers A and B into the vacuum. Two situations: without a magnetic field (left), and with a magnetic field (right). This field is perpendicular to the plane of the drawing and is marked by the × symbols� 54 13. syNTROPy IN A HOmOGENOUs mAGNE TIC fIELD Test 2 I (fA) Test 1 U (mV) T = 24 °C 15 T = 19 °C 0.30 10 0.20 5 0.10 - 2 - 1 B - 1 - 0.5 B 0 1 2 (mT) 0 0.5 1 (mT) - 5 - 0.30 - 10 - 0.20 - 15 FIG. 11. Measured DC current I and voltage V as a function of the magnetic field density B� The current reached a maximum at field density B ≈ 0.4 mT (where mT = millitesla, the unit for a weak magnetic field density), with a peak value of 15 fA (15 femtoamperes, a current of approx. 100,000 electrons per second). The voltage curve shows a similar dependence on the field density, and reaches 0.3 mV at the peak value. We see that the current and voltage curves are odd functions of the field. This fact is in complete accordance with the general theory of syntropic phenomena, and so it supports the syntropic origin of the current� (The last three figures are taken from the original paper, the one published in 2003.) In the absence of a magnetic field, the electric current between contacts A and B was zero. But with a magnetic field, the current reached a measurable DC value (Fig. 11). If we go back to Fig. 10, we can see the reason for the generation of a current: within the field, each electron travels along a circular path. After being released into the vacuum, its trajectory is bent rightward, as seen in the figure. So the electrons can easily jump from electrode A to electrode B, but not from B to A. If the magnetic field is reversed, the trajectory is bent leftward, and they travel in the opposite direction, from B to A. The crucial parts of the experimental system (the complete vacuum tube and the magnet) were in thermodynamic equilibrium (TE) throughout the tests. The current was generated from TE conditions, so there is no other clue for current generation but syntropic origin. In the years that followed the first successful attempts, the Chinese researchers continued to improve the electron-emitting layers inside the vacuum tube. The maximum measured current was nearly 200 fA at room temperature (22 °C), 55 PART THREE and also the measured voltage was considerably increased. The maximum electric power produced by this “syntropic generator” reached a value of approx. 10−15 watt (1 fW, one femtowatt).96 One can imagine this minute power like this: under a microscope we observe a tiny bacteria swimming in water (e.g. moving by means of its oscillating flagella). A typical bacterial cell is about 10 micrometres in length, and if it swims with a velocity of 20 micrometres (its double body length) per second, then it uses approx. one femtowatt of power for its slow locomotion. This is a small value but it still is a measurable macroscopic quantity, totally different from chaotic microscopic thermal fluctuations. Where does the electric power in the Shanghai experiments come from? The syntropic process produces a small potential drop (according to Fig.11, a voltage difference of 0.3 mV or less) between electrodes A and B. The electrons travelling in the vacuum from A to B are moving against this potential gradient (like a ball rolling up a slope), so on their way from A to B they lose one part of their kinetic (thermal) energy. This is the reason that the whole system is cooling down, slightly below the ambient temperature. In the Shanghai experiment, the thermal energy of our natural environment is continuously converted into measurable electric energy, and this fact goes against the Second Law of thermodynamics. 14. Syntropy in a chiral magnetic field Hitherto, a few similar experiments with electrons in a vacuum have been performed. Here I shall present two more tests with a positive outcome. One of them was carried out as early as 1987 in Ljubljana (Slovenia) by the author of this treatise.97 There are two apparent distinctions between the Shanghai experiment and the one performed in Ljubljana (although both experiments follow a very similar idea): In Shanghai, a syntropic current was produced by a homogenous magnetic field. In Ljubljana, a syntropic current was produced by a chiral (helically twisted) magnetic field. In both cases, the magnetic field was stationary (static). In Shanghai, the rare-density electron gas was kept at room temperature. In Ljubljana, this electron gas was kept at a precisely regulated temperature of 190 °C. 56 14. syNTROPy IN A CHIR AL mAGNE TIC fIELD At this elevated temperature, it was easier to obtain a sufficient density of electrons in a vacuum. Here I shall briefly present some details and the results of my experiment from 1987. The vacuum tube was maintained in precisely regulated thermodynamic equilibrium at an elevated temperature (190 °C). A special thermostat ensured that all parts of the vacuum tube were kept at the same temperature (temperature differences below 0.001 K, i.e. degrees Kelvin), and this temperature did not fluctuate over time. This special care was taken for a serious reason: we had to be sure that only syntropy (and not any other effect) produced the measured electric current. A helically twisted magnetic field was produced in two different ways: either by a special coil (carrying DC electric current) in the form of a double helix (Fig. 12), or by a set of permanent magnets attached to iron poles in the form of a double helix.98 In both cases, the measurements of the syntropic voltage gave the same results when the magnetic field density was equal in both cases. If the field was produced by an electric current I, measurements with a slowly varying magnetic field were carried out. A series of one-minute sweeps over a chosen interval of field density was performed (approx. from – 0.25 mT to + 0.25 mT). FIG. 12. The vacuum tube containing rare-density electron gas (plasma). It is covered by two helical electrodes, denoted as H and G� Still closer to the exterior, the tube is wrapped with a coil in the form of a double helix, with helical pitch p� DC current I is passing through the coil� Note the relative position between the coil and both electrodes� One can measure a small syntropic current J if the measuring instrument is connected between both electrodes. (Figures 12 through 17 are taken from the original paper.) 57 PART THREE FIG. 13. Experimental apparatus (electrical network, simplified). H = internal heater; 1 H = external heater; 2 S = platinum thermoresistor; 1 S = thermocouple; 2 C = coil in the form of a double helix; E = connections to the helical electrodes� In this way, the dependence of the measured syntropic voltage on the magnetic field density was detected. The complete experimental system is outlined in Fig. 13. We can see that a considerable part (the upper half of the figure) of the whole set-up was intended for the precise stabilization of the temperature. Another part (the bottom left corner) was intended for the generation of a zigzag waveform of a slowly changing electric current. This current produced a chiral magnetic field within the helical coil. 58 14. syNTROPy IN A CHIR AL mAGNE TIC fIELD FIG. 14. The central part of the experimental system (not to scale and simplified): 1 – glass tube with a pair of electrodes in the form of a double helix, 2 – glass capillary, 3 – chamber with liquid caesium, 4 – supplementary thermostat maintaining a chosen vapour pressure of caesium, 5 – inner thermostat modelled as a “heat pipe”, 6 – fine metal grid soaked with water, 7 – copper/Teflon rings, 8 – inner thermostat: heater, 9 – inner thermostat: temperature sensor (Pt resistor), 10 – heat insulating material, 11 – outer thermostat, 12 – outer thermostat: heater, 13 – thermocouple between the inner and outer thermostats, 14 – coil in the form of a double helix (only its extreme left part is drawn completely). The third part (the bottom right corner) was meant for the measurement of syntropic voltage. A plotting instrument drew the voltage curves, in dependence on the magnetic field density. The next figure (Fig. 14) shows the interior part of the system in greater detail. In the centre, we see the glass tube with a pair of electrodes in the form of a double helix. The density of the electron gas within the tube was thermally regulated. The vapour pressure of caesium (which affects the electron work function) was regulated by a separate thermostat. The largest part of the system shown in the figure is the main thermostat with precise double-level regulation of the temperature. The helical coil embracing the vacuum tube is placed in the narrow space between the inner and outer parts of the thermostat. The voltage generated between both electrodes in the form of a double helix was measured. While changing the electric current through the coil (a slow “sweep” between two extreme values), the functional dependence of the measured voltage on the magnetic field density was plotted. Several plots are shown in Fig. 15. This experiment was a pioneering observation of a process beyond the Second Law – maybe even the first experimental proof in favour of syntropy. But the consequences are so profound that we need to discuss all of this in greater detail. 59 PART THREE FIG. 15. Typical plots showing how the syntropic voltage (U) between both electrodes is functionally dependent on the magnetic field density (B) in the centre of the vacuum tube. This field is linearly proportional to the field-generating current I ( I = 0�91 A at B = 0.2 mT). Ts = temperature of the vacuum tube with electron gas� 15. Discussion of both experiments My experiment was not as simple as the Shanghai experiment. In most of the variations of my experiments, a chiral magnetic field was electrically controlled, and there was also precise temperature stabilization. All of this required a great amount of laboratory equipment and a complicated electric network. A question arises: Did some side effects perhaps appear to be syntropic currents? Careful precautions are needed to exclude such a possibility. I did everything to exclude any artificial excitation of the electron gas. With the exception of the syntropic currents, the electron gas had to be at complete TE, together with the walls of the container. If there were some temperature gradients or chemical potentials, numerous side effects (e.g. a thermoelectric effect and other non-equilibrium phenomena) would have influenced the plasma behaviour, and a clear understanding of the measurements would be impossible. 60 15. DIsCUssION Of bOTH E xPERImENTs In order to be sure that the measured voltage was really of syntropic origin, I carried out many additional tests.99 For instance, I checked various symmetry properties of the hypothetical syntropic current. All of these tests succeeded as well. It can be shown by theoretical arguments that in principle my experiments could be modified in such a way that the syntropic current is generated without any external power source. For an explanation of these arguments, see Fig. 16 and Fig. 17. The graphs in Fig. 15 show an odd functional dependence on the magnetic field density, like the graphs from the Shanghai experiment (Fig. 11). There are many similarities between both experiments. Even the curves display a similar form. The scales of the two presentations are different (for instance, the peak voltage of 0.3 mV in comparison to 2 mV), but this is due to the different geometry of the two magnetic fields (homogenous vs. chiral), and also to the different temperatures used in the experiments. By means of symmetry arguments (Chapter 12), we can predict two distinct geometric arrangements of a syntropic set-up in a stationary magnetic field: FIG. 16. Sketch of a syntropic system in a chiral arrangement (the principle of an idealized experiment): the vacuum tube with electron gas (part A) has a pair of helical electrodes (shaded). They are attached to the resistive load R� The helical coil (C) is made of a superconducting material and by means of the permanent DC current I it produces a static magnetic field without any power consumption� All parts of this system are kept at the same temperature T� The syntropic current J brings electric power P to the resistive load R� 61 PART THREE FIG. 17. Another variant of the previous design: Here, a twisted magnetic field is produced by a permanent magnet (M, with poles N, S), attached to an iron yoke in the form of a double helix. Absolutely no power is needed to maintain the field. The whole system is kept at the same temperature T. Experiments like this (with positive results) have actually been performed. a) magnetic field + polar vector (both directions are different, preferably perpendicular); b) magnetic field + chirality (the chiral structure of material or of the magnetic field). The Shanghai experiment belonged to category (a). Symbolically, the syn- tropic current was the cross product of the magnetic field density B, multiplied by the normal vector a (a vector perpendicular to the electron-emitting surface). Meanwhile, my experiment belonged to category (b). Symbolically again, the syntropic current was the product of a magnetic field (expressed as the axial vector B), multiplied by the “chirality” C of this same field.100 The magnetic field had a chiral form because the field-generating coil was also chiral: a double helix is a classic example of a chiral structure. Chirality exists only in our 3-dimensional world, not within a fictive 2-di- mensional one.101 But mathematical modelling of 3-D systems is complicated. When I was preparing the experiment with the help of powerful computers (simulations of electron trajectories within a chiral magnetic field), I was lucky to have these computers at my disposal. They were still quite rare in those times. The calculated curves of the functional dependence between the magnetic field 62 15. DIsCUssION Of bOTH E xPERImENTs density and the syntropic current approximately matched the measured curves, but not precisely. It seems that the experimental set-up was a bit more complicated than the one inserted into the calculations. In practice, I could not avoid passive electric fields resulting from the surface inhomogeneity inside the vacuum tube. They could not be determined in advance, and although they were weak they could significantly affect the measured current. However, the passive electric fields in my vacuum tube still did not play such a crucial role as in the Shanghai experiment. It can be shown theoretically that the Shanghai experimental arrangement can yield a syntropic current only in the presence of a passive electric field normal to the emitting surface. If this electric field vanishes, the syntropic current inside the homogenous magnetic field (category (a) above) vanishes as well. The experimenters were lucky to have that hidden passive electric field. When preparing the experiment, they used a very simplified theoretical model that took into account neither the passive field nor the geometry of the whole space inside the vacuum tube. A simplified theory, ignoring these additional complications, gives (by good luck!) the resulting current, which was indeed measured in the Shanghai experiment. The vector a mentioned above is related to the passive electric field, which is normal to the electron-emitting surface of both metal electrodes, A and B. It results from a different work function of two different materials (the inner surface of the glass tube, and the metal electrodes). 2 With the chiral geometry of the set-up (the helical form of the electrodes and also of the field-generating coil), I succeeded in achieving considerably good efficiency of the energy conversion. The thermal energy of the electrons was continuously converted into measurable electric energy. Approx. 1% of the thermal energy within the electron gas was continuously converted into the electric energy of the measured syntropic current. In favour of using an elevated temperature (190 °C), the density of the electron gas in my vacuum tube was many orders of magnitude greater than the density in the tube from the Shanghai experiment, which was kept at room temperature (22 °C). 63 PART THREE Consequently, my syntropic current was also much greater. The electric pow- er produced by my “syntropic generator” reached a value of approx. 10−12 W (1 pW, one picowatt). Can we imagine this minute power? A small larva with a tiny body length of 0.5 mm is swimming in water, at a speed of 0.5 mm per second (its body length). It needs just 1 pW of mechanical power to swim like this. Now the movement of this tiny creature is visible with the naked eye, we do not even need a microscope to observe it! This power was 3 orders of magnitude (1,000 times) greater than the maximum power from the improved tube in the Shanghai experiment. However, that experiment was exceptionally simple, realized at room temperature, and with a permanent magnet. It is so convincing just because of its simplicity. If the results of both experiments are considered together and fused into a single scientific idea, then we can be even more assured that physical phenomena beyond the Second Law really do exist, here in our manifested world. 16. More evidence of syntropy There are even more experimental proofs in favour of syntropy. Two Russian scientists, Alexander Perminov and Alexey Nikulov, studied the behaviour of rare-density electron gas in a circular magnetic field produced by a straight-line electric current. The magnetic field forms circles around the wire. We can enclose the wire with two concentric glass tubes of different radii (Fig. 18), and then pump the gas out of the space between both tubes. Again, electrons evaporate into the vacuum, and their trajectories are influenced by the magnetic field. The Russian scientists performed an experiment of just this type.102 They detected a small electric current (0.14 μA) creeping along the axis of both tubes. This current was trapped by a pair of electrodes and quite easily measured. The whole system was kept at an elevated temperature, in order to achieve an adequate density of electron gas (like in my own experiment). There were certain temperature differences between different parts of the system. (Therefore, unlike in my own experiment, the whole system was not kept in strict TE). However, the Russian scientists deduced by logical argumentation that the measured electric current did not originate from these temperature differences. 64 16. mORE E VIDENCE Of syNTROPy e I B FIG. 18 (a). The Russian experiment: electrons (with charge e) are hopping leftwards in magnetic field B (here represented by the large circle) of the straight-line current I along the central wire� Only the inner glass tube is shown in this drawing from the original paper� I FIG. 18 (b). The corresponding design from my own numerical analysis: the complete system with two concentric glass tubes (with a vacuum space between both tubes), and a pair of metal electrodes (shaded) on the left and right ends. 2 A few decades ago in my own country, we (Milan Hodošček and the author of this treatise) analysed a similar physical arrangement by computer modelling.103 Just like in the Russian lab, we confirmed the spontaneous generation of a syntropic current streaming between both electrodes. We calculated the dependence of the syntropic current on the magnetic field density (Fig. 19a), and on the potential difference between both electrodes (Fig. 19b). The electrons were travelling against the electric field between both electrodes, and so they were losing their natural thermal energy, and transforming it into electric energy. This is the essence of syntropic phenomena. We also varied several geometric parameters (the length of both tubes, the radii of both tubes), in order to find the best geometric arrangement of the system. We even introduced all possible electric fields (including spontaneously generated radial fields) into our calculations. The results for one set of parameters 65 PART THREE FIG. 19 (a). Dependence of the syntropic current on the magnetic field density at zero electric field. The magnetic field density is proportional to the dimensionless parameter b, and the syntropic current is expressed by the number of electrons (N) in 106 numerical units of time� Statistical dispersions are marked by short vertical lines� FIG. 19 (b). Influence of the electric field: dependence of the syntropic current (again expressed by N) on the voltage (in dimensionless units) between both electrodes (at b = 1), for two examples of different tube geometry. (Both graphs are from the original paper.) 66 17. wAys TO GENER ATE syNTROPIC POwER fROm sEmICONDUC TORs (short-circuited electrodes, zero electric field) are shown here in graph (a), and some results for a non-zero electric field are shown in graph (b). 17. Ways to generate syntropic power from semiconductors Let us review: the spontaneous generation of an electric current was detected inside at least three mutually independent laboratories, in different parts of the world. These currents were generated merely by the influence of a static magnetic field on electron gas in TE. The electrons were spontaneously directed towards a higher electric potential, so that their thermal energy was converted into useful electric energy. We are hot on the trail of Maxwell’s demon. Maxwell’s original idea was a “demon” located outside thermal particles. The ensuing theories of the “demon” erroneously clung to the same concept of separation: here are the particles and there is the hypothetical “demon”. Those theories had unsolvable problems with the transmission of information between the “demon” and the thermal particles. But the demon’s nature may be quite different. In our case, every single electron is Maxwell’s demon by itself. Both entities are fused into one. 2 In all of the experiments listed here, the measured syntropic currents were extremely small. The main reason for the minuteness of these currents was the very low density of the electron gas. The greatest power that we could generate was only 1 picowatt,104 enough for a tiny larva, but not enough for humans. But that larva may represent the early stage of the most beautiful butterfly, etc. So, how can we proceed? In a vacuum and at equilibrium conditions at moderate temperature, we cannot reach a much higher density of electron gas.105 Its density is low even at an elevated temperature (as it was in my experiment from 1987), so we can achieve a much lower density at an ambient room temperature (as it was in the Shanghai experiment). 67 PART THREE But we seek Maxwell’s demon, capable of working at an ambient room temperature, the natural temperature of the environment. This is an important point. We want to extract the ambient heat from the natural environment (from the air, from sea water, etc.), and convert it into useful electric energy. All energy that we use eventually returns to the natural environment – so it is the only place from which we should be allowed to extract energy. I shall expound on this statement in Part Five of this treatise (especially in Chapters 28 and 29). 2 Is there any way to modify the experimental system with electron gas in a magnetic field so that we can achieve a much higher density of the electron gas? The answer is yes – if we replace electron gas in a vacuum with electron gas inside a semiconducting material. There are many semiconductors (such as silicon, gallium arsenide, etc.) with a high density of electrons that move nearly freely inside the material, much like they wander around inside a vacuum. However, this is not a totally free movement: the electrons are scattered on crystal imperfections and also by thermal vibrations (so-called phonons).106 107 The mean free path of electrons inside semiconducting material is much less than one micrometre (except in extremely pure semiconducting crystals cooled down to extremely low temperatures). This fact poses severe experimental challenges that are not easily overcome. The bright scientific idea of Maxwell’s demon is then transformed into a tedious matter of technology. We need to develop a totally new kind of a semiconducting device. Now we know about semiconducting devices such as diodes and transistors, as well modern field-effect transistors (FETs), light-emitting diodes (LEDs), charge-coupled devices (CCDs), semiconducting lasers, etc.108 These semiconducting compo- nents are integrated into modern chips, and also into our smartphones. The development of each electronic component from the list above took many years of work in academic and industrial laboratories. We can hardly expect that the realization of a usable syntropic generator will take much less time. Namely, our computer simulations have revealed that this new device has very specific 68 17. wAys TO GENER ATE syNTROPIC POwER fROm sEmICONDUC TORs requirements, in many ways quite different from anything that has hitherto been tested in laboratories. But accepting this exciting challenge is more than worth it. There are strong indications that we could succeed within a few years, especially if we establish friendly cooperation between dedicated and well-qualified research teams – theoretical and experimental. Most certainly, it is a worthy and noble endeavour because we urgently need syntropic sources of electric power. Day after day we receive further serious indications of a threatening climatic catastrophe, which obviously may trigger other kinds of calamities (water and food shortages, massive migration, wars and social revolutions, etc.). A syntropic generator is ecologically a totally clean energy source; a solution for all the energy needs of the emerging post-industrial society. 2 We are already headed towards the practical realization of such devices. The theoretical basis of syntropic currents in semiconductors is becoming clearer. Many details have already been studied. In my country, we have upgraded the basic theory with extensive numerical modelling, and now we are doing some preparatory work in European laboratories where pilot semiconducting samples are being fabricated. We are focusing on the development of a device that is somewhat similar to an FET transistor, more specifically to an HEMT transistor (high electron mobility transistor).109 The electrons in such devices are distinguished by an adequately long free path, so they can be manipulated by a magnetic field in a similar way as in a vacuum. However, when seeking syntropy in semiconductors, this free path is still rather short. So we must use a magnetic field of high field density (at least one tesla). This field is practically homogenous within the range of the free path in a semiconductor. So we could fabricate something similar to the device in the Shanghai experiment (Chapter 13), with the important distinction that the electrons should not be emitted into a vacuum but into the semiconducting material. At least two different materials should bond together: one of them is the electron-emitting surface, and the semiconductor with high electron mobility (hence a long free path) is another. So this is a so-called heterostructure, 69 PART THREE a well-known topic in semiconductor technology. Several layers of different materials are joined into a single crystal. The best structures of this kind are nowadays produced by MBE technology (molecular beam epitaxy).110 It was mentioned in Chapter 15 that the passive electric field (the field perpendicular to the emitting surface) plays an important role in this type of syntropy. Numerical modelling of the proposed syntropic generator has revealed that the required value of this field is limited to a narrow interval. This is quite different from the fields in present-day heterostructures. Likewise, electron mobility should not be low, but (amazingly!) also not too high. Thus we find another interval, the interval of the allowed mobility values. Furthermore, even the thickness of each layer is important, and there are also some other specific requirements. Most of them have not yet been sufficiently studied and tested.111 2 Overall, a great deal of additional work needs to be carried out in order to fabricate the first successful samples – those that can mimic realization in a vacuum but at a much higher energy level. At present, we are still learning how to find the right choice of materials and the right structure of the proposed syntropic device. Numerical models seem to promise that we could reach a maximum electric power of approximately 0.1 watt if the electron-emitting surface has an area of 1 cm2. These calculations were made for electrons in gallium arsenide (GaAs) at room temperature. Most probably, subsequent technological developments will in time enable the fabrication of multiple layers inside a single heterostructure. If there are 100 structural repetitions inside the semiconducting slice, perhaps up to 10 W/cm2 of useful power density can be achieved. The active volume of the GaAs (this is where the syntropic process takes place) is so small that it is almost unbelievable: theoretically, we could extract 1 kW of electric power from one cubic centimetre of a semiconducting material such as GaAs. This new kind of semiconducting chip is inserted into a strong static magnetic field. A permanent magnet with appropriately shaped iron poles can do quite well for this purpose, but large generators for massive power production may use electromagnets, possibly superconducting coils. 70 18. syNTROPy IN CHIR AL mATERIAL s So this could be the beginning of something really new. The social and philosophical implications of these advanced technologies are of paramount importance. They will be explained in Part Five of this treatise. 18. Syntropy in chiral materials according to point (a) from Chapters 12 and 15, but let us now turn briefly also We have not yet exhausted all the different variants of syntropy in a homogeneous magnetic field. We have just described a generator to point (b) from that same chapters. The syntropic current j was symbolically represented by the equation j = C x B. Here, B is the magnetic field density, and C stands for a specific chiral property of the semiconducting material.112 In terms of symmetry properties, C is a scalar quantity (more precisely: a pseudo-scalar); therefore, the current j has the same direction as the magnetic field B. No other vectors but field B have the role of the “demon” here. A piece of a semiconductor with a chiral internal structure is simply inserted into a strong magnetic field. The syntropic electric current is aligned with the magnetic field, and electric power is generated throughout the whole volume. Therefore, this power is proportional to the volume of the semiconducting material. Chiral semiconductors do exist, e.g. monocrystals of elemental tellurium (Te) or cinnabar (HgS).113 However, analytical estimations show that we cannot expect usable syntropic currents from these crystals. The lattice parameter is too small, incomparable with the radius of the cyclotron resonance in semiconductors. Much more is expected from chiral structures with a larger helical pitch length, e.g. approx. 10 nanometres. The fabrication of semiconducting materials with a chiral structure of this kind has yet to be developed, but we are familiar with one exotic material that can be used to test the principle. 2 71 PART THREE Carbon nanotubes (CNTs) are assembled from carbon atoms that are arranged into a regular hexagonal mesh.114 Two types of CNTs (the “zigzag” type and the “armchair” type) are not chiral, but a third type displays chiral properties (Fig. 20). The grade of chirality is determined by the helical twist of the nanotube structure, specified by two parameters (n and m)115. Quantum theory predicts how electrons are allowed to move with regard to a definite arrangement of carbon atoms inside the tubular wall. The electric conductivity depends on the numerical pair (n, m), and the chosen CNT can behave in one of three different ways: like a metal, a semiconductor, or an insulator.116 FIG. 20 (a). A pair of single-wall carbon nanotubes (side view). We see carbon atoms arranged in a tubular mesh-like structure� Here a chiral type of CNT is presented: the cylindrical mesh is distinguished by the helical pitch (see the blue pattern at the top), so that a helical form of crystal lattice is obtained� This form can appear as a left-handed or right-handed screw (the left and the right drawing, respectfully). Both forms are mutual mirror images. FIG. 20 (b). The same pair of CNTs (top view). Again, we can see a left-handed or right-handed screw� 72 18. syNTROPy IN CHIR AL mATERIAL s Now let us apply a magnetic field to a single chiral nanotube, so that the magnetic field is aligned with the nanotube axis. If this field is very strong (e.g. a magnetic field density of 10 tesla inside a superconducting coil), then the quantum states of the electrons are dramatically modified. One can achieve that electrons circle around the tube in only one direction (e.g. clockwise), but not in the opposite direction.117 In addition, electrons also try to move along the tubular axis. This linear movement is coupled with circular movement around the tube, since the quantum state of the electron is a combination of both possible movements. Now let us observe two electrons: the first one is travelling in one direction along the nanotube axis, and the second one is moving in the opposite direction. Meanwhile, both of them are also circling clockwise around the tube, since a magnetic field does not allow counterclockwise circulation. The quantum state of the first electron can be described by a left-handed helix, while the second electron is marked by a right-handed helix. Quantum wave functions of both electrons display a chiral structure. Here we must also remember the chiral structure of our chosen CNT. With properly adjusted parameters m and n, the first electron behaves like an electron inside a semiconductor (or even in a metal), while the second one behaves like an electron inside an insulator: it cannot move, its quantum state is bound to a fixed location. In principle, both electrons have the same thermal energy (approx. kT), but only the first electron can contribute to the electric current. So we get a syntropic electric current in one direction along the nanotube axis. If the magnetic field is reversed, then only counterclockwise circulation is allowed, and the syntropic current is reversed, together with the field reversal. A syntropic effect in chiral materials (such as CNTs) has not been verified experimentally, but it is not very difficult to test for. In case of success, it could be another proof in favour of syntropy; it could open another area of feasible syntropic arrangements. Today, we can observe each single nanotube under an electron microscope in order to determine its m and n parameters. So we are not (yet) ready to use this type of syntropy for the production of useful electric energy. However, the proposed experiment is of great theoretical value. Whenever possible, let us try to expand our perspective. 73 PART FOUR SYNTROPY IN POLYPHA SE QUANTUM S TATES 19. The nature of time in quantum theory Now let us start with another great class of syntropic phenomena: syntropy in polyphase quantum states. With regard to any class of syntropy, the first question is how the arrow of time enters into the physical process (see the related arguments in Chapter 10). By what means is the unidirectional flow of time born in the physical system of thermodynamic particles? The arrow of time should not be determined by external influences. It should not feed on external order (the case of Prigogine’s dissipative structures, see Chapter 7). Namely, syntropy must be born inside the system, because this is the only way to check whether we have really found a case that lies beyond the Second Law. Maxwell’s demon is an integral part of the system. When we were discussing the syntropy of the first great class (syntropy in a magnetic field), we found that the arrow of time is hidden inside the very physical nature of this mysterious magnetic field. Are there also some other possibilities? How can Mother Nature introduce the flow of time into a thermodynamic system without imposing external order into this system? Careful theoretical analysis reveals that syntropic processes are quite improbable as long as we are bound to the classical description. Maybe it is even completely impossible. Then, is syntropy a quantum process? It turns out that also the first class of syntropy (the one with a magnetic field) is a quantum process. In the experiments discussed, electrons were roaming around nearly free, so that they persisted in the same coherent quantum state, while “reading” the hidden message of time in the magnetic field. And besides, magnetic phenomena in general cannot be explained by classical physics. They do exist only on the level of quantum reality.118 2 Likewise, also the second great class of syntropy includes phenomena that follow the same principle: they are quantum processes by nature, and can be explained in the language of quantum physics. The most curious fact is that the flow of time is manifested quite differently, depending on whether we look 77 PART fOUR through a classical or quantum lens.119 Here we need some more explanation. So, let us venture on a brief excursion into the subquantum world.120 Scientists have carried out astonishing experiments with electrons and photons. These particles were sent through a complex labyrinth of beam- splitters, polarizers, mirrors, and semi-permeable mirrors, so that a single particle (electron or photon) was spread over distant parts of the labyrinth, and travelled simultaneously along several parallel paths (e.g. two paths). Definite actions (e.g. interaction with the electron’s spin) were imposed on both “halves” of the electron, in a definite time sequence. Upon leaving the labyrinth, the electron was assembled again, and then the outcome was measured. The results of such experiments are amazing: the temporal sequence of actions that influence the electron’s behaviour has a totally different logic than we are accustomed to in our everyday world.121 Successive moments of time are fused together (without a “before” or an “after”) into one single experience of all-embracing duration. (Here again we encounter Bergson’s idea of duration!) Quantum information spans all moments of time within the duration of a definite quantum state (within a temporal period of quantum coherence), and it also includes the whole extension in space (within the area of quantum coherence in space). A quantum state is established by internal informational interconnectedness within the physical dimensions of space and time.122 The so-called informational field keeps together and organizes the integral structure of a quantum state, much as if the particle was a living organism impregnated by élan vital. The great physicist David Bohm pointed out that quantum information is the subtle pilot that organizes the space-time structure of a quantum state, much like the health of every living organism is maintained by its deeply ingrained “meaning of being.” 123 The informational compactness of quantum states is the door through which the arrow of time can enter into thermodynamic particles. 20. Polyphase cycles and circular diagrams How can this be realized in practice? The trick is based mainly on two ideas or scientific tools from two quite distinct areas of science and technology. We merge the theory of polyphase oscillations with the theory of thermodynamic cycles. 78 20. POLyPHA sE CyCLEs AND CIRCUL AR DIAGR Ams Polyphase oscillations were invented by Nikola Tesla (a series of patents from 1888)124, who made good use of them in electric generators, polyphase motors, and in the transmission of electric power. Usually three sinusoidal oscillations of electric current (and also voltage) are used in our modern three-phase system of electric power distribution. There are three wires, and the currents in these wires are not in the same phase, but are mutually phase-shifted (Fig. 21). Phase 1 Phase 2 Phase 3 1.0 0.5 0˚ 120˚ 240˚ 360˚ 0 - 0.5 -1.0 FIG. 21. Electric current in the three-phase system: three sinusoidal oscillations are mutually phase-shifted. One whole cycle (running from 0° to 360°) is shown in the graph. The time axis goes from left to right� Relative to phase 1, phase 2 is delayed in time by ⅓ of the complete cycle, and phase 3 is delayed by ⅔ of the complete cycle� The temporal delay between particular phases is pregnant with information about the arrow of time. For instance (the case in Figure 21), if we invert the phase shifts of phase 2 and phase 3 (in practical work with electric machinery, this is achieved by interchanging the electric contacts of these two phases), then we get an “inverse flow of time”: the motor runs in the opposite direction. A single sinusoidal oscillation holds no information about the direction of time, but a polyphase oscillation definitely does. 2 79 PART fOUR The idea of thermodynamic cycles was conceived by the French engineer Sadi Carnot.125 It is widely used in descriptions of heat engines, such as a steam engine or internal combustion engine. One cycle is a periodic process in which the thermodynamic system, after a series of changes, reaches its initial state again. Carnot plotted the pressure p of the heated gas, in dependence on the changing volume V inside the cylinder of the engine. He obtained a graphic representation of one full cycle, and it is always a closed curve (Fig. 22). p-V diagram 15 180 ° ) 14 a] 13 90 ° 12 [1bar = 0.1 MP (bar 11 p e essur 10 270 ° Pr 9 0° 1100 1200 1300 1400 1500 Volume V (cm3) FIG. 22. Diagram representing one full cycle inside a Stirling type of heat engine� The enclosed area (shaded) equals the work released during one cycle of operation. If we move clockwise (see the arrows), the heat engine has the role of a motor. If we move counterclockwise, the same engine functions as a heat pump� The shaded surface inside is proportional to the mechanical energy released by the engine in one full cycle of operation. Diagrams of this kind are useful also in analysing synchronous electric motors. Therein, one full cycle is a cycle of alternating current, and Carnot’s V and p are replaced by the electric current I and magnetic flux Φ .126 Again, the shaded area inside the closed curve represents the work done during one cycle. Like a heat engine, also an electric machine can function in both roles, like 80 21. POLyPHA sE qUANTUm sTATEs a motor or an electric generator. These two functions are symmetrical, mutually mirrored through the inversion of the time arrow (clockwise vs. counterclockwise circulation). From our examples we can see that, together with the inversion of time, also the energy flow is inverted (e.g. inside an electric motor the electric energy is transformed into mechanical energy, while inside a generator it is just the inverse). We shall shortly encounter analogies on the quantum level since this idea can be generalized: the area inside a closed curve describes the energy transfer within one cycle of a periodic process. The energy transfer can be positive or negative, depending on whether the process goes around clockwise or counterclockwise. Mathematically, the enclosed area is calculated by a special kind of integral, a specific example of the so-called circulation integral and denoted by the sign ∮ .127 Two centuries ago Carnot’s mathematical tools opened the way to the Law of Entropy. Now similar tools, combined with other methods, are useful when opening the way to the Law of Syntropy. The arrow of time is found in both processes presented above: we have found it in polyphase oscillations, and also in energy transfer cycles. We can fuse these two ideas together into a single process, and bring all this to the quantum level. 21. Polyphase quantum states Inside a semiconductor, electrons can move nearly without collisions. The quantum state of an electron can be spread over a considerably large distance, e.g. one micrometre. This distance is called the coherence length. Then imagine that with a certain trick (explained below) we build an artificial periodic electric potential inside the semiconductor, with the period shorter than one micrometre, so that the electron is spread over many periods. We can realize this type of periodic potential by means of a series of tiny nanoelectrodes positioned on the surface of our semiconducting sample. Let us connect these electrodes into three discrete arrangements. Every third nanoelectrode belongs to the same arrangement, which is connected to a certain voltage. Thus we achieve that the potential is a periodic succession of three potential steps with unequal height. One example is shown in Fig. 23. We can 81 PART fOUR a b c a b c a b c U c c c b b b a a a 1 x 2 3 FIG. 23. Graph of a periodic potential (three complete periods). Each period consists of three potential steps with a different potential height U� This potential can be generated by an array of nanoelectrodes (seen at the top), grouped into three arrangements (a, b, c). Three corresponding voltages (U , U , U ) are oscillating slightly up a b c and down (marked by small arrows). However, these three oscillations (a, b, c) are mutually phase-shifted, in the mode of a three-phase oscillation� adjust these three heights by applying carefully chosen voltages to these three arrangements of nanoelectrodes. Now let us also imagine that these three voltages are oscillating slightly. A weak three-phase oscillation (the small arrows in the figure) is added to the periodic potential. We get a polyphase oscillation of the electric potential inside our semiconducting sample, and the arrow of time is hidden within. An electron’s quantum wave function depends on the ambient electric potential. Here we see a polyphase potential, and each electron is spread over all three phases (a, b, c). So the arrow of time can enter into the electron’s behaviour. New mathematical tools have been developed in order to calculate the expected value 128 of the electron’s electric charge within the region of every potential step. But since each step is oscillating, the charge under each electrode changes with time, so we calculate the respective charge during successive moments within one cycle of oscillation. For each of the three steps (a, b, c) and for every moment of time, we get a pair of variables: the electric potential U (in volts) and the calculated charge e (expressed in coulombs). 82 21. POLyPHA sE qUANTUm sTATEs Then we are ready to plot a circular diagram, similar to what was just explained in the preceding chapter. Here the variables are not p and V (or I and Φ), but the electric potential U and the calculated charge e. Again, we get a closed curve, representing one full cycle. The area A inside the curve represents the amount of energy that is exchanged between one electron and the nanoelectrodes during one complete cycle (Fig. 24). There are three arrays of electrodes, so the complete energy transfer (A) is the sum (denoted as Σ ) of the three contributions A (n = 1, 2, 3), with each A corresponding n n to the energy exchange within the respective phase. The mathematical form of this calculus looks like this (we write it only for those who love mathematics): Due to the compact informational interconnectedness on the quantum level, each electron can simultaneously “feel” all three phases of the electric potential. Its quantum state is adjusted to all three phases, and mutual phase shifts disclose a certain direction of time. If we bring our knowledge of polyphase systems and circular diagrams to the quantum level, we get a non-zero energy conversion even on the level of thermodynamic particles. Electrons can deliver their inherent thermal energy to the set of nanoelectrodes.129 Let us review: the calculated quantity A is the energy exchange between one electron and the adjacent polyphase array of nanoelectrodes over one full temporal cycle. The circulation ∮ is used in this kind of calculus, and as the small circle inside ∮ suggests, it denotes the integration along a closed curve. The partial charge e is n determined by the quantum behaviour of an electron within a three-phase potential. The mathematical machinery of this computation is certainly complex enough, so here I was only able to outline some basic facts and results.130 We can go around the enclosed area in two opposite ways: clockwise or counterclockwise. One can choose between them by selecting either a positive or negative phase shift between adjacent phases. These two ways are mutually symmetrical, and can be represented by two opposite directions of time. If we move counterclockwise (as in Fig. 24), the electrons deliver their thermal energy to the set of nanoelectrodes. The contacts (a, b, c) are connected to a three- phase resonator. So the three-phase electric oscillation is amplified. 83 PART fOUR FIG. 24. Two “circular diagrams”, representing the energy exchange between an electron and the potential of the nanoelectrodes� The left diagram is “degenerated” since it refers to a single-phase oscillation: we go up and down along the same path, so the enclosed area is zero (A = 0). The right diagram refers to a polyphase oscillation: the enclosed area is non-zero; it equals the shaded area A � If we move one way n (counterclockwise, as in the figure), we get syntropy. If we move clockwise, we get entropy. U is the electric potential under one of the three electrodes (n = 1, 2, 3), and n e is the partial electric charge under the corresponding nth electrode� n But if we move around in the opposite direction (clockwise, with negative phase shifts between phases), then the energy exchange is negative: the electrons receive energy from the resonator. This energy is transformed into the chaotic movement of electrons, i.e. into heat. The first process follows the Law of Syntropy, while the second process follows the Law of Entropy. We can choose one or the other simply by choosing the sign of the phase shifts in our polyphase oscillation. 22. A polyphase variant of Maxwell’s demon As we have just described, Maxwell’s demon of this kind does not start from thermodynamic equilibrium. Namely, at least a weak initial polyphase oscillation is needed – but only at the start. After that, the oscillation is continuously amplified if the system includes a positive feedback loop. This can be done by adjusting the parameters of the resonator. Then gradually, through its 84 22. A POLyPHA sE VARIANT Of mA x wELL’s DEmON internal syntropic process and the positive feedback, the whole physical system departs far from thermodynamic equilibrium (TE). The internal order of this physical system (the electrons in the semiconducting sample + the electrodes + the resonator) does not feed on negative entropy from the environment. On the contrary, the polyphase oscillation is generated from the electrons’ thermal energy, thus from the energy of the thermodynamic chaos. So our case is quite different from Prigogine’s dissipative structures.131 The whole process goes like this: electrons suck their thermal energy from the natural environment, and the syntropic process converts this thermal energy into the electric energy of the polyphase oscillation, enabling us to harvest useful electric energy from the resonator. 2 During the years 2010–2012, our research tandem (Gorazd Lampič and the author) calculated more than 100,000 cases of circular diagrams. All of them referred to various examples with two-phase or three-phase oscillations. Each case related to a specific set of physical parameters so that we could gain an overall picture of these syntropic phenomena. Finally, I determined the optimal parameters and made plans for a pilot experiment. A special semiconducting chip was planned, with a semiconducting surface area of 2 mm2 covered by nanoelectrodes. The experiment was designed to use the semiconducting material indium antimonide (InSb), kept at the low temperature of liquid hydrogen (–253 °C = 20 K). According to our calculations, one could expect to harvest 10−9 W (1 nW, one nanowatt) of electric power. Fig. 25 schematically outlines the complete experimental system, as it was designed in 2012.132 Although this power is quite small, it would be another proof in favour of syntropic processes in nature. The positive outcome of the experiment could confirm that there are at least two great classes of syntropy: one generated with a magnetic field (as discussed above), while here we can see another, one generated by polyphase potentials on the quantum level. This line of research is, in my opinion, of great value. In the following chapters I will apply this class of syntropy to certain phenomena in living organisms, and to syntropy in certain crystals. Both issues are of extreme importance. 85 PART fOUR FIG. 25. Pilot experiment on syntropy in polyphase quantum states (the complete experimental system, schematically). An array of nanoelectrodes grouped in three phases (three colours) is on the surface of a semiconducting sample. A thin insulating layer (black) is placed between the electrodes and semiconductor with freely moving electrons (scattered dots). The three-phase resonator is assembled from six capacitors C and three impedances L� The generated syntropic power can be measured at three contacts a, b, c (green). Three DC voltages (see the box in the top right corner) provide for three potential steps under three sets of electrodes, and also for the adjustment of the charge density in the semiconductor� The polyphase oscillation is initiated by an auxiliary source, active only at the start (see the box in the bottom right corner). Most regretfully, the experiment outlined above has not been carried out, due to insufficient interest from scientific authorities (in fact: this resulted in a lack of financial support). But our theoretical research continues and offers good hope. New results have already superseded our old plans. Now we are heading towards solutions even better than those just described. 86 23. A symphony of protein structures During the last decades of research in biology scientists have detected that living organisms are sensitive to characteristic frequencies of electromagnetic (EM) radiation. 133 134 It was discovered that the mitosis (cell division) of different bacteria and yeast cells is significantly affected when these organisms are exposed to external EM fields. The effect is measurable at very precise resonant frequencies (usually above 30 GHz), even if the radiation intensity is extremely weak. It seems that EM oscillations are a trigger of some kind that guide internal biomolecular processes. Many of them are based on the conformational changes (the rearrangement of adhesive bonds between polypeptide chains) of biologically active proteins. So there may be a kind of coupling between EM oscillations and conformational change in protein molecules. Let us continue the story. With modern ultra-sensitive electronic equipment we can measure a wide range of EM signals that are generated by living organisms themselves (endogenous oscillations). Characteristic frequencies range from 400 Hz up to the UV spectra. For instance, a weak emission of light emanates from practically all living organisms. This can be detected by sensitive instruments positioned in total darkness. The measured spectra are very different from the spectra characteristic of thermal radiation (Stefan-Boltzmann radiation).135 So we are dealing with a new kind of biological phenomena. According to the new findings, there is no doubt that all living organisms, from single-celled to plants and animals (including humans), are internally permeated with EM fields of various frequencies, which can be measured by instruments in modern laboratories. Each organism and even each living cell is characterized by a definite spectrum of such resonant frequencies. These oscillations play an important role in biological processes.136 Today we have available a vast amount of experimental data (spectra attributed to various biological processes, etc.), and the vast extent of experimental knowledge is continuously growing. But we still do not possess an adequate theoretical explanation of how these EM oscillations are generated and processed. Many mechanisms linking EM resonances and biomolecular changes have been proposed, but a list of open questions still remains. 87 PART fOUR Let us enumerate some of them. How can an extremely weak external EM irradiation provoke a biologically active conformational change? Do those sharp resonance peaks allude to some strange quantum effect? And how are endogenous oscillations generated? Is there some kind of signal-amplifying mechanism within living matter? Syntropic processes in protein structures could be a possible explanation. Namely, similar amplifying phenomena were described in the preceding chapter. 2 Still another question refers to the so-called Levinthal’s paradox.137 A protein molecule is a long chain of discrete amino-acid residues, and before it assumes a biologically active form (conformation), it goes through a process called protein folding. But large protein molecules could be folded into many different conforma- tions, and some of them have approximately the same free energy.138 How can the protein chain find the “right” conformation? Computer models have confirmed that it cannot be done within several milliseconds (but only in billions of years!), therefore this process cannot be explained by classical thermodynamics. Much deeper “energy valleys” should be needed. In fact, energy exchanges may govern (at least partially) the process of protein folding, but there must also be some other self-organizing factor within living matter. This fact is definitely something beyond formal thermodynamics, most probably beyond the Second Law itself – something Schrödinger already alluded to in his book What is Life? It has been found that every specific protein corresponds to a definite set (spectrum) of resonant frequencies. So there is a kind of resonance between high- frequency EM fields and acoustic modes of vibration within the atomic structure of a protein molecule. Such resonances are well known in the theory of condensed matter physics 139 (the physics of crystals, etc.). Resonant frequencies range up to Debye frequencies, which are usually within a range between 100 GHz and 10 THz (depending on the material: the larger and “softer” is the molecule, lower is the Debye frequency). Biomolecules are very large, so resonances around 100 GHz (or less) can be expected – exactly what has been measured in many experiments. Each resonance is responsible for a definite type of acoustic vibration. Large molecules can host a great number of vibrational modes, each one with 88 24. syNTROPy IN LIVING ORGANIsms a characteristic resonant frequency. Specific acoustic resonances in biomolecules can trigger a conformational change, so that a protein shifts from one biologically active conformation into another, i.e. into a conformation with a different biological task within a living cell. New facts in biology confirm that more complex protein molecules can assume more than one biological function – each one according to a specific molecular conformation. Changes from one function to another are governed by specific resonant frequencies. From here we can deduce that biological processes within living cells are governed by a true symphony of UHF music (both acoustic and electromagnetic). This holds for an isolated cell as well as for a multicellular organism. With so many biological functions, the body is permeated with extremely rich “music”. Most probably many tunes are spread over the whole body. Long-range resonant quantum states are quite common in modern physical science, so why should they not play a role in biological processes?140 Nature has always invented ingenious pathways in order to express the joy of being. 24. Syntropy in living organisms It is practically impossible to explain these recently discovered subtle phenomena only on the basis of biochemical reactions. 141 Living organisms manifest wonderful abilities of informational coordination and self-organization, even on the level of intracellular processes. Just recall the complex “dialogue” between kinesin molecules and microtubules (see Fig. 5 in Chapter 6)!142 Syntropy brings the idea of self-organization to the quantum level of thermodynamic particles. So, could we use the ideas and conclusions of this treatise to present a possible explanation of intracellular self-organization? The first great class of syntropy concerns the influence of magnetic fields. But geo-magnetic fields are too weak, and they cannot introduce syntropic processes into our natural biosphere. So, what about the second great class of syntropy (i.e. syntropy in polyphase quantum states)? Logical inspection of the examples from Chapters 21 and 22 reveals that we can state three necessary “syntropic conditions”. However, there they did not appear so explicitly: 89 PART fOUR 1. A polyphase oscillation (of any kind, mechanical or electromagnetic) inside a bulk material renders it possible for the arrow of time to emerge inside the specific physical system. 2. The syntropic process operates on the quantum level, on the nanoscale. Discrete phases of oscillation act upon separate regions, i.e. each localized region with a specific group of atoms. 3. A positive feedback loop inside the system is needed so that the polyphase oscillation is amplified and sustained at a definite amplitude, despite entropic dissipation. These same syntropic conditions can be fulfilled also within an arrangement without nanoelectrodes. Here is just one example of a hypothetical model (defi- nitely simplified): let us imagine a crystal or some biological substance, but in any case this material should be composed of various kinds of atoms (chemical elements) that are displaced at various positions. To explain the principle, let us assume that there are three different kinds of atoms (A, B, C), displaced in line in a periodic succession ABCABCABC, etc. (see Fig. 26). The length of one group of ABC is one period (usually denoted by the Greek letter λ). Now let us consider a travelling wave of acoustic vibration going through this piece of material. If the acoustic wavelength matches the period λ, then the same situation is repeated within every period λ, all along the crystal. The wave goes along three regions occupied by three sorts of atoms A, B, C, and reaches each next atom only after a successive time delay – and the same delay is repeated in every period λ. So we can see that the three distinct regions, A, B, and C, are FIG. 26. Three different types of atoms (A, B, C) are arranged in periodic succession. Three periods λ are drawn� An acoustic wave (with its wavelength matching the period λ) goes from left to right and affects each successive atom only after a certain temporal delay, which is, in this case, exactly ⅓ of the complete cycle� So atoms A, B, C are affected by the three-phase oscillation� 90 24. syNTROPy IN LIVING ORGANIsms affected by the polyphase oscillation, each region (A, B, and C) after a definite time delay. This simplified example is shown in Fig. 26. Due to a hypersonic143 wave, the electric potentials in the regions of atoms A, B, and C are slightly changing, but do so in tune with the polyphase oscillation. Now let us look at delocalized electrons within biomolecules.144 145 If their quantum states are spread over a considerable coherent length (at least over one period λ), then the electrons “feel” the arrow of time inscribed in the polyphase oscillation. The arrangement in Fig. 26 is similar to the syntropic system presented in Fig. 23 (Chapter 21), which also featured a travelling wave of electric potential. It turns out that we get syntropy if the hypersonic wave goes along one direction, but we get only entropic dissipation if the wave goes in the opposite direction. It is easy to understand this: the backward movement can be expressed by the inversion of time. 2 In the preceding chapter, we mentioned some results of high-frequency experiments with biological tissues. Every living matter is permeated with acoustic hyper-sound (a frequency above 10 GHz), and by the accompanying high-frequency EM radiation. If the acoustic wavelength matches the period- icity of the crystal (or the semi-periodicity of the protein structure), then we can achieve syntropic resonance and a pronounced peak in response. This occurs only at precisely determined frequencies. Therefore, we can expect that syntropic phenomena may be generated in ordered structures with well-determined positions of atoms. This condition is most easily fulfilled in systems with at least partial periodicity. Acoustic and EM waves may be amplified within such materials. However, this does not happen just anywhere, but only under certain conditions. Syntropic amplification must prevail over entropic damping. Damping due to thermal dissipation is present everywhere, but syntropy is usually quite weak, so only materials with a proper internal structure can host the self-organization of this same structure. It seems that this is exactly what is taking place inside living organisms. Through the three-billion-year cycle of evolution on Earth, living organisms have been persistently improving those kinds of biological structures that can 91 PART fOUR host syntropic phenomena. In return, syntropy is a “magic tool” that takes care of the internal order within living cells, and hence fosters the vibrant health of every living organism. In general, syntropy supports biological evolution towards even better expressions of life. 2 There are many known examples of coherent vibrations in biological structures. Rhodopsin (Fig. 27), a light-sensitive protein in the eye retina, hosts long-range quantum oscillations, excited by individual photons.146 Important information delivered by every single photon impacts the consciousness of owls and some other species of night creatures. Quantum oscillations of a similar kind permeate the tissue of all living beings, and syntropy might very well play an important role in the amplification of those frequencies that are beneficial to the internal order and well-being of a particular subject. FIG. 27. The active part of a rhodopsin molecule, an example of a protein� Rhodopsin is a light-sensitive protein present in the eye retina. A single photon (a quantum of EM radiation) performs a conformational change in this molecule (from -cis to -trans conformation), and this triggers a complex chain of chemical reactions, leading to activation of the visual nerve� This example is well studied since a photon of light carries energy of approx� 2 eV (two electronvolts) at a frequency of approx. 500 THz. Much less is known about those conformational changes that are triggered by lower energies and lower frequencies� 92 25. CRysTAL s A s POwER sOURCEs We can go even further. Today it is known that water molecules (H O) at 2 room temperature stick together in considerably large clusters that are internally ordered, although macroscopically they seem to be in a liquid state.147 The origin of this order is usually explained by electrostatic forces among molecular dipoles. But new theories suggest that also long-range interactions must be taken into account. Coherent quantum states at precisely determined EM frequencies spread all over the clusters, which behave like tiny ice crystals, constantly assembling together and again dissolving. A plausible explanation is that these oscillations govern the growth of ice crystals, and hence, also the growth of snowflakes. Snowflakes can assume innumerable forms, which are extremely sensitive to various influences from the environment, including acoustic sounds and EM fields.148 Here we can raise a justified hypothesis. Does syntropy pilot the self- organization within crystallized clusters of water molecules? Yet we do not have a good physical model for this type of syntropy. Different models should be used for syntropy in semiconductors and for syntropy in dielectric materials, such as water. However, the hypothesis is enticing. Many protein structures are covered by an adhesive water sheath in crystallized form. The hydrogen bonds of water molecules support delocalized electron states in the neighbouring proteins. So the mutual interplay of water and protein molecules supports long-range oscillations in biological materials. This is another fact in favour of syntropy in living organisms. 25. Crystals as power sources Throughout this text we have mentioned several experiments that have generated minute amounts of syntropic power. The improved Shanghai experiment yielded 10 − 15 W (1 fW, one femtowatt) of power. My own experiment from 1987 generated 10−12 W (1 pW, one picowatt) of electric power. The estimated power from the unrealized experimental set-up (see Chapter 22) would be even “greater”, namely 10−9 W (1 nW, one nanowatt, according to calculations).149 And finally, it seems quite possible that we could obtain several watts (or even more?) of useful electric power from semiconducting heterostructures, as presented in Chapter 17 of this treatise. 93 PART fOUR Most of these cases belong to the first class of syntropy (syntropy in a magnetic field). But what about the second class (syntropy in polyphase quantum states)? Beside the experimental set-up referred to in Chapter 22, are there some other (potentially feasible) variants of syntropic power sources? Can we come up with brand new scientific ideas, and invent new engineering tricks, in order to construct a device capable of producing at least one watt of electric power? Over the last seven years, my intensive research on this subject has shown that lofty expectations of this kind may be feasible, although a great amount of additional work is needed before a working prototype will see the light of day. No one knows if the goal can be reached in five or ten years, but what is sure is that it is worth striving in this direction. Here I shall briefly explain some basic ideas. 2 In variants with nanoelectrodes, as they were presented in Chapter 22, the polyphase potentials are generated only within a thin (e.g. 20 nanometres) layer beneath the surface of the semiconducting sample. Syntropic power is produced only within this thin layer, so the power produced is extremely small. Now the question is: Could we produce polyphase potentials within the whole volume of bulk material?150 This cannot be achieved by nanoelectrodes positioned on the surface. Think a little about those electrodes. Successive phases were arranged along one dimension in space (e.g. the x-dimension). So that was a one-dimensional case. What about two-dimensional systems? Imagine a two-phase oscillation, with the first phase acting upon those atoms that are aligned along the x-axis, and with the second phase acting upon atoms aligned differently along the y-axis. Such an ordered alignment of atoms in discrete directions is found in crystals. We must search among anisotropic crystals that display different physical properties along different directions.151 Electric and magnetic fields are both vector fields, with three discrete components along the x, y, and z directions in space. We can generate an electric field with only two components (x and y). Let it be a two-phase oscillating field, so that the x-component belongs to the first phase of the oscillation, and the y-component to the second phase. Then let the phase shift between both oscillations be exactly one quarter of a full cycle. In this arrangement we get 94 25. CRysTAL s A s POwER sOURCEs FIG. 28. An anisotropic crystal (grey) inside a rotating electric field. This field is produced by two pairs of electrodes (black), connected to a source of two-phase oscillation. If the phase shift between them is ¼ of a full cycle, we get a rotating electric field inside the crystal (the arrowed circle). The syntropic behaviour of electrons inside a properly chosen crystal may amplify this oscillation� In such a case, we pump energy from the crystal� a homogenous rotating field (electric or magnetic). So the idea is that we put our anisotropic crystal into a rotating field. One possible arrangement can be seen in Fig. 28. Syntropy theory imposes several additional requirements on crystal symmetry. Not just any anisotropic crystal can serve our purpose. Crystals are, according to the internal symmetry of their atomic structure, classified into seven crystal families. It turns out that only crystals from two families (monoclinic and triclinic) can fulfil our requirements. Several crystals from other families can be added to this list if the crystal is under the influence of an additional strong DC field (electric or magnetic) along a fixed direction, or under very strong pressure in a fixed direction. But this is already a technical problem, and not easy to fulfil in practice. 2 Suppose that we put the crystal into a rotating electric field (as in Fig. 28). It should be a dielectric crystal that does not conduct electric current. Electrons are trapped in discrete cells of the crystal lattice. But since the crystal structure is anisotropic, the electrons behave differently under the influence of the x-component of the rotating electric field, in contrast to the influence of the 95 PART fOUR y-component. So, electron quantum states are pierced by the arrow of time hidden in the electric two-phase potential, and then we can continue with the general theory of syntropic processes. We can use similar computing methods as explained in Chapter 21. We use a certain mathematical tool denoted by ∮ (a line integral over a closed curve, also called a cyclic integral or circulation integral). In this way, we can explore and determine circular diagrams, which describe a physical process during one cycle of oscillation. The value of the circulation represents the energy transfer from a discrete electron to the ambient electric potential of the electrodes. This computation is mathematically more complicated than the computation of circular diagrams referred to in Chapter 21 because electron quantum states in dielectric crystals are more structured than the quantum states of free electrons in semiconductors. Moreover, it seems that at least two sets of quantum states (the ground state and the first excited state) should be calculated. When computing the circulation, the thermal transitions between ground states and excited states must be taken into account. The respective calculus is quite complicated, but can still be managed. Certain properties of dielectric crystals are represented by the so-called dielectric tensor of a given crystal.152 Let us look at two off-diagonal elements of this tensor, ε13 and ε31. Long ago it was deduced from the Second Law that dielectric tensors are symmetrical, which means that ε and ε are always of the same 13 31 value.153 It is difficult to measure these two off-diagonal elements, so serious attention was not devoted to this question. But new computations (those outlined here) for certain crystals from monoclinic and triclinic families reveal that ε13 and ε are not quite exactly equal. The difference between them is responsible 31 for syntropic phenomena and is distinctly dependent on temperature. The framework of the numerical model has been prepared, so we already have a vague idea about syntropy in dielectric crystals. Various parameters in the model are related to the structural properties of the chosen crystal. With proper adjustment of these parameters, we could obtain considerable energy conversion. Amazingly, it turns out that crystals with large organic molecules should serve the best. So here again, physics meets biology. If the electric field is rotating in one direction (e.g. clockwise) then the energy transfer is positive: electrons yield their thermal energy to the electrodes and to 96 25. CRysTAL s A s POwER sOURCEs the attached polyphase resonator. This is a syntropic process. But if the external field is rotating in the opposite direction (counterclockwise), just the inverse happens: the energy transfer is negative and we get entropic energy dissipation. The syntropic process takes place all over the volume of our crystal. It is not impossible that quite useful amounts of electric power can be extracted from the crystal (more than one watt per cubic centimetre of crystal volume). So this is another promising type of syntropic generator, in addition to the one already mentioned in Chapter 17.154 97 PART FIVE SOCIAL AND PHILOSOPHIC AL IMPLIC ATIONS 26. The question of responsibility So far we have encountered many plausible arguments for the presence of syntropic processes in nature. Human perspectives based on the assumption that such processes exist are so far-reaching that we must stop here for a while, and ponder the consequences. In this treatise, I pursue two parallel paths, that of physics and that of philosophy. Both of them should be intertwined into a single vivid current of creative thinking.155 Whoever follows only the path of physics meets with a growing burden156 of scientific achievements, and he or she does not know where this burden should be deposited in order not to cause any harm. Namely, in our modern times we must be aware that science and technology are making rapid strides, and the respective influence on our lives is also enormous. So this is a question of scientists’ responsibility to humankind and to the whole of nature in general.157 On the other hand, whoever follows only the path of philosophy encounters the phenomenon of “diluted air”. In the open space of the mind, one can prepare wonderful solutions to human problems, one can even erect a New Jerusalem; but does all this suffice to remove the fetters on the material level of our existence? There is too much suffering in this world, and in order to help other sentient beings find a way out of this universal suffering, the spiritual space of compassion should not disregard the material conditions of our life. This is a simple question of sincerity. We are not paving a comfortable way for those who are well protected and want to have even more than they need (in fact, that would pave a communal path to the grave), but for anyone of us, breathing the same air, polluted or unpolluted. The idea of syntropy brings us to the core of human existence, to the “sacred place” where the material and spiritual worlds meet. Namely, syntropy is a vivid link between the realm of divine order (formerly attributed only to Heaven) and the material world. Today the times are a-changing, the old dogma of the hierarchical split between “above” and “below” is gone, and we could even say that Heaven has met Earth. A scientist working in this field of research bears the role of ancient Orpheus, who was able to convey the music of Heaven down to Earth. And most surely, this is anything but simple and easy. A scientist 101 PART fIVE must listen to the silent whispers of Mother Nature, and should learn from her willingness to cooperate, but just as well from her resistance. So, how to carry on between the open blue sky and the muddy soil? At the dawn of his fame, the then 19-year-old Russian poet Sergey Yesenin met another great poet, Alexander Blok, and asked him for advice. He got the following answer (1915): “Sooner or later, one must take full responsibility for every step in our life. But it is difficult to plod on these days, and to plod on in literature is getting even more difficult day by day. I tell you this from my heart since I know how difficult it is to proceed, so that someone is not blown away by the wind or suffocated in mud.” 158 In these days of growing natural troubles and social unrest, it is so important to retain this broad poetic perspective. Natural scientists should always maintain this subtle balance between physics and philosophy.159 In these years of global confusion, under the dark threatening sky of profound changes, where lies the place for an ethically pure fountain of faith in our scientific investigations? How can we proceed when we are dying of thirst? But still, what is to be praised is our wondering at natural beauty without limits. We are like children playing by the banks of a running mountain stream. The murmuring ripples are covered by innumerable facets of silver light. It is this innocent primal wonder that makes us truly human. Scientists are engaged in a continuous search for so-called “scientific truth” although we know that it can never be the absolute truth. The Universe is so rich, with an infinite variety of forms and multiple expressions of life. And here breathes our tiny, vulnerable drop of cosmic awareness. Let it radiate this cosmic light; let us protect it! Let us be sensible when we examine how scientific achievements resound in our human condition.160 Besides its profound spiritual meaning (see Chapter 31), the science of syntropy provides us with a handful of practical gifts. It offers a better understanding of subtle phenomena in living organisms, hence also in our own bodies, and thus it even supports certain complementary methods of medical treatment. And secondly, there is another gift that is revealing itself – at this critical moment of human evolution – which is even more important in everyday practice: the science of syntropy sheds light on new methods of energy conversion. The far-reaching 102 27. A wAy OUT Of CHAOs meaning of this latter point will be explained in Chapters 28 and 29. So I do hope that it is worth continuing with this story. 27. A way out of chaos One might wonder why, until recent years, the idea of syntropy has not been taken seriously by the mainstream scientific community. From my own life experience, I dare to posit four main reasons. Amazingly, they have nothing to do with the rules of scientific methodology. 1. The Second Law is firmly rooted in the structure of theoretical physics. Usually it is still deemed that the Law of Entropy does not permit any exceptions. Only careful inspection reveals that this law is without any adequate theoretical and experimental support – but was a child of the zeitgeist of the 19th century. So it turned out that the Second Law was assigned the status of scientific dogma. History tells us that only the bravest people are willing to challenge firmly rooted dogmas of any kind. One needs to “step out of the box” – one’s mental box and the box of social safety, both simultaneously. 2. The Second Law is in tune with some deeply embedded religious dogmas of Western civilization. Many religiously devoted scientists (even the most prominent ones) cannot abandon this law because it offers an illusion of “scientific proof” of the existence of a higher being. This is an over-simplified analogy between spirit and matter. Moreover, the idea of syntropy introduces a different relationship between animate and inanimate world: there is no sharp deviding line between the two. But this is not in accordance with the Western division into living and non-living matter (Descartes’ schism between res cogitans and res extensa). 3. Another reason relates to the personal reaction of established scientific authorities. Syntropy is a new field of research, so the “authorities” cannot claim this new ground to be a part of their own “feudal territory”. They are more willing to support those young people who tend to continue their own line of research, since in such manner mentors can profit from their 103 PART fIVE students’ contributions. Alas, vain jealousy also plays an important role here. Most regretfully, such selfish reactions appear quite often, sometimes they even escalate into violent opposition to new ideas. 4. Today, scientific research is increasingly prone to follow the rules of marketing, because it is too heavily enslaved by the chains of all-mighty capital. Appearance is becoming more important than content, so proj- ect evaluations are becoming increasingly superficial. New light cannot penetrate through the thick layers of bureaucracy. Diffuse criteria prevail (e.g. research projects in line with considerable capital interests often have priority over promising ones with a modest budget). We are no longer living in an age of “scientific romanticism” – but we need the free reign of our imagination when introducing new scientific ideas. I think that a complex combination of these reasons is that hindering factor, and so visions beyond the Second Law have been so commonly, too easily, refut- ed. A reaction of this kind went against the results of the Shanghai experiment (Chapter 13), and against the facts from my own work (Chapters 14 and 22) – to mention only a few examples. But fortunately the paradigm shift that we are experiencing today is more favourable to the emerging idea of syntropy. In this modern age, no one can deny that the accelerated effects of human activity are becoming increasingly adverse and ambiguous. The net result is that chaos is rising rapidly on so many levels (environmental chaos, social chaos, spiri- tual darkness, etc.). So we could say that the modern age is becoming increasingly entropic. Let us remember: total confusion is a state with maximum entropy, much like a physical system in thermodynamic equilibrium. Are we still far away, or are we already quite near that singular point with maximum entropy? 161 Knowledge of syntropy can help us rise from a state of maximum entropy. I wonder if it was necessary to become so deeply immersed in our modern confusion before we were ready to open our eyes. Now the hidden message of our new understanding is being revealed in many parts of the world. But was it necessary to wait so long? 104 28. The balance between entropy and syntropy On several occasions in this treatise we have encountered a physical structure in the function of Maxwell’s demon. I have described three experimentally verified examples (Chapters 13–16), and also other types of syntropic systems that have been tested by extensive computer simulations (Chapters 21–22). Furthermore, I have even disclosed two feasible variants that could someday provide us with usable amounts of electric power (Chapters 17 and 25). Let us be clear: we are speaking about an energy source that converts the thermal energy of our natural environment into electric energy. Electrons yield their own thermal energy to external electrodes, so the electrons get cooler, and with them the whole crystal (semiconductor or dielectric) also cools down, considerably below the ambient temperature. In brief, the crystal sucks ambient heat and transforms it into electric energy. Until now, such kind of device has not been known or even acknowledged. Its possibility has been ignored because the Second Law (the Law of Entropy) forbids such possibility.162 According to this law, we can produce electric power only if we burn some kind of fuel to raise the temperature above the natural temperature of the environment.163 But knowledge from the last few decades has given us new hope. Let me repeat: in the universal sense the Second Law has never been proven, theoretically or experimentally. It is the result of the mid-19th-century zeitgeist. But throughout this treatise we have adduced arguments (based on new scientific discoveries) that support the idea of syntropy. Processes beyond the Second Law do exist in nature. 2 Our novel energy source enables something that hitherto has not been possible: the recycling of energy. The explanation of this concept seems obvious. The classical science of thermodynamics claims that various forms of energy are characterized by a different weight of entropy (specific entropy, the entropy affixed to each unit of energy). Electric or mechanical energy has the lowest specific entropy, while the natural heat of the environment has the highest specific entropy. Here, let us remember that the Second Law allows only those natural 105 PART fIVE FIG. 29. Classification of energy type according to specific entropy processes that result in an increase in entropy. So, according to this disputable law, electric energy can be converted into ambient heat, but the inverse is not possible (Fig. 29). I do not claim that entropic processes do not exist in nature. Surely they do exist, we can see them just about everywhere. Spring water is dirtied by mud, a rose blossom withers, a dead body decays: all these are entropic phenomena. But I do claim that syntropic processes also exist in nature. They are not so easily perceivable since here on Earth they exist on the level of microscopic biomolecules, most probably also on the level of so-called “etheric matter” 164 165, and on the level of novel devices that are yet to be realized. Entropic and syntropic processes both reside side by side (as seen in Fig. 30); both of them are in mutual balance. Perfect balance is not necessarily perceived on the local level. It seems that here on Earth entropic processes prevail over syntropic ones because we can afford it to a certain extent, due to the beneficial inflow of low-entropy radiation from the Sun.166 But as was shown in the opening chapters of this treatise, in our time the safe limit has already been breached, and we cannot survive without the manifested presence of syntropy. Human civilization cannot continue without it.167 If there were only entropic processes, the material world would, sooner or later, be engulfed in increasing chaos (see Chapter 2). In the end, the whole Universe would die of a so-called thermal death. According to Eddington, the substance of time would disappear, and any process (including life) would disappear as well. On the other hand, if there were only syntropic processes, 106 29. V yAk TI FIG. 30. The right side of the diagram presents the usual flow of entropic processes that transform electric energy (E) into the thermal energy (T) of the natural environment. The left side shows the other half, which, until now, has not been accepted� Syntropic processes transform ambient heat back into electric energy� In this manner the energy circle is closed, energy can be recycled� in the end they would lead to perfect, completely static order, so time would disappear in this case also. But if both principles are in balance, they bestow upon us the entire wealth of living phenomena that we are able to see and enjoy. 29. Vyakti Let us return to our syntropic energy generator. My vision is that quite soon, perhaps in ten years, humanity will make good use of freely accessible energy produced in this way. I shall speak about it as if this device were already here. This is my dream,168 but it is very real, and supported by scientific confirmations over the last few decades. Outside on the windowsill, or on the balcony, or somewhere in the garden, my family has installed a metal box (the size of a large shoebox) incorporating our syntropic energy generator. Let us call it the vyakti.169 A fan inside the box forces the air to circulate through it. The air that comes out of the box is cooler than the air at the entrance because a certain amount of thermal energy is converted into electric energy. There is a cable leading from the box into our flat, so that we can use the electric power for our everyday needs. All the conventional devices in the flat 107 PART fIVE (lamps, computers, boilers, heaters, etc.) dissipate heat, and some of this heat finally returns back to the vyakti, which sucks it up and converts it again into useful energy. According to the Law of Energy Conservation, the dissipated heat in the room is in perfect balance with the production of the “cold” on the balcony. The temperature difference is felt only within a short distance, between the room and the balcony. On a larger scale, the vyakti does not leave any imprint or influence on the environment. In the sense of ecology, the vyakti is a totally pure energy source because it takes energy from the ambient heat – which is exactly where every form of energy finally returns. In this way, energy is continuously recycled (see the closed circle in Fig. 30). Ambient heat with the greatest specific entropy (at the bottom of the energy conversion chain) is transformed into electric energy with zero entropy (at the top of this chain). An entity that was “the lowest” is transformed into “the highest” – is this not the holy grail of modern times? 2 We are living in the final years of relentless economic growth. The main problem is that our economic system (capitalism in this final, neo-liberal phase) is based on an endless craving for financial profits, not on our communal welfare.170 Climate and biodiversity catastrophes are at our doorstep. Many other mutually interconnected natural conditions for our life are totally out of balance as well.171 We are on the brink of destroying our whole biosphere, of destroying the life of all successive generations. Now only a few years of dire reflection remain left at our disposal, and during this precious time we must eagerly look for new energy sources that are not based on fossil fuels. Many renewable sources (wind energy, solar energy, etc.) are now in rapid development. However, it seems that most countries in the world are not adequately endowed with truly clean renewables. On the other hand, the development of nuclear fusion reactors (like the prototype ITER reactor near Marseille, France)172 continues, but again, at least two or three additional decades may pass before they start functioning – and that will be too late. Another way to “solve” the problem of global heating is through geoengineering, i.e. large-scale intervention into the Earth’s climate system. One method 108 29. V yAk TI is the artificial removal of CO from the atmosphere, and another is solar 2 radiation management (SRM). But I have serious doubts regarding any kind of geoengineering. New scientific data (also disclosed by various UNO panels) reveal that the problem is much larger than just global heating. The harmful impact on the biosphere is due to our ruthless expansion, excessive harmful activities, and material greed on every level. We cannot solve the delicate problem with new global interventions (such as geoengineering) that are, again, mainly in the interest of capital.173 Mother Nature is seriously damaged and it is high time to let her recover, mainly through her own wisdom. For instance, we should stop global deforestation this very moment. Forests cannot be replaced by artificial CO-removal systems, and likewise the spraying of aerosols into 2 the atmosphere cannot bring the climate into the same beneficial balance as cooperative natural processes provided during the Holocene. 2 Many are striving to restore the balance of the natural processes on this vulnerable Earth. It is time to put the Law of Entropy in balance with the Law of Syntropy. That is why I am firmly convinced that syntropic energy generators are the best long-term solution to our energy needs. They would not have any impact on our ecosystems if the infrastructure were not centralized. Every household could be fitted with its own vyakti so that the heat currents are weak and the temperature discrepancies negligible. The largest generators (e.g. those that supply electric power for massive production of hydrogen by electrolysis) should be installed near rivers or by the sea, so that they take thermal energy from the water and turn it into usable electric energy. Also in such case a practically negligible impact on the ambient temperature is easily achievable. A vyakti could also be very useful in the function of a cooling device. In hot parts of the world we could use one instead of an air conditioner. A vyakti could be installed inside the house to cool the interior, while the electric power generated is sent to the grid for some other use – maybe to run the neighbour’s bakery. Again, the temperature differences are balanced within a short range and there is no noxious impact on the environment. 109 PART fIVE 30. A letter to the people Although syntropic phenomena were introduced into the natural sciences quite recently, their existence is now supported by a growing body of experimental confirmation. This gives us new hope. The naughty boy Entropy has finally found his lost twin sister, Syntropy. Together, they will cause no more problems because they are in complete balance. Relentless entropic degradation can be neutralized and transcended. Universal Creation lives and functions quite differently from the way that has been believed over the last few centuries. Mother Nature herself is endowed with a wonderful inherent ability of self-organization. The miracle of life can be sought on every scale of the material world, even within a minute speck, perhaps the size of a molecule. In fact, there is no clear boundary between animate and inanimate levels of reality. The material world is permeated by the syntropic principle, so we can say that everything is alive. Nature’s Creation is boundlessly precious, even miraculous. A few centuries ago, natural science had to start from easily understandable origins, referring to the simplest examples (e.g. the elliptical movement of the Earth around the Sun). But now we have developed theoretical, numerical, and experimental tools to study much more elaborate physical systems. Such systems are interlaced with many different levels of complexity. So we are beginning to shed new light on the world we live in. We are committed to respecting Mother Nature’s hidden languages and her subtle instruments of manifestation. Syntropy is one of these subtle mechanisms or even natural languages. Every tiny dot in the Universe is breathing in and out its own life. In the next chapter, I will try to explain this principle with the help of the syntropic perception of time. What does it tell us? We do not need to enslave and exhaust our environment, and we can still live quite happily, even much better. We do not need to compete with Mother Nature since she is very cooperative as soon as we understand her ways. She is our companion. As soon as we get in tune, Mother Nature is willing to provide us with so many gifts simply on her own. Remember John Lennon and his songs “Imagine” and “Power to the People”? Yes, syntropy can literally provide us with practically inexhaustible electric 110 30. A LE T TER TO THE PEOPLE power. But now it is time to accept the human condition that we are responsible enough for this gift of power. In the history of civilization, power of any kind has too often been misused for the sake of supremacy. For instance, we know all too well what the first use of nuclear power was. The aggressive dualistic split into contradictions has done so much harm to humans and nature. That is an ever-present warning, and hence the reason why I wrote this chapter as a “letter to the people”. 2 Many a modern human being feels enchanted by our extremely rapid scientific and technological progress. He or she may even believe that our technology will solve all human problems. Here I dare say that this belief is a complete illusion. Such things as autonomous vehicles, artificial intelligence, and bioengineering may arrange a certain number of small pieces into a practical position, they may be useful within their own limits, but we cannot model our vast reality on particulars. A society excessively relying upon technology is bound to be transformed into a network of irresponsive automatons – incapable of genuine human feelings, and even unaware of the terrible mistakes produced by their “unreflected” activity. In this way we cannot expect to realize our noble human nature. The genuine nature of the human soul cannot be expressed by algorithms, and our living perception of reality far transcends anything that technology can offer. Likewise, the vyakti in its material form is only a technical device meant to bestow upon us some benefits within its own realm (e.g. a long-term solution to the problem of global heating). But this can be beneficial only on condition that any harvesting of material benefits delivered from it is accompanied by a deep human understanding of our new situation.174 The scientific introduction of syntropy (together with all implications) into our global worldview is much more important than the technological spin-offs of this same discovery. We need to understand what the very existence of syntropy means in a philosophical, psychological, and sociological sense. We need a radical shift in our individual and global consciousness, or the recent discovery of syntropy might lead to even greater chaos in our world (material and social chaos). We have enough of it already. I do hope that we are willing to step out of the old mental box. 111 PART fIVE I am not alone in this faith: numerous, rapidly growing social movements and networks all around the world claim the same social and spiritual values that I advocate in this treatise. I especially feel proud of those young people (some of them very young indeed, like Greta) who are not indoctrinated by certain obsolete, rotten beliefs, and so they are able to perceive the pure suchness of the present global situation. In the opening chapters of this treatise we mentioned the emerging dark clouds of our times. They remind us of the inevitable conclusion that a new form of social integrity must be implemented as soon as possible. Those who have social power are still misusing it for the sake of their own supremacy. The current neo-liberal politics are still following the doctrine of social Darwinism (“survival of the fittest”), although the old rules of supremacy and exploitation threaten a global catastrophe. 2 Here the idea of syntropy can offer new insight. Our Universe is permeated with cosmic life energy in a myriad of visible manifestations, all of them in mutual cooperation. By means of syntropy, “the lowest” can be quite softly transformed into “the highest” (see the preceding chapter). The rigid structures of violence are melting down. We can speak about a great paradigm shift, and the shift in the natural sciences runs parallel to the shift in human society. This is exactly the same message that we can hear from so many new social movements in our globalized world. A growing percentage of people all around the world are demanding honest social and economic equality, real democracy, compassion for all those who suffer, and full ecological responsibility. The idea of syntropy can provide additional impetus to this new emerging awareness: further physical support for our dreams. The vyakti transforms thermodynamic chaos into valuable energy, darkness into light. In fact, the vyakti is born out of the darkness of our modern world. The great musician Keith Jarrett wrote:175 “I now see that the dark can be looked at two ways: forbidding or enticing. I chose to be enticed and was finally allowed to turn on a light in there. Now the darkness is that much smaller, and my faith is greater that what is found in the darkness is not destructive, but Creative.” 112 31. The syntropic perception of time the arrow of time Arthur Eddington, an astronomer and influential advocate of the Second Law (see Chapters 8 and 28), declared that this law provides an origin to , which is always entropic, pointing towards greater disorder. But is it so simple? What gives direction to the flow of time? This question greatly intrigued theoretical physicists. Namely, all fundamental physical laws are totally symmetrical in relation to the temporal direction.176 They do not give priority to some definite direction. Then how does the arrow of time get into our world? We can find innumerable explanations of this problem in the scientific lit- erature.177 178 Practically all of them are based on a statistical interpretation of temporal irreversibility (Boltzmann’s approach), but they do not get even a bit further. A critical analysis shows that such an interpretation of irreversibility is based on the temporal asymmetry of the initial conditions, so this initial asymmetry is introduced into the argument.179 In fact, this is Isaac Newton’s perception of the Universe: at the beginning of Time, the Creator created the initial order of the Universe and then set it in motion, and now it is function- ing according to physical laws. But I believe that we can now transcend the artificial split between spirit and matter. An adequate answer to the dilemma of the time arrow is, in my opinion, much simpler than one might expect – but until now the answer has not been sought in the right place. The preferred direction of time is manifested as soon as the observer’s consciousness is included in our physical model of the world.180 Let me explain this idea. We know (and every child knows it even better) that on the level of consciousness, our innermost perception of time is related to the bliss of pure being.181 This is a worldview that has nothing to do with final destruction and chaos. All living beings grow and breathe in the collective flow of time, in the all-embracing universal symphony of dancing interconnectedness. Time is an important natural parameter in this symphony, just as it is in every piece of music. Here, we are introducing something quite opposite to Eddington’s entropic perception of time, so we shall call it the syntropic perception of time. 113 PART fIVE How can we reconcile both perceptions? The material and spiritual levels of reality have been separated for too long, and this has caused much damage during human history. It is time to surmount the apparent dichotomy. So let us ask: is our paradigm-shift in tune with the syntropic perception of time? 2 Knowledge about syntropy is liberating, since we are no longer trapped in the cage of the entropic perception of reality. We do not need to rely on exhaustible low-entropy sources (material or psychological) since ever-new forms of order are at any time and everywhere simultaneously created by syntropic processes in nature. There is an internal balance between entropy and syntropy. On the level of subquantum reality, one cannot declare whether the world is entropic or syntropic – it is neither one nor the other. So we are still completely free to choose between both time directions. We are free to enjoy a new kind of spiritual freedom that has never been experienced before, since until now (especially in Western society) we have been bound to this or that theory about some unique direction of time. And from the standpoint of this new mystical insight, we can finally perceive the answer to the old dilemma regarding the arrow of time. While we calmly abide in this serene freedom, an apparently shocking truth silently reveals itself: the free choice regarding the direction of time is a great illu- sion. Namely, we cannot exclude the observer’s consciousness. My consciousness and your consciousness, just like the consciousness of every sentient being, are linked to cognition, and proceed along an informational increase (an increase in syntropy, a decrease in entropy). A being that decides to exist in the opposite (retrograde) direction of time cannot be in tune with the universal cognition of our common world, so for us this being would drop out of our experienced reality. Amazingly, this principle is true not only for living beings but for every phys- ical entity interacting with the rest of the world.182 Our experienced direction of time is continuously getting in tune with the future, and so it survives, while the retrograde direction disconnects itself from reality and vanishes. Through this subtle rule, our unique direction of time enters into our experienced (physical) world together with the eternal phenomenon of consciousness. This has nothing to do with classical thermodynamics. 114 31. THE syNTROPIC PERCEPTION Of TImE Maybe some additional explanation is needed here. When a sentient being expe- riences a moment of awareness, this “moment” is not an isolated and dimensionless point, but rather has an open duration in time (Bergson’s term). In terms of quantum physics, this duration is described mathematically as informational interconnectedness in space and time (quantum coherence or Bohm’s holomovement). The quantum nature of duration engenders a new quality: the “abiding calm” along both directions of time. A quantum entity is not tied to any direction of time. It is completely free to decide, autonomously, on its own direction of the time arrow. As beings breathe and vibrate together in the Universe, from their love for this collective existence ( dependent arising, Skrt. pratītyasamutpāda )183, all of them decide to harmonize their cognition, to get in tune with each other. So they do decide to live in the same direction of time. If this were not so, there would not be any inter- action among sentient beings or among quantum entities as well, and each one of them would exist in isolation in the Universe. Sentient beings are tuned in to our common direction of time, because it is beneficial to all of them. Beings accept the common time as the prime binding agent of their innermost physical existence. In Chapter 24 we devoted attention to the syntropic vibration of living organisms in space and time. That common symphony pervades all beings in the Universe, and is based on the mutually agreed direction of time. The evolution acting inside diverse beings is mutually supportive. Living beings are tuned together in their collective direction of time. The subtle awareness of this agreed direction is the deepest foundation of love among sentient beings. So here is the prime origin of love, and here is its prime purpose. This novel explanation has a precise significance in the realm of theoretical physics, but it nevertheless has a deeper spiritual meaning too. Love is not some isolated idea in the realm of human emotions, it is not some mysterious mystical power; it is rather a clearly determined quality in relation to the temporal structure of our Universe, a quality that is most easily found when manifested forms exhibit elaborate structural complexity in space and time.184 Poets often say that love is the perpetual unveiling of life. But in the experience of love, everyone’s existence is faced with an ever-new understanding of time. Therefore, as a certain philosopher says, love is continuously inventing a new kind of duration in life: “Happiness in love is the proof that time can accommodate eternity.” 185 115 Acknowledgments I feel very grateful to so many dear people who have, over the last half-century, sincerely supported my investigation of syntropy. It is utterly impossible to give a full list of them here, so I shall mention only those individuals who have most directly contributed to my own results presented in this treatise. Computer modelling of the syntropic system with the magnetic field of a straight-line current (Chapter 16) was carried out by Milan Hodošček. With his help, I was also able to develop a numerical model of the experiment with a chiral magnetic field (Chapter 14), so that the optimal physical parameters were determined in advance. That experiment was performed in the laboratories of the company Iskra Elektrooptika (present-day Fotona), where Jože Petkovšek (head of development) and the late Feri Bogataj provided their utmost support. Many other people in that high-tech company and in some other institutions contributed to the experiment. I must mention especially the late Evgen Kansky, who prepared the vacuum tube with caesium, and Miran Zgonik, the head of development at the company TGN (thermal devices). The path to the confirmation of syntropy in semiconducting materials (Chapter 17) is another story. It has not yet proceeded to evident results and is still in progress. Also here, the first step in our research was the development of an appropriate numerical model. In this regard, Gorazd Lampič, the CEO of the company Elaphe Propulsion Technologies, was of indispensable help and support. The first semiconducting samples were prepared by Urban Medič, and subsequently Janko Kobal designed and fabricated an improved batch of samples. The physical steps of this process were carried out at the laboratories of CNR-IOM, located in Basovizza (Italy), where Giorgio Biasiol and Simone Dal Zilio offered their time and support. Up to the present, this work has been realized under grants for two NFFA projects, ID-430 and ID-577. My scientific knowledge and understanding of subtle processes within living organisms (Chapters 23 and 24) have been sharpened and evolved through innumerable discussions with Igor Jerman, the head of the Bion Institute. Together we have been looking for bridges between many observable biological processes and physical models of syntropic phenomena. Free-flowing discussions of this 117 ACkNOwLEDGmENTs kind were also held with Mitja Peruš, quite often together with the late Andrej O. Župančič or Karl Pribram. Insight into their quantum models of cognitive processes follows the same line of thought as does my own research on syntropy. The extensive numerical computation of circulations (outlined in Chapters 21 and 22) was carried out in a firm partnership with Gorazd Lampič. This work lasted several years, day after day. Our cooperation was an extremely fulfilling experience also because it was crowned by the interesting results hatched every day by our inventive mathematical tools. I also give special thanks to my many dear friends who have inspired me to write this treatise. They have supported the emerging idea of syntropy from the very beginning, and many of them have read parts of this text while it was still in the germination phase, and advised me on how to improve it. The incomplete list includes (without those already mentioned above, here in alphabetical order): Bojan Brecelj, Violeta Bulc, Stan Coenders, Tamara Ditrich, Biljana Dušić, Howie Firth, Marjeta Godler , Karel Gržan, Darko Korbar, Barbara Korun, Manca Košir, Andrej Lukšič, Jaka Modic, Vida Mokrin Pauer, Saša Pavček, Dušan Plut, Maja Remškar, Janko Rožič, Andre Ule, Joško Valentinčič, Nastja Vidmar, Rok Zavrtanik, Ira Zorko, and Tomaž Zwitter. The drawings were prepared by Jaka Modic (Figs. 18b, 23, 26), by Svit Merc (Figs. 20ab), by the author of this treatise (Figs. 12 through 17, 19ab, 24, 25, 28, 29, 30), some of them were copied from the cited papers (Figs. 9, 10, 11, 18a), and the rest was taken from Wikipedia. In addition, I owe my thanks to Jaka for final processing the drawings in this book. Finally, Tibor Hrs Pandur, with whom I also exchange research on the holistic legacy of Nikola Tesla, did great work reading the manuscript, went through it in great detail, and improved the text with many valuable suggestions. I would also like to thank Dean J. DeVos for his invaluable help with the English manuscript; our discussions were fruitful and enjoyable. 118 Endnotes 1 A continuous process with a respectable history. For instance, see: Fritjof Capra, The Turning Point (Simon and Schuster, 1982) 2 Dušan Plut, Ekostistemska družbena ureditev (Ecosystematic Social Order), work in two large volumes, with extensive Summary in English, Publ. University of Ljubljana, 2022) 3 https://ipbes.net/news/Media-Release-Global-Assessment (retrieved 22 February 2020) 4 Noam Chomsky, Manufacturing Consent (Pantheon Books, 1988), Necessary Illusions: Thought control in democratic societies (Pluto Pess, 1993), The Conquest Continues (Publ. Verso, 1993), Powers and Prospects: Reflections on human nature and the social order (Pluto Press, 1996). 5 Slavoj Žižek, The Plague of Fantasies (Publ. Verso, 1997); The Courage of Hopelessness: Chronicles of a Year of Acting Dangerously (Penguin Books, 2017). 6 Mattias Desmet, The Psychology of Totalitarianism (Chelsea Green Publ., 2022) 7 Vimala Thakar, Totality in Essence (Motilal Banarsidass, Delhi, 1971) 8 A poetic description of this global confusion can be found in my essay Brave New World. See www.andrejdetela.si/umetnost/brave-new-world/ 9 Niall Fearguson, Civilization: The six killer apps of Western power (Penguin Books, 2012) 10 David Bohm, On Creativity (Routledge, 1998) 11 Charles Darwin, On the Origin of Species (Publ. John Murray, London, 1859). The complete title is On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. 12 Herbert Spencer, Principles of Biology (London, 1864, 1867, revised and enlarged 1898). 13 Alan Lightman, Great Ideas in Physics (McGraw-Hill, 2000), Chapter 2: The Second Law of Thermodynamics. 14 Claude Shannon(1948), A Mathematical Theory of Communication, Bell System Technical Journal 27 (3): 379-423, 623-656; Claude Shannon and Warren Weaver: The Mathematical Theory of Communication (University of Illinois Press, Urbana, Illinois, 1949) 15 We shall expound on the relation between experienced time and physical time at the end of this treatise. We shall see that it has much to do with the profound meaning of syntropy. The concept of syntropy cannot be understood without taking a fresh look at the phenomenon of experienced temporal asymmetry. 119 ENDNOTEs 16 Sadi Carnot, Réflexions sur la Puissance Motrice du Feu (Paris, 1824; transl. Reflection on the Motive Power of Fire, Dover Publ., 1960, 2005) 17 W. Benenson et al., Handbook of Physics (Springer-Verlag, 2000), 20.5.3 18 James C. Maxwell, Theory of Heat (1871); (reprint: Longmans Publ, 1908): 308-309 19 Alan Lightman, Great Ideas in Physics (McGraw-Hill, 2000), Chapter 2, pp. 64-82 20 Colin Campbell, The Easternization of the West: A Thematic Account of Cultural Change in the Modern Era (Paradigm Publishers, 2007) 21 Remarkable events took place in Paris (May 1968), and all over the Western world as well. 22 Meadows D. H., Meadows D. L., Randers J., Behrens W.W., The Limits to Growth (Universe Books, New York, 1972). Accessible at: http://www.donellameadows.org/wp-content/ userfiles/Limits-to-Growth-digital-scan-version.pdf (retrieved 22. Febr. 2020). In 1972, I read the essential results published in the French magazine Science et vie. 23 https://en.wikipedia.org/wiki/Planetary_boundaries 24 https://en.wikipedia.org/wiki/Climate_migration 25 According to the recent analysis of the Intergovernmental Panel on Climate Change (published in IPCC 6th Assessment Report, March 2023): “It is only possible to avoid warming of 1.5 °C or 2 °C if massive and immediate cuts in greenhouse gas emissions are made.” In order to avoid a global climatic collapse, CO 2 emissions should decline by 50% in the coming years, and reach net zero by around 2050. 26 Hickel J. and Kallis G., Is Green Growth Possible? (included in the book New Political Economy, Routledge, 2019). Accessible at: https://doi.org/10.1080/13563467.2019.1598964 (retrieved 22. Febr. 2020). 27 This same viewpoint was clear a couple of decades ago, and up to the present day. See my essay Entropy - our perspective at the end of the 20th century at https://www.andrejdetela.si/ znanost/entropy-perspective-at-the-end-of-20th-century/ 28 More about this will be explained in Part Five of this book (especially in Chapter 30). 29 Andreas Weber, Alles fühlt. Mensch, Natur und die Revolution der Lebenswischenschaften (Berlin Verlag GmbH, Berlin, 2007) 30 One example will be shown in Chapter 6, Fig. 5: cooperation between a microtubule and kinesin molecules. 31 Thomas S. Kuhn, The Structure of Scientific Revolutions (The University of Chicago, 1962, 1970, 1996) 120 ENDNOTEs 32 An early hint at this conjecture: Erwin Schrödinger, What is life – the physical aspect of the living cell (Cambridge University Press, 1944). See also: H. Fröhlich, Can biology accommodate laws beyond physics? (included in the book: Quantum Implications, Routledge, 1987) 33 Léon Brillouin, Negentropy principle of information, J. Apply. Phys., vol. 24 (9) (1953): 1152-1163 34 Ilya Prigogine, Gregoire Nicolis, Self-organization in Non-equilibrium Systems (Wiley, 1977) 35 Modern research in quantum biology has led to the justifiable hypothesis that certain protein structures inside living cells (e.g. microtubules, see Fig. 5) play the role of a quantum computer (Stuart Hameroff, Quantum computation in brain molecules?, University of Arizona, 1999). It is known that even simple quantum computers in certain ways surpass the most efficient classical computers. 36 James Clerk Maxwell, Theory of Heat (1872). 37 Max Planck, Treatise on Thermodynamics (1897-1922). 38 Roy Jay Glauber, Phys. Rev. 130: 2529 and 131: 2766 (1963) 39 Ludwig Boltzmann, Vorlesungen über Gastheorie (Leipzig, 1896-1898) 40 https://web.archive.org/web/20060619231414/http://www.sintropia.it/english/ (retrieved 22 Febr. 2020). 41 The arrow of time is an official expression denoting the direction of temporal flow – either from the past towards the future, or backwards, from future to past. 42 Erwin Schrödinger, What is Life? (Cambridge University Press, 1944). 43 Albert Szent-Györgyi, Drive in Living Matter to Perfect Itself, Synthesis, Vol. 1/1 (1977), pp. 14-26. 44 Ilya Prigogine, Isabelle Stengers, Order out of Chaos (Bantam Books, 1984) 45 Ilya Prigogine, Time, Structure and Entropy (included in the book Time in Science and Philosophy, ed. Zeman J., Elsevier, 1971). 46 Later published in: Arthur Eddington, The Nature of the Physical World (MacMillan, 1928), Chapter 4 47 Albert Einstein, Autobiographical Notes (ed. Schilpp P., A Centennial Edition. Open Court Publ. Co., 1979), p. 31. 48 Léon Brillouin, Science and Information Theory (Academic Press, 1956; 2nd edition 1962, reprinted Dover Publ., 2004). 49 Brillouin L., Maxwell’s demon cannot operate: information and entropy, J. Appl. Phys., Vol. 22 (1951), pp. 334-337; Brillouin L., Life, thermodynamics and cybernetics, included in the book Biology and Computation: A physicist’s choice (World Scientific, Advanced series in neuroscience, Vol. 3, 1994), pp. 554-568. 121 ENDNOTEs 50 Rolf Landauer, Minimal energy requirements in communication (Science, 28 June 1996), pp. 1914-1918. 51 Minagawa S. et al. (2023) [2308.15558] Universal validity of the second law of information thermodynamics (arxiv.org) 52 Fritjof Capra, The Turning Point (Simon and Schuster, 1982) 53 Madawala Hemananda, Nature and Buddhism (2002): Chapter 15 (Ecology & the environment) 54 Leon N. Cooper, Science and Human Experience: Values, Culture and the Mind (Cambridge Un. Press, 2014) 55 Many examples and studies on this are found in a pair of books: Maxwell’s Demon: Entropy, Information, Computing (eds. Leff H.S. and Rex A.F., Princeton University Press, 1990); Maxwell’s Demon 2: Entropy, Classical and Quantum Information, Computing (eds. Leff H.S. and Rex A.F., Institute of Physics Publ., Bristol and Philadelphia, 2003). Both books are a collection of studies encompassing both old and modern beliefs concerning the existence of this “demon”. 56 Arnold N. Kolmogorov, Inf. Transmission, Vol. 1 (1965), p. 3; IEEE Trans. Inf. Theory, Vol. 14 (1968), p. 662. 57 Albert Einstein, Relativity: The Special and General Theory (3rd ed., London, 1920) 58 Paul A.M. Dirac, Quantum Mechanics (Oxford Univ. Press, 1958): § 10. 59 Mari Jibu and Kunio Yasue, Quantum Brain Dynamics and Consciousness (John Benjamins Publ. Co., 1995), Sections 16 and 17. 60 Some recent ideas about syntropy in living matter (ordered quantum systems) will be shown in Chapters 23 and 24. 61 Although there have been many genuine efforts to find the “demon” up to this day, most of these efforts have not followed strict scientific criteria, or they have disregarded the complex reality of many physical systems. Something like this was not even expected, since, until quite recently, the natural sciences and experimental techniques were not yet at a sufficient level. But as soon as these particular branches of the natural sciences reached an adequate level of maturity, a new problem arose: specialization in scientific research. Very few creative minds can link together the (apparently remote) areas of quantum physics, electromagnetism, and thermodynamics. 62 Nikola Tesla, The problem of increasing human energy (Originally published in Century Magazine, June 1900. Later reprinted in many separate editions.) 63 Wilhelm Reich, The Bion Experiments on the Origin of Life (Octagon Books, New York, 1979). 64 Henri Bergson, L’évolution créatrice (Oeuvres, Presses universitaires de France, Paris, 1970). 65 Bergson H., Essai sur les données immediates de la conscience (ibidem). 122 ENDNOTEs 66 David Bohm, Causality and Chance in Modern Physics (Routledge, 1957). 67 Maxwell’s Demon: Entropy, Information, Computing (eds. Leff H.S. and Rex A.F., Princeton University Press, 1990); Maxwell’s Demon 2: Entropy, Classical and Quantum Information, Computing (eds.Leff H. S. and Rex A. F., Institute of Physics Publ., Bristol and Philadelphia, 2003). 68 References to these three conferences in San Diego, and to the conference in Prague are cited in Chapter 11 (“San Diego Files”). See endnotes 83, 84, 85. 69 Vladislav Cápek and Daniel P. Sheehan, Challenges to the Second Law of Thermodynamics – Theory and Experiment (Springer, 2005). 70 Thomas Cleary, Kensho – The Heart of Zen (Shambala, Boston & London, 1997) 71 Philip Kapleau, Straight to the Heart of Zen: Eleven Classic Koans and their Inner Meanings (Shambala, 2001) 72 Chögyam Trungpa, Shambala: The Sacred Path of the Warrior (Shambala, Boston & London, 1995) 73 David Bohm, On Creativity (ed. Lee Nichol, Routledge, London & New York, 1998) 74 Questioning Krishnamurti: J. Krishnamurti in dialogue with leading twentieth century thinkers (Krishnamurti Foundation Trust, 1996) 75 Andrej O. Župančič O ustvarjalnosti v znanstvenem raziskovanju (On creativity in scientific research, ZRC SAZU, Ljubljana, 2006) 76 Lars Onsager, Reciprocal relations in irreversible processes, Phys. Rev. (A) 37 (1931): 405; 38 (1931): 2265 77 See also the first footnote in the final chapter (about antiparticles). 78 The arrow of time could be imposed on molecules through any entropic process, for instance if we expose one half of the basin to a heater. This can be done only from the outside, but since the system at issue is isolated, there is no chance of this. And it is also important to know that the “demon” itself does not act through an entropic process – in accordance with the definition of Maxwell’s demon. 79 Many inventors tried in vain to find the “demon” by simply putting together several macroscopic parts of a complete physical system (e.g. they put together various gas containers, turbines and compressors, heat exchangers, etc., but all these parts are simple enough for us to know that each one of them is firmly within the realm of the Second Law. And since entropy is an additive quantity (i.e.: the entropy of the entire physical system is the sum of the entropies attributed to the distinct parts), the Second Law holds also for the entire system. So this false approach to the “demon” always led down a blind alley. It seems that even Nikola Tesla made this mistake. 80 Sybren Ruurds de Groot, Thermodynamics of Irreversible Processes (North Holland Publ. Company, 1963) 123 ENDNOTEs 81 Mircea Eliade, Le Yoga, immortalité et liberté (Editions Payot, 1954, 1975, 1991): Chapter 6 82 This treatise does not deal with physical systems that are far from TE from the start. Through syntropic process, some systems of this kind may escape even further from TE. Living matter is by definition far from TE, so it is worth noting that certain biological phenomena inside living cells may justify the above assumption. 83 st 1 conference: Quantum Limits to the Second Law: Theory and Experiment (First International Conference on Quantum Limits to the Second Law, University of San Diego, 2002): http://www.ipmt-hpm.ac.ru/SecondLaw/part1.en.html; 74 contributions, most of them accessible in PDF format at https://aip.scitation.org/toc/apc/643/1?size=all&expanded=643 (retrieved 5 Sept. 2019). 2nd conference: Sheehan D.P. (ed.), The Second Law of Thermodynamics: Foundations and Status (Proceedings of the AAAS Symposium, 19-22 June 2006, University of San Diego, CA. Special issue of Foundations of Physics, Vol. 37.12, 2007). 3rd conference: Sheehan D.P. (ed.), The Second Law of Thermodynamics: Status and Challenges (AIP Conference Proceedings, University of San Diego, 14-15 June, 2011). 84 Frontiers of Quantum and Mesoscopic Thermodynamics (Prague, Czech Republic, 26-29 July 2004). 85 Čápek V. and Sheehan D.P., Challenges to the Second Law of Thermodynamics – Theory and Experiment (Springer Verlag, 2005). 86 This alocality is related to Heisenberg‘s uncertainty principle, expressed also by duality "wave- particle". Schrödinger‘s description of the quantum world gives some insight into the form of a-local quantum states. See also David Bohm & Basil J. Hiley, On the intuitive understanding of non-locality as implied by quantum theory, Foundations of Physics, 5 (1975): 93-109 87 David Bohm, Quantum theory as an indication of a new order in physics. Part A: The development of new orders as shown through the history of physics, Foundations of Physics, 1 (1971): 359-381; Part B: Implicate and explicate order in physical law, Foundations of Physics, 3 (1973): 139-168 88 We can say that these systems start from TE since the initial difference from maximum entropy can be infinitely small. 89 Several synonyms for negative entropy: informational structure; negentropy (old term); syntropy. 90 Richard Feynman, Lectures on Physics, Vol. II (Addison-Wesley Publ. Company, 1964) 91 Minoru Fujimoto, Physics of Classical Electromagnetism (Springer Verlag, 2007) 92 https://en.wikipedia.org/wiki/Chirality_(mathematics) 93 https://en.wikipedia.org/wiki/Cross_product 94 Fu Xinyong, Fu Zitao, Realization of Maxwell’s Hypothesis: A heat-electric conversion in contradiction to Kelvin’s statement, arXiv.org/physics/0311104 (Shanghai Jiao Tong University, 2003). 124 ENDNOTEs Updated version: Fu X.& Fu Z., Realization of Maxwell’s Hypothesis: An Experiment of Heat- Electric Conversion in Contradiction to the Kelvin Statement. Preprints 2016, https://doi. org/10.20944/preprints201607.0028.v7 95 Alfred Hermann Sommer, Photoemissive Materials (Wiley, 1968) 96 Calculated from results in the updated published version (2016). 97 For a complete explanation of the experiment, see the original text published in unaltered form at https://doi.org/10.5281/zenodo.14192042 Although it was written a long time ago (in 1987), it is still relevant. In that paper, I used the obsolete term dimentropic effect (something that diminishes entropy of an isolated physical system) instead of the modern term syntropy (or syntropic process). The only statement that I disagree today is at the end of the article: “Rather complex theoretical arguments suggest that it is very hard (if not impossible) to find a case of Second Law violation, when a homogeneous magnetic field is applied to any system of particles.” My further research, along with the Chinese experiment, disproved this claim. 98 The basic geometry of the experiment with permanent magnets is shown in Chapter 15 (Fig. 17). 99 These tests are explained in the above reference https://doi.org/10.5281/zenodo.14192042. 100 The chiral factor C displays the symmetry properties of a pseudoscalar. One consequence of this fact is that syntropic currents are aligned with the magnetic field lines – hence the helical position of the electrodes, together with the relative position between the coil and both electrodes, as shown in the drawings. 101 This refers to chirality as it is defined in physics, chemistry, and biology. In mathematics, chirality exists also in the two-dimesional world. 102 Alexander Perminov and Alexey Nikulov, Transformation of thermal energy into electric energy via thermionic emission of electrons from dielectric surfaces in magnetic fields (2011), https:// www.researchgate.net/publication/258476601 (retrieved 29 Febr. 2020). 103 Andrej Detela and Milan Hodošček, Kritika entropijskega zakona – Numerično modeliranje sintrop- nega toka v prostoru med dvema cevema (internal publication in the Slovene language, 1984). English translation: The Law of Entropy – a Critique: Numerical modelling of a syntropic current in the space enclosed by two tubes. See https://doi.org/10.5281/zenodo.14191729. 104 In the Russian experiment, the measured power even reached 20 nanowatts, since one part of the system was intensely heated, in order to increase the electron gas density. A violation of the Second Law was deduced only on the basis of logical argumentation. 105 The dependence of electron density on temperature is described by the Boltzmann distribution of energy levels. For definition, see Handbook of Physics (ed. W.Benenson et al., Springer-Verlag, 2002): 17.1.1.2 106 Robert G. Chambers, Electrons in Metals and Semiconductors (Chapman and Hall, 1990) 125 ENDNOTEs 107 L. Solymar, D. Walsh, R.R.A. Syms, Electrical Properties of Materials (10th edition, Oxford University Press, 2019) 108 Richard Turton, The Physics of Solids (Oxford University Press, 2000): Chapter 6 109 https://en.wikipedia.org/wiki/High-electron-mobility_transistor 110 Hans-Eckhardt Schaefer, Nanoscience (Springer-Verlag, 2010): Chapter 3.4 111 The complete theory based on analytical and numerical models will be published separately. 112 In this chapter, C does not describe the chirality of the field itself (as was the case in Chapter 14), but describes the structural chirality of the material. 113 Crystals of pure tellurium, and also crystals of cinnabar, belong to the rhombohedral crystalline class 32, space group P3121 or P3221 (two chiral enantiomers). 114 Hans-Eckhardt Schaefer, Nanoscience (Springer-Verlag, 2010): Chapter 5.1 115 Single-wall carbon nanotubes are characterized by two main parameters (m and n) that define their physical properties: the rate of chirality, metal vs. insulator behaviour, and several other properties of each specific nanotube. In order to test syntropy in CNTs, we must select a nanotube with a well-defined combination of these two leading parameters. 116 Michael P. Marder, Condensed Matter Physics (Wiley, 2010): Chapter 10, Problem 3 (p. 287) 117 This is not a simple classical movement since the electron quantum states are spread all around the nanotube. But we can talk about the privileged direction of the orbital angular momentum. 118 Richard Feynman, Lectures on Physics, Vol. II (Addison-Wesley Publ. Company, 1964): Chapter 34-6 119 The Oxford Handbook of Philosophy of Time (ed. Callender C., Oxford University Press, 2011): Part V, Time in a Quantum World 120 David Bohm, Quantum Theory (Prentice-Hall, 1951): Part I, Chapter 8; David Bohm, Causality and Chance in Modern Physics (Routledge & Kegan Paul, London, 1957): Part IV, Chapter 4. 121 For instance, the delayed-choice quantum eraser experiments. See Brian Greene, The Fabric of the Cosmos: Space, Time, and the Texture of Reality (Alfred A. Knopf Publ., 2004): Chapter 7 (Time and the Quantum) 122 David Bohm, Wholeness and the Implicate Order (Routledge & Kegan Paul, 1980). See also: Quantum Implications (ed. B.J. Hiley and D. Peat, Routledge, 1987) 123 David Bohm and Renée Weber, Meaning as being in the implicate order philosophy of David Bohm: a conversation (included in the above cited book Quantum Implications, Routledge, 1987). 124 Nikola Tesla, Radovi iz oblasti elektroenergetike (Works in the field of power engineering, ed. Nikola Tesla Museum, Belgrade, 1988) 126 ENDNOTEs 125 Sadi Carnot, Réflexions sur la Puissance Motrice du Feu (Reflections on the motive power of fire, 1824) 126 Andrej Detela, Mathematical theory of a new type of synchronous electric motors (internal publication, 2000). Cooperation between Josef Stefan Institute (Ljubljana) and Harmonic Drive Systems (Hotaka, Japan). 127 https://en.wikipedia.org/wiki/Circulation_(physics) 128 However, given the interaction with the nanoelectrodes, this is a probabilistic average – in the spirit of quantum measurement theory. 129 A single-phase oscillation bears no information about the arrow of time (and the enclosed area A is zero), but even with a polyphase oscillation the enclosed area would be zero if we do not take into account the miraculous nature of the quantum world. 130 The complete mathematical theory of the energy transfer in polyphase quantum states, together with many examples and the detailed design of proposed experiments, was presented in my book Syntropy in Polyphase Quantum States (original title in Slovene: Sintropija v polifaznih zibelkah, 2014). At present, only the original Slovene version of the book is available at https://www.andrejdetela.si/knjige/ 131 Ilya Prigogine and Gregoire Nicolis, Self-Organization in Non-Equilibrium Systems (Wiley, 1977) 132 The challenges and details of this planned experiment have been presented in my book Syntropy in Polyphase Quantum States. See endnote 130. 133 Many relevant data have been presented at three consecutive conferences on EM phenomena in living matter, that took place in Prague, Czech Republic. See the corresponding proceedings: Jiři Pokorny (ed.), Electromagnetic aspects of selforganization in biology (Proceedings of international symposium, Prague, 2000) Jiři Pokorny (ed.), Endogenous physical fields in biology (Proceedings of int. symposium, Prague, 2002) Jiři Pokorny (ed.), Coherence and electromagnetic fields in biological systems (Proceedings of int. symposium, Prague, 2005). 134 See an overview of the relevant achievements in: Cifra M., Fields J., Farhadi A., Electromagnetic cellular interactions, Progress in Phys. and Mol. Biology, Vol. 105/2 (2011), pp. 223-246. 135 https://en.wikipedia.org/wiki/Black-body_radiation 136 Due to spatial limitations, I am unable to mention a huge number of details here, so I do not cite the relevant authors involved in this research. A large list of references, including new discoveries in biology, can be found in my above-cited book Syntropy in Polyphase Quantum States (Slovene version 2014), Chapter 24. 137 https://en.wikipedia.org/wiki/Levinthal%27s_paradox 127 ENDNOTEs 138 Free energy is a specific quantity in the theory of thermodynamics, and is slightly different from plain energy. Today the term “free energy” is loosely (and incorrectly) used also for that kind of energy that can be extracted from some unknown, freely accessible source (see also endnote 154 – the final remark there). 139 Michael P. Marder, Condensed Matter Physics (Wiley, 2010) 140 Mari Jibu and Kunio Yasue, Quantum Brain Dynamics and Consciousness (John Benjamins Publ. Co., 1995), especially Part Three of the book. 141 Rupert Sheldrake, A New Science of Life: the hypothesis of formative causation (J.P. Tarcher Publ., Los Angeles, 1981) 142 See a short video of kinesin‘s guided walking at https://en.wikipedia.org/wiki/Kinesin 143 The length λ is no more than a single molecule of ABC, thus the corresponding acoustic frequency is extremely high – it is in the range of hypersound (a frequency above 10 GHz). 144 Albert L. Lehninger, Biochemistry (Worth Publishers, 1975): Chapter 18 (Oxidation-reduction enzymes and electron transport) 145 Marian Wnuk, Istota procesow życiowych w świetle koncepcji elektromagnetycznej natury życia (The essence of life processes in the light of the concept of electromagnetic nature of life, University KUL, Lublin, Poland, 1996): Chapter 2.3; Summary: https://www.kul.pl/ the-essence-of-life-processes-in-the-light-of-the-concept-of-electromagnetic-nature-of- life,26372.html 146 Albert L. Lehninger, Biochemistry (Worth Publishers, 1975): Chapter 13, p.354 ; W.T. Keeton, J.L. Gould, C.G. Gould, Biological Science, 5th edition (W.W. Norton & Company, 1993): Chapter 36, p. 1042 147 Boris F. Sergeev, Physiology for everyone (Mir Publishers, Moscow, 1978): Chapter Water – a personal ocean 148 See the related experiments with snowflakes in: Masaru Emoto, The Hidden Messages in Water (orig. Mizu Wa Kotae Wo Shitterli, Sunmark Publishing, 2001). 149 A derivative of the experiment from Chapter 22 could yield several watts of electric power. It was described in detail in the cited book Syntropy in Polyphase Quantum States (2014), but its realization would require extremely expensive nanotechnology. Another allied variant was conceived later and is simpler, but is still close to the edge of feasibility. These complicated variants are not included in the present treatise. 150 The hypothetical model of syntropy in living matter (Chapters 23 and 24) also deals with syntropy in bulk material. However, there the applied frequencies are extremely high, the oscillations are weak, and a totally different theoretical framework is necessary. 128 ENDNOTEs 151 For instance, heat conductivity is different along two (or three) different directions of heat transfer. Also dielectric permittivity can be different for each one of three different directions x, y, z. 152 J. F. Nye, Physical Properties of Crystals (Oxford Univ. Press, 1957, 1985) 153 J. F. Nye, ibidem, Chapters III-2 and IV-4, and also Appendix F. 154 Here we discussed a rotating electric field, but said nothing about a magnetic analogy: syntropy produced by the influence of a rotating magnetic field upon an anisotropic magnetic crystal. I attempted to develop a theoretical model for this process, but experiments did not confirm it. However, I would still argue that magnetic phenomena of this kind do exist. I can cite at least two names in favour of such a hypothesis: Floyd and Yildiz. In the 1980s, Sweet Floyd (United States) developed a two-phase magnetic system (a permanent magnet fitted with two coils, one wound perpendicular to the other) that cooled down and produced electric power, sometimes more than 100 watts. See http://panacea-bocaf.org/ floydsweet.htm ; www.rexresearch.com/sweet/1nothing.htm ; https://www.youtube.com/ watch?v=Ljypdy8yDNU (retrieved 26 Febr. 2020). And in the 2010s, Moamer Yildiz (Turkey) assembled his “magnetic motor”, which provided considerable mechanical power (sometimes more than 1 kW) without being attached to the grid. Again, several parts of this motor became cooler than the ambient temperature. According to Prof. Jorge Duarte (Eindhoven University of Technology), a respected expert in electric motors, there were no hidden “fake parts” in this “magnetic motor”. See https://gaia-energy.org/en/projekt-yildiz-announcement/ (retrieved 26 Febr. 2020). In both cases, the experiments were not regularly reproducible, so the right parameters could not be determined – they were sought only by means of trial-and-error. In fact, there was a total lack of any credible theoretical model. This was the main reason why these sporadic successes did not attract serious scientific consideration. Presentations of this kind are usually classified as “free energy devices”. One can find hundreds of them on the Internet, many of them quite dubious (without any reference to where the energy comes from). Also the denomination is not well chosen since free energy is already a canonized name for a certain physical quantity (with a different meaning) in the theory of thermodynamics. 155 In recent decades there have been a growing number of authors expounding on these ideas of great integration. For instance, see Ken Wilber, Integral Buddhism and the Future of Spirituality (Shambala Press, 2018) 156 Every human activity becomes a burden if it is entrapped within a framework of excessive competition, without integral creative insight and loving kindness for all living beings. 157 Sheila Jasanoff, The Ethics of Invention: Technology and the Human Future (W.W. Norton & Company, 2016) 158 Translated into English from the Slovene version of the original. Source: Sergej Jesenin, Pesmi (Ljubljana, 1967). The original is displayed at: https://artsandculture.google.com/ asset/letter-a-a-blok-to-s-a-yesenin/GAEgz348AKL9fQ?hl=en 129 ENDNOTEs 159 Most of those scientists who introduced radically new ideas into our human perspective were deeply interested in the arts and philosophy. Two examples in the realm of quantum physics: Erwin Schrödinger secretly studied Indian philosophy, and David Bohm had many inspiring dialogues (most of them published) with the Indian philosopher Jiddu Krishnamurti. 160 For an expansion of this topic, see my essay Silent Message of Nature at https://www. andrejdetela.si/umetnost/silent-message-of-nature/ 161 In fact, it is theoretically impossible to define the entropy of the entire human society, also because the number of subsystems of internal order in each human society depends on our imagination. 162 Alan Lightman, Great Ideas in Physics (McGraw-Hill, 2000), Chapter 2: The Second Law of Thermodynamics. 163 We can also produce electricity with renewable sources (such as solar-, hydro- and wind- power), but they currently do not satisfy our contemporary energy needs. These sources are still within the realm of the Second Law. 164 Several material forms exist at the subtle levels of our material world. They are recognized by sensitive people as an aura in the form of several radiating sheaths around living beings. Other subtle forms of matter include the “informational fields” that play a role in certain psychic phenomena (telepathy, etc.) and in animal orientation (for the migration of birds, etc.). These phenomena are not yet explained by present-day science. 165 Barbara Ann Brennan, Hands of Light (Bantam Books, 1988); Light Emerging (Bantam Books, 1993) 166 Perhaps a complete balance between entropic and syntropic processes is manifested only on the scale of the whole Universe, and over a sufficiently long span of time. Can such an idea shed some light on our understanding of the Universe? My opinion is that the language of science (as we know it today) cannot give a reply to this hypothesis. It can be assumed only as a philosophical conjecture. 167 This is a necessary condition but not a sufficient condition. Syntropy can contribute only a small part of the solution to today’s problems. Every specific human problem must be tackled in its own way. It is our noble human heart that brings all these solutions together. 168 A quotation from Martin Luther King. He demonstrated that our dreams are forerunners of quite feasible social changes. 169 Vyakti is a word of Sanskrit origin. Here I cite a great Sanskrit-English Dictionary (Monier- Williams M.): the root vy-añj = to decorate, adorn, beautify; to cause to appear, manifest, display, make clearly visible, etc. Vyakti (f) = visible appearance or manifestation, specific appearance, distinctness, individuality, beauty, etc. Vyakta (m) = heat, learned man, etc. 130 ENDNOTEs According to these meanings, our vyakti makes “hidden” thermal energy clearly visible, and beautifully manifests it in the form of electric power. 170 Yanis Varoufakis, The Global Minotaur: America, Europe and the Future of the Global Economy (Zed Books, London, 2011) 171 Here I did not mention the social conditions, which are also totally out of balance (rapidly growing social and economic inequality, manipulative media and social networks, global stress, the threat of great wars and violent upheavals, etc.). If we do not recover our sanity in due time, then Nature will pass severe and final judgement. 172 https://en.wikipedia.org/wiki/ITER 173 Naomi Klein, This Changes Everything: Capitlism vs. the Climate (Simon & Schuster, 2014): Part Two 174 Bertrand Russell, Has Man a Future? (George Allen & Unwin, London, 1961) 175 The commentary accompanying his LP record In the Light (ECM Records, 1974). 176 Here we exclude the Second Law, which relates only to many-particle systems. On the fundamental level, the only known exception to symmetry as it applies to both temporal directions is the so-called CPT invariance in the physics of elementary particles. But that is already a different story. 177 Time‘s Arrows Today: Recent Physical and Philosophical Work on the Direction of Time (ed. Savitt S.F., Cambridge University Press, 1995) 178 The Oxford Handbook of Philosophy of Time (ed. Callender C., Oxford University Press, 2011): Part Two (The Direction of Time) 179 Victor Mansfield, Time and Impermanence in Midde Way Buddhism and Modern Physics, included in Buddhism and Science (ed. B.A. Wallace, Motilal Banarsidass & Columbia Un. Press, 2003): pp. 305-321 180 Henri Bergson, Oeuvres (Presses universitaires de France, Paris, 1970): Essai sur les données immediates de la conscience ; L’évolution créatrice 181 Poetically expressed in Ātmaṣaṭkam (Song of the Soul), a hymn presumably written by the great Indian philosopher Śaṅkarācārya (most probably the first half of the 8th century). See the whole text in B.K.S. Iyengar, Light on Yoga (Unwin Paperbacks, 1985): p. 53 182 Antiparticles (antiprotons, positrons, etc.), according to CPT invariance, live in the opposite direction of time, and so they are annihilated as soon as they come into touch with our experienced reality. 183 Dependent co-arising, dependent origination (pratītyasamutpāda), an important concept in Buddhist philosophy. See Joanna Macy, The Ecological Self, included in the book Buddhist Philosophy: essential readings (ed. W. Edelglass & J. L. Garfield, Oxford University Press, 2009) 131 ENDNOTEs 184 For a poetic description of the ideas presented here, see my essay Syntropic perception of time at https://www.andrejdetela.si/umetnost/syntropic-perception-of-time/ 185 Alain Badiou avec Nicolas Truong, Éloge de l›amour (Flammarion, Paris 2009). English translation: In Praise of Love (Flammarion, 2012), p. 48. 132 Rien ne peut arrêter une idée dont l'heure est venue. (Nothing can stop an idea for which time has ripened.) VIC TOR HUGO 37 € zalozba.zrc-sazu.si