E-book
39.38
drukowana A5
59.25
Cognitive Prosthesis

Bezpłatny fragment - Cognitive Prosthesis


5
Objętość:
141 str.
ISBN:
978-83-8384-047-5
E-book
za 39.38
drukowana A5
za 59.25

Only Physics counts, the rest is philately

— Ernest Rutherford

Introduction

At the very beginning of this publication, I would like to clarify that this will be a book mainly about preserving and improving the condition of the mind and life of a person using Quantum Tools, primarily the Philosopher’s Stone Algorithm, of which I am the author. Of course, you can also use other Tools in this regard, but due to the fact that I am accustomed to the Algorithm and personally consider it the best Tool ever invented, so, I use it, as it were, automatically. These technical Tools like Quantic or Healy — probably others will be created soon — are expensive and you have to have a big enough budget for it, though. I know, I know, sick people are selling out to buy themselves similar modern devices, but still, for now, I think that „my” Philosopher’s Stone Algorithm is the best. Only it requires commitment and time. Someone who doesn’t want to delve into the theory of how these Quantum Tools work and has the cash to do so, will probably choose to buy a technical quantum device of the Quantec or Healy type, or something else, which, I predict, will be created in the near future, and will have the problem over with, will be able to just passively enjoy the benefits of Quantum Tools and the whole theory, and go about their own business.

However, in order to focus on the substance of this topic, I must refer here to the theory of esoPhysics and the Two Level Interpretation of Quantum Mechanics (quantum’s), which I have been promoting in my books. However, from now on they will call it the Theistic Interpretation of Quantum Mechanics. However, in order for someone to learn more deeply about the subject, I would recommend reading the contents of my previous publications. These books were written several times and they are: „esoPhysics”, „The Uber Mage”, „The Quantum Condition of the Mind”, „The Four Pillars” and „Mind and esoPhysics”. You don’t have to buy them right away, as most are on offer from Legimi and Empik Go rental services.

Here I must refer to this theory to justify my view of the nature of the human psyche and consciousness, in the context of the possible intervention of this positive and „repair” it. That is, the proper correction of the psyche, human behavior and everything that affects us humans in life. Normally, after all, we live so godly, that is, so without deep reflection and attention to how these turns of life sculpt us, or rather our brains. And yet, as it happens in life, some are more fortunate others less fortunate (at least it seems so to us from the perspective of the Measurable Level, but in reality …?!). Thus, our psyches, consciousnesses and even our souls can be corrected with the help of Quantum Tools. Someone will say: but after all, this is the content of our Path of Spiritual Development, it is inscribed in the meaning of our existence here on Earth. And yes and no. Indeed, until now there were no effective methods to interfere with this, but today in the Quantum Age there are finally the right Quantum Tools to face this subject. And there is also already a relevant theory, which, let me say here, I have mostly developed myself, possibly unearthed, and which people have long since discovered, only this deluge of Atheism, Materialism that has taken over Science and the official narrative has pushed this out of the public domain. Now, I’ll state at the outset what I’m going to try to demonstrate before I move on to the proper forms of Cognitive Prosthesis, which makes it possible to interfere with the quality of our lives. So, first of all, I will demonstrate that the world is based on two levels. On the Measurable Level, that of Hard Matter, and on the other, the Non-Measurable Level, the level of Transcendence, the Spiritual Level. Then I will prove that on the Non-Measurable Level the Causal and Purposeful Cause, which I call God, must manifest, and which is already sufficient for all the Effects it causes, and which Effects are already manifested on the Measurable Level. I will show that God has given humans the attribute of influencing this Non-Measurable Level, which is the proper base of Causal and Purposeful Causes, the Effects of which manifest later on the Measurable Level, the one that can be subjected to measurement. At this level (the Non-Measurable Level), as I will also show, there is no randomness. I will need this to show that man, as an entity that functions on the Measurable Level, has this material body and material brain, must, like any entity, also be linked to this Non-Measurable Level, because, after all, everything must have a link to it, to this Non-Measurable Level. And related to what? The Soul, which is the manifestation of a person’s Spirit from the Non-Measurable Level to this very Measurable Level. There is sufficient empirical evidence to claim that man’s brain, through superposition, contains this material part, related to the Measurable Level, this Animality, and has a Soul, an expression of his Spirit, which is perceived by the brain, probably, as the Occultists claim, through the pineal gland, but this is not even important through what we perceive our Soul in the brain, because the Soul is certainly not a product of the work of the neurons in the brain. What is important is that this end result of this superposition of the Spiritual and the Animal is human consciousness. Human consciousness cannot be explained by a mere neural structure and the structure of the brain and nervous system, although attempts to study and explain this go back more than a hundred and fifty years, and so far the phenomenon of human consciousness could not and still cannot be explained by materialistic and atheistic categories. So Descartes was right that there is in man a dualism of the Spiritual and the Material (Animalism). Next, I will show how it is possible, using Quantum Tools and according to this whole theory, to influence the brain and the quality of life of man, by what I called, perhaps erroneously, imprecisely Cognitive Prosthesis.

1.Historical outline

a. Physics to the 20th century

Our science, of our modern civilization, is short, very short, it’s just the last few minutes in, conventionally put it this way, the daily history of our species. Therefore, although we already boast of quantum computers, it is not surprising that we still know so little. We are lost in conjecture, and knowledge even that of the Universities is based mainly on usus. What authorities deem to be true is taught universally. And although the history of science, I am thinking here mainly, however, of physics and metaphysics-although the latter is often regarded, not without reason, as a collection of confabulations-is so short, it has abounded in a number of theories, which, however, over time have been rejected by authorities. For example, let’s give such an example, Ptolemy’s geocentric cosmology or the theory of heat, or the concept of the ether as a carrier of electromagnetic waves, or even Kelvin’s vortex concept of the structure of atoms — all of which, it is believed, were falsified and rejected. Falsification was widely regarded as the primary logical tool in establishing the prevailing Paradigm of Science, at least it was so until the end of the 20th century. Today, it is believed that there are theories that do not lend themselves to falsification, although they are considered by most authorities to be at least promising. I am referring to M-Theory, String Theory, or Supersymmetry. This is because science has reached the limit of human perception, that is, it has descended below Plank time and Plank length, which is no longer possible to subject to falsification, which is mainly done in Science empirically, that is, by experience. Although even so, one is frantically seeking empirical confirmation of these theories at CERN, this still does not yield the desired result for the time being. Physics as Science originates in the deliberations of the early Greek philosophers, who passionately considered the ontological nature of entities. And this, along with philosophy, is also a favorite life theme of physics. It is believed that antiquity abounded with a host of philosophers who are still highly regarded today, but such first-order scientists were on the fingers of one hand. Certainly among such can be counted Pythagoras, Archimedes and Aristotle, who, after all, knew everything and, unfortunately, he introduced a large number of confusions and inaccuracies into common Science, which persisted almost until the time of Galileo. And although he (Aristotle) based his theories on verbal formulations — after all, the algebraic form of representing the laws of nature was not popularized until the 17th century and later — it is known that one of his postulates and laws that he believed to be correct was the famous formulation F= mv. According to him, force was proportional to mass and velocity. Hence, a common conclusion from this formula was that when there is no force on a body of mass m, i.e. F=0, the body does not move, because it must be v=0. Which seems total nonsense to us today, but this was still believed to be the case until the advent of Galileo Galilei. Galileo Galilei was probably one of the first full-fledged physicists who operated mainly with knowledge derived from experience. He was the first to correct Aristotle, with a series of experiments he proved that if there is no force acting on a body, F=0, then such a body moves uniformly along a rectilinear path or remains at rest. From this postulate of his, it followed that F≠mv, speculation for the time being, then why is F equal? Galileo partially found the answer to this question, but did not formulate and disseminate it. And it was only Newton who gave a complete answer to this question, Newton, who created integral and differential calculus. It is true that Galileo stated that for F=0 then v= constants, but it was Newton who knew that a=dv/dt, hence d(constant)/dt=0, because the derivative of a constant is zero (0), i.e. his correct conclusion was that for F=0 then a=0, i.e. that „F” must be directly proportional to „a”, hence it was a small step to formulate Newton’s universal law of Dynamics that F=ma. In this Newtonian formulation, „m” (mass) acts as a constant coefficient of proportionality. This, then, was an effective refutation of Aristotle’s findings. But before this happened, the history of physics still saw the emergence of such names as Tycho de Brache and his heir Kepler. The former developed, to the naked eye, very accurate astronomical tables, from which, after deep analysis, Kepler derived his laws, in which, among other things, he established that the Planets move not in circular orbits, as hitherto assumed, but in ellipses, with the Sun always located at one of the poles of the ellipse. Kepler’s cosmology was cemented by Newton, who finally explained the cosmological laws with his formula for the gravitational force, which is directly proportional to the product of the mass of the Sun and the Planet, and inversely proportional to the square of the distance between them. It turns out that all of Kepler’s Laws follow directly from this simple relationship. Newton was also the first scientist to formulate the so-called Scientific Method, which is the way of working of a physicist, a scientist who objectively investigates and analyzes the conditions of a physical problem, without recourse to some extraordinary and mystical circumstances. You can find such a scheme in all his available works, especially in Principia…, where Newton explicitly shuns formulations that cannot be proved scientifically. This can be seen in relation to the problem of the action of the Gravitational Force through „empty” space, which was difficult for him to accept (a force acting without direct application). He preferred not to draw any metaphysical conclusions on this subject. This problem was solved only by Einstein in the General Theory of Relativity, who determined that there is no Gravitational Force, there is only a disturbance of space-time and cosmic bodies move along geodesic lines, determined by the masses (of these bodies) disturbing this space. For Newton it was difficult to assume that Gravitational Forces act without intermediaries through the empty space of the Cosmos, and this is what his Cosmology boiled down to. Today in modern physics it is also accepted that Forces act through force exchange particles, and that they cannot act without intermediaries unless they act in a Force Field, as accepted in Maxwell’s Classical Theory, but the concept of a Field is a separate deep physical concept. Today we know that there must be Force Exchange particles anyway. And although Newton is considered the founder of modern Science, without reference to extraordinary phenomena, it is worth knowing that privately he was an avid Alchemist and Magician, who later devoted almost his entire life to the study of the Bible Code, from where, he hoped, he would gain deep knowledge of the world and reality, and Spirituality. It is somewhat of a paradox that the Founding Father of rational Science was de facto a Magus, i.e. someone who partook of the post-sensory world. Even so, his achievements in the field of Science are undeniable and indisputable.

The period of the Enlightenment and later, was a period of the emergence of classical formalisms, including mainly formalisms based on purposive causality. So these were (these purposive formalisms) and the principle of least action, Lagrange’s formalism, later Hamilton’s formalism, somewhat different formalisms from Newtonian formalism, which was based on causal causality, where the main role was the Forces that caused the action. In the formalism based on causal causality, it was not the Forces that were at the core, but the intentional (Hamiltonian) conditions that the bodies subject to such a formalism had to meet. But for both Causal Causation and Purposeful Causation in Determinism, which was in effect at the time, Causes (both Purposeful and Purposeful) were already sufficient and sufficient for effects to take place in a certain way. This distinguishes the formalism of classical physics (mechanics) from quantum physics (mechanics), as I will write about later. It will turn out that for Quantum at our Measurable level Indeterminism already applies, i.e. that Causes (Causal or Purposeful) are only necessary, but not sufficient, for Effects to take place. This distinction into two types of causes, Causal Causes and Purposeful Causes, dates back to antiquity, and in fact its origin is the already mentioned Aristotle himself. In general, he introduced the concept of four types of Causes. Causes: formal, material, causal and purposive (or final), but only Causal and Purposive Causes have survived to our time. The former two pertained to his metaphysical conception of the ontology of being (the structure of matter), but that this idea turned out to be at least inaccurate and even wrong, today they (these Causes: material and formal) are no longer included in the description of physics. And indeed, Newtonian formalism based on the concept of Force is consistent with Causal Causes, Lagrange’s and Hamilton’s formalism is based on and related strictly to Purposeful Causes, in Hamilton’s formalism the elements of a system (body) must satisfy the purposeful condition that the Hamiltonian of the system defines. But Causality, or the Law of Cause and Effect, must take into account both types of Causes and applies to both types. I, in my earlier books as more attached to Newtonian formalism, which was after all the first, defined the Law of Causality in the context of Causal Causes, but all conclusions based on Causality, i.e. on Determinism or Indeterminism, can just as well be related to Intentional Causes in the same way, so now I will use a generalization: Causal, in both contexts: Causal and Purposeful, without specifying. It can be immediately noted here that the Hamiltonian and Lagrange formalisms are more readily used in modern physics because of a number of advantages, including computational, over the Newtonian formalism. These include the ease of transformation from a system to another system (reference system) (generalized coordinates and momentum) and the resulting simplicity of describing the physics of such systems in these formalisms based on Purposeful Causes. However, the concept of Force is very persistent, as most descriptions and laws of physics use this concept. Four types of Elementary Forces are specified. These are the Gravitational Force, the Electromagnetic Force, and the Nuclear Strong and Nuclear Weak Forces. As will become clear later in Quantum, my Interpretation of the Two Levels of Quantum takes into account two more types of Forces, but about that in its own time. It may be noted that these Elementary Forces (the four main ones) are equivalent to Causal Causes, because they cause source action, or causality.

So it is worth referring here still to the concept of Causality (the Law of Cause and Effect). This is, and actually has been so far a concept rather nonchalantly treated in Science, in Physics. Everyone, yes, agrees that it is a very important concept, but actually it has been little considered in Science so far. It is common knowledge to everyone that in a physical process the Cause (Causal?) must precede in time the Effect it causes. Purposeful Causes condition physical processes, which, however, always end in some kind of Effect. In this context, they are also sufficient (Determinism) or merely necessary (Indeterminism) for these Effects.

Until now, it has also always been assumed that Causes are sufficient (the opinion of Aristotle, who first came up with this) and sufficient for Effects, and this was thought so until the early 20th century. But in fact this case refers to Determinism. It is worth realizing that in general Causality is divided into two branches. Into Determinism and Indeterminism. So this case of the sufficiency of the Cause for the Effect it causes applies only to Determinism. But what about Indeterminism? How is it to be understood? And when do we have to deal with it? This is my concept, which I promote in my publications. And it should be understood this way, that Indeterminism occurs when Causes are only (?) necessary, but insufficient for the Effects they cause. In other words, when they can, but do not have to effect (these Causes). It is explained that such a case occurs in Quantum. When (at the Measurable(?) Level) Effects take place in physical processes, but with some probability of this fact, or to put it a little differently, with some uncertainty. I will write more about this as part of my description of Quantum Mechanics. At this point, however, I must mention it. Why? Because all Classical Physics (almost all) from the Measurable Level is subject to de facto Determinism. All formulas are clear, transparent, unambiguous and Deterministic. And all Quantum from the Measurable Level is subject to Indeterminism, uncertainty. And on this, for now, at this point, I must stop, and I will return to this later in the book.

Since the 18th century, the era of Electromagnetism begins in Science. It is with this phenomenon that the concept of Force Field, which was introduced by Faraday, is associated. It would seem, from the very beginning, that this is a similar concept of forces as gravitational forces, that is, forces that act at a distance, something that cannot be explained Scientifically, and it was to somehow solve this that the concept of Force Field was introduced. Maxwell’s Classical Theory is based on the concept of Electromagnetic Force Field and EM (electromagnetic) waves. Maxwell, who gave the formulas describing all of Classical Electromagnetism, also proved that light is a kind of electromagnetic wave. Maxwell’s opinion and theses were instrumental in the fact that from then until the advent of quantum’s in the 20th century, light was considered waves. The trouble is that waves need a carrier. Just as the carrier of sea waves is water, and of acoustic waves is air. But, although for these three hundred years in advance there has been a search for a carrier of EM (Electromagnetic) waves — that is, the so-called Ether, and in fact there are still some who are still looking for it today — no one has succeeded in this art. I will describe this issue in detail when I write about Quantum. Then I will also try to prove that there is no Ether, because light and EM waves are particles, by the way, as Newton himself defined it. Field Quanta (particles) that propagate through space at absolute speed „c”. The fact is that these particles behave as if they were waves, because they manifest, their physics, the hallmarks of waviness, for example, bending and interference, so for people of those years the natural conclusion was that light is a wave. Even today, in the Copenhagen Interpretation of Quantum Mechanics, there is a corpuscular-wave dualism, which assumes that matter, including light and elementary particles, can be considered waves at one time and particles at another time. I will try to write about this when I present the Quantum part of the book. Now I will only mention that such views (dualism) are associated with a certain misunderstanding, which can be easily explained. The explanation lies almost as if on a platter in the mathematical formalism of Quantum.

Discoveries within the theory of electromagnetism have brought significant and lasting technological advances to people, and improved common living conditions. With the advent of electrical engineering, numerous electromagnetic devices and motors in homes, there was widespread lighting, power grids and electrical appliances to improve living comfort. In fact, progress in this field continues to this day. It’s true that today electronic and quantum engineering and computer science have joined it, but these have only enriched this progress, which source began as early as the 18th century.

The development of physics, theoretical mathematics and physical mathematics since the time of Newton, since his famous work Principia, is and was so great that practically all the main mathematical formalisms were established then, which are still widely used today even in new physical and scientific theories. The knowledge that has been established by man since those days is today practically beyond the grasp of even brilliant people. This has forced scientists to develop very selective branches of Science, including physics and mathematics. In Universities, physics, mathematics, chemistry, biology and virtually all other branches of knowledge had to be broken down into detailed faculties. Universities around the world have grown immeasurably, and a disproportionate number of scientists, people who are professionally engaged in Science and teaching, have also arrived.

It is worth noting that practically until the emergence of Quantum Mechanics, physical formulas and theories were Deterministic. Even Thermodynamics, which was established in this first period, was based on deterministic assumptions, and only generalizing to the vastness of molecules found in a single mole of gas or matter led to probabilistic rules and laws of thermodynamics. They were governed by Determinism as one form of Causality: def. of Determinism:

∀Cause (Cause →Effect).

At the same time, it was about Causal or Intentional Causes. That is, in these formulas, the Causes were already sufficient and sufficient for the Effects carried out in the physical processes.

The main reason for this fact was that only Real numbers were included in these formulas. But as was soon to happen, the Real numbers and the Body of Real Numbers was no longer a sufficient set of numbers to describe all physical phenomena. And with the arrival of the 20th century, the Body of Complex Numbers joined the game in describing the physics of phenomena. And this was a significant and qualitative change. Changing the whole understanding of physics and the structure of the World.

b. Twentieth century physics

As late as Lord Kelvin in the late 19th century, he was firmly convinced that the theory of physics was already practically finished, and that the rest would be taken care of by the Academies, where engineers would be trained to make practical use of the findings of the great theorists of already finished physics. That is, in his opinion, nothing interesting in the theory of physics was going to happen anymore. This was undoubtedly a great physicist who left behind a not inconsiderable body of work, but in this respect he was cardinally wrong. Because, after all, in retrospect, we know that in fact the fun in physics was just about to begin. In the 20th century, and at its very beginning, two gigantic branches of Science were created. The Theory of Relativity and Quantum Mechanics. Actually, the Theory of Relativity is two theories, the Special and General Theory of Relativity. The former is actually a generalization of Newtonian Kinematics and Dynamics to the case of bodies moving with significant velocity, that is, with a velocity of v≥c/3, a velocity above a third of the speed of light. In this theory, Einstein, the founder of the theory, assumed that all inertial systems are equivalent for physical processes. Hence, it follows that there is a maximum velocity of bodies (finite, but enormous), the so-called absolute velocity „c” in these equivalent systems, which is the same for all inertial systems (in general for all). In his thought experiments at a still very young age, for which Einstein was later widely known, the creator of the theory came to the conclusion that light moves at such a speed. And indeed, empirics confirmed this conclusion. Today we know that all massless particles (massless entities) move at „c” speed. Importantly, light, according to this, has the same absolute velocity „c” relative to any system, no matter how it moves. There are a great many seemingly paradoxical conclusions from the Special Theory of Relativity. First, time loses its rank as an absolute parameter, that is, a parameter that does not change when moving from system to system. Second, space and time are inextricably linked, forming space-time. When you go from system to system, the entire space-time is transformed, and this is expressed in the formulas for the Lorenz Transformation. In detail, it turns out, according to these formulas, that time undergoes dilation and distance undergoes Lorentzian shortening, that is, in a moving system, relative to one at rest, time slows down, and length in a moving system (relative to a system at rest, i.e., a laboratory system) in the direction consistent with the motion of that system shortens. And here, right at the start, Einstein had to deal with the first paradox (turns out to be a pseudo-paradox). The point is that, after all, motion is relative. After all, if system A is moving with relative velocity „v” with respect to system B, then it can be equivalently said, according to this principle, that it is system B that is moving with velocity "-v” with respect to A. The sign "-" changes nothing here, it is only a matter of changing the return of motion. So, what do you mean, then in which system does time slow down? This was the main weapon of all opponents of STW (Special Theory of Relativity). Because it was a conclusion that supposedly led to an outright absurdity, i.e., proof No Straight that STW is fundamentally false. And if we now caused one of these moving systems (A or B) to turn around, and these systems came into contact again, as they did at the beginning of the movement, in which one would the passage of time be less? This is the content of the so-called Paradox of Twins (Because in these systems, one can assume, each in a different one, there are twins). In turn, the solution is trivially simple. Refer to the dynamics. Note that one of the systems must change direction of motion at some point, and then we are no longer dealing with an inertial system, but subject to overload, that is, the findings of relativity of inertial motion do not apply to it. It is in this system where these forces will occur (so that the system can turn around) that time dilation occurs. This follows directly from the mathematical operations carried out in a tedious manner, resulting from the formulas of the Lorenz Transformation, which I have omitted here, for convenience, but anyone can do this, these mathematical operations, themselves, on a piece of paper, as it were.

Something else is important when considering the transformation of the transition from system to system. Something that physicists have so far overlooked or have done so involuntarily, without special attention. Well, on all transformations from system to system must still be applied the condition that in the case of spatially similar (and temporally different) events, Causality between these events must be preserved, that is, the Law of Cause and Effect must be preserved. In other words, if in some reference system for spatial-similar events the temporal order is as follows Cause (A) precedes Effect (B), then at any transformation of systems this order must be preserved or both events (A and B) occur simultaneously. Causality need not be preserved only for time-like events. It is worth noting here that it is difficult to obtain temporal simultaneity for two different points in space. Hence, various strange conclusions arise from events in STW. And it is so unintuitive, but there are no paradoxes in it, and so far it (STW) has not been falsified, and empiricism confirms all its unexpected conclusions.

The first half of the 20th century undoubtedly belonged to Albert Einstein, who first formulated (1905) the STW and then the OTW (General Theory of Relativity) (1915). His contribution to the theory of physics was not limited to these two powerful ideas, for he was also co-creator of Quantum Mechanics (1900 -1925), which was also nascent at the time. And although, admittedly, at some point, once the theory of MK (Quantum Mechanics) had somewhat solidified, he was unable to accept and come to terms with the blatant Indeterminism (or, in short, randomness) that flowed from MK, famously in his phrase …God does not play dice with the world…, he nevertheless followed closely to the end of his life all the findings of MK, which in time began to dominate Science, as the most important physical theory of the 20th century. Before I briefly discuss the OTW and move on to MK, however, at this point I would like to note that Einstein was both wrong and very much right, which I will justify in detail a little later. I will now note enigmatically that, indeed, Quantum Mechanics (MK) from the Measurable Level is Indeterministic, that is, largely probabilistic (random), but from the second level, which I will allude to later, from the Non-Measurable Level, it is not Indeterministic. In essence, then, it is the case that to us, at the Measurable Level, it appears that MK is Indeterministic, in fact, at a deeper, this second level, at the Non-Measurable Level, there is no randomness and God, in fact, does not play dice with the world. So Einstein’s tremendous intuition is bowed to here and confirmed. What I am writing about now may indeed seem puzzling and enigmatic, but I will address this in more detail later in the book and try to explain it. I can only state that almost all theories, written from the Measurable Level, i.e. written up to the 20th century, are Deterministic, are written precisely in this spirit. This also still applies to STW and OTW. They too are as Deterministic as possible. Only MK and theories approaching the common denominator of Quantum are written from the Non-Measurable Level, and there from this level (Non-Measurable) they are not Indeterministic, and there are such from the point of view of the Measurable Level actually Indeterministic, and this is already qualitatively a completely different form of description of the physics of phenomena. It is interesting that it is only after a century of MK that someone (i.e. me) is making such obvious statements. Before I start describing this division of the structure of the world into two levels (Measurable and Non-Measurable) and describe Quantum Mechanics, and Quantum in general, let’s go back to the General Theory of Relativity (OTW). To Einstein’s second theory.

The General Theory of Relativity assumes the equivalence of all types of reference systems and those inertial and those in which forces act, as equivalent in the description of physics. It applies mainly to Cosmology, because it replaces Newton’s Cosmology, which was falsified in this way, so to speak. It assumes the equivalence of heavy mass with inertial mass, thus postulating that the acceleration resulting from the action of the gravitational „force” should be considered equivalent to the acceleration resulting from the action of the inertial force. It constitutes, this whole theory (OTW), a kind of nonlinear geometry with a specific metric, which is formed by the masses of bodies (Planets, stars, space objects), which curve space-time locally, so to speak, causing these masses (these space bodies) to move along geodesic lines determined by these masses. The most important equation in the General Theory of Relativity is Einstein’s field equation. This equation describes the relationship between the curvature of space-time and the distribution of masses and energies in it. Important elements of the OTW are the Energy-momentum tensor, the Riemann curvature tensor, the metric tensor and constants of nature, such as the Gravitational constant. These elements are the main elements of the equation. Actually, the OTW is curvilinear geometry, with the constraint that Einstein’s Field Equation imposes on it. Bodies, space objects, must satisfy these conditions. That is, there are Purposeful Causes (Einstein’s Field equation), forces do not exist, so there are no Causal Causes (as far as the force of Gravity is concerned). Purposeful Causes are therefore sufficient and sufficient, in a word, pure Determinism. Although this description is seemingly simple, there are infinitely many solutions to this equation. Whether the world given by such an equation will shrink or expand, or be static, is determined by the average density of the world’s matter. According to empirical findings, our World is expanding and will continue to expand until the moment of „heat death”, when, after a tumultuous history, after the phase of formation of stars, planets and then black holes, masses will evaporate (black holes) into the form of photon radiation. And a full Eon of the world’s existence will come to an end, which will be the origin of a new world (a new Aeon), that is, according to this theory, the world is eternal and undergoes successive Eons. One such full Eon is estimated to last about 10^13 years. This is the version of Cosmology behind Sir Roger Penrose’s theory, which, it’s true, not everyone agrees with, but as of today it seems to be the correct theory. In that case, however, there must be some remnants in the Cosmos from past Eons, and this already yields to falsification. For now, a strenuous search is underway for these remnants, which will confirm or negate Roger Penrose’s theory. Somewhat for the sake of sobriety, it is worth noting that the present Universe (the current Aeon) is about 13.8 billion years old, or on the order of 10^10 years (officially), although recent findings suggest that this age may be as much as twice as long. However, there is still a radically long time until the end of the Aeon. More even, the Solar System in which we find ourselves and the age of the Earth has not yet reached even an estimated half of its potential existence.

Albert Einstein is considered if not the most outstanding, then certainly one of the most outstanding physicists not only of modern times, but in general. Certainly no one else-maybe only Isaac Newton can match him in this- has created such coherent and great systems in physics. His greatness would have been perfect and unquestionable were it not for a small detail. Although he was, after all, one of the pioneers of Quantum Mechanics, he did not imprint himself with his achievements and did not leave such a mark of genius as he could have, after all, as in his flagship Theories of Relativity. This, by the way, aroused frustration and embitterment in him almost to the end of his life, which was reflected in the fact that he could not come to terms with this system (MK). In fact, he couldn’t come to terms with the popular and valid interpretations of Quantum Mechanics at the time, mainly the Copenhagen Interpretation of Quantum Mechanics. And even more strictly with Indeterminism, which overtly followed from MK. Toward the end of his life (until his death), he worked to unify his Theory of Relativity and Quantum Mechanics into a single Deterministic system. Unfortunately, these efforts proved to be futile. I am not some outstanding theoretician, physicist, but in my opinion this was due to the fact that then and in modern Science Quantum Mechanics (MK) and all its valid interpretations are one-level. And this is, among other things, the reason that while Classical Physics, that is, up to the beginning of the 20th century, was written mainly in the manner of Determinism, because it was written from the Measurable Level, Quantum Mechanics is already written mainly from the Non-Measurable Level. On how I understand this in a moment. Now let’s return to the history of physics.

Quantum mechanics, or rather its framework, was created with the beginning of the century, around 1900. The first was Planck. In order to explain the spectrum of blackbody radiation, he proposed the novel solution that the energy emitted by the Blackbody in the form of radiation was quantized into portions, today we would say into quanta of energy. This was, in his opinion at the time, a purely mathematical operation, and not a formal change or undermining of the existing Paradigm of Science. It was supposed to save the then formulae for the radiation spectrum of the Perfect Black Body, because they struggled with the so-called catastrophe in the ultraviolet, diverging to infinity (emitted energies) as the radiation frequency increased. Empiricism, however, did not confirm this, and only such a maneuver (quantizing the energy, i.e. dividing it into separate portions) saved the whole thing. Interestingly, as the anecdote goes, earlier, at the time when he was about to get serious about Science, he was advised against studying physics, claiming that nothing new and interesting could be discovered in physics anymore, and should rather take up some other, more promising branch of Science. But in fact, as we know, he became an absolute pioneer of a new branch of Science, physics, probably the most important physical system (Quantum Mechanics and Quantum in general), which changed not only the Paradigms of Science, but in general the entire modern world. The fact that today we use computers, electronics, and that AI was born and is becoming more and more common, has its roots in this „mathematical” trick of Plank, in the quantization of energy radiated in the form of light. Very soon after, in 1905, Einstein reasoned that light, and EM radiation in general, should be treated as corpuscles, particles that move at absolute speed „c” (very large, but nonetheless limited). That is, de facto Einstein clarified that this „mathematical” trick of Planck has as much physical basis as possible, because the energy emitted in the form of light (EM radiation(waves)) is always in the form of particles, quanta of Field (Energy), which henceforth are used to call photons. Light: the dilemma, are they waves or particles?, is known as corpuscular-wave dualism in the prevailing modern interpretations of Quantum. I will refer to it later in the book.

It turns out that the radiation quantum or radiation energy quantum equals: Ek=hf, where „h” is Planck’s constant and „f” is the frequency of the EM wave. Quantization of energy does not mean that the energy (E) is discontinuous or has any holes. It just says that for a certain frequency of an EM wave, this energy is emitted or absorbed in portions, or quanta, and for a given frequency, such a quantum of energy equals: Ek=hf, whereby, however, the frequency „f” of the EM wave is as continuous as possible. At the same time, it should still be taken into account that such a quantum of energy, is the smallest portion of energy for a given frequency „f”, that is, it is the energy of one photon of a given „f”. However, the frequency of photons is not arbitrary and is determined by the physical processes under consideration. In this sense, therefore, it is discrete and the energy, the quantum of energy, for such a process is determined and they are also discrete. Energy, after all, in this case, is a multiple of a specific quantum of energy. But formally, for different physical processes, for different elements, the frequency of a photon has no mathematical limits, and so the quantum of energy, for one process can be fa= 2.42 10^ 15Hz, and for another fb= 6.4 10^16Hz.

It was also very quickly possible for physicists to explain on this basis the atomic structure of hydrogen H and the conditions that must be met by an electron in a hydrogen atom, what are the values of energy absorbed or emitted when an electron jumps from one orbit to another, because Quantum turned out to perfectly describe the world at the atomic level. The first co-creators and discoverers of Quantum were Planck, Einstein, Bohr and others. This was that first period, one might say: very naive, of Quantum Mechanics. This was followed by a period of forming and developing the theory of mathematical formalisms of Quantum Mechanics, which lasted almost two decades. Each year, each brought astonishing discoveries about Quantum. The final stage was the blossoming of Quantum Mechanics into a definite and finite mathematical formalism, which we use practically to this day.

So the final formalism of Quantum Mechanics was defined between 1925 and 1926. The first was the matrix formalism presented in 1925 by Heisenberg, now called the Heisenberg image, the next in order was the Schrödinger formalism in 1926, called the Schrödinger image, which is more popular. There are other images, including the Dirac image. These are all equivalent formalisms. With the right Unitarian transformation, it is easy to switch from one formalism to another. The Heisenberg formalism differs from the others in that in it the forms of the Observables matrices (operators) change over time and a slightly different equation, called the Heisenberg equation, corresponds to this, while in the Schrödinger formalism the State Function of the system changes over time and the Schrödinger wave equation, familiar to most physicists, corresponds to this. This certainly has some metaphysical overtones, which I won’t write about now, but let’s remember that also in Classical Physics, Classical Mechanics can also use equivalent formalisms, such as Hamilton’s or Newton’s formalism.

In the following years of the 20th century, a some of other theories were developed, but all were based on the idea of Quantum. They arose directly from the Ground of Quantum Mechanics, such as Quantum Field Theory (QFT), and from it Quantum Electrodynamics (QED), Quantum Chromodynamics (QCD) and others. Prominent contributors to these theories include Feynman, Dirac, Pauli, Wigner, and others. This book does not claim to be considered a textbook on physics, which is why I describe this history of 20th century physics so briefly. However, it is worth recalling that back in the early years of the 20th century, the German scientist Noether managed to create and develop the foundations of an important branch of mathematical physics, without which it is difficult to consider modern physics today, and which can be encapsulated in the word Symmetries and Conservation Principles. The formalisms of Quantum plus the applied laws of symmetry form the so-called Standard Model (MS), which is the applicable Paradigm of modern physics.

In the final years of the 20th century, additionally, other physical theories were developed, which were intended to Unify all known forces. In particular, gravity with the forces already unified by MS (Standard Model). String Theory was created, later its generalization M-Theory. However, all of them are base on the concept of Quantum, all of them are base different from Classical physics. They all represent a new quality. Very importantly, practically all of them are not subject to falsification. They cannot practically be verified empirically, because the energies that physicists would have to have at their disposal to verify them empirically are gigantic, beyond the reach, not only of all research centers, but beyond the reach of humanity in general. Thus, it has come to pass that so-called Physicalism, that is, the current in physics that consists in physicists, Scientists, recognizing as scientific facts only concepts and objects and theories that can be empirically experienced, has become seriously overblown in the late 20th century.

2 Quantum Mechanics vs. Classical Mechanics

The twentieth century brought, from the perspective of today, a real breakthrough. Behold, Classical Mechanics a and classical physics in general was written and formulated in the language of Determinism. That is, in other words, the Causes that initiate physical processes are sufficient and sufficient for the Effects that are realized. It can be written like this:


∀Cause (Cause →Effect).

And as we have already explained to each other, Causes here can be: Causal or Intentional.

And Quantum Mechanics, Quantum Mechanics in general, is already written from a different point, it is written not in the spirit of Determinism. So, in what, you ask? This will become strictly apparent later in the book. Now I can state rather enigmatically that from our sense perception (Measurable Level) it is fully Indeterministic.

Write it down like this:


∀Cause (Effect → Cause).

In other words, Causes (Causal or Purposeful) are, yes necessary, but they are also insufficient. Which should be understood to mean that even knowing from our level of observation the Causes, we cannot be sure of the Effects that will occur in this physical process. However, if an Effect has arisen, it is sufficient and unambiguously determines the Cause that caused it.

So the great unknown?

And yes and no. About this I will write, but for now it is important to emphasize that this is so from the level of our observation, from the sensory level (Measurable Level). It turns out that, as I will explain and clarify later, there are two levels of the World. These first, is the Measurable Level, the level of hard matter, the level of the senses, but this second level, is the Unmeasurable Level. And this second level, the Unmeasurable Level is only considered in Quantum, or Quantum Mechanics and its further clones. This first level has always been considered from Aristotle to Einstein in classical physics, while this second level, the Unmeasurable Level, is the level that is inaccessible to our senses, our observations, our experiences. It is unrecorded by our ordinary five senses and by physical experiments and measuring apparatuses, which are, after all, extensions of our ordinary senses. And, most importantly, the processes that happen there at this level, although they are unmeasurable, beyond observation, manifest themselves later finally at the Measurable Level in the form of concrete and real (observed) Effects. So how does one know that such a level, this Unmeasurable Level exists, even must exist?

So let’s first look at the mathematics (formalisms) used in classical physics and Quantum. It is commonly taught at a general level that we have several types of numbers. There are Natural numbers, Integer numbers, Measurable numbers and Non-Measurable numbers. Together, they constitute the entire set of Real numbers. However, it turns out that this is not entirely precise and exhaustive. Although these numbers, which I will now write about, were already known in part in antiquity, at least their existence was suspected, it was not until the Middle Ages that Italian mathematicians were able to convince themselves of their reality when they searched for general solutions of third-degree algebraic equations. So there is another type of numbers, which I intuitively in my publications called non-real, and which the Italian mathematician of the early Renaissance period Bombelli in his 1572 work Algebra Opera called non-real. That is, intuition from this point of view did not deceive me. Later, however, Descartes gave them, in my opinion, the misleading name of imaginary numbers, a name that has become widely accepted. What are these numbers? They are numbers, unlike Real numbers, numbers whose power two is negative. This is the distinguishing feature. That’s why I quite intuitively called them non-real numbers, because with Real numbers the power of the second is always greater than or equal to zero. You will ask yourself, and what does this seem to give? What it gives is that the set of numbers containing both Real numbers and nonReal numbers algebraically closes the mathematical systems that contain them. That is, all algebraic (mathematical) equations and problems have solutions in this set, the solutions not escaping into some more general larger set of numbers. And, importantly, this is the smallest such numerical set. This would perhaps be just a mathematical curiosity, but I have arrived by way of considerations, if only even the problems I describe in my publications, that the Pythagorean Principle is correct and valid, that all true mathematical concepts and quantities have their reflection in phenomena and processes, and physical concepts. This is already the case. This is not an axiom seen intuitively by the path of inspiration, but by a circuitous route after painstaking deliberation, as it were, around from end to beginning, I came to this conclusion. Thus, from the point of view of physics, I can now, but with a certain advance, which I will justify later, say that the numerical set containing and Real numbers and non-real numbers constitutes the basic and underlying numerical spectrum when considering „all” physical processes. Such a smallest set is the set of Complex numbers.

z= a + ib, where a, b belong to the set of Real numbers, and i²=-1; z belongs to the set of Complex numbers.

Now I will state rather vaguely, which will be important later, however, that composite numbers are am measurable, actually unmeasurable. So all concepts, quantities and values represented by these numbers cannot be measured in real life. Do they not exist, then? However, it was one of several premises that led me to the division that the structure of our World cannot be one-level. There must be deeper structures. Let’s call them the generalized Unmeasured Level. So the answer to this question is, yes they exist, but not on the ordinary Measurable Level, but on the Non-Measurable Level.

It turns out that it was only in the 20th century that Science, Physics, and more precisely Quantum Mechanics was the first branch of Physics to reach this second level of the World, the Unmeasurable Level. So for now, everything seems clear and logical (logically consistent). It’s just that, actually, for what reasons, Science, physics does not register this, and from the beginning of the emergence of Quantum has not allowed it. The omnipresent Determinism, Physicalism and Atheism, prevailing in modern Science, rejects and has rejected any idea that there should be some level not accessible to our senses. Such ideas are immediately affixed with the patch of paranoia. This is still the aftermath of being enraptured by the Rationalism of the Enlightenment era, the power of the mind and man’s independence from Nature. But, come to think of it, if there is indeed a generalized second level, called by me the Unmeasurable, does it not associate you, Reader, unequivocally with Spirituality, a deeper level of existence? Let’s leave this aspect of Spirituality for now, let’s focus on the scientific implications and effects of allowing the concept of the second level of reality, which arises from the mathematical formalisms used in Quantum (Quantum Mechanics). We will look at the basic wave equation of Quantum Mechanics (Schrödinger Image).

ih∂/∂t(ψ)= Ηψ

Where we have the partial derivative after time of the state function of the system (ψ) and the Hamiltonian of the system (H). We also have, very importantly, in this equation, a non-rational, imaginary number (i). So, objectively, we must conclude that this equation and the solutions, which also have a complex dimension, are not from the Measurable Level, commonly available to us? Someone will say, and this is, after all, just an equation, how many formulas we have in physics that contain complex numbers and they do not, after all, define any second level, it is all a matter of interpretation. One could make the matter so shallow and go into denial (according to Physicalism), but after all, the solutions to this equation are composite wave functions, which are also represented by composite numbers. That is, they cannot be measurable at our Measurable Level. And this equation and these functions, its solutions, describe the physics of as real a physical system as possible. The only and seemingly logically consistent interpretation of this fact is that there must be a second level, the Non-Measurable Level, and this equation and its solutions represent physics from the point of view of this level.

However, let’s take a look at how modern Science, the one that accepts as scientific fact only what can be measured (Physicalism), the atheistic one, the one thoroughly attached, however, to the mechanistic, Deterministic vision of Nature, interprets the phenomena of Quantumness. In general, as for the mathematical formalisms of Quantum, no one discusses them. They are reasonably simple and transparent. Various technologies of modern times have developed on this basis. Today, after nearly a century, transistors, lasers, complex electronics were invented on the basis of the mathematical formalisms of Quantumness. Processors, computers and everything related to modern technology and its practical application. The problems begin when it comes to metaphysical conclusions. And here, to this day, Science has had its own problems. Science does not want to allow at all costs the possibility of the existence of the other, the Unmeasurable Level. And hence there are only single-level Interpretations of Quantum, which all but one, about which in a moment, lead to logical inconsistencies, contradictions, at best to unexplained paradoxes or contradictions of logical laws. So, in fact, all of them can and must be rejected, according to the generalized proof of Not Straight. Because if something leads to an absurdity, it is a falsehood (proof of Not Straightforward). However, this does not mean that Quantum is itself false. It only means that all interpretations of this Quantum, except for one interpretation by Everett, are certainly false, because they lead to outright absurdities (Reductio ad absurdum). But Everett’s Interpretation is also, in my opinion, false, just from a different point of view. However, because of the fact that it is not falsified due to proof not directly (Reductio ad absurdum), most modern atheistic physicists have recognized it as the one correct one. Why is it not falsified? Because de facto she is not one-level but infinitely many-level. It assumes the absurd vision of budding the world into infinitely many disconnected worlds. Why, then, do I believe that she too is false? Because there is another simple and non-contradictory solution. This plus Ockham’s razor allows me to effectively reject this last resort for atheists of all sorts. This interpretation, which is the only one that does not lead to absurdities and contradictions, is the Interpretation of the Two Levels of Quantum. This is not an interpretation that divides physics into macro and micro worlds. It is an interpretation that assumes the existence of two levels of the world. The Measurable Level, Hard Matter, and the second level, the Unmeasurable Level, that is, the one that does not yield to the senses, to measurement. I will write about this in detail in a moment. But now I will turn to the contradictions and absurdities that all single-level interpretations carry, that is, those that do not recognize processes and phenomena that do not yield to measurement and observation. It is interesting to note that at the dawn of the emergence of Quantum, it would not even have occurred to theoretical physicists of the likes of Dirac, Schrödinger, Heisenberg or even Einstein that tangible reality is not the only and only acceptable one. It was, and still is, the aftermath of the Copernican Revolution, which knocked God and the Spiritual off the pedestal of Science. It was this legacy of the Enlightenment, it was this tradition and trust in the fact that man is the measure of all things and there is only what he experiences. The Enlightenment era, indeed with its victory of Determinism, Mechanisms, seemed to affirm this human self-reliance and self-sufficiency. But we today, with the benefit of hindsight, can confidently state that this was only an illusion. It is already so difficult for scientists to conclude that quantum’s implies the triumph of Indeterminism over Determinism (from the point of view of the Measurable Level), for this reason physics seems illogical and incomprehensible to them. Unfortunately, all single-level and atheistic interpretations like the Copenhagen Interpretation and its many clones only add to this confusion. So what is the trouble with all one-level interpretations, whatever they may be?

First of all, on the fact that they try, all these one-level interpretations — actually, there is no other way — to fit (fit) all these concepts, ideas of Quantum, these combined ideas, with concepts and quantities of Real nature on the same, one level, on the Level of observable reality, because, after all, there is no other. And this leads in a simple way to paradoxes, contradictions and contradictions of the elementary laws of logic. Therefore, the basic problem in these interpretations (e.g., the Copenhagen Interpretation of Quantum) is, among other things, the Problem of Measurement. How here from the form of a composite function representing the state of the system to transform seamlessly into the real value of the measured parameter? Hence were born such monstrosities of ideas as State Function Reduction (Quantum Collapse). How many scientific wonks have been created to justify the reality of such an approach.

In general, how many studies by natural philosophers have been written, and probably will be written still, to explain in some meaningful way the subtle differences between the various interpretations of Quantum (the one-level ones). Sometimes the thing breaks down to one word, one concept, one reference, and there is so much smoke that no one really understands it. I, here, have to honestly admit most of the interpretations of Quantum are incomprehensible to me because of over-stretched arguments. Copenhagen Interpretation, Relational Interpretation, Transactional Interpretation, Consistent Histories Interpretation, Riegl’s Interpretation and a multitude of others. I, God forbid, do not question the competence of the authors, the physicists who proposed these interpretations, but I must say here that they are terribly vague and only deter amateurs in general from trying to learn about Quantum. And it all boils down to this confusion at one (accessible level, after all) of concepts and composite quantities with measurable concepts. As I write the further content of this book, we will look at all these problems of single-level interpretations more closely.

Now I’ll just note that the Complex Numbers, generally the Body of Complex Numbers, were discovered much earlier than Quantum Mechanics. And it was at least several centuries earlier. And the fact that their proper functioning used to be related to the Non-Measurable Level, as it should be according to the Pythagorean Principle, of which I am a fervent advocate, was not immediately realized at the time of their discovery, and in fact there is quite a bit of trouble about it to this day. By whom? By the „great ones,” those who created Science and its Paradigms. All this has caused an impossible confusion. Professional mathematicians „suffer” from a strange affliction that manifests itself in the fact that they often forget that mathematics is the language of physics, and it is through it (mathematics) that we learn how the world (physics) works. Mathematicians would like their domain to be some abstract, detached branch of Science. However, it has been known since antiquity that mathematics always has concrete applications, and as Rutherford said …only physics counts, the rest is philately...Indeed, everything real comes down to physics. Originally, after all, the basics of arithmetic were developed as far back as Sumer just so that the rams in the herd could be counted. Physical mathematicians or, in other words, mathematical physicists are sacredly convinced of the existence of a Platonic world, that is, a place where mathematical ideals ontologically exist. And so, the idea of a circle, although realistically we will never achieve it in practice, we will not plot a perfect circle with a circular wheel, it exists, this idea, in the world of Platonic Ideas. Therefore, many would be inclined to consider that composite numbers function in such a Platonic world. And in this context, they would probably recognize that the Unmeasurable Level may exist in the Platonic world. I, for one, believe that one should go a step further, to follow the path of the Pythagorean Principle, which assumes that every true mathematical fact has its counterpart in the real world, in physics. And that’s why I think all single-level interpretations of Quantum are inaccurate, to say the least, and there are actually two levels. The Measurable Level and the generalized Unmeasurable Level. And I believe that this Non-Measurable Level exists realistically, although it does not yield to observation, as I will prove later in still this book.

As I wrote, all single-level interpretations of quantum’s generate logical problems and paradoxes themselves. Paradoxes, that is, something that is accepted as true, although it contradicts logic. But, as is well known, anything that leads to the contradiction of logic is the canvass of the proof of Not Straight. So paradoxes may be accepted in the humanities, but not in physics. Even those so-called paradoxes in STW turn out, on deeper consideration, to be only pseudo-paradoxes and are explained correctly by classical logic. However, if popular interpretations of quantum’s lead the straight path to paradoxes, this is proof of their incorrectness. I know of only two interpretations that are devoid of these logical flaws. These are: Everett’s Interpretation and the Two Levels Interpretation, the one I present in this book. What should be emphasized here, however, all of them, the correct and the false interpretations, are based on the same mathematical formalism. But this further obscures the whole problem, because even though we interpret quantum’s falsely, it is difficult to falsify it. However, there are always a few elements that point the right way. This additional element is the correct form of the Law of Causality, or the Law of Cause and Effect. That is, it can be said that the ontological sense of the composite formalism and the correct form of Causality, and the Pythagorean principle, as well as the reliance on the laws of Classical Logic, fully confirm the truth and validity of the Interpretation of the Two Levels of Quantum.

3. The essential theory of esoPhysics

Przeczytałeś bezpłatny fragment.
Kup książkę, aby przeczytać do końca.
E-book
za 39.38
drukowana A5
za 59.25