“We wish to make statements about ´trajectories´ of observers.However, for us a trajectory is constantly branching (transforming from state to superposition) with each successive measurement”

Hugh Everett III, 1957

The Far Reaching Consequences of a

Ph.D Dissertation

When treating the ideas of Relativity, we saw the Relativity Principle implying infinite 3D spaces associated to each instant of time, which in turn implies that the world is at least 4-dimensional. This explained why Michelson and Morley's experiments failed to determine absolute movements with respect to an *Ether,* supposed the spatial absolutely steady reference system. This idea about the existence of an infinity of 3-dimensional spaces associated to a single instant of time we have to keep present when thinking to the organization of whatever* *into systems and subsystems. Also, the classic Principle of Superposition treated elsewhere in this website, is still presently being presented in the elementary textbooks of Physics and Electrotechnics in the high-schools and in the University courses for future Electronics and Telecommunications' Engineers. What only a few, mainly Physicists and Mathematicians, know is that the Classic Principle underwent a radical revision after 1957. What changed the course of history was a decision, took during a meeting (circa 1955), who saw the participation of some of the top American physicists. The key point: the necessity to bridge General Relativity and Quantum Mechanics (QM) theories. The dream of the participants to the meeting: to reach a recipe to quantize gravitation, thus having a unique theory encompassing all physical phenomena, whatever their scale, from the subatomic to the cosmological. What is going to be accounted happened when the majority of the American physicists were yet doubting about some basic ideas of the named *Copenhagen’s Interpretation of Quantum Mechanics*. A graduate student particularly brilliant in Mathematics at Princeton University, Hugh Everett III, received by his mentor prof. John Archibald Wheeler, this task of the maximum level of difficulty. It was passed to a graduate student a research activity that no one of the late professors was capable to accomplish… And that youngster encountered, deeply buried in the successful original formulation of the Quantum Mechanics, much more than that. He started by a basic assumption of QM. So basic to had been oversight along the previous three decades by the fathers of QM. These formulas were conceived for an *insulated *or* *closed *physical system.* And not for our daily life non-insulated open systems. Systems constantly exchanging photons and gravitons with the environment on a cosmological scale. The Universe is the only system truly respecting that basic assumption. By definition, the domain including all others and not included into any other. Since its early start around 1928, QM was conceived to have full validity only when applied to a system like that. Implying that the quantomechanical computations made until well beyond 1957 were approximations. Calculating what eigenvalue to expect by a certain eigenfunction for a physical system supposed insulated by any other when, in the reality, it is not.

** ****Hugh Everett, III, physicist and mathematician **

** ****Albert Einstein, Hideki Yukawa and John Archibald Wheeler at the Princeton’s Institute for Advanced Studies, 1953. Three years later, Wheeler passed to Everett the task to encounter a bridge between General Relativity and Quantum Mechanics, allowing their unification (**
**H. Schrader/Princeton University/1953)**

“Measurement is Superposition”

Bernhard Riemann, “On the hypotheses which lie at the foundations of geometry”

One of the first who understood the nature of this approximation was Erwin Schroedinger. Everett attacked the problem considering the entire Universe as *Superposition of all the superpositions *(or, grand total of all the possible sums),* *thus discovering the *wave function of the Universe, *based over new meanings for the:

- Principle of Superposition;
- Theory of Measurement.

The book above, a *Princeton Series in Physics*, edition 1973, includes the 164-pages long dissertation by Everett (Princeton University Press, Cambridge, Mass., USA). Book today only available thru worldwide Physics' Department libraries. The comprehensive dissertation, occupying over one-half of the academic text, can be downloaded above. Everett was the first to fully recognise the centrality of the Principle of Superposition for Physics and its Technological applications. This work was kept in the Universities' files intentionally away from the limelight. This mainly because it efficiently countered Quantum Mechanics' standard interpretation, then centred around Niels Bohr’s Institute of Advanced Studies at Copenhagen, Danmark. Dissertation kept hidden until the 1970s [Note: in 1986 the Author of these notes had to ask availability of a copy of the book to the twelve main Universities of his own country. Later encountering just one, the Physics’ Department of Padua University at Padua, Italy, having a copy], when it started to be dug up in a desperate attempt to reconcile General Relativity and Quantum Mechanics.

Following Everett, we have just to interpret what Quantum Mechanics really tells them, without trying to change its rules to suit their own prejudices. The first step lies in the admission of all of the implications of the Principle of Superposition. Principle of Superposition implying existence of macroscopic superpositions as a reality, and not just useful temporary mathematical fictions. All liquid waves, as titanic as the Oceans' waves used by surfers (see video below) or as small as those in the sink of your home, are some of the examples of the reality of the macroscopic superpositions. What follows is referred to all of the scales, and not to the subatomic. Quantum Mechanics describes an ensemble of many coexisting *branches* of the Multiverse, branches whose difference is reduced to 1 bit. Therefore, the discovery of Quantum Mechanics one century ago started a scientific revolution where a colossal Multiverse replaced a yet huge Universe.

** ****Niels Bohr in 1960**

Atomic Physics

The Rationale of a Revolution

In the text before we hinted to the dissatisfaction of the American physicists around 1955 with respect to some aspects of the Copenhagen interpretation of QM. In the following, we’ll explain the reaons for the dissatisfaction. It is known that Erwin Schroedinger, who personally discovered the wave function *Ψ*, observed his formula showed a multitude of separate, distinct states. Namely: a multitude of Universes. Schroedinger, considered this evidence not a key point and did not deepened its implications. He did the correct thing, maybe. As a matter of fact, when he observed that Multiverse was what his formula was describing, the instrumentation (coincidence counters and single photon detectors) were not existing. Then, his eventual announcement for such a discovery, should have been surely disproved by experimental tests. The image on side represents by a 3D diagram the location of the electron in a hydrogen atom.

**3D diagram the location of the electron in a hydrogen atom**

An electron there is a point particle, and in the picture it is in the n = 14 energy level, and it has 7 units of angular momentum. The quantity plotted is the probability that you will find the electron, should you measure at that location. Note the many islands-like shapes that are surrounded by zero probability, areas in between “orbitals” where the electron can pass but cannot lie in any stable state. The axes are labeled in units of the Bohr radius: 0.53 Å.

Focus on The Probability

Something which, because of its scale, results unfortunately difficult to discern in the figure, is that as the radius, the distance by the nuclear proton increase, the probability to encounter the electron never becomes zero, rather simply decrease. Something which, in a Classic perspective could be interpreted as that the electron is simultaneously wherever. Non-zero probability nor for an atom radius greater than the known dimension of the part of Multi(verse) with which our Galaxy had enough time to enter in causal relation of state, < 18 billion light years. A subtle point whose relevance cannot be underestimated.

** **Progressive Time-related spreading of the ** wavefront **outcoming by an Event P

**at the origin of the bicone. The wavefront carries the Information commonly named**

**Signal. At right side shown the**

**3D description of the future cone as an expanding succession of concentric spheres originating at**P

**. Only the points into the past ligh cone contribute to what is later the physical Event at**P.

**The points of 3D space outside the bicone centred at the Event**P ,

**host physical Events causally disconnected with**P

**(**

**abridged by R. Penrose/2010)**

The key question becomes then:* how can an electron be simultaneously wherever?* The answer to the question of 1926, given by Hugh Everett III between 1956 and 1957, is:

* “...the electron is not simultaneously wherever!”*

The wave function Ψ (read: “*psi”*), of which this image is simply a plot, a plot confirmed by billions of experiments made wherever along past century, yet had all necessary informations into.

Ψ (read “psi”):

- could not represent a single position for that electron, being the single electron encountered in a (non-infinite, however enormous) multitude of positions. A non-contradictory because non-coexisting multitude: each one position in a definite & separate branch of the Multiverse;
- represented the measures, the weighs of individual positions, with respect to the 100 % of probability to encounter the electron.

Key point is that we cannot have prior knowledge of the future position of that electron: we don't know now where it'll be later, because that future position is out of the future light-cone, out of the spacetime related to us and now (further details here).

A Branched Reality

Events and Eventualities

In the following we’ll adopt Paul Dirac’s *bra-ket* notation of 1939, applied to John von Neumann's QM perspective. It is a common habit to refer to *Events*, when meaning *happened facts*. A broader idea is that of *eventuality*, where the *Event* may or may not happen. An example of *eventualities* below in a common roulette, a “Montecarlo randomizer-device” well emulating nearly all kinds of human and machine measurements. 38 numbered slots, but also the possibility that the ball disappears and other baffling cases observed or imagined.

Interestingly, all of the possible position states of the ball spinning around the roulette wheel, superimpose to produce an interference pattern. Before to observe (or, measure) the outcome, we have to consider the roulette system as being in a *superposition of all its possible states*. What players intuitively know be true, when waiting for the stabilisation in a single value of the precedently superimposed (summed) outcomes. The grand total of all the *e**ventualities* associated to a physical system, is the system's *state* space.

Observables

Another fundamental idea of Physics is that of *observable. *[In the following, the symbol “ ⊕ ” identifies a vectorial sum. When referred to bi-dimensional or higher dimensional vectors, it identifies tensorial sums].* *The notion of *observables *is* *used to describe a set *{e}* of *eventualities,* subject to the conditions:

*mutual exclusivity*. A pair of*eventualities**e1*and*e**2*are*mutually exclusive*if they satisfy the condition of incompatibility*e**1*∩*e**2*= ∅, this last a necessary condition for the Events’ observability;- if
*e**1*and*e**2*are the subspaces of the Hilbert space representing a couple of admissible*eventualities*, their set intersection*e**1*∩*e**2*will also be a Hilbert subspace. It'll represents the conjoint*eventuality;* - the set union
*e**1*∪*e**2*will in general not have the structure of a Hilbert subspace then not representing an*eventuality;* - the eventualities' additive structure is an effect of the Hilbert space structure. The sum of eventualities
*e**1*⊕*e**2*is the Hilbert subspace spanned by the separate Hilbert subspaces*e**1*and*e**2;* - the state
*|Ψ⟩ ∈**e**1*⊕*e**2*if |Ψ*⟩*is a Hilbert space vector whose form is:

* |Ψ⟩ = |Ψ1⟩ + |Ψ2⟩*

for a couple of Hilbert space vectors *|Ψ**1**⟩* and |Ψ*2**⟩*, named *wave functions *or* state functions*, satisfying the condition:

* |Ψ1⟩ ∈* *e**1*

|Ψ*2*⟩ *∈* *e**2*

In all these formulas the marks | ⟩ in the symbol indicate the vector nature of |Ψ⟩.

Orthogonality and Observables

Orthogonality is what characterizes an *observable* in the Quantum Theory. An *observable* for a system *A* consists of a complete set *{e}* of mutually orthogonal Hilbert subspaces:

e*1 *⊕* **e**2** · · · *e*N-1** *⊕ e*N*

spanning the entire Hilbert space *I{A}. *As an example, such that their sum equals the complete set *I* whose probability *P**{**I**}* = 1 and:

e*1 *⊕* **e**2** · · · *e*N-1** *⊕ e*N* *= I.*

Measurements are Branchings

The original idea of *branching* is not ascribed to Hugh Everett, III. Rather, it dates back to before John von Neumann, when it was assumed that the probability distribution would be provided by a *pure state*, specified by a unit Hilbert space vector, represented as a sum:

|*Ψ⟩* = *Σ**i** **|Ψ**i**⟩*

where *i* = 1, 2, …, N - 1, N, of the eigenvectors |*Ψi⟩ ∈ ei* of the observable

*{*being considered.

**e**}**A trajectory is constantly branching, transforming from state to superposition of states, with each successive measurement. The Hilbert space represents a system before the measurement, when all of the states of the system coexist in superposition. What visible on side is Everett's original idea, where Time has the direction of the branching. Also the **

**opposite happens, where different branches join themselves. Modern meaning associated to the classic term**

*interference**“Quantum Theory is predicting the probabilities of sequences of alternatives at a serie of times”*

The summation symbol above *Σ* referred to a vectorial or tensorial sum, an alias of *superposition*. We are touching this point to remark that the Principle of Superposition played a central role yet in the original Quantum Mechanics. It was considered necessary by its many developers, like John von Neumann, Max Planck, Niels Bohr, Werner Heisenberg, Wolfgang Pauli, Max Born, Erwin Schroedinger, Paul Langevin, etc., before Hugh Everett, III was born. Yet before Everett, the Copenhagen's viewpoint described the observation (or, measurement) process as having a first step consisting of splitting *|Ψ⟩* into the set of alternative projections:

* |Ψi⟩ = ei |Ψ⟩ *

**[1]**

onto the relevant eigenspaces named ** branches**.

Different Histories

Doing this Quantum Theory is predicting the probabilities of sequences of alternatives at a serie of times. Then, *histories*. As an example, the sequence of ranges of centre-of-mass position of a container at a series of times, giving a coarse-grained description of its motion when transported by a Conveyor. Sequences of sets of alternatives at a series of times, specify a set of different histories of the model. An individual history in the set corresponds to a particular sequence of alternatives and is represented by the corresponding chain of *projection operators*. To prevent the disturbing implication of the many alternative histories experienced by a single object, the creators of Quantum Mechanics added a second step to the observation process, cleared by the figure below. Here, a projection of a quantum state-vector | ψ⟩ into a vector subspace S by a projector *P*(S ). Here shown the projection of | ψ⟩ onto a ray corresponding to | ψm⟩, with which it makes an angle θ. The probability for this transition to occur results the square-cosine of the angle θ between the subspace S and the vector | ψ⟩, thus illustrating von Neumann’s concept of *probabilities evaluated by the measurement of angles*.

The (disturbing) implication of the *many actual histories experienced by a single object after an interaction* (or, measurement) is deeply ingrained in the Principle of Superposition. Single object *coexisting* in each one of the histories. It is exactly because Quantum Mechanics makes since the start of the Principle of Superposition its corner stone, that the fathers of Quantum Mechanics encountered since the start those also the *disturbing implications*. Second step that the Theory itself, resumed in the superposition* ***[1]**, does not encompass. Second step famously named *collapse*, where it was conjectured that the set should had been replaced by a single renormalized branch vector:

that would turn up with the corresponding conditional probability to:

*Pi = ⟨Ψi |Ψi⟩ *

Branches’ Measures

To make sense of the everettian perspective about branching, in the following we'll be using some pretty decent though imperfect analogies. The *wave *or* state function **branches,* in general, have different *measures *or, in the word used by Everett, *weights*. The concept of branch *weight* is hinted by the figure on side. The base of a physical object shaped following the geometry of a cone, results its most stable. extended, related surface. Visibly more stable and extended that its vertex. Vertex that from a mathematical point of view is a singularity. The cone’s base represents its unique stable position. In this perspective, a branch of the wave function due to the projection of the vertex has smaller measure in the Hilbert space than another deriving by the many more eigenvectors of the base. Also, an object lasting a long time like, as an example, *a neutron with its mean lifetime > *10^{29} *years is an eventuality reproposed in many individual* *Events. *Thus having a measure in the Hilbert space greater than, i.e., the subatomic particle named neutral pion, whose mean lifetime is just ~10^{-17 }seconds. *Events* which, in the relativistic viewpoint, are 3D leaves composed of all what exists and tagged by different instants of Time.

** C****one’s base is not just bigger than its vertex: it also represents its unique stable position. Its measure in the Hilbert Space is bigger than that of its vertex and of the many lateral sides over which it’d lie inclined. What does not mean that only the cone's base exists**

Time and the Quantum

One of the main problems in the Copenhagen's view is related to the *timing of the collapse. *Relativity Theory since the start objects that a question about when something happens refers to the concept of *time.* A concept of *time* subjectively dependent on the choice of the reference system. The problem is so acute, that it was later discovered that also the classic newtonian Mechanics shows paradoxical effects of that time-ordered idea of “collapse”. Paradoxical effects hinting to an intrinsic error. Also, when Hugh Everett III was a child he exchanged mails with the creator of Relativity theory, Albert Einstein, then at Princeton University, New Jersey. In some way, when adult Everett made justice of many Einstein’s famous remarks with respect to Copenhagen's view. In opposition to Copenhagen's view, following Everett before a measurement the system *A* would not be in a *pure state*. Rather, in a *mixed state, *simultaneously existing in a mix of all its allowed states. Its corresponding von Neumann operator, temporarily representing the probability:

** P** = Σi |Ψi⟩⟨Ψi | **[2]**

distinguished by the *pure* probability operator:

** P**(0) = |Ψ⟩⟨Ψi | **[3]**

and whichever may turn out the value of a posteriori probability operator:

* P**[i] = 1/Pi **|Ψ**i**⟩⟨Ψ**i **| ***[4]**

** ****Quantum superposition in a subset of the State space named Hilbert space. **

**Ψ**i

*(i*

*=*1, 2, 3)

**∈**

*e**i*

**of the observable**{

**e**

*},*

**are orthogonal vectors named eigenvectors. Their sum**

**Ψ**superposition

**spans the entire complex vector space, representing**

**the State of a physical system.**

**Everett recognized the permanent superposition of all the physical systems**

Everett replaced the assumption that the system *A* was initially in a *pure state, *by the more general assumption that it was in an initial state described by an a priori Probability operator P*(0)*, then an *arbitrary sum of pure state operators*. The effect of the first step of the observation process will be to provide a temporary Probability operator given no longer by the equation **[2]**, rather by the more general:

** P** = Σi Pi **P**[i] **[5]**

where the operators *P[i] *are the a posteriori Probabilities for the outfeeding *branches. *An example of them, the eventualities * ei.* These a posteriori probability operators, and the corresponding values of probability, are given in terms of the a priori probability operator

*P*

*(0)*by:

** P**[i] = 1/Pi

**e**i

**P**(0)

**e**i

where the probability is the trace:

P*i* = tr{*P**(0) ***e***i*}

or, from the temporary probability operator given in the equation **[5]**, by expressions like:

** P**[i] = 1/Pi **e**i **P** **e**i

where: * P**i** = tr{**P** **e**i**} *

Multiverse, in Synthesis

Bryce Seligman DeWitt, who edited the collection of papers (DeWitt, 1973) in the publication cited before, wrote a Preface we'll quote in the following. It offers a straight and clear insight on the logic machinery underlying *The Theory of the Universal Wave Function*, including the modern version of the Principle of Superposition:

“In 1957, in his Princeton doctoral dissertation, Hugh Everett, III, proposed a new interpretation of quantum mechanics that denies the existence of a separate classical realm and asserts that it makes sense to talk about a state vector for the whole universe. This state vector never collapses and hence reality as a whole is rigorously deterministic. This reality, which is described jointly by the dynamical variables and the state vector, is not the reality we customarily think of, but is a reality composed of many worlds. By virtue of the temporal development of the dynamical variables the state vector decomposes naturally into orthogonal vectors, reflecting a continual splitting of the universe into a multitude of mutually unobservable but equally real worlds, in each of which every good measurement has yielded a definite result and in most of which the familiar statistical quantum laws hold. (…) Looked at in one way, Everett's interpretation calls for return to naive realism and the old fashioned idea that there can be direct correspondence between formalism and reality”.

** ****However astounding it may appear, the modern version of the Principle of Superposition is saying that a multitude of Chess Games are being simultaneously played. Played in different 3-dimensional spaces superimposed in the 4-dimensional we inhabit. Part of the games correspond to the summation of all the possible familiar time-ordered sequences conceived by Chess Games’ top masters and by the Supercomputers programmed to play Chess. The arena where the theory started to be experimentally verified by mean of the Mach-Zehnder interferometer was that of Quantum Optics, later extended until mesoscopic-sized objects visible to the unaided eyes**

The Debate

What before DeWitt wrote, hints to a fact: Everett’s interpretation of Quantum Mechanics (QM) is the simplest and straightest statement about what QM is saying us. Everett's understanding of QM formalism was such to determine the first real step by step analysis of what a Measurement is. In the debate about the interpretation of QM, Hugh Everett III was a follower of Albert Einstein's position. To have an idea of the kind of ‘philosophy’ accompanying the “shut up and calculate” imposed *credo*, Copenhagen's interpretation was insisting that *r**eality is attributed to objects by the observation*. Einstein and Everett, on the opposite, are convinced defenders of the idea that *Reality exists unrelated to Measurements and existed before any human started to observe Nature*. A historically new and deeper operative meaning had been established for the Principle of Superposition, based on the Schroedinger equation **[6]**:

** ****Bryce G. DeWitt, the physicist who published the theory by Everett, including the Principle of Superposition in its modern version (**
**Larry Murphy)**

** ****After 1990 it became clear that choices in the way-outs are apparent. All of them are explored. Following modern insights of Quantum Field Theory, to that instant of time are referred many more results than the 3 here visible. Each result an Event itself, triggered in correspondance to different properties. In the last years they had been experimentally observed on macroscopic scales the less intuitive results**

**How do we know a bottle is closed ?**

The formalism of the version 1957 of the Principle of Superposition (on left side), can be translated in practical terms by mean of an example referred to a cap presence digital inspection, in a Bottling Control. We choose the Cap Presence inspection because one of the simplest existing, then easier to imagine also for non-specialists. One of these is visible in the figure below.

**An everyday life object, a PET bottle, immediately before to establish a strict however brief superposition of states with two apparatuses, acting as measurements systems. The lateral black colour fork-like couple of objects being an high frequency (27 MHz) radiator to detect the filling level. The upper central cylindric object, being a Photoscanner devoted to Cap Presence inspection, including a LED illuminator and a light detector based on a phototransistor **

Let:

*S composite system*

The *apparatus* we name Cap Inspection in a Bottling Control with several kinds of inspection, and a cap mainly composed of Carbonium atoms;

*S1**subsystem of the system S*

An atom of Silicium in the phototransistor which have to detect the reflection of a 700 nm photon (emitted by a LED encased in the same Photoscanner) indicating the status of present cap;

*S2 **subsystem of the system S** *

An atom of the cap, which can be present or absent;

*R**i* *property of the subsystem S1*

The energetic level of the Silicium atom, high after absorption of a red colour photon whose wavelength is 700 nm;

*Qi ** property for S, after the establishment of the correlation *(a superposition, with the topology of a bifurcation or an interference)

*between S1 and S2;*

Then:

Ψ^{S1 + S2 }*wave function after interaction**of S1**and S*** 2**.

Wave function encoding the status of the atom in the cap and of the atom in the phototransistor of the Photoscanner, whose laymen translation is: "Cap Present”.

and:

Σi ai Ψ^{S1 + S2 }*eigenstate in which the apparatus has recorded an eigenvalue. *

Relative state in which a component of the Cap Presence inspection, namely an atom of Silicium of the phototransistor into the Photoscanner, has increased its own energetic level after having absorbed a 700 nm photon reflected by an atom of Carbonium of the cap

**[6] **

where:

- i imaginary defined in complex numbers’ set
*ℂ*{ a + i b; a,b ∈ ℝ }, say √-1; *V*( r,*t*) potential energy influencing the particle;- m mass of the particle;
*ħ*Planck constant divided by 2π, equal to 1.05459 x 10^{-34}J s*ψ*( r,*t*) wave function, defined over space and time;- ∇
^{2}Laplacian operator:

A meaning where the *wave function* Ψ is the fundamental deterministic entity. Schroedinger’s equation which is the equation of the Irish mathematician George Rowan Hamilton in disguise, thus bridging the classic mechanics’ discoveries to the extremely fine-detailed quantum scales. This picture makes sense only when the observation processes are treated within the theory. It is only in this manner that the apparent existence of definite macroscopic objects, as well as localized phenomena (such as tracks in cloud chambers), can be satisfactorily explained in a wave theory where the waves are continually diffusing.

A deduction of this theory is that phenomenas will appear to observers to be subject to the discontinuities which are everywhere observed. The *quantum-jumps* exist as relative phenomena (e.g., the states of an object-system relative to chosen observer states show this effect), while the absolute states change quite continuously. And that’s why one of the names of the theory is Relative State formulation. Each one result is not less true and real than the others. We are indicating in the figure above this multitude of coexisting, rather than alternative, histories by mean of the three ways branching out of a single. The new Principle is capable to include the entire step-by-step evolution of a system composed of subsystems. And, what is most important, this modern version holds for any system of Quantum Mechanics for which the classic version of the Superposition Principle holds and is applicable to all physical systems, regardless of size*. *As a consequence, what in these pages is being presented is the most modern conception of Measurement.

The Modern Principle

** ****The atoms of Carbonium in the screw of this Bottle were yet correlated with the Carbonium atoms we name Cap**

**well before they were aggregated in that shape by a Cap Moulding Machine. Each one atom of Silicium today functionally shaped and doped to act as detector of the reflected wave packets, indicating presence of a cap, in the Cap Presence inspection of a Food and Beverage Bottling Control, is correlated with all other existing particles. To measure by mean of a photoscanner the property**ΦiS1

*closure,*means to establish a cap-photoscanner state describing the photoscanner as definetely perceiving that particular system state. But, to definetely perceive a Cap + Photoscanner state, it is necessary …time. Time to to transform the previous state, in which all possible kinds of correlation of the Photoscanner coexist, in a following state in which the Photoscanner is “aware” to be correlated to a Cap, because having recorded eigenvalues for the eigenfunction**de**

**scribing a Cap**

In the following the text of the modern version of the Principle of Superposition due to Everett, where:

- A is a quantity with eigenfunctions Φ
_{i}^{S1 }measured in a system S1 by an Apparatus; - ai = (Φ
_{i}^{S1}, Ψ^{S1}) are the projections on the eigenspaces of the different eigenvalues of A, after the measurement; - the brackets [ … ] denote values recorded in the memory of the measurement apparatus. Exactly what invariably happens to the measurements performed by the nonclassic subsystems (namely, detectors’ semiconductors, part of the inspections) of other subsystems of those apparatuses we name Electronic Inspectors.

*“**.…*For any situation in which the existence of a property Ri for a subsystem S1 of a composite system S will imply the later property Qi for S, then it is also true that an initial state for S1 of the form:

Ψ^{S1} = Σ ai Ψ^{S1}[Ri]

will result in a later state for S of the form:

Ψ^{S }^{ }= Σi ai Ψ^{S}^{ }[Qi]

which is also a superposition of states with the property Qi. That is, for any arrangement of an interaction between two systems S1 and S2 which has the property that each initial state:

Φ^{S1} Ψ^{S2}

will result in a final situation with total state Ψ^{S1 + S2}, an initial state of S1 of the form:

Σi ai Φ^{S1}

will lead the whole system, after interaction, to the superposition:

Σi ai Ψ^{S1 + S2 }*…..” *

*Repeatability* and *False Positives*:

a no more statistical meaning

*Photoscanners* are the modern version for what long time ago was known as *photo-electric cells, *an evolved industrial version of the* photo-detectors*. They have common use in a huge amount of technological industrial applications. Interesting because designed in such a way to *emulate* the simplest neuronal chains, like those comprised between the retina and the groups of neurons in the lobes, devoted to perception. What to expect today as the modern meaning of terms like *Repeatability* and* False Positives, *when it is a wave the proved nature of all, since the demonstration given in 1927 by Louis de Broglie? All waves are Superpositions, then sums of other terms featuring different frequencies and amplitudes.

** ****All everyday life objects, including PET bottles and their content, share the same wave nature. Superpositions of multitudes of terms at different frequencies and amplitudes. An atom, the resonance outcoming by a vast collection of atoms like a aluminium lid exposed to ultrasounds and the movement of the hitted surface of a drum, share a similar nature ****(**
**COMSOL/2013)**

What applies fully to all matter, including the atoms composing a common cap and the Photoscanners itself. In the modern viewpoint, to determine by mean of a Photoscanner that a bottle is capped, say to definetely perceive a Cap + Photoscanner state, it is necessary:

**Time,****to to transform the previous state, in which all possible kinds of correlation of the Photoscanner coexist, in a following state in which the Photoscanner is “aware” to be correlated to a Cap**, because having recorded eigenvalues for the eigenfunction ΦiS1 describing a Cap. The correlation between the two systems Photoscanner and Cap) is progressively established during interaction and proportional to the natural logarithm ( ln t ) of the interaction time t. An ideal correlation, corresponding to a maximised information of the Photoscanner about the Cap, can only be reached allowing an infinite time. This causes the measurements’ fluctuations, a synonimous of the spectrum of the eigenvalues, resulting in the Electronic Inspector false positives (false rejects). Time to transform the previous state, in which all possible kinds of correlation (superpositions) of the Photoscanner coexist, in a following state in which the Photoscanner is*aware*to be correlated to a Cap, because having recorded eigenvalues for the eigenfunction ΦiS1 describing a Cap.

** ****Individual wave packets, whose length is 700 nm, emitted by a red LED can be reflected back to a phototransistor encased jointly with the LED to form a Photoscanner, by an atom of Carbonium of the many composing what we name a cap, applied over a passing bottle. The correlation between the two systems Photoscanner and Cap) is progressively established during interaction and proportional to the natural logarithm of the interaction time ln (t). An ideal correlation, corresponding to a maximised information of the Photoscanner about the Cap, only allowing an infinite time. **

This is the causes of the measurements’ fluctuations, a synonimous of the spectrum of the eigenvalues, resulting in the Electronic Inspection equipment false positives, later translated in false rejects. Time to transform the previous state, in which all possible kinds of correlation (superpositions) of the Photoscanner coexist, in a following state in which the Photoscanner is *aware* to be correlated to a Cap, because having recorded eigenvalues for the eigenfunction ΦiS1 describing a Cap. To make an example, refer to the images gere above. If the speed of the beer bottles visible should be, i.e., 15 times faster than the 2 m/s it really is, then no one of the six inspections visible in the figure could any more accomplish correctly its task. At a container speed of 30 m/s, low energy photons, like all those adopted above as “interactant" by:

- 3 LASER tracking triggers (red colour visible light),
- 1 container inner pressure inspection (ultrasounds),
- 1 high frequency fill level inspection (~21 MHz),
- 1 cap optic inspection (visible light),

are no more efficient. As an example, at 30 m/s:

- LASER tracking trigger,
- Fill Level inspection,
- Cap Optic inspection,

have to be forcedly replaced by a more expensive Gamma-Rays detectors. The inner pressure cannot any more be established by ultrasounds and, as a matter of fact because of other reasons, it is challenging to inspect it by Gamma-Rays also.

**Interaction between the systems such that the information in the marginal distribution of the object inspected is never decreased.**Otherwise we could not have any more repeatability of the measurements. As an example, this should be the case if we’d erroneously try to use a beam of high energy neutrons, rather than LED's low energy photons, to interact with the Cap. The nuclei should modify the molecular structure of the Cap, modifying its eigenstates and then the eigenvalues we expected to derive by the measurement.

**This causes the measurements’ fluctuations, a synonimous of the widened spectrum of the eigenvalues, resulting in the Electronic Inspector's false positives (false rejects). The interaction between the systems has to be such that the information in the marginal distribution of the object inspected is never decreased.Otherwise we could not have any more repeatability of the measurements. As an example, if we’d erroneously try to use a beam of high energy atomic nuclei, rather than LED's low energy photons, to interact with the Cap. The nuclei could easily change the molecular structure of the Cap (damaging it) changing its eigenstates and then the eigenvalues we expected to measure**

1982: the Crossroad

John Bell’s analysis and successive experiments demonstrated that the phenomenon named Entanglement, whose initial ideas are dated 1935, have to be part of reality rather than being the consequence of an incomplete description. Say, mere statistical correlations. Single photon detectors, coincidence counters and powerful computers allowed a team of researchers of Sorbonne University, Paris, led by **Alain Aspect**, a thorough statistical verification of the Entanglement phenomenon along the years 1980 to 1982. Quantum Mechanics, considering only two of its technological applications, namely Electronics and Information Technology, is yet the most successful scientific theory. In 1982, that part of the humanity knowing what is going to shape the future of whoever, understood it was reached an historical crossroad.

** ****Alain Aspect. Along 1980-1982, he directed a group of researchers who made the first decisive test of Entanglement and non-locality. experiments confirmed the non-local character of all, matter and radiation**

3 mutually excluding interpretations for the single meaning of *Reality*:

1. the limit speed of propagation of radiation is higher than light speed

(Bohm-De Broglie interpretation, 1952);

or:

2. Ψ represents a Quantum Field and multiple paths are actual

(Everett interpretation, 1957);

or:

3. an observation (or, measurement) is what let a physical status exists

(Bohr-Copenhagen interpretation, ~1932);

where the interpretation:

- is contradicted by:
- the generalized idea that causes exist and precede the effects;
- Einstein's idea about the existence of a limit speed for light Signals. Special Relativity validity constantly reconfirmed by experiments on macroscales;

- seems Science Fiction;
- is a solipsistic position, criticised by Einstein who observed he could not believe that a mouse could bring about drastic changes in the universe simply by looking at it. The key point of the named Copenhagen's Interpretation, lies in the John von Neumann's concept of Measurement. Following von Neumann, a measurement corresponds to the set of possible projections onto a complete orthogonal set rays of the Hilbert space being measured.

** ****Programmable Logic Controllers (PLCs) and Frequency Converters are applications of the same Quantum Mechanics' nonclassic rules in all Packaging Lines. The Modern Principle of Superposition was modelled in 1957 over complex automata,**

**today's name for PLCs**

Since ~1990 the Event definition as detection of a quantum field excitation emitted (absorbed) by a source (detector), is built over the cornerstone of those multiple coexisting Paths made of Events, but goes further. Today, after the Theory of Information's revolution backed by theorems discovered in the last decades whose impressive yield has only started to give its fruits, the Information is no more considered a *passive* element. Passive like a mere way to label properties and states of energy and matter. The concept of Event, its Information content and connections with other branches of the Science and Technology, are much broader today than they were in 1915. An Event is today, basically and in general, the name of the status of a physical or logical property.

** ** **Quantum optic bench devoted to interferential microscopy. The bench itself is typically a very thick, extremely heavy block of granite (**
**M. Peshkin, A. Tonomura/1989)**

The entire line of reasoning based on the chaining of cause and effect, something which appeared obvious in 1915, one century later is object of deeper exams. The ancient causal ideas came at odds with the outcome of experiments like those associated with Entanglement. A key point regards the wave function Ψ which is what enters constantly in mathematical models related to the design of semiconductors. Exactly those accounting for > 99.9999 % of the components into CPUs and other Integrated Circuits. CPUs and Integrated Circuits (ICs) which also allowed to increase the production speed of the World fastest Food and Beverage Bottling Line, from the ~9000 containers-per-hour of 1948 to the actual ~140000 containers-per-hour.

There are plenty of practical and successful technological applications, like the Programmable Logic Controllers (PLCs) and Frequency Converters into the Control Rooms of the industrial Packaging Lines. They all applications derived by the superposed waves building up the localized wave packet Ψ of the formula **[6]**.It merits to be remembered that all these everyday applications owe their existence to Science. And precisely to ideas synthesized in the formulas like that of the wave packet Ψ, to the fundamental idea of Superposition and others. Formalisms whose explanatory power is corroborated by optic experiments like those above, including LASER sources, mirrors, lenses and beam splitters, slits and photodetectors.

**An experiment thinked in 1935 by Einstein and collaborators Podolsky and Rosen (EPR), to verify Quantum Mechanics' respect of causality basic idea (resumed in the locality concept underlying Special Relativity theory) implied to control the physical properties of individual particles. Following the human basic assumption of causality, no action on a physical property (example, the spin) of an object here and now, may instantaneously influence a physical property of another object elsewhere. Experiment requiring a technology unavailable in 1935. A B**

**ö**

**hm-designed variant had to wait nearly 50 years to be tested. Its impressive result is that just Bohm's and Everett's interpretations of Quantum Mechanics correspond to Nature’s behaviour (**

**E.S. Fry and X. Qu/2009)**

The example above shows a pair of particles emitted from the source, each headed for its respective Stern-Gerlach magnet. Each magnet is set to measure spin in some direction, and the result is that the particle is deflected either 'up' or 'down', to be detected in an 'up' or 'down' region on its detector. This setup helped to realise the relevance of the parametric + hardware + procedural setup in the determination of the outcomes. In this special case, the former being the orientation of the Stern-Gerlach magnets, and the latter being the flash at either the 'up' or 'down' region of a detector.

Following the human basic assumption of causality, no action on a physical property (for example, the spin) of an object here and now, may instantaneously influence a physical property of another object elsewhere. The original experiment was thinked by Einstein and his collaborators Podolsky and Rosen in 1935. Unfortunately, implied a technology for individual photon or electron generation, detection and coincidence counting then unavailable. Nearly 50 years later it was tested a David Böhm-designed variant, named EPR-Böhm. Its impressive result is that just David Böhm’s **(1)** and Hugh Everett’s **(2)** interpretations of Quantum Mechanics correspond to Nature’s behaviour.

*(to be continued)*

Links to pages on related subjects:

### Introduction￼In the Industrial Automation and Electronic Inspection fields, however named and operatively disguised, Triggers play the role of the most elementary measurement instruments. Simpler than any single-channel (e.g., the High Frequency fill level inspection) or multi-channel (for example, all camera-based inspections) analog-digital variable measurement system, and a permanent input device to the Programmable Logic Controllers' (PLCs) for the automations. …

### Objects and Measurements’ hidden Nature ￼￼ In 1915 Einstein extended Relativity Theory to include also some of the ideas underlined by the figure on side. To every point of a curved differentiable manifold …

### Triggers. Sharp and Unsharp Signals￼Triggered is said of Events with the most strict kind of correlation which may be imagined: the causative. Their effects are other Events. After the introduction given here…

### Classic ViewAll the engineers engaged on a daily base with calibration operations in the equipments and machinery part of Food and Beverage Production Lines, know they continously apply an idea named Principle of Superposition. …

### The Far Reaching Consequences of aPh.D Dissertation￼When treating the ideas of Relativity, we saw the Relativity Principle implying infinite 3D spaces associated to each instant of time, which in turn implies that the world is at least 4-dimensional. …

### Quantum in Brief￼ ￼With reference to the figure on side, a quantum system is specified by: Hilbert space H : a subset of the Banach space, whose rays are non-nil complex-valued vectors each of them representing possible states of a physical system.…

Links to pages on other subjects:

### Total Cost of Ownership of a Full Containers Electronic Inspector, on base of its Fill Level Inspection Technology￼Counting the number of Technologies existing for the measurement of the fill level in Bottling Lines, we encounter at least seven different. …

### When thinking to its applications in the industrial Machinery and equipments, whoever thinks to know what is a Trigger. Their most known examples all Container Presence electromagnetic detectors (i.e., photoelectric, inductive, by mean of ultrasounds, Gamma-rays) which let the Machinery operate. …

### A Fundamental QuestionWhat Detectors detect? Their purpose is known: the conversion of light (photons) into electric currents (electrons). Photodetectors are among the most common optoelectronic devices; they automatically record pictures in the Electronic Inspectors’ cameras, the presence of labels in the Label Inspectors or the fallen bottles lying in a Conveyor belt. …

### IntroductionThe light generated by a LASER LED in the figure above may be used to detect an excessive inclination or height of a closure, and also the filling level of a beverage in a transparent container. …

### ￼The subject of Classification is studied by Statistics and Applied Probability Theory. It is concerned with the investigation of sets of objects in order to establish if they can be summarized in terms of a small number of classes of similar objects. …

### An optical rotary joint using injection moulded collimating optics (￼ Poisel, Ohm University of Applied Sciences/2013) Runt pulses & nonclassic Packaging Controls’ components ￼ ￼ ￼ ￼￼ Also consumer cameras use a Trigger. …

### Inspections in a Decohering EnvironmentWhat is a Measurement ?￼Measurement’s nature is like time, one those things we all know until we have to explain it to someone else. Explanation invariably passing thru the idea of comparison between a standard established before and something else. …

### First In First Out Application to Food & Beverage packaging an ideadeveloped to handle the highest ProductionFIFO (First-In-First-Out) concept started to be applied some decades ago to industrial productions, specifically to manage the highest speed production lines. …

- Fill level inspection tech: a TCO point of view
- Physics of Triggering
- What Detectors detect ?
- Electromagnetic Measurements of Food, Beverages and Pharma: an Information Retrieval problem
- Binary Classification fundamentals
- Electronic Inspectors’ nonclassic components
- Measures in a Decohering Environment
- FIFO: Bottling Quality Control applications of an idea born to manage the highest production performances
- Photodetectors fundamental definitions
- Media download
- Containers