Inspections in a Decohering Environment

“Around the Electronic Inspectors the decisive factor and most often overlooked: the Environment"

What is a Measurement ?

Technical/White paper

     Flash animation, 69 MB, ZIP

                  Click  index.html

“As models for Observers we can consider automatically functioning machines, possessing sensory apparata and coupled to recording devices capable of registering past sensory data and machine configurations. 

We can further suppose that the machine is so constructed that its present actions shall be determined not only by its present sensory data, but by the contents of its memory as well.  

Such a machine will then be capable of performing a sequence of observations (measurements), and furthermore of deciding upon its future experiments on the basis of past results”

Measurement’s nature is like time, one those things we all know until we have to explain to someone else.  Explanation invariably passing thru the idea of comparison between a standard established before and something else.  In the reality, this answer learnt in the Electronics, Electrotechnics and Engineering courses at whatever level, is considered today a classic one.  A soft way to say obsolete.  The idea of measurement has slowly changed from the initial mere comparison rule, to something completely different.  It has been reshaped, following what understood by the discoveries of the past century.  In the Classic perspective, a physical measurement requires a collection of devices such as clocks, encoders, phototransistors, counters, a LASER LEDs, temperature or pressure sensors, and so on.   The operational control of this instrumentation is exercised alternatively by the observer, who decides what to measure, how to perform a measurement, and how to interpret the results.

“As models for Observers we can, if we wish, consider automatically functioning machines, possessing sensory apparata and coupled to recording devices capable of registering past sensory data and machine configurations.   We can further suppose that the machine is so constructed that its present actions shall be determined not only by its present sensory data, but by the contents of its memory as well.   Such a machine will then be capable of performing a sequence of observations (measurements), and furthermore of deciding upon its future experiments on the basis of past results”.  Visibly, these automatically functioning machines closely resembles an inspection in an Electronic Inspector, or of one of the assembles composing an Automatic Machine, like the Conveyors, Filler, Closer, Blowformer, Palletiser, etc.   Established these basics, we all agree a physical measurement is meaningful only if one identifies in a non ambiguous way:

  • who is the observer (or, apparatus, or automatically functioning machine);
  • what is being observed.

The same observable can be the target of more than one observer (or, automatically functioning machine) and that’s why we need a suitable algorithm. Because of this reason it is also needed a suitable algorithm to compare their measurements.   In the following, we’ll just briefly introduce two points of view about the nature of physical measurements:

  • classic, a relativistic point of view based over Differential Geometry;
  • modern, the Quantum Physics point of view based over Quantum Field Theory

Being two distinct point of view about the same subject, many terms shall be purportedly repeated. This introduction tries to abide by the necessary formalism, here replaced by figures and graphics. The goal of this introduction along two different pathways, is to show how similar conclusions about what a measurement really is, get out naturally by theories substantially different and developed in different epochs. 

Local and non-local measurements

The Electronic Inspector and the ambient containing it, cover a finite spatial volume making measurements lasting for a finite interval of time.  As seen by a point of view which since over thirty years considered obsolete and far from truth, the measurement’s domain is the space-time region in which a process of measurement takes place.   In a following section we’ll deepen this point, showing how wide results the domain of the automated quality control equipments and packaging machinery.    A volume erroneously imagined reduced to the Bottling Hall.    If the background curvature can be neglected, then the measurements will not suffer from curvature effects and will then be termed local.   On the opposite, if the curvature is strong enough that it cannot be neglected over the measurement’s domain, the response of the instruments will depend on the position therein and therefore they require a careful calibration to correct for curvature perturbations. In this case the measurements carrying a signature of the curvature will be termed non-local.

Apparatus and physical measurements

In Mathematics, the tangent space of a manifold simplifies the generalization of vectors from the affine spaces to the general manifolds.  An example of tangent space is given in the figure at right side.   A generalization made necessary because in the latter case, what we all learnt by elementary vector algebra, when subtracting two points to obtain a vector pointing from one to the other, is not possible. Whatever, and also an automatically functioning machine (e.g., an Electronic Inspector) and the environment containing it, is mathematically modeled by a family of non-intersecting time-like curves having u as tangent vector field and denoted by Cu 

  A tangent space of a single point on a sphere. Generalising the concept, to every point of a differentiable manifold, can be attached a tangent space. Tangent space is a vector space containing all possible  directions along which one can pass through the point (image credit


                          A vector normal to the surface

Here, Cu is the congruence of curves with the tangent field. Each curve of the congruence represents the history of a point in the laboratory or automatically functioning machine.   We choose the parameter on the curves of Cu so as to make the tangent vector field unitary, an always possible choice for non-null curves.   Let ΣC be a space-like three-dimensional section of u spanned by the curves which cross a selected curve γs of the congruence orthogonally.   The concepts of unitarity and orthogonality are relative to the assumed background metric.   The curve γs will be termed the fiducial curve of the congruence and referred to as world line of the observer, apparatus or automatically functioning machine.   

  Comparison between the actual course of a point on a test geodetic and the fiducial course. Fiducial course is the course it would have had to take to keep constant separation from the point moving on the fiducial geodetic (credit J. A. Wheeler et al./1973)


   A vector normal to the surface and a family with infinite measure of the associated tangent vectors

We’ll observe the section Σ spanning a 4-dimensional volume. This volume represents the space-time history of the environment including the observer, industrial equipment or Detector.   Limiting Σ's extension to a range much smaller than the average radius of its induced curvature, it becomes possible to identify:

  • Cu       geodetic curve γs
  • Σ         point γs ( t )

Any time-like curve γu with tangent vector can then be identified as the world line of an observer (or, electronic inspector) which will be referred to as “the observer u”.   If the parameter on γ is such as to make the tangent vector unitary, then its physical meaning is that of the proper time of the observer u.    As an example, the time read on his clock in units of the speed of light in vacuum.  

Reference frames

The path extremizing the lapse of proper time 

This concept of observer, apparatus or automatically functioning machine without a definition of a reference frame adapted to him or it, results ambiguous.    A reference frame is defined by a: 

  • clock which marks Time as a parameter on γ;  
  • spatial frame made of 3 space-like directions identified at each point by space-like curves stemming orthogonally from it.    

While the time direction is uniquely fixed by the vector field u, the spatial directions are defined up to spatial rotations, i.e. transformations which do not change u.   Obviously there are infinitely many such spatial perspectives, whose effect is evidenced by the figure at left side.

  A family of alternative courses that a test geodetic could have taken, all of them through the point Q.   f is the fiducial curve.  The could have taken through, has been replaced after 1990 by the modern took through. The word alternative has been replaced by actual.  The family of geodesics shown, features different degrees of convergence toward the right side, or divergence toward left side (image credit J. A. Wheeler, et al./1973)

Here, alternative courses that a test geodetic could have taken through.   The result of a physical measurement is mathematically described by a scalar, a quantity which is invariant under general coordinate transformations.   A scalar quantity, however, is not necessarily a physical measurement.   The latter, in fact, needs to be defined with respect to an observer and in particular to one of the infinitely many spatial frames adapted to him.   The aim of the relativistic theory of measurement is to enable one to devise, out of the tensorial representation of a physical system and with respect to a given frame, those scalars which describe specific properties of the system.   

Another classic way to understand the meaning of reference frame, may be inferred by the figure at right side.   Here five world lines c, d, e, f, g  join a single Event A to the future Event B.    Five different histories meant as just five in the multitude of correct answers to a single question: 

“What is the path to go from the point A in the past to the point in the future ?”     

A family of alternative courses that a test geodetic took through, when going from a point in the past to the point in its future.  Red coloured path e is that one that extremizes the lapse of the affine-parameter.  Considering that the graphics refers to spacetime, the affine-parameter is the proper time t.   As an example, the time measured by mean of the clock of the particle moving from to B

“the 'Principle of Democracy of Histories', implies that no path, no history is favoured or more real than any other”

Also a question being constantly answered in the Industrial Machinery applications by the multitude of Detectors which let the equipments (i.e., PLCs) operate future actions driven by past informations.  With respect to the figure on side, the paths:

  • d, e, f         cross each other;
  • f                 shortest path from to B. 

Rewriting the question before as:

“What is the path of a test particle to go from the point in the past to the point in the future, minimising the proper time measured by its clock ?”

In red colour the path f extremizing the lapse of proper time from to B, with respect to all other world lines.   Two Fourier amplitudes a1a2  let the world lines (or, histories) c, d, f, g distinguished from the fiducial world line e.    Then, for any given value of the proper time affine-parameter t  scaled so that  ) = 0,  ) = 1 ], each of the histories shall be distinguished with respect to the fiducial line.    Its amplitude δaμ) obtained by:

What before is a point of view older than one century.   Fifty years later Quantum Mechanics introduced the named principle of democracy of histories, implying that no path, no history is favoured or more real than any other.   Since then, the only difference between an history and another lies in the phase angle of the complex probability amplitude: 

Meaning that the over one century old assumption about what path the test particle “could have taken through”, has been replaced after 1990 by the modern “took through”.

The elementary measurement

It can be a pressure sensor in a squeezer™ Leakage Electronic Inspector, or a capacitive sensor in a Weigh Checker for cases or clusters of bottles or cans, it can be a High Frequency fill level inspection, X-rays or the most daily sight we give to the world when awaking, these all are electromagnetic measurements.  The only non-electromagnetic measurements whose effects we can feel, are those happening at other levels, namely:

“A physical measurement is meaningful only if one identifies in a non ambiguous way who is the observer (or, automatically functioning machine) and what is being observed

The today classic relativistic point of view about the most elementary measurement. Imagine a source of light moving along the geodetic γ.  It emits a photon at the point A1,  reflected back by a mirror existing at the point P, reabsorbed by the observer source of light at the point A2.  Such configuration allow to establish the spatial distance ζs  along the curve separating the observer or, his operative models, like the inspections, by the mirror (image credit De Felice, Bini, 2010)   

  • nuclear, like those originating the gamma-ray fill level inspections;
  • gravitational.

To let an electromagnetic interaction be considered a good measurement, it is necessary:

  1. Timeto transform the previous state, in which all possible kinds of correlation (superpositions) between observer (or, his models, like all automatically functioning machines) and the object of the measurement coexist, in a following state in which the observer is aware to be correlated to an object, because having recorded eigenvalues for the eigenfunction ΦiS1 describing the object.  The correlation between the two systems is progressively established during interaction and proportional to the natural logarithm ( ln t ) of the interaction time t.   An ideal correlation, corresponding to a maximised information of the observer about the object, can only be reached allowing an infinite time. The fact we cannot wait for an infinite time causes the measurements’ fluctuations, a synonimous of the spectrum of the eigenvalues, resulting in the Electronic Inspector's false positives (false rejects). 

  1. Interaction between the systems such that the information in the marginal distribution of the object inspected is never decreased.  Otherwise we could not have any more repeatability of the measurements.  As an example, the instrument used to establish the correlation, for an instance an electromagnetic wave in a common photo-electric sensor, should never modify the molecular structure of the object.  Otherwise, it’d be modifying its eigenstates and then the eigenvalues we expected to derive by the measurement.

A visible example of such elementary measurement is visible at right side. Its meaning, following the relativistic and today classic point of view, is the time-ordered sequence where: 

  1. a source of light moving along the geodetic γ,
  2. emits a photon at the point A1,
  3. instantaneously reflected back by a mirror existing at the point P,
  4. reabsorbed by the Observer source of light when this reaches the point A2.  

Such configuration allows to establish the spatial distance ζs  between observer and mirror, along the curve separating them.  A configuration whose essential idea since several decades is reproduced wherever.  Also in the Industrial Machinery, in the Electronic Inspectors, in the inspections of which the Inspectors are a superposition, until the Observers. In the figure below, as an example, the source of light is fixed with respect to the mirror.  Pallets travel over a Conveyor whose speed v is relayed to an Electronic Inspector in the form of a train of pulses, whose frequency is proportional to the Conveyor’s mechanical speed.  With respect to the pallet, the signal is dual: 

  1. presence, deduced by the interruption of the light beam, corresponding to a measurement of little or none returning photons infeeding the photocell's phototransistor;
  2. length l, deduced by the number of Encoder’s pulses received as a digital input by the Electronic Inspector during the conditioning status of pallet presence. 

Our technological example results an approximation of the general case where the speed of light is finite and 4-dimensional-curved the space where the movement happens.  The light speed is visibly assumed infinite and the evaluation of the spatial distance ζs aliased by that of the pallet length l.   As seen from the point of view of a pallet in movement, the light barrier here represented is in movement at the Encoder speed, along an opposite direction.  

  The basic or advanced applications of the Industrial Machinery are aliasing prime-concepts of Physics (image credit InterSystem® AB, 2014)

Modern interpretation of measurement

Who or what measures what, where and when ?

The basic measurement mechanism described immediately before, corresponds to a classic point of view.  Since decades replaced by a completely different one.    The measurements are in general observer-, inspection- or automatically functioning machine-dependent.   Because of this reason, a criterion should also be given for comparing measurements made by different observers, inspections or automatically functioning machines. The relevance of the comparison between different observers, inspections or automatically functioning machines was discovered nearly one century ago.   More than fifty years ago it started to become clear why this dependance.   A basic role in this procedure of comparison implicit in the measurement, is played by the Lorentz group of transformations.   A measurement which is observer-independent is termed Lorentz-invariant and Lorentz-invariant measurements are of key importance in physics. Why ?   Because the description of a physical system depends both on the observer, inspection or automatically functioning machine and on the chosen frame of reference. In most cases, the result of a measurement is affected by contributions from the:

  • background curvature;
  • peculiarity of the reference frame. 

  The measurement of a physical system depends both on the Observer, inspection or Automatically functioning Machine and on the chosen frame of reference. In this example, the reference frame is the metal “frame” of the micrometer 

Riding geodetics

  Oscillation characteristic  values (eigenvalues) are modes of vibration. A visible example the upper surface of a drumhead, each one mode oscillating its own frequency (image credit J. Kreso, 1998-2010)

As long as it is not possible to discriminate among them, a measurement remains plagued by an intrinsic ambiguity. This ambiguity, whose existence was clear yet sixty years ago to Albert Einstein who General Relativity created, is not an accidental.  It hints to the multiple actual courses that a test geodetic can take through.  And we, our inspections and automatically functioning machines ride these geodetic. The statistical nature of measurement, hinted by the video in the start of this web page, is an illusion. No measurement, by an observer, inspection or automatically functioning machine is statistical.  Rather, it is strictly deterministic and linear.  

The measurements’ results, e.g. the gaussian expected distribution of the independent measurements of a random variable, are not random at all.  It is directly the statistical interpretation of measurement, over one century old, to be flawed by ...obsolescence.  

In the modern, experimental and theoretical perspective, each one measurement corresponds to one of the existing (existing or actual, and not possible) eigenvalues of the eigenvector including the observer and of its generalisations (the inspections or the automatically functioning machines).  An eigenvector whose complete set of values exists only from the point of view of a superior superposition, the sum of all the superpositions.  An existence in the sense that only the superior superposition has all of the knowledge for all of the eigenvalues of all of its eigenvectors.  We all observers and all our operative generalisations, like the inspections or the automatically functioning machines, have energy and extension limits. Because of this excellent reason, quite obviously we can only have an extremely  limited perception and memory of the multitude of the coexisting eigenvalues.   To clear the comprehension of this truly fundamental concept, please refer to the figure on right side.  There, each point of the surface of the drum head is indexed by a couple of (x, y, z) coordinates where x, y are the common cartesian bidimensional coordinates, and z the height (or, depth) of the point.   Each one of the infinite points of the surface of the drum head exists before and after the measurement, also if associated to different values of the height z. There is a multitude (not infinite) of values of the z coordinate which may be associated with each one point x, y.    Returning to the differential geometry classic point of view described by the figures above, the sum of all of those courses took by a test geodetic.   We are speaking of an actual multiplicity, obviously fronting the immediate objection: 

                           why we only see a single measurement result ?       

The following animation below, based over an insight by Dieter Zeh dated 1970, allows to start to answer the question. It shows the fine-details of how a measurement happens. Specifically shown what happens to a macroscopic System (a cat), initially in a superposition of states, under the effect of the same Environment in standard conditions of temperature, pressure and humidity where also all Food and Beverage Controls operate.  It is, visibly, a smooth continous process, displaying the reconstructed Wigner function of the System, averaged over 4 ms.   Reconstructed with the data recorded in a 4 ms sliding time-window.  Two different phenomena are visible: 

  • fast decay of the quantum interference feature, in the few initial milliseconds;
  • much slower evolution of the classical components, towards phase-space origin. 

 A cat, initially in a superposition of states, under the effect of the same Environment in standard conditions of temperature, pressure and humidity where also Food and Beverage Controls operate.  It is, visibly, a continous process (credit CNRS, Laboratoire Kastler Brossel)

In the following, it’ll become clear how deeply and constantly these modern subjects intervene in the function of the Electronic Inspectors and their measurements (inspections).  Along past three decades Decoherence explained why: 

  • certain microscopic objects, commonly named “ articles", seem to be localized in space: in the reality, particles do not exist and only there are waves (see figure on right side); 
  • microscopic systems are usually found in their energy eigenstates and therefore seem to jump between them, meaning that there are no quantum jumps;
  • they appeared to exist two contradictory levels of description in physics (classical and quantum) when there is a single framework for all physical theories: the quantum theory;
  • the Schrödinger equation of General Relativity (also named Wheeler-DeWitt equation) born in 1967 may describe the appearance of Time in spite of being Time-less.  It has been understood that Time does not exist and what it really exists is an arrow of time in the form of a special initial condition.

Electronic Inspection and Quantum Physics: 

a common ground

causal-relation-scale med-2

Inspections can be thinked as finalised measurements or, measurements with a scope.  Where the scope is the binary classification and eventual rejection of the objects.  Whatever object (bottles, cans, crates, cases, kegs but also wheels, smartphones, blood samples, glue, pens, gaseous substances, etc.) where at least one measured physical quantity, resulted out of a pre-defined range.  Physical measurements are the core of Bottling Controls Technology and operation.  Around the Electronic Inspector there is the decisive factor, and most often overlooked: the Environment.  Electronic Inspectors (or, Bottling Controls) in Food and Beverage production Lines are collections of assembles, named inspections, whose components are Optoelectronic devices performing physical measurements of the properties of objects (containers, crates, cases, etc.) and their content, mainly by mean of electromagnetic measurements.  In this framework, Triggers are the most basic devices, those controlling the successive actions of physical measurement and eventual rejection in a Binary Classifier.  

    The Environment causally connected with our Bottling Controls is wider than the Factory in which they operate. An entire Food and Beverage Bottling Line can be stopped by the false rejects whose unique, 8.5 minutes delayed cause, lies 150 millions of kilometres afar: the Sun.  But, also apart the limit case here described, sun beams act constantly in the productive process, creating a daily cycle causing apparent spontaneous sensibilisations of the Bottling Controls’ inspections and the opposite effect around twelve hours later 

A physical measurement requires a collection of devices in our case clock systems integrated in the Inspector’s electronics, Encoders, counters, LASER photosensors, CMOS-cameras, High Frequency residual liquid controls, and so on. The operational control of this instrumentation is exercised in the phase of the Electronic Inspector's startup and commissioning by the Field Engineer. 

He is who calibrates amounts corresponding to Contractual Agreements about: 

  • what to measure, 
  • how, where and when to perform a measurement, 
  • how to interpret the results. 

Environment, the measurements' domain

The Food and Beverage Bottling Line covers a finite spatial volume and the measurements last for a finite interval of time, in our case commonly ranging:  (0.1 - 20) ms. In general, it is defined as the measurement’s “domain" the space-time region in which a process of measurement takes place. The Sun (see figures on right side) is capable to repeatedly stop along tens of minutes an entire Beverage Bottling Line.  How ?   In its most natural way, by mean of beams of light which are one of the two known causative relations (the second being gravitons) between the measurements accomplished in our Factories here and now, and that object 149.5 millions of kilometer afar and ~8.5 minutes ago.   We’ll cite four different examples, all of them referred to our specific field of application, automatised Quality Controls in the Food and Beverage Bottling Lines:  

    The smallest and strictly causally connected Environment is a huge sphere, centered in the Factory, whose volume is 1.4 * 1025 km3

  1. photons in the visible part of the spectrum, due to reflections into Inspectors’ mirrors are later amplified, forcing massive false rejects (> 40 %) in Full Bottle and Empty Bottle Inspectors equipped with Vision cameras; 
  2. photons in the visible part of the spectrum, in feeding cap Colour tri-chromatic sensors and later amplified, emulate caps of wrong colour, forcing massive false rejects (> 10 %) in Full Bottle Bottle Inspectors equipped with cap Colour inspection; 
  3. thermal photons act on beverage characteristic creating a diurnal cycle.  A beverage whose fill level is being inspected with high frequency em radiation, appear apparently underfilled at ~3 PM than at ~5 AM.  Net effect front of a single sensitivity setup: huge false rejects at ~3 PM;
  4. thermal photons act on the PET bottles characteristics, creating a diurnal cycle.  PET containers tension shall be minimised at ~3 PM and maximised at ~5 AM. A PET container whose sealing (leakage) is inspected by mean of a Squeezer Full Bottle Inspector, equipped with inspection for pressure, inductive seal and difference of fill level, shall appear defective at ~3 PM and correctly sealed at ~5 AM.  

Classic interpretation

Reading from the classic Physics point of view the last two of the four cases above, we see a common cause (written in italics) for the “measurement anomaly": 

     3.  the beverage at ~3 PM cannot be inspected for HF fill level like at ~5 AM, because the Environmental conditions are different;

     4.  the PET container at ~3 PM cannot be inspected for sealing like at ~5 AM, because the Environmental conditions are different;

Modern interpretation

Re-reading from the point of view of modern Physics the last two of the four cases above, it is detected (and, tentatively eliminated) an ambiguity in the Classic Physics point of view, causing approximation:      

    3.  the beverage at ~3 PM cannot be inspected for HF fill level like at ~5 AM, because the correlation between Environment and beverage is different;

     4.  the PET container at ~3 PM cannot be inspected for sealing like at ~5 AM, because the correlation between Environment and container (mechanical characteristics) is different.

Out of this sphere, there is the much wider heliosphere where the Sun act also, preventing the dangerous arrival on the Earth surface of the majority of atomic nuclei and electrons, flying at relativistic speed and of high energy photons, gamma and X-rays.  All these mere byproducts of the multitude of physical Events happening into clouds and into those gigantic fusion energy-based reactors, collectively named stars.  A small portion of them reach however the surface and our Machinery, implying one more reason for the fact that it is unavoidable to experience measurement fluctuations, also in presence of standard cables' shielding.  These examples are not simply extending the radius of the spherical space-time region, the “domain” (or, Environment) in which our measurements take place, well out of the assumed perimeter of the Food and Beverage Factories.    This, because what really performs the inspection function in the over 100000 Electronic Inspectors into Food and Beverage Bottling Factories are, nearly invariably, atoms of Silicium.   Atoms of Silicium into the billions of transistors, themselves part of Integrated Circuits processing signals mainly incoming by CMOS- and CCD-cameras. Several stages of amplification, filterings and comparisons of these amounts with parameters, are the essence of the inspection process.  A process finalised to Binary Classification, typical task for Quantum Computers processing qubits rather than bits and a long chain of nonclassic measurement stages. 

Tunnel-effect: a way toward a technological breakthrough 

                   Tunnel effect diodes sport extremely high speed of operation,  ~ 1 THz (1 terahertz equals 1000 GHz).  This results from the fact that Tunnel diode only uses majority carriers, e.g., holes in an N-type material and electrons in a P-type material. The minority carriers slow down the operation of a device and, as a result, their speed is slower  

 Negative differential resistance of a Tunnel-effect diode, in a current-voltage graph. The nonlinear and nonclassic feature of the diode identified in the red coloured negative differential electric resistance, base of its impressive speed performances.  The tunnelling effect is inherently very fast…, too fast when we consider that recent thorough testing has determined superluminal speed.  Superluminal if, and only if, the object is a unique instance existing in a single world (image public domain under CC 3.0)

    Prof. Günter Nimtz.  His impressive breakthroughs are confirmed by an amount of other independent experiments and theories. Some of them, of Bell-Aspect type about Entanglement, verified until 30 σ, ...thirty standard deviations ! (image published with permission of prof. Günter Nimtz, 2014)

   Nimtz and Stahlhofen double prism experiment of 2006. Photons can be detected behind the right-hand prism until the gap exceeds up to about one meter (image credit Jochen Magnus, 2011)

A practical example of the many logical and experimental threads which carried to this modern scenario, given by the non-classic quantum mechanical Tunnel-effect discovered in 1958 by Esaki.   Below, on right side a practical application, the Tunnel-effect diode and its nonlinear characteristic current-voltage curve.  The animation below shows the time-evolution of the wave function of the electrons in those atoms of Germanium building up this electronic component.  Electrons making what Classic Laws of Physics considers impossible, passing through a barrier of potential.  Heisenberg’s Uncertainty Principle allowed and made sense of this expected behaviour yet decades before the experimental discovery.  What has been discovered later that the Tunnelling effect is ...too fast.  

Thorough testing by several Laboratories, the first of them that of the prof. Günter Nimtz at Koeln University, Germany (see image on right side), at different frequencies and for different kinds of particles, allowed the determination of superluminal speed across the barrier.  

As an example, the graph below shows two microwave pulses at the same frequency of 8.2 GHz travelling through:

                                                    (1)   air  (light violet)

                                                    (2)   a barrier  (dark violet);

the latter traversed the same distance ~ 1 ns faster, a speed 4.7 c, say nearly five times faster than light speed in vacuum (299 792 458 m/s).  

  Evolution of the electron wave function through a potential barrier.  The animation renders what let the Tunnel-effect diode (figures on right side) be so fast with respect to the other components: the Heisenberg Uncertainty Principle.  The central white colour vertical bar is the potential barrier Classic Physics considered impossible to breakthrough.  If we assume the video showing a single electron in a single Universe, then frequent superluminal propagation of the wave function implies paradoxical situations, synthesized in violations of the basic postulate of Relativity: the existence of a maximum limit speed. Tunnel-effect diodes are simply 'too fast' to be existing in a unique instance. What seems to cross the barrier five times faster than light, when considering that the electron coexist in several interfering branches of a common tree-like structure,  is with today a few residual doubts only a side effect of our perspective  

A deviation out of any possibility of explanation in terms of mere statistical fluctuations, say the fluctuations implicit in all physical measurements, whose true origin we are not here detailing.

 Tunnel-effect hints to an interpretation of the events described by the Quantum Field Theory. In the example on left side, photons crossing air (1) or a barrier (2). When crossing the barrier they reach the opposite side ~ 1 ns before those which crossed the air, say 4.7 times faster than the maximum speed of light in vacuum postulated by Relativity (abridged by G. Nimtz, 2006)

A deviation superluminal if, and only if, the wave packet propagated is a unique instance existing in a single universe, say the classic point of view, dated 1905-1915, of the Special and General Relativity theories.  On the opposite, no violation at all of the relativistic (no superluminal propagation) postulates, if what we are integrating into our measurement are a multitude of superimposed instances of the same object.  Object whose existence is multiversal, say in several and mutually interfering branches. Each one instance of the same object in a slightly different Environment.  This, after the discovery of Decoherence, is the meaning for World.    Wave packets which, because of the:

  • Heisenberg’s Uncertainty Principle; 
  • linearity of the Quantum Mechanics wave functions which let the semiconductors switch and amplify;

are irreducibly superimposed multiversal objects, behaving dependently of what is happening elsewhere.  Tunnel-effect can only be understood within the nonclassic point of view of Quantum Mechanics where, like a dam, the Uncertainty Principle separates Classic and Modern ideas we have about the physical world.

Massively confirmed evidence

The concept of propagation speed makes sense if Time exists as a fundamental, because the concept of Speed derivates by those of Space and Time.  Then, the conundrum is Time.  We saw elsewhere the General Relativity assumption about the time-ordered sequence of submanifolds (slices or, leaves, of the manifold M ) constituting a Foliation, whose details and properties we examined here.   In this framework, what clocks measure is proper time s along their own worldline γ, maintaining coherence with the General Covariance Principle over which Relativity theory is based. 


Superluminal motion. To have an idea of the yield of the experimental discovery by Guenter Nimtz and the group of researchers led by him,  it is necessary to recall that since 1905 all material bodies were assumed be moving always and only at velocities v < c, measured with respect to inertial reference frames. Velocities leaving them into the light-cone's external boundary. Boundary represented in the figure above by a dotted line getting out of the origin O(0,0).  In the example, the blue colour represents a body in superluminal motion at velocity v = 4c   All the surface under the dotted line was considered forbidden to whatever, be mattter or radiation.  Visibly the blue coloured vector displaces itself a distance OB > OA in a time shorter than that one necessary to the subluminal red coloured material body

In 1967 the Wheeler-DeWitt equation was capable to join the relativistic description of an object by mean of the hamiltonian, where the object is a constructive interference in a sea of destructive, with the quantisation of whatever implicit in Quantum Mechanics:  


 To have an idea of the yield of the experimental discovery by Guenter Nimtz and the group of researchers led by him,  it is necessary to recall that since 1905 all material bodies were assumed be moving always and only at velocities v < c, measured with respect to inertial reference frames. Velocities leaving them into the light-cone's external boundary. Boundary represented in the figure above by a dotted line getting out of the origin O(0,0).  In the example, the blue colour represents a body in superluminal motion at velocity v = 4c   All the surface under the dotted line was considered forbidden to whatever, be mattter or radiation.  Visibly the blue coloured vector displaces itself a distance OB > OA in a time shorter than that one necessary to the subluminal red coloured material body 

and, what is truly relevant, without any reference to the Time.   How ?   In brief, the Ψ term above is the superposition of all of the elemental wave functions related to all of the existing wave packets.  More, from the point of view of that superposition, no time evolution exists at all.  On the opposite, correlated sets of wave packets part of the superposition witness the initial condition effect historically named Time.   


In 1983 Don Page and W. T. Wootters showed how entangled particles could be used in a Quantum Physics test, to see that the time-ordered sequencing (of the relativistic spatial foliation M) is in the reality only felt by objects correlated with others because of Entanglement.

Entangled couples of particles:

  • have notoriously one of their properties strictly related and are unrelated to the Environment, until a measurement is accomplished on one of them by a third party or the Environmental Decoherence prevails;
  • also when widely separated in the 4-D ordinary space-time, they continue to share the same small Hilbert space;
  • are explicitly cited by the Relative State formulation of Quantum Mechanics.  

Objects related to the Environment feel the effects of Time: Thermodynamics being a relevant example.  


The experiment thinked by Page and Wootters involves Entangled photons.  It had been first time executed along 2013 by a multinational team guided by Ekaterina Moreva. The figure below shows the Optoelectronics layout of the test, including beam splitters, lenses, filters, plates and LASER light commonly adopted into camera-equipped Electronic Inspectors.  It has allowed to form an entangled state of the polarization of two photons, one of which is used as a clock to gauge the evolution of the second: 

  • an "internal” Observer that becomes correlated with the clock photon sees the other system evolve;
  • an “external" Observer that only observes global properties of the two photons can prove it is static. 

 Optoelectronics' layout allowing to form an entangled state of the polarization of two photons, one of which is used as a clock to gauge the evolution of the second.  To an "internal” Observer that becomes correlated with the clock photon the other system appears in evolution, while to an “external" Observer that only observes global properties of the two photons, it looks static (figure credit Moreva, et al., 2013)

  2013 is the year when Time terminated to claim its existence as a fundamental of Physics. First observed in 1967 the possibility of well defined quantum states, completely abiding by the notion of Time. Later, in 1983 it was conceived the mechanism inducing the Time-sensation. A mere effect of Entanglement, only observable by who lives along branches, histories of the Multiverse. The experiment implies measurements so complex and delicates, that it was necessary to wait the technological developments available in 2013 to make it

Time is an Emergent Property.

The term Multiverse has no relation with the known “Parallel Universes”, made popular by Science-Fiction. “Parallel” means not interacting or ambients causally disconnected, say no exchange at all of Signals and Energy.  

Multiverse is nearly the opposite: a superposition of all of the mutually interfering wave packets, corresponding to the wave functions of all objects.  Renamed “Multiverse” to mark the conceptual difference.  

tree-like structure from our point of view.  A multiply-connected object, as seen by Topology point of view.  Several coexisting instances of each one object, each one part of a slightly different (δ  = 1 bit) Environment.

The recently published results confirm the analysis given in 1983 by Don N. Page and W. K. Wootters: Time is an emergent property, deriving from quantum correlations (namely, Entanglement), and not a fundamental of Physics. Then, now that it is established on the dual theoretical and experimental base that Time is a derived concept of Physics, the superluminal speed of the experiments developed on Tunnel-effect, has to be moved from the paradox rank to that of  unavoidable effect.  We cannot calculate any speed, where no Time evolution exists.   Following the Quantum Theory of Measurement, each one time a "good measurement” happens, a correlation between two systems and respective wave functions, a new history branches itself out of the others. This process, introduced by Everett (1957), later adopted by many others eminent physicists.   Between them, the nobelists:

  • Richard Feynman, 
  • Stephen Weinberg, 
  • Alan Guth, 
  • Murray Gell-Mann,

and also some of the most brilliant minds of Physics like: 

  • David Deutsch,
  • James Hartle, 
  • John A. Wheeler, 
  • Stephen Hawking, 
  • Leonard Susskind, 
  • Lev Vaidman, 
  • Avshalom Elitzur, 
  • Yakir Aharonov,
  • Wojciech Hubert Żurek 
  • Dieter Zeh.

    CESIUM ATOMIC CLOCK ON A CHIP.  Just an inch across, this innovative atomic clock buy NIST is smaller than 8 mm high and 3 mm wide. In that space are contained two radio antennas, a heating coil, and a glass ampule with Cesium element trapped inside.  Is this device really measuring time ?  It really depend on what is meant for “time”. After 1983 the very idea of time as a fundamental of Physics fell at least on the theoretical ground. After 2013 the experimental confirmation about the fact that time is a secondary concept, rather than a fundamental like space (video credit Theodore Gray, Max Whitby and Nick Mann, 2009)

"Time is an emergent property, deriving from quantum correlations and not a fundamental of Physics

Each “good measurement” establishes a new additional thread along the sum of all of the yet existing histories

The idea clearly explained in DeWitt (2004, pages 138-144).  That’s why no violation of the light limit speed c exists in the Tunnel-effect: the new Events are observed along a new branch, a new history, and no referral to prior measurements and results make sense to apply.  Since two decades this is the mechanism conceived underlying correlations superior to 30 σ (thirty standard deviations !) in the worldwide Bell-Aspect experiments studying Entanglement.    Also, it allowed decades ago to understand that the mechanism effectively prevents causal violations effects of an hypothetical topologic structure like the Einstein-Rosen bridge (also known as wormholes or Closed-Timelike-Curves, CTCs).  Einstein-Rosen bridges are implicit in the General Relativity theory.  However, until recently the basic idea of CTCs hitted hardly against the solid wall of Philosophic logic and of the common sense: can an effect precedes its own cause ?   In fact, really it is impossible.   The new concept of Measurement, as a matter of fact, prevents CTCs from creating paradoxical causal violations at all scales:

  • microscopic;
  • macroscopic.

Considering all this, Quantum Mechanics is today backing the coherence of General Relativity.   Showing that these extremal scenarios of another theory are not in contradiction with its basic assumptions.  Entanglement idea derives by what in 1935 Einstein, Podolski and Rosen figured what at first sight appeared as a flaw into Quantum Mechanics, one proving at least its incompleteness.  It later resulted that the single-world classic point of view of General Relativity is an approximation.  Each “good measurement” establishes a new additional thread along the sum of all of the yet existing histories. In this framework of correlated, non-interacting, systems it is explained (see Everett in eds. DeWitt, et al., 1973, pages 78-83) why and how they are implicit consequences of the Quantum Theory of Measurement, however incomprehensible they may appear as seen by the classic approximation.  The two figures below synthetise the situation:

  • left side, the modern paradigm, where Time does not exist any more.  Measurements as a natural process continously happening and each possibleactual: the  measurement's result is starting point of a new branch of the general history;
  • right side, the classic point of view today disproved by theory and experiments. An initial condition is perceived like Time by apparatuses and Observers into some of the branches, which are have no information about the content of the other branches of the history.  Time, in the reality, a proved effect of the Entanglement condition of apparatuses and Observers.

                   Modern                                                        Classic

1982-the-crossroad-of-physi med-2








Large scale effects of a change of paradigm 

about the meaning of Measurement

  Evolution of a defined volume Γ(t)  in the phase space.  The region Γ(t) represents the information we have about a system at three distinct and successive times t = 1, 2, 3.   Visibly, the information we have does not increase. Liouville’s Theorem holds its full validity, included those mesoscopic and macroscopic space-time scales where the Optoelectronic devices in the Electronic Inspectors are sensible (abridged by image Susskind, 2005)

  Time t evolution of a region x.  y(t, x) is their Liouville function. The spreading of the Liouville function with time made evident by the fact that the graphic is not monometric: 1 cm on the y axe equals 4 on the x axe.  Liouville function and its underlying concepts regarding the phase-space operates wherever: we’ll encounter it again as a useful tool to evaluate what rejects' rates may be expected after changing an inspection's sensitivity, given initial conditions


We saw here that in the Electronic Inspectors, the simplest measurement subsystems named Triggers are constantly labelling the (or, applying an identification to) objects to their position in the space-time, coarse graining (enclosing) them into macroscopic Shifting-Register’s cells.  We have shown how an Event can be associated to the label of a single slice (or, leaf) in a vast foliation.  There is no fundamental distinction between measuring devices and other physical systems.  What Triggers differentiate are the identities of the objects.  Triggers are the most elementary kind of inspection which can be conceived in a Bottling Control. A measurement is a special case of interaction between physical systems, an interaction which has the property of correlating a quantity in one subsystem with a quantity in another.  With reference to the formalism of the modern version of the Principle of Superposition presented here, is cleared that an interaction is a measurement (pages 54-60) if it satisfies three requirements:

  1. the measurement is associated to a fixed interaction H between systems;
  2. the measured quantities are the limit of the instantaneous canonical operators, as the time goes to infinity.  Our common measurements subjected to finite times are approximate and approach exactness as the time of interaction increases indefinitely;
  3. if the fixed interaction H is to produce a measurement of A in S1 by B in S2 then H shall never decrease the information in the marginal distribution of A.  In other terms, if H is to create a measurement of A by correlating it with B, we expect that a knowledge of B shall give us more information about A than we had before the measurement took place since, otherwise, the measurement would be useless. 

The last requirement, on first sight obscure, regards the fact that the interaction H:

  • cannot decrease the information content of the distribution of the measured object A; 
  • has to increase the information we had about A before we accomplished the measurement;

is the choice which let the conservation of Probability hold true, the only choice which makes possible statistical deductions: Sturm-Liouville Theorem.  A classic concept redressed under new words.  Its full comprehension need an aid to intuition, aid which may arrive by the following two figures referred to cases with two and three dimension.  In Mechanics and Thermodynamics there is a precise signification when saying that Information is never lost by a closed system.  Please refer to the two figures below, showing in two and three dimensions the time evolution of a defined volume in the phase space. The volume of the region Γ(t) represents the information we have about a system at three successive times t = 1, 2, 3.   Visibly, the information does not increase.  Sturm-Liouville’s Theorem holds its full validity also in those mesoscopic and macroscopic space-time scales where the Optoelectronic devices in the Electronic Inspectors are sensible. 

This is the origin of the third constrain appearing above, about the fact that Information cannot decrease in A but has to increase the Information we had about A before the measurement; on practice the origin of the Second Principle of Thermodynamics.

Mesoscopic scale: 

an arena for experimental verifications

In 2010 they started to be spot objects of the size of a visible hair (~ 0.1 mm) existing in two separate places, whose separation was measured with a scanning electron microscope. Along 2012 it has been first time experimentally detected one more counter intuitive result, yet envisaged by Quantum Mechanics: LASER photons spontaneously jumping back, rather than proceeding forward in a crystal lattice.  

 Setup used in 2001 by Zeilinger et al. which allowed to verify that also macromolecules entails fully their properties in the non-classic domain of Quantum Mechanics and respect the Heisenberg Uncertainty Principle. Fullerene C70, is an allotropic form of Carbonium 70 atoms (figure abridged by Zeilinger, et al., 2001)

And this, not in the atomic or subatomic realm where Quantum Mechanics was only supposed to act.  Rather, on the same mesoscopic scale of the semiconductors’ junctions part of the phototransistors of which are equipped nearly all of the photo-sensors used as Triggers by the most Beverage Bottling Controls.  A new paradigm is arising, one whose powerful fruits are commercial applications as different as the Quantum Computers, simultaneously parallel processing in several other Universes, now used by companies like Google, Inc. or Lockeed-Martin Corp. and academic researches sponsored by the Society of Lloyd’s yet in 2007.  Why these private companies should be investing money in something, at first sight, seemingly theoretical ?   The answer, in some way, is related to the discovery of Decoherence.

 Close up on the molecule of Fullerene C60, an allotropic form of Carbonium with 60 atoms.  Visible the isosurfaces of ground state electron density.  Discovered in 1990, was the first used after 1993 to test Decoherence in the mesoscopic scale of dimensions.  The Van der Waals diameter of the molecule C60, accounting also for the thickness of the electronic clouds around nuclei, is 1 nm

Close-up on Decoherence

"Decoherence:                                                                                                                                              process that classicalizes a quantum phenomenon, so that its former wavy character disappears"

  Mesoscopic scale of dimensions.  Tungsten individual atoms directly sighted in this image obtained in 2008 of the tip of the sharpest Tungsten needle existing. The small central red colour dot is an atom: ~0.30 nm its visible diameter. The image was obtained by mean of a Field Ion Microscope. Brighter red colour lines are an effect of smearing due to atoms' displacements along the 1 second long exposure (image abridged by Silverman, 2008)

Triggered Events lie in the space-time and energy boundaries separating the fields of application of Classic and Quantum Physics.  We all agree that macroscopic objects are composed of collections of microscopic, like molecules and atoms.  As a consequence, Triggered Events are finely rooted there, in the microscopic scale of distances.  But, we all are convinced to see individual Events referred to individual Objects. The Objects to whom we are referring being always and only human-, space-, time- and energy-proportioned objects.  No one, in absence of instruments, is capable to discern:

  • grains of dust whose diameter is < 0.01 mm;
  • Events separated by a time interval < 0.001 s;
  • radiant energy < 0.1 nJ;      

because our eyes, and the neuronal system supporting their function, are not biologically deve-loped for that.   But, in the end, these limits does not matter that much.   Since 1985 they are known the factors which let the multitudes of paths above described, fruit of Feynman’s  own intuition, be reduced to the individual alternative we are experiencing. An introductory definition for this modern concept is the process that classicalizes a quantum phenomenon, so that its former wavy character disappears.  Decoherence is the theory of universal Entanglement: it does not describe a distortion of the system by the Environment, but rather a change of state of the Environment by the system.  The Environment includes a multitude of air molecules and photons, mainly at thermal frequencies. Imagine a macroscopic body as massive as a Bowling Ball like that in one of the figures below, on right side.  In this case, scattering of photons or atoms off such a macroscopic object, even very small dust particles or macromolecules, causes no recoil.  But, this inefficiency in the measurement results over compensated by the multitude of scattering Events, occurring in our daily life and industrial Environmental conditions, even along small time intervals, say:

  • atmospheric pressure:   ~1 bar;
  • temperature:                   (-30 - 60) °C;
  • relative humidity:            (0 - 100) %.

Young’s experiment with and without air 

The two figures below representing the Young double-slit experiment with and without molecules of gas (air), interposed along the paths of electrons.  Two completely different distributions of the hits counted on the following screen:

  1. on left side, in a vacuum;
  2. on right side, with the gas molecules of air.  

Decoherence is what impedes us to see all objects in their superimposed reality.  It is mainly due to the collisions between the molecules of air and the electronic clouds around each one atom.  These collisions carry away the phase correlations between the histories where the electron arrived at several other points.  The next two section shall detail what is detected with and without interposed gas.   It’ll be accounted how Decoherence phenomenon, discovered in 1970, was hiding the direct sight of the true constantly happening behaviour of the matter and radiation, first detected only in 1989.  

  Thomas Young's double-slit experiment, on left side in vacuum and on right side with air along the paths of electrons emitted by lamp filaments. Counting the hits on the screen, two completely different distributions arise, a diffrence due to Decoherence.  Decoherence is mainly due to the collisions between molecules of gas and the electron, carrying away the phase correlations between the histories where the electron arrived at point y on the screen by passing through the L slit by those historieswhere the electron arrived at point y on the screen passing through the U slit

“...There are only waves and, knowingly, waves are superpositions of other waves."

1.  Double-slit in a Vacuum

Many of the ideas about the concepts of measurement, space, matter and radiation into today’s academic journals originate by ideas published or however circulating decades ago.  There are reasons why this is happening.  No theory can ignore the experimental evidences and these last are constantly improved.  And, there are some special experiments, like Thomas Young’s, first accomplished centuries ago and providing some strong clues about the everything reality, which had to wait centuries.  The nobelist Richard Feynman decades ago imagined this when called it: "a phenomenon which is impossible ... to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery of quantum mechanics”.  Feynman was writing about interference fringes appearing in the double-slit Young's arrangement when many simultaneously electrons were fired.  "Many simultaneous" means a high probability of interference fringes due to the superposition of several electrons on the screen, something absolutely expected and normal.   But, when later, Young’s experiment was first time performed with individually fired electrons, it was touched the surface of something much bigger than the wavelike properties of the electromagnetic radiation.


"Field emission:

electrons are emitted from a very sharp tungsten tip (thinner than 1/1000 mm) when a potential (3 – 5) kV is applied between the tip and a first anode ring; this effect is known as field emission

It happened in 1989. A group at the Hitachi's Advanced Research Laboratory (at Tokyo, Japan) led by Akira Tonomura developed the first double-slit system allowing to observe the build-up of the fringe pattern with a very weak electron source: single-electrons fired on a one-by-one base toward a double slit. What results when firing single-electrons fired on a one-by-one base toward a double slit is unimaginable.   The figure below shows a schematic representation of the modifications that Tonomura made to a Transmission Electron Microscope to develop his experimental setup. Electrons are emitted from a very sharp tungsten tip (thinner than 1/1000 mm) when a potential in the range (3 – 5) kV is applied between the tip and a first anode ring; this effect is known as field emission.  Assorted Optoelectronics within the modified electron microscope attenuates and focus the electron beam.  The hits are fine-detailed by the four figures on right side.  

                          Time = 1

                          Time = 2

                          Time = 3

                          Time = 4

   Common sense defies our imagination when trying to figure how these images could be truly existing. Thomas Young’s experiment was first made with light waves two centuries ago. But, since one century it is observed that firing massive particles like electrons or neutrons, on a one-by-one base, creates the same wave like pattern.  And same is true also firing massive mesoscopic molecules composed by hundredths of atoms, on a one-by-one base: the same wave like pattern. Each objects fired after the precedent hitted the screen, shows an interferential shape on the screen meaning that also mesoscopic bodies are, when closely looked, waves.  More, or electrons have now to be supposed having their own brain (and surely they have not) or there is only a theory capable to explain how can they know what was the path choose by each one of the electrons fired before. The only theory with the necessary explanatory power is the Relative State formulation of Quantum Mechanics, also named Many-Worlds Interpretation of Quantum Mechanics (images abridged by Amelino-Camelia, Kowalski, 2005)

Missiroli experiment

  Tonomura's team experiment schematic representation. Their 1898 single-electron double-slit experiment (image abridged by Prutchi, et al., 2012)

These show how, hit after hit, what in the start looks like mere noise, develope in the end a wave like pattern.  The bright spots begin to appear here and there at random positions: these are electrons’ constructive wave packets, detected one by one and looking like particles.  These electrons were accelerated to 50000 V, and therefore the speed is ~40 % of the speed of the light, i. e., it is ~120000 km/second.  These electrons can go around the earth three times in a second.  They pass through a one-meter-long electron microscope in 1/100 000 000 of a second. The De Broglie wavelength for the accelerated electrons is λ = 0.0055 nm.  

Interference fringes are produced only when two electrons pass through both sides of the electron biprism simultaneously. If there were two electrons in the microscope at the same time, such interference might happen. But this cannot occur, because there is no more than one electron in the microscope at one time, since only ten electrons are emitted per second. When a large number of electrons is accumulated, something like regular fringes begin to appear in the perpendicular direction shows.  Clear interference fringes can be seen in the last scene of the experiment after 20 minutes. It should also be noted that the fringes are made up of bright spots, each of which records the detection of an electron. The final resulting pattern on the screen does not resembles at all any interferential, rather hints to a corpuscular character of the objects.  Although electrons were sent one by one, interference fringes could be observed.  Interference fringes are expected only when electron waves pass through on both sides of the electron biprism at the same time, but nothing other than this. Whenever electrons are observed, they are always detected as individual particles. When accumulated, however, interference fringes are formed.   Please recall that at any one instant there was at most one electron in the microscope.    We remark that these figures are what is detected on the screen after having been hitted by material particles, like molecules, atoms, neutrons, or electrons.  We are not speaking of objects since centuries considered wavelike, like the light (photons). This confirms that, in the reality, no material particle exists at all, rather only waves with different properties like energy, spin, etc.   The interpretational bifurcation reached in 1982 after Alain Aspect’s group experiment on Entanglement: 

  1. Bohm-De Broglie's interpretation explains it but only after paying the unacceptable price to postulate that light (and, gravitational interaction) does not defines the limit speed for all causal correlations: it needs to introduce tachyions; 
  2. Copenhagen's interpretation of Quantum Mechanics, has no explanation for what registered during this experiment;
  3. Everett’s Relative State formulation, since the start included exactly what is observed;

 Clifford-torus is a 4-dimensional object of which we are here forcedly seeing a projection.  We have developed less than 100 billions of neurons: not enough to perceive events which are happening on an arena whose spatial dimensionality is, as a minimum, 4. What we see in the video on side, is never what really the Clifford-torus is and look like as seen by a tetra dimensional point of view we’ll never have. As a consequence, the details of the branching superpositions of states, corresponding to Measurements and looking like bifurcations, can be finely followed but only mathematically 

reproposed itself in 1989, this time with many more accumulated experimental evidences.   The aspect of interference fringes, visible above on right side, develops itself always and also for individual particles, …also if the molecules, atoms, neutrons, protons or electrons are fired after the precedently fired particle yet hit the screen. The experiment had been repeated with material bodies of progressively increasing size and mass: we are no more in the domain of particles, rather inthe mesoscopic scale close to human direct unaided sight.    Also mesoscopic macromolecules, including several hundredths of atoms appearing like a small grain of dust well visible by common microscopes were tested, without any change in the final result.  There are only waves and, knowingly, waves are superpositions of other waves. More, or neutrons have now to be supposed having a brain, because it seems that they know what was the path choose by each one of the neutrons fired before.   Physics has since 1957 a unique theory with the explanatory power for the images on right side and, what is more important, it is not an ad-hoc one born to explain Tomomura experiment, because it was conceived 31 years before Tonomura's result. This interpretation (“The Many-World Interpretation of Quantum Mechanics”) in brief explains that all of the objects are waves and that they all are superimposed and part of a last grand superposition. Thomas Young’s experiment with matter had been extensively repeated on a worldwide base after 1989, reconfirming the veridicity of the interference fringes on side. Today they exists also cheap and valuable Optoelectronics plus software kits, to be connected to a computer thus allowing to whoever to witness and register single-electrons interferences in the two-slit configuration.

2.  Double-slit with air

On the opposite, a decoherent set of histories is one for which the quantum mechanical interference between individual histories is small enough to guarantee an appropriate set of probability sum rules, what represented by the bell-shaped distribution observed above, on right side.  It is the continous measurement of an object by all other objects, in our industrial Environment mainly molecules of that mix of O, N, He, CO2 we name air, under the permanent bath of light, the reason why we do not see simultaneous alternative Triggered Events.  In other terms, the Environment induces a super selection, separating in two or more subspaces the (Hilbert) space where objects really exist.  What above, around one century ago induced what was then the mainstream interpretation of Quantum Mechanics (Copenhagen’s school) to establish a Wave-Particle Duality which further experiments, improved technologies and theories, showed a mere illusion.  Since decades it is clear that is Decoherence what lets us perceive in a unique status for what is superposition of states and that, all is:

  Evolution of the coherent history of the wave function Ψ.  In two branches Ψ(1) and Ψ(2), separately evolved in five branches  Ψ(1,1), Ψ(1,2), Ψ(2,1), Ψ(2,2) and Ψ(2,2), evolving in further seventeen branches.  Such scenario is cited today in the scientific literature very frequently.  An amount of technological facts encounters there its only explanation (figure abridged by Zurek, Riedel, Zwolak, 2013)

“Why do we only experience individual sharp superpositions, single bowling balls, rather than multitudes ?”.  

Because all others get damped out by decoherence, before we have the time to observe them

  • waves;
  • branching superpositions of waves.

Decoherence speed

Imagine a physical object as heavy as a Bowling Ball in an Environment in standard conditions of temperature, air pressure and humidity.  It is a superposition of a multitude of possible correlations between its elementary components (quarks, gluons, leptons, etc.) and all of the others building up what we name Environment.  Its even and odd components have equal classical components but opposite quantum interferences.  The Laboratoire Kastler Brossel of the French CNRS processed such an object subtracting their Wigner functions, then isolating the interference feature displaying their quantumness.  The result is the evolution of this signal over 50 ms, exhibiting a fast decay after only a few milliseconds due to Decoherence, of the original pure interference pattern which represented the physical object.   

  Magnesium fluoride multicoating anti reflective treatment, well visible by its pink colour in this camera, is a practical example of quantum interference.  Here, destructive interference is used to increase the Signal-to-Noise relation of the Information contained in the image

Bowling balls decohere in a time extremely short, explaining our sensation of their unique existence in a definite place   

How fast Decoherence happens is known since two decades.  In the Table below, showing the Decoherence Rates (or, Localization Rates), expressed in units of  m-2 s-1 are represented three cases representative of objects of micro-,  meso-  and macroscopic scales:

  • an electron, not binded to any atom;
  • a dust particle, at the limit of unaided eyes visibility;
  • a bowling ball.



  Classic or non-classic behaviour of the objects as an effect of Decoherence by the surronding environment.  Tabular values show the Localization Rate, expressed in units of cm-2 s-1, for the center-of-mass of three different objects, in order of decreasing strength.  Localization Rate measures how fast interference in between different positions disappears for distances smaller than the wavelength of the scattered objects.   Here shown the localization rates for 3 cases: a quark (electron) not binded to any atom, a dust particle and a bowling ball.  Our operative ambient conditions are met at a thermal background temperature ~ 300 K (~ 27 C) in air at a pressure of 1 atm, implying extremely high localization rates for an object of the size of a bowling ball, but fourteen orders of magnitude smaller for an object as small as an electron (table abridged by Tegmark, 1993)

 All measurement equipment, controls and machinery, are themselves superpositions of waves like the photon wave packet depicted above. Particles do not exist at all and the reason we are still naming waves in that way, is historical. 

 The measurement of a physical property by a macroscopic instrument, e.g. by an inspection in a Bottling Control, is subject to additional limits.  Only the properties of objects S and of the environment E into the causally connected (blue colour) volume of space, part of the future lightcone  of the Event M (the Measurement), shall be decohered.  This, means that the macroscopic measurement device A continues to remain in a super position of statuses for all what lies in the causally disconnected - red coloured - space (image adapted by L. Susskind, R. Bousso, 2012)

We’ll be now more precise about how our operative ambient conditions [thermal background temperature ~ 300 K (~ 27 ºC) in air at a pressure of 1 atmosphere], imply extremely high localization rates for an object of the size of a bowling ball, but fourteen orders of magnitude smaller for an object as small as an electron.  The effect of air, or any other surrounding substance, and black-body radiation from the surrounding is strongly temperature dependent (typically T), and can hence be reduced by nine orders of magnitude by working at liquid Helium temperatures or, in a smaller extent, taking profit of a Peltier-effect cell.  Exactly the strategy followed when looking for maximum performances of Optoelectronics’ devices, first of all: the CCD-sensors.   Comparing these results with conditions of minimised scattering, like the exposure of these objects to the cold ambient with the only cosmic background radiation, at a temperature of ~3 K (~-273 ºC), the macroscopic bowling ball decoheres in a time 1028 shorter than what we experience.


These studies allowed to answer the main question arising after 1957:  

“Why do we only experience individual sharp superpositions, 

single bowling balls, rather than multitudes?”  

…..because all of the others get damped out by Decoherence, before we have the time to observe them.  

Decoherence limits

Decoherence is a process with the same fundamental limits seen in the start of this page, with respect to the volumes of space causally connected with the Trigger Events, a subject we’ll deepen in the following.   Refer to the figure below, where:

  • blue coloured,   3-D volume (encoded in 2-D for graphic rendering) of space hosting the environmental factors E (e.g. air, thermal photons) and the quantum system S;
  • red coloured,     3-D volume causally disconnected;
  • M,                      Trigger Event;
  • S,                       quantum system interacting (“measured”) by the Trigger;
  • A,                       macroscopic measurement instrument, e.g. a Trigger photosensor.

The majority of the space is always causally disconnected.  We are indicating as a black colour bold inclined line, the worldline of the macroscopic measurement instrument A (the Trigger).  Inclined with respect to the Time axe, to manifest the fact that the instrument is over a non inertial platform.  What precedes has an interesting implication: the macroscopic measurement device A (the Trigger, in our case) continues to remain in superposition of statuses, not decohered, for all what is so far to result causally disconnected.  Say, all what in the figure below lies in the space and is shown in red colour.  The reduced volume that we consider Environment of a Food and Beverage Bottling Line, assures that all triggerings and measurements are always derived by interactions with decohered states.  As visible by the figure below, this assumption is a fiction useful to simplify an extremely complex relation.  As all fictions, it cannot provide definitive solutions nor improvements on hard-to-tackle technical issues.  This subject is reminiscent of the Problem Solving method searching for the root cause of a problem centered on the point where the effects are felt.  A strategy commonly followed when an evident cause cannot be encountered in the space-time volume we choose to consider the connected Environment.  In these cases, we are searching for all of the thinkable cause-effect relations:

  • in space volumes progressively increased;
  • going backward in time. 

The answer of modern Physics to this point is dual, stating that all:

  • subsystems are in a superposition of statuses, as seen by all of the other subsytems, causally disconnected because too far;
  • what lies into the future lightcone of the measurement Event M is related to M and gives rise to effects S + A conditioned by the Environment E. 


 Photons are superpositions of wave packets of something, a fundamental concept (an axiom) we'll name here energy without even to try to explain its nature.  The well known diffraction of white light in a spectrum, corresponds to a macroscopic observation of a reality whose arena lies in the microscale.  Something we are capable to perceive directly only because effect of the participation of a multitude of photons 


Links to other pages:

                                                                                                            Copyright Graphene Limited 2013-2019