The statistical nature of the measurements is an illusion.  No measurement is statistical, rather strictly deterministic and linear.  The measurements’ results, e.g. the gaussian expected distribution of the independent measurements of a “random” variable, are not random at all

Inspections in a Decohering Environment

“Around the Electronic Inspectors the decisive factor and most often overlooked: the Environment"


What is a measurement ?

Technical/White paper

     Flash animation, 69 MB, ZIP

                  Click  index.html


    

              132 pages,  33 MB    



                       32 MB




“As models for Observers we can consider automatically functioning machines, possessing sensory apparata and coupled to recording devices capable of registering past sensory data and machine configurations. 

We can further suppose that the machine is so constructed that its present actions shall be determined not only by its present sensory data, but by the contents of its memory as well.  

Such a machine will then be capable of performing a sequence of observations (measurements), and furthermore of deciding upon its future experiments on the basis of past results”




Measurement’s nature is like time, one those things we all know until we have to explain to someone else.  Explanation invariably passing thru the idea of comparison between a standard established before and something else.  In the reality, this answer learnt in the Electronics, Electrotechnics and Engineering courses at whatever level, is considered today a classic one.   A soft way to say obsolete.  The idea of measurement has slowly changed from the initial mere comparison rule, to something completely different.  It has been reshaped, following what understood by the discoveries of the past century.  In the Classic perspective, a physical measurement requires a collection of devices such as clocks, encoders, phototransistors, counters, LEDs, temperature or pressure sensors, and so on. 

The operational control of this instrumentation is exercised alternatively by the observer, who decides what to measure, how to perform a measurement, and how to interpret the results.

“As models for Observers we can, if we wish, consider automatically functioning machines, possessing sensory apparata and coupled to recording devices capable of registering past sensory data and machine configurations. We can further suppose that the machine is so constructed that its present actions shall be determined not only by its present sensory data, but by the contents of its memory as well. Such a machine will then be capable of performing a sequence of observations (measurements), and furthermore of deciding upon its future experiments on the basis of past results”. 

Visibly, these automatically functioning machines closely resembles an inspection in an Electronic Inspector, or of one of the assembles composing an Automatic Machine, like the Conveyors, Filler, Closer, Blowformer, etc. 

Established these basics, we all agree a physical measurement is meaningful only if one identifies in a non ambiguous way:

  • who is the observer (or, apparatus, or automatically functioning machine);
  • what is being observed.

The same observable can be the target of more than one observer (or, automatically functioning machine) and that’s why we need a suitable algorithm. Because of this reason it is also needed a suitable algorithm to compare their measurements. 

In the following, we’ll just briefly introduce two points of view about the nature of physical measurements:

  • classic, a relativistic point of view based over Differential Geometry;
  • modern, the Quantum Physics point of view based over Quantum Field Theory

Being two distinct point of view about the same subject, many terms shall be purportedly repeated. This introduction tries to abide by the necessary formalism, here replaced by figures and graphics. The goal of this introduction along two different pathways, is to show how similar conclusions about what a measurement really is, get out naturally by theories substantially different and developed in different epochs. 


Local and non-local measurements

The Electronic Inspector and the ambient containing it, cover a finite spatial volume making measurements lasting for a finite interval of time.   As seen by a point of view which since over thirty years considered obsolete and far from truth, the measurement’s domain is the space-time region in which a process of measurement takes place.   In a following section we’ll deepen this point, showing how wide results the domain of the automated quality control equipments, volume commonly imagined reduced to the Bottling Hall.   

If the background curvature can be neglected, then the measurements will not suffer from curvature effects and will then be termed local.   If on the opposite, if the curvature is strong enough that it cannot be neglected over the measurement’s domain, the response of the instruments will depend on the position therein and therefore they require a careful calibration to correct for curvature perturbations. In this case the measurements carrying a signature of the curvature will be termed non-local.


Apparatus and physical measurements in the space

In Mathematics, the tangent space of a manifold simplifies the generalization of vectors from the affine spaces to the general manifolds.   An example of tangent space is given in the figure on right side.   Generalization made necessary because in the latter case, what we all learnt by elementary vector algebra, when subtracting two points to obtain a vector pointing from one to the other, is not possible.   

Whatever, then also an automatically functioning machine and the Environment containing it, is mathematically modeled by a family of non-intersecting time-like curves having u as tangent vector field (see figure at right side) and denoted by Cu .

  Tangent space of a single point on a sphere.  Generalising the concept, to every point of a differentiable manifold, can be attached a tangent space.  Tangent space is a vector space containing all possible  directions along which one can pass through the point (image credit csdn.net, 2014)






NormalVector









                         A vector normal to the surface


Here, Cu is the congruence of curves with the tangent field.  Each curve of the congruence represents the history of a point in the laboratory or automatically functioning machine.   We choose the parameter t on the curves of Cu so as to make the tangent vector field unitary, an always possible choice for non-null curves.   

Let ΣC be a space-like three-dimensional section of u spanned by the curves which cross a selected curve γs of the congruence orthogonally.   The concepts of unitarity and orthogonality are relative to the assumed background metric.   The curve γs will be termed the fiducial curve of the congruence and referred to as the world line of the observer, apparatus or automatically functioning machine.  

  Comparison between the actual course of a point on a test geodetic and the fiducial course. Fiducial course is the course it would have had to take to keep constant separation from the point moving on the fiducial geodetic (credit J. A. Wheeler et al., 1973)






Tangent and Normal Vectors in a manifold









  A vector normal to the surface and a family with infinite measure of the associated tangent vectors


    Four non-intersecating time-like curves having u as tangent vector field.  All automatically functioning machines are mathematically modeled by a family of such curves, denoted C








At time t, let the point of intersection of Σ with γs be: 

 

                                            γs )


As a consequence, during the continous variation of t over γs we’ll observe the section Σ spanning a 4-dimensional volume.  This volume represents the space-time history of the observer’s laboratory, or the machine and its environment.  

Limiting the extension of Σ to a range much smaller than the average radius of its induced curvature, it becomes possible to identify:

  • Cu  with the geodetic curve γs
  • Σ    with the point γs ( t )

Any time-like curve  γs  with tangent vector can then be identified as the world line of an observer (or, automatically functioning machine or electronic inspector) which will be referred to as “the observer u”.   If the parameter t on γ is such as to make the tangent vector unitary, then its physical meaning is that of the proper time of the observer u.   As an example, the time read on his/its clock in units of the speed of light in vacuum.  


Reference frames

This concept of observer, apparatus or automatically functioning machine however, needs to be specialized further, defining a reference frame adapted to him.   A reference frame is defined by a clock which marks the time as a parameter on γ, as already noted, and by a spatial frame made of three space-like directions identified at each point on by space-like curves stemming orthogonally from it.   


While the time direction is uniquely fixed by the vector field u, the spatial directions are defined up to spatial rotations, i.e. transformations which do not change u.   Obviously there are infinitely many such spatial perspectives, whose effect is visible by the figure on left side.  Here, in the classic relativistic picture, alternative courses that a test geodetic could have taken.  Could have taken through, has been replaced after 1990 by the modern took through.  

The result of a physical measurement is mathematically described by a scalar, a quantity which is invariant under general coordinate transformations.  A scalar quantity, however, is not necessarily a physical measurement. 




























  A family of alternative courses that a test geodetic could have taken, all of them through the point Q.   The could have taken through, has been replaced after 1990 by the modern took through.   The word alternative has been replaced by actual.   The family of geodesics shown, features different degrees of convergence toward the right side, or divergence toward left side (image credit J. A. Wheeler, et al., 1973)


The latter, in fact, needs to be defined with respect to an observer and in particular to one of the infinitely many spatial frames adapted to him.  The aim of the relativistic theory of measurement is to enable one to devise, out of the tensorial representation of a physical system and with respect to a given frame, those scalars which describe specific properties of the system. 


The elementary measurement

It can be a pressure sensor in a Leakage Electronic Inspector, or a capacitive sensor in a Weigh Checker for cases or clusters of bottles or cans, it can be a High Frequency fill level inspection, X-rays or the most daily sight we give to the world when awaking, these all are electromagnetic measurements.

  Essential measurement.  Electromagnetic signalling from a light source at A1 to a mirror at P and its reabsorption at the point A2.   The true spatial distance between the light source and the mirror is the geodetic curve ζs from joining the points P and A(S).  The arrangement is similar to the one shown in the figure below.   Only difference the fact that the photocell barrier below transmits and integrate many photons rather than just one of the case above (image credit De Felice, Bini 2010)

The only non-electromagnetic measurements whose effects we can feel, are those happening at other levels, namely:

  • nuclear, like those originating the gamma-rays fill level inspections;
  • gravitational.

To let an electromagnetic interaction be considered a good measurement, it is necessary:

  1. Timeto transform the previous state, in which all possible kinds of correlation (superpositions) between observer (or, his models, like all automatically functioning machines) and the object of the measurement coexist, in a following state in which the observer is “aware” to be correlated to an object, because having recorded eigenvalues for the eigenfunction ΦiS1 describing the object.  The correlation between the two systems is progressively established during interaction and proportional to the natural logarithm ( ln t ) of the interaction time t.   An ideal correlation, corresponding to a maximised information of the observer about the object, can only be reached allowing an infinite time.   The fact we cannot wait for an infinite time causes the measurements’ fluctuations, a synonimous of the spectrum of the eigenvalues, resulting in the Electronic Inspector's false positives (false rejects). 
  2. Interaction between the systems such that the information in the marginal distribution of the object inspected is never decreased.  Otherwise we could not have any more repeatability of the measurements.   As an example, the instrument used to establish the correlation, for an instance an electromagnetic wave in a common photo-electric sensor, should never modify the molecular structure of the object.   Otherwise, it’d be modifying its eigenstates and then the eigenvalues we expected to derive by the measurement.

A visible example of such elementary measurement is visible on right side. 

Its meaning, following the relativistic and today classic point of view, is the time-ordered sequence where: 

  1. a source of light moving along the geodetic γ,
  2. emits a photon at the point A1,
  3. instantaneously reflected back by a mirror existing at the point P,
  4. reabsorbed by the Observer source of light when this reaches the point A2.  

Such configuration allows to establish the spatial distance ζs  between observer and mirror, along the curve separating them.

A configuration whose essential idea since several decades is reproduced wherever.  Also in the Industrial Machinery, in the Electronic Inspectors, in the inspections of which the Inspectors are a superposition, until the Observers.  In the figure below, as an example, the source of light is fixed with respect to the mirror.   The pallets travel over a Conveyor whose speed v is relayed to an Electronic Inspector in the form of a train of pulses, whose frequency is proportional to the Conveyor’s mechanical speed.  

With respect to the pallet, the signal is dual: 

  1. presence, deduced by the interruption of the light beam, corresponding to a measurement of little or none returning photons infeeding the photocell's phototransistor;
  2. length l, deduced by the number of Encoder’s pulses received as a digital input by the Electronic Inspector during the conditioning status of pallet presence. 

Our technological example results an approximation of the general case where the speed of light is finite and 4-dimensional-curved the space where the movement happens.  Light speed is assumed infinite and the evaluation of the spatial distance ζs  aliased by that of the pallet length l.    As seen from the point of view of a pallet in movement, the light barrier here represented is in movement at the Encoder speed, along an opposite direction.  

  The basic or advanced applications of the Industrial Machinery are aliasing prime-concepts of Physics (image credit InterSystem® AB, 2014)


Modern interpretation of physical measurement

Who measures what, from where and when ?









The basic measurement mechanism described immediately before, corresponds to a classic point of view.   Since decades replaced by a completely different one.   The measurements are, in general, observer-, inspection- or automatically functioning machine-dependent.   Because of this reason, a criterion should also be given for comparing measurements made by different observers, inspections or automatically functioning machines.   The relevance of the comparison between different observers, inspections or automatically functioning machines was discovered nearly one century ago.   More than fifty years ago it started to become clear why this dependance.   

A basic role in this procedure of comparison implicit in the measurement, is played by the Lorentz group of transformations.   A measurement which is observer-independent is termed Lorentz-invariant and Lorentz-invariant measurements are of key importance in physics.    Why ?   Because the description of a physical system depends both on the observer, inspection or automatically functioning machine and on the chosen frame of reference. 


  Measurement of a property in physical system depends both on the Observer, inspection or automatically functioning Machine, and on the chosen frame of reference.  In this example, reference frame is the metal “frame” of the micrometer 


Riding geodetics










  Oscillation characteristic  values (eigenvalues) are modes of vibration.  An example the upper surface of a drumhead, each one mode oscillating its own frequency (image credit J. Kreso, 1998-2010)












In most cases, the result of a measurement is affected by contributions from the:

  • background curvature;
  • peculiarity of the reference frame. 

As long as it is not possible to discriminate among them, a measurement remains plagued by an intrinsic ambiguity.  This ambiguity, whose existence was clear yet sixty years ago to Albert Einstein who created General Relativity, is not accidental.  It hints to the multiple actual courses that a test geodetic can take through.   And we, our inspections and automatically functioning machines, ride these geodetic.  The statistical nature of measurement, hinted by the video in the start of this web page, is an illusion.  No measurement, by an observer, inspection or automatically functioning machine is statistical.   Rather, it is strictly deterministic and linear.  

The measurements’ results, e.g. the gaussian expected distribution of the independent measurements of a random variable, are not random at all.  It is directly the statistical interpretation of measurement, over one century old, to be flawed by ...obsolescence.  

In the modern, experimental and theoretical perspective, each one measurement corresponds to one of the existing (existing or actual, and not possible) eigenvalues of the eigenvector including the observer and of its generalisations (the inspections or the automatically functioning machines).  An eigenvector whose complete set of values exists only from the point of view of a superior superposition, the sum of all the superpositions.  An existence in the sense that only the superior superposition has all of the knowledge for all of the eigenvalues of all of its eigenvectors.  We all observers and all our operative generalisations, like the inspections or the automatically functioning machines, have energy and extension limits.  Because of this excellent reason, we can only have an extremely limited perception and memory of the multitude of the coexisting eigenvalues.  

To clear the comprehension of this truly fundamental concept, please refer to the figure on right side.  There, each point of the surface of the drum head is indexed by a triad of (x, y, z) coordinates where x, y are the common cartesian bidimensional coordinates, and z the height (or, depth) of the point.    Each one of the infinite points of the surface of the drum head exists before and after the measurement, also if associated to different values of the height z.   There is a multitude (however not an infinity) of values of the z coordinate which may be associated with each one point x, y.    Returning to the differential geometry classic point of view described by the figures above, the sum of all of those courses took by a test geodetic.  

We are speaking of an actual multiplicity, obviously fronting the immediate objection: 

                           why we only see a single measurement result ?       

The following animation below, based over an insight by Dieter Zeh dated 1970, allows to start to answer the question.     It shows the fine-details of how a measurement happens. Specifically shown what happens to a macroscopic System (a cat), initially in a superposition of states, under the effect of the same Environment in standard conditions of temperature, pressure and humidity where also all Food and Beverage Controls operate.    It is, visibly, a smooth continous process, displaying the reconstructed Wigner function of the System, averaged over 4 ms.   Reconstructed with the data recorded in a 4 ms sliding time-window (credit CNRS, Laboratoire Kastler Brossel).



 A cat, initially in a superposition of states, under the effect of the same Environment in standard conditions of temperature, pressure and humidity where also Food and Beverage Controls operate.  It is, visibly, a continous process

Two different phenomena are visible: 














  • fast decay of the quantum interference feature, in the few initial milliseconds;
  • much slower evolution of the classical components, towards phase-space origin. 

In the following, it’ll become clear how deeply and constantly these modern subjects intervene in the function of the Electronic Inspectors and their measurements (inspections).

Along past three decades Decoherence explained why: 

  • certain microscopic objects, commonly named “ articles", seem to be localized in space: in the reality, particles do not exist and only there are waves (see figure on right side); 
  • microscopic systems are usually found in their energy eigenstates and therefore seem to jump between them, meaning that there are no quantum jumps;
  • they appeared to exist two contradictory levels of description in physics (classical and quantum) when there is a single framework for all physical theories: the quantum theory;
  • the Schrödinger equation of General Relativity, also named Wheeler-DeWitt equation, born in 1967 may describe the appearance of Time in spite of being Time-less.   It has been understood that Time does not exist and what it really exists is just a special initial condition.



Electronic Inspection and Quantum Physics: a common ground

causal-relation-scale med-2

Inspections can be thinked as finalised measurements or, measurements with a scope.  Where the scope is the binary classification and eventual rejection of the objects.  Whatever object (bottles, cans, crates, cases, kegs but also wheels, smartphones, blood samples, glue, pens, gaseous substances, etc.) where at least one measured physical quantity, resulted out of a pre-defined range.  Physical measurements are the core of Bottling Controls Technology and operation.  


  The Environment causally connected with our Bottling Controls is wider than the Factory in which they operate. An entire Food and Beverage Bottling Line can be stopped by the false rejects whose unique, 8.5 minutes delayed cause, lies 150 millions of kilometres afar: the Sun.  But, also apart the limit case here described, sun beams act constantly in the productive process, creating a daily cycle causing apparent spontaneous sensibilisations of the Bottling Controls’ inspections and the opposite effect around twelve hours later 
























Around the Electronic Inspector there is the decisive factor, and most often overlooked: the Environment.  Electronic Inspectors (or, Bottling Controls) in Food and Beverage production Lines are collections of assembles, named inspections, whose components are Optoelectronic devices performing physical measurements of the properties of objects (containers, crates, cases, etc.) and their content, mainly by mean of electromagnetic measurements.  

In this framework, Triggers are the most basic devices, those controlling the successive actions of physical measurement and eventual rejection in a Binary Classifier.  

A physical measurement requires a collection of devices in our case clock systems integrated in the Inspector’s electronics, Encoders, counters, LASER photosensors, CMOS-cameras, High Frequency residual liquid controls, and so on. The operational control of this instrumentation is exercised in the phase of the Electronic Inspector's startup and commissioning by the Field Engineer. 

He is who calibrates amounts corresponding to Contractual Agreements about: 

  • what to measure, 
  • how, where and when to perform a measurement, 
  • how to interpret the results. 



Environment, the measurements' domain

The Food and Beverage Bottling Line covers a finite spatial volume and the measurements last for a finite interval of time, in our case commonly ranging:  (0.1 - 20) ms.  In general, it is defined as the measurement’s “domain" the space-time region in which a process of measurement takes place.  The Sun (see figures on right side) is capable to repeatedly stop along tens of minutes an entire Beverage Bottling Line.  How ?   In its most natural way, by mean of beams of light which are one of the two known causative relations (the second being gravitons) between the measurements accomplished in our Factories here and now, and that object 149.5 millions of kilometer afar and ~8.5 minutes ago. 

We’ll cite four different examples, all of them referred to our specific field of application, automatised Quality Controls in the Food and Beverage Bottling Lines:  

    The smallest and strictly causally connected Environment is a huge sphere, centered in the Factory, whose volume is over 1.4 * 1025 km3








                   Tunnel effect diodes sport extremely high speed of operation,  ~ 1 THz (1000 GHz).  This results from the fact that Tunnel diode only uses majority carriers, e.g., holes in an N-type material and electrons in a P-type material. The minority carriers slow down the operation of a device and, as a result, their speed is slower  

  1. photons in the visible part of the spectrum, due to reflections into Inspectors’ mirrors are later amplified, forcing massive false rejects (> 40 %) in Full Bottle and Empty Bottle Inspectors equipped with Vision cameras; 
  2. photons in the visible part of the spectrum, in feeding cap Colour tri-chromatic sensors and later amplified, emulate caps of wrong colour, forcing massive false rejects (> 10 %) in Full Bottle Bottle Inspectors equipped with cap Colour inspection; 
  3. thermal photons act on beverage characteristic creating a diurnal cycle.  A beverage whose fill level is being inspected with high frequency em radiation, appear apparently underfilled at ~3 PM than at ~5 AM.  Net effect front of a single sensitivity setup: huge false rejects at ~3 PM;
  4. thermal photons act on the PET bottles characteristics, creating a diurnal cycle.  PET containers tension shall be minimised at ~3 PM and maximised at ~5 AM. A PET container whose sealing (leakage) is inspected by mean of a Squeezer Full Bottle Inspector, equipped with inspection for pressure, inductive seal and difference of fill level, shall appear defective at ~3 PM and correctly sealed at ~5 AM.  




Classic interpretation

Reading from the classic Physics point of view the last two of the four cases above, we see a common cause (written in italics) for the “measurement anomaly": 

     3.  the beverage at ~3 PM cannot be inspected for HF fill level like at ~5 AM, because the Environmental conditions are different;

     4.  the PET container at ~3 PM cannot be inspected for sealing like at ~5 AM, because the Environmental conditions are different;





Modern interpretation

Re-reading from the point of view of modern Physics the last two of the four cases above, it is detected (and, tentatively eliminated) an ambiguity in the Classic Physics point of view, causing approximation:      

    3.  the beverage at ~3 PM cannot be inspected for HF fill level like at ~5 AM, because the correlation between Environment and beverage is different;

     4.  the PET container at ~3 PM cannot be inspected for sealing like at ~5 AM, because the correlation between Environment and container (mechanical characteristics) is different.

Out of this sphere, there is the much wider heliosphere where also Sun acts, preventing the dangerous arrival on the Earth surface of the majority of massive atomic nuclei and electrons, flying at relativistic speed and of high energy photons, gamma and X-rays.  All these mere by-products of the multitude of physical Events happening into clouds and into those gigantic fusion energy-based reactors, collectively named stars.  

A small portion of them reach however the surface and our Machinery, implying one more reason for the fact that it is unavoidable to experience measurement's fluctuations, also in presence of excellent cables’ shielding and in place deeply buried underground.  These examples are not simply extending the radius of the spherical space-time region, the “domain” (or, Environment) in which our measurements take place, well out of the assumed perimeter of the Food and Beverage Factories.  

This, because what really performs the inspection function in the over 100000 Electronic Inspectors into Food and Beverage Bottling Factories are, nearly invariably, atoms of Silicium.  Atoms of Silicium into the billions of transistors, themselves part of Integrated Circuits processing signals mainly incoming by CMOS- and CCD-cameras.  Several stages of amplification, filterings and comparisons of these amounts with parameters, are the essence of the inspection process.   A process finalised to Binary Classification, typical task for Quantum Computers processing qubits rather than bits and a long chain of nonclassic measurement stages. 



Tunnel-effect a way toward a technological 

breakthrough 








 Negative differential resistance of a Tunnel-effect diode, in a current-voltage graph. The nonlinear and nonclassic feature of the diode identified in the red coloured negative differential electric resistance, base of its impressive speed performances.  The tunnelling effect is inherently very fast…, too fast when we consider that recent thorough testing has determined superluminal speed.  Superluminal if, and only if, the object is a unique instance existing in a single world (image public domain under CC 3.0)








    Prof. Günter Nimtz.  His impressive breakthroughs are confirmed by an amount of other independent experiments and theories. Some of them, of Bell-Aspect type about Entanglement, verified until 30 σ, ...thirty standard deviations ! (  published with permission of prof. Günter Nimtz, 2014)




   Nimtz and Stahlhofen's double prism experiment of 2006.  Photons can be detected behind the right-hand prism until the gap exceeds up to about one meter (   Jochen Magnus, 2011)




A practical example of the many logical and experimental threads which carried to this modern scenario, given by the non-classic quantum mechanical Tunnel-effect discovered in 1958 by Esaki.   Below, on right side a practical application, the Tunnel-effect diode and its nonlinear characteristic current-voltage curve.  The animation below shows the time-evolution of the wave function of the electrons in those atoms of Germanium building up this electronic component.  Electrons making what Classic Laws of Physics consider impossible, passing through a barrier of potential.   Heisenberg’s Uncertainty Principle allowed and made sense of this expected behaviour yet decades before the experimental discovery.  What has been discovered later that the Tunnelling effect is ...too fast.    Thorough testing by several Laboratories, the first of them that of the prof. Günter Nimtz at Koeln University, Germany (see image below at right side), at different frequencies and for different kinds of particles, allowed the determination of superluminal speed across the barrier.  

As an example, the graph below shows two microwave pulses at the same frequency of 8.2 GHz travelling through:

                                                    (1)   air  (light violet)

                                                    (2)   a barrier  (dark violet);


the latter traversed the same distance ~ 1 ns faster, a speed 4.7 c, say nearly five times faster than light speed in vacuum (299 792 458 m/s).  


  Evolution of the electron wave function through a potential barrier.   The animation renders what let the Tunnel-effect diode (figures on right side) be so fast with respect to the other components: the Heisenberg Uncertainty Principle.   The central white colour vertical bar is the potential barrier Classic Physics considered impossible to breakthrough.   If we assume the video showing a single electron in a single Universe, then frequent superluminal propagation of the wave function implies paradoxical situations, synthesized in violations of the basic postulate of Relativity: the existence of a maximum limit speed.  Tunnel-effect diodes are simply 'too fast' to be existing in a unique instance.  What seems to cross the barrier five times faster than light, when considering that the electron coexist in several interfering branches of a common tree-like structure,  is with today a few residual doubts only a side effect of our perspective  



A deviation out of any possibility of explanation in terms of mere statistical fluctuations, say the fluctuations implicit in all physical measurements, whose true origin we are not here detailing.












 Tunnel-effect hints to an interpretation of the events truly and only predicted by Quantum Field Theory.   In the example on left side, photons crossing air (1) or a barrier (2). When crossing the barrier they reach the opposite side ~ 1 ns before those which crossed the air, say 4.7 times faster than the maximum speed of light in vacuum postulated by Relativity (abridged by    G. Nimtz, 2006)


A deviation superluminal if, and only if, the wave packet propagated is a unique instance existing in a single universe, say the classic point of view, dated 1905-1915, of the Special and General Relativity theories.   On the opposite, no violation at all of the relativistic (no superluminal propagation) postulates, if what we are integrating into our measurement are a multitude of superimposed instances of the same object.  Object whose existence is multiversal, say in several and mutually interfering branches.  Each one instance of the same object in a slightly different Environment.   This, after the discovery of Decoherence, is the meaning for World.   

Wave packets which, because of the:

  • Heisenberg’s Uncertainty Principle; 
  • linearity of the Quantum Mechanics wave functions which let the semiconductors switch and amplify;

are irreducibly superimposed multiversal objects, behaving dependently of what is happening elsewhere.  Tunnel-effect can only be understood within the nonclassic point of view of Quantum Mechanics where, like a dam, the Uncertainty Principle separates Classic and Modern ideas we have about the physical world.



Massively confirmed evidence

Superluminal motion. To have an idea of the yield of the experimental discovery by Guenter Nimtz and the group of researchers led by him,  it is necessary to recall that since 1905 all material bodies were assumed be moving always and only at velocities v < c, measured with respect to inertial reference frames. Velocities leaving them into the light-cone's external boundary. Boundary represented in the figure above by a dotted line getting out of the origin O(0,0).  In the example, the blue colour represents a body in superluminal motion at velocity v = 4c   All the surface under the dotted line was considered forbidden to whatever, be mattter or radiation.  Visibly the blue coloured vector displaces itself a distance OB > OA in a time shorter than that one necessary to the subluminal red coloured material body

 To have an idea of the yield of the experimental discovery by Guenter Nimtz and the group of researchers led by him,  it is necessary to recall that since 1905 all material bodies were assumed be moving always and only at velocities v < c, measured with respect to inertial reference frames. Velocities leaving them into the light-cone's external boundary. Boundary represented in the figure above by a dotted line getting out of the origin O(0,0).  In the example, the blue colour represents a body in superluminal motion at velocity v = 4c   All the surface under the dotted line was considered forbidden to whatever, be mattter or radiation.  Visibly the blue coloured vector displaces itself a distance OB > OA in a time shorter than that one necessary to the subluminal red coloured material body 











The concept of propagation speed makes sense if Time exists as a fundamental, because the concept of Speed derivates by those of Space and Time.  Then, the conundrum is Time.  We saw elsewhere the General Relativity assumption about the time-ordered sequence of submanifolds (slices or, leaves) of the manifold M, constituting a Foliation, whose details and properties we examined here.   

In this framework, what clocks measure is proper time s along their own worldline γ, maintaining coherence with the General Covariance Principle over which Relativity theory is based. 



1967

In 1967 the Wheeler-DeWitt equation was capable to join the relativistic description of an object by mean of the hamiltonian, where the object is a constructive interference in a sea of destructive, with the quantisation of whatever implicit in Quantum Mechanics:  

       

and, what is truly relevant, without any reference to the Time. How ?   In brief, the Ψ term above is the superposition of all of the elemental wave functions related to all of the existing wave packets.   More, from the point of view of that superposition, no time evolution exists at all.   On the opposite, correlated sets of wave packets part of the superposition witness the initial condition effect historically named Time.   


1983

In 1983 Don Page and W. T. Wootters showed how entangled particles could be used in a Quantum Physics test, to see that the time-ordered sequencing (of the relativistic spatial foliation M) is in the reality only felt by objects correlated with others because of Entanglement.

Entangled couples of particles:

  • have notoriously one of their properties strictly related and are unrelated to the Environment, until a measurement is accomplished on one of them by a third party or the Environmental Decoherence prevails;
  • also when widely separated in the 4-D ordinary space-time, they continue to share the same small Hilbert space;
  • are explicitly cited by the Relative State formulation of Quantum Mechanics.  

Objects related to the Environment feel the effects of Time: Thermodynamics being a relevant example.  


2013

The experiment thinked by Page and Wootters involves Entangled photons.  It had been first time executed along 2013 by a multinational team guided by Ekaterina Moreva. 

The figure below shows the Optoelectronics layout of the test, including beam splitters, lenses, filters, plates and LASER light commonly adopted into camera-equipped Electronic Inspectors:

 Optoelectronics' layout allowing to form an entangled state of the polarization of two photons, one of which is used as a clock to gauge the evolution of the second.  To an "internal” Observer that becomes correlated with the clock photon the other system appears in evolution, while to an “external" Observer that only observes global properties of the two photons, it looks static (   Moreva, et al., 2013)








It has allowed to form an entangled state of the polarization of two photons, one of which is used as a clock to gauge the evolution of the second: 

  • an "internal” Observer that becomes correlated with the clock photon sees the other system evolve;
  • an “external" Observer that only observes global properties of the two photons can prove it is static. 


  2013 is the year when Time terminated to claim its existence as a fundamental of Physics.  First observed in 1967 the possibility of well defined quantum states, completely abiding by the notion of Time.  Later, in 1983 it was conceived the mechanism inducing the Time-sensation.  A mere effect of Entanglement, only observable by who lives along branches, histories of the Multiverse.  The experiment is so complex that it was necessary to wait the technical and scientific developement of 2013 to make it


Time is an emergent property, not a fundamental






The term Multiverse has no relation with the known “Parallel Universes”, made popular by Science-Fiction. “Parallel” means not interacting or ambients causally disconnected, say no exchange at all of Signals and Energy.  

Multiverse is nearly the opposite: a superposition of all of the mutually interfering wave packets, corresponding to the wave functions of all objects.  Renamed “Multiverse” to mark the conceptual difference.  

tree-like structure from our point of view.  A multiply-connected object, as seen by Topology point of view.  Several coexisting instances of each one object, each one part of a slightly different (δ  = 1 bit) Environment.

The recently published results confirm the analysis given in 1983 by Don N. Page and W. K. Wootters: Time is an emergent property, deriving from quantum correlations (namely, Entanglement), and not a fundamental of Physics.   Then, now that it is established on the dual theoretical and experimental base that Time is a derived concept of Physics, the superluminal speed of the experiments developed on Tunnel-effect, has to be moved from the paradox rank to that of unavoidable effect.  We cannot calculate any speed, where no Time evolution exists.  

Following the Quantum Theory of Measurement, each one time a “good measurement” happens, a correlation between two systems and respective wave functions, a new history branches itself out of the others.   An idea in which a measurement is a branching of a new story, introduced in 1957 and later adopted by many others eminent physicists.  

Between them, the nobelists:

  • Richard Feynman, 
  • Stephen Weinberg, 
  • Alan Guth, 
  • Murray Gell-Mann,
  • Andrei Dmitriyevich Linde,
  • Alexei Starobinsky,
  • Alan Guth,

and also some of the most brilliant minds of Physics like: 

  • David Deutsch,
  • James Hartle, 
  • John A. Wheeler, 
  • Stephen Hawking, 
  • Leonard Susskind, 
  • Lev Vaidman, 
  • Avshalom Elitzur, 
  • Yakir Aharonov,
  • Wojciech Hubert Żurek 
  • Heinz-Dieter Zeh.


    CESIUM ATOMIC CLOCK ON A CHIP.  Just an inch across, this innovative atomic clock buy NIST is smaller than 8 mm high and 3 mm wide. In that space are contained two radio antennas, a heating coil, and a glass ampule with Cesium element trapped inside.  Is this device really measuring time ?  It really depends on what is meant for “time”. After 1983 the very idea of time as a fundamental of Physics fell at least on the theoretical ground. After 2013 the experimental confirmation about the fact that time is a secondary concept, rather than a fundamental like space (video credit Theodore Gray, Max Whitby and Nick Mann, 2009)


"Time is an emergent property, deriving from quantum correlations and not a fundamental of Physics
















Each “good measurement” establishes a new additional thread along the sum of all of the yet existing histories







The idea clearly explained in DeWitt (2004, pages 138-144).  That’s why no violation of the light limit speed c exists in the Tunnel-effect: the new Events are observed along a new branch, a new history, and no referral to prior measurements and results make sense to apply.  Since two decades this is the mechanism conceived underlying correlations superior to 30 σ (thirty standard deviations !) in the worldwide Bell-Aspect experiments studying Entanglement.  

Also, it allowed decades ago to understand that the mechanism effectively prevents causal violations effects of an hypothetical topologic structure like the Einstein-Rosen bridge (also known as wormholes or Closed-Timelike-Curves, CTCs).  Einstein-Rosen bridges are implicit in the General Relativity theory.  However, until recently the basic idea of CTCs hitted hardly against the solid wall of Philosophic logic and of the common sense: ...how can an effect precedes its own cause ?   In fact, really it is impossible.

The new concept of Measurement, as a matter of fact, prevents CTCs from creating paradoxical causal violations at all scales:

  • microscopic;
  • macroscopic.

Considering all this, Quantum Mechanics is today backing the coherence of General Relativity. 

Showing that these extremal scenarios of another theory are not in contradiction with its basic assumptions.  Entanglement idea derives by what in 1935 Einstein, Podolski and Rosen figured what at first sight appeared as a flaw into Quantum Mechanics, one proving at least its incompleteness.  It later resulted that the single-world classic point of view of General Relativity is an approximation.  

Each “good measurement” establishes a new additional thread along the sum of all of the yet existing histories. In this framework of correlated, non-interacting, systems it is explained (see Everett in eds. DeWitt, et al., 1973, pages 78-83) why and how they are implicit consequences of the Quantum Theory of Measurement, however incomprehensible they may appear as seen by the classic approximation.  

The two figures below synthetise the situation:

  • left side, the modern paradigm, where Time does not exist any more.  Measurements as a natural process continously happening and each possible choice is actual.    Each masurement's result is the starting point of a new branch of the general history;
  • right side, the classic point of view today disproved by theory and experiments.  An initial condition is perceived like Time by apparatuses and Observers into some of the branches, which are have no information about the content of the other branches of the history.  Time, in the reality, a proved effect of the Entanglement condition of apparatuses and Observers.


                   Modern                                                        Classic

1982-the-crossroad-of-physi med-2


  

                                                        

                                          

   


        


                

  


Large scale effects of a change of paradigm about the Measurement

    Signature of John Von Neumann, who created much of the Quantum logic and terminology  


Basic Quantum Terminology

Coherent, typically applied to a system, in modern Physics is a synonimous of superimposed.  In this sense, superimposed (or, coherent) are the orthonormal eigenvectors constituting a base for the wave function Ψ, representing an object, however complex or massive may it be.   Then, decoherent means reduced by a measurement to one definite value (eigenvalue of the eigenstate), the only one we perceive in the multitude of alternatives.

Eigen means “characteristic” or “intrinsic”.

Eigenstate is the measured state of some object possessing quantifiable properties like position, momentum, energy etc.  The state being measured and described have to be observable (e.g. like the common electromagnetic measurements of Bottling Controls ), and have a definite value, called an eigenvalue.

Eigenfunction of a linear operator, defined on some function space, is any non-zero function in that space that returns from the operator exactly as is, except for a multiplicative scaling factor. 


  Evolution of a defined volume Γ(t)  in the phase space.  The region Γ(t) represents the information we have about a system at three distinct and successive times t = 1, 2, 3.   Visibly, the information we have does not increase. Liouville’s Theorem holds its full validity, included those mesoscopic and macroscopic space-time scales where the Optoelectronic devices in the Electronic Inspectors are sensible (abridged by   Susskind, 2005)








We saw here that in the Electronic Inspectors, the simplest measurement subsystems named Triggers are constantly labelling the (or, applying an identification to) objects to their position in the space-time, coarse graining (enclosing) them into macroscopic Shifting-Register’s cells.  We have shown how an Event can be associated to the label of a single slice (or, leaf) in a vast foliation.  There is no fundamental distinction between measuring devices and other physical systems.  What Triggers differentiate are the identities of the objects.  Triggers are the most elementary kind of inspection which can be conceived in a Bottling Control.  

A measurement is a special case of interaction between physical systems, an interaction which has the property of correlating a quantity in one subsystem with a quantity in another. 

With reference to the formalism of the modern version of the Principle of Superposition presented here, is cleared that an interaction is a measurement (pages 54-60) if it satisfies three requirements:

  1. the measurement is associated to a fixed interaction H between systems;
  2. the measured quantities are the limit of the instantaneous canonical operators, as the time goes to infinity.  Our common measurements subjected to finite times are approximate and approach exactness as the time of interaction increases indefinitely;
  3. if the fixed interaction H is to produce a measurement of A in S1 by B in S2 then H shall never decrease the information in the marginal distribution of A.  In other terms, if H is to create a measurement of A by correlating it with B, we expect that a knowledge of B shall give us more information about A than we had before the measurement took place since, otherwise, the measurement would be useless. 

The last requirement, on first sight obscure, regards the fact that the interaction H:

  • cannot decrease the information content of the distribution of the measured object A; 
  • has to increase the information we had about A before we accomplished the measurement;

is the choice which let the conservation of Probability hold true, the only choice which makes possible statistical deductions: Sturm-Liouville Theorem.  A classic concept redressed under new words.  

Its full comprehension need an aid to intuition, aid which may arrive by the following two figures referred to cases with two and three dimension.  In Mechanics and Thermodynamics there is a precise signification when saying that Information is never lost by a closed system.  













Please refer to the two figures above and below, showing in two and three dimensions the time evolution of a defined volume in the phase space. The volume of the region Γ(t) represents the information we have about a system at three successive times t = 1, 2, 3.   Visibly, the information does not increase.  

Sturm-Liouville’s Theorem holds its full validity also in those mesoscopic and macroscopic space-time scales where the Optoelectronic devices in the Electronic Inspectors are sensible.   This is the origin of the third constrain appearing above, about the fact that Information cannot decrease in A but has to increase the Information we had about A before the measurement; on practice the origin of the Second Principle of Thermodynamics.



  Time t evolution of a region x.  y(t, x) is their Liouville function. The spreading of the Liouville function with time made evident by the fact that the graphic is not monometric: 1 cm on the y axe equals 4 on the x axe.  Liouville function and its underlying concepts regarding the phase-space operates wherever: we’ll encounter it again as a useful tool to evaluate what rejects' rates may be expected after changing an inspection's sensitivity, given initial conditions





Mesoscopic scale: arena for experimental verifications










In 2010 they started to be spot objects of the size of a visible hair (~ 0.1 mm) existing in two separate places, whose separation was measured with a scanning electron microscope.  

In 2011 Gerlich, Arndt, Eibenberger, et al. demonstrated that the wavelike behaviour can be followed in the matter also for extremely huge compounds.   They used the Kapitza, Dirac, Talbot and Lau interferometric layout conceived in 2007 and visible in the figure below.    The quantum wave nature and delocalization of compounds had been followed up to 430 atoms, with a maximal size of up to 60 Å, masses up to 6910  AMU and de Broglie wavelengths as short as: 

                                         λdB  =  h/mv    1  pm  


meaning particularly heavy packets of waves and energy.

    Layout of the interference experiment of Kapitza, Dirac, Talbot and Lau (   Nature Publishing Co., Macmillan Publishers Group/CC BY-NC-ND 3.0/2011)





Along 2012 it has been first time experimentally detected one more counter intuitive result, yet envisaged by Quantum Mechanics: LASER photons spontaneously jumping back, rather than proceeding forward in a crystal lattice.   And this, not in the atomic or subatomic realm where Quantum Mechanics was only supposed to act.   Rather, on the same mesoscopic scale of the semiconductors’ junctions part of the phototransistors of which are equipped nearly all of the photo-sensors used as Triggers by the Beverage Bottling Controls.   


  Setup used in 2001 by Zeilinger et al. which allowed to verify that also macromolecules entails fully their properties in the non-classic domain of Quantum Mechanics and respect the Heisenberg Uncertainty Principle. Fullerene C70, is an allotropic form of Carbonium 70 atoms (abridged by    Zeilinger, et al., 2001)







A new paradigm is arising, one whose powerful fruits are commercial applications as different as the Quantum Computers, simultaneously parallel processing in several other Universes, now used by companies like Google, Inc. or Lockeed-Martin Corp. and academic researches sponsored by the Society of Lloyd’s yet in 2007.   Why these private companies should be investing money in something, at first sight, seemingly theoretical ?   

The answer, in some way, is related to the discovery of Decoherence.


 Close up on the molecule of Fullerene C60, an allotropic form of Carbonium with 60 atoms.  Visible the isosurfaces of ground state electron density.  Discovered in 1990, was the first used after 1993 to test Decoherence in the mesoscopic scale of dimensions.  The Van der Waals diameter of the molecule C60, accounting also for the thickness of the electronic clouds around nuclei, is 1 nm



Experimental close-up on Decoherence

"Decoherence:                                                                                                                                              process that classicalizes a quantum phenomenon, so that its former wavy character disappears"


  Mesoscopic scale of dimensions.  Tungsten individual atoms directly sighted in this image obtained in 2008 of the tip of the sharpest Tungsten needle existing. The small central red colour dot is an atom: ~0.30 nm its visible diameter. The image was obtained by mean of a Field Ion Microscope. Brighter red colour lines are an effect of smearing due to atoms' displacements along the 1 second long exposure (abridged by    Silverman, 2008)








Triggered Events lie in the space-time and energy boundaries separating the fields of application of Classic and Quantum Physics.  We all agree that macroscopic objects are composed of collections of microscopic, like molecules and atoms.  As a consequence, Triggered Events are finely rooted there, in the microscopic scale of distances.  But, we all are convinced to see individual Events referred to individual Objects. The Objects to whom we are referring being always and only human-, space-, time- and energy-proportioned objects.  

No one, in absence of instruments, is capable to discern:

  • grains of dust whose diameter is < 0.01 mm;
  • Events separated by a time interval < 0.001 s;
  • radiant energy < 0.1 nJ;      

because our eyes, and the neuronal system supporting their function, are not biologically deve-loped for that.   But, in the end, these limits does not matter that much.   Since 1985 they are known the factors which let the multitudes of paths above described, fruit of Feynman’s  own intuition, be reduced to the individual alternative we are experiencing. An introductory definition for this modern concept is the process that classicalizes a quantum phenomenon, so that its former wavy character disappears.  

Decoherence is the theory of universal Entanglement: it does not describe a distortion of the System by the Environment, but rather a change of state of the Environment by the System.  

The Environment includes a multitude of air molecules and photons, mainly at thermal frequencies. Imagine a macroscopic body as massive as a Bowling Ball like that in one of the figures below, on right side.  In this case, scattering of photons or atoms off such a macroscopic object, even very small dust particles or macromolecules, causes no recoil.  But, this inefficiency in the measurement results over compensated by the multitude of scattering Events, occurring in our daily life and industrial Environmental conditions, even along small time intervals, say:

  • atmospheric pressure:       ~1 bar;
  • temperature:                       (-30 - 60) °C;
  • relative humidity:                (0 - 100) %.

Decoherence's effects are can be felt also at the space and time scales we perceive directly, without any instrumental aid.   

Two examples of Decoherence's effects are visible in the figure below:  

  1. it explains why one seems to observe individual tracks in a Wilson chamber as apparent particle trajectories, described in terms of wave packets;
  2. the observation of radioactive decay is directly perceived in the Food and Beverage Packaging Industry, when using gamma-Rays sources and detectors, like in the keg’s Fill Level Inspectors.      

  Two examples of Decoherence's effects are visible with unaided eyes : the individual tracks, some centimeters long, in a Wilson chamber as apparent particle trajectories, and the radioactive decays originating these tracks.  Decoherence’s effects are directly perceived also in the Food and Beverage Packaging Industry, when using gamma-Rays sources and detectors


Young’s experiment with and without air 












The two figures below representing the Young double-slit experiment with and without molecules of gas (air), interposed along the paths of electrons.

Two completely different distributions of the hits counted on the following screen:

  1. on left side, in a vacuum;
  2. on right side, with the gas molecules of air.  

Decoherence is what impedes us to see all objects in their superimposed reality.   It is mainly due to the collisions between the molecules of air and the electronic clouds around each one atom.  

These collisions carry away the phase correlations between the histories where the electron arrived at several other points.   

The next two section shall detail what is detected with and without interposed gas.   It’ll be accounted how Decoherence phenomenon discovered in 1970, was hiding the direct sight of the true constantly happening behaviour of the matter and radiation, first time detected in 1989.  

  Thomas Young's double-slit experiment, on left side in vacuum and on right side with air along the paths of electrons emitted by lamp filaments. Counting the hits on the screen, two completely different distributions arise, a diffrence due to Decoherence.  Decoherence is mainly due to the collisions between molecules of gas and the electron, carrying away the phase correlations between the histories where the electron arrived at point y on the screen by passing through the L slit by those historieswhere the electron arrived at point y on the screen passing through the U slit








“...There are only waves and, knowingly, waves are superpositions of other waves.





1.  Double-slit in a Vacuum

Many of the ideas about the concepts of measurement, space, matter and radiation into today’s academic journals originate by ideas published or however circulating decades ago.  There are reasons why this is happening.  No theory can ignore the experimental evidences and these last are constantly improved.  And, there are some special experiments, like Thomas Young’s, first accomplished centuries ago and providing some strong clues about the everything reality, which had to wait centuries.  

The nobelist Richard Feynman decades ago imagined this when called it: "a phenomenon which is impossible ... to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery of quantum mechanics”.  Feynman was writing about interference fringes appearing in the double-slit Young's arrangement when many simultaneously electrons were fired.  "Many simultaneous" means a high probability of interference fringes due to the superposition of several electrons on the screen, something absolutely expected and normal.

But, when later, Young’s experiment was first time performed with individually fired electrons, it was touched the surface of something much bigger than the wavelike properties of the electromagnetic radiation.



  Tonomura's team experiment


1989

"Field emission:

electrons are emitted from a very sharp tungsten tip (thinner than 1/1000 mm) when a potential (3 – 5) kV is applied between the tip and a first anode ring; this effect is known as field emission


















It happened in 1989.   A group at the Hitachi's Advanced Research Laboratory (at Tokyo, Japan) led by Akira Tonomura developed the first double-slit system allowing to observe the build-up of the fringe pattern with a very weak electron source: single-electrons fired on a one-by-one base toward a double slit. What results when firing single-electrons fired on a one-by-one base toward a double slit is unimaginable.  

The figure below shows a schematic representation of the modifications that Tonomura made to a Transmission Electron Microscope to develop his experimental setup. Electrons are emitted from a very sharp tungsten tip (thinner than 1/1000 mm) when a potential in the range (3 – 5) kV is applied between the tip and a first anode ring; this effect is known as field emission.  Assorted Optoelectronics within the modified electron microscope attenuates and focus the electron beam.  The hits are fine-detailed by the four figures below.  

These show how, hit after hit, what in the start looks like mere noise, develope in the end a wave like pattern.  The bright spots begin to appear here and there at random positions: these are electrons’ constructive wave packets, detected one by one and looking like particles.  These electrons were accelerated to 50000 V, and therefore the speed is ~40 % of the speed of the light, i. e., it is ~120000 km/second.  These electrons can go around the earth three times in a second.  They pass through a one-meter-long electron microscope in 1/100 000 000 of a second. The De Broglie wavelength for the accelerated electrons is λ = 0.0055 nm.  

Interference fringes are produced only when two electrons pass through both sides of the electron biprism simultaneously. If there were two electrons in the microscope at the same time, such interference might happen. But this cannot occur, because there is no more than one electron in the microscope at one time, since only ten electrons are emitted per second. When a large number of electrons is accumulated, something like regular fringes begin to appear in the perpendicular direction shows. 

 Clear interference fringes can be seen in the last scene of the experiment after 20 minutes. It should also be noted that the fringes are made up of bright spots, each of which records the detection of an electron. The final resulting pattern on the screen does not resembles at all any interferential, rather hints to a corpuscular character of the objects.  Although electrons were sent one by one, interference fringes could be observed.  Interference fringes are expected only when electron waves pass through on both sides of the electron biprism at the same time, but nothing other than this. 

  Tonomura's team experiment schematic representation. Their 1989 single-electron double-slit experiment (abridged by    Prutchi, et al., 2012)


                          Time = 1


                          Time = 2


                          Time = 3


                          Time = 4





Common sense defies our imagination when trying to figure how these images could be truly existing.  Two centuries ago, Thomas Young’s experiment involved light waves.   But, since one century it is observed that firing on a one-by-one base massive particles like electrons or neutrons, creates the same wave like pattern.   And same is true also firing massive mesoscopic molecules composed by hundredths of atoms: the same wave like pattern.   Each object fired after the precedent hitted the screen, shows an interferential shape on the screen meaning that also mesoscopic bodies are, when closely looked, waves.   More, or electrons have now to be supposed having their own brain (and surely they have not) or there is only a theory capable to explain how can they know what was the path choose by each one of the electrons fired before, so to reform all times the visible interferential pattern.   The only theory predicting this well visible facts, the only one owning the necessary explanatory power and not violating the axioms of the Special Relativity is the Relative State formulation of Quantum Mechanics, also named Many-Worlds Interpretation of Quantum Mechanics (abridged by   Amelino-Camelia, Kowalski, 2005)    



Whenever electrons are observed, they are always detected as individual particles. When accumulated, however, interference fringes are formed. Please recall that at any one instant there was at most one electron in the microscope.  

We remark that these figures are what is detected on the screen after having been hitted by material particles, like molecules, atoms, neutrons, or electrons.  We are not speaking of objects since centuries considered wavelike, like the light (photons). This confirms that, in the reality, no material particle exists at all, rather only waves with different properties like energy, spin, etc.  

The interpretational bifurcation reached in 1982 after Alain Aspect’s group experiment on Entanglement: 

  1. Bohm-De Broglie's interpretation explains it but only after paying the unacceptable price to postulate that light (and, gravitational interaction) does not defines the limit speed for all causal correlations: it needs to introduce tachyions; 
  2. Copenhagen's interpretation of Quantum Mechanics, has no explanation for what registered during this experiment;
  3. Everett’s Relative State formulation, since the start included exactly what is observed;

reproposed itself in 1989, this time with many more accumulated experimental evidences.   The aspect of interference fringes, visible on right side, develops itself always and also for individual particles, …also if the molecules, atoms, neutrons, protons or electrons are fired after the precedently fired particle yet hit the screen.   

The experiment had been repeated with material bodies of progressively increasing size and mass: we are no more in the domain of particles, rather in the mesoscopic scale close to human direct unaided sight.  Also mesoscopic macromolecules, including several hundredths of atoms appearing like a small grain of dust well visible by common microscopes were tested, without any change in the final result.   There are only waves and, knowingly, waves are superpositions of other waves.    More, or electrons have now to be supposed having their own brain (and surely they have not) or there is only a theory capable to explain how can they know what was the path choose by each one of the electrons fired before, so to reform all times the visible interferential pattern. 






 Clifford-torus is a 4-dimensional object of which we are here forcedly seeing a projection.   We have developed less than 100 billions of neurons: not enough to perceive events which are happening on an arena whose dimensionality is at least 5.   What we see in the video on side, is never what really the Clifford-torus is and look like as seen by a 4D point of view we’ll never have.   As a consequence, the details of the branching superpositions of states, corresponding to Measurements and looking like bifurcations, can be followed only mathematically 

Physics has since 1957 a unique theory with the explanatory power for the images on right side and, what is more important, it is not an ad-hoc born to explain Tomomura's experiment, because it was conceived 31 years before Tonomura's result.  

This interpretation (The Many-Worlds Interpretation of Quantum Mechanics) in brief explains that all of the objects are: 

  • waves;
  • superimposed;
  • part of a last grand superposition.  

After 1989 Thomas Young’s experiment with matter had been extensively repeated on a worldwide base, alwaus and only reconfirming the veridicity of the interference fringes visibçe above.   Today they exists also cheap and valuable Optoelectronics plus software kits, to be connected to a computer thus allowing to whoever to witness and register single-electrons interferences in the two-slit configuration.











2.  Double-slit with air

On the opposite, a decoherent set of histories is one for which the quantum mechanical interference between individual histories is small enough to guarantee an appropriate set of probability sum rules.   What is represented by the bell-shaped distribution observed above, on right side.   

It is the continous measurement of an object by all other objects, in our industrial Environment mainly molecules of that mix of O, N, He, CO2 we name air, under the permanent bath of light, the reason why we do not see simultaneous alternative Triggered Events.  













   

Repeating the Young experiment with air molecules allows to witness the effect of the relation between a simple object (in this case, an electron) and the Environment all around it.   

In other terms, the Environment induces a super selection, separating in two or more subspaces the Hilbert space where objects really exist.  

What above, around one century ago induced what was then the mainstream interpretation of Quantum Mechanics (Copenhagen’s school) to establish a Wave-Particle Duality which further experiments, improved technologies and theories, showed a mere illusion. 

Since decades it is clear that is Decoherence what lets us perceive like a unique state of an object what is superposition of states and that all is:

  • waves;
  • branching superpositions of waves.

 Evolution of the coherent history of the wave function Ψ.  In two branches Ψ(1) and Ψ(2)separately evolved in five branches Ψ(1,1), Ψ(1,2), Ψ(2,1), Ψ(2,2) and Ψ(2,2)evolving in further 17 branches.   Scenario today very frequently cited in the scientific literature, when considering Events.   An amount of technological facts encounters there its only explanation (abridged by    Zurek, Riedel, Zwolak, 2013)


 Dieter-Heinz Zeh, discoverer of Decoherence






Decoherence's implications apply to all measurements, including the gamma-rays counted in the Geiger counters and also the gamma- or X-rays fill level inspections in the Food and Beverage Packaging Lines   

The point is clearly explained by the physicist Heinz-Dieter Zeh, Decoherence’s discoverer, portraited here at right side. 

Below we prefer to quote directly his own illuminating words written in 2010 as Quantum discreteness is an illusion, and published in Foundations of Physics, grabbing the subject from a historical angle:

“(…)  When Ernst Mach was confronted with the idea of atoms, he used to ask: “Have you seen one?”   He finally accepted their existence when Loschmidt’s number had been confirmed beyond doubt to be finite.   At this time he could hardly imagine that this number represented no more than the number of nodes of some not-yet-known high-dimensional wave function.

However, can we today not observe individual atoms and other kinds of particles in many ways?   When experimentalists store single “particles” in a cavity, this may again be understood in terms of wave functionals with one single node – but why do they often appear as pointlike (discrete) objects in space and time?

What we actually observe in all such cases are only “pointer positions” of appropriate measurement devices, that is, positions of macroscopic objects, such as spots on a screen, droplets in a Wilson chamber, bubbles, or clicks of a counter.   So one naturally expects local or instantaneous microscopic causes for these phenomena.   While particles remained an essential ingredient for Heisenberg’s quantization procedure and its interpretation, Niels Bohr was more careful – at least during his later years.  

He presumed classical concepts only for macroscopic objects.  In a recent paper, Ulfbeck and Aage Bohr (Niels Bohr’s son) concluded that, when the decay of a nucleus is observed with a Geiger counter, “No event takes place in the source itself as a precursor of the click in the counter ...”.   They refer to this interpretation as “the new quantum theory”, although they do not specify whether they thereby mean Niels Bohr’s later interpretation or their own generalization of it.   So far I agree with them, but they assume furthermore that “the wave function loses its meaning” when the click occurs.  With this latter assumption, Ulfbeck and Bohr are missing a better and more consistent description of the quantum measurement and its classical outcome.  

When the apparatus and the Environment are both included into the description by a Schrödinger wave function, unavoidable interactions lead to a dislocalization of the initially prepared superposition, the process known as Decoherence.  This consequence may well explain Niels Bohr’s pragmatic rules, but without postulating any genuine classical properties to assume definite values “out of the blue”, and without changing the rules of logic. 

Although Ulfbeck and Bohr’s claim may thereby even be justified in the sense that the nonlocal wave function becomes inaccessible to any local observer, this consequence is here derived precisely by assuming a global wave function that obeys the Schrödinger equation.







Since misinterpretations of decoherence (such as in terms of perturbations by, rather than entanglement with, the Environment) are still quite popular, let me here briefly review its mechanism and meaning.   For this purpose, assume that some effective macroscopic variable y had been brought into a superposition described by the wave function ψ(y).  Its uncontrollable Environment χ(z), where z may represent very many variables, would then unavoidably and extremely fast be transformed into a state χy(z) that depends strongly, but uncontrollably, on y, while a reaction (recoil) of the macroscopic system, such as ψ(y) → ψ’(y), can often be neglected.   

The macroscopic superposition described by ψ(y), which would represent a “Schrödinger cat”, is thus (in practice irreversibly) dislocalized: it exists neither at the system any more, nor in the environment. 

Their combined state is entangled, as its wave function ψ(y) χy(z) is not a product of functions that depend separately on y or z.   In general, we do not even have an interpretation for it, since we can only perform local measurements.   However, there do exist nonlocal microscopic states whose individual meaning is well defined, such as total angular momentum eigenstates of spatially separated objects.


So one might expect never to find any macroscopic variable in a superposition ψ(y).  Rather, its reduced density matrix, obtained by tracing out the Environment, would be the same as that of an ensemble of narrow wave packets.   

This is an essential step to understand the classical appearance of the world as well as the observed quantum indeterminism.   The correspondence principle is not only unrealistic (in assuming isolated sytems), but would also be insufficient to describe the transition from quantum to classical physics.

 Decoherence implications apply to all measurements, including what is shown by the dials of the analog multimeters










An entangled state that includes a macroscopic variable can be created by von Neumann’s unitary measurement interaction.  For example, if a microscopic superposition:


                                φ(x)  =  Σ cn  φn(x) 

  Decoherence implications apply to all measurements, including those of the common digital multimeters







is measured by means of a pointer variable y, this means in quantum mechanical terms:


              Σ cn  φn(x) ψ0(y)    →    Σ cn  φn(x) ψn(y)


where ψn(y) are narrow and mutually (approximately) orthogonal wave packets centered at pointer positions yn.   If y were a microscopic variable that could be isolated from its environment, this process would represent a reversible measurement (describing “virtual” decoherence of the local superposition φ(x) ).   

But according to what has been said above, the superposition of macroscopically different pointer states ψn is immediately and irreversibly decohered by its Environment, giving rise to further, now uncontrollable entanglement:


                Σ cn  φn(x) ψn(y) χ(z)   →   Σcn  φn(x) ψn(y) χy(z)


This superposition can never be relocalized (“recohered”) any more to become accessible to a local observer.  Therefore, the fast but continuous process of decoherence describes an apparent collapse into an ensemble of narrow wave packets of the pointer (that is, of quasi-classical states that may discriminate between different values of n).


The very concept of quantization can be understood as the conceptual reversal of this physical process of decoherence, since it formally re-introduces the superpositions that were classically missing.   The effective quantum states are thus described by wave functions on the classical configuration space.   It is not clear, though, whether field configurations form the ultimate Hilbert space basis – as it is assumed in unified field theories.   String theory is just a specific playground to search for other possibilities, while there is as yet no reason to question the universal validity of the superposition principle. 

Insofar as Poincaré invariance remains valid, low excitations can nonetheless be classified by Wigner’s irreducible representations, which may then lead to effective quantum fields.   Their formal simplicity may even explain the mathematical beauty of emerging classical (Maxwell’s or Einstein’s) field equations, which has often given the impression that they must represent an ultimate truth.   On the other hand, interactions between the effective quantum fields lead to their intractable entanglement, a situation that allows only phenomenologically justified perturbation methods in connection with appropriate renormalization procedures.


     All the measurement systems, Bottling Controls and Machinery, are themselves superpositions of waves like the photon wave packet depicted above.  The particles do not exist.  Reason we are still naming waves in that way, is mere custom



Restrictions of the superposition principle, such as those applying to macroscopic variables because of their unavoidable decoherence, are called “superselection rules”. 

Others, for example those that exclude superpositions of different electric charges, can similarly be explained by entanglement with the environment – in this case between a local charge and the quantum state of its distant Coulomb field.  The Coulomb constraint, which requires this specific entanglement, can either be understood as part of the kinematics, or again as being “caused” in the form of the retarded Coulomb field of the conserved charge in its past.

The arrow of time representing the irreversibility of decoherence requires initial conditions similar to those used in classical statistical physics: all correlations (now including Entanglement) must form “forks of causality” based on common local causes in their past.   It then follows from statistical arguments, taking into account the complexity of macroscopic systems, that these correlations usually remain irrelevant for all relevant future times.   













So they have no local effects, and there is no recoherence in the quantum case.  In spite of Decoherence, there always remains a global superposition.   We have two options to understand this consequence of the Schrödinger equation in order to remain in accordance with the observed world: either we assume that decoherence triggers a global collapse of the wave function by means of an unknown modification of the unitary dynamics, such that all but one of the decohered components disappear, or, according to Everett, that all unitarily arising components exist simultaneously, forming a “multiverse” that consists of many different quasi-classical worlds containing many different successors of the same observers.   

While the different quasi-classical “worlds” that emerge by means of decoherence of a superposition of pointer positions need not be exactly orthogonal (their wave functions may slightly overlap), decoherence of discrete neuronal states in the sensory system and the brain may also contribute to dynamically separate the “many minds” of an observer.

Superpositions of macroscopic variables are thus permanently being decohered, for example by scattered light.   While thermal radiation would suffice to cause decoherence, ordered light carries away usable and even redundant information.   A macroscopic trajectory is then said to be “overdetermined by the future” (separately in each quasi-classical branch). 



“Dislocalization”, a term by John A. Wheeler in place of Zeh's orginal “delocalization”, to characterize the transformation of local superpositions into entangled (nonlocal) ones – in contrast to objects that are merely extended in space

 Photons are superpositions of wave packets of energy.  The well known diffraction of white light in a spectrum, corresponds to a macroscopic observation of a reality whose arena lies in the microscale.  Something we are capable to perceive directly only because effect of the participation of a multitude of photons.  What at a first sight looks like a banal coloured iris, hides the impressive reality of all objects



This is the physical reason why the macroscopic past – in contrast to the future – appears to “already exist” and to be fixed.

An almost classical trajectory can be observed for an α-particle in a Wilson chamber by means of repeated measurements of its position by the undercooled gas.   

Although its center-of-mass wave function may continuously escape from a decaying atomic nucleus in the form of a spherical wave, Mott’s analysis has shown how interaction of the α-particle with electrons of the gas molecules gives rise to the superposition of a continuum of narrow angular wave packets (representing quasi-rays pointing in all directions) that are correlated with ionized molecules lying along almost straight tracks. 

The wave function does evidently not lose its meaning according to Ulfbeck and Bohr when the first event on a track occurs.   The ions lead to the formation of macroscopic droplets, which are in turn irreversibly decohered and documented by scattered light (a consequence not yet taken into account by Mott).  

Decoherence separates even branches with slightly different droplet positions along the same track.  












This situation differs only quantitatively from that of trajectories of macroscopic bodies by: 

  1. not completely negligible recoil of the α-particle (leading to slight deviations from straight tracks related to quantum Brownian motion), 
  2. somewhat weaker inter-action of the α-particle wave functions with their environment (leading to noticeable gaps between successive “particle positions” that may be regarded as forming “discrete histories”).
Decoherence implications apply to all measurements, including those related to our own brain neuronal functions.  It has been demonstrated by Max Tegmark, that the neuronal systems decoheres in extremely brief times. Decoherence of discrete neuronal states in the sensory system and the brain may also contribute to dynamically separate the “many minds” of an observer

If recoil is strong, such as for the scattering between “particles” of similar mass in a gas, the thereby decohered variables (positions) are also localized, but they cannot follow quasi-deterministic trajectories any more.   Boltzmann’s stochastic collision equation is then a more realistic quasi-classical approximation than the deterministic particle mechanics from which it is traditionally derived.   

A particle picture for the gas molecules is thus nothing but a prejudice derived from classical physics. (…)  Decoherence does not only explain quasi-classical states as narrow wave packets (apparent points) in the thus emerging “configuration” space, but also apparent decay and other events.   

If a decaying system were described by a Schrödinger equation, its time-dependence would be continuous and coherent – corresponding to a superposition of different decay times.   This is known to require small deviations from an exponential decay law, which are observable under specific circumstances.   It demonstrates that the decay of isolated systems is incompatible with the assumption of stochastic decay events or quantum jumps.   

  Decoherence implications apply to all measurements, including those related to our own brain neuronal functions.  It has been demonstrated by Max Tegmark, that the neuronal systems decoheres in extremely brief times. Decoherence of discrete neuronal states in the sensory system and the brain may also contribute to dynamically separate the “many minds” of an observer

However, when the outgoing wave front interacts with an environment, it is decohered from the later emitted partial wave. 

In this way, the wave is decohered into many partial waves which correspond to different decay times, whereby the time resolution depends on the strength of the interaction.  In this way, decoherence leads to an apparent ensemble of stochastic events, and to their consequence of an exact exponential decay.   

In the case of clearly separated energy eigenvalues, as they usually exist for microscopic systems, interactions of the decay products with the environment also tend to decohere superpositions of different energies.   This can be directly observed for individual atoms under permanent measurement (that is, under strong decoherence), and it explains why microscopic systems are preferentially found in energy eigenstates.


Decoherence acts at all scales. The constructive interferences of light wave packets explore our own skin.  After the air molecules, they are the second cause for our own body fast decoherence

After decoherence has become irreversible, the global superposition can for all practical (or “operational”) purposes be replaced by an ensemble. A decay or transition is then regarded as “real” rather than “virtual”, even when we have not yet observed it.   This may justify the pragmatic though in general insufficient interpretation of the wave function as representing “quantum information”.   I do not know of any discrete quantum phenomenon in space or time that can not be described by means of decoherence. (…)”   

   The constructive interferences of light wave packets explore our own skin.  After the air molecules, they are the second cause for our own body fast decoherence




Decoherence speed


“Why do we only experience individual sharp superpositions, single bowling balls, rather than multitudes ?”.  

Because all others get damped out by decoherence, before we have the time to observe them

Imagine a physical object as heavy as a Bowling Ball in an Environment in standard conditions of temperature, air pressure and humidity.  It is a superposition of a multitude of possible correlations between its elementary components (quarks, gluons, leptons, etc.) and all of the others building up what we name Environment.  Its even and odd components have equal classical components but opposite quantum interferences.  

The Laboratoire Kastler Brossel of the French CNRS processed such an object subtracting their Wigner functions, then isolating the interference feature displaying their quantumness.  The result is the evolution of this signal over 50 ms, exhibiting a fast decay after only a few milliseconds due to Decoherence, of the original pure interference pattern which represented the physical object.   








How fast Decoherence happens is known since two decades.  

In the Table below, showing the Decoherence Rates (or, Localization Rates), expressed in units of  m-2 s-1 are represented three cases representative of objects of micro-,  meso-  and macroscopic scales, respectively an:

  • electron, not binded to any atom;
  • dust particle, at the limit of unaided eyes visibility;
  • bowling ball.
16-megapixel-2_med_hr_med

The bowling balls decohere in a time extremely short, explaining our sensation of their unique existence in a definite place    

bowling-ball_med



  Magnesium fluoride multicoating anti reflective treatment, well visible by its pink colour in this camera, is a practical example of quantum interference.  Here, destructive interference is used to increase the Signal-to-Noise relation of the Information contained in the image






  

      





  Classic or non-classic behaviour of the objects as an effect of Decoherence by the surronding environment.  Tabular values show the Localization Rate, expressed in units of cm-2 s-1, for the center-of-mass of three different objects, in order of decreasing strength.  Localization Rate measures how fast interference in between different positions disappears for distances smaller than the wavelength of the scattered objects.   Here shown the localization rates for 3 cases: a quark (electron) not binded to any atom, a dust particle and a bowling ball.  Our operative ambient conditions are met at a thermal background temperature ~ 300 K (~ 27 ºC) in air at a pressure of 1 atm, implying extremely high localization rates for an object of the size of a bowling ball, but fourteen orders of magnitude smaller for an object as small as an electron (table abridged by Tegmark, 1993)


 





























We’ll be now more precise about how our operative ambient conditions [thermal background temperature ~ 300 K (~ 27 ºC) in air at a pressure of 1 atmosphere], imply extremely high localization rates for an object of the size of a bowling ball, but fourteen orders of magnitude smaller for an object as small as an electron.  The effect of air, or any other surrounding substance, and black-body radiation from the surrounding is strongly temperature dependent (typically ∝T), and can hence be reduced by nine orders of magnitude by working at liquid Helium temperatures or, in a smaller extent, taking profit of a Peltier-effect cell.  

Exactly the strategy followed when looking for maximum performances of Optoelectronics’ devices, first of all: the CCD-sensors.   Comparing these results with conditions of minimised scattering, like the exposure of these objects to the cold ambient with the only cosmic background radiation, at a temperature of ~ 3 K (~ -273 ºC), the macroscopic bowling ball decoheres in a time 1028 shorter than what we experience.



           

These studies allowed to answer the main question arising after 1957:  

“Why do we only experience individual sharp superpositions, single bowling balls, rather than multitudes ?”.  

…..because all of the others get damped out by Decoherence, before we have the time to observe them.  



Decoherence's limits

Decoherence is a process with the same fundamental limits seen in the start of this page, with respect to the volumes of space causally connected with the Trigger Events, a subject we’ll deepen in the following.

Refer to the figure below, where:

  • blue coloured,   3-D volume (encoded in 2-D for graphic rendering) of space hosting the environmental factors E (e.g. air, thermal photons) and the quantum system S;
  • red coloured,     3-D volume causally disconnected;
  • M,                      Trigger Event;
  • S,                       quantum system interacting (“measured”) by the Trigger;
  • A,                       macroscopic measurement instrument, e.g. a Trigger photosensor.

The majority of the space is always causally disconnected.  We are indicating as a black colour bold inclined line, the worldline of the macroscopic measurement instrument A (the Trigger).  Inclined with respect to the Time axe, to manifest the fact that the instrument is over a non inertial platform. 

What precedes has an interesting implication: the macroscopic measurement device A (the Trigger, in our case) continues to remain in superposition of statuses, not decohered, for all what is so far to result causally disconnected.   Say, all what in the figure below lies in the space and is shown in red colour.   The reduced volume that we consider Environment of a Food and Beverage Bottling Line, assures that all triggerings and measurements are always derived by interactions with decohered states.  As visible by the figure below, this assumption is a fiction useful to simplify an extremely complex relation.  As all fictions, it cannot provide definitive solutions nor improvements on hard-to-tackle technical issues.   

This subject is reminiscent of the Problem Solving method searching for the Root Cause of a problem centered on the point where the effects are felt.   A strategy commonly followed when an evident cause cannot be encountered in the space-time volume we choose to consider the connected Environment.    

In these cases, we are searching for all of the thinkable cause-effect relations:

  • in space volumes progressively increased;
  • going backward in time. 

The modern answer of Physics to this point is dual, stating that all:

  • subsystems are in a superposition of statuses, as seen by all of the other subsytems, causally disconnected because too far;
  • what lies into the future lightcone of the measurement Event M is related to M and gives rise to effects S + A conditioned by the Environment E. 



 The measurement of a physical property by a macroscopic instrument, e.g. by an inspection in a Bottling Control, is subject to additional limits.  Only the properties of objects S and of the environment E into the causally connected (blue colour) volume of space, part of the future lightcone  of the Event M (the Measurement), shall be decohered.  This, means that the macroscopic measurement device A continues to remain in a super position of statuses for all what lies in the causally disconnected - red coloured - space (adapted by    L. Susskind, R. Bousso, 2012)






Links to the pages:









This website has no affiliation with, endorsement, sponsorship, or support of Heuft Systemtechnik GmbH, MingJia Packaging Inspection Tech Co., Pressco Technology Inc., miho Inspektionsysteme GmbH, Krones AG, KHS GmbH, Bbull Technology, Industrial Dynamics Co., FT System srl, Cognex Co., ICS Inex Inspection Systems, Mettler-Toledo Inc., Logics & Controls srl, Symplex Vision Systems GmbH, Teledyne Dalsa Inc., Microscan Systems Inc., Andor Technology plc, Newton Research Labs Inc., Basler AG, Datalogic SpA, Sidel AG, Matrox Electronics Systems Ltd.

                                                                                                                      .                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         .                   
TRUSTe Certified Privacy Seal

.                 .                                                                                         .                                                                                              .                                                                                                                                 66                                                                                        

graphene-lda.com Webutation

                                                                                                                                                                                                                                                                                                                                                          .                                                                                                                         PRICE LIST       REFERENCES      SERVICES      CONTACT     MULTIMEDIA       TECHNOLOGIES       DEVELOPER       TERMS                                                                                                                                                                                                                                                                                                                         .                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           
TRUSTe Privacy Policy Privacy Policy
Site protected by 6Scan