Classic View
Principle of Superposition, classic version
For all linear systems, the net respon-se at a given place and time caused by two or more inputs (or, stimuli) is the sum of the responses which would have been caused by each input (or, stimulus) individually.
Then, if x, y are the inputs and a, b are two scalars, the correlated output is:
f (a x + b y) => a f (x) + b f (y)
All the engineers engaged on a daily base with calibration operations in the equipments and machinery part of Food and Beverage Production Lines, know they continously apply an idea named Principle of Superposition. As an example, we apply it during the most different activities, when:
- applying Root Cause Analysis, during problem solving;
- troubleshooting electronics, electrotechnics and electromechanical problems;
- designing simple or complex electronic circuits;
- imagining how an effect could be related to a hypotetical cause.
A canon some centuries old, always part of the design calculations of Electronics (Electronic Inspectors, Food and Beverage Bottling Controls, fully included), Mathematics and Analytical Mechanics. The classic version of the Principle of Superposition, basically enounced in the form on right side, is based on the assumption that the system:
- is linear (the function satisfies the conditions of additivity and continuity)
- is insulated by the Environment: no influence (or, inputs or, stimuli) on the system except the inputs considered (x and y, in the example on right side);
- has no subsystems then: no correlation of subsystems’ states.
Considering that the state of a system is uniquely defined by the correlations of its subsystems, the point 3. above could surely be a reasonable assumption centuries ago, when the Principle was born. In the meantime, Science and Technologies proceeded so much in the reduction of measurement uncertainties.
All clockworks are systems of linear mechanical components. Since centuries knowingly to be extremely sensible to the Environment. The entire Physics was tentatively modelled as a clockwork world until the year 1899. In 1900 the nobelist Max Planck made the decisive breakthrough which changed the course of Mankind's history
Yet in the year 1900, the evaluation by Planck of the constant since then carrying his name, implied an update of some assumptions valid since centuries. Without to enter in fine-details, we all know that yet the simplest linear systems, when closely looked are, in the reality, extremely complex. Known examples: inductors and capacitors connected to alternate current generators, mechanical resonance of metal bars, waves, etc. The same term complexity hints to the existence of subsystems, and subsystems’ states are correlated. In the reality, the supposedly linear behaviour was yet centuries ago object of the doubts and investigations of many. Also the clockwork visible below is a known complex and purely mechanical system, where the Environment (e.g., the ambient Temperature and atmospheric pressure) is decisive, and different times shall be presented following different conditions.
Example 1
Superposition of two Monochromatic
Waves
To understand how many ideas lie today back of the word Superposition, we suggest to simply observe a liquid wave motion like the one in the video below. The wave motion of liquids, e.g. sea waves, are a particularly complex case of what formulae show in the following, about the superposition of two waves. In the wave motion of liquids, undercurrent foam and spray are influenced by flip particle movements.
Not an exaggeration, when considering that each one frame of the video in its native extremely high definition took 8 hours of time to an high-end personal computer. Meaning they were necessary months of computation to make this video. The Machine Vision systems inside Empty and Full Bottle Inspectors, Case or Crate Inspectors, also show similar levels of computational complexity. The Principle of Superposition in its classic version, when applied to two (or, more) waves with the same wavelength λ, superposed and propagating along the same direction, states that they can be considered as being a single wave whose instantaneous amplitude is the geometric sum V of the individual instantaneous amplitudes of the separate waves U and U′. As an example, let us consider two monochromatic (isofrequential) waves U and U′ with the same amplitude a, differing only for their phases, φ and φ′:
U = a cos (ωt − φ)
U' = a cos (ωt − φ′)
The instantaneous value of their combined amplitudes U and U′, is:
V = U + U ′ = a[cos(ωt − φ) + cos(ωt − φ′)] = 2 a^{2}^{ }cos[(φ − φ′)/2] cos[ωt − (φ + φ′)/2]
The new amplitude A depending upon the difference of optical paths x and x′of each wave, expressed in units of the wavelength λ, results:
A = 4 a^{2 }cos^{2 }(π/λ) (n x n x′) = 2 a^{2}{1 + cos[2 n (x − x′) / λ]}
Thus, there can be both constructive and destructive interference between the two waves, and the resulting amplitude can be anything between 0 and 2a. This result is extremely important in that it is stating that the sum of two waves can be also be their mutual annihilation and disappearance. The precedent example above, however complex it may appear, treated a case where the entire dynamical phenomena lies in a spatially limited volume. No external Environment is affecting the 4D evolution of the wave. It is, literally, infinetely less complex than an apparently banal liquid wave motion.
Example 2
Supposedly Linear System: RLC Resonant Serie circuit
James Clerk Maxwell
An example of the strict correlation of the subsystems and of the correlation between a system and its Environment, is known to Electronic Engineers operating in Food and Beverage Bottling Lines. The RLC resonant circuit behaviour, when varying the frequency of the alternative current Signals applied. Inductors, resistors and capacitors are linear components, where linear means they are bipoles whose behaviour follows the classic version of the Principle of Superposition. These circuits are defined by Maxwell’s differential equation [3] below where E is the potential, L the inductance, C the capacity, R the resistance, i the current, t the time, f the frequency and where ω = 2 π f:
[3]
having solution [4] for the current intensity i:
[4]
These relations between currents, voltages, resistance, inductance and capacitance are what we are expecting to see when applying the Principle of Superposition, assuming it fully applies to this kind of system and its subsystems. Connect to a Signal or Functions Generator the RLC (Resistor-Inductor-Capacitor) serie circuit, creating what visible on right side and power on setting a sinusoidal wave form:
The fiction. Linear character of the inductor, resistor and capacitor is what we expect to see after connecting them to Functions or Signal Generator in a RLC serie circuit, powered by a sinusoidal wave form. Linear means they are bipoles whose behaviour follows the classic version of the Principle of Superposition
The reality. RLC are correlated subsystems and each one of them separately and differentially exposed to an Environment whose relation with the correlated RLC is impossible to fully account
- ideally, to allow precise measurements, the behaviour of the circuit should be controlled by mean of an instruments to measure values of intensity of high frequency alternate current, with scales until 10 μA, 100 μA, 1 mA and 10 mA;
- if such an instrument is not available, to understand what happens may be enough a visual display of the wave form and its features:
- directly in the Function Generator (like in the figure on side), or:
- by mean of an Oscilloscope connected to the same leads where the Function Generator or Signal Generator is connected;
- start by a frequency of 50 Hz and increase the frequency by decadic steps to: 500 Hz, 5 kHz, 50 kHz, 500 kHz, 5 MHz, 50 MHz;
- after each increase of frequency:
- observe and take a note about the intensity of the current (or, evaluate the oscilloscope’s wave forms, if you have no microamperometer);
- move your hand at a distance of 300 mm by the circuit, and look for effects on the measurements readouts (or, waveforms' shapes);
- for frequencies > 200 kHz the fact to oscillate the hand a few centimetres, by a distance of 300 mm, let a circuit built with three surely linear components behave an apparent nonlinear way. The hand, lying out of the RLC circuit visible on side, is now part of an Environment undoubtedly correlated with the resonant circuit;
- staying far from the circuit and increasing the value of the frequency, we are spectators of the circuit increasingly complex behaviour due to the superposition of the effects of distinct behaviours:
- RLC circuit resonance;
- correlations between RLC circuit's subsystems;
- correlation between the resistor R and Environment;
- correlation between the inductor L and Environment;
- correlation between the capacitor C and Environment;
- closing the entire circuit in a Faraday cage and increasing the value of the frequency, we are spectators of the circuit increasingly complex behaviour due to the superposition of the effects of distinct behaviours:
- RLC circuit resonance;
- correlations between RLC circuit's subsystems;
- correlation between the resistor R and Environment, where the superposed terms due to electromagnetic induction by the power network, motors, data, radio, TV, are minimised, but still they remain several other contributions;
- correlation between the inductor L and Environment, where the superposed terms due to electromagnetic induction by the power network, motors, data, radio, TV, are minimised, but still they remain several other contributions;
- correlation between the capacitor C and Environment, where the superposed terms due to electromagnetic induction by the power network, motors, data, radio, TV, are minimised, but still they remain several other contributions;
implying a complex behaviour which the definition of the Principle of Superposition we are adopting like our Polar Star, does not appear fully capable to handle;
Nonlinear classic devices were born long before transistors. Legacy vacuum tubes are non-linear devices, operating on base of Electrostatic. A portion of their characteristic curve shows the equivalent of a differential negative resistance, what made of them the first Signal amplifiers of the history
- increasing the frequency of the signal from 500 kHz to 50 MHz the circuit is markedly showing a complex behaviour. The RLC circuit expected operation is progressively replaced by a vectorial superposition of different effects:
- Resistor R
- is progressively showing properties we’d expect by a superposed inductor because of the solenoidal geometry of its material and also because the couple of terminals are metallic, then themselves inductors;
- a superposed capacitor, because of the potential existing between different sections;
- skin effect is progressively increasing the impedance, reducing the section of its terminals really interested by the passage of electrons;
- the metal oxides of which the resistor is made behave a different way, introducing unexpected superimposed effects;
- changes in the Environmental temperature around the resistor, seems to be intervening in the operation, introducing unexpected superimposed effects;
- there is an additional alternative current induced in the terminals and in the resistive metal oxides materials lying in the em field created by the Inductor;
- there is an additional alternative current induced in the terminals and in the resistive metal oxides materials lying in the em field created by the terminals of the Capacitor C;
- there are additional alternative currents induced in the Resistor, originating by the fact that the Generator is not ideal (it radiates);
- there are additional alternative currents induced in the Resistor, due to sudden fluctuations in frequency, polarization and amplitude of the electromagnetic fields in the Environment (e.g., data, radio and TV transmissions, cables of power network radiating at 50 or 60 Hz);
- there are additional alternative currents induced in the Resistor, impulses of brief duration due to cosmic rays;
- there are additional alternative currents induced in the Resistor, impulses of brief duration due to environmental radioactivity;
- the electromagnetic fields of prior point 9. also fluctuate because of sudden changes in the local value g of the Earth gravitational field, when no way exists to shield gravitationally the RLC circuit and gravimeters’ measurements’ repetition rate < 10 Hz, always delayed with respect to the em induced interfering signals we’d like to compensate;
- (…………..).
- Capacitor C
- shows a progressively marked behaviour equivalent to the one we'd expect by an inductor, because its terminals are metallic;
- an additional parallel capacitor, because of the potential existing between its terminal;
- skin effect is progressively increasing the impedance, reducing the section of its terminals interested by the passage of electrons;
- the dielectric behaves a different way, introducing unexpected superimposed effects;
- changes in the Environmental temperature around the capacitor, seems to be intervening in the operation, introducing unexpected superimposed effects;
- there is an additional alternative current induced in the terminals and in the metal plates, lying in the em field created by the Inductor L;
- there is an additional alternative current induced in the terminals and in the metal plates lying in the em field created by the passage of current thru the terminals of the Resistor R;
- there are additional alternative currents induced in the Capacitor, originating by the fact that the Generator is not ideal (it radiates);
- there are additional alternative currents induced in the Capacitor, due to sudden fluctuations in frequency, polarization and amplitude of the electromagnetic fields in the Environment (e.g., data, radio and TV tranmissions, cables of power network radiating at 50 or 60 Hz);
- there are additional alternative currents induced in the Capacitor, impulses of brief duration due to environmental radioactivity;
- there are additional alternative currents induced in the Capacitor, impulses of brief duration due to cosmic rays;
- the electromagnetic fields of prior point 9. also fluctuate because of sudden changes in the local value g of the Earth gravitational field, when no way exists to shield gravitationally the RLC circuit and gravimeters’ measurements’ repetition rate <10 Hz, always delayed with respect to the em induced interfering signals we’d like to compensate;
- (…………).
- Inductor L
- is progressively showing properties we’d expect by a superposed capacitor, because of the fact that its resistance is not zero and then a potential exists between different sections of the solenoid;
- an additional parallel capacitor, because of the potential existing between its terminals;
- skin effect is progressively increasing the impedance, reducing the section interested by the passage of electrons;
- the air in which the winding lies, is clearly intervening in the operation, following relatively small changes in the Environmental humidity, introducing unexpected superimposed effects;
- there is an additional alternative current self-induced in the terminals and in the solenoid, lying in the em field created by the same Inductor L;
- there is an additional alternative current induced in the terminals and in the solenoid lying in the em field created by the passage of current thru the terminals of the Capacitor C;
- there is an additional alternative current induced in the terminals and in the solenoid lying in the em field created by the passage of current thru the terminals of the Resistor R;
- there is an additional parasitic capacitance originating by the fact that the winding has not zero resistance and that there is a dielectric in between them;
- there are additional alternative currents induced in the Inductor, originating by the fact that the Generator is not ideal (it radiates);
- there are additional alternative currents induced in the Inductor, due to sudden fluctuations in frequency, polarization and amplitude of the electromagnetic fields in the Environment (e.g., data, radio and TV tranmissions, cables of power network radiating at 50 or 60 Hz);
- there are additional alternative currents induced in the Inductor, impulses of brief duration due to environmental radioactivity;
- there are additional alternative currents induced in the Inductor, impulses of brief duration due to cosmic rays;
- the electromagnetic fields of prior point 10. also fluctuate because of sudden changes in the local value g of the Earth gravitational field, when no way exists to shield gravitationally the RLC circuit and gravimeters’ measurements’ repetition rate < 10 Hz, always delayed with respect to the em induced interfering signals we’d like to compensate;
- (…………..).
- Resistor R
The values we registered for the current i, so different than what expected by the equation [4] have shown that the impedance characteristics of common circuit elements (resistors, capacitors, inductors) utilized in circuit theory, are simply low-frequency asymptotes of the overall frequency responses of these components. Also, these tests show that each one of the three components, yet when took separately by the others violates the rules for a linear device because has internal subsystems and, worse, subsytems whose behaviour with respect to the frequency is differential. This, means that also the three circuits on right side, equivalent to Resistor, Capacitor and Inductor yet when separately considered, are frequency-dependant. Their aspect and performances change, following the frequency. Also, these equivalent circuits abstract by the existence of the Environment. Comparing the reality we discovered, with fourteen elementary components appearing below rather than three (R, L, C), it becomes easier to imagine why the observed behaviour is so complex and divergent by what the differential equation [3] defines.
Equivalent circuits of a single Resistor, Capacitor and Inductor. The values we registered for the current i, so different than what expected by the equation [4], shows that the impedance characteristics of common circuit linear elements R, L, C are only low-frequency asymptotes of the overall frequency responses of these components
Up to 750 GHz
Imagine now to have the possibility to access a Signal Generator capable to reach an even higher frequency, e.g. 750 GHz. Then, the only model explaining what is observed is that one where the amount of superposed and correlated effects diverges to infinity. When fronted by the necessary answers to the following questions we start to feel much more than the superposed effects of the multitude of particles building up an atom of Copper in a conductor:
- what of the effects, and in what amount, is due respectively to the RLC circuit and to the Environment ?
- of the space-time existing around the circuit, what section to consider Environment ?
- of all of the objects (e.g., elementary particles, atoms, molecules, macroscopic bodies) existing in the section of space-time we have chosen to consider Environment, what and in what extent to consider causally connected ?
- for each one infinitesimal frequency in such a wide range, we’d be forced to deduce the complete detailed behaviour by the measurement of an amount of properties which, having been forced to include the Environment, is now no more limited to three RLC components, rather extended to a mind boggling amount;
- the measurements instruments we are using are not ideal, and the amplitudes of their uncertainties, as we increase the Signal frequency and extend to the Environment the exam, becomes of the same size as what we are looking as Signals. What to consider Signal and what to consider a fluctuation due to the instrument uncertainty ?
- what should happen when increasing the frequency toward infinite ?
- how to treat the chained effects of a change in the temperature T implying a new value for the resistance R, inductance L and capacity C ? ...say the equation to the partial derivatives:
∂ f (R, L, C) / ∂ T
Introducing the module below in the out feed of a Signal Generator like the one above, it becomes possible to extend until 750 GHz the frequency of the signals introduced into electrical and electronic circuits. Thus, allowing to witness the divergence of the super posed correlated effects. An infinity converging to a limited however mind boggling amount due to the discretization of the frequencies readable in the Generator ( abridged by Rohde & Schwarz™, Virginia Diodes™/2013)
Common cables, linear when transferring DC power (not closely looking what in the meantime is going on in the atomic scale) start to reveal their true nonlinear nature, as much as we increase the frequency of the Signals
We saw above only a small fraction of all of the aspects that an allegedly-simple RLC serie circuit presents when closely examined by mean of the Principle of Superposition, into its Environment and increasing the frequency of the alternative current i. Above we considered a relatively complex case with three electronic components. May linearity be assured when considering simpler examples ? Of all of the electrical components, no one looks simpler than a cable. What above is true also for the less suspect of the linear devices, like the common copper cables and their connectors, visible in the figure on right side. Also cables, linear when transferring DC power (not closely looking what in the meantime is going on in the atomic scale) start to reveal their true nonlinear nature, as much as we increase the frequency of the Signals. How true this statement is can be inferred by the graphics below. Here shown a cable powered in a range of frequencies: (0 - 34) GHz. In the vertical axe, the relation between transmitted and incident signals, expressed in dB. The cable transmitted response (voltage trasmitted / voltage incident) quite finely evaluated sampling the signals 80 billions of times per second. In the figure, evidenced by a red colour square, the oscillating behaviour shown over signals’ frequencies of 27 GHz. A classically unexpected behaviour, manifestly complex and nonlinear. We leave the Reader imagine what it signifies, in terms of deviations, to power that same cable at 750 GHz, considering that yet at 30 GHz the frequency response is what visible in the graphics below.
Classically unexpected behaviour, manifestly nonlinear, of the simplest electric components shows itself when increasing the signals' frequency. In the example referred to a cable, to signals’ frequencies ranging (27 - 34) GHz correspond wide deviations out of what expected by classical Electrodynamics. Nature has an agenda different than our classic, itself based on a completely different set of assumptions. In evidence the fact that the standard deviation (“Std Dev”) of the timing is only 1.74587 ps (picosecond, 10^{-12} s). We are looking the Events with improved precision, but still compelled to zoom additional 29 orders of magnitude to see directly their natural Planck scale
Information Flow underlying Measurements
Knowingly, in Classical Mechanics it is named Lagrangian L of a system, i.e. a material body, the relation between its kinetic and potential energy. Its analogue in Quantum Electro- dynamics (QED) is noted L, an amount which can be integrated over all spacetime to get the action S. The Lagrangian density L of quantum electrodynamics (QED) is a typical example of interacting systems:
left to right, it contains three terms:
Year 2000: in absence of any Intelligence coordinating all of the particles of the Universe, it started to be understood the extreme relevance of the Information Flow for all physical phenomena
The Italian physicist, mathematician, astronomer Joseph Lagrange (Giuseppe Lodovico Lagrangia) is also known for his discovery of the today named Lagrangian Points. The animation above shows the case of L1, L2, …, L5 Lagrangian points of the physical system Earth-Moon (CC 3.0)
- the coupling between an ideal electromagnetic field and an ideal electron field;
- an ideal electron field in isolation, characterized in terms of a mass parameter m;
- an ideal electromagnetic field in isolation, based on the elementary charge e.
These parameters pertain to a literally bare electron stripped of all electromagnetic field, which is described separately. Shortly later we’ll see how deeply all this applies to the industrial Machinery operation. Readers are suggested to observe that the rationale used to evaluate the QED Lagrangian density L pass through a first term giving due relevance to the coupling. Coupling of two ideal fields: an electric and an electromagnetic. In other words, Quantum Electrodynamics recognized and calculated the fundamental action of a charged particle, like the electron, over itself. Since long time the recognized origin of a row of practical industrial applications of the electromagnetic induction, like the motors. Something presented to the students of the high school courses under the heavy and 150 years-old today Classic dress of reactance, impedance and reactive power. After this briefer to the revolution started by Paul Dirac in 1938 when founding Quantum Field Theory, we ask the Readers to extend as much as they are capable in the space and time the rationale of his mental strategy. What is the widest thinkable collection of particles ? The Universe. How many couplings between particles ? Since more than seven decades it is known that our measurement instruments (RLC circuit included) and Machinery are causally related, a near-synonymous of coupled, with other ≳ 10^{80 }particles. Particles close enough to allow the establishment of a causal relation. To coordinate the movements and energy levels of so many particles spread in such an enormous volume they are both vital:
- Intelligence;
- Information Flow
To be sure of this last statement, imagine you have been given the task to move over 10^{80 }material particles, assuring their positions with accuracies ~10^{-31 }m with timings as precise as 10^{-43 }s. Needing an extremely high processing power and memory, you’ll surely ask for the intelligence of an impressive Supercomputer. Also, you’ll expect to have to transmit and receive an equally impressive amount of Signals. An extremely intense Information Flow. And now, try to answer the last two fundamental questions, those attacking the source of all of the other problems:
- Where is the Intelligence necessary to coordinate, on such distances and at such infinitesimal levels of precision, all these atoms ?
- Never detected: it does not exists. Since centuries it had been abandoned the anthropic idea that natural facts correspond to the manifestation of a superior intelligence. Then, Nature has to apply a different mechanism to create that sensation of Physical Laws used, as an example, to design everything technological. A different definition for Superposition, one allowing to have however ready the correct answer, the fitting eigenvalue, also in absence of an intelligence to precalculate and transmit the correct. The next question 2. and its answers, strictly related to this, is what is considered the mechanism allowing the coordination.
2. Where is the Information Flow underlying such coordination ?
- Virtual Particles play the role. The photons are called virtual because their creation and annihilation in the interaction does not conserves energy and momentum. One electron creates some virtual photons, which are annihilated by the other. Thus the electron and electromagnetic fields form an integral system. If an electron has enough energy it can easily create photons or excite the electron field and create electron-positron pairs. A photon created by an electron may be reabsorbed by the same electron. In fact the electron cannot escape from interacting with the electromagnetic field it itself generates. The self-energy of interaction gives rise to infinities. The solution to the problem of infinities is based on the insight that the electron, with which the theory starts, is fictitious. It has to be considered that the electromagnetic field originates by the electron. The electron is always accompanied by electromagnetic field excitations or surrounded by virtual photons. An aspect of the electron closer to its reality. Thus the mass parameter m and charge parameter e, which pertain to bare electrons make no physical sense.
Pair production. The creation of an elementary particle and its antiparticle. Here, a 2 eV photon creates a positron-electron pair. A short time later the pair annihilates leaving a 2 eV photon so that in the long term total Energy is conserved. Positrons may be conceived as electrons moving toward the Past, or imagining that their own clock, as seen by our own point of view, rotates counterclockwise.
At first sight something purely theoretical, since three decades they are used positrons in the Positron Electron Tomographs (PET) with improved non-invasive capability of diagnosis in Medicine. Virtual Particles, introduced by the nobelist Paul Dirac as consequences of his quantum relativistic theory of 1928, are a reality with practical effects
In the following, an abridged list of phenomena and technological applications whose existence is assured by Virtual Particles, in the framework of the Quantum Field Theory:
- Electromagnetic induction. This phenomenon transferring energy to and from a magnetic coil via an electromagnetic field is a near-field effect (see point 12). It is the basis for power transfer in transformers, electric generators and motors, signal transfer in metal detectors.
- Coulomb force between electric charges. It is caused by the exchange of virtual photons. In symmetric 3-dimensional space this exchange results in the inverse square law for electric force. Since the photon has no mass, the Coulomb potential has an infinite range.
- Magnetic field between magnetic dipoles. It is caused by the exchange of virtual photons. In symmetric 3-dimensional space this exchange results in the inverse cube law for magnetic force. Since the photon has no mass, the magnetic potential has an infinite range.
- Hawking radiation, where the gravitational field is so strong that it causes the spontaneous production of photon pairs and particle pairs.
- Strong nuclear force, between quarks is the result of interaction of virtual gluons. The residual of this force outside of quark triplets (neutron and proton) holds neutrons and protons together in nuclei, and is due to virtual mesons such as the pi meson and rho meson.
- Weak nuclear force. It is the result of exchange by virtual W and Z bosons.
- Decay of an excited atom, accompanyied by spontaneous emission of a photon. Such a decay is prohibited by ordinary Quantum Mechanics and requires the quantization of the electromagnetic field for its explanation.
- Casimir effect, where the ground state of the quantized electromagnetic field causes attraction between a pair of electrically neutral metal plates.
- Van der Waals force, partly due to the Casimir effect between two atoms.
- Lamb shift of positions of atomic levels.
- Vacuum polarization, involving pair production or the decay of the vacuum, which is the spontaneous production of particle-antiparticle pairs.
- Electromagnetic near-field. Where the magnetic and electric effects of the changing current in the antenna wire and the charge effects of the wire's capacitive charge may be important contributors to the total em field close to the source, but both of which effects are dipole effects that decay with increasing distance from the antenna much more quickly than do the influence of conventional electromagnetic waves that are far from the source. Far-field waves, for which electric field intensity E is in the limit of long distance equal to cB, are composed of actual photons. It should be noted that actual and virtual photons are mixed near an antenna, with the virtual photons responsible only for the extra magnetic-inductive and transient electric-dipole effects, which cause any imbalance between E and cB. As distance from the antenna grows, the near-field effects (as dipole fields) damp themselves much more rapidly, and only the radiative effects that are due to actual photons remain as important effects. Naming r the radius measured from the source, the virtual effects extend to infinity but they drop off in field amplitudes as r^{-2}^{ }rather than the field of electromagnetic waves composed of actual photons, which drop r^{-1}^{ }Also, the powers of the virtual photons decrease as r^{-4} when the powers of the actual photons decrease as r^{-2}
Information-Computation-Energy Link
“4 hours of processing work time for a cellular automaton 10^{63 }times smaller than the Uni-verse 4D-volume and content, are enough to write a complete simulation of its 14.5 billion years long history”
A discovery of last forty years is that all physical processes, following scaled in terms of space-time extension, mass or energy, emulate a computation. Then, how much computing is going on in a small speck of matter, which let it appears what our eyes see? The answer depends on the amount of energy involved, but even for very modest figures, the information processing rate is extraordinarily high. As an example, for each joule of energy yet in 1996 it has been estimated a maximum processing rate ~ 10^{33 }operations per second [see: Margoulis, 1996]. Then roughly over a billion billion times faster than today’s computers. Seemingly too fast: what and where is processing so many bits, in so short time ? Passing from tiny specks of matter to the huge 4D-volume named Uni-verse where ~ 10^{80 }material particles had enough time (14.5 billion of years) to causally connect each other, it gets out that our algorithms and computers could potentially be too powerful. In 1992 it had been imagined [Fredkin, 1992] a cellular automaton capable to emulate the grand sum of all existing material particles’ mutual interaction, along 14.5 billion of years. Thus reproducing the assumed unique history for the Uni-verse. Unique history seemingly manifest in the single 4D-frame observed today, after 14.5 billion of years of (assumed) unique-historical evolution. Just 4 hours of processing work time for a cellular automaton 10^{63 }times smaller than the Uni-verse 4D-volume and content, are enough to write a complete simulation of its 14.5 billion years long history. A definitely too powerful algorithm.
Intimations of Reality
We hope to have transferred to the Readers that what before is suggesting interesting incoherences in our most basic assumption. Incoherences whose far-reaching consequences cannot be fully sighted if observed by mean of the deforming glass of today’s widespread relativism. Physics does not conceive relativism and openly disproves it by mean of its Laws. In laymen words: the objects are one way or another. For example, they have not to be confused the relative different results for the measurement of a property named length of an object [an object's precise mathematical definition, based on the grounds of Differential Manifolds and Topology theories', here] as seen by differently accelerated platforms. Relativity theory never made statements about non-existence of the length property. On the opposite, it includes precise instructions to determine the best reference frame to evalutae that and other physical properties. After this brief epistemological introduction, we remark that elevating to the rank of Truth two Classic ideas:
- an object is an uniquely existing instance;
- the collection of all objects is a unique instance itself, named Uni-verse.
carry us to a long row of contradictions. The entire history of the Uni-verse calculated in just 4 hours by a computer with the size of a big star. Also, too many computations necessary to handle an energy transfer amounting to just 1 joule. Around ten times less than the radiant energy outcoming each second by the LCD screen used to read this page. End of the thirties of past century, the own geniality of Paul Dirac and others, tried to circumvent some problems which were logically starting to surface creating the concept of virtual particles. Virtual particles have to be imagined as the way-out from these and many other incoherences. On appearance, virtual particles should be capable to reproduce what is visible in a Uni-verse. But, the way-out was thinked 50 years before Quantum Computation started to rise. Then not including nor capable to explain what 50 years later, since the eighties of past century arises by new theories and practical application of non-standard Computation.
Why the Classic Principle of Superposition Fails
What precedes demonstrates that the majority of the linear systems encountered are simplifications of a nonlinear reality, under some basic assumptions which, in the case of electronic circuits, can be synthesized as:
- no Environment;
- no high frequency Signals;
- no Virtual Particles;
- full linearity of the eventual subsystems of the linear system, in our case, the RLC circuitry;
- no correlations between the eventual subsystems of the linear system, in our case, the RLC circuitry.
Six stages in the history of a Virtual Particle and of its antiparticle counterpart traveling backword in time. The Classic Principle of Superposition, still today teached in the high schools and colleges, was coined centuries ago when electromagnetic and coulombian forces’ effects were considered fully accounted. Since 1928 the description agreeing with what in the experiments pops-out to existence are Paul Dirac’s Virtual Particles
The intrinsic nonlinearity in these conditions is not perceptible in those measurements which are accessible to direct observation, because in them the nonlinear terms are quite negligible in comparison with the linear ones. That’s why the Classic version of the Principle of Superposition is found to be confirmed. Surely, the Classic Principle of Superposition presents the advantage to have to solve relatively simple algebric or (linear) differential equations, rather than much more difficult nonlinear differential equations. But, can this way to proceed let our knowledge advance toward the solution of intrinsically nonlinear problems ? Ignoring Virtual Particles' existence, when these are assuring electromagnetic and coulombian forces is an ill-fated position. Then, only the dissipative thermal effects of active power (watt) in the resistor R are truly accounted for by the Classic Principle of Superposition. No impedance Z, no reactive Q, nor apparent powers A (measured in VAR and VA) due to an Inductor, Capacitor or inductance and capacity of a real Resistor, can be explained without Virtual Particles. Say, without the Quantum Field Theory (QFT). The denomination Virtual Particles, comprehensible when it was coined several decades ago, results today a bit inappropriate.
All existing motors use the induction phenomena due to virtual photons ( Paul Nylander/2014)
A virtual particle is not a particle at all: it refers to a disturbance in a field, and a field is not surely a particle. A particle is a regular ripple in a field, one that can travel through space. On the opposite, a virtual particle generally is a disturbance in a field that will never be found on its own. It is caused by the presence of other particles and often of other fields. After this briefer, it is however necessary to make it clear that the, in terms of mere existence, virtual particles are not less real than all other particles. Since decades the technological progress is allowing to observe them in experiments made in the widest range of ambient temperatures, comprised between a little more than the absolute zero and billions of Kelvin degrees. They were the New Physics …seventy years ago !
An Apparent Exception to Superposition
Feynman diagram descripting a collision between two gamma-photons, themselves originated by an electron-antielectron pair, with a pair quark-antiquark being the final outcome ( Ch. Berger/1983)
A relevant, apparent exception to the superposition regards light when freely moving through space, say not absorbed or detected by atoms or molecules. Since centuries, they are used expressions like interference of light, in spite of the observation that light beams do not interact with on each other. They cross through each other unperturbed: it does not exists at all a photon-photon interaction. Light should also be invisible to us if it’d not be for the fact that we absorbs it in our own material detectors, the eyes’ retinas. When superposed, light beams do not create the interference fringes. Why? Mainly because we are speaking of visible light, light whose energy range in the order of a few eV (electron volt). Gamma-rays are a relevant exception. High energy photons notoriously adopted as interactant in the kegs and cans' fill level inspections in some Beverage Bottling Lines. These being associated to photon energies as high as ~500 MeV, a pair of photons can annihilate creating an electron-positron pair. But the exception before is discussed in terms of higher order diagrams, which are higher order terms in a perturbation series. Consider a beam splitter, an optic component always present in Full and Empty Glass Bottles Inspectors with cameras.
“No impedance, no reactive nor apparent powers due to the Inductor or the Capacitor or the inductance and capacity of a real Resistor, can be explained without Virtual Particles, say without the Quantum Field Theory (QFT)”
A natural question arises when practicing common Optoelectronics experiments: how do a beam splitter succeed in redirecting the energies from two opposing light beams into one or another direction in different interferometers? The answer is not a direct interference between photons. Rather, the response characteristics of the dipoles of the material medium to the superposed electromagnetic fields. In conclusion, and with reference to the figure above:
- the white coloured light indicates superposition of the photons whose frequencies are the notorious three fundamentals red, green and blue;
- the photons are not directly superimposed and the effect of superposition originates by the water molecules with whom the three-coloured photons are separately interacting.
Links to pages on related subjects:
Introduction￼In the Industrial Automation and Electronic Inspection fields, however named and operatively disguised, Triggers play the role of the most elementary measurement instruments. Simpler than any single-channel (e.g., the High Frequency fill level inspection) or multi-channel (for example, all camera-based inspections) analog-digital variable measurement system, and a permanent input device to the Programmable Logic Controllers' (PLCs) for the automations. …
Objects and Measurements’ hidden Nature ￼￼ In 1915 Einstein extended Relativity Theory to include also some of the ideas underlined by the figure on side. To every point of a curved differentiable manifold …
Triggers. Sharp and Unsharp Signals￼Triggered is said of Events with the most strict kind of correlation which may be imagined: the causative. Their effects are other Events. After the introduction given here…
Classic ViewAll the engineers engaged on a daily base with calibration operations in the equipments and machinery part of Food and Beverage Production Lines, know they continously apply an idea named Principle of Superposition. …
The Far Reaching Consequences of aPh.D Dissertation￼When treating the ideas of Relativity, we saw the Relativity Principle implying infinite 3D spaces associated to each instant of time, which in turn implies that the world is at least 4-dimensional. …
Quantum in Brief￼ ￼With reference to the figure on side, a quantum system is specified by: Hilbert space H : a subset of the Banach space, whose rays are non-nil complex-valued vectors each of them representing possible states of a physical system.…
Links to pages on other subjects:
Total Cost of Ownership of a Full Containers Electronic Inspector, on base of its Fill Level Inspection Technology￼Counting the number of Technologies existing for the measurement of the fill level in Bottling Lines, we encounter at least seven different. …
When thinking to its applications in the industrial Machinery and equipments, whoever thinks to know what is a Trigger. Their most known examples all Container Presence electromagnetic detectors (i.e., photoelectric, inductive, by mean of ultrasounds, Gamma-rays) which let the Machinery operate. …
A Fundamental QuestionWhat Detectors detect? Their purpose is known: the conversion of light (photons) into electric currents (electrons). Photodetectors are among the most common optoelectronic devices; they automatically record pictures in the Electronic Inspectors’ cameras, the presence of labels in the Label Inspectors or the fallen bottles lying in a Conveyor belt. …
IntroductionThe light generated by a LASER LED in the figure above may be used to detect an excessive inclination or height of a closure, and also the filling level of a beverage in a transparent container. …
￼The subject of Classification is studied by Statistics and Applied Probability Theory. It is concerned with the investigation of sets of objects in order to establish if they can be summarized in terms of a small number of classes of similar objects. …
An optical rotary joint using injection moulded collimating optics (￼ Poisel, Ohm University of Applied Sciences/2013) Runt pulses & nonclassic Packaging Controls’ components ￼ ￼ ￼ ￼￼ Also consumer cameras use a Trigger. …
Inspections in a Decohering EnvironmentWhat is a Measurement ?￼Measurement’s nature is like time, one those things we all know until we have to explain it to someone else. Explanation invariably passing thru the idea of comparison between a standard established before and something else. …
First In First Out Application to Food & Beverage packaging an ideadeveloped to handle the highest ProductionFIFO (First-In-First-Out) concept started to be applied some decades ago to industrial productions, specifically to manage the highest speed production lines. …
- Fill level inspection tech: a TCO point of view
- Physics of Triggering
- What Detectors detect ?
- Electromagnetic Measurements of Food, Beverages and Pharma: an Information Retrieval problem
- Binary Classification fundamentals
- Electronic Inspectors’ nonclassic components
- Measures in a Decohering Environment
- FIFO: Bottling Quality Control applications of an idea born to manage the highest production performances
- Photodetectors fundamental definitions
- Media download
- Containers