Complex numbers does matter

Do you remember the handling given to √-1 when solving equations like:

a x^{2} +
b x + c = 0

into the set ℝ of real numbers ?

Solutions given by relation:

x_{1,2} = ± ( b^{2} – 4ac )^{0.5 }/ 2a

may imply square roots of negative numbers.

An example in the case:

x^{2} -
1 = 0

whose roots are:

x_{1,2} = {-1; 1}

Negative roots of minus one are considered roots without signification in the set ℝ.* *As a matter of fact, on the opposite these same roots have full meaning (also physical, e.g. when dealing with concepts of reactive power in Electrotechnics and resonance in Electronics) when the solution of the equations is extended to the set ℂ** **of the complex numbers, of which the real numbers ℝ are just an infinitesimal subset.

In synthesis

The guiding principles for today comprehension understanding of the fine-scales of all phenomena are:

- linear
*superposition principle*, - probabilistic interpretation of Quantum Mechanics.

These two principles suggest that the *state space* is a linear space, thus accounting for the *superposition principle*, endowed with a scalar product used to calculate probability amplitudes. It is named “Hilbert Space” and abbreviated *H* , such a linear space endowed with a scalar product. More, Hilbert Space *H *is a linear space over the field of the Complex numbers *ℂ*.

This means that if a, b ∈ *H* , then:

- a + b ∈
*H* *H*

where c is any Complex number *ℂ.*

Superposition basics

The *superposition principle *requires a continuum. As an example, the *superposition* of classical configurations such as the positions of N particles, originates the idea of *wave functions. Wave functions *evolving continuously in Time according to an equation named Schrödinger equation. Evolving continuously until a measurement disturb the system.

The Writer of these notes remember he had personally been teached Quantum Mechanics starting by the study of single-particle problems. Then following the early Schrödinger equation. This frequent recall of studies whose valye is today nearly only historic gives the impression that the *wave function *Ψwas kind of a spatial field. And that’s why many Telecommunications or Chemical engineers or scientists, still erroneously believe it exists a *wave function for each electron in an atom.*

On the opposite, according to Schrödinger’s quantization procedure or the *superposition principle, wave functions *are defined on* Configuration Space*. And is this what carries to the unavoidable *entanglement* of all systems existing in the Universe. These discoveries of one century ago has to be seen from today point of view, one based on fields rather than particles. The effect of the entanglement of all systems existing in the Universe can be immediately perceived in the superposition of classical configurations. If we superpose their amplitudes of certain fields (i.e., the electro magnetic field) rather than particle's positions, the *wave function *Ψ becomes an *entangled wave functional for all of them.*

And is what explained here above, modern principle of Superposition the basic reason why methods and formalism of Quantum Mechanics, different than those of Relativity, establish that subsystems evolve separately, along a tree-like structure occupying a multitude of spaces. Multitude of spaces reminiscent of the (infinite) multitude of spaces since 1907 attributed by Relativity Theory to each one instant of Time. The information about the entire set of historical time-ordered ramifications, only known to the *superposition of all of the subsystems*, named Multiverse, and *never fully known to the subsystems*. Causing what, from the sub systems point of view, is the existence of limits to knowledge, synthesized in the Uncertainty Principle. Multiverse is an idea implicitly embedded since the start in the same wave function Ψ. The same function describing how, as an example, semiconductor-based transistors operate grouped into the Integrated Circuits of the Industrial Machinery Programmable Logic Controllers (PLC) devoted to automation, or the Bottling Controls (Electronic Inspectors).

*“all possible measurement values, say all statuses, are actual”*

We explained above what discovered about a *wave function *Ψ, originally born only to define the probability to localize the electron of an atom, since the start spread the answer in the infinite range of positions x. Erwin Schroedinger, the nobelist who fathered in 1926 the *wave function* formalism on the base of prior ideas of Louis De Broglie, saw that the wave function Ψ was describing a multitude of simultaneously coexisting electron positions in the space *ℂ* of complex numbers { a + i b, a, b ∈ ℝ } rather than a single position in the space *ℝ** *whose elements are real numbers.

Schrödinger choose to treat the multitude of solutions in the particular case he studied (a multitude of electrons in a Hydrogen atom), the way we treat square roots of minus one in the set of real numbers ℝ.

They were necessary other thirty one years to let someone else, Hugh Everett III, recognise their physical relevance and meaning. The wave function Ψ representing a physical field, and each one solution different than the others for a minimum amount of Information. Amount today considered equal to 1 bit.

The detailed example we considered elsewhere, should have fully satisfied Maxwell’s equation **[3] **in case of an *ideal RLC serie circuit* (something which never existed nor shall exist) in an ideal Environment where:

- temperature is kept constant with an infinite precision;
- humidity is kept constant with an infinite precision;
- electromagnetic fields do not exist;
- no gravitational fields exist (say, a RLC resonant circuit in a never existed flat Euclidean space-time);
- no induction of the electromagnetic field of the inductor in the resistor and capacitor, etc.

In other terms: ideal RLC components part of a real Environment.

Conditions like these cannot be encountered in the Universe, because they imply no electromagnetic nor gravitational fields. On practice, they speak about a system causally disconnected (then, non correlated) with respect to its Environment.

Superposition of different states

In 1935 Schroedinger conceived a case where a complex (biologic) system lies in an ambient assuring total disconnection from the external world, the Environment. He was the first to understand that an object lying in a truly closed ambient, without any exchange (no matter, radiation nor information flow) with the external ambient, should have been in a superposition of states, rather than in the single one we perceive directly or by mean of measurement instruments. In this case, *the insulated system should have had a multiple existence: different states of different instances of the system in the insulated ambient. *

The object should have been:

- out of the Environment;
- into an infinite-dimensional space, where a multitude of instances of the system exists.

How this may be possible also depends on the fact that Ψ is defined in the Hilbert space *H*, a vector space itself, generalization of the known *ℝ*^{3 }Euclidean space.

Hilbert Space *H* extends the methods of vector algebra and calculus from the three-dimensional space to spaces with any infinite number of dimensions. Here, we are using the term *dimension* in the usual way as the quantities fully defining the position of a point in the space.

Imagine a dot and let all infinite radiuses spherically get out: that is the dimension of a dot in the Hilbert Space *H*, where each one radius identifies each one of the properties (e.g., position, time, momentum, polarization, curvature, action, energy, temperature, information, etc.) of that dot.

**Hilbert Space ***H** ***cannot treat physical fields, like the electromagnetic or gravitational. Embedding the separable Hilbert space in a rigged Hilbert space, it can house fields representing them as blurred sets of Hilbert vectors**

*“The system derived by the superposition of the two original systems, each one described by 6 dimensions, is fully described in the Hilbert Space **H* * by 36 **dimensions”*

Dimensionality of a system in the Hilbert

Space

As an example, of the differences existing between the classic Euclidean and the modern Hilbert Space *H*, consider two classical independent systems for which we know:

- the Euclidean position, in the 3-dimensional space (x, y, z);
- their velocity components, along each one of these axes (Vx, Vy, vz).

If we join these two systems and let them interact, we *add the number of dimensions of each one *of them.

The system derived by the superposition of the two original systems, each one described by 6 dimensions, is fully described by 12 dimensions.

But, repeating exactly the same steps in the Hilbert Space *H* carries a completely different result because we have now to *multiply the number of dimensions* of the original systems. Then, in this case the dimensionality necessary to fully describe the superposed system increases to 36.

Noncommutativity in the Hilbert Space

More, in the Hilbert Space the superposition of two systems like that described before (a measurement) is noncommutative **[6]**, meaning that the order of the operations implies different results.

A binary operation, indicated by the asterisk symbol * on a set S, is noncommutative if for all x, y ∈ S:

a * b ≠ b * a **[6]**

This last point truly makes the difference with respect to a common sense still based on the classic Algebra of real numbers functions in the set *ℝ*** **of elements in the same set

*ℝ*. Our idea of commutativity is strictly related to the deeper idea of symmetry.

How can be possible that a different order in the operations change the result ?

An idea like this, judged from the familiar point of view of the Real numbers seems unthinkable, as an example suggesting that:

2 * 3 ≠ 3 * 2

After the implications of non commutative rules became evident, it was coined a first comfortable temptative explanation, one trying to pass around the new scenario it forced to front:

*“No paradox exists: we are treating non-classic complex operators in the space of Hilbert”. *

But, the Hilbert Space is yet the base for all practical technological applications. How can it be possible that *a useful fiction, a mathematical-only artifice* could carry us such a bonanza of Wall Street-quoted technological and industrial applications ?

** **** The rotations in three dimensions of a solid are non commutative (abridged by Jeff Atwood, 2011) **

As an example, no one doubts about the reality of the switchings of the CPUs and of the logic gates in the Programmable Logic Controllers (PLCs) which let worldwide Food and Beverage Packaging Lines run in this moment.

No one doubts that what those CPUs and I/Os are performing, are computations over a programmed algorithm. A second attempt to pass around the uncomfortable implications of the reality of the Quantum Field, tried to limit to the atomic and subatomic scales the domain of application of noncommutative geometry rules.

But, other examples of noncommutativity were yet known also for objects of the spacetime macroscale of dimensions we directly perceive without any instrumental aid. As an example the geometric rotations of a solid, massive 3-dimensional book, in the figure below. Then, non commutativity represents rules of general application.

What Hilbert Space represents

*“Hilbert space represents a system before the measurement action, when all of the states of the system exist in superposition**”*

The key point is that:

**Hilbert Space ***H*** represents a system before the measurement action when all of the states of the system exist in superposition.**

*The act to measure a status for a variable (or, Trigger, e.g. the time of passage of a container front of a photoelectric sensor) reduces the observed value to a single point, **...what does not mean that the others do not exist any more or they did not existed at all before the measurement.*

Additional confirmations that Quantum Mechanics had since the start the multiversal scenario deeply embedded into its logic. The tree-like branching logic which explains why and how our industrial world exists and operates.

Hilbert Space and Hypercomplex numbers

In the course of the last century it was discovered that the Hilbert Space *H* can be defined over different number fields.

These include the:

- Real numbers
*ℝ*, numbers on a line, an 1-dimensional number system; - Complex numbers
*ℂ*, a 2-dimensional number system; - Quaternions
*ℍ*, a 4-dimensional number system;

and, to some extent, also the **Octonions**, in the figure below included with the quaternions between the Hypercomplex numbers *ℍ*. The Hypercomplex numbers are all those whose dimension is over 3. The Octonions can be thought of as octets (or 8-tuples) of Real numbers building up an 8-dimensional number system.

** ** **The set of all numbers includes the subsets of the ****> 3-dimensional**** Hypercomplex numbers ***ℍ***, visibly dropping the commutative property.**** ***ℍ*** ****includes the 2-dimensional C****omplex numbers set ***ℂ*, **of special interest for all ***dynamical systems,*** included the Industrial Machinery object of Root Cause Analysis ****(credit stratocaster47, 2014)**

An Octonion is a real linear combination of the unit octonions:

*“Triggers, the simplest thinkable instance of all kinds of electronic inspection measurements, are one of the ways to differentiate leaves otherwise similar”*

* { e0, e1, e2, e3, e4, e5, e6, e7 }*

where *e0* is the scalar, or Real element, which can be identified with the Real number 1.

That is, every Octonion *x* can be written in the form:

* x = x0 e0 + x1 e1 + x2 e2 + x3 e3 + x4 e4 + x5 e5 + x6 e6 + x7 e7*

with Real coefficients *{ xi }*.

The reason why we pointed out the relation of the Hilbert Space *H* to Octonions is that today's best mathematical instrument to scan *all* is String Theory. And the 10-dimensional version of String Theory, say the version exempt by anomalies, is the one using the Octonions rather than different number systems.

The transition from classic to quantum

The classic physics “state"

**R****elation between Hilbert Space ***H*** and Topological Space. Topological spaces are defined by the most basic Set Theory**

** Oscillation characteristic values (eigenvalues) are modes of vibration, e****ach one oscillating its own frequency in this image of a drumhead (image credit J. Kreso, 1998-2010)**

In classical physics, the notion of the “state” of a physical system is intuitive. It is focused on certain measurable quantities of interest, for example, the position and momentum of a moving body, and subsequently assign mathematical symbols to these quantities (such as “x” and “p”). The state of motion of a body is then specified by assigning numerical values to these symbols. In other words, there exists a one-to-one correspondence between the physical properties of the object and their mathematical representation in the theory.

To be sure, we may certainly think of some cases in classical physics where this direct correspondence is not always established as easily as in the example of Newtonian mechanics used here. As an example, it is rather difficult to relate the formal definition of temperature in the theory of Thermodynamics to the underlying molecular processes leading to the physical notion of temperature. However, reference to other physical quantities and phenomena usually allowed one to resolve this identification problem at least at some level.

Quantum Physics' *state*

The one-to-one correspondence between the physical world and its mathematical representation in the theory came to an end with quantum theory after 1926. Instead of describing the state of a physical system by means of intuitive symbols that corresponded directly to the “objectively existing” physical properties of our experience, in quantum mechanics we have only an abstract quantum state that is defined as a vector (or, more generally, as a ray) in a similarly abstract Hilbert vector space. The conceptual leap associated with this abstraction cannot be underestimated.

As a matter of fact, the discussions regarding the interpretation of quantum mechanics since the early years of quantum theory, are to a large extent due to the question of:

* ...how to relate the abstract quantum state to the physical reality out there ?*

The connection with the familiar physical quantities of our experience is only indirect, through measurements of physical quantities, that is, of observables object of our measurements and everyday life, represented by (Hermitian) operators in a Hilbert space.

To a certain extent, the measurement allows us to revert to a one-to-one correspondence between the mathematical formalism and the “objectively existing physical properties” of the system, say to the concept familiar from classical physics. But, due to the fact that many observables are mutually incompatible (non commutativity), a quantum state will in general be a simultaneous eigenstate of only a very small set of operator-observables. Accordingly, we may ascribe only a limited number of definite physical properties to a quantum system, and additional measurements will in general alter the state of the system unless, we measure, by virtue of luck or prior knowledge, an operator-observable with an eigenstate that happens to coincide with the quantum state of the system before the measurement.

A consequence is that it is impossible to uniquely determine an unknown quantum state of an individual system by means of measurements performed on that system only. This situation is in contrast with classical physics. Here we can enlarge our “catalog” of physical properties of the system by performing an arbitrary number of measurements of additional physical quantities. Furthermore, many independent observers may carry out such measurements (and agree on the results) without running into any risk of disturbing the state of the system, even though they may have been initially completely ignorant about this state. The idea of preexistence of the classical states is a mere remnant of the limited knowledge we had along past centuries.

Hilbert Space and Fields

Following Quantum logic, the Hilbert Space *H *cannot
treat physical fields, like the familiar:

- electromagnetic field,
- gravitational field.

There is compelling evidence that, when continuous spectrum is present, the natural mathematical setting for Quantum Mechanics is the *rigged Hilbert Space* rather than just the *Hilbert Space **H*. By embedding the separable Hilbert space in a *rigged Hilbert space*, it can house
fields by representing them as blurred sets of Hilbert vectors.

Each field is the convolution of a typical blur with a set of Dirac delta-functions that represent Hilbert vectors. When the blur is differentiable, then also the field is differentiable. The field values can be attached to all Hilbert vectors so that quantum logic can be expanded to treats fields.

Actual or potential ?

In view of the properties of quantum states introduced above, it has often been argued that these states represent only *potentialities *for the various *classical *observed states.

At the same time quantum states:

- represent a complete description of a quantum system, encapsulating all there is to say about the physical state of the system;
- do not tell us which particular outcome will be obtained in a measurement but only the probabilities of the various possible outcomes.

This intrinsic probabilistic character of quantum mechanics, only thirty years ago teached as a canon, is only seemingly existing and, rather, an effect of our cultural perspective. In an experimental situation, the probabilistic aspect is represented by the fact that, if we measure the same physical quantity on a collection of systems all prepared in exactly the same quantum state, we will in general obtain a set of different outcomes. What had been fully understood along past decades, marrying the best experimental techniques to the most powerful theoretical ideas, is that **all possible measurement values, say all statuses, are actual.**

** Erwin Schroedinger, one of the founders of Quantum Mechanics. He was the first to understand that a multitude of simultaneous instances of the electron the Hydrogen proton, were modelled by his newly created Wave Function **Ψ

Schroedinger. Only waves

Reading textbooks written sixty years ago, you’ll be surprised by the difficulty to figure the named *wave-particle duality*. This idea, stood between those which delayed many initial efforts to understand Quantum Mechanics. The following decades demonstrated it was one more side-effect of the past cultural perspective.

Yet Erwin Schroedinger, the physicist depicted in the figure here on right side, in the early days of Quantum Mechanics when attempting to identify narrow wave packets in real space with actual physical particles observed two problems:

- initially localized wave packets spread out very rapidly over large regions of space, a behaviour irreconcilable with the idea of particles, by definition localized in space;
- the Wave Function Ψ describing the quantum state of N > 1 particles in the 3-dimensional space, resides in a 3N-dimensional Hilbert space. No more in the familiar 3-dimensional space of our experience.

In 1952, during a conference at Dublin, Ireland, he first introduced openly the idea that what his own and the Werner Heisenberg's matricial quantum formalisms really are suggesting, is the simultaneous superposition of all the possible instances of the same physical object.

Physical object, from his point of view, reduced to a wave. And *particles* as mere idealisations for wave packets with relatively limited extension in the space-time.

“One of the applications of the quantum computers is the Binary Classification.

All Electronic Inspectors, whatever their size, complexity or task, are Binary Classifiers”

The Past

We prefer to start this section with a visit in one of the Google's, Inc. Data Centres. Why this visit ? Because it shows the Past. And, specifically, it shows what is one of the world’s greatest Classifiers when approaching the dawn of their design. The aligned rows of thousands of Servers consume impressive amounts of active energy.

Reason why Google privileges the naturally cold ambients, like Icelandic North Sea or Swedish Baltic Sea, for its Data Centres, so to have cheaper cooling of the CPUs. Parallelizing Support Vector Machines on Distributed Computers is one of the main characteristics of these Data Centres.

In some way, the alignment of the Servers and of the many CPUs there embedded, suggests one of those mirror-like effects where it is possible to see many instances of a single object. It is important to start to understand that, living in a multiversal environment, all states are elements of a single Superposition. A single CPU, in the reality, has a multitude of counterparts, parallel processing data whose difference is limited to 1 bit.

Binary Classification: Present and Future

The video below synthesises in a few words a new paradigm. Binary Classification, the task of all of the Electronic Inspectors of the World, is an activity intrinsically optimized for massive parallel computing. Massive parallel computing expensive and unpractical, if arising by the ideas underlying all today’s Data Centres, like the Google, Inc. above. A single Quantum Computer outperforms several Data Centres, at a fraction of their cost.

**Rainier processors and Josephson-effect junctions operated into 16 concentric levels of electro magnetic shielding at temperatures close to the -273 ºC. Special conditions to recreate small and brielf-duration Hilbert spaces in our hot decohering environment (image courtesy of D-Wave Systems Inc., 2014) **

The following video, originating by the world leader in quantum computing, D-Wave Systems, Inc., is meant as an introduction to commercial applications of the superposed structure named Multiverse. One of the applications of the quantum computers is the Binary Classification. Electronic Inspectors in the Food and Beverage Bottling Lines are Binary Classifiers. (In another page of this site we’ll explain why the ideal Bottling Control is a device calculating in binary way as usual but, in the Hilbert space).

From Decoherence we learnt that the general mechanisms and phenomenons arising from the interaction of a macroscopic quantum system with its Environment, strictly depend on the strength of the coupling between the considered degree of freedom and the rest of the world. And, because of this reason D-Wave's “Vesuvius” Central Processing Units are a system of 512 quantum bits (qbits).

512 superpositions of the Ψ wave functions, preserved longer than possible in the Hilbert space:

- operating at temperatures of 0.02 K (-272.98 ºC), extremely close to the absolute zero;
- protected by induction of external electromagnetic fields by fifteen levels of Faraday cages.

A mind-boggling amount of other branches of the Multiverse where the computer exists, each one processing the same original qbit with differences from one to another in the variable processed, reduced to 1 bit.

Moreover they:

- adopt Rainier processors and Josephson-effect junctions, rather than Silicium-based processors. Josephson junctions are the most basic mesoscopic superconducting quantum device;
- handle qubits which can slowly be tuned (annealed) from their superposition state (say where they are 0 and 1 at the same time) into a classical state where they are either 0 or 1. When this is done in the presence of the programmed memory elements on the processor, the 0 and 1 states that the qubits end up settling into gives the answer to an user-defined problem.

To create a Hilbert Space *H* of reduced volume and duration here on the Earth's surface, requires a technology completely different than that of the actual Supercomputers. Technology which is the core of the new Quantum Computers. Nearly ** Science Fiction**, as seen by the point of view of today Industrial Automation.

One of the key points lies in the fact that superpositions of the Ψ wave functions are preserved longer than possible in the Hilbert Space when:

**operating at temperatures of 0.02 K (-272.98 ºC), extremely close to the absolute zero;****protected by induction of external electromagnetic fields by the shielding provided by fifteen levels of Faraday's cages.**

**The figure above is showing extreme cares with respect to both points: **

**the entire circuit lies in a fridge;****EMI-induced parasitic currents flowing in the shields of the cables carrying Signals, are immediately discharged to Ground. This, limiting to < 250 mm their extension far from Ground discharge points massive and whose impedance is extremely low. In other terms, also EMI-induced parasitic currents are conceived in the framework of the wider circuital design (image courtesy of D-Wave Systems Inc., 2014)**

Links to the pages: