However, the situation is completely reversed for the theory of the quarks and gluons that compose the strongly interacting particles in the atomic nucleus. While the natural energy scale of these particles, the proton, r meson, etc. is on the order of hundreds of millions of electron volts, the quark masses are about one hundred times smaller. Likewise, the gluons are quanta of a Yang-Mills field which obeys highly non-linear field equations. As a result, strong interaction physics has no known analytical approach and numerical methods is said to be the only possibility for making predictions from first principles and developing a fundamental understanding of the theory. This theory of the strongly interacting particles is called quantum chromodynamics or QCD, where the non-linearities in the theory have dramatic physical effects. One coherent, non-linear effect of the gluons is to “confine” both the quarks and gluons so that none of these particles can be found directly as excitations of the vacuum. Likewise, a continuous “chiral symmetry”, normally exhibited by a theory of light quarks, is broken by the condensation of chirally oriented quark/anti-quark pairs in the vacuum. The resulting physics of QCD is thus entirely different from what one would expect from the underlying theory, with the interaction effects having a dominant influence.
It is known that the much celebrated Standard Model of Particle Physics is incomplete as it relies on certain arbitrarily determined constants as inputs – as “givens”. The new formulations of physics such as the Super String Theory and M-theory do allow mechanisms where these constants can arise from the underlying model. However, the problem with these theories is that they postulate the existence of extra dimensions that are said to be either “extra-large” or “compactified” down to the Planck length, where they have no impact on the visible world we live in. In other words, we are told to blindly believe that extra dimensions must exist, but on a scale that we cannot observe. The existence of these extra dimensions has not been proved. However, they are postulated to be not fixed in size. Thus, the ratio between the compactified dimensions and our normal four space-time dimensions could cause some of the fundamental constants to change! If this could happen then it might lead to physics that are in contradiction to the universe we observe.
The concept of “absolute simultaneity” – an off-shoot of quantum entanglement and non-locality, poses the gravest challenge to Special Relativity. But here also, a different interpretation is possible for the double-slit experiment, Bell’s inequality, entanglement and decoherence, which can rub them off of their mystic character. The Ives – Stilwell experiment conducted by Herbert E. Ives and G. R. Stilwell in 1938 is considered to be one of the fundamental tests of the special theory of relativity. The experiment was intended to use a primarily longitudinal test of light wave propagation to detect and quantify the effect of time dilation on the relativistic Doppler effect of light waves received from a moving source. Also it intended to indirectly verify and quantify the more difficult to detect transverse Doppler effect associated with detection at a substantial angle to the path of motion of the source – specifically the effect associated with detection at a 90° angle to the path of motion of the source. In both respects it is believed that, a longitudinal test can be used to indirectly verify an effect that actually occurs at a 90° transverse angle to the path of motion of the source.
Based on recent theoretical findings of the relativistic transverse Doppler effect, some scientists have shown that such comparison between longitudinal and transverse effects is fundamentally flawed and thus invalid; because it assumes compatibility between two different mathematical treatments. The experiment was designed to detect the predicted time dilation related red-shift effect (increase in wave-length with corresponding decrease in frequency) of special relativity at the fundamentally longitudinal angles at or near 00 and 1800, even though the time dilation effect is based on the transverse angle of 900. Thus, the results of the said experiment do not prove anything. More specifically, it can be shown that the mathematical treatment of special relativity to the transverse Doppler effect is invalid and thus incompatible with the longitudinal mathematical treatment at distances close to the moving source. Any direct comparisons between the longitudinal and transverse mathematical predictions under the specified conditions of the experiment are invalid.
Cosmic rays are particles – mostly protons but sometimes heavy atomic nuclei – that travel through the universe at close to the speed of light. Some cosmic rays detected on Earth are produced in violent events such as supernovae, but physicists still don’t know the origins of the highest-energy particles, which are the most energetic particles ever seen in nature. As cosmic-ray particles travel through space, they lose energy in collisions with the low-energy photons that pervade the universe, such as those of the cosmic microwave background radiation. Special theory of relativity dictates that any cosmic rays reaching Earth from a source outside our galaxy will have suffered so many energy-shedding collisions that their maximum possible energy cannot exceed 5 × 1019 electron-volts. This is known as the Greisen-Zatsepin-Kuzmin limit. Over the past decade, University of Tokyo’s Akeno Giant Air Shower Array – 111 particle detectors have detected several cosmic rays above the GZK limit. In theory, they could only have come from within our galaxy, avoiding an energy-sapping journey across the cosmos. However, astronomers cannot find any source for these cosmic rays in our galaxy. One possibility is that there is something wrong with the observed results. Another possibility is that Einstein was wrong. His special theory of relativity says that space is the same in all directions, but what if particles found it easier to move in certain directions? Then the cosmic rays could retain more of their energy, allowing them to beat the GZK limit. A recent report (Physical Letters B, Vol. 668, p-253) suggests that the fabric of space-time is not as smooth as Einstein and others have predicted.
During 1919, Eddington started his much publicised eclipse expedition to observe the bending of light by a massive object (here the Sun) to verify the correctness of General Relativity. The experiment in question concerned the problem of whether light rays are deflected by gravitational forces, and took the form of astrometric observations of the positions of stars near the Sun during a total solar eclipse. The consequence of Eddington’s theory-led attitude to the experiment, along with alleged data fudging, was claimed to favor Einstein’s theory over Newton’s when in fact the data supported no such strong construction. In reality, both the predictions were based on Einstein’s calculations in 1908 and again in 1911 using Newton’s theory of gravitation. In 1911, Einstein wrote: “A ray of light going past the Sun would accordingly undergo deflection to an amount of 4’10-6 = 0.83 seconds of arc”. He did not clearly explain which fundamental principle of physics used in his paper and giving the value of 0.83 seconds of arc (dubbed half deflection) was wrong. He revised his calculation in 1916 to hold that light coming from a star far away from the Earth and passing near the Sun will be deflected by the Sun’s gravitational field by an amount that is inversely proportional to the star’s radial distance from the Sun (1.745” at the Sun’s limb – dubbed full deflection). Einstein never explained why he revised his earlier figures. Eddington was experimenting which of the above two values calculated by Einstein is correct.
Specifically it has been alleged that a sort of data fudging took place when Eddington decided to reject the plates taken by the one instrument (the Greenwich Observatory’s Astrographic lens, used at Sobral), whose results tended to support the alternative “Newtonian” prediction of light bending (as calculated by Einstein). Instead the data from the inferior (because of cloud cover) plates taken by Eddington himself at Principe and from the inferior (because of a reduced field of view) 4-inch lens used at Sobral were promoted as confirming the theory. While he claimed that the result proved Einstein right and Newton wrong, an objective analysis of the actual photographs shows no such clear cut result. Both theories are consistent with the data obtained. It may be recalled that when someone said that there are only two persons in the world besides Einstein who understood relativity, Eddington had replied that he does not know who the other person was. This arrogance clouded his scientific acumen, as was confirmed by his distaste for the theories of Dr. S Chandrasekhar, which subsequently won the Nobel Prize.
Heisenberg’s Uncertainty relation is still a postulate, though many of its predictions have been verified and found to be correct. Heisenberg never called it a principle. Eddington was the first to call it a principle and others followed him. But as Karl Popper pointed out, uncertainty relations cannot be granted the status of a principle because theories are derived from principles, but uncertainty relation does not lead to any theory. We can never derive an equation like the Schrödinger equation or the commutation relation from the uncertainty relation, which is an inequality. Einstein’s distinction between “constructive theories” and “principle theories” does not help, because this classification is not a scientific classification. Serious attempts to build up quantum theory as a full fledged Theory of Principle on the basis of the uncertainty relation have never been carried out. At best it can be said that Heisenberg created “room” or “freedom” for the introduction of some non-classical mode of description of experimental data. But these do not uniquely lead to the formalism of quantum mechanics.
There are a plethora of other postulates in Quantum Mechanics; such as: the Operator postulate, the Hermitian property postulate, Basis set postulate, Expectation value postulate, Time evolution postulate, etc. The list goes on and on and includes such undiscovered entities as strings and such exotic particles as the Higg’s particle (which is dubbed as the “God particle”) and graviton; not to speak of squarks et all. Yet, till now it is not clear what quantum mechanics is about? What does it describe? It is said that quantum mechanical systems are completely described by its wave function? From this it would appear that quantum mechanics is fundamentally about the behavior of wave-functions. But do the scientists really believe that wave-functions describe reality? Even Schrödinger, the founder of the wave-function, found this impossible to believe! He writes (Schrödinger 1935): “That it is an abstract, unintuitive mathematical construct is a scruple that almost always surfaces against new aids to thought and that carries no great message”. Rather, he was worried about the “blurring” suggested by the spread-out character of the wave-function, which he describes as, “affects macroscopically tangible and visible things, for which the term ‘blurring’ seems simply wrong”.
Schrödinger goes on to note that it may happen in radioactive decay that “the emerging particle is described … as a spherical wave … that impinges continuously on a surrounding luminescent screen over its full expanse. The screen however, does not show a more or less constant uniform surface glow, but rather lights up at one instant at one spot …”. He observed further that one can easily arrange, for example by including a cat in the system, “quite ridiculous cases” with the ψ-function of the entire system having in it the living and the dead cat mixed or smeared out in equal parts. Resorting to epistemology cannot save such doctrines.
The situation was further made complicated by Bohr with interpretation of quantum mechanics. But how many scientists truly believe in his interpretation? Apart from the issues relating to the observer and observation, it usually is believed to address the measurement problem. Quantum mechanics is fundamentally about the micro-particles such as quarks and strings etc, and not the macroscopic regularities associated with measurement of their various properties. But if these entities are somehow not to be identified with the wave-function itself and if the description is not about measurements, then where is their place in the quantum description? Where is the quantum description of the objects that quantum mechanics should be describing? This question has led to the issues raised in the EPR argument. As we will see, this question has not been settled satisfactorily.
The formulations of quantum mechanics describe the deterministic unitary evolution of a wave-function. This wave-function is never observed experimentally. The wave-function allows computation of the probability of certain macroscopic events of being observed. However, there are no events and no mechanism for creating events in the mathematical model. It is this dichotomy between the wave-function model and observed macroscopic events that is the source of the various interpretations in quantum mechanics. In classical physics, the mathematical model relates to the objects we observe. In quantum mechanics, the mathematical model by itself never produces observation. We must interpret the wave-function in order to relate it to experimental observation. Often these interpretations are related to the personal and socio-cultural bias of the scientist, which gets weightage based on his standing in the community. Thus, the arguments of Einstein against Bohr’s position has roots in Lockean notions of perception, which opposes the Kantian metaphor of the “veil of perception” that pictures the apparatus of observation as like a pair of spectacles through which a highly mediated sight of the world can be glimpsed. According to Kant, “appearances” simply do not reflect an independently existing reality. They are constituted through the act of perception in such a way that conform them to the fundamental categories of sensible intuitions. Bohr maintained that “measurement has an essential influence on the conditions on which the very definition of physical quantities in question rests” (Bohr 1935, 1025).