David Brown
Registered: May 2009
Posts: 173 
NKS Chapter 9 and f(div) theory critique of Eötvöstype experiments
Is NKS Chapter 9 the key to a new foundation for both quantum field theory and general relativity theory? Do Eötvöstype experiments rule out the f(div) theory of modified general relativity theory?
In terms of simple physical hypotheses, what might be the essence of NKS Chapter 9?
Consider Wolfram’s cosmological principle:
THE MAXIMUM PHYSICAL WAVELENGTH IS THE PLANCK LENGTH TIMES THE FREDKINWOLFRAM CONSTANT.
From the preceding principle, energy as well as time and distance must be physically discrete. If energy, time, and distance are digital entities, then is it highly plausible that there is an informational substrate underlying quantum field theory? If energy, time, and distance are secondary entities, then is it highly plausible that there is a Wolframian mobile automaton that explains all of physics?
Consider the following hypothesis:
MTHEORY IS THE ONLY MATHEMATICALLY SATISFACTORY WAY OF UNIFYING QUANTUM FIELD THEORY AND GENERAL RELATIVITY THEORY.
If there is belief in the Wolframian mobile automaton and belief in the essential value of Mtheory, then should one believe in the following fundamental law of the multiverse:
INFORMATIONAL SUBSTRATE MAKES NAMBU DIGITAL DATA MAKES DIGITAL PHYSICAL REALITY?
Is the Mtheoretic 11dimensional supersymmetric model the only way of defining the Fredkin delivery machine and the Nambu transfer machine — the only way of defining the Fredkin alternateuniverse engine?
Consider two hypotheses:
(1) The BekensteinHawking radiation law is 100% correct in Mtheory without alternate universes.
(2) The BekensteinHawking radiation law is significantly wrong in Mtheory with alternate universes. The modified radiation law and a digital model of black holes give a prediction of paradigmbreaking photons that explain the GZK paradox.
What are some problems with hypothesis 2 and its prediction? The prediction requires a considerable investment of time and effort in ultrahigh energy cosmic ray research. Worse perhaps, it is a qualitative rather than a quantitative prediction. Most physicists would argue that the prediction is speculation rather than science. The quantitative development of the prediction requires Mtheory.
Does the f(div) theory make quantitative predictions even without huge new developments in Mtheory? Is the f(div) theory consistent with numerous Eötvöstype experiments?
Does a nonzero cosmological constant suggest that alternateuniverse forces exist? Let <capital lambda> represent the cosmological constant in general relativity theory. In cosmology, physical units are generally chosen so that Newton’s gravitational constant and c (the speed of light) are equal to 1. In the standard cosmological model, the value of <capital lambda> is taken to be roughly (10**(35)) / (sec**2), or (10**(47)) * ((Gev)**4), or (10**(29)) gram / (cm**3) (assuming the usual conventions in cosmological theory).
http://en.wikipedia.org/wiki/Cosmological_constant
In relativity theory,
(relativistic force) = (relativistic mass) * (relativistic acceleration),
In f(div) theory,
(nongravitational relativistic force) = (relativistic mass) * (relativistic acceleration).
However, f(div) theory hypothesizes that (gravitational relativistic force) + (Fredkin force) = (gravitational relativistic force) * ( 1 + (Fredkin factor)) = (relativistic mass) * ((relativistic acceleration). In the future, Fredkin forces should be calculated from Mtheory with alternate universes as part of the Wolframian mobile automaton. For now, a plausible guess might be that Fredkin factor = 10**(5); the Fredkin factor's sign should make the Fredkin force have the same direction as the gravitational relativistic force.
In the f(div) theory, there is a Fredkin force correction in gravitational force and there are dark matter corrections necessary because quantum field theory ignores the gravitational effects of CANCELLED VIRTUAL PHOTONS.
The guess for the magnitude of the Fredkin factor is suggested by:
http://www.zarm.unibremen.de/2fors...r_Magdeburg.pdf
How can the f(div) theory be compatible with highly accurate verifications of Einstein’s equivalence principle as found in Eötvöstype experiments?
From http://en.wikipedia.org/wiki/org/Eötvös_experiment
“Eötvös' original experimental device consisted of two masses on either end of a rod, hung from a thin fiber. A mirror attached to the rod, or fiber, reflected light into a small telescope. Even tiny changes in the rotation of the rod would cause the light beam to be deflected, which would in turn cause a noticeable change when magnified by the telescope.
Two primary forces which are of interest here act on the balanced masses, apart from the string tension, as seen from the earth's frame of reference (or "lab frame"); gravity and the centrifugal force due to the rotation of the Earth (this being a noninertial frame of reference). The former is calculated by Newton's law of universal gravitation, which depends on gravitational mass. The latter is calculated by Newton's laws of motion and depends on inertial mass.”
Eötvös’s experiment does indeed give strong empirical evidence that two masses on either end of a rod obey the equivalence principle. However, in f(div) theory, the rod masses together with their internal binding energies should obey the equivalence principle. But, in f(div) theory, the rod masses leave weird trails of gravitational effects caused by virtualmass energy from cancelled virtual photons. Relative to the reference frame of the Eötvös experiment, there should be a center of mass of dark matter which changes according to the history of Fredkin forces from alternate universes. How does the Eötvös experiment takes dark matter into account? Cosmological observations prove that dark matter in some form exists — if the f(div) theory is correct, then Eötvöstype experiments are either irrelevant to f(div) theory or are not accurate in their tracking of dark matter and/or the Fredkin force adjustment needed in general relativity theory.
Note that in f(div) theory there is dark matter from alternate universes on the one hand and standard matter (either observed or unobserved) on the other hand. In f(div) theory, unobserved “dark matter” is not really dark matter — it is merely unobserved standard matter. Does the f(div) lack a scientific foundation? Perhaps, the foundation of f(div) is now inadequate, but what of the foundation of quantum field theory in terms of explaining the vacuum energy?
http://en.wikipedia.org/wiki/Vacuum_catastrophe
In Mtheory without alternate universes, there is force derived from standard massenergy. In Mtheory with alternate universes, there is force derived from standard massenergy, force derived from informational massenergy with positive gravitational massenergy and zero inertial massenergy, and force derived from informational massenergy with negative gravitational massenergy and zero inertial massenergy. The f(div) theory of modified general relativity theory deals with the Fredkin force derived from informational massenergy with positive massenergy. The Fredkin factor results from the informational massenergy which is virtual massenergy entering the observable universe from alternate universes. Dark matter consists of the gravitational effects of virtual photons that cancel out in quantum field theory calculations.
Last edited by David Brown on 05192010 at 06:32 PM
Report this post to a moderator  IP: Logged
