I'm getting close to publishing book 3. I am past, if not well past the half way point and mos of the math that will be presented in book 3 is done. Dare I look for a July publication date? Stay tuned, the weekend may well tell.
Chapter 38 Quantum information
theory and traditional physics
Force is a result.
For example, it is speculated that when compression results in long
period of ct4 super compression in
the presence of ct5. This is solution based, in high compression states,
ct5, you get super-compression 1111 to 11111 coordinate solutions and these
last as the ct4 states move away leaving them as compressed ct4 which we
experience as neutrons and electrons.
This transition is
based on the necessary crowding of
substitution states and is probably the strongest evidence that, for
example at ct2-3 (photon to wave) the substitution is not purely ct1 substitution
but ct2 substitution in waves. I.E. substitution in higher states involves
at least temporary super compression
states of the next lower state. Time
dilation, on the other hand, shows that external ct1 substitution rates remain important in the complete
management of math solutions. In
magnetism, as soon as the “current is shut off” the related spiral of ct2
temporary states also disappear, remembering that this is a math result and not a force consequence of shutting off the
current.
It appears that the
same type of state substitution occurs during fusion reactions to some extent,
but more likely this is an after effect of the generation of ct4 super compression during very high black hole type compression state
that occur, for example during the big
bang inflection point events. The
continuation of these nucleus super compression states indicates that for
higher compression states, the temporary super compression states last longer,
presumably because of internal sharing
which leads to history and time dilation in conjunction with external
ct1substitution. Whether very large
black holes reflect the same effect in the presence of as yet undetected ct6 or
whether these very high compression states can generate their own super
compression (e.g. by fusion in stars in ct4) remains to be determined with
specificity, but can be determined by examining the results of compression in
fusion reactions in stars for ct4.
When you get ct1
linearity in the presence of massive overlaps of ct1 state you get comparative
states between ct1 and ct2 and the result of these separation states is that
you experience gravity which is more pronounced as there is ct1 sharing between
higher compression states. Again, this
is solutions based resulting from math solutions that generate ct1 substitution
and not force driven.
Space and the
effects of space affected over long periods of time in the presence of high
compression may easily account for dark matter but this is only necessary to
the extent that dark matter is compressed in certain circumstances.
The idea that light
is bent in areas where we believe there to be concentrations of dark matter potentially
reflect intermediary states present with longer lifetimes. The fact that
it may be ct6, super compressed ct1 (space) resulting from earlier compression
in the presence of a high ct state or some other result doesn’t change the
ability of AuT to provide an explanation if enough data can be obtained.
In circumstances
during very high compressive states (whether localized or during a big bang)
super compression of ct states appears to occur and these compressive states
continue and continue to circulate as magnetism and similar mathematical
solutions as the combined high compression spirals begin to disperse during the
phases of decompression.
A lasting effect in
the form of dark energy is net decompression which will eventually change to
net compression lest the universe continue to expand exponentially.
To call dark matter
a particle is nonsense just as looking for bozons is nonsense. It is
attempting to force non-dimensional mathematical solutions into dimensional
particles.
While we see
dimension, it has nothing to do with dark energy except that this decompression
state is a result of decompression and exponential growth of F-series solutions
of ct1 without offsetting higher state compression. When the higher state
compression is greater than the F-series growth, you get a compressing universe
which approaches the big bang inflection point.
Since the “length”
measured in quantum states between
these “turns” or “carrier lengths” at each carrier state
is exponentially higher (almost but not quite double) at each change, the carrier
lengths between turns have epoc type lengths and
intersections between intersecting spirals or other compressive solutions
instead of turns can play a role in “reversing” the compression/decompression
solution theorized in the primary model.
Substitution becomes a replacement for
velocity. 1:256 is light speed. It is noted that in a slow changing
ct4 state, the rate of ct1 substitution remains at 1:256 for the ct1 states
that make up the slower carriers at the remove of 3 steps (ct1 substitutions,
ct1-ct2, ct2-ct3, ct3-ct4).
We see the rate of
change of ct2 as energy and we see ct1 as having more energy than ct2, but the
truth remains that these things are just different forms of information, more
specifically different solutions with different substitution rates, the
substitution rate for space having not context in time or dimension and
therefore difficult for us to understand until it reacts with higher ct states
to give a spacetime relevant result.
To say that means
that space has "more energy" than anything else is inaccurate, but it
also has a grain of truth from our perspective because we see relative speed of
information exchange as energy and from that standpoint you can say that space
either (1) has infinite energy or, more accurately (2) that the effect that
space has on ct2 gives it the maximum energy that we experience (the speed of
light movement) before AuT.
Likewise, to say
there is more of ct1 than anything else is a tautology because everything else
is made of ct1.
The mathematical
solution or algorithm solution for the 256 states aligned by one ct1 carrier
define a quantum photon or ct2 state. At first glance, it appears that
the one ct1 carrier state is added to 255 “carried” states to reach the 256
magic number and that as x changes, each time x changes, there is a
substitution so that ct1 is always changing at light speed although other
solutions are suggested by the mesh created when higher states are
formed. It can be noted here that ct1 has no internal substitution (at
least none dimensional) so space does not experience velocity as we understand
it, but does have solution order allowing it to provide separation without movement.
Space varies along with
x (the single variable). The suggestion of exponential change with each change in
x is hard to escape without the math of f-series addition being altered.Overlap
seems to be suggested by shared ct state origins along with proximity of solution. Compression seems to be based on the association
of the “net” state of pi for a higher ct state.https://medium.com/starts-
This article contains a podcast that suggests fundamental is material and in spacetime. It isn't even close to accurate, but discusses the standard model. It notes that particles can be reduced to identical features among at least like particles.
This is suggesting that on a more fundamental level these different generic particles can be reduced to a generic form.
1) Fermion or 2) Bozon and they are categorized by spin (fraction (in halves) or whole number).
He discusses annihilation as a feature. AuT rejects the destruction of information, but its conversion into space, a non-dimensional element, is expected.
He talks about sharing binding particles, binding bozons, the "exchange" of gluons" which corresponds to information states being exchanged.
Half size information is important for Aut geometry because of this figure:
pi/2 means that you have to have half measurements and how is this possible? The answer suggested by AuT is the one shown in this figure. Each result leading to geometry is a function of two parts, plus-plus to minus-minus.
The math required to get from information theory to Quark Theory is complex, but the given that the observed forces and masses are reflections of information solutions means that the math is also a reflection of the more simple underlying algorithms. The simplicity of this
The other complicated difference between Fermions and Bozons-the pauli exclusion principle-no two identical F can occupy the same quantum state-not at same place as same time with same properties. This would, of course
Bozons are allowed to be the same meaning that Bozons correspond to CT1 which isn't the "same" because of solution order (relative solution order) but it does show how the features of quark theory line up with AuT theory which is merely a more fundamental, more accurate way of approaching these issues.
Now this goes into molecular construction, but reconciling those features of the universe is for another book, not book 3 because it is very, very distant from quantum mathematics. The idea here is that separation features correspond to the math, not the forces arising from the math and has to do with separation features of the geometry.
What's on your white board?
No comments:
Post a Comment