Pages

Wednesday, April 19, 2017

AuT-The interaction between larger particles and ct4 quantum elements 7 of 10

It is so strange to see so far beyond everyone else in this regard.  And yet, the mathematics of space compression is very complex in the presence of the high ct states, 4 and 5 for this discussion.  The gradual compression equation, so prevelent locally even during the latter half of an expansion cycle not only compresses ct4 states into electron and proton pairs, but also defines a process that associates these in bulk to the intermediate states.
This result comes from algorithm that must "force" concentration into its solution or we'd have a homogeneous nothing of a universe, the result of an algorithm where the geo function does not grow in the presence of compressed states, where higher solutions do not remain with higher solutions.  We call this effect gravity, but in fact it is a result of the linearity that gives rise to gravity which increases with higher solutions.
In terms of solution it is something like this:
1) The algorithm provides that compressed states tend to stay in proximity to compressed states but not to the exclusion of decompression of space.
2) The algorithm provides that there be enough ct1 in the presence of compressed states to allow for ct1 changes with each change in the value of x
3) Shared ct1 states between compressed states are provided by having proximate solutions of multiple higher states in the presence of a common ct1 solution.
It is not satisfactory as an explanation because of the general separation of black holes and the space within locally dense ct4 states.  It speaks to a relative proximity of solutions rather than a precise one; but in truth what it has to do with is the necessityfor ct1 for movement.  Space and the resulting separation are necessary because it is necessary for movement and change is required for each ct state because of the explanation of the formula.
There is a conflict between the forced movement and the tendency toward compression which requires separation even as separtion is the preferred.  The forced movement is because every coordinate has to change for every change in the value of x.  This means that no matter how concentrated teh coordinate change it is required that there be enough ct1 states, shared or otherwise to make the change occur and this is consistent entirely with the universe based on alternating converging and diverging series.  That is, the observed result, is the logically expected result.

This post will cover the problem with focusing on the results instead of the algorithm itself.
One of these processes which has been studied is fusion.  But while mathematics has explored the process of fusion and while practice has accomplished fusion in a controlled and uncontrolled environment, the most obsesrved one being the uncontrolled processes of the sun, we fail to recognize our role in the spiral mathematics of tools of an extremely complex system when we do simple fusion reactions on earth.
It is a conceit that we think we are studying the process or using the process when we are being used by the super symetry underlying space-time and illusory entropy.
Nevertheless, our role in the process is rather minor compared to the actual process itself.
We know the idea of combining masses to release mass and this is important in the present analysis.  It isn't important because of the e=mc^2 or more specifically ct4=f(4)^(2^4); but it is important because it shows a reaction of spirals moving both towards and away from higher compression states in a reaction that is U(x)=U(x)ct4+Ux(ct3)+Ux(ct2)+Ux(ct1) goring from U(x)1 to U(x)2 for a localized system where geo is largely the same for all U(x) but where x is changing between two time periods goverened by a change for x 10^37 times every second.
Outlier factors, such as the spirals governing our actions become relative remote in terms of defining what happens with a short period of time in the concentrated area around the partial universe represented by the reacting particles and the adjacent space even though the amounts of ct1 exchange are enormous over even a few seconds.  Because this is all a part of an interconnected matrix going all the way back to x=1 only approximate results are possible, but locallized analysis is largely accurate even though the permutations are enormous.
We expect these permutations in a system as dense as, say, the sun, but even in a vacuume in a lab, the vcuume necessarily being a location of ct1 exchange, the system is very dense and includes specifically our intereaction with the process.
We can estimate the value of the changes based on the ratios that exist between ct2,3,4 and 5 and by the weight measurements between the various portions (roughly 1830 to 1) of the neutron.
What we would like to assume is that the intermediary ct4 states for ct2 substitution and ct3 substitution, either of which is pure since all involve elements of the other would somehow yield a ratio of 1 to 1830.
There is another element brought up which is the converging series nature of half lives which appear to be present here.  That means that existence of the higher states lasts forever in steadily decreasing amounts.  This suggests that the amount of anti-matter and matter never match up but approach an equal quantity which would tend to covert all matter eventually to space.
Proton decay ranges in length (according to years) in a range from 10^30 to 10^35 years although half lives higher than 1.67*10^34th years have be posited as has the order of 10^32 years but these models are based on string theory than anything else with quarks.  The decay in this case is the decay towards minimal ct4 states, but then to lower states.
So how do we account for continued compression stages in the ct4 state.  To do this we have to use the parts we have either the geo funciton or the compression function and since all that is open is the geo funciton in this regard.  This new rule has to explain why we have greater compression at high compression states, particularly proximate to big bangs, and why less when we have high decomrpession states, sort of where we are now.  In looking at the spiral function we know that you need long enough spirals to have stable states between turns.  We have different values of curvature depending on where a point is relative to other solutions in order.  And so we come up with greg's rule: The greater the curvature (the closer in solution distance between spiral solutions) the shorter the spiral length between turns necessary to obtain compression.
Of course, this whole thing is greg's rule, so whatever.
Where there are 4 coordinates being solved at once, the compression at any one point must be solved by the solution to each of the 4 coordinates being solved together.
To understand this better, let's look at one coordinate at once ct1: 1,1,2,3,5,etc.  For this you have a different value of pi unless it is in the presence of a higher state (we're in existence where the higher state is at least ct5) so you have to sum all the points and based on distance so you have curvature a function of the specific geo function of pi (evolving) plus Sum(1,max x)KdA where this is gaussian curvature for the spiral set in question and dA is the quantum area for that curvature K.  We showed this with drawings earlier:
And

Although these drawing suffer somewhat at the ct1 state where curvature and dimension is yet to be resolved, you need at least three points to define this type of curvature.
The complexities of the measurements in question arise from the complex nature of defining curvature based on the type of information included (curvature is different for each type of information, obviously, but only based on the number of coordinates being solved together) and by the fact that only at a quantum instant is curvature well defined for any point and that definition requires a solution for whatever vaue of x is present in the universe at that quantum instant.  Since it is a function of the relative order of solution, giving rise to what we perceive as distance, it gives rise to the complex scalars which exist at the quantum instant only as relative timing of solutions to whatever point is being examined.  A point in the middle of space, like we are, as opposed to a point on one of the edges will have a much more complex approximate solution, the exact solution requiring a solution to every point in the universe and therefor being impractical and practially speaking unnecessary especially since the solution will shift drastically when x changes each 1.07x10^-37th of a second.
One problem with using observed phenomena to arrive at answers is that we assume three dimensional space which we know is inaccurate both because three dimensional space is a function of 4 coordinate changes and because of the presence of 5 dimensional space (ct5) and lower dimensions of space based on lower orders of information combinging to create the changing face of ct4.

Einstein curvature, for example, understood that there was a matrix of space time, what we call an order of information solutions, and focused largely on the concentration of solutions of ct4 in a given environment.  What's interesting about this analysis is that it recognizes that there is a relationship between the higher order of curvature and the lower order of curvature based on energy, but it incorrectly assumes that pi is a fixed part of the equation when pi itself only exists relative to the position of information relative to point under examination and where even the approximation of pi varies greatly (although without much practical relevance) according to how many coordinates are changing at once for a given solution and the state of compression (how many points are being proximately solved at the quantum instant where proximate refers to the relevant locality, irrelevant locality being that so far away as to not have a practical effect on the solution) so that you have some very complex underlying symetry to take into account.
Hence the matter/curvature equations work well in the proximity of large amounts of matter, and fail pretty miserably when you are dealing with lower information states or higher ones although the basic concepts remain the same.
Another aspect of this is that curvature is therefore a perspective issue.  From one position it is one thing, from another it is something different.
We must eventually get to where we discuss gravitational waves which use, as a medium, space time itself and how that occurs in an algorithm based system.  But not tonight.

https://physics.stackexchange.com/questions/109731/how-to-measure-the-curvature-of-the-space-time

http://mathworld.wolfram.com/GaussianCurvature.html



http://www.johnagowan.org/proton.html

No comments:

Post a Comment