Pages

Tuesday, April 19, 2016

AuT-Compression vs capacitance and another name for dark matter

There will, shortly, be an unsatisfying, largely mathematical post dealing with dune formation, Bernoulli flow issues and the like.  There will be a related post on the derivation of the capacitance equation, equally unsatisfying.  While these models hover around the correct answer, the exact model, which has to go with the equation (sin pi/2) given earlier for the spiral formation and with a compression function which we will deal with this this post.
Ignoring dark matter (97% of the universe will be used for dark matter-space) of the 3% remaining, 1/1000 may be stellar mass black holes.
We could use a number between 3 and 5% but I have a reason for the smaller number.
This indicates that at the point of compression where we find ourselves, the compression of space into black holes through 5-6 states and 4-5 intersections (the first state existed without an intersection assumptive).
if a black hole is 10 (10 to 20) x our sun's mass and an "average" supermassive black hole is 4.1 million times time size of our sun then the difference is only a factor of 10^9 larger which may or may not work out well.  Fortunately there are black holes that are potentially billions of times our sun's suze which adds a few zeros (10^12).  Even these, presumably may be only accumulations of ct5 states which would mean there are no true ct6 states in the universe at this time.
Given their relatively small part of the 3%, We are assuming for this that the super-massive black holes are NOT ct6 for this discussion, they can easily be just large accumulations of stellar black holes.   Hence we have only 5 states with 4 intersections.
Now pre AuT science doesn't understand that space, energy and matter are all information hence (see the article by NASA below) they come up with a number of wrong assumptions and they confuse the make-up of space with the results of information theory in a capacitance/compression framework.  But I'm not here to dis NASA, if they'd figured it out there would be nothing for me to do.
Likewise, they don't understand the effect of an algorithm driven universe, so they're stuck with the resulting forces rather than the cause of the forces which leads to even more confusion.  It's like they're looking a shadows on a cave wall and trying to figure out what's happening, that's the allegory of the cave, I suppose.
None of this is overly relevant to the discussion.
But it's important to understand that the entire concept of dark energy is just the failure to recognize that after the period of capacitance/intersection/compression there is a period of decompression which is what is driving the expansion of the universe.  This decompression is a fairly simple concept, the compressed, but not compressed to the point of stable compression space is 'leaking' out of its compressed state and forcing the universe apart. This is NOT a physical event, instead it is a mathematical event, the algorithm running according to some version of the capacitance equations we will cover shortly.
So much for science.  Why is this relevant to our discussion other than to point out how poorly pre AuT science saw things?  It's because the 71% dark energy envisioned is really the amount of space that failed to compress to a higher state during the last intersection of the intersection linear F-series spirals.  Pretty simple stuff!
So we have, potentially by this analysis, but uncertainly to some extent because the 71% doesn't line up with the 24% dark matter perfectly.  The reason is that the use of dark energy vrs dark matter is a poor analysis for weighing these two and until someone does the analysis knowing what they are looking at, we're potentially comparing apples to oranges.
However, for purposes of this post we'll make yet another (very risky) assumption. We're going to say space is 95% of the universe after 5 compression states.
So we have this type of transition:
Original universe: 100% CT1 (space)
Post Collision1:xCT1, yCT2
Post collision 2:x''tct1, y'ct2, zct3
Post collision 3:x'''ct1, y''ct2, z'ct3, act4
Post collision 4: x''''ct1, y'''ct2, z''ct3, a'ct4, bct5
At post collision 4 (which we may still be in) we are going to say that 95% of space is still space.  This means the other states are 5%.  We are taught that bct5=1/1000th of the universe of matter, but it appears more likely that ct5 will one day be that high but is actually much lower at this quantum moment or this one or this one.  While ct4 is a very big number indeed (1000 times or more than matter) we have to assume that energy is much higher still.  It would be nice to be able to say each was 1000ths of the next, but that type of progression seems inconsistent with compression.  Indeed, but the design of the underlying algorithm, it would appear on the surface that the universe might not have the "energy" to fully compress.  The algorithm model for a single spiral is relatively simple by comparison to the actual compression of every quantum bit of information in the universe where each bit at each quantum moment must be solved for x as x is (not goes from) 0 to infinity.
Looking at the post collision numbers, what we're left with for the moment is:
post collision 4: x''''ct1 is >95% of everything else (as we will see this is a "much greater"), y'''ct2 is >1000times everything else but ct1, z''ct3 has the same general concentration as ct2 (this is something of a grey area due to the interchange between these states and is again all energy is probably "much greater" in terms of information content than ct4 and 5) , a'ct4>1000 times ct5.
Each (') assumes the amount of the ct state in question is growing at the expense of space (ct1 getting constantly smaller, but remaining the lions share of the universe for now) or other states.  Ultimately we'll have two ctx states with essentially no space between them, but that is a prior post.
What we're seeing here is that during capacitance, approximately 71% of space is compressed, but that at the end of the day, even after 4 collisions, only 5% of it has stayed compressed.  I believe the number is closer to 3%, but because we're calling (idiotically) decompressing space by the totally wrong name- dark matter, the amount of it could be far greater.  Indeed the 3% number may be impossibly far off and yet until I figure it out we don't know...or do we?
It just so happens, that I have figured this out, previously, in principle anyway.  If we look to the compression charts I put together based on the number of intersections and the amount of material changing at each intersection and we take the total amount of information at any quantum moment, then we should be able to figure out how many spirals it takes to turn the universe into a compressed mass.  At each compression state a number of these are "aligned" in terms of ct state if the universe is made up of smaller universes.  Some of these cannot fully compress meaning there will always be lower states and this "indicates" that at full compression "our universe" (as opposed to the next one out with an additional spiral past the spiral that we have) you'd have at least one of each lower state (one space, one photon, one wave energy, etc) because those other universes cannot collapse all of the way.  If the total compression state of our universe was, for example, ct 170, presumably you'd have at least one of 169 other states at fully compression.  The actual indication is that the number would be higher because many of these sub-universes would have the same level of compression based on the idea of one universe growing from each smaller universe at a constant rate. This is the same exponential growth that you see in the growth of spirals reflected in this drawing but with each line of common length being an additional universe with the number of ct states as those others of common length.

This will probably be a solution saved for volume 2 because the 300 or some intersections that I have looked to previously may be woefully inadequate.  Indeed, the number could be as high as 10^100 spirals making up the primary spiral of our universe!!!!  Probably not that high, by the way, but there is a model that suggests it which I will get to in due time.
For more on the wrong answers, see this list of articles.
http://map.gsfc.nasa.gov/universe/uni_matter.html
http://www.universetoday.com/112500/how-much-of-the-universe-is-black-holes/
http://science.nasa.gov/astrophysics/focus-areas/black-holes/
http://science.nasa.gov/astrophysics/focus-areas/what-is-dark-energy/
https://profmattstrassler.com/articles-and-posts/particle-physics-basics/mass-energy-matter-etc/matter-and-energy-a-false-dichotomy/

No comments:

Post a Comment