- Thread starter version
- Start date

Let's start with the analogy of Indra's Net.

Think of atoms/matter = jewels, net = superfluid, superconducting, wormhole criss-crossed space.

Remember the concept of 'quantum foam'? Essentially, spacetime is so highly energetic at the quantum scale due to quantum uncertainty that it's stretching spacetime into a highly turbulent fabric. At the most fundamental level, spacetime isn't smooth, it's multiply connected through wormholes. That means essentially, when you move your hand, the atoms are 'hopping pixels'. You are constantly tunneling around.

Anyway...keep that concept sort of in mind.

Quantum theory was basically started when Max Planck found out that energy moves in discrete packets. For example, a blackbody emits radiation in discrete quanta.

We didn't think energy moved in packets, for example when you heat up your oven it doesn't seem to 'jump' temperatures - but it actually is. The jumps are just extremely tiny so it appears to be a smooth process.

Even the field when it's at rest / appears to be at a ground state, it will still be made up of these packets. At the smallest level, these are what is commonly referred to in mainstream physics as 'vacuum fluctuations'.

When you add up the amount of vacuum fluctuations that you find in a cubic centimeter of space, you get 1093 grams. This is an absurdly high amount of energy. For example, if you squished the universe into the same space, you yield 1055 grams. The predicted value vsobserved value of vacuum energy is known as the vacuum catastrophe and is the biggest unsolved problem in physics with 122 orders of magnitude difference.

From this issue, we have been unable to link the mass of matter to the vacuum - to these fundamental natural quanta.

From the wiki page on planck unit:

This is known as the hierarchy issue (why is the proton mass so small, and why is the planck mass so large?).

We commonly think of these vacuum fluctuations as 'virtual' because we assume that this energy is not actually affecting anything (even though we've extracted photons from vacuum with the Casimir Effect) and essentially even the Higgs Field relies on a non-zero vacuum energy expected value.

What Nassim Haramein has done is figured out how we can derive the mass of matter from the fundamental planck unit. He starts with a planck spherical unit - a spherical oscillator with the planck mass and planck length diameter. Remember, these values aren't defined by humans, they are absolutely natural values. Since it's a fluctuation it has a length, an energy/mass, a time/frequency, etc.

If you simply divide the proton by these spheres, and multiply by the planck mass, you yield the mass of the observable Universe. 1055 grams.

What this is stating, plainly, is that there is the exact amount of vacuum fluctuations that fit in the proton volume to equal the mass of the Universe.

If we run with this, it obviously makes the proton a black hole - it has way enough mass in it's size to become one.

Once it's a black hole - we can borrow a theoretical but mathematically valid concept from string theory, the holographic principle - which simply states the surface information of a black hole can encode the volume information.

When you do this, by simply dividing the surface planck spheres by the volume planck spheres and multiply by the planck mass, you go from the mass of the universe (the mass of all protons) to the mass of a single proton, it's rest mass, at ~10-24 grams. We have derived the mass for gravitation from discrete quanta - in completely not anthropomorphically defined units (planck unit).

Proton charge radius: .8755 x 10-16 m

Proton volume with given radius: 2.831 * 10-45 m3

Planck length diameter sphere volume: 2.21 * 10-105 m3

Divide them and multiply by planck mass

((2.831 * 10-45 m3) / (2.21 * 10-105 m3)) * planck mass

Yields: 1.281 * 1060 * planck mass = 2.788 * 1055 grams.

And here is calculating the proton rest mass via these same principles but applying the holographic principle (planck masses that fit on surface / planck spheres in volume)

2 * (1.02656 * 1036 gram / 1.2804 * 1060) = 1.603498 * 10 -24 grams

So it's one equation to go from the holographic mass to the rest mass of the proton.

So simply put: each proton contains the information of all protons holographically. The surface planck spheres are terminations of wormholes that connect all proton's surfaces through a superfluid/superconducting aether, allowing instantaneous information transfer through the vacuum of space - creating a universal holographic network in which each piece contains the entirety. Quantum foam isn't disorganized chaos of connecting and disconnecting wormholes - space is structured, organized, and coherent wormhole geometries. Matter is the result of these coherent entanglement relationships.

This is how you resolve the immense vacuum energy to the tiny energy of matter. Gravity isn't 'leaking into other dimensions' or 'curled up in higher dimensional strings'. Energy is non-local and 'shared' across the entire Universe in a single quantum network - and buffered by limited surface holographic horizons of black hole objects.

This allows for a continually evolving and learning universe across scales.

For this in a very digestible format, checkout the 2015 lecture.

The short answer is probably, yes. But the connotations of 'simulation' are a little bit off, imo.

The reality described by a Universe that is essentially a holographic quantum system is more like a fractal self-configuring, self-evolving/complexifying and self-referencing system rather than some VR type deal that was programmed by a higher being. IMO of course.

What holofractal is saying is that the Universe is made up of bits of information - and that the information of the entire system is fractally encoded at every point through harmonic nesting/layering.

Through entanglement, systems can evolve into higher and higher orders of complexity. Essentially, think of the Universe, then add an entire layer or 'dimension' overtop that is allowing the entire Universe to talk to itself. The Universe came out of the box pre-wired with a network that can sustain virtually instantaneous information transfer. If you can begin to imagine the effects that this could have instead of a disconnected Universe, concepts such as biogenesis and ordering systems in general / negentropy start to make a whole lot more sense -- especially when you realize that time is not linear in one sense, and entangled future states would have an attractor effect on current systems - morphic resonance.

It has implications for consciousness as well as all sorts of phenomena considered supernatural that would in effect be just natural, like remote viewing.

There's an amazing paper that came out of Resonance Science Foundation called The Unified Spacememory Network. It may take a few reads, but IMO this is the most important paper in the modern era.

There's that Talbot book from 15-20 years ago, but it gets bogged down in the mystical - Hawaiian shamans hacking the holomatrix to trot over lava fields and that kind of thing.

But more recently there was the holographic information encoded on the spherical surfaces which is mainstream cosmology.

this must be the paper https://www.neuroquantology.com/index.php/journal/article/view/961

OK, well I'll have a go at this. Disclaimer: I've basically stopped reading about physics in the decade or so since I stopped studying, so really, someone with an A-level in the subject and a subscription to *New Scientist* is likely to have a more in-depth knowledge of current developments in cosmology and theoretical physics than I do.

That said, the first thing that strikes me about this idea is that it isn't new at all. It actually goes back nearly a hundred years to ideas that were being kicked around by Arthur Eddington and Paul Dirac. Eddington was an early champion of Einstein and is the first person after Einstein himself who can really be called an expert in general relativity. He also led the astronomical observation of the deflection of light from Mercury by the sun's gravitational field in 1919, which was the first empirical test of the new theory. Point being, he was very much a proper scientist and not in any way a crank. But he became obsessed with certain 'coincidences' involving pure numbers in physics (which, as the article rightly points out, are hugely interesting to people working in fundamental physics and cosmology, because they're independent of any system of measurement and in a sense represent the 'settings' of our universe).

Now Eddington conceived of the number 136 as being of enormous importance, because the fine structure constant, which determines the strength of the electromagnetic interaction in what was then the brand new theory of quantum electrodynamics (QED), developed in large part by Dirac, had been measured to be about 1/136. Eddington decided it was *exactly* 1/136 and used it as the basis for some very arcane arithmetic that was supposed to connect the total mass of the universe to the total number of protons (which in those days were thought to be elementary particles). When experimental techniques improved and showed that the constant was more like 1/137, Eddington produced a post-hoc explanation for why 136 should really have a 1 added to it to make the 'true' ultimate cosmic number, 137. Naturally this led to widespread derision and his scientific reputation never really recovered. (The accepted value today is 1/(137-and-a-little-bit).) He even claimed to have predicted the*precise number of protons in the entire universe*. What had happened is that he'd left the path of real science and had ended up doing a sort of physics-flavoured numerology - pseudoscience, in other words. Possibly you could call it a conceptual version of pareidolia, or seeing patterns because your brain expects or wants to see them, whether or not they're actually there.

Dirac spent some time working with similar ideas, trying to connect the observed age and scale of the universe to physical constants such as Planck's constant, Newton's constant (i.e. the universal gravitational constant) and the speed of light, as well as the masses of electrons and protons. One number that comes up in this context is 10^120, or the approximate ratio of the density of the cosmological constant predicted by theories of quantum gravity and the observed value consistent with a more-or-less flat universe. All this stuff is regarded as fairly fringe-y these days, I think, and Dirac, unlike Eddington, didn't get too hung up on it and carried on doing productive physics until his death in the (and his) 80s.

That said, the first thing that strikes me about this idea is that it isn't new at all. It actually goes back nearly a hundred years to ideas that were being kicked around by Arthur Eddington and Paul Dirac. Eddington was an early champion of Einstein and is the first person after Einstein himself who can really be called an expert in general relativity. He also led the astronomical observation of the deflection of light from Mercury by the sun's gravitational field in 1919, which was the first empirical test of the new theory. Point being, he was very much a proper scientist and not in any way a crank. But he became obsessed with certain 'coincidences' involving pure numbers in physics (which, as the article rightly points out, are hugely interesting to people working in fundamental physics and cosmology, because they're independent of any system of measurement and in a sense represent the 'settings' of our universe).

Now Eddington conceived of the number 136 as being of enormous importance, because the fine structure constant, which determines the strength of the electromagnetic interaction in what was then the brand new theory of quantum electrodynamics (QED), developed in large part by Dirac, had been measured to be about 1/136. Eddington decided it was *exactly* 1/136 and used it as the basis for some very arcane arithmetic that was supposed to connect the total mass of the universe to the total number of protons (which in those days were thought to be elementary particles). When experimental techniques improved and showed that the constant was more like 1/137, Eddington produced a post-hoc explanation for why 136 should really have a 1 added to it to make the 'true' ultimate cosmic number, 137. Naturally this led to widespread derision and his scientific reputation never really recovered. (The accepted value today is 1/(137-and-a-little-bit).) He even claimed to have predicted the

Dirac spent some time working with similar ideas, trying to connect the observed age and scale of the universe to physical constants such as Planck's constant, Newton's constant (i.e. the universal gravitational constant) and the speed of light, as well as the masses of electrons and protons. One number that comes up in this context is 10^120, or the approximate ratio of the density of the cosmological constant predicted by theories of quantum gravity and the observed value consistent with a more-or-less flat universe. All this stuff is regarded as fairly fringe-y these days, I think, and Dirac, unlike Eddington, didn't get too hung up on it and carried on doing productive physics until his death in the (and his) 80s.

Last edited:

Back to the paper. One of the things that makes it look rather anachronistic is the fixation on protons as some sort of fundamental unit of matter, when we've known for decades they are no such thing. They just happen to be the lightest baryon and therefore the only baryon that's (as far as we know) stable in isolation, so it makes up nearly all the baryonic matter in the universe. But it's actually made up of three quarks bound together by gluons (force carriers of the strong force). These (apparently) elementary particles, not protons, should have properties explained by ratios of pure numbers, if there's any mileage in this idea. And the mass of a proton isn't even simply the sum of the masses of its constituent quarks: the quarks are very light and make up only about 1% of the proton's mass. The rest comes from the self-energy of the 'gluonic field' binding them together. The strength of this field, and therefore the mass and other properties of the proton such as its effective diameter, are determined by the strong coupling constant (analogous to the fine structure constant in QED). Now if there really is a final 'theory of everything' that ties together all forms of physical interaction, including gravity, then the strong coupling constant will be related in some numerical way to the speed of light, Planck's constant and so on. But this theory says nothing at all about this aspect, so I think "What's so special about protons?" remains a valid objection.

A further objection is that calculations supposedly including the total mass of the universe need to factor in that baryonic matter (i.e. protons) makes up only a fraction of the required amount of mass to make astronomic models make sense. The rest is so-called dark matter, which cannot be made of protons because (being 'dark') it doesn't interact with light. (There are serious researchers today who think dark matter is nonsense and observed effects in the behaviour of galaxies can be explained instead by modifying gravity at very long range, but so far the consensus is very much on the side of dark matter.). And it gets even worse if you consider that matter is just one aspect of mass-energy, and that to account for the observed acceleration of the universe's expansion, we need an even larger component of 'dark energy'. Anyway, the point is that protons aren't fundamental in terms either of their own structure or of making up most of the 'stuff' that appears to exist in the universe. (Edit: current observations suggest protons make up less than 5% of the universe's mass-energy constant.)

Then we come to the 'Indra's Net' bit (I know I'm not going in order here). This is a neat analogy, or metaphor, or something, but it's also nothing new. It's in Fritjof Capra's*The Tao Of Physics*, which came out in 1975 and starts with some fairly engaging ideas about how some ideas in modern physics are vaguely analogous to metaphysical concepts and allegorical images in Hindu, Buddhist and Taoist thought, before jumping into the deep end and trying to use quantum mechanics to explain telepathy and whatnot. The line at the end of the piece about explaining "remote viewing" unfortunately places it in the same category of 'quantum woo' pseudoscience.

A further objection is that calculations supposedly including the total mass of the universe need to factor in that baryonic matter (i.e. protons) makes up only a fraction of the required amount of mass to make astronomic models make sense. The rest is so-called dark matter, which cannot be made of protons because (being 'dark') it doesn't interact with light. (There are serious researchers today who think dark matter is nonsense and observed effects in the behaviour of galaxies can be explained instead by modifying gravity at very long range, but so far the consensus is very much on the side of dark matter.). And it gets even worse if you consider that matter is just one aspect of mass-energy, and that to account for the observed acceleration of the universe's expansion, we need an even larger component of 'dark energy'. Anyway, the point is that protons aren't fundamental in terms either of their own structure or of making up most of the 'stuff' that appears to exist in the universe. (Edit: current observations suggest protons make up less than 5% of the universe's mass-energy constant.)

Then we come to the 'Indra's Net' bit (I know I'm not going in order here). This is a neat analogy, or metaphor, or something, but it's also nothing new. It's in Fritjof Capra's

Last edited:

All of which is a bit unfortunate, because there are some snippets of real and valuable physics here. The intersection of quantum mechanics, cosmology and information theory has occupied some of the most brilliant minds in physics, most notably Stephen Hawking for much of his career. The author rather overextends himself with stuff I don't think he understands properly, such as quantum entanglement seemingly enabling 'instantaneous communication'. (This is forbidden by a very abstruse but apparently cast-iron proof called the no-communication theorem. This theorem, strictly speaking, applies only in 'orthodox' quantum mechanics. Various alternatives to QM have been put forward, most significantly the so-called 'hidden variables' theory, but none of these theories has had the predictive power of QM and many variants have been explicitly ruled out by experiment.)

Finally, even without these objections, I'd still be asking: OK, nice idea, but so what? Does it give an ontological or epistemological picture of the physical cosmos that's neater, more natural or involves fewer ad-hoc assumptions or fundamental constants that have to be measured (as opposed to calculated) than the theories we have already? Even more importantly, does it allow any predictions that are different from those of accepted theories and that can be measured using current technology, or any technology that might conceivably exist in the foreseeable future? The answer would seem to be 'no'. Appeals to teleology (which violate the laws of thermodynamics) and flat-out woo like ESP (which has never once been adequately demonstrated under laboratory conditions) are the nail in the coffin.

Further reading:

https://en.wikipedia.org/wiki/Arthur_Eddington#Fundamental_theory_and_the_Eddington_number

https://en.wikipedia.org/wiki/Dirac_large_numbers_hypothesis

https://en.wikipedia.org/wiki/No-communication_theorem

https://en.wikipedia.org/wiki/Bell's_theorem <-- theoretical background to the tests that have disproven hidden variables in favour of quantum mechanics

Finally, even without these objections, I'd still be asking: OK, nice idea, but so what? Does it give an ontological or epistemological picture of the physical cosmos that's neater, more natural or involves fewer ad-hoc assumptions or fundamental constants that have to be measured (as opposed to calculated) than the theories we have already? Even more importantly, does it allow any predictions that are different from those of accepted theories and that can be measured using current technology, or any technology that might conceivably exist in the foreseeable future? The answer would seem to be 'no'. Appeals to teleology (which violate the laws of thermodynamics) and flat-out woo like ESP (which has never once been adequately demonstrated under laboratory conditions) are the nail in the coffin.

Further reading:

https://en.wikipedia.org/wiki/Arthur_Eddington#Fundamental_theory_and_the_Eddington_number

https://en.wikipedia.org/wiki/Dirac_large_numbers_hypothesis

https://en.wikipedia.org/wiki/No-communication_theorem

https://en.wikipedia.org/wiki/Bell's_theorem <-- theoretical background to the tests that have disproven hidden variables in favour of quantum mechanics

Last edited:

Lazy. You could at least have used that gif from The Simpsons!Nerd

Thank You For This Useful PostAmazing. Best title of any thread in dissensus history. Worst content. Like eating cold porrige.

You don't leave much room for debate and I don't know enough physics to counter what you say. Nor have i had the opportunity to read the Neuroquantology paper yet.C'mon though, surely that stimulated *some* thought!

Dark matter is probably bollocks though, I think absence of evidence is evidence of absence in this case. The ballooning of dark matter to account for the shortcomings of gravitational cosmology reminds me of pre-Keplerian epicycle proliferation.

Last edited:

Not an unreasonable comparison, I admit. But fudging a theory as elegant and successful as general relativity to make it fit empirical data is very unattractive too, I think.The ballooning of dark matter to account for the shortcomings of gravitational cosmology reminds me of pre-Keplerian epicycle proliferation.

Just read through the Unified Spacememory Network paper, and while I can safely say that the overwhelming bulk of it is beyond me, what little I've read on this Haramein fellow I find interesting.C'mon though, surely that stimulated *some* thought!

Any of you have any thoughts on this guy? Seems to be a trend of dismissive accusations of pseudoscience (...) and cultishness.

Also, I remember enjoying this presentation, which seemed to connect intelligence (of any kind?), or perhaps the subjectivity of intelligence, to Mandelbrot's infinite recursion formula (?). Anyway, as a layman, I found it immensely interesting.