I've read that all math breaks down as you approach the big bang. I'm not educated enough in math to understand how, or why, but apparently they cannot mathematically understand the origin of the universe.
Another problem lies within the mathematical framework of the Standard Model itself—the Standard Model is inconsistent with that of general relativity, to the point that one or both theories break down under certain conditions (for example within known spacetime singularities like the Big Bang and the centres of black holes beyond the event horizon).[4]
My ELI5: Both theories work great, supported by vast amounts of evidence and excellent theoretical models. It seems they are two tools with distinct purposes. One for big and heavy stuff, the other for small and energetic stuff. The problem arises when big and heavy stuff is compressed into tiny spaces. This case is relevant for both theories, but here they don't match, and we don't know which to apply. It's a strong hint we lack understanding, one of the biggest unsolved problems in physics.
So math itself is probably fine, we're just at a loss how to use it in these extreme cases.
But if you condense it all into something infinitely dense, then is it suddenly finite in size? Does it still have infinite size and simultaneously infinite density?
Why didn't the immense density cause it to form a black hole?
I don't think current understanding of things is that the universe is infinite. We can estimate the size of the universe we know, because we know how fast it is spreading and for how long. Wiki says: "Some disputed estimates for the total size of the universe, if finite, reach as high as 10 10 10 122 10{10{10^{122}}} megaparsecs. We don't know whether that's all there is though. We don't even know whether the universe has the same properties everywhere, which complicates things.
It's impossible to measure precisely enough to know for sure that it is completely flat, or even saddle shaped (both being infinite in size). The generally accepted understanding by cosmologists is that it is infinite. But just due to the nature of measurement and tools we can't completely rule out a finite universe. However we do know based on the measurements that it is really really... really really really big if it's not infinite.
My understanding is that it has a 14 billion light-year radius from any given point. We can only see 14 billion light years away, since the universe is only 14 billion years old (actually 13.8). Light can only travel at a given speed, so we can't see beyond the distance light has traveled during the existence of the universe. But since the universe expanded in all directions, from everywhere all at once, it's truly infinite. If you were to teleport 14 billion light-years in any direction, you would still see 14 billion years away, since the universe expanded from that point too during the big bang. It's mindfuck level stuff.
That understanding is intuitive but very wrong. We can see parts of the universe that are up to 46 billion light years away because of the expansion of space. The actual physical universe extends beyond that, further than we can observe.
The light didn't travel 46 billion lightyears, but the objects whose light we are seeing are 46 billion lightyears away by the time we collect that light due to expansion. So the agreed on "radius of the observable universe" is 46.something GLY
https://youtu.be/XBr4GkRnY04 this old video from Veritasium explains the concept of the hubble sphere and the particle horizon, both of which are further than 13.8.Billion lightyears away
https://youtu.be/eVoh27gJgME this newer video from PBS Spacetime goes into much further detail about how they're calculated
They use the Lamda-CDM model which outputs the rate of expansion of the universe at every moment in past present and future. You measure the amount of light+matter+dark matter+dark energy that your universe has, plug those values into the Friedmann equation, and it spits out the rate.
You can try out an online calculator yourself! It already has those values filled in, all you need to do is enter the z value - the "redshift" - and click generate. So for example when you hear in the news something like "astronomers took a photo of a galaxy at redshift 3", you put in 3 for "z", and you see that the galaxy is 21.1 Gly (billions light years) away! That's the "comoving distance", a convenient way to define distance on cosmic scales that is independent of expansion rate or speed of light. It's the same definition of distance that gives you that "46 Gly" value for the size of observable universe. But the light from that galaxy only took 11.5 Gyr to reach us. The universe was 2.2 Gyr old when the light started. So the light itself only traveled 11.5 Gly distance, but that distance is 21.1 Gly long right now because it kept expanding behind the photon.
Crucially, we are able to determine the distance by redshift via the observations of objects with known distance (like standard candles) and their redshifts. The ΛCDM model only becomes necessary for extrapolating to redshifts for which we otherwise don’t know the distance, but this extrapolation cannot be made without the data of redshifts of known distances.
That's true! There is a kind of incestuous relationship between the cosmic distance measurements and the cosmic model. Astronomers are able to measure parallax only out to 1000 parsecs, and standard candles of type Ia supernovae to a hundred megaparsecs. But the universe is much bigger than that. So as I understand it they end up climbing a kind of cosmic ladder, where they plug the measured distances up to 100 Mpc into the the ΛCDM model to calculate the best fit values for the amounts of matter/dark matter and dark energy. Then they plug in those values along with the redshift into the model to calculate the distances to ever more distant objects like quasars, the Cosmic Microwave Background, or the age of the universe itself. Then they use observations of those distant objects to plug right back into the model and refine it. So those values - 28.6% matter 71.4% dark energy, 69.6 km/s/Mpc Hubble constant, 13.7 billion years age of the universe - are not the result of any single observation, but the combination of all observations taken to date. These values have been fluctuating slightly in my lifetime as ever more detailed and innovative observations have been flowing in.
Are you an astronomer? Maybe you can help me, I've been thinking - how do you even measure the redshift of the CMB? Say we know that CMB is at redshift 1100z and the surface of last scattering is 45.5 GLy comoving distance away. There is no actual way to measure that distance directly, right? Plugging in the redshift into the model calculator is the only way? And how do we know it's 1100? Is there some radioastronomy spectroscopy way to detect elemental spectral lines in the CMB, or is that too difficult?
If we match the CMB to the blackbody radiation spectrum, we can say that its temperature is 2.726K. Then if we assume the temperature of interstellar gas at the moment of recombination was 3000K, we get the 1100z figure. Is that the only way to do it? By using external knowledge of plasma physics to guess at the 3000K value?
So if you take 2 things that started say ~3 billion light years apart (which would be ~1000x a megaparsec), that means every single second the universe has existed those 2 points have gotten 70,000km further apart. And now that they're further apart, they separate even faster the next second.
For reference:
31.5 million seconds in a year. ( 3.15 x 10^7 )
universe is 13.8 billion years old ( 1.38 x 10^10 )
So we talking about this 70,000km getting added between the 2 points ~4 x 10^17 times.
Then you gotta bring calculus into it to factor in the changing distance over time.
It ... adds up. Which is why you'll see the estimates for the observable universe's radius being ~46.5 billion light years (93 billion light year diameter), even though the universe had only existed for ~14 billion years.
And now that they’re further apart, they separate even faster the next second.
That's a common misconception! Barring effects of matter and dark energy, the two points do NOT separate faster as they get farther apart, the speed stays the same! The Hubble constant H0 is defined for the present. If you are talking about one second in the future, you have to use the Hubble parameter H, which is the Hubble constant scaled with time. So instead of 70 km/s/Mpc, in your one-second-in-the-future example the Hubble parameter will be 70 * age of the universe / (age of the universe + 1 second) = 69.999...9 and your two test particles will still be moving apart at 70000km/s exactly.
The inclusion of dark energy does mean that the Hubble "constant" itself is increasing with time, so the recession velocity of distant galaxies does increase with time, but that's not what you meant. Moreover, the Hubble constant hasn't always been increasing! It has actually been decreasing for most of the age of the universe! The trend only reversed 5 billion years ago when the effects of matter became less dominant than effects of dark energy. This is why cosmologists were worried about the idea of a Big Crunch for a while - if there had been a bit more matter, the expansion could have slowed down to zero and reversed entirely!
Oh wow thanks. You learn something new every day! I'm definitely an "armchair physicist", and still find it hard to think about things in a nonstacically geometric way.
Sounds like the Hubble Constant ain't so constant :)