If you put a TV in a Faraday cage that blocked the relevant radio spectrum, would there be no static on it? I expected the answer to be a quick Google, but it wasn't.
That is a good question, but I suspect if you tried this in real life it would still show static.
The waves are amplified with a circuit that attempts to find a signal even if it's very weak (so you can get a picture even if you're close or very far from the tv station)
At a certain point, the electromagnetic field from the running TV itself would start to get picked up
I suspect a better thought experiment would be if you just disconnected the input and amplification circuit entirely from the CRT tube, in which case you would probably just get white as the electron beam scans back and forth without any modulation.
You'd still see static from the TV itself and any radiation that passed in to the cage. It's not a perfect EM blocking device like TV shows and movies would have you believe.
This is a very non scientific answer, but when I was a kid (good 40 years ago) I remember having a science book that called TV static "an echo of the big bang". I guess that would mean just randomly scattered energy bouncing around on all bands?...
I could probably Google it and give you an answer, but I'll just wait for someone with a more convincingly and authoritatively written reply.
Not all of it. But parts of it really are due to the cosmic microwave background radiation. Light from the moment the universe was transparent enough to let light spread. It's from about 300,000 years after the big bang if I recall correctly. It's the earliest image of the universe we have. And it's more or less everywhere.
Now that you mention it, I remember something similar! I may have to follow up on that to see (but I'm also curious of others' responses, hence asking).
That's false. Most of them still agree that Big Bang happened, it is just that the first extra small fraction of a second of Big Bang can't be explained with our current understanding of physics, and there is still a lot of some unanswered questions about it.
The TV will try and amplify and display any signal. Without a station, it will end up amplifying random radio noise and tiny fluctuations in the amplifier circuits themselves.
The momentary signal strength is interpreted as brightness of a spot which is rapidly scanned over the display. In this case the signal is random so every spot on the screen will be a random brightness, changing every frame.
Modern digital TVs won't do this, because with compressed video recognizable data is needed to even attempt displaying a picture.
As for the sources of the radio noise, most of it is from electrons being jostled by heat, some from space. (Including the cosmic microwave background others have mentioned)
The electron jostling (thermal noise) is the reason the receivers on radio telescope as cooled to insanely low temperatures often with liquid helium.
This is the closest to the correct explanation. The reason televisions based on AM radio reception showed static is because of a circuit called the AGC (Automatic Gain Control) which worked like a robotic volume control. Its job is to keep the recovered video signal within a certain amplification range. As long as there was a carrier (the TV station was "on the air"), you'd see whatever the station broadcast. But when they turned off their transmitter, the signal strength would fall and the AGC would increase the amplification until what you see is white noise, mostly due to the random motion of electrons in the electronic components. We can minimize that by cooling, but it can't be totally eliminated. Audio amplifiers often come with a "hiss" specification that tells you how much of this kind of noise you can expect at normal operating temperature.
BTW, modern digital TVs -will- show a noise picture if they lack a video muting function when no carrier is detected. I have an LG bought in 2019 that does this, and it's hella annoying when I accidentally hit the input selection button on the remote, switching from HDMI to TV reception.
Waves are everywhere. The TV picks up whatever waves it can. Some of those waves are signals meant to transmit an image (eg from a broadcast tower), others are just random noise in our environment.
It's been awhile since I've messed about with this, so I don't remember (and you may not either, so this is an open question), but wouldn't it produce the effect even if disconnected from an antenna?
If so...Would the same principle be in play of it picking up on general EM waves to cause the effect?
the effect can change slightly if you unplug or touch the antenna or the TVs socket for it, because it may change what contributes to the signal noise and how much. It can for example become brighter and the pitch of the audio noise can change.
Afaik the antenna is picking up the background waves/radiation and the TV is displaying that background waves/radiation. If you disconnect the antenna, the TV will have no signal to display, it'll be as blank as it can get.
The display on the screen is the strongest signal. Without a strong signal from a TV tower, you just get noise from 60 Hz AC running through the wall, or radio towers, or power lines, or whatever else makes that radio noise.
In another bit of poorly-aged prediction by Gibson, Case, the main character, brings some RAM with him to sell for a quick buck on the street. How much RAM? Three entire megabytes.
Ha I remember that. I also recall someone in the 80s there was a pop song popular in Poland, entitled "Glass Weather". It was about these rainy autumn evenings when there's nothing better to do than sit in front of your (black and white) TV. The lyrics were mentioning "apartment window blue from the TV glow".
I remember putting my finger near the CRT display of these televisions during the animated static and noticing the weird electrical tingle in my finger. I even did this with my hair. It was so fun... and also potentially dangerous.
Now that's something I can't replicate anymore with my modern telly.
Its just static electricity. Not dangerous at all and not exclusive to the scramble screen.
Copied from an old reddit post:
Old cathode ray tube (CRT) televisions have an electron gun which fires electrons at the back of the screen. And the screen is coated with phosphorus phosphors which emit light whenever struck by an electron. The side-effect of this process is that each electron increases the static charge of the screen, and over time as the image on the TV changes it increases the charge. Meanwhile, rubbing your hand, which has a slight negative charge, across the screen will remove some of this built-up static.
There are many sources of electromagnetic noise which cause the characteristic display patterns of static. Atmospheric sources of noise are the most ubiquitous, and include electromagnetic signals prompted by cosmic microwave background radiation,[1] or more localized radio wave noise from nearby electronic devices.[2]
The display device itself is also a source of noise, due in part to thermal noise produced by the inner electronics. Most of this noise comes from the first transistor the antenna is attached to.[2]
and what happens if you broadcast static, like point it at a car thats blasting some shitty radio station, and you are transmitting that same frequency? the distortion will destroy their speakers.