Compression artifacts will exist as long as we use lossy video compression. You can however eliminate visible to the eye artifacts with high enough bitrate, but that has always been the case.
Considering the ungodly size of raw video or even video with lossless compression, we will need lossy compression for the next century.
It will only change if the bandwidth and storage become practically free, which would require some unforseen breakthrough in technology.
It's so rare to see someone speak the voice of reason on future technology matters. People think we'll be able to do anything, when there are physical limits to just how far we can advance current technology.
Even if we invented something new, you have to deal with the size restrictions of atoms. Silicon has an atomic radius of 1.46 Å, gold 1.35 Å, and our current process for manufacturing that's in development is a 2 nm, or 20 Å process, although that number doesn't mean much, since the measurements are closer to 20 nm (metal pitch). There are experiments dating back about a decade where someone created a transistor out of a phosphorus atom. We're a lot closer to the end than we might realize.
As you mentioned chip lithography is hitting a wall. They're not going to be able to make things much smaller with current technology. People have been working on new ways to build these mechanisms. They're researching ways to do things at the quantum level, that's some seriously sci-fi tech.
Yes compression will always be required. The raw video they work with in production takes up enormous amounts of drive space. As things become less restricted it only means we'll be able to use less compression and higher bitrates. Though you can use fairly liberal bitrates with local storage already. I have zero issue with local videos, but I can run into compression artifacts when streaming from sites with less than optimal bitrates.