I remember when chips first hit 1GHz around 1999. Tech magazines were claiming that we'd hit 7GHz in 5 years.
What they failed to predict is that you start running into major heat issues if you try to go past ~3GHz. Which is why CPU manufacturers started focusing on other ways to improve performance, such as multiple cores and better memory management.
My dad had one of the first consumer 3GHz chips available. By the time I inherited it in 2009 it was completely outclassed by a <2GHz dual-core laptop.
That would've been a single 3ghz cpu core. Now we have dozens in one chip. Also, the instruction sets and microcode has gotten way better since then as well.
Clock speed isn't improving that quickly anymore. Other aspects, such as more optimized power consumption, memory speeds, cache sized, less cycle-demanding operations, more cores have been improving faster instead.
We're running into hard physical limits now, the transistors in each chip are so small that any smaller and they'd start running into quantumn effects that would render them unreliable.