"For GPU-accelerated AV1 playback, you will need an Nvidia GeForce RTX 30 series card, an AMD Radeon RX 6000 series, or an 11th-gen Intel CPU with Iris Xe graphics."
Yeah. This seems a bit quick given the adoption of hardware decoders. I would have thought they would have waited a while longer before actually changing the default.
The benefits are real, though. AV1 is extremely efficient and it looks markedly better than alternatives even at lower bitrates.
You obviously need an AV1 hardware decoder to do hardware decoding of AV1. The GPUs you listed are the only ones with a builtin AV1 hardware decoder on the PC, though that list is also out of date. There are also multiple smartphone SoCs from Mediatek and Qualcomm that have an integrated AV1 hardware decoder.
This is also not in any way an indicator for the level of hardware needed to do software decoding of AV1.
That would require not only storing the video in every resolution, but also in both codecs for every resolution. By going with AV1, they're trying to reduce the cost of storage and bandwidth, not double it.
YouTube has exactly one purpose: playing videos. If the technology they're trying to use can't achieve that, the technology isn't ready, simple as that.
AV1 won't magically reduce storage by 90%, it's a marginal improvement.
Youtube already stores every video in H.264 and VP9, with some videos also being encoded in AV1. They aren't removing support for other codecs yet, they are just making AV1 the default.