Finally a use case where AI/Machine learning would absolutely make sense. If we can have AI that can generate text or images, imitate people's voices or write code, we can also have a lightweight model that can detect ads and skip them during playback. There's a model trained on SponsorBlock data for detecting sponsored segments https://github.com/xenova/sponsorblock-ml
I'm sure that we can have something similar but for embedded ads.
It's called a classifier and it could easily detect an embedded ad. The issue is now everyone needs to run it on their hardware to detect and this will cost some electricity.
Ads have definitely added more load on electrical grids in aggregate than locally hosted and lightweight models, especially given that ads are fucking everywhere all the time. Websites, apps, the servers, even 24/7 electric billboards. I'm not worried about a few nerds using slightly more electricity sometimes for their own benefit and joy (it's still less power than gaming), as opposed to a corp that burns through power and breaks their climate pledges (Microsoft) for the benefit of their bottom line and nothing else. Corps don't get to have a monopoly on AI that was built with our data, only to have it fed back to us to pull more data and siphon more money.
Do you understand what we're still talking less energy than the monitor it displays on.
I would bet even untuned VGG16 could do that without even a fine tune. Advertising is starkly different to content and the output is a "ad=yes/no" signal. It's a very small amount of data, probably less than the plain hardware video decoder.
It's also not a new type of load, it runs off the same power supply as any computer, a slight capacitive load, it won't even change the grid powerfactor.
Give it 5 more years in hardware performance improvements and software/model optimization and I don't see a problem. The important part is that improvements are made public for everyone to use and improve upon instead of letting openai and microsoft take the whole cake