Currently, Nvidia dominates the market for AI chips, with over 80% market share, according to some estimates.
Nvidia reveals new A.I. chip, says costs of running LLMs will ‘drop significantly’::Currently, Nvidia dominates the market for AI chips, with over 80% market share, according to some estimates.
Not all bad, compared to crypto the vector transformations done for ml are relatively similar to those done by graphics processing. So any innovations on the ml front will probably yield improvements in graphics.
I'm not worried about that. There will be open competition, because most of this stuff is open-source. Cheaper hardware will open the door for anyone like you or me to set up our own services. Anyone can set up a server with their own hardware (or rent it from Amazon or wherever) and run their own chatbot (with blackjack! and hookers!) instead of using ChatGPT.
This is already possible on consumer hardware, just not with the biggest and best networks. Right now, if I wanted to run, say, BLOOM (an open-source LLM), I'd need to spend close to $100K on hardware. Obviously, that's out of reach for a hobbyist, so I'm limited to using smaller, less advanced networks like LLaMa or GPT-J. Cheaper hardware will help break the hold that the big players currently have over the industry.
While Nvidia are killing their consumer market to go all-in on something that might not survive the next decade, other brands should get their shit together, and fill in that gap. Still have faith in Intel, although it's not like they aren't just as evil
I'm liking AMD still. They're not perfect of course but they seem to have far less fuckery going on than Intel and Nvidia, and they have open source drivers that play nice with Linux.
I always have this thought in the back of my mind too, but the issue is that while raw performance is a bit better than the counterparts, Nvidia still offers more features for the money, and I don't always have money to throw away. Typically i'd upgrade my gpu once every 5 years or so
AI might not survive the next decade? I already use it every day at work. The productivity gains are enormous and far from saturated. I think it's more likely that AI will survive and consumers (humans) will not survive.
I think people simultaneously overestimate the capability of current machine learning models while underestimating their long term impact. These models are going to be in everything. They are very resource hungry and will absolutely be a driver of hardware innovation for the next decade and probably longer.
You've answered your own question. They used to release upgraded hardware with a reasonable generational boost almost yearly. Now the gap has widened, and they're iterating on old hardware, by giving it more juice and a larger cooler. Not to mention the astronomical prices that have outclassed previous top-end cards at the current mid-range