If I have to deal with Blockchain cloud computing IoT bullshit as a software engineer, I want everyone else to feel my buzzword pain in the tech they use.
It's repurposed cryptocurrency architecture with a price hike. They have no video output, and you can string dozens of GPUs to a single motherboard to sell processing power online.
Thus, Windows will again be instrumental in driving growth for the minimum memory capacity acceptable in new PCs.
I love that the primary driver towards more powerful hardware is Windows just bloating itself bigger and bigger. It's a grift in its own way, consumers are subsidizing the requirements for Microsoft's idiotic data processing. And MSFT is not alone in this, Google doing away with cookies also conveniently shifts away most ad processing from their servers into Chrome (while killing their competition).
They called it Federated Learning of Cohorts at one point. Instead of you sending raw activity data to Google servers and them running their models there, the model runs in Chrome and they only send back the ad targeting groups you belong to. All in the name of privacy of course.
Microsoft is desperate to regain the power they had in the 00s and is scrambling trying to find that killer app. At least this time they're not just copying apples homework.
And maybe that's why it isn't working. They try too hard to persuade or force you, giving people icky feelings from the get go... and they try too little to just make a product that people want.
At least it should result in less laptops being made with ridiculously small amounts of non upgradable RAM.
Requiring a large amount of compute power for AI is just stupid though. It will probably come in the form of some sort of dedicated AI accelerator that's not usable for general purpose computing.
And remember that your data and telemetry are sent to Microsoft servers to train Copilot AI. You may also need to subscribe to some advanced AI features
Hi kids, do you like violence?
Wanna see me stick nine-inch nails through each one of my eyelids?
Wanna copy me and do exactly like I did?
Try 'cid and get fucked up worse than my life is?
Everyone here is like praising microsoft when in fact you can just buy any pc's with 16 gig ram you like without the additional ai spyware and (cost if i may assume)
Do it, it's easy and fun and you'll learn about the actual capabilities of the tech. Started a week ago and I'm a convert on the utility of local AI. Got to go back to Reddit for it but r/localllama has tons of good info. You can actually run useful models at a conversational pace.
This whole thread is silly because VRAM is what you need, I'm running some pretty good coding and general knowledge models in a 12GB Radeon. Almost none of my 32GB system ram is used lol either Microsoft is out of touch or hiding an amazing new algorithm
Running in system ram works but the processing is painfully slow on the regular CPU, over 10x slower
Makes sense, 16GB is sort of the new "normal" although 8GB is still quite enough for everyday casual use. "AI PCs" being a marketing term just like "AI" itself.
That doesn't work even as a hyperbole. I literally just opened an Excel spreadsheet with 51192 rows (I had Outlook already open) and those two programs still only take 417 MB of RAM combined. Meanwhile Firefox is at 2.5 GB.
Yes, my total RAM currently used is 13.8 GB but I have 64 GB of RAM installed and you should know that generally the more RAM you have, the more of it gets utilized by the system (this is true for all modern OS, not just Windows) which is a good thing, because it means better performance, since you can cache more things in RAM that would otherwise needed to be read from disk. Unused RAM is wasted RAM. So even if one computer uses 16 GB of RAM for some relatively simple tasks, it doesn't necessarily mean it wouldn't run or grind to a halt on a system with less RAM.