New memory tech unveiled that reduces AI processing energy requirements by 1,000 times or more
palordrolap @ palordrolap @kbin.run Posts 0Comments 220Joined 1 yr. ago
palordrolap @ palordrolap @kbin.run
Posts
0
Comments
220
Joined
1 yr. ago
To stick with the analogy, this is like putting a small CPU inside the bottle, so the main CPU<->RAM bottleneck isn't used as often. That said, any CPU, within RAM silicon or not, is still going to have to shift data around, so there will still be choke points, they'll just be quicker. Theoretically.
Thinking about it, this is kind of the counterpart to CPUs having an on-chip cache of memory.
Edit: counterpoint to counterpart