According to repair biz iFixit, the issue with the power-frugal LPDDR memory chips is that the lower voltage they operate at calls for more attention to be paid to signal integrity between the CPU and memory. In practice, this has meant shorter track distances on the circuit board, leading to LPDDR being soldered down as close to the processor as possible.
LPCAMM2 is intended to address this by putting LPDDR onto a circuit board module that is "cleverly designed to mount right up next to the CPU," with "very short traces to help maximize signal integrity," the iFixit team explains in a blog and video detailing their hands-on with the ThinkPad P1 Gen 7.
the lower voltage they operate at calls for more attention to be paid to signal integrity between the CPU and memory
And they aren't kidding around, modern high speed signals are so fast that a millimeter or less of difference in length between two traces might be enough to cause the signals to arrive at the other end with enough time skew to corrupt the data.
Edit: if you ever looked closely at a circuit board and seen strange, squiggly traces that are shaped like that for seemingly no reason, it's done so that the lengths can be matched with other traces.
USB3 is quite forgiving regarding the layout. The standard +-10% impedance matching is fine, and because there is no dedicated clock line you don't need to do length matching either. Even differential pair length mismatch is not that big of a deal. If 0.1mm is easy to archive, sure go for it, but I'd rather compromise on this in favor of more important parameters.
So, does it just have really advanced error checking? How does it handle the mismatches? I believe you, it’s just that the phrase “not that big of a deal” is doing a lot of heavy lifting here.
The signal does not care about how it gets from the sender to the receiver. The only thing that matters is that at the receivers end 0s and 1s can be separated. One common measurement is the eye pattern. If the eye is "open" enough (=matches the spec), communication is possible.
Impedance mismatch causes reflections (visible as oscillation after rising/falling edge), differential pair line mismatch degrades the slop of the signal transition (rising/falling edge). Geometric features only matter if they are large compared to the signal wavelength. As a rule of thumb features smaller then 1/20th of a wavelength can be safely ignored, often times a ratio as large as 1/5 works just fine. USB3 uses 2.5Ghz (5Gbit/s) or 5Ghz (10Gbit/s), where 1/20th result in 3.4mm and 1.7mm respectively (assuming an effective dialectic of 3.17). This is still grossly simplified, because in many real systems you don't control the entire transmission line (eg. user buys a random cable and expects it to work), so it makes sense that the USB consortium specifies eye patterns and factors in various system uncertainties.
RAM on the other hand uses 16/32/64/128 single ended data lines, with a dedicated clock line. Data does not have to arrive perfectly at the same time, but the margin may be as little as 1/10th of a clock cycle. Here accurate length matching is absolutely required. Its also the reason why the same CPU + RAM combination may archive higher stable clock rates on some mainboards then on others.
Ok, wow. Thank you for educating me on a great deal I didn’t know when I asked the question. And while it does a great deal to bridge that gap… the question remains unanswered: how is this breakthrough achieved?
Same, but now I'm working on very high-speed stuff for work and starting to get into that hobby-wise as well. Just yesterday had a conversation with a colleague about how things are getting too small to hand-solder.
My dedicated AI machine uses 1866mhz DDR3. Consumers don't know what they need and will buy whatever the latest new thing is. Smart phones are so dumb. Like wow, your brand new $2500 phone has a benchmark 4x faster than my refurbished $250 phone. Now tell me what you do with all that power. "...well I save 27ms per Instagram post which adds up with how much I use it". I want to run headfirst into a brick wall.
A couple old metrology equipment dated back from the 80s I still use calls them 'mil'. It's got dual dials for mil/mm. Gets me confused sometimes because the gauge can go down to couple millionths of an inch/couple 10s of nanometers.
Yeah, I’ve never heard of that before either. What I have heard of is either MOA or MIL reticles. In that context a Mil stands for milliradian, which is a representation of angle. That definitely doesn’t track with the post though.
And it's especially confusing for people who use sane measurement systems where "mil" is short for "millimetre", because it's just the start of the word. I think anyone that still insists on measuring things in thousandths of an inch should keep their own bespoke lingo too, and everyone else should steadfastly refuse to acknowledge "mil" in this context.
In the design and manufacture of PCBs (aka circuit boards) a "mil" is a one thousandth of an inch, so it makes sense that's what is being used in this context.
Also the maths check out: 0.005 inches is equal to aprox 0.12mm, "just over 0.1mm".
Yeah, I found it wierd too when I started designing PCBs (as hobby) that "mill" actually stood for thousanth of an inch.
Probably for historical reasons, there are tons of things in the older domains within electronics that are based on inches rather than metric units: for example the spacing between the legs of a microchip in the older chip package formats (so called DIP, the ones with legs that go into holes) is exactly 0.1"
The sizes in more modern electronics isn't usually based on inches anymore, but circuit boards are old tech (even if done with new materials) so there are still a number of measures in there which are based on inches.
I'm guessing regular non-LP DDR works fine socketed in desktops because power is nearly a non-issue. Need to burn a few watts to guarantee signal integrity? We've got a chonky PSU, so no problem. On mobile devices however every watt matters..
I recently got a have Mini-PC which a processor with a TDP of 6W and it uses run of the mill SODIMMS and the power supply for that stuff is a pretty regular wall socket power adapter, the same kind you would see for, say, a media box.
I suspect it's not even a few watts (at 3.3V 1W is around 300mA is quite an insane amount of current for a signal line), more like tens or even hundreths of a watt.
Mind you, what really changes here is voltage rather than current: these things run at a lower voltage, which helps with speed and in reducing the power dissipating as heat (so they waste less power and heat up less) and that's were signal integrity on longer signal traces becomes more of a problem because lower voltage signals are closer to the noise level the drop in voltage from the resistance of the circuit board lines because a higher proportion of the original voltage so the longer the trace the more likely it is that whatever reaches the other side is pretty much at the same level as noise.
Still matches what you wrote, by the way, as power = voltage * current, so all else being the same lower voltage does mean less power consumed. It's just that you were a bit off on the scale of the power consumption involved plus there's some more stuff related to using a lower voltage not just for lower power dissipation but also lower heat generation (which is directly derived from lower power dissipation) and higher speeds (which is for different reasons).
Normal DIMMs work fine but soldered RAM can just be much faster and in general better. It's not an acceptable compromise on most desktops but for laptops which also has to be smaller and need to worry about stuff like battery life, it matters more.
there was perfectly fine memory that was upgradable before. They (system integrators/oems) saw it as a way to kill the upgrade market, boosting profits.
"It's more performant than the old SODIMM sticks, vastly more efficient, it saves space, and it should even help with thermals as well. All that, and it's still about as repairable as anything we've ever seen," iFixit concluded.
Yes, there was a perfectly fine, upgradable memory standard before. And many 486s were also perfectly fine, upgradable computers.
The fact that a new technology makes it so we can have our cake and eat it too --- upgradability without any compromise --- is a fantastic innovation.
No, not at the cost of locking in customer choice and flexibility. I have soldered-on ram in my ThinkPad, but not in my Predator gaming laptop. There is a -157% chance that Lenovo was trying to extract a few percent of extra speed so that I can open Firefox 0.13 seconds faster. Perhaps they'd try to cry "but battery life!", in which case I'd respond with "well it's not fucking working" as that machine barely gets 2.5h on a brand-new battery, browsing the web + terminal windows doing server admin stuff. (ThinkPad X13 Gen 2, Intel, with WWAN if you're curious. Fucking 1.5k and it's just passable for basic usage on the go.)
I'm not really upset with this 'new' standard, but the fact that oems are absolutely going to use it as bullshit marketing "look, we fixed the problem! get our un-fucked ram for only $129 per stick!". That's what the fuck I'm pissed off about.
I mean I'm not sitting here defending soldered on ram but your unnecessary aggression and sarcasm in your previous responses overshadows the fact that while solder on ram sucks for the upgrade and repair market the underlying tech has very tangible improvements and now we can maintain that improvement and the upgrade and repair functions.
I agree, soldered ram is bad. But I disagree that LPDDR ram is fundamentally bad and this improvement allowing it to be modular while maintaining its improvements is a very good thing.
As far as your complaints of battery life on your thinpad goes, there is much more to battery life than the consumption of the memory but naturally every part plays a role and small improvements in multiple places result in a larger net improvement. I'm assuming you're running linux which in my experience has always suffered from less than optimal power usage. I'm far from an expert in that particular area but its always been my understanding that it is largely caused by insufficient fireware support.
As a whole this looking at this article in a vacuum i only see good things. A major flaw with lpddr has been address and i will be able to expect these improvements in future systems.