This knowledge comes in handy with marketing BS around CPU coolers. If an aftermarket cooler gets a CPU to 35C when the stock cooler is at 70C, marketing will sometimes claim it cut temperatures in half.
That's not how it works, an "idle" CPU is already generating a not insignificant amount of heat. That why you measure the difference against ambiant air if you're at all serious about it.
Celsius and Faernheit are interval scales, not rational scales. The absolute change from one number to the next is consistent, but since you can go into the negatives, 1 is not double 2.
Kelvin and Rankine are rational because they use an absolute zero.
But °C was mentioned in the units, and its well understood that 0°C is a cold temperature for humans.
I'm not a fan of marketing doublespeak either, but I think the right scale and right terminology was used here. They cut the temperature in half, in Celsius, on the basis that 0°C is very cold.
Thats where the physics comes in. if the temperature is halved in terms of celsius from 70° to 35°, if in your case the temperature starts at 100°, the same energy difference would only bring the temperature down to something closer to 65° than 50°.
the specific cooling capacity of the cooler in question only "halves" the temperature if you start at a very specific point.
My entire argument rests on the premise that 0°C is a rational start point for both C and F, but I concede that halving something doesn't explain absolute changes
But centigrade isn't a measure of absolute units and is disingenuous. Using your argument it requires the consumer/reader to make a number of inferences or assumptions which isn't a good method of communication in general. It is perfectly valid to say that the cooler took CPU temperatures from 70°C to 35°C.
Why not just say that. It's an impressive stat!
Scales exist for a reason. Cutting 70°C in half is by definition -101.5°C. But let's assumed somehow everyone is on the same page and that anything below 0°C should just be ignored in this specific scenario and not any other (confusing right?), saying the temperature was cut in half is still confusing! Half from where? Did it go from 20°C to 10°C? From 80°C to 40°C? It just doesn't mean anything and as said before I would argue just stating the numbers is more impressive and informative.
I agree that the numbers should just speak for themselves
Cutting 70°C in half is by definition -101.5°C
I'd argue here that no one would make this leap nor mental calculation, and most people would just divide X by 2 and gauge what the resulting Y is based on their familiarity with the weather.
it requires the consumer/reader to make a number of inferences or assumptions which isn’t a good method of communication in general
They still have to make these inferences to understand whether or not 70 to 35 is a remarkable feat or not.
If it's 30 / 2 = 15, people would think "Huh, 15 is pretty cool compared to room temperature ~ 20ish , that's significant". If it's 90 / 2 = 45, people would think "Huh, both 90 and 45 are pretty hot, but it seems like a meaningful reduction nonetheless."
All I can say is that in my professional career where I have to write technical reports and summarize technical information I would never represent it that way, and I would be concerned if a colleague, customer, or supplier did it even if they were communicating it to a non technical audience. I would also call out my employer or management if they ever tried to change the representation of the data to something like this.
That could say more about me than anything else, but that's where I am at.
If you convert those temperatures to Kelvin, they become 308K and 343K. Since Kelvin is absolute and we're measuring the same material, this tells you how much more thermal energy is there and their actual proportion to each other.
I just want to chime in and say I appreciate your willingness to absorb knowledge, as well as not doing the "I was mistaken so I'll delete my comment" thing so that other people can read along and learn as well.