Americans always regurgite the "Fahrenheit is how people feel" nonsense, but it is just that: nonsense. Americans are familiar with fahrenheit so they think that it is more inituitive than other systems, but unsurprisingly people who are used to celsius have no problems using it to measure "how people feel" and will think it is a very inituitive system.
Can confirm. Moved from the US to Canada and maybe a year of using Celcius revealed to me just how fucking stupid and convoluted Fahrenheit is. My dad spent three weeks out here and started using Celcius on his phone. Now I only use Fahrenheit when dealing with fevers or temping cases of suspiciously overripe produce.
Fellow Americans. Celcius is superior and more intuitive for those who take a moment to adjust to it. It is okay to accept this as fact without developing an inferiority complex. USA not always #1. USA quite often not #1 and that is okay. It is okay for USA to not be #1 without developing an inferiority complex.
The universe is mostly empty space with an average temperature of like... 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.
I don't know why "techtarget" would be a credible source on Physics questions, but the SI convention, which is, according to Wikipedia, the "only system of measurement with an official status in nearly every country in the world, employed in science, technology, industry, and everyday commerce", poses that "kelvin is never referred to nor written as a degree."
But I also made the mistake to write it as "Kelvin" instead of "kelvin".
So then we should use the system that reflects the freezing point and boiling points of water at nice round values such as 0 and 100 then? Sounds like Celsius is the better system
Slightly off topic, but 23°C is a nice room temperature? We have our thermostats at 20°C and I find it quite warm. In the sleeping room we have 18°C and so do I have in my office, which I find quite comfortable. I hate visiting my parents, they always have 22.5°C which I find uncomfortably warm.
Well it's all subjective after all, I'll be happy about chilly 23°C inside when summer comes.
What is your point? That people who use Celsius can't feel the difference between 21.7°C and 22.8°C?
If you're worried about your thermometer, you'll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
It's just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
It's the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.
Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
So whenever you have to tell someone the temperature outside, you say it's 0.000000000000000000000000015237 Planck
If 3 digits isn't more a tiny bit more cumbersome than 2, then 32 digits is fine too.
We don't have issues with decimals in many places. For example, why are there pennies? Why aren't dollars just scaled up 100? Generally speaking: why don't people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you're correct, those should be simplified too - yet they aren't.
Why bother with Celcius at all when there is Kelvin.
Because Celsius uses a scale that relies on temperatures you're encountering in your everyday life.
Even Kelvin is arbitrary. Best to use Plank normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.
But that's the same reason given for Farenheit!
Why? That scale is still arbitrarily chosen
It's not arbitrary in that it represents the fundamental limits of temperature in the universe. Planck units are fundamental to the nature of the universe rather than based on any arbitrary object.
I would argue it's because of historical usage, familiarity, and resistance to change. Most countries and most people living in hot climates use Celsius.
Do not say anything positive about Fahrenheit in this thread.... the Temperature Scale Inquisition is watching closely for any dissent from the party line.
Hum... Around here water boils at ~96°C (some labs measure that). And it seems to not freeze at 0°C anywhere on Earth, as it's never pure water, with never an homogeneous freezing point.
It is repeatable, it's not very arbitrary, but "intuitive" doesn't apply in any way.
You must be at altitude. That definitely makes a difference for the boiling point, but of course water freezes at 0. Impurities that you'll encounter in tap water, for example, will not have a large effect on freezing point.
Even if it was different by a few degrees, how does that make the scale any less intuitive?
No it really doesn't. Knowing water freezes at 0 gives you no help in day to day life vs knowing 32 or 300 for water to freeze. You still have to be cautious driving above the freezing point. Your refrigerator sits a few degrees above 0 instead of 35 or 305.
Knowing it's 20 out only tells you useful information because you memorized what that feels like. You could just have internalized what 375 feels like.
Celsius is nice if you need to build a thermometer from scratch. That's not something people generally do.
I mean, you're 100% wrong. Fahrenheit isn't "how people feel" arbitrarily, it's almost literally a 0-100 scale of how hot it is outside. You need no prior knowledge to interpret a Fahrenheit measurement. Which really reflects poorly on everyone who says "Fahrenheit doesn't make any sense" because if they were capable of any thought at all they would figure it out in 2 seconds, like everyone else. I'm a lab rat that uses Celsius all day every day, I'm just not a pretentious stuck up tool about alternate measurements just because I refuse to understand them.
I like that Fahrenheit has a narrower range for degrees. 1C is 1.8 degrees F. So, F allows you to have more precision without the use of decimals. Like, 71F feels noticeably different to me than 64F, but that is only a 3.8 degree difference in C.
Where in the chicken I jam the thermometer makes several degrees difference. If you truly require that level of granularity whilst grilling, I'd wager reading a decimal figure isn't the end of the world. Us normies can continue to bring chicken to 74 and call it a day
3 degrees celcius is easily noticeable too so that's a bit of a moot point. If anything, 1 degree celcius is much harder to discern and therefore having an even more granular scale is unnecessary.
It is really easy to map onto human feel though. 0-100 pretty accurately maps onto our minimum and maximum realistically survivable temps, long-term, and the middle temperatures of those are the most comfortable. It's far more round, when it comes to describing human preference and survivability, than Celsius is.
I bet a lot more people know what 0°C feels like than 0°F. One is freezing point, one is a completely arbitrary temperature which only gets called "the lowest you'll experience" as a post hoc rationalisation of Fahrenheit. Most people will never experience anything that cold, some people experience colder.
I even bet more people know what 100°C feels like than 100°F. One is accidentally getting scalded by boiling water, the other is a completely arbitrary temperature which is quite hot but not even the hottest you'll experience in America.
boiling water isnt necessarily 100c. if youre boiling water, it can be any arbitrary temperature above 100.
thats like going to a geyser pit and saying thats 100c, when it isnt. when you cook and let water come to a boil, the chef doesnt care that its exactly 100c, only that its in the state above 100.
If anything it'll be below 100 due to altitude. For example salt water for making pasta boils still at approx 100 deg. C. It takes quite a lot of salt (way more than you would ever want to consume) to meaningfully raise the boiling point.
if youre boiling water, it can be any arbitrary temperature above 100.
That's not how boiling works. The water heats up to its boiling point where it stops and boils. While boiling the temperature does not increase, it stays exactly at the boiling point. This is called "Latent Heat", at its boiling point water will absorb heat without increasing in temperature until it has absorbed enough for its phase to change.
0-150 is the better range, and 75 is right in the middle. 100 is just a hot air temperature most people don't want to be in but it's not an extreme.
Saunas can get up to 200 degrees
Hot tubs are usually at 100
Freezers need to be at least 0
You say 15°C. 6° cooler than room temperature. But how much is 6°?
It's 60°F.
50°F or 10°C is where you need clothes to survive
300, 325, 350 is where you bake cookies (149-176°C)
Fahrenheit has a bunch of 5 and 10s
Saying something like high 70s or low 70s for temp represents an easy way to tell temperature.
21° to 26° for celcius
I walk outside and say "It feels like high 70s today" someone using celcius would say, "Feels like 25°". If it was a little warmer than "low 80s" compared to "Ehh about 26 or 27°C"
Why is it okay to say high 70s/low 80s and not high 20s? No one goes outside and says, "Ehh, it feels like 26.6 oC today.", we just know it is a bit warmer than 25.
Yeah, I get your point. I think I'm just trying to explain that it all just matters where you grew up and what you used. I go outside today and I do say it feels like a 12 degree day. It's not that much different.
I must admit, the oven temps are nice, but they are a product of being written in Fahrenheit (if they were written in celcius, it would be round too, like 150c, 160c, 170c, 175c, etc)
But the more I look at it the more I see it's all just numbers. We put importance to these numbers but they're all pretty arbitrary, except celcius using 0 as the freezing point for water and 100 as the boiling point- these are two very important measures that are just weird for Fahrenheit.
What makes 0F (-18C) special? How do you estimate survivability at such temperature? If I'd be out on the street naked, I would die there in a matter of minutes. At the same time, there is plenty of places where winter temperatures go -40F (-40C) and even below, yet people very much survive and live there.
Similar with 100F (38C). There are places with higher temps in the summer, up to 120F (49C) in some places, yet people survive. Still, if you're not equipped with anything, 100F (38C) will burn you alive.
All that not to mention that 50F (10C) is actually cold, not comfortable.
Fahrenheit is only intuitive and "feeling-descriptive" because you're used to it. From a person born in Celsius country, it's really not less intuitive.
I know I can be comfortable in my birthday suit at around 25C. Less than 20 is chilly, less than 10 - cold, less than 0 - freezing. More than 30 is hot, more than 40 is deadly.
Guess what, Canada sets the freezer at -15 Celsius. The USDA just chose 0F because it's good enough and a nice easy to remember number, there is nothing special about it.
Same with all your other numbers, your just using whatever the closest even F value is that's easy to remember there's nothing special about any of them and we have equivalents in Celsius
your freezer at -18 °C (0 °F) or lower. This will keep your food out of the temperature danger zone between 4 °C (40 °F) to 60 °C (140 °F) where bacteria can grow quickly.
Every 2 F is basically 1 C. You have more whole numbers with F.
Like -15°C is 5°F
6°F is -14.4444°C
-14°C is 6.8°F
So 5, 6, and 7°F are about equal to -15, -14.5, and -14°C.
And it's not just a random number. You know how much more energy would be used if everyone kept their freezer just a couple degrees colder? It's the optimum recommended temperature.
No it's random and arbitrary.
Those couple degrees improve shelf life and allow for better extermination of many organisms, and higher temperature gradient allows the water to freeze faster, which is reflected in the quality of the product after thawing as it is less affected by wrongly formed and expanded ice.
There is no "golden temperature", and so everyone flips it how they like it, and instead of what's actually right this is often dictated by convenience.
There are strong benefits to keeping your freezer at -80°C (-112°F), even, but at this point it crosses the line of practicality by both freezer cost and electricity consumption.
Also, the whole numbers argument is extemely weird.
Like, do you know the difference between 71 and 72°F? Is it pronounced in any way?
I can assure you, I cannot tell the difference between 21°C and 22°C. And where it actually matters (precise measurements etc.) you'll need decimals for both (and there's nothing wrong with them!)
If I said to you. Would you stick your hand in 50°C water for 100 dollars would you do it?
What about 60°C?
65°C?
I bet you don't know what would happen if you stuck your hand in 65°C water without looking it up. There's a huge jump from 60° to 65°C. 70°C will instantly scald you.
Someone out there is stupid enough to think. Water boils at 100°C, 65 should be perfectly fine. Even though water doesn't boil until 212°, most people would be cautious of sticking their hand in 100°F+ water.
Yes if you think 40°C+ is hot then you can gather that 65°C would be hotter. But why compare to 40° when you can do 100°.
Why compare it to 40°? Because I know what 40° feels like because I've been living in a civilized country with a civilized measurement system all my life. I can tell you that 65° is too hot, because I make my tea with 70° to 80° hot water. Therefore just before that will probably be too hot for my skin.
In the end, there is no objectively better system when it comes to day to day temperatures. But there is one when it comes to science, reliability and universality and that is Celsius.
All international science uses metric and slowly but surely the resistance amongst US universities melts away and they switch to metric as well. Give it another one or two generations and we'll finally be rid of the outdated and arbitrary imperial system!
Fahrenheit is grouped with US Customary units but is not one.
I agree metric system is superior and there isn't a reason to use Inches, Feet, Yards, etc.
But Fahrenheit is a great system for weather and works great for everything else.
For science if I have to heat a beaker to 280° it doesn't matter if it's C or F. I'm not going to be able to relate to 280° in either system. The instrument is going to have to tell me the exact measurement.
Same with like a tape measure. I can measure out 3 meters. I don't need to know how long 3 meters is to do that.
However, mark two lines on a piece of paper and I will get closer guessing in inches than cm because I know the US customary units better.
Eventually US will change to metric. But I doubt we will ever not use Fahrenheit for normal day things like weather
I can absolutely tell 65°C is too hot, that's 5°C short of what is piped as literal hot water in the taps in my area.
I would not recommend going above 40°C for washing, and there is literally zero issue remembering that. Body temperature of a healthy human is 36,6-36,7°C (97,9-98,1°F), everything above that is hot.
As such, there is literally zero issue figuring 40°C is reasonably hot and 65°C is unreasonably hot, it doesn't take a genius.
Speaking of water, Celsius is obviously superior as a water-based system. I can easily tell temperature in my kettle goes to 100°C (212°F, huh?), or steps down to 90°C (194°F??) or 80°C (176°F??) to brew a perfect cup of tea. When temperature outside goes 0°C (32°F??), I know I can expect ice and snow. And for everything in between, I can make a pretty accurate approximation.
And finally, modern Fahrenheit scale is literally defined through Celsius. It's a scale that is defined as 32°F at the freezing point of water (i.e. 0°C) and 212°F at water boiling point (i.e. 100°C). You're literally using Celsius but make it harder for no reason.