For proof that this thread is just people justifying what they know as better somehow, look no further than Canada.
We do cooking temps in Fahrenheit, weather in Celsius. Human weights in pounds, but never pounds and oz. Food weights in grams, cooking weights in pounds and oz. Liquid volume in millilitres and litres, but cooking in cups, teaspoons and tablespoons. Speed & distance in kilometres, heights in feet and inches.
Try and give this any consistency and people will look at you like you’re fucked. The next town is 100km over, I’m 5ft 10in, a can of soda is 355ml, it’s 21c out and I have the oven roasting something at 400f. Tell me it’s 68f out and I will fight you.
People like what they are used to, and will bend over backwards to justify it. This becomes blatantly obvious when you use a random mix of units like we do, because you realize that all that matters is mental scale.
If Fahrenheit is “how people feel” then why are feet useful measurements of height when 90% of people are between 4ft and 6ft? They aren’t. You just know the scale in your head, so when someone says they’re 7ft tall you say “dang that’s tall”. That’s it.
Fahrenheit: let’s use “really cold weather” as zero and really hot weather as 100.
I don’t really have a horse in this race but this logic doesn’t seem legit to me.
How is -17°C really cold weather AND 37°C really hot weather?
One is actively trying to kill you if weren’t already dead by the time the weather got that bad. The other just makes your nuts stick to your thighs – if you’re in a humid place.
I’d agree with the logic if 100F was equal to something like 65°C. 🤷♂️
It makes no sense because that’s not what the 0 of the Fahrenheit scale is. The 0 point is the coldest an ammonium chloride brine mixture can be cooled to. The 90 point was an estimated average for human body temperature (it was adjusted up over time). These were chosen because the goal of the scale was to provide a way for people to have a defined temperature scale with a range and degree size that could be reliably reproduced without passing around standardized tools. 100 is really hot because human bodies were used as a reference for the high end, but the low end has nothing to do with the human body.
but like isn’t that the whole point of celsius? all you need to calibrate a C thermometer is some water: when it starts freezing it’s 0°C and when it’s boiling it’s 100°C, super simple and accessible.
It’s not like “the estimated average human body temperature” is particularly accurate, and surely no matter what you mix into water it won’t magically boil at the same temperature regardless of air pressure?
You’re totally correct that Celsius is the more sensible scale with easier to replicate reference points (when using water). It was also invented almost 30 years after the Fahrenheit scale and with all the insights gained from that period of technological advancement. In fact in the modern day the Celsius degree size is defined in reference to the Boltzmann constant since Celsius is essentially the Kelvin scale with the numbers moved around.
It also used 100 as the freezing point of water and 0 as the boiling point when originally proposed, which changed after Anders Celsius died because everyone knew that was a weird way to do it.
So why not make the temperature go to the hottest? Let me guess, 0 isn’t the coldest either in America, right? It’s just so arbitrary, and pure cope to say it’s the best way to describe temperature.
All of them are. The decision to use water at all is completely arbitrary. Even Kelvin and Rankine are completely arbitrary: the “width” of the degrees is not defined by a physical factor, but relative to an entirely arbitrary concept.
100 is the imprecise average body temperature of the developer
That’s a myth. It’s no more true than the myth that it was the body temperature of horses, or that the scale was designed to reflect how humans experience the weather. (It happens to reflect how humans experience the weather, but this was an incidental characteristic and not the purpose for which the scale was designed.)
The Fahrenheit scale starts to make sense when you realize he was a geometrist. It turns out that a base-10 system of angular measurement objectively sucks ass, so the developer wasn’t particularly interested geometrically irrelevant numbers like “100”, but in geometrically interesting numbers like “180”. He put 180 degrees between the freezing and boiling points of water. (212F - 32F = 180F)
After settling on the “width” of his degree, he measured down to a repeatable origin point, which happened to be 32 of his degrees below the freezing point of water. He wanted a dial thermometer to point straight down in ice water, straight up in boiling water, and to use the same angular degrees as a protractor.
The calibration point he chose wasn’t the “freezing point” of the “random brine mixture”. The brine was water, ice, and ammonium chloride, which together form a frigorific mixture due to the phase change of the water. As the mixture is cooled, it resists getting colder than 0F due to the phase change of the water to ice. As it is warmed, it resists getting warmer than 0F due to the phase change of ice to water. (Obviously, it can’t maintain this relationship indefinitely. But so long as there is ice and liquid brine, the brine will maintain this temperature.) This makes it repeatable, in labs around the world.
And it wasn’t a “random” brine mixture: it was the coldest and most stable frigorific mixture known to the scientific community.
This criticism of Fahrenheit is borne of simple ignorance: people don’t understand how or why it was developed, and assume he was an idiot. He wasn’t. He had very good reasons for his choices.
That factoid makes celsius relevant for about 4 out of the 12 months, and humans lack the capacity to distinguish between 60-100 on the Celsius scale. Anything at those temperatures just feels like blisters.
I’m saying that 0F is waaaaaaay more dangerous than 100F so the logic of those particular temperatures being the 0-100 ends of the scale can’t be explained by how dangerous they each are.
Almost everyone would be fine staying outside for 30 minutes at 100F without no external help (shade, cool drinks etc). Almost nobody would be fine after staying outside at 0F without external help (parka, thermals etc).
To me, with absolutely no data, it feels lie:
0F is as dangerous as 140F (you’re long dead if you’re outside in both cases)
100F is as dangerous as 40F (mildly uncomfortable but safe for a while)
So calling 0F and 100F both “really dangerous” and using that to justify them being the respective points of 0 and 100 disingenuous. Like, use Fahrenheit if that’s what you’re used to - I use it too because that’s what I’m used to. But I don’t explain the insane system with “it’s because the two ends are reallllly dangerous.”
Every time a heat wave brings 100F, the news starts reporting about old people dying. Every time the temperatures reach zero, same thing.
Personally, I can handle the cold much easier than the heat. I get stupid-brain working more than 30 minutes at 95F. Another 15 minutes and I can’t catch my breath, lose fine motor control, and start feeling faint. Drenching myself in water - the colder the better - every 20 minutes or so is the only way I’ve found to be productive above 100F. I feel like 100F is actively trying to kill me.
0F is where it starts getting difficult for me to stay warm without an additional heat source.
Lmao are you a penguin or something? Please tell me that you’re exaggerating to make a point and aren’t seriously saying that you’re capable of staying warm at -10°C (14°F) “without an additional heat source.”
I mean, I have clothes. Long underwear? Layers? Coats, gloves, hats, scarves?
They say you can always put on more clothes if you’re cold, but that’s not really true. Insulation adds bulk, and bulk reduces mobility. Around 0F is where I start to have real trouble wearing enough clothing to stay warm while still being able to perform the activity that has me outside in that weather. Somewhere around 0F, clothing doesn’t really cut it, and I need shelter or additional heat.
That’s a lot of moved goalposts to justify the weird temperature scale logic but okay.
You’ve essentially justified that 0F and 100F are what they are because some old people died when it was 100F (most people, including the old are perfectly fine at this temperature all around the world) and because you can manage at 0F while wearing a ton of layers and not need a heat source (do all old people manage to survive just fine at 10F or 20F by just putting on some layers?).
Either way, this pointless conversation had gone on for way too long. Have a good day! :)
If an argument is being made for one thing, Fahrenheit, it’s not relevant to bring up a different thing. Why is feet a useful measurement? Maybe it’s not, we’re talking about temperature.
Yeah like the metric system has good arguments for why it’s measurements and weights are better, mainly conversion being easier, but for temperature there really isn’t an argument. I would make an argument for Fahrenheit as it gives more precision without having to use decimals which at least in America isn’t a thing for temperature. But those are pretty minor things and I do tend to agree it comes down to what you grew up with.
This fear of decimals is a strictly American thing. Celsius achieves more precision with decimals than fahrenheit without decimals. And this American fear of decimals is pretty funny, considering you will happily do advanced fractions as soon as you are doing length measurements.
I don’t mind decimals at all, it’s more that I don’t trust companies to actually deal with supporting decimals when making the switch. Plus the last time I discussed this on Lemmy someone was saying that decimals aren’t even universally used and it might depend on what you get whether you get that precision or not. Either way like the main point of my post was anyways these are minor arguments and at the end of the day there isn’t really a reason to use Celsius vs Fahrenheit.
Can you feel the difference between 23.5° and 24? I can’t. You don’t often need precision to tenths.
In Australia most weather providers give you whole degrees, the bureau of meteorology gives you to one decimal in reports and whole degrees in forecasts
My coffee and beer boilers can hit high precision temperatures to variously 0.1° or 0.5° precision. The beer boiler gives 3 digits - hundredths below 10°, tenths below 100°, whole numbers 100° and over
You can choose the precision of thermometers you wish to buy for yourself
I have seen fahrenheit thermometers which are hard to read to better precision than 5 degrees
Most of Europe just uses metres for people’s height. 1.67m, like that. I have no mental picture of that, so it doesn’t work for me. But they don’t seem to have any trouble, further evidence that it’s all just what you know.
I find it weird that when measuring height in metric, people using cm exclusively, i’ve noticed this a lot actually, people will use cm or mm in places where it arguably doesn’t make any sense. I could see the justification for doing math maybe, but like, that defeats the whole point of it being metric no?
Why is that defeating the whole point of being metric? If you know someone is 183 cm tall, you also know that they are 1.83 m tall. If its easier to say the length in cm, you do. No need for “one meter and eighty-three centimeters” or “one point eighty-three meters”, just “a hundred and eighty-three centimeters”. Often you just skip saying the “centimeters” part as well, because most people can see that you’re not the size of a skyscraper without getting a ruler out.
For proof that this thread is just people justifying what they know as better somehow, look no further than Canada.
We do cooking temps in Fahrenheit, weather in Celsius. Human weights in pounds, but never pounds and oz. Food weights in grams, cooking weights in pounds and oz. Liquid volume in millilitres and litres, but cooking in cups, teaspoons and tablespoons. Speed & distance in kilometres, heights in feet and inches.
Try and give this any consistency and people will look at you like you’re fucked. The next town is 100km over, I’m 5ft 10in, a can of soda is 355ml, it’s 21c out and I have the oven roasting something at 400f. Tell me it’s 68f out and I will fight you.
People like what they are used to, and will bend over backwards to justify it. This becomes blatantly obvious when you use a random mix of units like we do, because you realize that all that matters is mental scale.
If Fahrenheit is “how people feel” then why are feet useful measurements of height when 90% of people are between 4ft and 6ft? They aren’t. You just know the scale in your head, so when someone says they’re 7ft tall you say “dang that’s tall”. That’s it.
Fahrenheit: let’s use “really cold weather” as zero and “really hot weather” as 100.
Celsius: let’s use “freezing water” as zero, and “boiling water” as 100.
Canucks:
I don’t really have a horse in this race but this logic doesn’t seem legit to me.
How is -17°C really cold weather AND 37°C really hot weather?
One is actively trying to kill you if weren’t already dead by the time the weather got that bad. The other just makes your nuts stick to your thighs – if you’re in a humid place.
I’d agree with the logic if 100F was equal to something like 65°C. 🤷♂️
It makes no sense because that’s not what the 0 of the Fahrenheit scale is. The 0 point is the coldest an ammonium chloride brine mixture can be cooled to. The 90 point was an estimated average for human body temperature (it was adjusted up over time). These were chosen because the goal of the scale was to provide a way for people to have a defined temperature scale with a range and degree size that could be reliably reproduced without passing around standardized tools. 100 is really hot because human bodies were used as a reference for the high end, but the low end has nothing to do with the human body.
Geometric construction plays a role in there as well: the 180 degrees between the boiling point and the freezing point of water was not accidental.
but like isn’t that the whole point of celsius? all you need to calibrate a C thermometer is some water: when it starts freezing it’s 0°C and when it’s boiling it’s 100°C, super simple and accessible.
It’s not like “the estimated average human body temperature” is particularly accurate, and surely no matter what you mix into water it won’t magically boil at the same temperature regardless of air pressure?
You’re totally correct that Celsius is the more sensible scale with easier to replicate reference points (when using water). It was also invented almost 30 years after the Fahrenheit scale and with all the insights gained from that period of technological advancement. In fact in the modern day the Celsius degree size is defined in reference to the Boltzmann constant since Celsius is essentially the Kelvin scale with the numbers moved around.
It also used 100 as the freezing point of water and 0 as the boiling point when originally proposed, which changed after Anders Celsius died because everyone knew that was a weird way to do it.
At what molar concentration? Was it just as much NH4Cl as he could dissolve at ambient temperature and pressure?
As I understand it, yes it was a saturated solution.
Thank you. That argument bugs the heck out of me.
Removed by mod
So why not make the temperature go to the hottest? Let me guess, 0 isn’t the coldest either in America, right? It’s just so arbitrary, and pure cope to say it’s the best way to describe temperature.
Removed by mod
The records are -80°F and 134°F
That’s quite an error in a “whole human experience in zero to one hundred” system
All of them are. The decision to use water at all is completely arbitrary. Even Kelvin and Rankine are completely arbitrary: the “width” of the degrees is not defined by a physical factor, but relative to an entirely arbitrary concept.
Technically all arbitrary, but Fahrenheit is definitely on a whole different level of arbitrary.
Celsius - 0 = precise freezing point of water and 100 = precise boiling point
Kelvin - same as C, but shifted so 0 is the precise lowest possible temperature
Fahrenheit - 0 is the imprecise freezing point of some random brine mixture, 100 is the imprecise average body temperature of the developer
That’s a myth. It’s no more true than the myth that it was the body temperature of horses, or that the scale was designed to reflect how humans experience the weather. (It happens to reflect how humans experience the weather, but this was an incidental characteristic and not the purpose for which the scale was designed.)
The Fahrenheit scale starts to make sense when you realize he was a geometrist. It turns out that a base-10 system of angular measurement objectively sucks ass, so the developer wasn’t particularly interested geometrically irrelevant numbers like “100”, but in geometrically interesting numbers like “180”. He put 180 degrees between the freezing and boiling points of water. (212F - 32F = 180F)
After settling on the “width” of his degree, he measured down to a repeatable origin point, which happened to be 32 of his degrees below the freezing point of water. He wanted a dial thermometer to point straight down in ice water, straight up in boiling water, and to use the same angular degrees as a protractor.
The calibration point he chose wasn’t the “freezing point” of the “random brine mixture”. The brine was water, ice, and ammonium chloride, which together form a frigorific mixture due to the phase change of the water. As the mixture is cooled, it resists getting colder than 0F due to the phase change of the water to ice. As it is warmed, it resists getting warmer than 0F due to the phase change of ice to water. (Obviously, it can’t maintain this relationship indefinitely. But so long as there is ice and liquid brine, the brine will maintain this temperature.) This makes it repeatable, in labs around the world.
And it wasn’t a “random” brine mixture: it was the coldest and most stable frigorific mixture known to the scientific community.
This criticism of Fahrenheit is borne of simple ignorance: people don’t understand how or why it was developed, and assume he was an idiot. He wasn’t. He had very good reasons for his choices.
We live on a water planet. The weather we care about is water.
If you look at the overnight low you probably want to know if frost was likely. Guess what Celcius temperature frost happens at.
That factoid makes celsius relevant for about 4 out of the 12 months, and humans lack the capacity to distinguish between 60-100 on the Celsius scale. Anything at those temperatures just feels like blisters.
Removed by mod
I’m saying that 0F is waaaaaaay more dangerous than 100F so the logic of those particular temperatures being the 0-100 ends of the scale can’t be explained by how dangerous they each are.
Almost everyone would be fine staying outside for 30 minutes at 100F without no external help (shade, cool drinks etc). Almost nobody would be fine after staying outside at 0F without external help (parka, thermals etc).
To me, with absolutely no data, it feels lie:
So calling 0F and 100F both “really dangerous” and using that to justify them being the respective points of 0 and 100 disingenuous. Like, use Fahrenheit if that’s what you’re used to - I use it too because that’s what I’m used to. But I don’t explain the insane system with “it’s because the two ends are reallllly dangerous.”
Removed by mod
Every time a heat wave brings 100F, the news starts reporting about old people dying. Every time the temperatures reach zero, same thing.
Personally, I can handle the cold much easier than the heat. I get stupid-brain working more than 30 minutes at 95F. Another 15 minutes and I can’t catch my breath, lose fine motor control, and start feeling faint. Drenching myself in water - the colder the better - every 20 minutes or so is the only way I’ve found to be productive above 100F. I feel like 100F is actively trying to kill me.
0F is where it starts getting difficult for me to stay warm without an additional heat source.
Lmao are you a penguin or something? Please tell me that you’re exaggerating to make a point and aren’t seriously saying that you’re capable of staying warm at -10°C (14°F) “without an additional heat source.”
I mean, I have clothes. Long underwear? Layers? Coats, gloves, hats, scarves?
They say you can always put on more clothes if you’re cold, but that’s not really true. Insulation adds bulk, and bulk reduces mobility. Around 0F is where I start to have real trouble wearing enough clothing to stay warm while still being able to perform the activity that has me outside in that weather. Somewhere around 0F, clothing doesn’t really cut it, and I need shelter or additional heat.
That’s a lot of moved goalposts to justify the weird temperature scale logic but okay.
You’ve essentially justified that 0F and 100F are what they are because some old people died when it was 100F (most people, including the old are perfectly fine at this temperature all around the world) and because you can manage at 0F while wearing a ton of layers and not need a heat source (do all old people manage to survive just fine at 10F or 20F by just putting on some layers?).
Either way, this pointless conversation had gone on for way too long. Have a good day! :)
Celsius is for scientists and nerds, Fahrenheit is for normal idiots. It’s not rocket surgery.
Those are two different things. Hope this helps.
It doesn’t help at all, it’s being intentionally obtuse. You know what I mean, it’s unhelpful to pretend otherwise and pick a fight over it.
If an argument is being made for one thing, Fahrenheit, it’s not relevant to bring up a different thing. Why is feet a useful measurement? Maybe it’s not, we’re talking about temperature.
Yeah like the metric system has good arguments for why it’s measurements and weights are better, mainly conversion being easier, but for temperature there really isn’t an argument. I would make an argument for Fahrenheit as it gives more precision without having to use decimals which at least in America isn’t a thing for temperature. But those are pretty minor things and I do tend to agree it comes down to what you grew up with.
This fear of decimals is a strictly American thing. Celsius achieves more precision with decimals than fahrenheit without decimals. And this American fear of decimals is pretty funny, considering you will happily do advanced fractions as soon as you are doing length measurements.
I don’t mind decimals at all, it’s more that I don’t trust companies to actually deal with supporting decimals when making the switch. Plus the last time I discussed this on Lemmy someone was saying that decimals aren’t even universally used and it might depend on what you get whether you get that precision or not. Either way like the main point of my post was anyways these are minor arguments and at the end of the day there isn’t really a reason to use Celsius vs Fahrenheit.
Can you feel the difference between 23.5° and 24? I can’t. You don’t often need precision to tenths.
In Australia most weather providers give you whole degrees, the bureau of meteorology gives you to one decimal in reports and whole degrees in forecasts
My coffee and beer boilers can hit high precision temperatures to variously 0.1° or 0.5° precision. The beer boiler gives 3 digits - hundredths below 10°, tenths below 100°, whole numbers 100° and over
You can choose the precision of thermometers you wish to buy for yourself
I have seen fahrenheit thermometers which are hard to read to better precision than 5 degrees
1cm3 of water weights 1 gr and needs 1 calorie to rise 1ºC.
But calories are now obsolete and the unit is Joules.
Removed by mod
This makes a lot of sense, and why I’d never survive in Canada.
As a Canadian idk why your using us an an example, we are wrong to do so and we blame Americans for giving us this bad habit.
I just see it positively and choose to believe you’re in the process of transitioning to enlightenment (metric). ;)
Outdoor temperature in °C, unless you’re talking about an outdoor pool then it’s often enough °F :-)
I think part of the reasons it’s so mixed might just be due to how many Amero-centric devices and parts are common between the two countries.
Y’all can take your shitty Phillips screws though. Roberts is by far superior ;-)
Imagine weighing people as big rocks, though.
Until the UK changes that, us Americans and Canadians can rest assured that nothing we are doing is quite that ridiculous.
Removed by mod
Most of Europe just uses metres for people’s height. 1.67m, like that. I have no mental picture of that, so it doesn’t work for me. But they don’t seem to have any trouble, further evidence that it’s all just what you know.
Removed by mod
Why is that defeating the whole point of being metric? If you know someone is 183 cm tall, you also know that they are 1.83 m tall. If its easier to say the length in cm, you do. No need for “one meter and eighty-three centimeters” or “one point eighty-three meters”, just “a hundred and eighty-three centimeters”. Often you just skip saying the “centimeters” part as well, because most people can see that you’re not the size of a skyscraper without getting a ruler out.
Removed by mod
As you pointed out previously, nobody uses decimeters, so x10 errors are not that common.
Removed by mod
To you. But you are aware that this is not the case for people (almost the rest of the world) who are using metric, right?
Removed by mod
Note to self: High heat levels make Canadians cranky.