• Wolf_359@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I can feel the difference between 71 and 73 in my house.

    At 73, my kids room is uncomfortably hot. At 71, it has a perfect chill for sleeping.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?

      If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        9 months ago

        The point was they need that extra decimal because C isn’t good for human temperature sense.

        It’s not like you are prohibited from using decimals in Fahrenheit. It’s that you don’t need 3 digits because it works better for people.

        And fuck you for making me defend the most ass backwards measurement system on the planet.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            9 months ago

            It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.

            Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.

            So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck

            If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.

            • FooBarrington@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 months ago

              We don’t have issues with decimals in many places. For example, why are there pennies? Why aren’t dollars just scaled up 100? Generally speaking: why don’t people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you’re correct, those should be simplified too - yet they aren’t.

              Why bother with Celcius at all when there is Kelvin.

              Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.

              Even Kelvin is arbitrary. Best to use Plank normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.

              Why? That scale is still arbitrarily chosen.