• FooBarrington@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      9 months ago

      It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.

      Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.

      So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck

      If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.

      • FooBarrington@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        We don’t have issues with decimals in many places. For example, why are there pennies? Why aren’t dollars just scaled up 100? Generally speaking: why don’t people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you’re correct, those should be simplified too - yet they aren’t.

        Why bother with Celcius at all when there is Kelvin.

        Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.

        Even Kelvin is arbitrary. Best to use Plank normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.

        Why? That scale is still arbitrarily chosen.