• starman2112@sh.itjust.works
    link
    fedilink
    arrow-up
    60
    ·
    1 year ago

    Fun fact! Y2K was going to be an awful nightmare for computers, but engineers and programmers managed to fix most of the potential issues with enough time to spare that most people didn’t even notice, so now it’s a widely held incorrect belief that there was never anything to worry about in the first place. This story likely has no impact on our day to day lives, but thank a programmer for your local hospital’s computer system surviving the turn of the century

    • tilcica@lemm.ee
      link
      fedilink
      arrow-up
      20
      arrow-down
      1
      ·
      1 year ago

      fun fact! 32 bit systems were only fixed until 2038 because then, the signed 32 bit integer storage for the unix epoch time will overflow causing it to flip to the lowest negative number

      and a lot of public stuff still uses 32 bit systems

      • redcalcium@lemmy.institute
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        I’m sure we’ll come up with a shitty way to work around the issue later. Recent example is how IPv4 was supposed to run out years ago, but thanks to shitty workaround deployed by telcos, no one felt the need to migrate to IPv6 even though the workaround makes the internet more restrictive and shittier.

        • R0cket_M00se@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Other than being annoying to deal with, how does PAT/Dynamic NAT make the Internet “more restrictive?”

            • R0cket_M00se@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Damn I need to reread my CCNA textbooks, I’ve forgotten a few things about IPv6 apparently since I’ve never worked at a place where it’s been used.

              • asbestos@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Oh no, I was talking about the IPv4, if it was IPv6 there wouldn’t be a need for providers to put people under NAT

          • redcalcium@lemmy.institute
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            It actually takes power away from ordinary users and put it in the hand of big corporation. It might sound ridiculous, but you’ll start to notice this if you compare how people use the internet 20 years ago vs now. For example, it’s no longer possible to communicate to other people over internet without going through an intermediary. Sending text, files, voice and video calls, all need to go through an intermediary to make sure your data went though. Even modern p2p protocols requires intermediaries in the form of stun/turn servers or chance are high that the participants can’t see each other.

            As an exercise, try to communicate (text, voice, video, file transfer, gaming) with a group of friends over the internet without using any 3rd party service except DNS. It used to be no brainer in the past, but today it’s outright impossible if both party are behind a CGNAT, which is very likely (and almost 100% will happen if you live in a 3rd world country due to disproportionate IP blocks allocation that favor western countries).

            Over the years, this trains internet users into thinking that the internet is not useable without getting an account on tech giants’ online services. Imagine if this restriction does not exist. The internet might be less centralized today, the internet giants might not be as giant, and people might use more p2p tech to communicate with each other and might have better privacy because they have less data captured by those 3rd party services.

            • R0cket_M00se@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Over the years, this trains internet users into thinking that the internet is not useable without getting an account on tech giants’ online services. Imagine if this restriction does not exist. The internet might be less centralized today, the internet giants might not be as giant, and people might use more p2p tech to communicate with each other and might have better privacy because they have less data captured by those 3rd party services.

              My friend, that’s just laziness. Most people don’t want to know and don’t care to learn how to use technology. I don’t think we’d be in some Free and Open Internet era had NAT not been deployed, you’d still have people like us with tech knowledge splitting off into our own areas based on that ability and desire, while the herd flocked to the “do it for me” solutions provided by big tech.

              • redcalcium@lemmy.institute
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                True, most people don’t out of laziness, but at least people who care would still have an alternative option instead of the mess we have now. Also, in a parallel universe where the internet is not crippled, maybe 20 years of p2p development would be enough to propel it the point of mainstream usability, but I guess we’ll never know.

      • Ryantific_theory@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        Y2K: Passed ✅

        2038: “Wanna see me do it again?”

        Ha ha, well I have absolutely no faith that we will collectively solve that unless 32 bit systems stop working on their own before then. If Y2K happened again today, there’d be a handful of companies handed billions of dollars to fix everything, and it’d wind up half done with demands for more money.

        • redcalcium@lemmy.institute
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          I’d love to be proven wrong, but I have suspicion that it will be solved by the good ol’ planned obsolescence: “Your device will no longer supported after 2038. Buy this shiny new device and receive 10% discount by entering our coupon code: YAY2K38”

      • SuperDuper@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        If we were to update Unix time to be stored in a 64 bit signed integer, we could handle over 584 billion years, or 292 billion years in either direction of the 1970 epoch.

        • tilcica@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          it depends on the CPU architecture. if you have a 32 bit CPU then it will only be able to save up to a 32 bit int. unix epoch time is just how many seconds passed since 1.1.1970 and cant be “updated”

    • andthenthreemore@startrek.website
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      And it cost an estimated $200bn to $600bn in 1999. Adjusted for inflation that’s between about $370 billion and $1.1 trillion in 2023 money.

    • R0cket_M00se@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      It’s like the ATLA episode with the town that’s about to be destroyed by a volcano. The townspeople claiming that the prophecy was still true even though without Aang they’d be dead.

      It’s not that the threat didn’t exist, it’s that people worked night and day on things like energy infrastructure SCADA networks and other critical areas so that we wouldn’t lose a bunch of shit during the rollover.

      Some stuff still did go down, but it was mostly government and thankfully small enough that it didn’t really matter.