• teft@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    ·
    1 day ago

    The san in sandisk is a replacement for sun not for sans. They basically didn’t want to be confused with Sun Microsystems so they renamed themselves.

    • adarza@lemmy.ca
      link
      fedilink
      English
      arrow-up
      24
      ·
      edit-2
      1 day ago

      that is true, it was originally ‘sundisk’. the ‘new’ name (from 1995), though, is a rather clever double play on words.

      sandisk -> sans disk (no disk)
      sandisk -> sand disk (made from sand)

  • smokebuddy [he/him]@lemmy.today
    link
    fedilink
    English
    arrow-up
    42
    ·
    1 day ago

    I realized when I bought a WD_Black SSD and the chips read SanDisk so I looked it up

    (Excuse the lil’ cat hair there, or don’t)

    • TrickDacy@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      (Excuse the lil’ cat hair there, or don’t)

      Resisting the urge to blow it away even though it’s a photo 😁

  • adarza@lemmy.ca
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 day ago

    western digital bought sandisk ~ 8 years ago as a form of self-preservation. their core business of mechanical hard drives was already in decline and being replaced by something completely different.

    • Steak@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      1 day ago

      I have a old pc. Built atleast 6 years ago maybe more. It has a 1080 FTW2 GPU in it and that still plays everything at 1080p and 120-144fps no problems at all. I only have a 1tb HDD and a 240gb sdd. The sdd has windows and then one or MAYBE two games im playing a lot on steam. Everything else is on a HDD. It’s kinda just getting to the point where it’s annoying and I want more sdd space. But I’ve made it this far. Whenever I upgrade in the next few years it’s gonna hurt. But I honestly don’t think I’ll go to 4k. Maybe 1440. 4k just doesn’t seem worth it and doesn’t look that much better if at all really. I think we peaked at 1080p and 24inch computer monitors. Sorry lol

      • fuckwit_mcbumcrumble@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        1080p at 24” looks kinda like butt. My last monitor was 1440p at 25” and that I thought was perfection. 2x anti aliasing was all you needed to make everything look nice, but it wasn’t hard to run. 4k isn’t worth it unless you have a REALLY big screen. It takes so much more power to draw 4k vs 1440p and at regular monitor sizes it doesn’t really look that much better.

        Also my 1080 absolutely struggled to play games, even at 1080p (what I’d have to revert to when it couldn’t run them at 4k). I tried battlefield 2042 but I struggled to even get 60 fps in that. This Reddit post from 3 years ago mirrors that so I know it’s not just me. And looking at black ops 6 the 1080 would struggle to run that at over 100 gps on anything other than the bare minimum settings.

      • glimse@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 day ago

        Resolution matters a lot less than pixel density. 1080p above 24" looks like shit so if you’re happy with your monitor size, definitely no need to upgrade

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 day ago

    That’s interesting, but most SD cards are based on the old Intel 8051 microcontroller running in the background. While Bill Mensch is behind Western Digital and the 6502, which is the processor typically in the background of hard drives.