• lemillionsocks@beehaw.org
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    1 year ago

    We’ve already got numerous examples of how these ai models and face recognition models tend to have biases or are fed data that accidentally has a racial bias. Its not a stretch of the imagination to see how this can go wrong.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Yep, the age old “garbage in garbage out”. If we had a perfect track record we could just send in all the cop data, but we know for a fact the poor and PoC are stopped more than others. You send that into AI it will learn those same biases