Code used in the analysis is here

  • CameronDev@programming.dev
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    8 months ago

    Except that they are now protected by the “We arent racist, they algorithm did it” defence, so realistically, only us plebs will lose.

    • gedaliyah@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      8 months ago

      The courts have already established that the user is still responsible for the tool, even if the tool is very sophisticated.

      • CameronDev@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        8 months ago

        Have they? There is the air canada thing, but that was kinda a different situation, the chat bot was explicitly acting for the company, and made direct claims for the company?

        IANAL, but proving discrimination was already hard, and now they can just point at the black box and blame it, so its gonna get harder?

        • T156@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 months ago

          IANAL, but proving discrimination was already hard, and now they can just point at the black box and blame it, so its gonna get harder?

          Especially if it gets rolled into other checks, like a police check, or a “personality fit”, which makes it more ambiguous.

      • CameronDev@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        Yeah, it really is. On the upside, if you get rejected from a company that doesnt even have the time to manually review your CV, that might be a blessing in disguise.