• r3df0x@7.62x54r.ru
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    8 months ago

    Someone would have made one eventually. Unless the government monitors every computer in existence, AI is inevitable.

    • JackGreenEarth@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 months ago

      And just to make it clear, we should not give the government the ability to monitor every computer in existence, or even any computer not owned by them.

      • spujb@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        8 months ago

        also, there are absolutely other ways to regulate technology, especially since it’s a tech that’s being bought and sold.

        “monitor every computer” is emphatically not the only solution ? and it’s weird that they suggested that lol

      • r3df0x@7.62x54r.ru
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        8 months ago

        That’s why AI is inevitable without a massive surveillance state.

    • spujb@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      8 months ago

      it’s not the “making one” that’s a problem. it’s the making, optimizing and rabid marketing of one in the service of capital instead of humans.

      if only a bunch of open source, true non-profits released language models, the landscape might still suck but would be distinctly less toxic.

      and if the government (or even a decently sized ngo standards entity) had worked proactively with computer scientists to find solutions like watermarking, labor replacement protections, and copyright protections, things might be arguably perfect. not one of those things happened and so further into the hellscape we descend.