The Trump administration recently published “America’s AI Action Plan”. One of the first policy actions from the document is to eliminate references to misinformation, diversity, equity, inclusion, and climate change from the NIST’s AI Risk Framework.

Lacking any sense of irony, the very next point states LLM developers should ensure their systems are “objective and free from top-down ideological bias”.

Par for the course for Trump and his cronies, but the world should know what kind of AI the US wants to build.

  • brianpeiris@lemmy.caOP
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    3 days ago

    I think the bipartisan TAKE IT DOWN bill has and will have a substantial effect on the proliferation of open source deepfake models. Sure, a tech savvy individual will still be able to download a model themselves and do whatever, but it significantly different from having deepfake services readily available for millions to use. Is it absolute enforcement, no, but it has a substantial effect on the world.

    • Tony Bark@pawb.social
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      How did we get to talking about deepfakes? They’re trying to stop honest discussions about misinformation, diversity, equity, inclusion, and climate change.

      • brianpeiris@lemmy.caOP
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        3 days ago

        I was giving an example of regulation that has an effect on open source AI

        • Tony Bark@pawb.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 days ago

          Fair enough. That being said, deepfake services doesn’t need to be open source. Anything that presents to the masses is obviously going to be enforced but that doesn’t necessarily translate back to the open source supply chain.

          • brianpeiris@lemmy.caOP
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            I’m not sure if this has happened yet, but in theory the TAKE IT DOWN act could be used to shutdown an open source deepfake code or model repository. In that case you’re right that there will be copies that spring up, but I think it is significant that popular projects could be taken down like that.

    • explodicle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      At this point any idiot can buy drugs on the dark web. We know this because many do it wrong. The same will apply to deepfake websites.