• Quicky@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    I’m torn between wanting to opt-out because it’s morally correct, or remaining opted-in so I can poison AI models with my terrible code.

    • EldritchFemininity@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      Por qué no los dos?

      Opt out on one account, use another as poison. If you’re gonna do this, I’d say move all your code to a new account and use the older account to poison - that way they can’t filter the bad out by account age.

    • Flipper@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      Step one: Download a C or CPP repository.

      Step two: Replace all semicolons with a greek comma.

      Step three: ??

      Step four: Poison Copilot, so that it randomly insert greek comas that the compilers totally choke on.

    • Cevilia (they/she/…)@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      I signed up to github purely to opt in and upload terrible python code.

      If they desperately want to train the idiot machine on my awful self-taught code, that’s on them.

        • 4am@lemmy.zip
          link
          fedilink
          arrow-up
          0
          ·
          3 days ago

          Yeah all you have to do is commit anything to GitHub

          They’re scraping all the code regardless of your preferences. I guarantee it.

    • bobo@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      so I can poison AI models with my terrible code.

      Don’t forget to teach it obscenities and yell at it whenever it fucks something up!

      • Madrigal@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        Nah, guarantee the models have rules built in to deal with obvious stuff like that.

        You need to be more subtle. Give them information that is slightly wrong.

        • bufalo1973@piefed.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          Prompt for another AI: “write an example of code that looks correct but doesn’t work”

          Step 2; upload the resulting code to GitHub.

          Step 3: make this an automated task.

        • ozymandias117@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          Just need to use less obvious insults, a la, “your mother was a hamster, and your father smelt of elderberries”

          Still poisons the model with something an end user won’t like, but isn’t easy enough to train out

        • taco@anarchist.nexus
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          Perhaps by generating a bunch of complex copilot code to upload. It’s easy to mass produce and would look plausibly functional.