Found in NeMo.

  • @[email protected]
    link
    fedilink
    English
    314 months ago

    This feels like suing gun manufacturers over murder. They made the tool but they’re not the ones responsible for the crime.

        • @[email protected]
          link
          fedilink
          English
          144 months ago

          I doubt the kitchen knife industry is worried about getting sued, but they are readily available to use in stabbing.

          • Saik0
            link
            fedilink
            English
            134 months ago

            Yeah that “unlike other industries” is a load of crock. Nobody goes after car manufacturers for all those deaths either unless there’s an actual flaw in the car itself that caused it.

            Hell we barely go after the pharmaceutical industry and they KNOWINGLY cause harm when they downplay all the risks that they do.

            • @[email protected]
              link
              fedilink
              English
              44 months ago

              +1 for the pharmaceutical industry. Those motherfuckers are actively causing harm. They pay doctors to over prescribe, which leads to addiction, which leads to more pharmaceutical sales. They are basically with the tobacco industry 

    • @[email protected]
      link
      fedilink
      English
      104 months ago

      It’s more like charging the iron ore mining companies over gun murders.

      NVIDIA doesn’t have any say over how their GPUs are used.

      • @locknessmeownster
        link
        English
        44 months ago

        No offense, but the article is not about Nvidia GPUs, but their own AI model NeMo. Literally in the first line of the report. In this case, they definitely have the say on what training data to be used in the model.

    • @[email protected]
      link
      fedilink
      English
      64 months ago

      Creating material that is copyright infridgement is not a desired output, the purpose of guns is to kill (when used). AI manufacturers depend on copyrighted material to “train” the AI, the method of creation makes it more likely to infridge.

      • @[email protected]
        link
        fedilink
        English
        14 months ago

        Creating material that is copyright infridgement is not a desired output

        Agreed.

        the purpose of guns is to kill (when used).

        Guns is a term with varied definitions of which not all are intended to kill. There are rubber bullets, air soft, small caliber, and even paint ball guns. These MAY be lethal but were made with other goals in mind.

        Nvidia on the other hand made GPUs for applications that revolve around video, the G literally stands for graphics. Some people found out that they are also efficient at other tasks so Nvidia made a new line of products for that workload because it was more lucrative. Gamers usually only buy 1 graphics card per machine, a few years ago some would even buy up to 3. In contrast, AI researchers/architects/programmers buy as many as they can afford and constantly buy more. This has made Nvidia change their product stack to cater to the more lucrative customer.

        AI manufacturers depend on copyrighted material to “train” the AI

        With everything I said, these AI creators CHOOSE what to feed into these new tools. They can choose to input things in the public domain or even paid-licensed-content but instead using copyrighted and pirated content is the norm. That is because this is a new field and we are collectively learning where the boundaries are and what is considered acceptable and legal.

        Reddit recently signed a deal to license it’s data (user generated content like posts and comments) for use with AI generation. Other companies are using internal data to tailor their AIs to solve field-specific problems. The problem is that AI, just like guns, is a broad term.

        the method of creation makes it more likely to infridge.

        Nvidia has given us the tools but until we define what is considered acceptable, these kinds of things will be inevitable. I do believe that the authors had their copyrights infringed but they are also going after the wrong people. There have been reports of AI spitting out full books on command, clearly proving that those works were used to train. The authors should be going after the creators of those specific AIs, not Nvidia.

        There is a long and bumpy road ahead.

        • @[email protected]
          link
          fedilink
          English
          14 months ago

          I don’t like Nvidia as they’re anti-consumer and anti software freedom, my biased here. In my country guns are not a legal right and I don’t think they should be legalised. That said, guns have a purpose: to stop people that can’t be reasonably stopped non-violently.

          Nvidia is creating AI cards with the expectations people will train it with copyrighted works. Maybe most generated art isn’t infringing (or rather, isn’t caught being infringing). I’m interested to see how much responsibility they are found to share but I’m confident it won’t stop someone doing it.

          I’m not aware of people training AI using public domain works and I have further objections to AI’s use in that case regarding the impact on artists.