Blade Runner director Ridley Scott calls AI a “technical hydrogen bomb” | “we are all completely f**ked”::undefined

  • bionicjoey@lemmy.ca
    link
    fedilink
    English
    arrow-up
    109
    ·
    1 year ago

    I’m sure that a film director is an expert on the technical underpinnings of large language models, which primarily are used to generate blocks of text that have the appearance of being coherent.

    • PerogiBoi@lemmy.ca
      link
      fedilink
      English
      arrow-up
      37
      ·
      1 year ago

      Several departments where I work had massive layoffs in favour of implementing customized versions of GPT4 chatbots (both client facing services and internal stuff). That’s just the LLM end of AI.

      That’s not even considering the generative image spectrum of AI. I fear for my companies graphics, web design, and UX/UI teams who will probably be gone this time next year.

      • M500@lemmy.ml
        link
        fedilink
        English
        arrow-up
        29
        ·
        1 year ago

        I work freelance but occasionally needed to partner with artists and other stuff. But I now use various “ai” projects and no longer need to pay people to do the with as the computer can do it good enough.

        I’m not some millionaire, I’m just a guy trying to save money to buy a house one day, so it’s not like a large economic impact, but I can’t be the only one.

      • jackalope@lemmy.ml
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 year ago

        Ux is not about drawing pictures. That work is already automated by ui kits anyway. Ux is about thinking through requirements and research.

        • PerogiBoi@lemmy.ca
          link
          fedilink
          English
          arrow-up
          13
          ·
          1 year ago

          I know very well what UX is having studied it as my major in uni. Senior executives do not know what it is and have and are making decisions to “replace” them with LLMs and “prompt engineers”. I see it daily at work.

          There is a great disconnect where hiring managers and executives see LLMs as a quick win that will cut costs and make moves to cut costs without doing any analysis.

      • remus989@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        I can tell you now that AI won’t come for UX/UI teams, at least not in the near future. Clients rarely are able to really articulate what they need out of software and until AI is smart enough to suss that out, we’re good. That being said, I’m sure there will be companies that try to go that route but I doubt it will work, again, in the near term.

        • PerogiBoi@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I’m not saying that AI will properly come for UX/UI teams.

          It already is. AI is as you said not smart enough to evenly replace UX/UI teams, but managers and executives and csuite individuals don’t understand that. AI has been sold to them as a quick win that lowers costs. To give you an example, 3 members of our CX team were replaced by an annual license to Enterprise GPT-4 and some custom training for business stuff. In the last 2 months so much has broken down with it/hasn’t worked well and clients complained so now we are subcontracting a Bangalore firm to try and fix it. Pretty sure we’ve exceeded those 3 people’s salary costs by now.

    • bh11235@infosec.pub
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      Jules Verne wasn’t a technical expert either, but here we are somehow. Don’t underestimate a keen and observant imagination.

      • SkaveRat@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 year ago

        They’re not even really AI.

        sigh. Can we please stop this shitty argument?

        They are. In a very broad sense. They are just not AGI.

        • Mahlzeit@feddit.de
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 year ago

          So much this. Most people under 40 must have grown up with video games. Shouldn’t they have noticed at some point that the enemies and NPCs are AI-controlled? Some games even say that in the settings.

          I don’t see the point in the expression “AGI” either. There’s a fundamental difference between the if-else AI of current games and the ANNs behind LLMs. But there is no fundamental change needed to make an ANN-AI that is more general. At what point along that continuum do we talk of AGI? Why should that even be a goal in itself? I want more useful and energy-efficient software tools. I don’t care if it meets any kind of arbitrary definition.

          • FishFace@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            It’s never going to go away. AI is like the “god of the gaps” - as more and more tasks can be performed by computers to the same or better level compared to humans, what exactly constitutes intelligence will shrink until we’re saying, “sure, it can compose a symphony that people prefer to Mozart, and it can write plays that are preferred over Shakespeare, and paint better than van Gogh, but it can’t nail references to the 1991 TV series Dinosaurs so can we really call it intelligent??”

      • Not_mikey@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        they’re a particularly beefed-up auto complete

        Saying this is like saying your a particularly beefed-up bacteria. In both cases they operate on the same basic objective, survive and reproduce for you and the bacteria, guess the next word for llm and auto-complete, but the former is vastly more complex in the way it achieves those goals.

    • erwan@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Yes, I thought he was talking about the film industry (“we’re fucked”) and how AI is/would be used in movie. In which case he would be competent to talk about it.

      But he’s just confusing science-fiction and reality. Maybe all those ideas he’s got will make good movies, but they’re poor predictions.

      • LwL@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        You kinda do, as anyone in tech that has ever had to communicate with customers can attest to.