• Lvxferre
    link
    fedilink
    3
    edit-2
    1 month ago

    Good luck now selling those products, given the negative sentiment around products advertised as having “AI”.

    • Zos_Kia
      link
      21 month ago

      Eh that one study was mostly about stupid products like “AI coffee machine” or “AI fridge”. AI products that make sense sell pretty well.

      • Lvxferre
        link
        fedilink
        11 month ago

        At least acc. to TechSpot, the negative sentiment is general. It’s just more pronounced for some products (high risk and/or price) than others.

        So even where plopping a LLM or similar would make sense, there’ll be likely strong market resistance.

        The fact that plenty actually sensible products are plagued with issues due to GAFAM disingenuousness/stupidity doesn’t help either. (See: Windows Recall, Google abusing search monopoly to feed its AI, etc.)

        • Zos_Kia
          link
          11 month ago

          At least acc. to TechSpot, the negative sentiment is general. It’s just more pronounced for some products (high risk and/or price) than others.

          That’s not what the study says. I’m no AI-hater but I would sure stay away from an AI car or medical diagnosis. Those products make absolutely no sense.

          So even where plopping a LLM or similar would make sense, there’ll be likely strong market resistance.

          I work tangentially to the industry (not making models, not making apps based on models, but making tools to help people who do) and that is not what i observed. Just like in every market, products that make sense make fucking bank. It’s mostly boring B2B stuff which doesn’t make headlines but there is some money being made right now, with some very satisfied customers.

          The “market resistance” story is anecdotal clickbait.

          • Lvxferre
            link
            fedilink
            130 days ago

            At least acc. to TechSpot, the negative sentiment is general. It’s just more pronounced for some products (high risk and/or price) than others.

            That’s not what the study says.

            Thanks for linking the study itself. I was having a hard time finding it, that’s why I relied on press coverage, even with all associated small bits of inaccuracy.

            Look at Study 3b:

            More specifically, the indirect effect through emotional trust was significant for the high-risk product (indirect effect = −0.656, SE = 0.163, 95% CI = [−0.979, − 0.343]), but not significant for the low-risk products (indirect effect = −0.099, SE = 0.143, 95% CI = [−0.383, 0.181]), which provided further support for hypothesis-3.

            With H₃ being “Perceived product or service risk moderates the indirect effect of inclusion of the AI term in the product or service description on purchase intention, mediated by emotional trust.”.

            What they’re saying here is neither the same as my statement (as what I said implies that the effect would remain strong for low-risk) nor yours (as you implied that the effect is not general - it is, but modulated by another factor).

            It’s mostly boring B2B stuff

            That’s a fair point - if they’re marketing it for businesses the attitude is bound to be different from marketing it to end customers.

    • @[email protected]
      link
      fedilink
      129 days ago

      Smartphones were a joke until the iPhone. There will be some resistance, but someone will crack the code to make it easily acceptable and then the race is away with the AI products

      • Lvxferre
        link
        fedilink
        129 days ago

        I do think that some technologies currently being marketed as AI might eventually become features of popularly used products. However:

        1. I don’t think that they’ll come from the so-called “AI companies”. As someone in HN commented the idea of an “AI company” is as ridiculous as a “Python company”.
        2. I don’t think that they’ll be marketed as AI, but as something else.

        As such I partially agree with your conclusion (although I’m not too eager to utter certainty on future events). The reasoning that you used to back it up (analogy with smartphones) is bad though, due to survivorship bias.