• @[email protected]
    link
    fedilink
    English
    435 months ago

    So Apple is paying them in exposure? First time I’ve ever seen that where it might actually be worth something.

    • @[email protected]
      link
      fedilink
      English
      315 months ago

      It’s worth more, more often than you think, it’s just hawked by random “influencers” with a paltry 50k followers or some shit. That kind of exposure is worthless.

      I’ve heard from photographers and artists before that they will consider exposure offers, but only if you have an actual status. Like a major brand or near-celebrity or a top 50 social media “influencer” with millions upon millions of followers.

      And they usually want a contract, i.e. you have to pin our work for X days and you have to tag us etc.

    • Rhaedas
      link
      fedilink
      55 months ago

      And Apple gets more usage. It’s a win-win for both companies.

    • @[email protected]
      link
      fedilink
      English
      135 months ago

      What data? The data that the user affirmatively agrees to send them that is anonymized? That data?

      • @[email protected]
        link
        fedilink
        English
        215 months ago

        I’m sure you understand this, but anonymized data doesn’t mean it can’t be deanonymized. Given the right kind of data, or enough context they can figure out who you are fairly quickly.

        Ex: You could “Anonymize” gps traces, but it would still show the house you live at and where you work unless you strip out a lot of the info.

        http://androidpolice.com/strava-heatmaps-location-identity-doxxing-problem/

        Now with LLMs, sure, you could “anonymize” which user said or asked for what… but if something identifying is sent in the request itself, it won’t be hard to deanonymize that data.

        • @[email protected]
          link
          fedilink
          English
          35 months ago

          So you would rather submit your non-anonymized data? Because those bastards will find a way to unanonimize it. Is Apple doing the right thing or not?

          • @[email protected]
            link
            fedilink
            English
            95 months ago

            What? No. I would rather use my own local LLM where the data never leaves my device. And if I had to submit anything to ChatGPT I would want it anonymized as much as possible.

            Is Apple doing the right thing? Hard to say, any answer here will just be an opinion. There are pros and cons to this decision and that’s up to the end user to decide if the benefits of using ChatGPT are worth the cost of their data. I can see some useful use cases for this tech, and I don’t blame Apple for wanting to strike while the iron is hot.

            There’s not much you can really do to strip out identifying data from prompts/requests made to ChatGPT. Any anonymization of that part of the data is on OpenAI to handle.
            Apple can obfuscate which user is asking for what as well as specific location data, but if I’m using the LLM and I tell it to write up a report while including my full name in my prompt/request… that’s all going directly into OpenAIs servers and logs which they can eventually use to help refine/retrain their model at some point.

              • @[email protected]
                link
                fedilink
                English
                85 months ago

                IIRC they demonstrated an interaction with Siri where it asks the user for consent before enriching the data through chatgpt. So yeah, that seems to mean your data is sent out (if you consent).

              • @[email protected]
                link
                fedilink
                English
                45 months ago

                I’d say the proof is on Apple to show that it’s being done on-device or that all processing is done on iCloud servers.

                You’re saying that OpenAI is just going to hand over their full ChatGPT model for Apple to set up on their own servers for free?

                But from the article itself:

                the partnership could burn extra money for OpenAI, because it pays Microsoft to host ChatGPT’s capabilities on its Azure cloud

                I get it if they created a small version of their LLM to run locally, but I would expect Apple to pay a price even for that.

                I think you may be confusing this ChatGPT integration with Apple’s own LLM that they’re working on… Again, from the linked article:

                Still, Apple’s choice of ChatGPT as Apple’s first external AI integration has led to widespread misunderstanding, especially since Apple buried the lede about its own in-house LLM technology that powers its new “Apple Intelligence” platform.

        • Zos_Kia
          link
          English
          35 months ago

          I don’t know about the US but in European GDPR parlance, of it can be reversed then it is NOT anonymized and it is illegal to claim otherwise. The correct term is pseudonymized.

      • @[email protected]
        link
        fedilink
        English
        55 months ago

        The point is that they can use that data for further training. They want to build a monopoly like Google is for search.

        • @[email protected]
          link
          fedilink
          English
          45 months ago

          Like Google did with user queries and crawling data. I’m just saying everyone is happily giving these companies data. You are welcome to not use the GPT functionality just like you are welcome to use DuckDuckGo. I’m not getting the hostility to Apple. Microsoft on the other hand…

    • Jesus
      link
      fedilink
      English
      10
      edit-2
      5 months ago

      If you look at the announcement, they’re pretty damn boxed in. They can’t scrap the local device, or iCloud. Open AI only gets queries that the dumber Apple models thinks would be better served by OpenAI. And each of those queries is prompted with a dialog that says “Do you want me to use ChatGPT to do that? Cancel / Use ChatGPT”

      That said, on stage, Apple briefly mentioned that ChatGPT plus users would have more functionality. I’ll bet money that’s the real play. LLM model subscriptions in the App Store. Apple loves that sweet sweet AppStore and subscription money.

      Question is, do they take a cut like with Spotify, or is basic, free, GPT 4 access payment enough?

  • Jesus
    link
    fedilink
    English
    195 months ago

    I’ll take bets here.

    I don’t think the user data where OpenAI makes its money. It’s the $20 a month GPT Plus subscriptions.

    Apple announced that Plus users would get more functionality. Also, OpenAI is basically only limited to collecting data from queries that the user explicitly says it wants to sent to a 3rd party model. Each GPT4 query prompts the user with “Do you want me to use ChatGPT to do that?”

    Apple’s not really in the business of selling data, but they are famously and infamously in the business of selling subscriptions to shit.

      • Jesus
        link
        fedilink
        English
        35 months ago

        Google: tries to track your behavior so they can sell targeted ads in their freeware

        Apple: aggressively tried to lock you into platforms and ecosystem so they can sell you subscriptions, apps, and hardware

        Both shitty, but one company’s business model is much more dependent on user data to exist.

    • @[email protected]
      link
      fedilink
      English
      15 months ago

      Plus users do get more functionality, but it sounded like these are just the extra features a Plus subscription gives you anyway (so analysis, image generation etc.).

  • Kokesh
    link
    fedilink
    English
    95 months ago

    How will this be financially viable for OpenAI? It costs lots of money to run this crap

    • Jesus
      link
      fedilink
      English
      55 months ago

      GPT Plus subscriptions.

      During the WWDC keynote they mentioned that GPT Plus users would get additional functionality in iOS.

    • Jesus
      link
      fedilink
      English
      155 months ago

      Every query prompted with a consent alert, and OpenAI isn’t allowed to rummage around through iOS, MacOS or iCloud. If the more private, but dumber, Apple model can’t do the job, you get this alert.

      My guess is that, since the data collected is from GPT 4 queries, and requires expensive compute, the business move is actually around subscriptions. The data probably isn’t worth the expense query compute.

      Apple announced that GPT Plus users would have more functionality, and if there is one thing Apple is exceedingly good at, it’s selling subscriptions to shit.

    • @[email protected]
      link
      fedilink
      English
      25 months ago

      And yet if you look at the technical details, between Google, Apple n Microsoft. This does seem to be the lesser evil for privacy.

      Of course if you have like a degoogled fairphone il bow down to you but if you also have a windows pc you’re screwed either way.

  • @[email protected]
    link
    fedilink
    English
    3
    edit-2
    5 months ago

    No, the payment is apple users’ data, is me. If I can’t turn it off completely, I won’t update the OS.

    • @[email protected]
      link
      fedilink
      English
      115 months ago

      Chill and do some reading. It’s not only opt-in but can be disabled at any time, and it’s opt-in per request. It’ll tell you before anything goes to ChatGPT and even then it’s anonymized.