… and neither does the author (or so I believe - I made them both up).

On the other hand, AI is definitely good at creative writing.

      • @[email protected]
        link
        fedilink
        13 months ago

        The energy usage is mainly on the training side with LLMs. Generating afterwards is fairly cheap. Maybe what you want is to have fewer companies trying to train their own models from scratch and encourage collaborating instead?

          • @[email protected]
            link
            fedilink
            1
            edit-2
            3 months ago

            Indeed. Though what we should be thinking about is not just the cost in absolute terms, but in relation to the benefit. GPT-4 is one of the more expensive models to run right now, and you can accomplish very good results with their smaller GPT-4o mini at 0.5% of the energy cost[1]. That’s the cost of running 0.07 LED bulbs over an hour, or running 1 LED bulb over 0.07 hours (i.e. 5min). If that saves you 5min of time writing an email while the room is lit with a single LED bulb and your computer is drawing energy, that might just be worth it, right?

            [1] Estimated by using https://huggingface.co/spaces/genai-impact/ecologits-calculator and the pricing difference between GPT-4o, 4o mini, and 3.5 (https://openai.com/api/pricing/). The assumption I’m making is that the total hardware and energy cost scales linearly with the API pricing.

              • @[email protected]
                link
                fedilink
                13 months ago

                Yeah, they operate very opaquely, so we can’t know the true cost, but based on what I can know with certainty given models I can run on my own machines, the numbers seem reasonable. In any case, that’s not really relevant to this discussion. Treat it as a hypothetical, then work out the math later to figure out where we want to be and what threshold we should be setting.