• MudMan@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.

    But all those locally-run models on laptop CPUs and desktop GPUs? That’s grid power being turned into heat and vented into a home (probably with air conditioning on).

    The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.

    I do hate our media landscape sometimes.

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      If I make a gas engine with 100% heat efficiency but only run it in my backyard, do the greenhouse gases not count because it’s so efficient? Of course they do. The high efficiency of a data center is great, but that’s not what the article laments. The problem it’s calling out is the absurdly wasteful nature of why these farms will flourish: to power excessively animated programs to feign intelligence, vainly wasting power for what a simple program was already addressing.

      It’s the same story with lighting. LEDs seemed like a savior for energy consumption because they were so efficient. Sure they save energy overall (for now), but it prompted people to multiply the number of lights and total output by an order of magnitude simply because it’s so cheap. This stems a secondary issue of further increasing light pollution and intrusion.

      Greater efficiency doesn’t make things right if it comes with an increase in use.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        For one thing, it’s absolutely not true that what these apps provide is the same as what we had. That’s another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.

        For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it’s the equivalent of turning on your microwave oven.

        The argument that we are burning more power because we’re using more compute for entertainment purposes is not factually incorrect, but it’s both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.

        The only reason you’re so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don’t have a reason to have an opinion about it.