cross-posted from: https://lemmy.ml/post/44996161

March 25, 2026

The policy, announced by Bernie Sanders, an independent senator from Vermont, and Alexandria Ocasio-Cortez, a New York Democratic representative, on Wednesday morning, aims to ensure the AI boom protects the environment and communities, and benefits workers instead of harming them. A temporary ban, the lawmakers say, would give the US government time to create strong federal safeguards for AI, which is “affecting everything from our economy and wellbeing to our democracy, warfare and our kids’ education”.

“AI and robotics are creating the most sweeping technological revolution in the history of humanity,” Sanders said in an emailed statement. “The scale, scope, and speed of that change is unprecedented. Congress is way behind where it should be in understanding the nature of this revolution and its impacts.”

  • tino_408@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    23
    ·
    10 hours ago

    Why are data centers bad for the environment? Wouldn’t this cause American ai innovation to slow down? I thought we are in an ai race?

    • wholookshere@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      18
      ·
      10 hours ago

      They gobble up power and water.

      Moorse law is dead, and the only trick left is tonadd more power. So hardware is only getting more power hungry.

      Most US electricity is fossil fuel.

    • BigDaddySlim@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 hours ago

      Sucking up water resources from local communities, poisoning water with it’s waste, noise causing illnesses in people living nearby, using a fuckload of energy usually from non renewable sources adding CO2 to our atmosphere, just to name a few things. This isn’t news

    • jaycifer@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      9 hours ago

      “Why are data centers bad for the environment?”

      Computer servers use a lot of electricity when they run. I believe most data centers are focused on data storage and retrieval, which means there are upswings and downswings in their usage as demand to access that data increases and wanes, so it’s not always running at 100% power consumption. My understanding is that AI data centers are primarily used for training new models, which means they are nearly always running at or near 100% to maximize training.

      Not only does this consume a lot of electricity from the grid to run, but a significant byproduct of servers running is heat, requiring strong cooling systems for the data center, which ironically uses even more electricity. I think they use a lot of water cooling to achieve this cooling as well since water is good at absorbing, moving, then dissipating heat. I’ve read comments that this makes the water difficult to reuse, but I don’t know why that would be the case.

      In short, they use a lot of electricity to generate heat that then needs even more electricity and water to manage.

      “Wouldn’t this cause American ai innovation to slow down?”

      Sure, this could cause the base level processing power available for training to taper off, but I think that would actually breed more innovation in making better training methods that use that power more efficiently. I recall a lot of early Chinese models being just as good for end users as American models despite being trained on less processing power. That sounds innovative to me.

      I would liken it to video game optimization. When gaming tech was weaker it was more necessary to optimize games to run on the limited hardware. Modern gaming consoles have enough processing overhead to achieve the same thing that developers can get away with less optimization, which ironically can lead to worse performing games than when that overhead was missing.

      • Pieisawesome@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        29 minutes ago

        If they build the cooling system in a closed loop with cooling towers, the water usage is mostly a 1 time fill.

        But it’s more expensive, so they don’t

      • RedGreenBlue@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        4 hours ago

        The water is partially evaporated each heating and cooling cycle. So new water needs to be added continiously. They could build more expensive/less efficient cooling systems. But turns out they can just get lots of cheap water instead. Probably cause there is lacking enviromental protection regulation.

    • AlexSage@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 hours ago

      The fact that China is catching up with opensource models basically proves it’s not a money problem. Competition isn’t helping us it’s just bleeding us dry of resources (power, water, and computer hardware). China has basically proven competition isn’t the most efficient way of making the best AI.