We are still waiting to see if the AI revolution will be the ruination of humanity or its salvation, or if the sacrificing of low-level jobs will be worth the leaps and bounds in productivity brought by our robot coworkers. But something that hasn't gotten enough attention is whether the entire enterprise is even sustainable, given how the computing power currently required, scaled upwards, uses incredible amounts of energy.

And in addition to energy in the form of electrical power, new AI data centers like one owned by Meta in Newton County, Georgia and discussed in this New York Times piece today, use a huge amount of water. The Times reporting points to the fact that Meta, along with competitors like OpenAI, Google, and Anthropic, have sought to locate their data centers in places around the country where electricity comes cheap. But they haven't really addressed the water needs of these facilities, which need large amounts of water to power cooling systems that keep all those acres of machines doing kids' English class homework from overheating.

Meta's Georgia facility, per the Times, "guzzles around 500,000 gallons of water a day," or about 10 percent of the total water used in the small community surrounding it. A couple who is unfortunate enough to own a home that sits 1,000 feet from where the data center was built claims that its 2019 construction clogged their groundwater with sediment, causing their well to go dry at different times in the last few years, and requiring them to spend thousands of dollars — in addition to stocking up on bottled water all the time because they don't trust the water for drinking anymore.

Meta, of course, denies that their facility has any impact on its neighbors. And while the homeowner claims that a Facebook employee told her just to boil her water before using it, the company denies that any employee ever said this.

Larger data centers will continue to seek even larger amounts of water, which is more of a finite if not scarce resource for some communities. The Times suggests that, for many, it's an afterthought or inconvenient truth for poorer municipalities enticed by the tax revenue from data centers being built — and they can always draw more power into their grid by adding solar, or wind power, but finding more water to run these cooling systems could be tougher.

An explainer from MIT published earlier this year details the massive increase in energy use spurred by generative AI, with scientists estimating "that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023" and they're only continuing to climb.

"The demand for new data centers cannot be met in a sustainable way," says Noman Bashir, who wrote a paper on the environmental impacts of generative AI. "The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants."

Regarding water usage, Bashir adds, "Just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud. Data centers are present in our physical world, and because of their water usage they have direct and indirect implications for biodiversity."

Bashir suggests that it is the responsibility of AI companies to let users know just how much computing power, and therefore how much of a environmental footprint, their simple chatbot tasks can require, so that they can better decide if it's worth those impacts.