China wants to lead the world on AI regulation — will the plan work?

nature.com

Having placed artificial intelligence at the centre of its own economic strategy, China is driving efforts to create an international system to govern the technology’s use.

Chinese President Xi Jinping speaks at the APEC Economic Leaders' Meeting.
Chinese president Xi Jinping speaking at the 2025 Asia-Pacific Economic Cooperation meeting in Gyeongju, South Korea.Credit: Yonhap via AP/Alamy

Despite risks ranging from exacerbating inequality to causing existential catastrophe, the world has yet to agree on regulations to govern artificial intelligence. Although a patchwork of national and regional regulations exists, for many countries binding rules are still being fleshed out.

Tiếp tục đọc “China wants to lead the world on AI regulation — will the plan work?”

The cost of human labor behind AI development

Digital sweatshops of the Global South

So, where does this hidden labor take place? According to Casilli’s research, workers are in countries including Kenya, India, the Philippines, and Madagascar — regions with high levels of digital literacy, access to English- or French-speaking workers, and little in the way of labor protection or union representation. 

Do Better Team

Behind most of today’s AI models lies the labor of workers in the Global South, who are exposed to disturbing content and poor working conditions. This reality raises urgent questions about the transparency and ethics of AI development.

Picture working 10-hour days tagging distressing images to train an AI model — and getting paid not in money, but in a kilogram of sugar. This isn’t dystopian fiction, but reality for some of the workers behind today’s most advanced artificial intelligence. 

While the development of AI is undoubtedly enhancing the lives of many by streamlining processes and offering efficient solutions, it also raises a pressing question: What is the true cost of AI, and who is paying for it? 

Antonio Casilli, Professor of Sociology at Télécom Paris and Founder of DipLab, addressed this question during an Esade seminar on the promises and perils of the digitalization of work. The event was part of the kick-off for the DigitalWORK research project, which explores how digital technologies are transforming work and promoting fair, equitable and transparent labor conditions, with Anna Ginès i Fabrellas and Raquel Serrano Olivares (Universitat de Barcelona) as principal investigators. 

AI isn’t autonomous, it’s human-powered

Tiếp tục đọc “The cost of human labor behind AI development”

What happens when you say “Hello” to ChatGPT?

The Hidden Behemoth Behind Every AI Answer

Billions of daily queries are reshaping energy and infrastructure

IEEE.org

Such a simple query might seem trivial, but making it possible across billions of sessions requires immense scale. While OpenAI reveals little information about its operations, we’ve used the scraps we do have to estimate the impact of ChatGPT—and of the generative AI industry in general.

This article is part of The Scale Issue.

OpenAI’s actions also provide hints. As part of the United States’ Stargate Project, OpenAI will collaborate with other AI titans to build the largest data centers yet. And AI companies expect to need dozens of “Stargate-class” data centers to meet user demand.

ChatGPT uses 8.5 Wh/day per user in 2025, equal to running a 10W LED bulb for 1 hour.

Estimates of ChatGPT’s per-query energy consumption vary wildly. We used the figure of 0.34 watt-hours that OpenAI’s Sam Altman stated in a blog post without supporting evidence. It’s worth noting that some researchers say the smartest models can consume over 20 Wh for a complex query. We derived the number of queries per day from OpenAI’s usage statistics below. illustrations: Optics Lab

ChatGPT uses 850 MWh daily, equaling 14,000 EV charges for 2.5 billion global queries.

OpenAI says ChatGPT has 700 million weekly users and serves more than 2.5 billion queries per day. If an average query uses 0.34 Wh, that’s 850 megawatt-hours; enough to charge thousands of electric vehicles every day.

ChatGPT's 912B queries yearly need 310 GWh, equal to powering 29,000 US homes.

2.5 billion queries per day adds up to nearly 1 trillion queries each year—and ChatGPT could easily exceed that in 2025 if its user base continues to grow. One year’s energy consumption is roughly equivalent to powering 29,000 U.S homes for a year, nearly as many as in Jonesboro, Ark.

AI queries need 15 TWh/year, equal to two nuclear reactors\u2019 output.

Though massive, ChatGPT is just a slice of generative AI. Many companies use OpenAI models through the API, and competitors like Google’s Gemini and Anthropic’s Claude are growing. A report from Schneider Electric Sustainability Research Institute puts the overall power draw at 15 terawatt-hours. Using the report’s per-query energy consumption figure of 2.9 Wh, we arrive at 5.1 trillion queries per year.

Generative AI queries projected to reach 120 trillion annually by 2030.

AI optimists expect the average queries per day to jump dramatically in the next five years. Based on a Schneider Electric estimate of overall energy use in 2030, the world could then see as many as 329 billion prompts per day—that’s about 38 queries per day per person alive on planet Earth. (That’s assuming a global population of 8.6 billion in 2030, which is the latest estimate from the United Nations.) As unrealistic as that may sound, it’s made plausible by plans to build AI agents that work independently and interact with other AI agents.

Diagram of 38 Stargate-class data centers with racks of GPUs and construction needed.

The Schneider Electric report estimates that all generative AI queries consume 15 TWh in 2025 and will use 347 TWh by 2030; that leaves 332 TWh of energy—and compute power—that will need to come online to support AI growth. That implies the construction of dozens of data centers along the lines of the Stargate Project, which plans to build the first ever 1-gigawatt facilities. Each of these facilities will theoretically consume 8.76 TWh per year—so 38 of these new campuses will account for the 332 TWh of new energy required.

Graphic: 347 TWh requires 44 nuclear reactors with icons of cooling towers.

While estimates for AI energy use in 2030 vary, most predict a dramatic jump in consumption. The gain in energy consumption will be driven mostly by AI inference (the power used when interacting with a model) instead of AI training. This number could be much lower or much higher than the Schneider Electric estimate used here, depending on the success of AI agents that can work together—and consume energy—independent of human input.

The AI arms race is on. Are regulators ready?

BY REBECCA KLAR – 02/14/23 5:03 AM ET

SHARETWEET

The Microsoft Bing logo and the website’s page are shown in this photo taken in New York on Tuesday, Feb. 7, 2023. Microsoft is fusing ChatGPT-like technology into its search engine Bing, transforming an internet service that now trails far behind Google into a new way of communicating with artificial intelligence. (AP Photo/Richard Drew)

The race among tech companies to roll out generative artificial intelligence (AI) tools is raising concerns about how mistakes in technology and blind spots in regulation could hasten the spread of misinformation, elevate biases in results and increase the harvesting and use of Americans’ personal data.

So far tech giants Microsoft and Google are leading the race in releasing new AI tools to the public, but smaller companies and startups are expected to make progress in the field.

Tiếp tục đọc “The AI arms race is on. Are regulators ready?”