Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AI is 'an Energy Hog,' however DeepSeek Might Change That
02-01-2025, 11:19 AM
Post: #1
Thumbs Down AI is 'an Energy Hog,' however DeepSeek Might Change That
[Image: 20240614_213621.png]



Science/

Environment/

Climate.


AI is 'an energy hog,' but DeepSeek might alter that
[Image: ai-v2-img3.jpg]


DeepSeek declares to use far less energy than its rivals, but there are still big concerns about what that indicates for the environment.
[Image: 68461dd2-b454-42e5-b281-e62fe7bf65c1_33f5c6da.jpg]


by Justine Calma




DeepSeek shocked everyone last month with the claim that its AI design uses approximately one-tenth the amount of calculating power as Meta's Llama 3.1 design, upending a whole worldview of how much energy and resources it'll take to establish expert system.


Taken at face value, that claim could have significant implications for the ecological effect of AI. Tech giants are hurrying to develop out enormous AI data centers, with prepare for some to use as much electricity as small cities. Generating that much electricity creates contamination, raising fears about how the physical facilities undergirding new generative AI tools might intensify climate modification and worsen air quality.


Reducing just how much energy it takes to train and run generative AI models might reduce much of that stress. But it's still too early to gauge whether DeepSeek will be a game-changer when it concerns AI's environmental footprint. Much will depend upon how other significant players react to the Chinese startup's advancements, particularly thinking about plans to construct new information centers.


" There's a choice in the matter."


" It simply shows that AI does not have to be an energy hog," states Madalsa Singh, a postdoctoral research fellow at the University of California, Santa Barbara who studies energy systems. "There's a choice in the matter."


The difficulty around DeepSeek began with the release of its V3 design in December, which just cost $5.6 million for its final training run and 2.78 million GPU hours to train on Nvidia's older H800 chips, according to a technical report from the company. For contrast, Meta's Llama 3.1 405B design - in spite of utilizing more recent, more efficient H100 chips - took about 30.8 million GPU hours to train. (We don't know specific expenses, however approximates for Llama 3.1 405B have actually been around $60 million and between $100 million and $1 billion for equivalent designs.)


Then DeepSeek launched its R1 model recently, which endeavor capitalist Marc Andreessen called "a profound gift to the world." The business's AI assistant rapidly shot to the top of Apple's and Google's app stores. And on Monday, it sent rivals' stock prices into a nosedive on the presumption DeepSeek had the ability to create an alternative to Llama, Gemini, and ChatGPT for a portion of the spending plan. Nvidia, whose chips allow all these technologies, saw its stock price drop on news that DeepSeek's V3 only needed 2,000 chips to train, compared to the 16,000 chips or more required by its rivals.


DeepSeek states it had the ability to reduce just how much electrical energy it takes in by utilizing more effective training methods. In technical terms, it utilizes an auxiliary-loss-free strategy. Singh says it comes down to being more selective with which parts of the design are trained; you don't have to train the entire design at the same time. If you believe of the AI model as a huge client service firm with lots of experts, Singh states, it's more selective in selecting which experts to tap.


The model also saves energy when it comes to reasoning, which is when the model is in fact tasked to do something, through what's called key worth caching and compression. If you're composing a story that needs research study, you can think of this method as comparable to being able to reference index cards with top-level summaries as you're composing instead of needing to check out the whole report that's been summed up, Singh explains.


What Singh is particularly optimistic about is that DeepSeek's designs are mostly open source, minus the training data. With this approach, scientists can learn from each other much faster, and it unlocks for smaller players to go into the industry. It also sets a precedent for more openness and responsibility so that investors and consumers can be more important of what resources enter into establishing a model.


There is a double-edged sword to think about


" If we've demonstrated that these sophisticated AI abilities don't need such enormous resource intake, it will open up a bit more breathing space for more sustainable infrastructure preparation," Singh says. "This can also incentivize these developed AI laboratories today, like Open AI, Anthropic, Google Gemini, towards establishing more efficient algorithms and methods and move beyond sort of a strength technique of just adding more data and calculating power onto these models."


To be sure, there's still hesitation around DeepSeek. "We've done some digging on DeepSeek, but it's difficult to discover any concrete realities about the program's energy consumption," Carlos Torres Diaz, head of power research study at Rystad Energy, said in an e-mail.


If what the company declares about its energy use holds true, that could slash an information center's overall energy usage, Torres Diaz composes. And while huge tech business have actually signed a flurry of offers to procure renewable energy, skyrocketing electrical energy demand from data centers still risks siphoning restricted solar and wind resources from power grids. Reducing AI's electrical power consumption "would in turn make more eco-friendly energy available for other sectors, helping displace faster using nonrenewable fuel sources," according to Torres Diaz. "Overall, less power need from any sector is helpful for the worldwide energy transition as less fossil-fueled power generation would be needed in the long-term."


There is a double-edged sword to think about with more energy-efficient AI models. Microsoft CEO Satya Nadella wrote on X about Jevons paradox, in which the more efficient a technology ends up being, the more likely it is to be utilized. The environmental damage grows as a result of performance gains.


" The concern is, gee, if we might drop the energy usage of AI by a factor of 100 does that mean that there 'd be 1,000 information providers can be found in and saying, 'Wow, this is excellent. We're going to construct, build, develop 1,000 times as much even as we prepared'?" states Philip Krein, research teacher of electrical and computer system engineering at the University of Illinois Urbana-Champaign. "It'll be a truly intriguing thing over the next 10 years to enjoy." Torres Diaz likewise stated that this concern makes it too early to modify power intake forecasts "substantially down."


No matter just how much electrical energy an information center utilizes, it is necessary to look at where that electrical power is originating from to understand how much contamination it produces. China still gets more than 60 percent of its electrical power from coal, and another 3 percent originates from gas. The US likewise gets about 60 percent of its electricity from fossil fuels, but a majority of that originates from gas - which creates less co2 pollution when burned than coal.


To make things worse, energy companies are postponing the retirement of nonrenewable fuel source power plants in the US in part to satisfy escalating demand from data centers. Some are even planning to develop out brand-new gas plants. Burning more nonrenewable fuel sources undoubtedly leads to more of the pollution that triggers climate change, in addition to local air toxins that raise health risks to nearby communities. Data centers also guzzle up a lot of water to keep hardware from overheating, which can result in more tension in drought-prone areas.


Those are all issues that AI designers can decrease by restricting energy use in general. Traditional information centers have had the ability to do so in the past. Despite workloads practically tripling between 2015 and 2019, power need managed to stay relatively flat during that time period, according to Goldman Sachs Research. Data centers then grew far more power-hungry around 2020 with advances in AI. They consumed more than 4 percent of electrical energy in the US in 2023, and that could almost triple to around 12 percent by 2028, according to a December report from the Lawrence Berkeley National Laboratory. There's more unpredictability about those kinds of forecasts now, however calling any shots based upon DeepSeek at this point is still a shot in the dark.

Here is my blog post: ai
Visit this user's website Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)