Energy Saving AI

 


BitEnergy AI is an inference technology company. The team has published a paper that describes a new technique to reduce the energy needs of AI applications. The group has proposed reducing AI's energy consumption by 95%. 

The use of AI applications has gone mainstream. It has lead to a rise in energy needs and costs. Chat GPT is an LLM that requires a LOT of computing power. This means a LOT of electricity is needed to run them. Chat GPT uses 564 kWh daily. That is enough energy to power approximately 18,000 homes. Some have suggested that AI might use around 100 TWh in just a few years. (That's a Bitcoin mining operation's worth of power!)

The new method is somewhat simple. The old method used complex floating multiplication (FPM). The new method uses integer addition. Apps use FPM to handle very large (or very small) numbers. This allows applications to carry out calculations. And the calculations are extremely precise. It's the most energy-intensive part of computing with AI. The new method is called Linear-Complexity Multiplication. FPM are approximating and use integer addition. In the testing, the new approach has reduced demand by 95%.

The team has built, tested and designed this new type of hardware, however, such hardware would have to be licensed. GPU maker Nvidia is the dominant AI hardware manufacturer. How they will respond to this energy saving method sets the pace at which it will be implemented. 

Comments

Popular posts from this blog

Mike Giradi from Stereotimes commentary on Holostage Room Treatments

Queen Bee 🐝

Novel Acoustic Wave Discovered