Smarter Pruning Boosts AI Efficiency
Why this is here: CWMP considers the actual energy costs of different AI calculations, unlike standard pruning methods that assume all updates are equal.
A new method for training artificial intelligence (AI) models uses less energy without sacrificing performance. Researchers developed a technique called Cost-Weighted Magnitude Pruning (CWMP).
This method improves “federated learning,” where AI learns from data on many devices like phones. Federated learning often struggles with slow speeds and high energy use.
CWMP focuses on “gradient pruning.” Gradient pruning reduces the amount of data sent between devices during training. Existing methods treat all data updates as equal in energy cost.
CWMP recognizes that some updates require more memory and processing power than others. It prioritizes updates based on a cost-benefit analysis.
Tests on a standard image dataset called CIFAR-10 showed CWMP performed better than traditional pruning. It created a better balance between accuracy and energy consumption. This is early research, but it offers a promising path to more sustainable AI.