Deep learning is an inefficient energy hog. It requires massive amounts of data and abundant computational resources, which explodes its electricity consumption. In the last few years, the overall research trend has made the problem worse. Models of gargantuan proportions—trained on billions of data points for several days—are in vogue, and likely won’t be going away any time soon.
https://www.technologyreview.com/2020/12/11/1014102/ai-trains-on-4-bit-computers/
AI can now train on tiny 4-bit computers | MIT Technology Review Powerful neural networks could soon train on smartphones with dramatically faster speeds and less energy. Deep learning is an inefficient energy hog. It requires massive amounts of data and … www.technologyreview.com |