Early Bird uses 10 times less energy to train deep neural networks

added 19.05.2020 19:05

Image of article 'Early Bird uses 10 times less energy to train deep neural networks'
Share More Posts

High-resolution IMAGES are available for download at: CAPTION: Rice University's Early Bird method for training deep neural networks finds key connectivity patterns early in training, reducing the computations and carbon footprint for the increasingly popular form of artificial intelligence known as deep learning.

"The state-of-art way to perform DNN training is called progressive prune and train," said Lin, an assistant professor of electrical and computer engineering in Rice's Brown School of Engineering.

A 2019 study by the Allen Institute for AI in Seattle found the number of computations needed to train a top-flight deep neural network increased 300,000 times between 2012-2018, and a different 2019 study by researchers at the University of Massachusetts Amherst found the carbon footprint for training a single, elite DNN was roughly equivalent to the lifetime carbon dioxide emissions of five U.S. automobiles.

A study by lead authors Haoran You and Chaojian Li of Rice's Efficient and Intelligent Computing (EIC) Lab showed Early Bird could use 10.7 times less energy to train a DNN to the same level of accuracy or better than typical training.

Early Bird is an energy-efficient method for training deep neural networks (DNNs), the form of artificial intelligence (AI) behind self-driving cars, intelligent assistants, facial recognition and dozens more high-tech applications.

Free awesomeness straight to your inbox!

Sign up now for this free newsletter to receive hand picked content, covering topics like Lifestyle, Business & Technology.

Free Newsletter Sign up now!

..or jump directly to the latest posts!