Ad Section

header ads

Solving Big AI’s Big Energy Problem


It seems that the more ground-breaking deep learning models are in AI, the more massive they get. This summer’s most buzzed-about model for natural language processing, GPT-3, is a perfect example. To reach the levels of accuracy and speed to write like a human, the model needed 175 billion parameters, 350 GB of memory and $12 million to train (think of training as the “learning” phase). But, beyond cost alone, big AI models like this have a big energy problem.  UMass Amherst researchers found that the computing power needed to train a large AI model can produce over 600,000 pounds…

This story continues at The Next Web

from The Next Web
>>>MY ANOTHER SITE ONLINE MONEY FREE TECH : DROP99FAST

>>READ MORE : DIGIO NEWS


Note : I Hope My My New News Article Solving Big AI’s Big Energy Problem Of DigioNews So You Trust My site you are always updated with Latest news Daily news Everyday. Only on DigioNews : Solving Big AI’s Big Energy Problem

Post a Comment

0 Comments