Machine learning hardware at a billionth of the power cost

The computing power required to run machine learning programs can be prohibitively large, and is one of the prime complaints leveed against machine learning and artificial intelligence initiatives.

Aatmesh Shrivastava, associate professor of electrical and computer engineering, wants to change all that, and a Young Faculty Award from the Defense Advanced Research Projects Agency will help.

Shrivastava’s research focuses on building analog computing hardware, distinct from conventional digital computers, the kind most consumers are familiar with.

All digital computing, Shrivastava says, relies on an analog electric signal that is converted into zeroes and ones — the binary at the heart of most modern computers. “The hardware does the computation on that binary,” Shrivastava says, “and if the information is to be used in the real world, you convert it back into analog.”

“So the real signal is analog, you convert it into digital, do the computation and digitizing using a computer, and bring it back” to analog, he says.

Shrivastava’s systems bypass the analog-to-digital (and back again) conversion process entirely. “You can use the same electronics that you used for computing in digital,” he says, using “transistors to build computational circuits.”

“The advantage is that you need fewer transistors,” he says, and information doesn’t need to go through that conversion process “into digital to compute. You just use incoming information directly and compute it directly.”

This means that the power costs of Shrivastava’s machines will be far, far lower than consumer devices. “We can enable machine learning capability at nano-watt power level,” he says.

For comparison, Shrivastava throws out the ballpark figure that consumer-level laptops might consume from “twenty to two hundred watts per hour.”

A nano-watt is one billionth of a single watt.

By “significantly lowering the power consumption,” Shrivastava says, they are working “to enable applications that are not [currently] possible due to the want of power.”

While the power-saving benefits of analog computing — in specific contexts, like machine learning — has long been known, Shrivastava’s “research goals are to make analog possible,” he says, “by addressing the implementation challenges.”

To begin with, Shrivastava’s hardware will focus on proof of concept machine learning tasks, he says, tasks like handwritten character recognition.

The Young Faculty Award that Shrivastava received from DARPA includes an award of $1 million that will provide support for researcher salaries, laboratory space and materials.

“This is one of the most competitive” grants researchers can receive, Shrivastava says.

Analog computing has been around for a long time, Shrivastava says, and the earliest computers were entirely analog. “That does not necessarily mean that analog has to be traditional.”

Noah Lloyd is a Senior Writer for NGN Research. Email him at Follow him on X/Twitter at @noahghola.

, ,