Are we bright, or really kinda dim? IFL Science reports that the human brain uses about as much electricity as the average computer monitor:
Considered as an organ, the brain is admittedly extremely energy-hungry: it accounts for about two percent of your body by weight, but about 20 percent of your basal energy consumption. But objectively, that’s not very much – given an average daily intake of, say, 2,700 calories, it only adds up to a paltry 340 or so to power the brain.
It’s the equivalent of 0.4 kilowatt-hours – enough to power an old-fashioned 60W incandescent lightbulb for less than seven hours. It’s less than the amount of energy you’d get from three bananas. It is, basically, a trifling amount, especially compared to the alternative: “One of the most powerful supercomputers in the world, the Oak Ridge Frontier, has recently demonstrated exaflop computing,” [NIST researcher Advait] Madhavan pointed out. “But it needs a million times more power [than the brain] – 20 megawatts – to pull off this feat.”
Running the Oak Ridge Frontier, therefore, would take the energy equivalent of – well, three million bananas, for one thing. But perhaps a more sensible comparison is this: to run it for a day, you’d need to burn 207 tonnes of coal, producing 340 tonnes of CO2 in the process. If not that, then 120,000 liters of liquid petroleum would do, and only produces about half as much CO2; alternatively, you could burn 84 million liters of natural gas, and release just 74 tonnes of CO2.
…
As uncannily human as modern AI programs can seem, they are, on a very fundamental level, not. It’s not just that they can’t process information in ways we can – the opposite is also true: “Traditional AI models rely heavily on backpropagation,” explained Suin Yi, an assistant professor of electrical and computer engineering at Texas A&M’s College of Engineering, in March last year – “a method used to adjust neural networks during training [which] is not biologically plausible”.
Creating a computer more akin to the human brain, therefore, would require a wholesale redesign. New algorithms would be needed, running on new topographies; a ground-up re-imagining of how connections and processes should be carried out and prioritized.
Luckily, there’s a bunch of people already doing exactly that. “What we did […] is troubleshoot the biological implausibility present in prevailing machine learning algorithms,” Yi said. “Our team explores mechanisms like Hebbian learning and spike-timing-dependent plasticity – processes that help neurons strengthen connections in a way that mimics how real brains learn.”
Yi and his team are just one of many now working on AI systems inspired by nature’s remarkable efficiency.
Researchers at the University of Surrey, for example, are toying with a technique called Topographical Sparse Mapping – a method in which neurons are connected not to all possible, but only those directly nearby, massively reducing a network’s energy requirements.