An MIT doctoral candidate is developing neuromorphic computing — technology that emulates the human brain — to make AI more energy-efficient.

Photo Credit: Massachusetts Institute of Technology

An MIT doctoral candidate is working on technology that copies how the human brain operates to make artificial intelligence less power-hungry, MIT News reported.

Miranda Schwacke studies neuromorphic computing in the Department of Materials Science and Engineering. Her work centers on building devices that both process and save data at a single point, mirroring brain operations through neurons and synapses.

Schwacke’s path to this field began in Charleston, South Carolina. Her mother worked as a marine biologist, investigating how pollutants harm dolphin groups. Watching her mom apply science to grasp and address environmental issues shaped Schwacke’s goals.

“That was an example of how science can be used to understand the world, and also to figure out how we can improve the world,” Schwacke told MIT News.

She learned about materials science during high school and developed a senior research project to explore dye-sensitized solar cells. At the California Institute of Technology, she studied tiny materials and energy storage technologies, including battery systems.  

She joined professor Bilge Yildiz’s lab at the Massachusetts Institute of Technology, where she now researches ionic synapses. These devices can be adjusted through chemical processes to change how well electricity moves through them, replicating how brain cells enhance or diminish their connections.









JustFoodForDogs is transforming pet health with fresh, whole-food meals crafted by veterinary nutritionists.

Treat your dog to fresh frozen, pantry-ready, and even targeted nutrition and prescription meals — all with recipes backed by over a decade of university-led research.


Plus, for a limited time, get up to 60% off and a free gift when you start your dogs on delicious, whole-food nutrition.






The problem Schwacke tackles is massive. Training large AI models eats up huge amounts of electricity. Standard computers move information back and forth between separate storage and processing areas, which burns through power. 

The brain works differently. It handles and saves information at one point, making it more efficient.

Schwacke’s devices use tungsten oxide, a material she can adjust with precision. She examines what happens when magnesium ions enter the material and alter its resistance, which controls how strongly signals pass through in ways that mirror synapses in our brains. She received a MathWorks Fellowship for her research in 2023 and 2024.

AI can help optimize renewable energy grids, speed up climate solutions, and reduce pollution. But AI also demands enormous quantities of electricity and water for cooling data centers. 

Training a single large language model can produce as much carbon pollution as five cars over their lifetimes. The technology creates electronic waste and raises concerns about misuse and security. As AI spreads through every industry, recognizing these trade-offs becomes more important.

If you’re concerned about technology’s environmental footprint, support research into energy-efficient computing. Ask tech companies about their energy sources and cooling methods.

“If you look at AI in particular, to train these really large models, that consumes a lot of energy. And if you compare that to the amount of energy that we consume as humans when we’re learning things, the brain consumes a lot less energy,” Schwacke said. “That’s what led to this idea to find more brain-inspired, energy-efficient ways of doing AI.”

💰Get TCD’s free newsletters for easy tips to save more, waste less, and make smarter choices — and earn up to $5,000 toward clean upgrades in TCD’s exclusive Rewards Club.

Cool Divider