AI Accurately Predicts the Useful Life of Batteries
March 26, 2019 | Stanford UniversityEstimated reading time: 4 minutes

If manufacturers of cellphone batteries could tell which cells will last at least two years, then they could sell only those to phone makers and send the rest to makers of less demanding devices. New research shows how manufacturers could do this. The technique could be used not only to sort manufactured cells but to help new battery designs reach the market more quickly.
MIT professor Richard Braatz, left, and William Chueh, assistant professor in materials science and engineering at Stanford, led researchers at their institutions who developed a better battery testing technique. (Image credit: Amos Enshen Lu)
Combining comprehensive experimental data and artificial intelligence revealed the key for accurately predicting the useful life of lithium-ion batteries before their capacities start to wane, scientists at Stanford University, the Massachusetts Institute of Technology and the Toyota Research Institute discovered. After the researchers trained their machine learning model with a few hundred million data points of batteries charging and discharging, the algorithm predicted how many more cycles each battery would last, based on voltage declines and a few other factors among the early cycles.
The predictions were within 9% of the number of cycles the cells actually lasted. Separately, the algorithm categorized batteries as either long or short life expectancy based on just the first five charge/discharge cycles. Here, the predictions were correct 95% of the time.
Published March 25 in Nature Energy, this machine learning method could accelerate research and development of new battery designs and reduce the time and cost of production, among other applications. The researchers have made the dataset—the largest of its kind—publicly available.
Stanford researchers developed a machine learning technique to identify how long batteries will last.
“The standard way to test new battery designs is to charge and discharge the cells until they fail. Since batteries have a long lifetime, this process can take many months and even years,” said co-lead author Peter Attia, Stanford doctoral candidate in materials science and engineering. “It’s an expensive bottleneck in battery research.”
The work was carried out at the Center for Data-Driven Design of Batteries, an academic-industrial collaboration that integrates theory, experiments and data science. The Stanford researchers, led by William Chueh, assistant professor in materials science and engineering, conducted the battery experiments. MIT’s team, led by Richard Braatz, professor in chemical engineering, performed the machine learning work. Kristen Severson, co-lead author of the research, completed her doctorate in chemical engineering at MIT last spring.
Optimizing Fast Charging
One focus in the project was to find a better way to charge batteries in 10 minutes, a feature that could accelerate the mass adoption of electric vehicles. To generate the training dataset, the team charged and discharged the batteries until each one reached the end of its useful life, which they defined as capacity loss of 20 percent. En route to optimizing fast charging, the researchers wanted to find out whether it was necessary to run their batteries into the ground. Can the answer to a battery question be found in the information from just the early cycles?
Stanford graduate students Nicholas Perkins, left, Peter Attia and Norman Jin are among the researchers who found the key for accurately predicting the useful life of lithium-ion batteries. (Image credit: Dean Deng)
“Advances in computational power and data generation have recently enabled machine learning to accelerate progress for a variety of tasks. These include prediction of material properties,” Braatz said. “Our results here show how we can predict the behavior of complex systems far into the future.”
Generally, the capacity of a lithium-ion battery is stable for a while. Then it takes a sharp turn downward. The plummet point varies widely, as most 21st-century consumers know. In this project, the batteries lasted anywhere from 150 to 2,300 cycles. That variation was partly the result of testing different methods of fast charging but also due to manufacturing variability among batteries.
“For all of the time and money that gets spent on battery development, progress is still measured in decades,” said study co-author Patrick Herring, a scientist at the Toyota Research Institute. “In this work, we are reducing one of the most time-consuming steps – battery testing – by an order of magnitude.”
Possible Uses
The new method has many potential applications, Attia said. For example, it can shorten the time for validating new types of batteries, which is especially important given rapid advances in materials. With the sorting technique, electric vehicle batteries determined to have short lifespans – too short for cars – could be used instead to power street lights or back up data centers. Recyclers could find cells from used EV battery packs with enough capacity left for a second life.
Yet another possibility is optimizing battery manufacturing. “The last step in manufacturing batteries is called ‘formation,’ which can take days to weeks,” Attia said. “Using our approach could shorten that significantly and lower the production cost.”
The researchers are now using their model to optimize ways of charging batteries in just 10 minutes, which they say will cut the process by more than a factor of 10.
Chueh is also a center fellow at Stanford’s Precourt Institute for Energy, which funded his exploratory work for this project. Other co-authors are Stanford students Norman Jin, Nicholas Perkins and Michael Chen; MIT Professor Martin Bazant, postdoc Benben Jiang and student Dimitrios Fraggedakis; Muratahan Aykol at Toyota Research Institute; Stephen Harris at Lawrence Berkeley National Laboratory and a visiting scholar at Stanford; and University of Michigan student Zi Yang, a Stanford intern.
This work was supported by the Toyota Research Institute, the Thomas V. Jones Stanford Graduate Fellowship, the National Science Foundation, SAIC through Stanford Energy 3.0, and the U.S. Department of Energy.
Suggested Items
Specially Developed for Laser Plastic Welding from LPKF
06/25/2025 | LPKFLPKF introduces TherMoPro, a thermographic analysis system specifically developed for laser plastic welding that transforms thermal data into concrete actionable insights. Through automated capture, evaluation, and interpretation of surface temperature patterns immediately after welding, the system provides unprecedented process transparency that correlates with product joining quality and long-term product stability.
Smart Automation: The Power of Data Integration in Electronics Manufacturing
06/24/2025 | Josh Casper -- Column: Smart AutomationAs EMS companies adopt automation, machine data collection and integration are among the biggest challenges. It’s now commonplace for equipment to collect and output vast amounts of data, sometimes more than a manufacturer knows what to do with. While many OEM equipment vendors offer full-line solutions, most EMS companies still take a vendor-agnostic approach, selecting the equipment companies that best serve their needs rather than a single-vendor solution.
Keysight, NTT, and NTT Innovative Devices Achieve 280 Gbps World Record Data Rate with Sub-Terahertz for 6G
06/17/2025 | Keysight TechnologiesKeysight Technologies, Inc. in collaboration with NTT Corporation and NTT Innovative Devices Corporation (NTT Innovative Devices), today announced a groundbreaking world record in data rate achieved using sub-THz frequencies.
Priority Software Announces the New, Game-Changing aiERP
06/12/2025 | Priority SoftwarePriority Software Ltd., a leading global provider of ERP and business management software announces its revolutionary aiERP, leveraging the power of AI to transform business operations.
Breaking Silos with Intelligence: Connectivity of Component-level Data Across the SMT Line
06/09/2025 | Dr. Eyal Weiss, CybordAs the complexity and demands of electronics manufacturing continue to rise, the smart factory is no longer a distant vision; it has become a necessity. While machine connectivity and line-level data integration have gained traction in recent years, one of the most overlooked opportunities lies in the component itself. Specifically, in the data captured just milliseconds before a component is placed onto the PCB, which often goes unexamined and is permanently lost once reflow begins.