ΑΙhub.org
 

Machine learning fine-tunes graphene synthesis


by
08 February 2022



share this:

flash graphene productionRice University chemists are employing machine learning to fine-tune its flash Joule heating process to make graphene. A flash signifies the creation of graphene from waste. (Credit: Jeff Fitlow/Rice University)

By Mike Williams

Rice University scientists are using machine learning techniques to streamline the process of synthesizing graphene from waste through flash Joule heating.

This flash Joule process has expanded beyond making graphene from various carbon sources, to extracting other materials, like metals, from urban waste.

The technique is the same for all of the above: blasting a jolt of high energy through the source material to eliminate all but the desired product. But the details for flashing each feedstock are different.

The researchers describe in Advanced Materials how machine learning models that adapt to variables and show them how to optimize procedures are helping them push forward.

“Machine learning algorithms will be critical to making the flash process rapid and scalable without negatively affecting the graphene product’s properties,” Tour said.

“In the coming years, the flash parameters can vary depending on the feedstock, whether it’s petroleum-based, coal, plastic, household waste or anything else,” he said. “Depending on the type of graphene we want — small flake, large flake, high turbostratic, level of purity — the machine can discern by itself what parameters to change.”

Because flashing makes graphene in hundreds of milliseconds, it’s difficult to tease out the details of the chemical process. So Tour and company took a clue from materials scientists who have worked machine learning into their everyday process of discovery.

flash heating methodMachine learning is fine-tuning Rice University’s flash Joule heating method for making graphene from a variety of carbon sources, including waste materials. (Credit: Jacob Beckham/Tour Group)

“It turned out that machine learning and flash Joule heating had really good synergy,” said Rice graduate student and lead author Jacob Beckham. “Flash Joule heating is a really powerful technique, but it’s difficult to control some of the variables involved, like the rate of current discharge during a reaction. And that’s where machine learning can really shine. It’s a great tool for finding relationships between multiple variables, even when it’s impossible to do a complete search of the parameter space.

“That synergy made it possible to synthesize graphene from scrap material based entirely on the models’ understanding of the Joule heating process,” he said. “All we had to do was carry out the reaction — which can eventually be automated.”

The lab used its custom optimization model to improve graphene crystallization from four starting materials — carbon black, plastic pyrolysis ash, pyrolyzed rubber tires and coke — over 173 trials, using Raman spectroscopy to characterize the starting materials and graphene products.

The researchers then fed more than 20,000 spectroscopy results to the model and asked it to predict which starting materials would provide the best yield of graphene. The model also took the effects of charge density, sample mass and material type into account in their calculations.

Co-authors are Rice graduate students Kevin Wyss, Emily McHugh, Paul Advincula and Weiyin Chen; Rice alumnus John Li; and postdoctoral researcher Yunchao Xie and Jian Lin, an associate professor of mechanical and aerospace engineering, of the University of Missouri. Tour is the T.T. and W.F. Chao Chair in Chemistry as well as a professor of computer science and of materials science and nanoengineering.

The Air Force Office of Scientific Research (FA9550-19- 1-0296), U.S. Army Corps of Engineers (W912HZ-21-2-0050) and the Department of Energy (DE- FE0031794) supported the research.




Rice University

            AIhub is supported by:



Subscribe to AIhub newsletter on substack



Related posts :

Scaling up multi-agent systems: an interview with Minghong Geng

  07 Apr 2026
We sat down with Minghong in the latest of our interviews with the 2026 AAAI/SIGAI Doctoral Consortium participants.

Forthcoming machine learning and AI seminars: April 2026 edition

  02 Apr 2026
A list of free-to-attend AI-related seminars that are scheduled to take place between 2 April and 31 May 2026.

#AAAI2026 invited talk: machine learning for particle physics

  01 Apr 2026
How is ML used in the search for new particles at CERN?
monthly digest

AIhub monthly digest: March 2026 – time series, multiplicity, and the history of RoboCup

  31 Mar 2026
Welcome to our monthly digest, where you can catch up with AI research, events and news from the month past.

What I’ve learned from 25 years of automated science, and what the future holds: an interview with Ross King

  30 Mar 2026
We launch our new series with a conversation with Ross King - a pioneer in the field of AI-enabled scientific discovery.

A multi-armed robot for assisting with agricultural tasks

and   27 Mar 2026
How can a robot safely manipulate branches to reveal hidden flowers while remaining aware of interaction forces and minimizing damage?

Resource-constrained image generation and visual understanding: an interview with Aniket Roy

  26 Mar 2026
Aniket tells us about his research exploring how modern generative models can be adapted to operate efficiently while maintaining strong performance.

RWDS Big Questions: how do we highlight the role of statistics in AI?

  25 Mar 2026
Next in our series, the panel explores the statistical underpinning of AI.



AIhub is supported by:







Subscribe to AIhub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence