Superconductors are an important component when it comes to cutting-edge technology in transportation, energy, and other industries. A new paper from Georgia Tech and Hanoi University of Science and Technology is proposing using a method powered by AI to identify potential superconductors faster.
According to the paper, the joint team looks to incorporate atomic-level information into machine learning pathways to discover new conventional superconductors, partially at ambient pressure. At issue is the challenging task of prediction of high-temperature superconductivity at zero temperature while lacking atomic level information.
So the researchers worked together to curate a dataset of 584 atomic structures with more than 1,100 values of λ and ωlog. These we computed at different pressures and in doing so, the machine learning models were developed for λ and ωlog.
Then they were used to screen over 80,000 entries in the Materials Project database. This unveiled, by first-principles computations, two thermodynamically and dynamically stable materials whose superconductivity may exist at Tc. This is equal to 10 -15k and p=o.
To achieve this, they employed a matminer package to convert atomic structures into numerical vectors and then used Gaussian process regression with the machine learning algorithm. According to the paper, the researchers used machine learning models for predicting 35 candidates.
Of those 35 candidates, six had the highest predicted Tc values. Others were too unstable and still required more stabilization calculations. After verifying the stability of two candidates, CrH and CrH2, their superconducting properties were computed using first-principles calculations.
After this, the researchers were able to evaluate their predictions and go on to perform more calculations to confirm their predicted results. Of those, accuracy currently sits at 2-3% of the reported values. So why is this important?
Well, identifying new candidates for potential superconductors in a faster and more reliable way is important for a number of reasons. Some of these include enhancing energy efficiency, advancing power transmission and storage, and driving scientific understanding of quantum phenomena.
The latter, storage and quantum phenomena are critical in data science. In the case of storage, massive amounts of data are being produced by the second, and storage can become an issue later. And as for quantum phenomena, this could hold the key to advancements in general artificial intelligence and other sub-fields of data science.