The Center for Artificial Intelligence Innovation (CAII) at the National Center for Supercomputing Applications (NCSA) has announced the winners of its recent hackathon “Ashby Prize in Computational Science – at the University of Illinois at Urbana-Champaign”, sponsored by CAII, Department of Atmospheric Sciences and Department of Mechanical Science and Engineering.
The Ashby Prize in Computational Science at the University of Illinois at Urbana-Champaign is awarded to a multidisciplinary team of students for the innovative use of high-performance computing to address a problem of societal significance. The Prize was established by Steven and Beth Ashby. Steve was one of the early users of NCSA, and has a background in computational science. He is currently the Director of the Department of Energy’s Pacific Northwest National Laboratory. He holds a B.S. in Mathematics/Computer Science from the University of Santa Clara and earned his M.S. and Ph.D. in Computer Science from the University of Illinois at Urbana-Champaign.
The winning team from the University of Illinois was Kwok Sun Tang, Chu-Chun Chen, Tin-Yin Lai, Kedar Phadke and Labdhi Jain. “This was very challenging to judge, as the quality of the teams and their work were excellent across the board,” said Dr. Volodymyr Kindratenko, Director of the Center for Artificial Intelligence Innovation at NCSA. “The talent and innovation from each team was high from first to third place and we are pleased with the outcome.”
1st place: Team #6: Kwok Sun Tang, Chu-Chun Chen, Tin-Yin Lai, Kedar Phadke, Labdhi Jain
2nd place: Team #4: Ziyang Xu, Enyi Jiang, Yuhang Ren, Xiuyi Qin, and Team #5: Hao Bai, Ruike Zhu, Xiyue Zhu, Muhil Arumugam
3rd place: Team #1: Kastan Day, Daniel Christl, Seonghwan Kim, Vardhan Dongre, and Team #2: Aniruddha Mukherjee, Richwell Perez, Philip Chmielowiec, William Eustis.
“I have been looking forward to this inaugural competition for some time and the teams did not disappoint!” said Dr. Ashby. “Each team did a terrific job and I commend their efforts. Their results were impressive and demonstrate the power of computational science to serve societal needs—which is the purpose of the Ashby Prize in Computational Science. I am pleased to know that my alma mater continues to lead the way in computer and computational sciences.”
“NCSA was proud to host the inaugural Ashby prize competition,” said NCSA Director Dr. Bill Gropp. “These students saw first hand the potential of high performance computing to provide innovative solutions to challenging problems.”
The objective of this competition was to create a machine learning model trained on accurate WRF-PartMC data that predicts climate-relevant aerosol properties from only the features that current GCMs can output. The problem statement and the training data was provided by Prof. Nicole Riemer and her postdoc Dr. Jeffrey Curtis from the Department of Atmospheric Sciences and Prof. Matthew West from the Department of Mechanical Science and Engineering. Prof. Riemer’s research group develops computer simulations that describe how aerosol particles are created, transported, and transformed in the atmosphere and Prof. West’s team developed the Particle-resolved Monte Carlo code, PartMC, for aerosol atmospheric aerosol simulation that was used to generate the datasets. Dr. Curtis spent countless hours helping the teams during the competition to make sense of the data and explaining the science behind the challenge. NCSA provided the computational resources, our purpose-built HAL cluster for deep learning, to host the data and enable model development and training. Dr. Dawei Mu from NCSA helped competition participants using the system.
The winning team applied a tabular data learning architecture, called TabNet, which uses sequential attention for automatically selecting features based on which prediction is made. The feature importance map tells which variables are useful for predicting the output variables. Several other teams also tried to analyze and understand which features to base the prediction on, however the approach used by the winning team is both unique and very powerful as it automatically selects variables to ignore and creates masks for training and inference. As a result, output variables were predicted with a high degree of accuracy compared to other models.
“We really liked the interpretability of the model as shown by the feature maps indicating which features were dominant spatially and temporally,” said Drs. Riemer and West, part of the judges panel. “These maps have the potential of offering physical insight into the workings of the machine learning model and could lead to new scientific hypotheses.”
The two teams that were placed second in the competition tried several models, including a 3D U-Net model which has proven to work well for some of the features. “We liked the progression of models tried by Team 4, from simple to increasingly complex,” said Dr. Riemer. “The feature selection was performed well and powerfully motivated. The team also considered the physical background of the predicted quantities and used this to guide the development of the models.” For Team 5, “A great aspect of their solution was the data augmentation,” said Dr. West. “This was a very sensible idea to boost the volume of training data and avoid learning spurious features. We also appreciated the attempt to include time correlations via the use of an LSTM.”
The teams placed third in the competition applied drastically different and more complex models with very promising results. However, these models have also proven to be more challenging to train in the amount of time allocated for the hackathon. “We liked the use of pre-trained ResNets for spatial embeddings and the incorporation of temporal relationships via a bidirectional LSTM done by Team 1,” said Drs. Riemer and West. They also noted “the novel use of a transformer to capture temporal information in the data by Team 2. The pairing of this with a U-Net encoder/decoder architecture could represent a powerful architectural approach.” “These are very interesting findings that, with some additional work, could lead to even better results,” said Dr. Kindratenko.
The National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students and collaborators from around the globe use these resources to address research challenges for the benefit of science and society. NCSA has been advancing many of the world’s industry giants for over 35 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale.