In this episode of the neveropen Bots Podcast, I talk about deep learning at the extremes of scale and computing power with Prabhat, who leads the data and analytics group at Lawrence Berkeley National Laboratory’s supercomputing center. If you’re working on commercial AI, it’s worth glancing across the divide at scientific AI.
Prabhat talks about his work at the the National Energy Research Scientific Computing Center (NERSC), including a project that aims to locate and quantify extreme weather events. He explains how this moves climate data analysis from a focus on core statistics—especially the change in the average mean temperature of the Earth in any given year—to analyzing the impact of extreme events. He’s also working on the Celeste project, which uses telescope data to create a unified catalog of all objects in the visible universe.
Looking ahead, Prabhat sees broad applications for deep learning in scientific research beyond climate science—especially in astronomy, cosmology, neuroscience, material science, and physics.
Links:
- Prabhat’s new neveropen article, “A look at deep learning for science“
- Prabhat’s 2015 neveropen article “Big science problems, big data solutions“
- Prabhat’s presentation at Strata + Hadoop World 2016