Latest Guides

Science and Technology

Caltech Researchers: How Can AI Propel Exploration of the Cosmos, the Oceans, and the Earth?

Published on Wednesday, February 7, 2024 | 5:35 am
 

BLACK HOLES

Computational imaging expert Katie Bouman describes how machine learning enables imaging of black holes and other astrophysical phenomena.

We use machine learning to produce images from data gathered by the Event Horizon Telescope (EHT), a network of several telescopes located around the world that work in sync to collect the light from a black hole simultaneously. Together, these telescopes function as one giant virtual telescope. We join their data together computationally and develop algorithms to fill in the gaps in coverage. Because of the holes in our virtual telescope’s coverage, there is a lot of uncertainty introduced in forming the picture. Characterizing this uncertainty, or quantifying the range of possible appearances the black hole could take on, took the worldwide team months for the M87* black hole image and literally years for the more recent picture of the Sgr A* black hole in the center of our Milky Way.

We can use the image-generating power of machine learning methods, called deep-learning generative models, to more efficiently capture the uncertainty for the EHT images. The models we are developing quickly generate a whole distribution of possible images that fit the complicated data we collect, not just a single image. We also have been using these same generative models to evaluate uncertainty for exoplanet orbits, medical imaging, and seismology.

One area where we are excited about using machine learning is in helping to optimize sensors for computational imaging. For instance, we are currently developing machine learning methods to help identify locations for new telescopes that we build into the EHT. By designing the telescope placement and the image-reconstruction software simultaneously, we can squeeze more information out of the data we collect, recovering higher-fidelity imagery with less uncertainty. This idea of co-designing computational imaging systems also extends far beyond the EHT into medical imaging and other domains.

Caltech astronomy professors Gregg Hallinan and Vikram Ravi are leading an effort called the DSA-2000, in which 2,000 telescopes in Nevada will image the entire sky in radio wavelengths. Unlike the EHT, where we have to fill in gaps in the data, this project will collect a huge amount of data: about 5 terabytes per second. All the processing steps, like correlation, calibration, and imaging, have to be fast and automated. There is no time to do this work using traditional methods. Our group is collaborating to develop deep learning methods that automatically clean up the images so users receive images within minutes of collecting the data.

—Katherine Bouman, assistant professor of computing and mathematical sciences, electrical engineering and astronomy; Rosenberg Scholar; and Heritage Medical Research Institute Investigator

ASTRONOMY

Astronomer Matthew Graham, the project scientist for the Zwicky Transient Facility, explains how AI is changing astrophysics.

Get our daily Pasadena newspaper in your email box. Free.

Get all the latest Pasadena news, more than 10 fresh stories daily, 7 days a week at 7 a.m.

Make a comment

Your email address will not be published. Required fields are marked *

 

 

 

 

buy ivermectin online
buy modafinil online
buy clomid online
buy ivermectin online