Machine learning meets space: How scientists corrected Webb’s optical distortions

Produced by: Mohsin Shaikh

Historic Launch

The US$10 billion James Webb Space Telescope was launched in December 2021, marking the biggest leap in telescope technology since Hubble.

Distance Challenge

Webb orbits 1.5 million km from Earth, making it impossible to service physically, unlike Hubble which required astronauts for repairs.

Australian Contribution

The telescope carries Australia’s only hardware, the Aperture Masking Interferometer (AMI), designed by astronomer Peter Tuthill to enhance image resolution.

Optical Precision

AMI filters light through a structured pattern of holes, enabling detection and correction of nanometre-level distortions across Webb’s 18 hexagonal primary mirrors.

Electronic Blur

Initial images showed subtle blurring from electronic effects, where bright pixels leaked into darker neighbours, limiting Webb’s ability to detect faint planets near stars.

Data Correction

Scientists built a computer model and machine learning algorithm to simulate AMI optics and detector behavior, restoring full function by correcting blurs during data processing.

Exoplanet Breakthrough

The correction revealed HD 206893’s faint planet and the reddest-known brown dwarf, objects previously undetectable with Webb’s raw data.

Complex Imaging

AMI now resolves intricate targets like Jupiter’s moon Io’s volcanoes, black hole jets in NGC 1068, and dust ribbons around WR 137 with unmatched precision.

Future Potential

This optical and electronic calibration framework sets the stage for Webb and future telescopes, like the Roman Space Telescope, to discover Earth-like planets in distant galaxies.