We're fatally wrong to study Webb's data

We're fatally wrong to study Webb's data

The James Webb Space Telescope shows the universe with impressive, unprecedented clarity. The super-high infrared view of the observatory cuts through space dust to highlight some of the earliest structures in the universe, hidden star crèches and spinning galaxies that are hundreds of millions of light years away.

What are Webb's tasks?

In addition, Webb will have the most complete understanding of the objects of the Milky Way, some of the 5,000 planets found in the galaxy. Astronomers use the accuracy of the telescope's light analysis to decipher the atmosphere surrounding some of these nearby worlds.

What's the problem?

The staff of the Massachusetts Institute of Technology have carried out a new study, and scientists suggest that the tools that astronomers usually use to decode light signals may not be good enough to accurately interpret the data of the new telescope. In particular, opacity models — tools that model the interaction of light with matter according to its properties — may require significant re-assembly to match Webba's accuracy.

If these models are not developed, the properties of the planetary atmosphere, such as their temperature, pressure and element composition, may differ in order, which means that scientists are fatally mistaken in their calculations.

There is a scientifically significant difference between the water content of 5% and 25% that modern models cannot distinguish. The model that NASA scientists use to decipher spectral information does not correspond to the accuracy and quality of the data obtained from the James Webb telescope.

What is lack of transparency?

The lack of transparency is a measure of how easily photons pass through material. As is known, photons of certain wave lengths can go directly through material, absorb or reflect back. It depends on whether they interact with certain molecules inside the material and how it happens. This interaction also depends on the temperature and pressure of the material.

The non-transparency model operates on the basis of different assumptions about how light interacts with the substance. Astronomers use it to obtain certain properties of the material, taking into account the spectrum of light emitted by the material. In the context of the exoplanet, the non-transparency model allows the type and content of chemicals in the planet ' s atmosphere to be deciphered from the light reflected by it and captured by the telescope.

What's going on now?

The current modern model of lack of transparency, which was compared in MIT with a classical language translation tool, "works well." It normally deciphers the spectrum data obtained by telescopes such as the Hubble Space Telescope.

"As long as the Rosetta Stone is okay," scientists write, but now that scientists are moving to the next level, working with Webba's ultra-precision tools, the current translation process "is not going to capture important subtleties." For example, those who distinguish a lifeable planet from an unsuitable planet.

What did scientists do?

In a new study, MlT staff were looking for the atmospheric properties of the model, if adjusted, to allow certain limitations in understanding how light and matter interacted. As a result, scientists created eight such "disguised" models.

They then provided every model, including a real version, with "synthetic spectrums." This is a pattern of light that scientists modeled, similar to the accuracy that the James Webb telescope could capture.

What did the scientists find out?

It turns out that, based on the same light spectrum, each outraged model provides large-scale predictions of the properties of the planet's atmosphere. Based on the analysis, scientists have concluded that if existing models of opacity are applied to the light spectrum obtained by the Webba telescope, they will achieve a certain "wall of accuracy." In simple terms, they will not be sensitive enough to determine the real temperature of the planet and what gas is occupying 5% or 25% of the atmospheric layer.

This difference is important for scientists to be able to limit the mechanisms for planet formation and to identify biosynchronous sites reliably.

The team also found that each model also provided a "good match" to the data. This means that even if an outraged model provided the wrong chemical composition, it generated its luminous spectrum. It was sufficiently close to "conforming" the original.

What's the outcome?

There are enough parameters that need to be adjusted, even with the wrong model, to get a good match, which means you can't know exactly that the model is wrong.

Scientists have put forward several ideas on how to improve existing models of lack of transparency; for example, additional laboratory measurements and theoretical calculations are needed to clarify assumptions about how light interacts with different molecules; there is also a need for collaboration between scientists from different fields, in particular between astronomers and spectroscopy specialists.

Much could be done if you knew perfectly how light and matter interact.


The mysterious blue mucus at the bottom of the sea has put scientists at a standstill.