NASA’s Roman Mission Gets Cosmic ‘Sneak Peek’ From Argonne Supercomputer
June 14, 2024 | BUSINESS WIREEstimated reading time: 2 minutes
Researchers are diving into a synthetic universe to help us better understand the real one. Using the Theta supercomputer at the U.S. Department of Energy’s (DOE) Argonne National Laboratory in Illinois, scientists have created nearly four million simulated images depicting the cosmos as NASA’s Nancy Grace Roman Space Telescope and the Vera C. Rubin Observatory in Chile, jointly funded by the National Science Foundation (NSF) and DOE, will see it.
“Using Argonne’s now-retired Theta machine, we accomplished in about nine days what would have taken around 300 years on your laptop,” said Katrin Heitmann, a cosmologist and deputy director of Argonne’s High Energy Physics division who managed the project’s supercomputer time. “The results will shape Roman and Rubin’s future attempts to illuminate dark matter and dark energy while offering other scientists a preview of the types of things they’ll be able to explore using data from the telescopes.”
The team is releasing a 10-terabyte subset of this data, with the remaining 390 terabytes to follow this fall once they’ve been processed.
For the first time, this simulation factored in the telescopes’ instrument performance, making it the most accurate preview yet of the cosmos as Roman and Rubin will see it once they start observing. Rubin will begin operations in 2025, and NASA’s Roman will launch by May 2027.
The simulation’s precision is important because scientists will comb through the observatories’ future data in search of tiny features that will help them unravel the biggest mysteries in cosmology.
Roman and Rubin will both explore dark energy — the mysterious force thought to be accelerating the universe’s expansion. Since it plays a major role in governing the cosmos, scientists are eager to learn more about it. Simulations like OpenUniverse help them understand signatures that each instrument imprints on the images and iron out data processing methods now so they can decipher future data correctly. Then scientists will be able to make big discoveries even from weak signals.
Then they’ll continue using simulations to explore the physics and instrument effects that could reproduce what the observatories see in the universe.
It took a large and talented team from several organizations to conduct such an immense simulation.
“Few people in the world are skilled enough to run these simulations,” said Alina Kiessling, a research scientist at NASA’s Jet Propulsion Laboratory (JPL) in Southern California and the principal investigator of OpenUniverse. “This massive undertaking was only possible thanks to the collaboration between the DOE, Argonne, SLAC National Accelerator Laboratory and NASA, which pulled all the right resources and experts together.”
Suggested Items
Keysight, Instrumentix Partner to Launch Complete Trade Monitoring Solution for Financial Markets
11/21/2024 | Keysight TechnologiesKeysight Technologies, Inc. expanded its financial capital markets portfolio through a partnership with Instrumentix to introduce a cutting-edge trade solution.
Gartner Forecasts MENA IT Spending to Grow 7.4% in 2025
11/20/2024 | Gartner, Inc.IT spending in the Middle East and North Africa (MENA) region is projected to total $230.7 billion in 2025, an increase of 7.4% from 2024, according to the latest forecast by Gartner, Inc.
ASMPT: Highly Flexible Die and Flip-chip Bonder for Co-packaged Optics Production
11/20/2024 | ASMPTThe high-precision AMICRA NANO die and flip-chip bonder has been specially developed for the production of co-packaged optics where which optical and electronic components are integrated in a common housing. With its exceptional process stability and a placement accuracy of ±0.2 μm @ 3 σ, this innovative bonding system is ideally equipped for the communication technology of the future.
New Ultrafast Memory Boosts Intel Data Center Chips
11/19/2024 | IntelWhile Intel’s primary product focus is on the processors, or brains, that make computers work, system memory (that’s DRAM) is a critical component for performance. This is especially true in servers, where the multiplication of processing cores has outpaced the rise in memory bandwidth (in other words, the memory bandwidth available per core has fallen).
Sluggish Telecom Market Growth Prompts Operators to Become Full-Stack Technology Suppliers
11/18/2024 | IDCWorldwide spending on telecommunications and pay TV services will reach $1,544 billion in 2024, representing an increase of 2.4% year-on-year, according to the Worldwide Semiannual Telecom Services Tracker published by International Data Corporation (IDC).