NASA’s Roman Mission Gets Cosmic ‘Sneak Peek’ From Argonne Supercomputer
June 14, 2024 | BUSINESS WIREEstimated reading time: 2 minutes
Researchers are diving into a synthetic universe to help us better understand the real one. Using the Theta supercomputer at the U.S. Department of Energy’s (DOE) Argonne National Laboratory in Illinois, scientists have created nearly four million simulated images depicting the cosmos as NASA’s Nancy Grace Roman Space Telescope and the Vera C. Rubin Observatory in Chile, jointly funded by the National Science Foundation (NSF) and DOE, will see it.
“Using Argonne’s now-retired Theta machine, we accomplished in about nine days what would have taken around 300 years on your laptop,” said Katrin Heitmann, a cosmologist and deputy director of Argonne’s High Energy Physics division who managed the project’s supercomputer time. “The results will shape Roman and Rubin’s future attempts to illuminate dark matter and dark energy while offering other scientists a preview of the types of things they’ll be able to explore using data from the telescopes.”
The team is releasing a 10-terabyte subset of this data, with the remaining 390 terabytes to follow this fall once they’ve been processed.
For the first time, this simulation factored in the telescopes’ instrument performance, making it the most accurate preview yet of the cosmos as Roman and Rubin will see it once they start observing. Rubin will begin operations in 2025, and NASA’s Roman will launch by May 2027.
The simulation’s precision is important because scientists will comb through the observatories’ future data in search of tiny features that will help them unravel the biggest mysteries in cosmology.
Roman and Rubin will both explore dark energy — the mysterious force thought to be accelerating the universe’s expansion. Since it plays a major role in governing the cosmos, scientists are eager to learn more about it. Simulations like OpenUniverse help them understand signatures that each instrument imprints on the images and iron out data processing methods now so they can decipher future data correctly. Then scientists will be able to make big discoveries even from weak signals.
Then they’ll continue using simulations to explore the physics and instrument effects that could reproduce what the observatories see in the universe.
It took a large and talented team from several organizations to conduct such an immense simulation.
“Few people in the world are skilled enough to run these simulations,” said Alina Kiessling, a research scientist at NASA’s Jet Propulsion Laboratory (JPL) in Southern California and the principal investigator of OpenUniverse. “This massive undertaking was only possible thanks to the collaboration between the DOE, Argonne, SLAC National Accelerator Laboratory and NASA, which pulled all the right resources and experts together.”
Suggested Items
AI Chips for the Data Center and Cloud Market Will Exceed US$400 Billion by 2030
05/09/2025 | IDTechExBy 2030, the new report "AI Chips for Data Centers and Cloud 2025-2035: Technologies, Market, Forecasts" from market intelligence firm IDTechEx forecasts that the deployment of AI data centers, commercialization of AI, and the increasing performance requirements from large AI models will perpetuate the already soaring market size of AI chips to over US$400 billion.
ZenaTech’s ZenaDrone Tests Proprietary Camera Enabling IQ Nano Drone Swarms for US Defense Applications, Blue UAS Submission
05/09/2025 | Globe NewswireZenaTech, Inc., a technology company specializing in AI (Artificial Intelligence) drones, Drone as a Service (DaaS), enterprise SaaS, and Quantum Computing solutions, announces that its subsidiary ZenaDrone is testing a new proprietary specialized camera that enables more efficient indoor applications such as inventory and security management, when utilizing IQ Nano drone swarms for commercial and US defense applications.
New Issue of Design007 Magazine: Are Your Data Packages Less Than Ideal?
05/09/2025 | I-Connect007 Editorial TeamWhy is it so difficult to create the ideal data package? Many of these simple errors can be alleviated by paying attention to detail—and knowing what issues to look out for. So, this month, our experts weigh in on the best practices for creating the ideal data package for your design.
Cadence Unveils Millennium M2000 Supercomputer with NVIDIA Blackwell Systems
05/08/2025 | Cadence Design SystemsAt its annual flagship user event, CadenceLIVE Silicon Valley 2025, Cadence announced a major expansion of its Cadence® Millennium™ Enterprise Platform with the introduction of the new Millennium M2000 Supercomputer featuring NVIDIA Blackwell systems, which delivers AI-accelerated simulation at unprecedented speed and scale across engineering and drug design workloads.
IPC White Paper Maps the Regulatory Terrain for Electronics Suppliers in E-Mobility Sector
05/07/2025 | IPCElectronics suppliers supporting the rapidly growing e-mobility sector are facing a dramatic escalation in environmental and social governance (ESG) compliance expectations. A new white paper from IPC’s e-Mobility Quality and Reliability Advisory Group provides a comprehensive overview of the evolving regulatory landscape and outlines the data infrastructure needed to stay ahead.