Faster Prediction of Wireless Downtime
July 5, 2016 | KAUSTEstimated reading time: 2 minutes

An efficient simulation scheme that hones in on the rarest elements in a dataset can help predict capacity exceedances in wireless networks.
As the number of mobile devices grows along with demand for faster connections and larger data volumes, wireless networks can easily exceed capacity, resulting in severe network slowdowns and outages. While engineers have developed various sophisticated signal processing methods to accommodate sudden changes in network loads, it has been challenging to evaluate and compare the performance of different approaches in realistic network environments. The reason for this difficulty is that network outages due to capacity saturation can be such rare events that producing simulations to identify outages can be very computationally intensive and take considerable time.
Raul Tempone and colleagues from the KAUST Strategic Research Initiative on Uncertainty Quantification in Science and Engineering (SRI-UQ) have now applied an importance sampling technique that can simulate rare events for the problem of wireless outage capacity1.
“The outage capacity is one of the most important performance metrics of wireless communication systems,” explained Tempone. “It measures the percentage of time that the communication system undergoes an outage, which is typically in the order of one second per million or more. There are no efficient analytical solutions to this problem, and to simulate this situation using conventional simulation methods might take more than a billion simulation runs.”
Motivated by the need for a much faster simulation method, Tempone and his team turned to importance sampling. This is a well-known approach through which a clever problem transformation makes it possible to sample more frequently from the event of interest. This effectively turns rare events in the original problem into non-rare events in the transformed problem. For example, for a typical outage capacity of the order of one in 100 million, the importance sampling approach allows the outage capacity to be estimated in 100 million times fewer simulation runs than conventional methods, dramatically reducing the time needed for estimation.
Unlike existing methods for estimating outage capacity that are only applicable to specific scenarios, the importance sampling approach is generic, making it suitable for a wide range of challenging network scenarios.
“Despite continuous advances in the concept of importance sampling in the field of rare events simulations, its popularity among researchers in the field of wireless communication systems is still quite limited,” Tempone said. “Our work is the first to bridge the gap between the framework of rare event algorithms and the evaluation of outage capacity for wireless communication systems.”
Suggested Items
I-Connect007 Editor’s Choice: Five Must-Reads for the Week
06/06/2025 | Nolan Johnson, I-Connect007Maybe you’ve noticed that I’ve been taking to social media lately to about my five must-reads of the week. It’s just another way we’re sharing our curated content with you. I pay special attention to what’s happening in our industry, and I can help you know what’s most important to read about each week. Follow me (and I-Connect007) on LinkedIn to see these and other updates.
INEMI Interim Report: Interconnection Modeling and Simulation Results for Low-Temp Materials in First-Level Interconnect
05/30/2025 | iNEMIOne of the greatest challenges of integrating different types of silicon, memory, and other extended processing units (XPUs) in a single package is in attaching these various types of chips in a reliable way.
Siemens Leverages AI to Close Industry’s IC Verification Productivity Gap in New Questa One Smart Verification Solution
05/13/2025 | SiemensSiemens Digital Industries Software announced the Questa™ One smart verification software portfolio, combining connectivity, a data driven approach and scalability with AI to push the boundaries of the Integrated Circuit (IC) verification process and make engineering teams more productive.
Cadence Unveils Millennium M2000 Supercomputer with NVIDIA Blackwell Systems
05/08/2025 | Cadence Design SystemsAt its annual flagship user event, CadenceLIVE Silicon Valley 2025, Cadence announced a major expansion of its Cadence® Millennium™ Enterprise Platform with the introduction of the new Millennium M2000 Supercomputer featuring NVIDIA Blackwell systems, which delivers AI-accelerated simulation at unprecedented speed and scale across engineering and drug design workloads.
DARPA Selects Cerebras to Deliver Next Generation, Real-Time Compute Platform for Advanced Military and Commercial Applications
04/08/2025 | RanovusCerebras Systems, the pioneer in accelerating generative AI, has been awarded a new contract from the Defense Advanced Research Projects Agency (DARPA), for the development of a state-of-the-art high-performance computing system. The Cerebras system will combine the power of Cerebras’ wafer scale technology and Ranovus’ wafer scale co-packaged optics to deliver several orders of magnitude better compute performance at a fraction of the power draw.