When disaster strikes—be it a flood, wildfire, or other extreme event—evacuating people safely and efficiently becomes a critical . Roads may become impassable as water rises or fires spread, and overcrowding at intersections can lead to dangerous bottlenecks. A new study tackles this problem head-on, investigating the computational complexity of scheduling evacuation routes in networks where connections decay over time. The researchers, from Technische Universität Berlin and Humboldt-Universität zu Berlin, focused on a model called Smooth Routing in Decaying Graphs (SRDG), which aims to find evacuation schedules that avoid collisions and respect capacity limits, even as parts of the network become unavailable. Their reveal that this task is surprisingly difficult, even in highly simplified scenarios, but they also provide a practical tool that could help emergency planners in real-world situations.
The core is that SRDG is NP-hard, meaning it is computationally intractable in the worst case, even when the road network is as simple as a straight path or a star-shaped intersection. For example, on a path graph—akin to a single street with sequential intersections—the problem remains NP-hard even if each intersection can only handle one evacuation route at a time, as shown in Theorem 1. Similarly, on a star graph—representing a major crossing or roundabout—NP-hardness holds even with constant deadlines or with all intersections having a capacity of one, as proven in Theorems 4 and 5. Perhaps most strikingly, the problem stays hard on tree-shaped networks without any capacity constraints at all, as long as connections have constant deadlines, as demonstrated in Theorem 6. These imply that, under standard complexity assumptions, no efficient algorithm exists to always find optimal evacuation schedules in these settings, highlighting the inherent difficulty of coordination during disasters.
To explore this complexity, the researchers employed a ical approach combining theoretical proofs and experimental validation. They defined SRDG using decaying graphs, where each connection has a traversal time and a deadline after which it becomes impassable, and each vertex has a capacity limiting how many routes can be there simultaneously. A schedule is considered smooth if paths do not meet on edges and vertex capacities are respected. The study systematically analyzed subclasses: paths, stars, and general trees, with variations in capacities and deadlines. For the experimental part, they developed an integer linear program (ILP) to compute the minimum extension to deadlines needed for feasibility, essentially finding the latest possible evacuation time. This ILP was tested on both artificial instances—simulating paths and stars—and semi-artificial ones based on real street networks from German cities prone to flooding, such as Koblenz and Trier, where deadlines were set based on distance to rivers.
From the ILP experiments, detailed in Section 5, show that while the problem is hard, practical solutions are often within reach. On artificial instances, runtime increased with more paths, lower capacities, and longer path lengths, but many cases were solved quickly—often in seconds. For example, on decaying paths with up to 16 vertices, median runtimes remained low across parameter variations, though means spiked due to outliers. On semi-artificial data, the ILP handled real-world scenarios effectively: for zone-C evacuations in cities like Koblenz, with 10% of nodes serving as evacuation paths, average runtimes were under 40 seconds. The ILP relaxation performed remarkably well, achieving optimal solutions in 97.7% of path instances and 100% of city-based instances, while using 98.2% of the time of the exact ILP on average. This suggests that heuristic approaches based on the relaxation could be viable for larger-scale applications.
Of this research extend beyond theoretical computer science into practical disaster management. By proving hardness, the study underscores why evacuation planning is so challenging and warns against overly simplistic solutions. The provided ILP offers a concrete tool that emergency planners could adapt, using it to assess evacuation feasibility or to identify critical bottlenecks in networks. For instance, in flood-prone cities, authorities could simulate different disaster scenarios to determine safe evacuation windows or prioritize infrastructure upgrades. The model's focus on smooth scheduling—avoiding collisions and overloading—aligns with real-world safety concerns, potentially reducing chaos and improving survival rates during crises.
Despite these advances, the study acknowledges limitations. The NP-hardness mean that, in worst-case scenarios, optimal solutions may be unattainable for large networks, necessitating approximations or heuristics. The experiments, while comprehensive, were limited to specific graph structures and parameter ranges; scaling to massive metropolitan areas with thousands of nodes and paths remains untested. Additionally, the model assumes fixed evacuation routes and perfect knowledge of decay times, which may not hold in dynamic disaster environments. Future work could explore restrictions like few sink vertices or faster algorithms for special cases, as well as integration with real-time data for adaptive planning. Overall, this research provides a rigorous foundation for understanding and addressing the complexities of evacuation in decaying networks, blending theoretical insights with practical computational tools.
Original Source
Read the complete research paper
About the Author
Guilherme A.
Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.
Connect on LinkedIn