Low Earth Orbit (LEO) satellites are transforming global connectivity by providing internet access from space, but their rapid motion and low altitudes create a : signals can be blocked or weakened by terrain, buildings, and vegetation, especially at low angles where the satellite is near the horizon. Existing models often ignore these real-world environmental factors, relying on statistical assumptions that may not capture the complexity of diverse landscapes like mountains, forests, or urban areas. This gap limits the ability to predict link reliability for applications such as remote communication in polar regions or rural connectivity, where accurate forecasting is crucial for network planning and user experience. A new approach addresses this by integrating actual geographic data into channel modeling, offering a more realistic way to assess signal propagation in varied environments.
The researchers developed an environment-aware channel modeling that uses real environmental data to determine whether a satellite link has a clear line-of-sight or is obstructed by terrain or vegetation. They leveraged digital elevation models (DEMs), which provide terrain elevation at a 10-meter resolution, and land cover information to identify obstacles like mountains and forests. For example, in the Canadian Arctic region studied, elevation ranged from -109.9 meters to 5083.8 meters, highlighting significant terrain variation that affects signal paths. By combining this with ray tracing simulations, classifies links as line-of-sight (LOS) or non-line-of-sight (NLOS) and identifies reflection paths from surfaces like water or ice, which can cause multipath interference. This approach moves beyond abstract statistics to account for specific environmental features, such as using the Fresnel zone criterion to assess clearance margins for signal propagation.
To make this scalable for real-time use, the team employed a diffusion model, a type of AI that learns to predict channel conditions from sampled data without performing computationally intensive ray tracing for every possible link. They first clustered the region based on terrain features like slope and roughness, land cover types such as forests or urban areas, and functional zones like residential or protected areas. Using k-means clustering, they guided sampling to cover diverse environmental conditions, prioritizing areas with high terrain elevation or strong reflectivity. For instance, in obstruction analysis, they weighted ridges and valleys more heavily, while for reflection analysis, they emphasized water surfaces. The diffusion model was trained on these samples, with inputs including DEM, slope, aspect, and embedded land cover data, to predict channel loss across arbitrary satellite and ground terminal positions, reducing computation time from roughly 144,000 seconds for full ray tracing to about 300 seconds for inference.
Demonstrated the model's effectiveness in capturing environmental impacts on signal propagation. In the Canadian Arctic region, obstruction rates increased sharply at lower elevation angles, reaching 24.28% at 25 degrees, due to terrain and vegetation blockages. Reflection maps showed stronger signals in flat central areas compared to forested or rugged regions. Validation with real-world data from Starlink, OneWeb, and cellular networks confirmed the model's accuracy. For example, when compared to OneWeb signal-to-noise ratio (SNR) measurements, the model achieved a Pearson correlation coefficient of 0.906, indicating strong consistency. In cellular tests on Galiano Island, the correlation was even higher at 0.932. The diffusion model's prediction performance varied with sampling density; at a 1% sample rate, mean absolute error (MAE) was 5.22 dB in rugged terrain, but improved to 1.23 dB at a 4% sample rate, showing that denser observations enhance accuracy, particularly in complex environments.
Of this research are significant for improving satellite network reliability and efficiency. By accurately predicting channel conditions, network operators can optimize resource scheduling, such as dynamically allocating bandwidth or managing handovers between satellites, to maintain connectivity in challenging areas like the Arctic or rural regions. This could lead to better service for remote communities, enhanced disaster response communications, and more robust internet coverage globally. 's ability to use environmental data without extensive field measurements makes it practical for inaccessible regions, reducing costs and enabling broader deployment. Additionally, the AI-driven approach supports real-time decision-making, which is essential for adaptive networks that must respond quickly to changing conditions, such as satellite movements or weather effects.
Despite its advancements, the study has limitations that point to future research directions. The model's performance depends on the quality and resolution of environmental data, such as DEMs and land cover maps, which may not be uniformly available worldwide. Seasonal variations in vegetation, like leaf growth or snow cover, are not fully accounted for, potentially affecting accuracy in dynamic environments. The validation relied on datasets from specific regions, such as the Canadian Arctic and Seattle, so generalization to other geographic areas with different terrain or climate patterns requires further testing. Additionally, while the diffusion model reduces computational cost, it still involves training on sampled data, and its accuracy degrades with very sparse observations, as shown by higher errors at low sample rates. Future work could refine the model to incorporate real-time weather data or expand validation to more diverse environments to enhance robustness and applicability.
Original Source
Read the complete research paper
About the Author
Guilherme A.
Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.
Connect on LinkedIn