Preview

Doklady BGUIR

Advanced search

Dynamic Relational Graph Modeling for Multi-Agent Motion Trajectory Prediction

https://doi.org/10.35596/1729-7648-2025-23-4-109-117

Abstract

Accurate trajectory prediction of multiple agents is a critical task in the fields of autonomous driving, human-computer interaction, and behavior analysis. However, the dynamic and interactive nature of agent behavior poses significant challenges, since it requires the formation of complex spatio-temporal dependencies and dynamically evolving interactions between agents. A novel approach is proposed for modeling dynamic relational graphs, the core component of which is the attention focus block, taking into account the relative positions of graph-based agents. By considering objects in a scene (e.g., vehicles and road elements) as graph nodes and their interactions as edges, the proposed approach effectively captures both local and global dependencies in a scene and makes a prediction about the future trajectory. The presented approach is evaluated using the Argoverse1 trajectory prediction dataset. Experimental results show that this model outperforms existing methods.

About the Authors

Yi Tang
Belarusian State University of Informatics and Radioelectronics
Belarus

Yi Tang, Postgraduate at the Department of Information Technologies in Automated Systems 

220013, Minsk, Brovki St., 6 

Tel.: +375 25 764-41-91 



D. Yu. Pertsau
Belarusian State University of Informatics and Radioelectronics
Belarus

Cand. Sci. (Tech.), Associate Professor, Associate Professor at the Department of Electronic Computer Science 

Minsk 



References

1. Yurtsever E., Lambert J., Carballo A., Takeda K. (2020) A Survey of Autonomous Driving: Common Practi­ ces and Emerging Technologies. IEEE Access. 8, 58443–58469.

2. Wang S., Bao Z., Culpepper J. S., Cong G. (2021) A Survey on Trajectory Data Management, Analytics, and Learning. ACM Computing Surveys (CSUR). 54 (2), 1–36.

3. Singh A. (2023) Trajectory-Prediction with Vision: A Survey. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 3318–3323.

4. Casas S., Luo W., Urtasun R. (2018) IntentNet: Learning to Predict Intention from Raw Sensor Data. In Proceedings of the Conference on Robot Learning. Zürich, Switzerland. 947–956.

5. Hong J., Sapp B., Philbin J. (2019) Rules of the Road: Predicting Driving Behavior with a Convolutional Mo­ del of Semantic Interactions. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Long Beach, CA, USA. 8454–8462.

6. Gao J., Sun C., Zhao H., Shen Y., Anguelov D., Li C., et al. (2020) VectorNet: Encoding HD Maps and Agent Dynamics from Vectorized Representation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 11525–11533.

7. Liang M., Yang B., Hu R., Chen Y., Liao R., Feng S., et al. (2020) Learning Lane Graph Representations for Motion Forecasting. In Computer Vision – ECCV 2020: 16th European Conference, Glasgow, UK, Aug. 23–28, 2020, Proceedings, Part II. Springer International Publishing. 541–556.

8. Zhou Z., Ye L., Wang J., Wu K., Lu K. (2022) HiVT: Hierarchical Vector Transformer for Multi-Agent Motion Prediction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 8823–8833.

9. Zhang L., Li P., Liu S., Shen S. (2024) SIMPL: A Simple and Efficient Multi-Agent Motion Prediction Baseline for Autonomous Driving. IEEE Robotics and Automation Letters.

10. Qi C. R., Su H., Mo K., Guibas L. J. (2017) PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 652–660.

11. Chang M.-F., Lambert J., Sangkloy P., Singh J., Bak S., Hartnett A., et al. (2019) Argoverse: 3D Tracking and Forecasting with Rich Maps. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 8748–8757.

12. Gu J., Sun C., Zhao H. (2021) DenseTNT: End-to-End Trajectory Prediction from Dense Goal Sets. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 15303–15312.

13. Gilles T., Sabatini S., Tsishkou D., Stanciulescu B., Moutarde F. (2022) THOMAS: Trajectory Heatmap Output with Learned Multi-Agent Sampling. In International Conference on Learning Representations.

14. Wang M., Zhu X., Yu C., Li W., Ma Y., Jin R., et al. (2023) GANet: Goal Area Network for Motion Forecas­ ting. In 2023 IEEE International Conference on Robotics and Automation (ICRA). 1609–1615.

15. Ngiam J., Vasudevan V., Caine B., Zhang Z., Chiang H. T. L., Ling J., et al. (2023) Scene Transformer: A Unified Architecture for Predicting Future Trajectories of Multiple Agents. In International Conference on Learning Representations.

16. Feng C., Zhou H., Lin H., Zhang Z., Xu Z., Zhang C., et al. (2023) MacFormer: Map-Agent Coupled Transformer for Real-Time and Robust Trajectory Prediction. IEEE Robotics and Automation Letters.


Review

For citations:


Tang Y., Pertsau D.Yu. Dynamic Relational Graph Modeling for Multi-Agent Motion Trajectory Prediction. Doklady BGUIR. 2025;23(4):109-117. https://doi.org/10.35596/1729-7648-2025-23-4-109-117

Views: 69


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 1729-7648 (Print)
ISSN 2708-0382 (Online)