Clinckemaillie, WinterWinterClinckemaillieVanhaeverbeke, JelleJelleVanhaeverbekeSlembrouck, MaartenMaartenSlembrouckVerstockt, StevenStevenVerstockt2026-04-132026-04-1320261077-3142https://imec-publications.be/handle/20.500.12860/59061Advanced computer vision and machine learning technologies transform how we experience sports events. This work enriches helicopter footage of cycling races with dynamic, in-scene, pose-aligned augmented reality (AR) overlays (e.g., rider name, speed, wind direction) that remain visually attached to each rider. To achieve this, we propose a multi-stage pipeline: cyclists are first detected and tracked, followed by team recognition using a one-shot learning approach based on Siamese neural networks, which achieves a classification accuracy of 85% on a test set composed of unseen teams during training. This design allows easy adaptation and reuse across different races and seasons, enabling frequent jersey and team changes with minimal effort. We introduce a pose-based AR overlay that anchors rider labels to moving cyclists without fixed field landmarks or homography, enabling dynamic overlays in unconstrained cycling broadcasts. Real-time feasibility is demonstrated through runtime profiling and TensorRT optimizations. Finally, a user study evaluates the readability, informativeness, visual stability, and engagement of our AR-enhanced broadcasts. The combination of advanced computer vision, AR, and user-centered evaluation showcases new possibilities for improving live sports broadcasts, particularly in challenging environments like road cycling.engAn end-to-end pipeline for team-aware, pose-aligned augmented reality in cycling broadcastsJournal article10.1016/j.cviu.2025.104602WOS:001640781900001