A 3-D detection method of moving traffic loads based on the stereo vision
|
Bibliographic Details
Author(s): |
Xinhong Liu
(Department of Bridge Engineering, Tongji University, Shanghai 200092, China)
Xiaoliang Li (Department of Bridge Engineering, Tongji University, Shanghai 200092, China) Ye Xia (Shanghai Tongji Testing Technology Co.,Ltd , Shanghai 200092, China) Limin Sun (State Key Laboratory for Disaster Reduction in Civil Engineering, Tongji University, Shanghai, 200092, China) |
||||
---|---|---|---|---|---|
Medium: | conference paper | ||||
Language(s): | English | ||||
Conference: | IABSE Congress: Structural Engineering for Future Societal Needs, Ghent, Belgium, 22-24 September 2021 | ||||
Published in: | IABSE Congress Ghent 2021 | ||||
|
|||||
Page(s): | 467-474 | ||||
Total no. of pages: | 8 | ||||
DOI: | 10.2749/ghent.2021.0467 | ||||
Abstract: |
Vehicle tracking based on computer vision is an important part of traffic load monitoring. The stable and reliable vehicle detection is the primary task of vehicle tracking. The widely used 2-D detection works accurately in simple situation but faces obstacles in dealing with multi-vehicle overlap. Therefore, this paper proposes a 3-D detection method of vehicles based on the stereo vision, which can position accurately the vehicles partly covered in the image to keep a continuous vehicle tracking. This method builds spatial relations between observed objects such as wheels and the feature points on the vehicle. With spatial relations determined, when the observed objects of the vehicle are partly covered by other objects, their positions can be inferred accurately from the visible feature points of the vehicle. Experiments were taken to verify the effectiveness of the proposed method. |
||||
Keywords: |
stereo vision 3-D detection vehicle tracking traffic load monitoring multi-vehicle overlap feature points
|
||||
Copyright: | © 2021 International Association for Bridge and Structural Engineering (IABSE) | ||||
License: | This creative work is copyrighted material and may not be used without explicit approval by the author and/or copyright owner. |