- Journal
- Remote Sensing
- Date
- 2024.05.14
- Abstract
High-performance radar systems are actively developed to detect obstacles in front of unmanned vehicles well in situations such as fog, snow, rain, and night. These radars are gradually increasing their use, such as providing empty space, environment detection other than moving target detection and tracking, which are existing basic functions. In this paper, using our high resolution radar system, a three-dimensional point cloud image algorithm was implemented. Axis translation code was applied to minimize the point spreading phenomenon caused by the different mounting positions and the alignment error of global positioning sensor and radar. After applying the correction code, point cloud image for corner reflector target and parked vehicle were implemented to directly compare the improved results. Recently developed radar system was mounted on the vehicle and collected data through actual road driving. Using this, a three dimensional point cloud image including a modification code was implemented. As a result, it was found that not only the curbstone of the road but also the street tree and the walls were well represented. This point cloud image was overlapped with QT web based navigation map image to implement an algorithm to determine the location of the vehicle. This method can be very useful for positioning of unmanned vehicle in urban areas where there are many buildings where GNSS cannot be received. Finally, sensor fusion, in which three dimensional point cloud radar image appears on the camera image, was implemented. The sensing position alignment between the sensors was implemented through internal and external parameter optimization. This high-performance radar application algorithm is expected to be well used for unmanned ground or aerial vehicle route planning and avoidance maneuvers in emergencies without the influence of weather as it can obtain detailed information on spatial information and obstacles not only in the front but also around it.
- Reference
- Remote Sens. 2024, 16(10), 1733