Person detection and tracking in a hall using multiple 3d lidar configurations

The purpose of this project was to track multiple people in realtime with a defined tracking space. Lidar was chosen for this type of tracking application as it is used in a dark hall and the only light is from animations displayed by the projectors. The surface of the hall is not flat, and tagets may be stationary, seated or moving. The positions is further streamed over ethernet to another system for a myriad of interactive applications.

The first phase requires merging lidar data, transformations are applied to cater for various lidar positioning.These are further combined to a single pointcloud that covers the enitire space.
Next, ground removal techniques are applied on merged pointcloud and cropped to focus on the region of interest using pcl library, then resulting pointcloud is further streamed to the next phase whereby deep learning techniques( using centerpoint) are applied to find the position of each person. Each individual is given a unique id and a bounding box is drawn around the person. The id,x and y coordinates are then sent to the next phase, where they are used as inputs for unreal engine for an interactive experience, where dynamic animations are projected on the floors and walls. This solution was tested in simulation and with live data.

Centerpoint detection framework

Snow

Above shows the two stage centerpoint model used to detect each individual. This is realtime and performed on ros topic as the data arrives. Thesolution does not require a map.

Simulation of the hall in gazebo ros

Above shows a 3d model of the hall in gazebo

live sensor data and simualted results in rviz

reference: https://arxiv.org/pdf/2006.11275.pdf