TY - GEN
T1 - Real-Time Point Cloud Action Recognition System with Automated Point Cloud Preprocessing
AU - Lai, Yen Ting
AU - Lin, Cheng Hung
AU - Chou, Po Yung
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Point cloud action recognition has the advantage of being less affected by changes in lighting and viewing angle, as it focuses on the three-dimensional position of an object rather than pixel values. This enables robust recognition performance even in complex and dark environments. Additionally, point cloud action recognition finds widespread applications in fields such as robotics, virtual reality, autonomous driving, human-computer interaction, and game development. For instance, understanding human actions is crucial for better interaction and collaboration in robotics, while in virtual reality, it can capture and reproduce user movements to enhance realism and interactivity. To build a smoothly operating point cloud action recognition system, it is often necessary to filter out background and irrelevant points, resulting in clean and aligned data. In previous methods, point cloud filtering and action recognition were usually performed separately, with fewer systems operating together or action recognition without background filtering. In this paper, we propose a pipeline that enables users to directly acquire point cloud data from the Azure Kinect DK and perform comprehensive automated preprocessing. This generates cleaner point cloud data without background points, suitable for action recognition. Our approach utilizes PSTNet for point cloud action recognition and trains the model on the dataset obtained through automated preprocessing, which includes 12 action classes. Finally, we have developed a real-time point cloud action recognition system that combines automated point cloud preprocessing.
AB - Point cloud action recognition has the advantage of being less affected by changes in lighting and viewing angle, as it focuses on the three-dimensional position of an object rather than pixel values. This enables robust recognition performance even in complex and dark environments. Additionally, point cloud action recognition finds widespread applications in fields such as robotics, virtual reality, autonomous driving, human-computer interaction, and game development. For instance, understanding human actions is crucial for better interaction and collaboration in robotics, while in virtual reality, it can capture and reproduce user movements to enhance realism and interactivity. To build a smoothly operating point cloud action recognition system, it is often necessary to filter out background and irrelevant points, resulting in clean and aligned data. In previous methods, point cloud filtering and action recognition were usually performed separately, with fewer systems operating together or action recognition without background filtering. In this paper, we propose a pipeline that enables users to directly acquire point cloud data from the Azure Kinect DK and perform comprehensive automated preprocessing. This generates cleaner point cloud data without background points, suitable for action recognition. Our approach utilizes PSTNet for point cloud action recognition and trains the model on the dataset obtained through automated preprocessing, which includes 12 action classes. Finally, we have developed a real-time point cloud action recognition system that combines automated point cloud preprocessing.
KW - action recognition
KW - dynamic point cloud
UR - http://www.scopus.com/inward/record.url?scp=85186984758&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85186984758&partnerID=8YFLogxK
U2 - 10.1109/ICCE59016.2024.10444448
DO - 10.1109/ICCE59016.2024.10444448
M3 - Conference contribution
AN - SCOPUS:85186984758
T3 - Digest of Technical Papers - IEEE International Conference on Consumer Electronics
BT - 2024 IEEE International Conference on Consumer Electronics, ICCE 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Consumer Electronics, ICCE 2024
Y2 - 6 January 2024 through 8 January 2024
ER -