Enhanced Vision-Based Speed Estimation By Roadside Surveillance Cameras

Cheng Wei Peng*, Tai You Lin, Chen Chien Hsu, Sheng Chung Huang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Vision-based applications have been gradually applied to city management demands in recent years. This study proposes an enhanced speed estimation method that uses video data from publicly accessible government surveillance cameras. By incorporating YOLO and DeepSORT, the system deduced each vehicle's position and estimated their speed accurately in a regression manner. According to the preliminary experimental result, the proposed method can produce data regarding vehicle counting and speed estimation; it showed that the proposed method could gain comparable content to the ground truth of the transportation web service platform.

Original languageEnglish
Title of host publicationGCCE 2023 - 2023 IEEE 12th Global Conference on Consumer Electronics
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages889-890
Number of pages2
ISBN (Electronic)9798350340181
DOIs
Publication statusPublished - 2023
Event12th IEEE Global Conference on Consumer Electronics, GCCE 2023 - Nara, Japan
Duration: 2023 Oct 102023 Oct 13

Publication series

NameGCCE 2023 - 2023 IEEE 12th Global Conference on Consumer Electronics

Conference

Conference12th IEEE Global Conference on Consumer Electronics, GCCE 2023
Country/TerritoryJapan
CityNara
Period2023/10/102023/10/13

Keywords

  • Advanced Traffic Management System
  • ITS
  • Object Detection
  • Object Tracking

ASJC Scopus subject areas

  • Artificial Intelligence
  • Energy Engineering and Power Technology
  • Electrical and Electronic Engineering
  • Safety, Risk, Reliability and Quality
  • Instrumentation
  • Atomic and Molecular Physics, and Optics

Fingerprint

Dive into the research topics of 'Enhanced Vision-Based Speed Estimation By Roadside Surveillance Cameras'. Together they form a unique fingerprint.

Cite this