A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion

An Chun Luo, Sei-Wang Chen, Kun Lung Tseng

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presented a virtual-real synthesis image composition method, called match-moving, which is suitable for the graphic image composition or the stereoscopic image composition. The proposed method consists of two subsystems for camera tracking and virtue-real preview, respectively. The camera tracking subsystem based on the temporal depth fusion is used to acquire the camera pose and trajectory. In order to compensate the existing noises in dynamic scenes, a priori model is employed while developing a human skeleton detection method and a spatial temporal attention analysis method. The composition of real and virtual objects is performed by a real-time virtual-real synthesis preview system. In exemplary results, the measurement of camera pose has shown that the proposed system was achieved with high accuracy. Our goal is to design an intuitive camera match-moving system to perform a virtual-real synthesis with real-time 3D preview visualization.

Original languageEnglish
Title of host publicationProceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages35-39
Number of pages5
ISBN (Electronic)9781509008797
DOIs
Publication statusPublished - 2016 Aug 1
Event2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016 - Warsaw, Poland
Duration: 2016 Jun 102016 Jun 12

Publication series

NameProceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016

Other

Other2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016
CountryPoland
CityWarsaw
Period16/6/1016/6/12

Fingerprint

Fusion reactions
Cameras
Chemical analysis
Visualization
Trajectories

Keywords

  • camera tracking
  • match-moving
  • temporal depth fusion
  • virtual-real synthesis

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition
  • Signal Processing

Cite this

Luo, A. C., Chen, S-W., & Tseng, K. L. (2016). A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion. In Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016 (pp. 35-39). [7528515] (Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/OPTIP.2016.7528515

A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion. / Luo, An Chun; Chen, Sei-Wang; Tseng, Kun Lung.

Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016. Institute of Electrical and Electronics Engineers Inc., 2016. p. 35-39 7528515 (Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Luo, AC, Chen, S-W & Tseng, KL 2016, A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion. in Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016., 7528515, Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016, Institute of Electrical and Electronics Engineers Inc., pp. 35-39, 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016, Warsaw, Poland, 16/6/10. https://doi.org/10.1109/OPTIP.2016.7528515
Luo AC, Chen S-W, Tseng KL. A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion. In Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016. Institute of Electrical and Electronics Engineers Inc. 2016. p. 35-39. 7528515. (Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016). https://doi.org/10.1109/OPTIP.2016.7528515
Luo, An Chun ; Chen, Sei-Wang ; Tseng, Kun Lung. / A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion. Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016. Institute of Electrical and Electronics Engineers Inc., 2016. pp. 35-39 (Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016).
@inproceedings{610c5f549d474d768cb76a00b86f5bc4,
title = "A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion",
abstract = "This paper presented a virtual-real synthesis image composition method, called match-moving, which is suitable for the graphic image composition or the stereoscopic image composition. The proposed method consists of two subsystems for camera tracking and virtue-real preview, respectively. The camera tracking subsystem based on the temporal depth fusion is used to acquire the camera pose and trajectory. In order to compensate the existing noises in dynamic scenes, a priori model is employed while developing a human skeleton detection method and a spatial temporal attention analysis method. The composition of real and virtual objects is performed by a real-time virtual-real synthesis preview system. In exemplary results, the measurement of camera pose has shown that the proposed system was achieved with high accuracy. Our goal is to design an intuitive camera match-moving system to perform a virtual-real synthesis with real-time 3D preview visualization.",
keywords = "camera tracking, match-moving, temporal depth fusion, virtual-real synthesis",
author = "Luo, {An Chun} and Sei-Wang Chen and Tseng, {Kun Lung}",
year = "2016",
month = "8",
day = "1",
doi = "10.1109/OPTIP.2016.7528515",
language = "English",
series = "Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "35--39",
booktitle = "Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016",

}

TY - GEN

T1 - A real-time camera match-moving method for virtual-real synthesis image composition using temporal depth fusion

AU - Luo, An Chun

AU - Chen, Sei-Wang

AU - Tseng, Kun Lung

PY - 2016/8/1

Y1 - 2016/8/1

N2 - This paper presented a virtual-real synthesis image composition method, called match-moving, which is suitable for the graphic image composition or the stereoscopic image composition. The proposed method consists of two subsystems for camera tracking and virtue-real preview, respectively. The camera tracking subsystem based on the temporal depth fusion is used to acquire the camera pose and trajectory. In order to compensate the existing noises in dynamic scenes, a priori model is employed while developing a human skeleton detection method and a spatial temporal attention analysis method. The composition of real and virtual objects is performed by a real-time virtual-real synthesis preview system. In exemplary results, the measurement of camera pose has shown that the proposed system was achieved with high accuracy. Our goal is to design an intuitive camera match-moving system to perform a virtual-real synthesis with real-time 3D preview visualization.

AB - This paper presented a virtual-real synthesis image composition method, called match-moving, which is suitable for the graphic image composition or the stereoscopic image composition. The proposed method consists of two subsystems for camera tracking and virtue-real preview, respectively. The camera tracking subsystem based on the temporal depth fusion is used to acquire the camera pose and trajectory. In order to compensate the existing noises in dynamic scenes, a priori model is employed while developing a human skeleton detection method and a spatial temporal attention analysis method. The composition of real and virtual objects is performed by a real-time virtual-real synthesis preview system. In exemplary results, the measurement of camera pose has shown that the proposed system was achieved with high accuracy. Our goal is to design an intuitive camera match-moving system to perform a virtual-real synthesis with real-time 3D preview visualization.

KW - camera tracking

KW - match-moving

KW - temporal depth fusion

KW - virtual-real synthesis

UR - http://www.scopus.com/inward/record.url?scp=84992047464&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84992047464&partnerID=8YFLogxK

U2 - 10.1109/OPTIP.2016.7528515

DO - 10.1109/OPTIP.2016.7528515

M3 - Conference contribution

AN - SCOPUS:84992047464

T3 - Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016

SP - 35

EP - 39

BT - Proceedings - 2016 International Conference on Optoelectronics and Image Processing, ICOIP 2016

PB - Institute of Electrical and Electronics Engineers Inc.

ER -