A collaborative client participant fusion system for realistic remote conferences

Wei Song, Mingyun Wen, Yulong Xi, Phuong Minh Chu, Hoang Vu, Shokh Jakhon Kayumiy, Kyungeun Cho

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Remote conferencing systems provide a shared environment where people in different locations can communicate and collaborate in real time. Currently, remote video conferencing systems present separate video images of the individual participants. To achieve a more realistic conference experience, we enhance video conferencing by integrating the remote images into a shared virtual environment. This paper proposes a collaborative client participant fusion system using a real-time foreground segmentation method. In each client system, the foreground pixels are extracted from the participant images using a feedback background modeling method. Because the segmentation results often contain noise and holes caused by adverse environmental lighting conditions and substandard camera resolution, a Markov Random Field model is applied in the morphological operations of dilation and erosion. This foreground segmentation refining process is implemented using graphics processing unit programming, to facilitate real-time image processing. Subsequently, segmented foreground pixels are transmitted to a server, which fuses the remote images of the participants into a shared virtual environment. The fused conference scene is represented by a realistic holographic projection.

Original languageEnglish
Pages (from-to)2720-2733
Number of pages14
JournalJournal of Supercomputing
Volume72
Issue number7
DOIs
StatePublished - 1 Jul 2016

Keywords

  • Foreground segmentation
  • GPU
  • Human-centric communication
  • Mixed reality
  • Remote conferencing systems

Fingerprint

Dive into the research topics of 'A collaborative client participant fusion system for realistic remote conferences'. Together they form a unique fingerprint.

Cite this