Color coherence-based scene-change detection for frame rate up-conversion

Ho Sub Lee, Sung In Cho

Research output: Contribution to journalArticlepeer-review

Abstract

Existing scene-change detection methods usually use the differences in luminance values between consecutive frames to detect scene changes. Therefore, they can have difficulty detecting scene changes for various video content because luminance values cannot properly represent the region characteristics. To solve this problem, this paper proposes a new scene-change detection method that uses the color coherence values for frame rate up-conversion. We define the patterns of the distribution of color features—the so-called color coherence patterns—as our feature to determine whether a given frame is a scene-change. The proposed method converts the color coherence pattern in the corresponding regions into bit codes and then uses both difference between the converted bit codes and the average luminance values from previous and current frames to locally determine whether the regions are the scene change or not. In this process, local scene changes are determined by calculating the region where the scene change occurred in the divided block, and if the number of blocks in the entire scene is greater than a certain threshold, it is determined that global scene changes have occurred. In addition, it uses a refinement process to enhance the detection accuracy. Using these detection processes, the proposed can further improve the detection accuracy. The experimental results showed that the proposed method enhanced the average F1 score to 0.5398 compared to benchmark methods.

Original languageEnglish
JournalMultimedia Tools and Applications
DOIs
StateAccepted/In press - 2024

Keywords

  • Color coherence patterns
  • Frame rate up-conversion
  • Motion estimation
  • Scene-change detection

Fingerprint

Dive into the research topics of 'Color coherence-based scene-change detection for frame rate up-conversion'. Together they form a unique fingerprint.

Cite this