Frame Difference-Based Real-Time Video Stylization in Video Calls

Zheyang Xiong , Cornelius Weber , Xiaolin Hu
International Conference on Intelligent Control and Information Processing (ICICIP), pages 333--339, doi: 10.1109/ICICIP.2018.8606694 - Nov 2018
Associated documents :  
The naive video stylization method is to perform neural-style transfer on individual frames, but this method would result in a flickering effect, which is particularly visible in static regions. Previous remedies extract optical flow from a video and use this information to stabilize the stylized videos. However, computing optical flow is complex and time-consuming. We consider stylizing videos in which the background is fixed and only the foreground object moves, which is the case in video calls. We propose a simple method to stylize such videos in real time based on frame difference. The main idea is to use the frame difference to detect foreground and rebuild it in the next frame while maintaining the stylized background from the previous frame. This method is easy to implement and can stylize videos in real time with stabilized frames.

 

@InProceedings{XWH18, 
 	 author =  {Xiong, Zheyang and Weber, Cornelius and Hu, Xiaolin},  
 	 title = {Frame Difference-Based Real-Time Video Stylization in Video Calls}, 
 	 booktitle = {International Conference on Intelligent Control and Information Processing (ICICIP)},
 	 editors = {},
 	 number = {},
 	 volume = {},
 	 pages = {333--339},
 	 year = {2018},
 	 month = {Nov},
 	 publisher = {IEEE},
 	 doi = {10.1109/ICICIP.2018.8606694}, 
 }