Typically, expensive high-speed cameras are required to record fast motion at high temporal resolution. Fortunately, video interpolation, the act of estimating the intermediate frames between two given frames, can be an easier option.
A recent paper published on arXiv.org proposes a new video frame interpolation method.
The researchers propose to use both video frames and event streams as inputs to address complex non-linear motion. The problem of lack of intermediate information in traditional frame-based cameras is solved by introducing an event camera. Event streams have been introduced into the unsupervised learning framework to directly estimate the optical flow between the intermediate frame and the input frame.
The proposed approach performs favorably against state-of-the-art approaches on both synthetic benchmarks and on real data.
Recording fast motion at high FPS (frames-per-second) requires expensive high-speed cameras. As an alternative, projecting low-FPS video from commodity cameras has attracted significant attention. If only low-fps videos are available, motion estimates (linear or quadratic) are necessary to estimate intermediate frames, which fail to model complex motions. Event camera, a new camera with pixels that produce brightness change events at a temporal resolution of μs ,10,6 second ,, is a game-changing device for enabling video interpolation in the presence of arbitrarily complex motion. Since the event camera is a new sensor, its potential has not been fulfilled due to lack of processing algorithms. The pioneering action time lens pioneered event cameras for video interpolation by designing optical devices to collect a large amount of training data coupled to high-speed frames and events, which is very expensive on a large scale. To fully unlock the potential of event cameras, this paper proposes a novel timerplayer algorithm to interpolate video captured by commodity cameras with events. It is trained in an arbitrary cycle-compatible style, eliminating the need for high-speed training data and bringing in the additional capability of video extrapolation. Its cutting-edge results and the demo video in the supplement reveal the promising future of event-based vision.
Research Paper: He, W., “Time Replacer: Unlocking the Potential of Event Cameras for Video Interpolation”, 2022. Link: https://arxiv.org/abs/2203.13859
Project Page: https://sites.google.com/view/timereplayer/