Old film classics are plagued with low resolution and disturbing artifacts. Film restoration technology takes painstaking effort; Therefore, an algorithm that does the tedious task would be beneficial.

retro photo camera.

retro photo camera. Image credits: PXFuel, free license

A recent paper published on arXiv.org attempts to integrate entire film restoration operations with a single framework in which we conduct spatio-temporal restoration.

Researchers rely on the insight that most defects encountered in a frame can reveal its contents in successive frames. Therefore, decline is repaired by taking advantage of the spatio-temporal context rather than relying on hallucinations. Visual comparison and quantitative evaluation show that the proposed approach performs well on both synthetic data and real old films.

Furthermore, it has been shown that the same framework can be readily adapted for film colorization. For this task, the method performs favorably over the major colorization methods.

We present the Recurrent Transformer Network (RTN), a learning-based framework for restoring heavily degraded old movies. Rather than performing frame-wise restoration, our method is based on hidden knowledge learned from adjacent frames containing abundant information about occlusion, which is beneficial for restoring challenging artifacts of each frame while ensuring temporal coherence. Furthermore, it becomes possible to estimate the position of the scratch in an arbitrary way in contrast to the current frame and hidden knowledge representations, and such defect localizations generalize well to real-world degradation. To better resolve mixed degradation and compensate for flow estimation error during frame alignment, we propose to take advantage of more expressive transformer blocks for spatial restoration. Experiments on both synthetic datasets and real-world old films demonstrate the significant superiority of the proposed RTN over existing solutions. In addition, the same framework can effectively propagate color from keyframes to the entire video, ultimately producing attractive restored movies. The implementation and model will be released at this https URL.

Research Paper: Wan, Z., Zhang, B., Chen, D., & Liao, J., “Bringing old films back to life”, 2022. Link: https://arxiv.org/abs/2203.17276


Leave a Reply

Your email address will not be published.