Earlier this year, I digitized a few old family movies from the mid 1930s. I have wanted to try out AI frame interpolation and colorization on original 8mm films for some time, and these movies are a perfect test case. The video below was shot in 1938, and was digitized with a telecine machine.

For frame interpolation, I ran the original video through the Depth Aware video frame Interpolation (DAIN) network developed by Wenbo Bao. The code is available on GitHub. I ran the DAIN network through Google Colab.
I suppose a brief explanation of frame interpolation would be helpful before showing the results. The idea of DAIN and similar deep learning algorithms is the in-filling of interval frames between the actual frames of a film. In the example film, the frame rate of the 8mm camera was probably between 15-18fps. With older films (think silent movie era), the frame rate might be even lower. These low frame rates are the source of old movies looking ‘sped-up’, as fewer images are used to represent the scene. DAIN attempts to fill in the extant frames with simulated frames approximating what a frame would look like if it had been there. In this way, we can ‘upscale’ a low fps movie such as the one above, to ‘look’ like it has been shot with modern 30fps cameras. The resulting film appears smoother, with slower motion.

Original 8mm film

Below you can see the output of the DAIN.

Black and White, 30FPS film

Next, using another AI film manipulator, I colored the film. To do this, I used the deoldify network (see my super8 movie project), run also in google colab. I think the results are quite good, and all that is left is getting the results of both the DAIN and deoldify networks into higher definition for youtube embedding.

Final Coloured Movie