![old movies about artificial intelligence old movies about artificial intelligence](https://flxt.tmsimg.com/assets/p27945_p_v8_au.jpg)
The result is beautiful, but necessarily imperfect. For example, we can’t be certain that the AI got the color perfectly right for those Victorian hats. Shiryaev’s enhancements fictionalize certain aspects of a scene that are unknowable a century after the film was shot.
![old movies about artificial intelligence old movies about artificial intelligence](https://m.media-amazon.com/images/M/MV5BYTNlOWM5ZWYtYWIwMC00ODQyLWIwMWMtN2RjNDJkNjBjODczXkEyXkFqcGdeQW1hZGV0aXNj._V1_QL75_UX500_CR0,0,500,281_.jpg)
This content can also be viewed on the site it originates from.īut let’s go ahead and dive deeper down this philosophical rabbit hole. “We don't want to argue with people from archives,” Shiryaev says. But film archivists would scoff at the idea of the rest of Shiryaev’s AI wizardry being a kind of restoration, because it layers in so much extra data, and much of that data is machine-learning guesswork, which is not necessarily historically perfect. Frame interpolation is an enhancement.” Shiryaev also removes visual noise-those momentary blips and black lines that flash across the screen-and maybe that could be considered a restoration. “So colorization is enhancement,” he adds. And when neural networks redraw in pictures, it's adding a new layer of data.” “We call it an enhancement, because we are training neural networks. “This is an important thing,” says Shiryaev. It can then apply that knowledge to old black-and-white films, painting the old footage with vibrant hues. For example, the algorithm DeOldify, which handles coloration, was trained on over 14 million images to build an understanding of how objects in the world are usually colored. Basically, the algorithms are making stuff up based on their previous training. To be clear, you can’t call these restorations of films, because the algorithms aren’t just getting rid of imperfections-they’re actually filling in approximations of the data missing from old, blurry, low-frame-rate films. And perhaps most importantly, we can see in unprecedented detail the … byproducts that horses had left on the ground along the cable car’s tracks. We can see the puckish looks on those newsboys’ faces. We can finally see vibrant colors on those flamboyant Victorian hats. Using a variety of publicly available algorithms, Shiryaev colorized and sharpened the film to 4K resolution (that’s 3,840 horizontal pixels by 2,160 vertical pixels) and bumped the choppy frame rate up to 60 frames per second, a process known as frame interpolation. Well over a century later, an artificial intelligence geek named Denis Shiryaev has transformed A Trip Down Market Street into something even more magical. After nearly a dozen minutes, the filmmakers arrive at the turntable in front of the Ferry Building, whose towering clock stopped at 5:12 am just four days later when a massive earthquake and consequent fire virtually obliterated San Francisco.
![old movies about artificial intelligence old movies about artificial intelligence](https://cdn-images-1.medium.com/fit/t/1600/480/1*2x4nHxSNK68usyr8rr8VoA.jpeg)
Old movies about artificial intelligence drivers#
Early automobiles swerve in front of the cable car, some of them convertibles, so we can see their drivers bouncing inside. A policeman strolls by wielding a billy club. Called A Trip Down Market Street, it’s a fascinating documentation of life at the time: As the cable car rolls slowly along, the brothers aim their camera straight ahead, capturing women in outrageous frilly Victorian hats as they hurry across the tracks. On April 14, 1906, the Miles brothers left their studio on San Francisco’s Market Street, boarded a cable car, and began filming what would become an iconic short movie.