An enthusiast from Canada presented a neural network that can reproduce the appearance of Roman emperors through sculptures. However, the researcher had to refine these photos in Photoshop.
Researchers have explained that the appearance of Roman emperors survived only in hundreds of sculptures. However, even the most detailed works cannot convey what they looked like when they were alive. To find out, Canadian cinematographer and virtual reality designer Daniel Voshart used machine learning – computer algorithms that are constantly learning. In this case, the computer system processes information through a hierarchy of nodes that interact in the likeness of brain neurons.
The Artbreeder neural network analyzed 800 busts to model more realistic facial shapes, features, hair and skin, and add color. Woshart then refined the Artbreeder with Photoshop, adding details collected from coins, works of art, and written descriptions of emperors from historical texts to make the portraits truly alive.
«A well-lit and drawn bust with minor damage and standard facial features will be fairly easy to reproduce, – said the researcher. – On the contrary, a data set that includes damaged sculptures or photographs in low light can produce the notorious “junk” images that are not very realistic.».
The busts that Woshart preferred to use as primary sources were made during the life of the emperors, so they can give an idea of what these people looked like. To determine the color of the skin, Voshart trusted AI – the machine itself distributed the shades so that the surface of the model resembled realistic human skin.