Close ad

Deepfake - the technology that makes it possible to replace people's faces in photos and videos with someone else's faces, has evolved in recent years to a form in which the difference between real footage and fake data is becoming more and more complicated. On sites with pornographic content, for example, deepfake is used to create titillating videos with the likenesses of famous actors. Of course, all this takes place without the consent of the attacked personalities, and thanks to the increasing sophistication of technology using machine learning, fears are spreading about other possible forms of its abuse. The threat that a deepfake could completely discredit digital records as evidence in court cases is real and hangs over the justice sector like the Sword of Damocles. The good news now comes from Truepic, where they have come up with a simple way to verify the authenticity of listings.

Its creators called the new technology Foresight, and instead of additional video analysis and determining whether it is a deepfake, it uses the linking of individual recordings to the hardware on which they were created to ensure authenticity. Foresight tags all records as they are created with a special set of encrypted metadata. Data is stored in common formats, in the preview for the page Android Police the company demonstrated that an image secured in this way can be saved in JPEG format. So there is no fear of incompatible data formats.

But the technology suffers from a row of small flies. The biggest one is probably the fact that the files do not yet record the changes that have been made to them. The solution is to involve more companies that would support this security method. The success of the technology will thus be mainly determined by the involvement of the largest manufacturers of cameras and mobile devices, led by Samsung and Applem. Are you afraid that someone might abuse your appearance? Share your opinion with us in the discussion below the article.

Today's most read

.