Deggers Posted May 8, 2021 Report Share Posted May 8, 2021 (edited) Here’s a bit of digital wizardry that's been doing the rounds just recently . . . “Deep Nostalgia” is an app which is capable of animating still photographs. After uploading a photo to MyHeritage.com (a company specialising in video reenactment using deep learning technology), the app learns the image and animates it by digitally applying a sequence of movements and gestures, such as smiles and blinks. By way of demonstration, here's a photo of Miss Monroe. And the same image again, animated with “Deep Nostalgia". Of course, should you suddenly find yourself inspired to rush off to the attic to grab your family albums, first click on the link below to visit the “How To” page of MyHeritage.com, which further explains the process : How to use “Deep Nostalgia” to animate family photos However, a gentle word of advice . . . by doing so, do be prepared for the possibility of stirring up some potentially emotional memories. Deggers Edited May 9, 2021 by Deggers Quote Link to post Share on other sites
Crawfie Posted May 8, 2021 Report Share Posted May 8, 2021 This is a bad as the dead rock stars being turned into holographic images and going on tour . The past is the past and is best left well alone Quote Link to post Share on other sites
Peter Cobbold Posted May 9, 2021 Report Share Posted May 9, 2021 Amazing technology. But with potential for abuse in the political arena. Not so much reanimating dead politicians as putting words into the living. eg Spurgeon " Long live the Union" Quote Link to post Share on other sites
Deggers Posted May 9, 2021 Author Report Share Posted May 9, 2021 (edited) 9 hours ago, Peter Cobbold said: Amazing technology. But with potential for abuse in the political arena. The MyHeritage technology, though impressive, is limited to subtly animating a single still photograph. However, there is already significantly more powerful technology available, which can be used to realistically manipulate video. "Deepfakes", as they are known, are used to digitally replace one person's face with the likeness of another, and it is already causing some very real concerns, not least amongst politicians and law enforcement agencies. From Wikipedia: "Deepfakes are synthetic media in which a person in an existing image or video is replaced with someone else's likeness. While the act of faking content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content with a high potential to deceive. Deepfakes have garnered widespread attention for their uses in fake news, hoaxes, and financial fraud. This has elicited responses from both industry and government to detect and limit their use." The technology is becoming increasingly powerful, and the results can be very convincing to those who don't know how to spot them. For example, here is a recent Deepfake video featuring footage from the 007 movie Dr. No, in which Sean Connery's face has been digitally replaced by Burt Reynolds. And it's not just video footage which can be faked. "Audio deepfakes have been used as part of social engineering scams, fooling people into thinking they are receiving instructions from a trusted individual. In 2019, a U.K.-based energy firm's CEO was scammed over the phone when he was ordered to transfer €220,000 into a Hungarian bank account by an individual who used audio deepfake technology to impersonate the voice of the firm's parent company's chief executive." This clip of Boris Johnson, although admittedly somewhat amusing, was created entirely using Deepfake technology. Deggers Edited May 9, 2021 by Deggers Quote Link to post Share on other sites
Bleednipple Posted May 9, 2021 Report Share Posted May 9, 2021 This has been inevitable for some time and we're now seeing the first incidents of it actually being used. In future no video content can be considered trustworthy of itself, we'll have to rely on our personal assessments of its provenance and the deemed trustworthiness of whoever passes it to us. In my view that will strengthen the utility of traditional/mainstream media brands to act as our 'agents' in the veracity of what we want to read/view in future - contrary to today's presumption that "all content is equal" whether we get it from mainstream or social media. However some people will fail to understand that and will be even more wide open to being duped and manipulated via social media. Nigel Quote Link to post Share on other sites
Crawfie Posted May 9, 2021 Report Share Posted May 9, 2021 1 hour ago, Bleednipple said: This has been inevitable for some time and we're now seeing the first incidents of it actually being used. In future no video content can be considered trustworthy of itself, we'll have to rely on our personal assessments of its provenance and the deemed trustworthiness of whoever passes it to us. In my view that will strengthen the utility of traditional/mainstream media brands to act as our 'agents' in the veracity of what we want to read/view in future - contrary to today's presumption that "all content is equal" whether we get it from mainstream or social media. However some people will fail to understand that and will be even more wide open to being duped and manipulated via social media. Nigel So ......what you’re saying is that we will never get rid of the Kardashians ??? God help us !! Quote Link to post Share on other sites
Peter Cobbold Posted May 9, 2021 Report Share Posted May 9, 2021 So its not really Einstein in his bath. Phew....I was nearly duped into a smart meter. Quote Link to post Share on other sites
stillp Posted May 9, 2021 Report Share Posted May 9, 2021 I thought colourising old b/w images was bad enough... Pete Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.