Thursday, April 18th, 2024

Breaking News

Deepfake Technology Can Put Words in ANYONE’S Mouth – Even YOURS (VIDEO)

Deepfake Technology Can Put Words in ANYONE’S Mouth – Even YOURS (VIDEO)

Did you know that technology exists that can create images of people that are not real?

And, did you know that technology exists that can make those “people” talk?

We’ve written about all sorts of dystopian technology on this website, but this might be the creepiest yet. Known as “deepfakes” (a portmanteau of “deep learning” and “fake”) this technology can be used for human image synthesis based on artificial intelligence. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique known as generative adversarial network (GAN).

Deepfake technology is rapidly evolving and the likely consequences are troubling.

Deepfake technology has already been used to create fake news, malicious hoaxes, fake celebrity pornographic videos, and revenge porn. We know how fast the internet can turn on a person from just a camera angle. Imagine the chilling possibilities with this kind of technology.

take our poll - story continues below
[gravityforms id="56"]
Completing this poll grants you access to DC Dirty Laundry updates free of charge. You may opt out at anytime. You also agree to this site's Privacy Policy and Terms of Use.

In a February 2019 report called ThisPersonDoesNotExist.com Uses AI to Generate Endless Fake Faces, James Vincent raises chilling concerns over the technology:

As we’ve seen in discussions about deepfakes (which use GANs to paste people’s faces onto target videos, often in order to create non-consensual pornography), the ability to manipulate and generate realistic imagery at scale is going to have a huge effect on how modern societies think about evidence and trust. Such software could also be extremely useful for creating political propaganda and influence campaigns.

In other words, ThisPersonDoesNotExist.com is just the polite introduction to this new technology. The rude awakening comes later. (source)

It looks like “later” is here:

A new algorithm allows video editors to modify talking head videos as if they were editing text – copying, pasting, or adding and deleting words.

A team of researchers from Stanford University, Max Planck Institute for Informatics, Princeton University and Adobe Research created such an algorithm for editing talking-head videos – videos showing speakers from the shoulders up. (source)

The possible consequences of this technology are horrifying to ponder, but Melissa Dykes of Truthstream Media did just that in this chilling video.


It is getting more and more difficult to discern fakes from reality.

“Our reality is increasingly being manipulated and mediated by technology, and I think we as a human race are really starting to feel it,” Dykes writes:

We are coming to a point in our post-post-modern society where seeing and hearing will not be believing. The old mantra will be rendered utterly worthless. The more we consider the sophistication of these technologies, the more it becomes clear that we are just scratching the surface at attempting to understand the real meaning of the phrase “post-truth world”. It’s not a coincidence this was an official topic at last year’s ultra-secretive elite Bilderberg meeting, regularly attended by major Silicon Valley players including Google and Microsoft, either. (source)

What do you think?

What will deepfake technology be used for? Do you think it will eventually become impossible to identify deepfakes? What will the consequences be? Please share your thoughts in the comments.

Courtesy of The Organic Prepper

Dagny Taggart is the pseudonym of an experienced journalist who needs to maintain anonymity to keep her job in the public eye. Dagny is non-partisan and aims to expose the half-truths, misrepresentations, and blatant lies of the MSM.

(Visited 54 times, 1 visits today)

Your Daily Briefing:

Fight Online Censorship!

Get the news Google and Facebook don't want you to see: Sign up for DC Dirty Laundry's daily briefing and do your own thinking!


Translate »