Deepfake Detectors Are Not As Accurate As You Think

Deepfakes have created a bit of a crisis in the world because of the fact that this is the sort of thing that could potentially end up making it difficult for people to be able to truly ascertain whether or not a piece of media or any kind of video that they might be looking into is in any way realistic. Deepfakes have been tricking people for quite some time now, which has prompted a few experts from major tech companies such as Microsoft as well as Facebook to try to create some kind of “deep fake detector”.

This has lead a lot of people to assume that deepfakes are not going to end up becoming all that big of a problem in the future, but the major issue with this sort of thing is that recent research has indicated that these deepfake detectors might not be quite as good as people think they are. Researchers at UC San Diego have ended up looking into these tools that can be used to detect deepfakes and have discovered that they can end up being tricked through something as simple as inserting inputs that can be referred to as “adversarial examples” into every frame, and this would end up successfully tricking any tools and making it seem like the video is actually genuine.

This shows that a lot more research needs to be done in this area. Deepfakes are some of the most dangerous things out there because of the fact that this is the sort of thing that could potentially end up making it seem like a famous public figure has said something that they actually didn’t. In a modern day and age where things can get really bad, really fast, people need to be able to develop some way to figure out whether or not the piece of media they are watching is in any way representative of anything real that might just end up happening.

Research will continue, but until it is completed you should be wary of any videos that you end up seeing online since they might be deepfakes.

Read next: Researchers have shown that even debiasing cannot remove racism from hate speeches
Previous Post Next Post