The time has come for us to make passwords for identifying each other…
I’m in the US and have a well off friend who had his Facebook hacked. The bad actors sent messages to his friends asking to borrow $500 until tomorrow because his bank accounts were locked and he needed the cash. Someone who was messaged by the bad actors posted a screenshot of a deepfaked video call he received that caused him to fall for it. Wild times we live in!
I know someone who fell for a similar scam but it involved purchasing gift cards.
I routinely get emails from the owner of the company I work for asking me to kindly purchase several large gift cards and forward them and the receipt to him for prompt reimbursement.
asking me to kindly purchase several large gift cards
kindly give me your money, thanks
Right now deepfakes doesn’t work well when the face is viewed from extreme angles, so you can ask them to slowly turn their face to the side or up/down as far as they can until the face is not visible. It also doesn’t work well when something obstruct the face, so ask them to put their hand in their face. It also can’t seem to render mouth right if you open it too wide, or stick out your tongue.
I base this from a deepfake app I tried: https://github.com/s0md3v/roop . But as the tech improves, it might be able to handle those cases in the future.
Edit: chance that the scammer use a live deepfake app like this one: https://github.com/iperov/DeepFaceLive . It also supports using the Insight model which only need a single well lit photo to impersonate someone.
Right now deepfakes doesn’t work well when the face is viewed from extreme angles, so you can ask them to slowly turn their face to the side or up/down as far as they can until the face is not visible.
or, you know, you can just pickup the phone and call them.
I got one of these a few months ago. I could tell it was fake before I even answered, but I was curious so I pointed my own camera at a blank wall and answered. It was creepy to see my friend’s face (albeit one that was obviously fake if you knew what to look for) when I answered.
How do these scamers know who our friends are? Also how are they able to get pictures or video from said friend to create the fake?
In my case, the friend’s facebook account was compromised. So they were able to get his pictures and call me from his account.
Here’s hoping for popularising secure communication protocols. It’s gonna become a must at some point.
WhatsApp video calls are end-to-end encrypted. A secure protocol means nothing in this context.
But key exchanges work.
Signal for example, will warn you when the person you are talking to is using a new device.
As long as the user heeds the warning, it is an effective stop, and at the very least gives the user pause.
If the signal safety number changes, but the communication stays on track, as in, the context of the conversation is the same, it’s unlikely to be a problem. But if the safety number changes and the next message is asking for money, that is a very simple and easy to process situation.
With deepfake technology being so advanced nowadays, how will we ever know if the person we are talking with on the internet is who they say they are?
Dude had too much money. Simple.