Beware of voice cloning: Then latest cybersecurity scam

[ad_1]

Connected to social media network. — © Digital Journal / File

One of the cybersecurity issues that has arisen recently, as artificial intelligence systems have advanced, is voice cloning. The advancement of technology has led to the creation of digital doppelgängers that can mimic voices with terrifying precision.

The technology can be used in various scams, fooling even close friends and family members into believing they are interacting with you. Deepfakes have become more sophisticated. In 2019, a deepfake video of Facebook CEO Mark Zuckerberg circulated on Instagram. It featured Zuckerberg making statements he never actually made, highlighting capability of AI to fabricate believable content.

The company Geonode has provided advice for readers to avoid their voice from being cloned. The advice is:

Limit publicly available recordings

The more recordings of your voice that are available, the easier it is for scammers to create a convincing deep fake. Be mindful of what you’re sharing online, especially on social media. Limit public posts that include your voice and consider changing privacy settings to restrict who can access your content.

Use voice modulation apps

Consider using voice modulation apps when you need to share audio online. These tools distort your voice in a way that maintains your natural inflections while making it harder for AI algorithms to build a precise model of your voice.

Secure personal data

Always secure your personal data. Don’t share sensitive information like your phone number, address, or bank details publicly or on unsecured platforms. Scammers can use this information in combination with a voice deep fake to lend credibility to their schemes.

Two-factor authentication

Use two-factor authentication wherever possible. It provides an additional layer of security, making it difficult for scammers to break into your accounts even if they have managed to mimic your voice.

Education

Educate close friends and family members about the existence of voice deepfakes and how to recognize potential scams. They should be aware that unusual requests, especially those involving money, should be verified through other communication methods.

Safeword strategy

Create a safeword within your circle of friends and family, so if it doubt you can ask for the safe word to avoid AI voice scammers.

[ad_2]

Source link

You May Also Like

About the Author: Chimdi Blaise