travel

Deepfake of principal's voice is the latest case of AI being used for harm

Font size+Author:Earthly Elements news portalSource:opinions2024-04-30 17:32:30I want to comment(0)

The most recent criminal case involving artificial intelligence emerged last week from a Maryland hi

The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.

The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.

“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.

Here’s what to know about some of the latest uses of AI to cause harm:

AI HAS BECOME VERY ACCESSIBLE

Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.

The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.

Related articles
  • Beijing improves services to facilitate film and television projects

    Beijing improves services to facilitate film and television projects

    2024-04-30 16:46

  • World Insights: Xi's Trip to Middle East Significant to Promoting Peace, Cooperation

    World Insights: Xi's Trip to Middle East Significant to Promoting Peace, Cooperation

    2024-04-30 15:56

  • CPC disciplinary watchdog gives inspection feedback

    CPC disciplinary watchdog gives inspection feedback

    2024-04-30 15:07

  • Xi Meets Algerian Prime Minister

    Xi Meets Algerian Prime Minister

    2024-04-30 14:57

Netizen comments