AI videos are tricking tourists into visiting places that don’t exist. That’s just the beginning

I can perfectly imagine the pain, confusion, and betrayal in the voice of the elderly Malaysian woman who, according to a hotel staff member, asked “Why do they do this to people?” when she found out that her dream holiday destination wasn’t real but a video fabricated with Veo3, the generative artificial engine made by Google. She and her husband had just driven three hours from Kuala Lumpur to this location in Perak state, convinced they would find a scenic cable car attraction called the Kuak Skyride. Instead of a gondola to wander across paradise, they found nothing but a quiet town and a hotel worker trying to explain that the glamorous TV journalist they’d watched on TikTok—the one who had ridden the tram through lush forests and interviewed happy tourists—had never existed at all.

There was a Veo3 logo in the bottom right corner. How on Earth didn’t they see that? Oh well, it’s something to tell the grandkids and feel really dumb. No criminals lifting $200,000 from their savings account, no false accusations to sink grandpa’s reputation, like others have experienced thanks to AI-made videos. No real harm done.

Except it is harmful. It’s another brick out of the walls of our reality in a world that’s been crumbling in this post-truth era. AI has made the impossible indistinguishable from the actual, and now it’s turning even vacation planning into a minefield of false experiences. The alleged Malaysian couple’s story might sound like an isolated incident, but it’s the expression of something far more sinister—the complete erosion of our ability to trust what we see, hear, and experience in a world where artificial intelligence can manufacture any narrative with increasingly terrifying precision.

The AI black hole is growing exponentially

The numbers tell the story of our collective descent into digital deception. Deepfake attacks have exploded from just 0.1% of all fraud attempts three years ago to 6.5% today—a staggering 2,137% increase that represents one in every 15 fraud cases, as identity services company Signicat detailed in February 2025.

The statistics have real victims behind them, like Steve Beauchamp, an 82-year-old retiree who drained his entire $690,000 retirement fund after watching deepfake videos of Elon Musk promoting investment schemes. “I mean, the picture of him—it was him,” Beauchamp told The New York Times, his life savings vanished into the digital void.

The scope of AI-powered deception now touches every aspect of human experience. The British engineering company Arup lost more than $25 million when an employee was tricked during a video conference call featuring deepfake versions of the company’s CFO and other staff members. A school principal in Maryland received death threats after an AI-manipulated audio clip showed him making racist and antisemitic remarks—a fabrication created by his own athletics director to discredit him. Even democracy itself isn’t safe: AI-generated robocalls impersonating President Joe Biden encouraged Democrats not to vote in the New Hampshire primary. The list goes on and on.

And now this couple.

AI tourism

The deception began with a video published on TikTok by “TV Rakyat,” a television channel that sounds official but exists only in the realm of artificial intelligence. The footage showed a reporter experiencing the Kuak Skyride, a cable car attraction supposedly located in the town of Kuak Hulu in Perak state. She rode the tram through beautiful forests and mountains, interviewing satisfied customers about their journeys. Everything looked perfect, professional, and real.

On June 30, the couple checked into their hotel in Perak state and approached someone on the staff—who goes by @dyaaaaaaa._ on Threads—to ask about the scenic cable car they’d seen online. The worker claims that she initially thought they were joking because there was no cable car, no attraction, nothing to see around. But the couple insisted, showing the detailed video they’d watched featuring the TV host and her interviews with happy tourists.

When the staff member explained that what they’d seen was an AI-generated video, the couple refused to believe it. They had driven three hours based on footage that felt completely authentic, complete with a professional news presentation and satisfied customer testimonials. According to the hotel employee, the elderly woman threatened to sue the journalist in the video before learning that she, too, was nothing more than a pixel figment of an AI’s imagination.

Things were bad enough already

Tourism was already drowning in manufactured reality before AI perfected the art of deception. Social media has transformed travel into “selfie tourism,” where visitors flock to destinations not for cultural immersion but to capture Instagram-worthy shots for their feeds. UNESCO has declared a three-alarm fire on this phenomenon, warning that travelers now visit iconic landmarks “primarily to take and share photos of themselves, often with iconic landmarks in the background.”

The consequences are devastating. In Hallstatt, Austria—a town that inspired Disney’s Frozen—over a million tourists descend annually to re-create viral moments, forcing the frustrated mayor to erect fences and tell the press that “the town’s residents just want to be left alone” Venice gondolas capsize when tourists refuse to stop photographing. Portofino, Italy, now fines visitors $300 for lingering too long at popular selfie spots to prevent what Mayor Matteo Viacava calls “anarchic chaos.”

That was all the product of influencers already distorting reality with carefully cropped shots of empty beaches and architectural marvels, editing out the crushing crowds and environmental destruction that mass tourism brings. These curated fantasies created unrealistic expectations about travel destinations, leading to overcrowding, infrastructure strain, and the degradation of local communities. And don’t get me started on AI-generated travel influencers. Yes, fake humans peddling AI-generated travel advice on video is now a thing that has turned into an industry (and while many people hate them, many others totally buy the scam). Even governments like Germany have sanctioned them: The German National Tourist Board launched an online marketing campaign in 2024 that featured artificial personalities to promote travel to the country.

It’s a depressing prospect. The Malaysian couple’s experience is just the newest chapter in our journey from reality to manipulated reality to completely fabricated reality. I tell myself that we can only face it with pervasive education campaigns, but I’m afraid that it will always be too little too late.

No comments

Read more