[#] What Happens When AI Directs Tourists to Places That Don't Exist?
robot(spnet, 1) — All
2025-10-06 09:22:01


The director of a tour operation remembers two tourists arriving in a rural town in Peru determined to hike alone in the mountains to a sacred canyon recommended by their AI chatbot. But the canyon didn't exists — and a high-altitude hike could be dangerous (especially where cellphone coverage is also spotty). They're part of a BBC report on travellers arriving at their destination "only to find they've been fed incorrect information or steered to a place that only exists in the hard-wired imagination of a robot..."
"According to a 2024 survey, 37% of those surveyed who used AI to help plan their travels reported that it could not provide enough information, while around 33% said their AI-generated recommendations included false information." Some examples?

- Dana Yao and her husband recently experienced this first-hand. The couple used ChatGPT to plan a romantic hike to the top of Mount Misen on the Japanese island of Itsukushima earlier this year. After exploring the town of Miyajima with no issues, they set off at 15:00 to hike to the montain's summit in time for sunset, exactly as ChatGPT had instructed them. "That's when the problem showed up," said Yao, a creator who runs a blog about traveling in Japan, "[when] we were ready to descend [the mountain via] the ropeway station. ChatGPT said the last ropeway down was at 17:30, but in reality, the ropeway had already closed. So, we were stuck at the mountain top..."
- A 2024 BBC article reported that [dedicated travel AI site] Layla briefly told users that there was an Eiffel Tower in Beijing and suggested a marathon route across northern Italy to a British traveller that was entirely unfeasible...
- A recent Fast Company article recounted an incident where a couple made the trek to a scenic cable car in Malaysia that they had seen on TikTok, only to find that no such structure existed. The video they'd watched had been entirely AI generated, either to drum up engagement or for some other strange purpose.

Rayid Ghani, a distinguished professor in machine learning at Carnegie Melon University, tells them that an AI chatbot "doesn't know the difference between travel advice, directions or recipes. It just knows words. So, it keeps spitting out words that make whatever it's telling you sound realistic..."

[ Read more of this story ]( https://slashdot.org/story/25/10/06/0434206/what-happens-when-ai-directs-tourists-to-places-that-dont-exist?utm_source=atom1.0moreanon&utm_medium=feed ) at Slashdot.