With increasing numbers of travel consumers saying they would use artificial intelligence (AI) for trip planning, a story emerging from Australia sounds a note of caution about AI hallucinations spreading false information that can ruin a holiday.
An AI-powered blog on Tasmania Tours’ website publicised the attractions of “Weldborough Hot Springs,” a forest wellness destination supposedly located among the forests of Australia’s archipelago state. CNN has seen screenshots of the blog pages, promoting a “peaceful escape” in the” tranquil haven.”
The blog has since been taken down because Weldborough, in northeastern Tasmania, has no hot springs. Instead, it is bordered by the North George River—a “freezing” course of water where tin and sapphire prospectors wear wetsuits, according to Kristy Probert, owner of the localWeldborough Hotel. She has had to disappoint “droves” of visitors arriving in the town seeking the “untouched” geothermal springs “rich in therapeutic minerals,” that the post described, as reported by ABC News. The non-existent “secluded forest retreat” was such a “favourite” according to the fake AI information, that it had been ranked in a list of the “7 Best Hot Springs Tasmania Experiences for 2026.”
Weldborough is not easy to reach, so tourists making a special trip there for the hot springs only to find they were an AI hallucination became increasingly angry at their wasted journey. Australian Tours and Cruises, the firm that published the AI blog, confirmed to ABC, “our AI has messed up completely,” but insisted they are a “legit” company. In late January 2026, they described “the online hate and damage to our business reputation” as “soul-destroying.”
The AI hallucination the tour company put out has not only wasted people’s time and travel budget, but also, experts note, sowed potentially dangerous misinformation, encouraging visitors to enter low-temperature waters and to explore remote parts of Tasmania on foot, despite the region’s lack of mobile phone and data coverage.
Elsewhere, AI has featured non-existent hiking trails on hazardous routes, where the mapping is fatally inaccurate. Another reason for caution is that as AI travel planning tools proliferate, so do the ways in which scammers might try to use them for darker purposes. It’s a game of cat-and-mouse, with legitimate developers trying to find ways to stop swindlers in turn.
But Australian Tours and Cruises are far from alone in attempting to harness AI. Airlines like Delta now propose AI “butlers.” Trip.com has developed an AI genie, Kayak began supporting AI plugins in 2023, and other search giants like Booking.com, Expedia, and Google are constantly evolving ChatGPT-powered AI features in response to apparent demand. Gemini’s “Gems” is supposed to offer a personalised travel guide service, making trip planning more intuitive. Even Google warns that results are “for illustrative purposes and may vary,” adding: “Check responses for accuracy. Internet and a browser for setup. Available in select languages, and to users 18+.”
Whether scams or hallucinations or so-called “AI slop”, nearly all AI itineraries contain errors, Hardy highlights, saying “90% of itineraries that AI generates have mistakes in them.”
Yet, around four in 10 tourists now use AI for travel advice or itineraries, says Anne Hardy, adjunct professor in tourism at Southern Cross University, Australia, telling CNN people “trust AI more than review sites.” Hardy’s findings echo research by the UK travel industry association, ABTA, whose 2025 survey revealed the proportion of holidaymakers using artificial intelligence (AI) to plan their trips had doubled over the year.












