“How would you feel if you let someone stay in your house for a few nights, but when you came back, they had thrown a loud party?”

As you looked aghast at the damage to your property, you’d either be livid or upset, or a combination of both.

Such scenarios have been widely reported around the world in recent years, especially during the coronavirus pandemic. With bars and nightclubs then closed, young adults, in particular, wanted to find somewhere else to hang out, dance and potentially drink too much.

It sparked a fightback from short-term rental giant Airbnb, which announced a “global party ban”, and vowed to do all it could to prevent such behaviour. This included banning offenders from making new bookings, and restrictions on under 25s who didn’t have a history of excellent reviews.

Airbnb said recently that as a result of its clampdown the number of reported parties dropped by 55% between 2020 and last year. But with the battle not yet won, the US firm has now upped the ante and introduced an artificial intelligence (AI) powered software system to help weed out potential troublemakers.

Now operating worldwide, when you now try to make an Airbnb booking the AI automatically looks out for things such as how recently you created your account, and – big red flag – whether you are trying to rent a property in the same town or city as where you live.

It also questions the duration of your stay – one night only is a potential concern – and whether the planned visit is occurring during a revelry-heavy period such as Halloween or New Year’s Eve.

Naba Banerjee, head of safety and trust at Airbnb, stops thousands and thousands of house parties

“If someone is booking a room during New Year’s Eve for one night only, and they are from the same city as the host, that’s likely to be a party,” says Naba Banerjee, head of safety and trust at Airbnb.

Ms Banerjee adds that if the AI deems that the risk of a party booking is too high, it will prevent the booking, or instead guide the person to the website of one of its partner hotel companies. She says it is an issue of trust, that people renting out their homes via Airbnb are as reassured as possible.

Lucy Paterson is one such person. She rents out the one-bedroom annex beside her home in Worcestershire, and has had more than 150 bookings since she first listed the apartment.

“Part of my planning to be an Airbnb host is that I have only a one-bedroom place, to minimise the potential for parties,” she says. “Of course it hasn’t always been perfect, but I’d say 99% of my guests have been fantastic.”

She adds that Airbnb’s new use of AI has given her “more reassurance”.

Going forward, Ms Banerjee says the AI will only get better and better as the more data it processes the more it will learn.

Lucy Paterson says she is reassured by Airbnb’s use of AI to weed out potential partygoers

In the car sharing sector, one of the biggest online marketplaces, Turo, also uses an AI system, to help protect people who let others hire their cars.

The software, a platform called DataRobot AI, can quickly detect a risk of theft. It also sets the prices for cars, determined by their size, power and speed, and the time of the day or week that a person wishes to begin their hire period.

Separately, Turo also uses AI to allow some users to talk to its app, to tell it what car it wants and when. The AI will then reply, with text on the screen, offering a personalised list of recommended vehicles that match the criteria. This service is currently available to subscribers of popular consumer AI system ChatGPT-4, which is incorporated into Turo’s system.

“We want to make it easy to browse Turo, and that can help build trust between us and our customers,” says the firm’s chief data officer Albert Mangahas.

Using AI to filter out potential problem customers is a good idea, says Edward McFowland III, assistant professor of technology and operating management at Harvard Business School. “Having that layer of AI can help ease the friction on both sides, for both business and consumer.”

He points out though, that even a perfectly calibrated AI model can create false negatives, such as shutting out a young person who wants to hire an apartment for New Year’s Eve, but has no intension of throwing a party. “And that’s why AI technology is still very hard to get right, all the time.”

Lara Bozabalian doesn’t accept first-time renters

In Toronto, Canada, Lara Bozabalian uses Airbnb to rent out her family’s 3,000 sq ft (280sq m) cottage. She says that no matter how attractive the booking may be, or what the AI has determined, she follows her own rule.

“I don’t take on clients who are first-time users. I need to know they’ve been vetted by someone at some point.”

Source:BBC

Leave a Reply

Your email address will not be published. Required fields are marked *