Air Canada tried to argue that it was not responsible for the false promises made by the chatbot, but the trial court refused to allow it

Air Canada tried to argue that it was not responsible for the false promises made by the chatbot, but the trial court refused to allow it

Could AI chatbots ease the burden on customer service representatives? Many companies seem to think so, but they better pray their chatbots don't cause problems like Air Canada's The airline was just forced to honor a refund policy that the chatbot seems to have made up on the fly and offer partial refunds to customers

The incident in question happened to Jack Moffat After his grandmother's death, he asked an Air Canada chatbot for help understanding the airline's bereavement travel policy The chatbot explained that he could book a flight immediately and request a partial refund within 90 days

Unfortunately, Air Canada's bereavement travel policy states that no refunds will be made for travel after a flight is booked Therefore, Moffat's refund request was denied Air Canada acknowledged that the bot was at fault and instead offered a $200 flight voucher and promised to update the chatbot to ensure that this would not happen again

Unsatisfied, Moffat filed a small claims lawsuit in the Canadian Civil Settlement Tribunal

Air Canada's main argument was that Moffat should have known what was going on because the chatbot linked to an official survivor's provisions page It also argued that the chatbot should not have been trusted in the first place and that the airline could not be held liable because the bit was a "separate legal entity"

It is clear that the court did not believe this excuse because Air Canada did not explain why it was not liable for the information provided by the chatbot or its agents, including human customer service representatives The court also ruled that Air Canada did not explain why customers should be held responsible for fact-checking and that Moffat had no reason to believe that the chatbots were not providing accurate information

Moffat was ordered to pay a partial refund of C$65088 ($42829) plus additional damages covering interest and court costs

It is rather perplexing that Air Canada would attempt to argue this in court The idea that a company would provide a chatbot in place of a human customer service representative and then refuse to accept responsibility when things go wrong is rather ridiculous Especially when such a small amount of money is at stake

Adam Leon Smith, chair of F-TAG, BCS's technical advisory group, and a leading AI safety expert, told Tom's Guide, "I'm surprised Air Canada even tried to fight this claim In this context, the chatbot advice is clearly on par with the advice given on their web page" He then added that "AI deployers need to understand its limitations, and that they are ultimately responsible for the mistakes made by AI on their behalf"

Meanwhile, Ryan Carrier, CEO of AI safety and certification body forHumanity, similarly criticized Air Canada's response and the precedent it sets for success Carrier told "Tom's Guide" that forHumanity "has consistently argued that all tools, including AI, algorithms, and autonomous systems, should always have a beneficial owner who is responsible and accountable In other words, companies need to know the risks of using chatbots and similar tools and be willing to back them up if they go off the rails and make a series of promises

Or stop trying to cut corners with customer service and hire extra human operators who are trained and directed not to make up company policies on the fly That way, customers would know what the deal is and companies would not have to deal with angry complaints about false promises

Additionally, human operators would be fired if they swore at customers or wrote poetry criticizing their employer during working hours

Categories