Legal liability for misrepresentation by an AI chatbot: Moffat v Air Canada 2024 BCCRT 149

Technology has evolved and continues to evolve at a rapid pace. From the use of electronic communication to the use of advanced AI chatbots, designed to mimic human interaction, technology is now part and parcel of everyday life. A legal question that then arises from the increased use of technology is: what liability lies in law for misleading information offered by a Chatbot?

A Canadian case, Moffat v Air Canada offers insights in answer to this question. In this case, the claimant, Mr Moffat, booked an air ticket following his grandmother’s death. Air Canada offered reduced ‘bereavement fares’ where the death of a close relative had occurred. Mr Moffat interacted with an AI Chatbot on Air Canada’s website as he looked for and eventually booked a flight. The Chatbot incorrectly responded to one of Mr Moffat’s queries by informing him that he could apply for the discounted bereavement fare upon filling a form within 90 days of the ticket’s issuance. There was a hyperlink on the words ‘Bereavement Fares’ in the Chatbot’s response, which led to Air Canada’s website. On the website, there was information on Air Canada’s policy on bereavement fares. The policy stated that the discount was not offered retroactively. Mr Moffat did not click on the link that would have directed him to the website, and was therefore unaware of the actual policy as intended by Air Canada.

Mr Moffat applied for the discount within the 90-day window, based on his interaction with the Chatbot. Air Canada disputed his claim, relying on its policy, because Mr Moffat had booked the flight after his grandmother had passed away. Under its policy, the discount would have been availed if Mr Moffat’s had booked the flight before his grandmother’s demise.

After some back and forth with Air Canada in which the dispute was not resolved, Mr Moffat brought an action in the British Columbia Civil Resolution Tribunal. The Tribunal ruled that Air Canada was responsible for the misleading information by its Chatbot, and awarded Mr Moffat damages for negligent misrepresentation. The Tribunal equated the Chatbot to an ‘agent’ for purposes of liability that arose in this way. Air Canada’s argument that Mr Moffat could have found the correct information on its website by clicking on the hyperlink was dismissed by the Tribunal because it was reasonable for Mr Moffat to rely on the Chatbot’s response.

The upshot of this decision is that those who deploy AI and similarly derived technologies in their operations should be aware that any liability arising from such AI will fall on them. Beyond misrepresentation, such liability may well extend to copyright breaches, defamation, and privacy.

 

Leave a Comment