Air Canada claims its chatbot is liable, not AC!

<https://www.cbc.ca/news/canada/british-columbia/air-canada-chatbot-lawsuit-1.7116416> There's surely more to this story. A customer sued AC for the consequences of bad advice AC's chatbot gave. In an argument that appeared to flabbergast a small claims adjudicator in British Columbia, the airline attempted to distance itself from its own chatbot's bad advice by claiming the online tool was "a separate legal entity that is responsible for its own actions."

D. Hugh Redelmeier via talk wrote on 2024-02-16 08:25:
<https://www.cbc.ca/news/canada/british-columbia/air-canada-chatbot-lawsuit-1.7116416>
There's surely more to this story.
Standard operating procedures - "it wasn't us, guv, it was a contractor". I particularly like this part:
Air Canada argued Moffatt could have found the correct information about bereavement rates on another part of the airline's website.
But as Rivers pointed out, "it does not explain why the webpage titled "Bereavement Travel" was inherently more trustworthy than its chatbot."
"There is no reason why Mr. Moffatt should know that one section of Air Canada's webpage is accurate, and another is not," Rivers wrote.
Also, where CBC looked up any similar cases on CanLII:
A survey of the Canadian Legal Information Institute — which maintains a database of Canadian legal decisions — shows a paucity of cases featuring bad advice from chatbots; Moffatt's appears to be the first.
We can bet everything we own that it won't be the last... rb

I think this is a great example of the caution needed when using AI chatbots. We are still in the early stages of adoption. With AI hallucinations being quite common, more "misunderstandings" are surely to occur. On Fri, 16 Feb 2024 at 13:54, Ron / BCLUG via talk <talk@gtalug.org> wrote:
D. Hugh Redelmeier via talk wrote on 2024-02-16 08:25:
< https://www.cbc.ca/news/canada/british-columbia/air-canada-chatbot-lawsuit-1...
There's surely more to this story.
Standard operating procedures - "it wasn't us, guv, it was a contractor".
I particularly like this part:
Air Canada argued Moffatt could have found the correct information about bereavement rates on another part of the airline's website.
But as Rivers pointed out, "it does not explain why the webpage titled "Bereavement Travel" was inherently more trustworthy than its chatbot."
"There is no reason why Mr. Moffatt should know that one section of Air Canada's webpage is accurate, and another is not," Rivers wrote.
Also, where CBC looked up any similar cases on CanLII:
A survey of the Canadian Legal Information Institute — which maintains a database of Canadian legal decisions — shows a paucity of cases featuring bad advice from chatbots; Moffatt's appears to be the first.
We can bet everything we own that it won't be the last...
rb --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

Don Tai wrote on 2024-02-16 12:02:
With AI hallucinations being quite common
Yes, but let's not keep stating that as though AIs are the only source of bad or wrong information on the internet. And limiting it to the internet is reductive itself - humans produce mountains of wrong information, and worse, sometimes intentionally.

Ron / BCLUG via talk wrote on 2024-02-16 10:35:
I particularly like this part:
Coincidentally, it was just yesterday that I caught one of Canada's most talented comedians do a 4 minute skit on dealing with the frustration of *human* customer service at "Air Canaday": https://www.youtube.com/watch?v=4kq2SaRwXeo If my customer support agents behaved like that, I'd consider replacing them with chatbots too. Although the better solution would be retraining (after an introduction to corporal punishment). Very talented writing, acting, and editing on display there. Truly a Canadian who deserves wider recognition.

Yeah, I had them change my booking for a stop in Montreal with only an hour connection, and yes I missed the connection... I had to stop watching the video, it was too close to documentary! She is one funny woman. On 2/16/24 18:18, Ron / BCLUG via talk wrote:
Ron / BCLUG via talk wrote on 2024-02-16 10:35:
I particularly like this part:
Coincidentally, it was just yesterday that I caught one of Canada's most talented comedians do a 4 minute skit on dealing with the frustration of *human* customer service at "Air Canaday":
-- Peter King peter.king@utoronto.ca Department of Philosophy 170 St. George Street #521 The University of Toronto (416)-946-3170 ofc Toronto, ON M5R 2M8 CANADA http://individual.utoronto.ca/pking/ ========================================================================= GPG keyID 0x7587EC42 (2B14 A355 46BC 2A16 D0BC 36F5 1FE6 D32A 7587 EC42) gpg --keyserver pgp.mit.edu --recv-keys 7587EC42

B.C. lawyer reprimanded for citing fake cases invented by ChatGPT: Chong Ke ordered to pay costs for opposing counsel to discover precedent was AI 'hallucination' https://www.cbc.ca/news/canada/british-columbia/lawyer-chatgpt-fake-preceden... "The cases would have provided compelling precedent for a divorced dad to take his children to China — had they been real. But instead of savouring courtroom victory, the Vancouver lawyer for a millionaire embroiled in an acrimonious split has been told to personally compensate her client's ex-wife's lawyers for the time it took them to learn the cases she hoped to cite were conjured up by ChatGPT. In a decision released Monday <https://www.bccourts.ca/jdb-txt/sc/24/02/2024BCSC0285cor1.htm>, a B.C. Supreme Court judge reprimanded lawyer Chong Ke for including two AI "hallucinations" in an application filed last December. The cases never made it into Ke's arguments; they were withdrawn once she learned they were non-existent." On Tue, 20 Feb 2024 at 10:19, Peter King via talk <talk@gtalug.org> wrote:
Yeah, I had them change my booking for a stop in Montreal with only an hour connection, and yes I missed the connection... I had to stop watching the video, it was too close to documentary! She is one funny woman.
On 2/16/24 18:18, Ron / BCLUG via talk wrote:
Ron / BCLUG via talk wrote on 2024-02-16 10:35:
I particularly like this part:
Coincidentally, it was just yesterday that I caught one of Canada's most talented comedians do a 4 minute skit on dealing with the frustration of *human* customer service at "Air Canaday":
-- Peter King peter.king@utoronto.ca Department of Philosophy 170 St. George Street #521 The University of Toronto (416)-946-3170 ofc Toronto, ON M5R 2M8 CANADA
http://individual.utoronto.ca/pking/
========================================================================= GPG keyID 0x7587EC42 (2B14 A355 46BC 2A16 D0BC 36F5 1FE6 D32A 7587 EC42) gpg --keyserver pgp.mit.edu --recv-keys 7587EC42
--- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
participants (4)
-
D. Hugh Redelmeier
-
Don Tai
-
Peter King
-
Ron / BCLUG