On 2024-01-12 19:26, Dave Collier-Brown via talk wrote:

A Smarter Colleague pointed out to me the answer you get isn't to the question asked, but to "what would an answer to this question sound like".

It's a language model, not a model of logic, science or law.

--dave


Have not played much with llama but chat-gpt is good for taking facts and converting them into a nicely flowing verbiage.

It is defiantly not useful for getting correct technical answers to problems.
It hallucinates as much as a politician trying to get elected.

[snip]
-- 
Alvin Starr                   ||   land:  (647)478-6285
Netvel Inc.                   ||   Cell:  (416)806-0133
alvin@netvel.net              ||