Just another example as to why companies tripping over themselves to force AI into everything without doing the real work of actually stopping this shit. It seems like it would need very direct rules in the code to just defer to a human tech in the event of not “knowing” the answer. Just like how human level 1 customer help will just say that it seems to need a higher level person to get correct help for the situation. All these bots are trained to focus on sounding correct above everything else is eventually cause much worse problems as greed and hype rule.
Just another example as to why companies tripping over themselves to force AI into everything without doing the real work of actually stopping this shit. It seems like it would need very direct rules in the code to just defer to a human tech in the event of not “knowing” the answer. Just like how human level 1 customer help will just say that it seems to need a higher level person to get correct help for the situation. All these bots are trained to focus on sounding correct above everything else is eventually cause much worse problems as greed and hype rule.