"Tech experts are starting to doubt that ChatGPT and A.I. 'hallucinations' will ever go away..."
@baldur@toot.cafe
Tech experts are starting to doubt that ChatGPT and A.I. hallucinations will ever go away: This isnt fixable
More accurately, AI researchers have always said that this isnt fixable but yall were too obsessed with listening to con artists to pay attention but now the con is wearing thin. https://fortune.com/2023/08/01/can-ai-chatgpt-hallucinations-be-fixed-experts-doubt-altman-openai/
https://toot.cafe/@baldur/110820093838025969
from the fortune article:
Spend enough time with ChatGPT and other artificial intelligence chatbots and it doesnt take long for them to spout falsehoods.
Described as hallucination, confabulation or just plain making things up, its now a problem for every business, organization and high school student trying to get a generative AI system to compose documents and get work done. Some are using it on tasks with the potential for high-stakes consequences, from psychotherapy to researching and writing legal briefs.
I dont think that theres any model today that doesnt suffer from some hallucination, said Daniela Amodei, co-founder and president of Anthropic, maker of the chatbot Claude 2.
Theyre really just sort of designed to predict the next word, Amodei said. And so there will be some rate at which the model does that inaccurately.
Jerry2144
(2,627 posts)I am so sorry for your loss and I hope you find peace in your heart and soul and that you find peace and peace in your life and that you will be able to get through this difficult time with your family and friends and family and friends in the future and I hope that you will be in my prayers and that you can be in my thoughts and prayers for you and your family and your family and your family I know
Yup. On drugs. Or maybe my phone is a republican like Caribou Barbie or Trumplethinskin? Sounds at home in front of any MAGA speakers podium
zipplewrath
(16,692 posts)He was struggling with what to do about AI and research papers.
His thoughts reminded me of when I was in school and cheap full function calculators were becoming cheaply available. There were discussions in both high school and college about their use on tests. Ultimately, teachers decided that using a calculator on a physics test was no big deal. The test was on physics, not mathematics.
I pointed out in my day, prior to google, I had to use a "reader's guide" to do the research for my papers. Was the class about how to use the readers guide or how to write research papers? Should students not be allowed to use google?
Right now my text editor is correcting or suggesting spelling corrections and grammar improvements. Is that wrong?
So a person uses AI to write out a paper on the impact of the emancipation proclamation. His real job, just like when I was using reader's guide, is to evaluate the information he is presenting to support his larger thesis. Are the references reputable? Heck, do they even exist? Were the quotes taken out of context? How close to "original source" are the references?
I suggested his job may get even easier since his primary job in grading papers would be to skim through the references, and check out several of them. If they are all just various authors parroting what they heard on a single Hannity show, he doesn't even have to read the paper.
It's a comparable problem to plagiarism. An older brother was a TA in various literature courses while in graduate school. After a few semesters, he starts to recognize many of the common references, and even specific quotes and paragraphs. The real problem is that students tend to find common and widely used papers, books, or studies and subconsciously end up using them as an "outline" for their own papers. But that can lead to writing very common insights and conclusions as their own. The question really was whether that meant the work was plagiarized. The professor ultimately started warning students about the problem. He suggested they double check their work against some of their own, most often used, references.
I'd do the same today with AI. You want it to write you and outline, fine. You better check those references, and not just to ensure that they exist, but they say what is written and are well sourced as well.
yonder
(10,005 posts)Recently I was listening to a YouTube narrator and he/it kept on referring to the 911 attacks as nine hundred eleven.
Another narrator bot kept saying something like fitlibs instead of foot pounds (ft lbs) when talking about torque.
Other signs are weirdly accented or pronounced syllables on common words and continuously used odd phrasing for what was meant to be idiomatic speech.
Strange times.