“Research”
Angela collier just uploaded a video about how embarrassing the original article is. Frankly, the situation is ridiculous and speaks to the brain rot experienced as some people become more reliant on these chatbots, regardless of how intelligent seeming some individuals may start off.
He lost his chat logs. I was going to mock him him for not backing up his work but I’m not sure I want to punch down on someone who considers his LLM chat logs research.
Wait wait wait. ALL of his research was in his chat looks in chat gpt‽ So NONE of his research was outside of chat gpt.
I’m not sure it’s meaningful to call that research
ChatGPT did the world a favour it seems
It might have been a research on ChatGPT itself, like glitch token archaeology
Nah, that’s fucking stupid on his part and he should 100% know better.
Absolutely. Angela Collier did a video on this where she kind of roasted the guy. Imagine openly stating that you haven’t actually worked for the past two years, by saying “hey guys just a word of warning, if you click the delete button it’ll delete stuff.”
Brain is mush.
I’m a department chair. I would certainly interpret the situation that way or worse if it was brought to me.
I think that scientists embracing AI so naively as this guy contribute to a larger discredit in science. Probably we will have in the future antivax people saying that " such and such vaccines were made of AI slop".
Same for me but the reason is different
… I don’t want to punch down someone for not using backups while I probably have a lot stored on sites and ready to be wiped accidentally
He basically deleted them
Lol
Lmao, even
In Germany we say “Kein Backup, kein Mitleid!”.
This moron is even labelled as a „professor“ at cologne university. There goes my respect for the German educational system.
I wonder if he could get face consequences because he was sloppy. I mean he is a Beamter, a Prof, and he has to lead by example. Here is the science policy of the University. He is responsible.
Good point. The most idiotic thing about this is that he thought publishing this giant clusterfuck was a good idea.
and science papers are being found with AI content in them in journals, im not surprised.
As someone who knows a decent bit about machine learning, I can say Dr. Collier seems to know very little of how it works on a technical level, which presents such a beautiful contrast when I’m watching her be fucking dead-on about its societal consequences. Seeing someone who isn’t familiar with the field but still knows what they’re talking about is so cathartic in a subject strewn with misconceptions from laypeople. Angela’s integrity is really admirable; you can tell she takes a lot of care not to overstep into areas she doesn’t understand.
Yeah, I agree. Sometimes I see people who are criticising AI in terms of its societal consequences, and whilst they’re decently on the mark with that, they say things that are just straight up wrong on the technical side. It makes me wince, because I worry that incorrect info may end up serving pro-AI discourse instead.
Angela Collier is good at avoiding falling into this trap, and she does so by not pretending she knows more than what she does.
I see your point, and it’s a valid one. It’s just so exhausting that you need to be an expert in the thing you hate and that’s obviously destroying society in order to talk about it’s effects on society. It’s like that with gun-nuts, if you don’t know the specific differences between AR-15 and M16A1, you’re not allowed to have an opinion on how easy it is for a child to get one and societal problems stemmed from it.
Did you write this comment yourself?
I always do. I write as a hobby, and I’m not stuck here, so I don’t see why I’d be here if I didn’t give enough of a shit to write what I want.
At some point, we need to stop caring or stop interacting entirely. It’s so exhausting to have to question the authenticity of every interaction.
we need to stop caring
We should never stop caring.

—the person whose comment was questioned
Not really his research when using an LLM to compile the content, now is it…
If he only used it as a tool, I’d still say it’s his work. It should of course be included in the methodology section.
But considering he lost 2 years worth of work, just because his chats got wiped? That’s more than just a tool
In 2 years, mr scientist never made a backup?
Fake af, can’t be real.
The buffoon.
“Oops all the research was set on fire” is a tale as old as grad students.
In the 2000s it was trusting everything in a multi-year study to a single thumb drive. In the 70s it was carrying the only copy of everything in a single briefcase. Presumably at some point someone was devastated that the rain came early and wiped out all the work on the sand table.
Well we are talking about the guy who pressed the delete everything button and then was surprised that everything was deleted.
Yes, but all was deleted were the ChatGPT chats. Is that all the research produced in 2 years? Probably nothing of value was lost.
Probably not. It seems misleading to characterize what he loss as his research, and a blatant clickbaity lie to claim it was all of it.
Bucher admitted he’d “lost” two years’ worth of “carefully structured academic work” — including grant applications, publication revisions, lectures, and exams
Is it the LLM’s research? Do guns shoot people?
Researching other people’s work is still research. If they start claiming other people’s work is their work without having added any value, then it’s obviously stealing, but otherwisethis argument makes no sense.When you type “dear robot girlfriend, do my work, make no mistakes” into a chatbot window, and then use the slop that the average word predicting machine shat out, you’re not doing research. In fact, you’re not doing anything besides frying your brain while frying the planet.
It’s not “research” at all, but whatever that is, llm generated it, you did nothing.So if I just quickly search Wikipedia or list through a few books, is that a research? How many prompts or hours until my unnamed activity becomes research? Is it a hard limit, or just based on how hateful you’re feeling for the day?
If I go to the restaurant and order something, is that counts as me cooking? How many times I need to point to the waiter at the menu and ask them to bring me something, until I am officially count as a professional cook? If I ask them to make it less salty and add cheese, is it counts as the restaurant employing me as a chef or only as a liner cook?
Just in case your chat “research” fried your brain completely, and it needs to be spelled out: no, to be called cook you need to cook the food. To do research you need to do research, not ask a word prediction machine to do it.https://dictionary.cambridge.org/dictionary/english/research
You don’t need to be a chef to cook a meal.
I don’t need to be a scientist to do research.Is that what you think being said to you? Is that what you consider an appropriate response? Damn, that’s even worse than I thought.
My dear internet stranger, I use a hammer to hammer a nail. God bless
Yes, that’s quick research. Enough for a post on the internet. Far from enough for a science paper
Well that’s a nice big strawman you got there. Lotta assumptions
Despite what your clanker wifu told you, just saying the word “strawman” doesn’t actually constitutes as a proper position.
You just made an assumption on how the research was done, that’s your strawman. Then you attacked me and assumed something completely not true, that’s ad hominem. You’ll be hard pressed to find someone who hates Artificial Idiots more than me, but I also hate bad faith arguments that reduce any person to a handful of stereotypes or a caricature. Instead of throwing insults try engaging in constructive dialogue.
ChatGPT’s chat history isn’t “carefully structured academic work”.
He basically used the 0/0/0 backup strategy, with not even one instance of his data saved anywhere.Nor did he do any research at all. There was nothing useful to backup.
He basically used the 0/0/0 backup strategy
thanks, I love this way of phrasing it xD
Can you imagine coauthoring a paper with this guy? Or being one of his students?
Thinking back to some of my college professors…yes.
He disabled the feature because he “wanted to see whether I would still have access to all of the model’s functions if I did not provide OpenAI with my data.”
Well I guess you found out

He should write a paper about it.
“ChatGPT, write a paper about how you deleted all the previous papers you wrote for me”
He did, check the link (well, “paper”)
If is was really “Intelligence” then it would have saved a backup copy first. “AWTF?” is a more fitting name for it.
“Scientist”
“Ay GPT, research this shit for me.”
Carefully structured academic work?
AI slop has been given a facelift!





















