ChatGPT erased two years of a professor’s work with one click — and there was no way back
A professor lost two years of research after disabling the data consent option in ChatGPT's settings.
When a professor trusted ChatGPT to help manage his research, he didn’t expect the tool to erase two full years of work with a single click. But that’s exactly what happened — and the worst part is that OpenAI couldn’t restore any of it. This incident is a stark reminder that AI convenience comes with real‑world risks.
It's evident that most, if not everyone, has transitioned into the digital era. Gone are the days when we used to depend on physical files and flash drives as some of the conventional ways of saving important data and information.
People are now more inclined towards sophisticated methods like backing up their data in the cloud using platforms like Google Drive, Microsoft OneDrive, or even Apple's iCloud. But as some of us have come to learn the hard way, these methods aren't exactly speck-free.
Last year, a software engineer lost 10 years' worth of data saved on the Amazon Web Services (AWS) cloud, but later managed to reinstate the work because one human inside AWS decided to give a damn. Similarly, a OneDrive user was locked out of "30 years' worth of photos and work", receiving no help from Microsoft support.
In a bizarre new incident recently shared by Marcel Bucher, Professor of Plant Sciences at the University of Cologne in Germany, an article published in Nature revealed that ChatGPT cost him two years of work, including grant applications, teaching materials, and publication drafts.
The professor indicated that they had signed up for OpenAI's ChatGPT Plus plan and used the service "to write e-mails, draft course descriptions, structure grant applications, revise publications, prepare lectures, create exams and analyse student responses, and even as an interactive tool."
According to the professor:
All the latest news, reviews, and guides for Windows and Xbox diehards.
"I was well aware that large language models such as those that power ChatGPT can produce seemingly confident but sometimes incorrect statements, so I never equated its reliability with factual accuracy, but instead relied on the continuity and apparent stability of the workspace."
However, the professor revealed that he wanted to see whether disabling the "data consent" option would still afford them access to all the model's functions without access to their data.
At that moment, all of my chats were permanently deleted and the project folders were emptied — two years of carefully structured academic work disappeared. No warning appeared. There was no undo option. Just a blank page. Fortunately, I had saved partial copies of some conversations and materials, but large parts of my work were lost forever.
Professor Marcel Bucher
The professor indicated that he initially that that it was a mistake, prompting him to confirm if he had indeed lost his data across multiple browsers, devices, and networks. His efforts to clear the cache, reinstall the app, and even change the settings back and forth were rendered futile.
He attempted to contact OpenAI for some support, but the first response came from an AI agent. He was able to get through to a human employee after making several attempts. Unfortunately, he wasn't able to restore his data.
While speaking to Nature, OpenAI indicated that:
"Regarding additional warnings and recovery, we do provide a confirmation prompt before a user permanently deletes a chat. However, once deleted, content cannot be recovered via the user interface, application programming interfaces (APIs), or support, as this aligns with privacy best practices and legal requirements around user data.
For data protection, we always recommend users maintain personal backups for professional work."
To that end, the professor highlighted the importance of having elaborate measures in place to avoid such occurrences through warnings about irreversible deletion, a recovery option, and more:
"If a single click can irrevocably delete years of work, ChatGPT cannot, in my opinion and on the basis of my experience, be considered completely safe for professional use. As a paying subscriber (€20 per month, or US$23), I assumed basic protective measures would be in place, including a warning about irreversible deletion, a recovery option, albeit time-limited, and backups or redundancy."
This incident underscores the importance of always having a backup for your backup. Beyond storing data in the cloud, it’s wise to take it a step further by keeping a physical backup as well.
Losing two years of work in seconds is the kind of nightmare that sticks with you. As AI becomes more embedded in our workflows, incidents like this will force companies to rethink how they protect user data — or risk losing trust entirely.
After ChatGPT erased two years of academic work, where do you stand on trusting AI? Share your thoughts in the comments and cast your vote!
Follow Windows Central on Google News to keep our latest news, insights, and features at the top of your feeds!

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
