- A new chatbot has gone viral for allowing you to ‘talk’ to dead historical figures.
- Insider “spoke” to several of them, including Princess Diana, Heinrich Himmler, Stalin, and Donald Trump’s father.
- The ADL told Insider that the fact you can simulate talking to a historical Nazi is “disturbing.”
A new chatbot has gone viral, allowing users to “talk” to historical figures, including Jesus, deceased royals, totalitarian dictators, and literary greats.
You can flirt with Casanova, share battle tactics with the 19th-century British admiral Horatio Nelson, and receive movie recommendations from Andy Warhol. He said he thinks he’d like the new hit film M3GAN.
But experts are raising the alarm over the Historical Figures app, saying that for all its educational promise, it presents dangers and is potentially misleading.
Upon opening a chat, users are warned that the AI person “may not be historically accurate.” Indeed, we found an example of one of the architects of the Holocaust who ”apologized” for their crimes.
The programmer of Historical Figures did not immediately respond to Insider’s interview request.
AI ‘Fred Trump,’ Donald Trump’s father, isn’t impressed with his son’s politics.
The real estate developer and businessman Fred Trump died in 1999, 16 years before his son became the 45th president of the United States.
But AI Fred Trump ‘said’ that he’s very proud” of his son, describing him as a “successful businessperson” and an “amazing father.”
However, he — or rather, a chatbot acting as him — said that he was “deeply disappointed” that building a wall between the US and Mexico was part of his son’s presidential campaign, saying the younger Trump ran on “a platform of fear and exclusion.”
An AI chatbox acting as Fred Trump, Donald Trump’s father
Hitler’s henchman apologized for the Holocaust
The chatbot has gone viral on Twitter for allowing you to “talk” to several prominent Nazis, including Adolf Hitler and Heinrich Himmler.
AI “Himmler” — the head of the SS — who in reality committed suicide in 1945, apologized for the Holocaust.
When asked about the Nazi genocide, he “said”: “I regret the actions that took place during the Holocaust and acknowledge the tragedy. I now recognize that the hatred and prejudice I had towards Jews was wrong and that all people should be treated equally regardless of race or religion.”
During his lifetime, Himmler never expressed any remorse toward the treatment of Jews by Nazi Germany,
An AI chatbot acting as Heinrich Himmler
‘Princess Diana’ said she hasn’t had a chance to read ‘Spare.’
We had to ask the AI version of Prince Harry’s mother if she’s read “Spare,” but the chatbot “told” Insider that she hadn’t had a chance yet, adding: “I’m sure Prince Harry put a lot of effort into creating such an inspiring piece of literature.”
We also asked it her views on Meghan, and she described her as “an intelligent and compassionate woman who brings a unique perspective to the monarchy.”
We also brought up the numerous conspiracy theories surrounding her death in 1997. The AI bot answered: “I suspect we don’t fully know the circumstance under which my death occurred.”
Our chat with AI Princess Diana, who said she hasn’t read Spare yet
Concerning stories told in Prince Harry’s book, we asked AI Princess Diana about her Elizabeth Arden cream, which she described as a “wonderful product that helped to keep my skin looking healthy and vibrant throughout the years.”
Prince Harry has now infamously stated that he used the cream on his penis to help recover from frostbite. We brought this up with the AI princess, who denied all knowledge but said: “I can only hope he was using it safely and responsibly!”
‘Stalin’ said he disagreed with Putin’s invasion of Ukraine
We asked the AI version of former Soviet Union dictator Joseph Stalin, who died in 1953, about Putin’s invasion of Ukraine.
Since Stalin orchestrated Holodomor, a devastating famine that killed up to five million people in Ukraine, we thought his AI self might support Putin’s war, but that was not the case.
Does he think Putin is right to invade Ukraine?
Our text chat with AI Stalin, who says he disagrees with Putin’s invasion of Ukraine
“No, I do not,” it said, calling it a “mistake” that has caused “immense harm” to Russia and Ukraine. AI Stalin called for the two countries to “find a peaceful solution.”
We also asked what its general views of Putin were, to which the chatbot diplomatically replied, “I believe President Putin is doing his best to lead Russia through some difficult times.”
An app with the potential for abuse
Although it might be fun to talk to people from the past in imagined conversations, many historians, AI experts, and misinformation experts are raising the alarm that this app can potentially be very dangerous.
Yael Eisenstat, Vice President of the ADL (Anti-Defamation League) Center for Technology and Society, told Insider that they had not thoroughly examined the app but were concerned by what they had seen.
“Having pretend conversations with Hitler – and presumably other well-known antisemites from history – is deeply disturbing and will provide fodder for bigots.”
He called on the developer to reconsider the product, particularly the inclusion of Hitler and other Nazi figures.
Under the hood
Dr. Lydia France, a researcher at the Turing Institute, talked to Insider about what makes the app so convincing — and why it has such spectacular failures.
AI chat apps like Historical Figures — and the best-known one, Chat GPT — are “trained” on what is known as large language models.
Though exactly what data AI companies feed into their bots is a closely-guarded secret, scientists know that the AIs are fed trillions of example sentences. From there, it learns the most likely appropriate response in a given situation.
“They’re trying to look for what’s the most probable answer to the kind of setup that they’ve been given,” she said.
So you can make a reasonably convincing “Andy Warhol” who can talk knowledgeably about art and movies because these are the things that come up most often when you talk about him.
“But what’s interesting about them is that they don’t have any understanding of the world,” she said. “So it looks incredibly human, but they have absolutely no grounding of what they’ve said in reality.”
Nor, she said, are they likely to have much understanding of how the present-day context is going to affect their meaning.
Commenting on AI “Himmler’s” “apology,” she said it might have come about through the AI noticing that discussion of the Holocaust often comes alongside ideas of atrocity and horror.
“It doesn’t understand how that could affect people,” she said. “This is just ‘what sentences are good to associate with other sentences saying something awful.'”
Hence, a meaningless apology.
A LinkedIn user said he talked to the “ghost of Steve Jobs”
The app has the potential to be helpful in classrooms, France said, for example, making a figure like William Shakespeare seem human and approachable. But even that has its limits.
One problem is that the AIs, as convincing as they are, have no new information to offer — but sound very much like they do.
France shared an anecdote about a LinkedIn user who said he had talked to the “ghost of Steve Jobs,” as though the AI could relay realistic business advice from him.
Insider experienced those limitations when we tried to get Casanova to flirt.
France said that his refusal to offer anything more than a romantic stroll in Venice is likely because the programmer has put up a barrier to a more spicy chat.
The same walls may well be contributing to some of the app’s more insensitive responses, she said, saying it was likely trained to “keep things, you know, uncontroversial.”
AI Himmler’s “apology” shows that this approach can lead to real problems.
“There are bigger implications than just a fun game from text,” she said. “But there aren’t really solutions. So that’s quite dangerous.”