
source: https://www.theatlantic.com/technology/archive/2022/12/openai-chatgpt-chatbot-messages/672411/
Ta-da! A five hundred word essay on the theme of the hero in contemporary literature. Or how about the history of the Parthenon? Or a letter to a friend one hasn’t seen in a long time? Ten seconds later and you have an advanced piece of writing, thanks to OpenAI’s ChatGPT.
It’s scary. And so much fun! Type in “Write a blog post on ChatGPT in Sweden” and you’ll get a fine answer, like “ChatGPT, a chatbot developed by OpenAI, has been gaining popularity in Sweden as a useful tool for business and organizations. It uses a variant of the GPT (Generative Pre-training Transformer) language model to generate human-like responses to text input in a conversation context.”
Boring, but accurate. After having read a dozen or more ChatGPT texts, it’s easy to get tired of the dry and lifeless language it uses. For facts and details, ChatGPT can also be straight-out incorrect. But it’s early days.
We speak Flashback
It’s also in English, so far. ChatGPT has been trained on the internet (lots of Wikipedia), archived books, and even real human conversations it’s been party to. All in English. A Swedish chatbot, GPT-SW3, is of course in the works. However, as Per Gudmundson (SvD) wrote the other day, its development is hindered by the fact that the National Library of Sweden (Kungliga biblioteket) won’t yet allow chat developers access to its vast digital content. Instead, the developers have had to turn to, among other things, posts on Flashback and Swedish Reddit for examples of human speech and interaction. With content like that, it’s going to be great.
The issue is sensitive. Sweden’s research institute, RISE, says on its website that GPT-SW3 is absolutely not training on Flashback. So who knows? The royal library took issue with Gudmundson as well. In SvD’s online edition the next day, the library protested that it is indeed helping Sweden develop its AI capabilities – by allowing its texts to be read and understood by computer programs. Chat development, though, gets a hard no. The library is not able to foresee the consequences of allowing its database to be used for that purpose, wrote the chief librarian.
For chat-interaction development, there are apparently two projects that are crucial: Natural Language Understanding (NLU) and Processing (NLP). These two projects are meant to be able to make it possible to sort through, tag, and find relevant information in huge amounts of Swedish. Then, in the next step, they’ll be able to respond with relevant answers and information, even for those who pose poor questions, or grammatically imperfect ones.
“Thank God I’m retiring.”
Until GPT-SW3 is well-sourced and developed, it’s in English where the rubber meets the road. A teacher’s “Thank God I’m retiring soon” is echoed everywhere. Articles entitled “The End of High School English” and the like have popped up like pimples on a teenager. In the New York Times Opinion, columnist Frank Bruni wonders if his career is over. Somewhat alarmingly, he doesn’t actually answer that question. Instead, he takes it to a more philosophical level, namely – If we are what we do, and we outsource what we do, what is left is aimlessness, purposelessness, even pointlessness. This is surely not OpenAI’s intention, but we all know where intentions can lead.
OpenAI was cofounded by Elon Musk and Sam Altman, among a few others. If you’re not a Musk fan, you might find purpose in not using ChatGPT, but it could be pointless. Ta-da!

