The esoteric market (divination, horoscopes, and prophecies) worldwide is worth hundreds of billions of dollars. It is equally popular in Poland. Every day, hundreds of thousands of Poles turn to divination and prophecies. Websites analyzing dreams (dream dictionaries) are visited by tens of thousands of people daily. One could say that this is a harmless form of entertainment, were it not for the fact that hundreds of people make significant decisions about their future based on divination or prophecies.
A very popular form of entertainment in the early 20th century at amusement parks and traveling circuses was the ZOLTAR machine. A wizard in a turban, who for a few cents, accompanied by Persian rhythms, would print a ticket with a life prophecy. As we know, such prophecies can always be tailored to reality, so everyone can find something for themselves. It's similar to the fortune cookies at a Chinese restaurant or the horoscopes from gazeta.pl. Many studies have been conducted to explain why people not only believe in such prophecies but also remember them for a long time. The mechanical Zoltar was attributed with reasoning abilities and empathy towards the customer, while operating 24/7/365.
A few months ago, the Apple team published a paper titled "GSM-Symbolic: Understanding the Limitations of Mathematical Reasoning in Large Language Models," which clearly states that LLM models do not have reasoning abilities, but only statistical mechanisms for calculating probabilities. It seems that everyone should be aware of this, but that is not the case. LLM models are like the Zoltar of the 21st century. A few days ago, with the appearance of the SORA model, the internet was flooded with videos showing generated characters that, despite being very realistic, broke all physical laws. Skiers going uphill, gymnasts with three pairs of legs performing mid-air somersaults as if gravity had no effect on them. Laughter, mockery, and widespread criticism. Crowds of former LLM model supporters, as a prelude to the destruction of humanity and the omnipresence of artificial superintelligence, received a cold shower. In the Gospel of John (20:24–29), Doubting Thomas had to see in order to believe. In our AI world, "Blessed are those who have not seen and yet have believed" has now become a foundation for mockery, as many have been persuaded that their professional and life future depends on a quick and unreflective plunge into the world of Zoltar.
Artificial intelligence (machine learning models) has powerful applications in industry, transport, and science. This is indisputable. Wherever real-time (or near real-time) decision-making gives a competitive edge, or initiating the correct—optimal—action is valuable in itself, AI is thriving. These areas are developing at an incredible pace. The effects of AI are not visible, but their consequences are. They bring benefits. Any Zoltar-like applications, not only extremely costly (OpenAI fears it won't be able to offer users access to the latest models because they lack the computational power), but also with results that, outside of entertainment and the ability to stimulate endorphins, do not bring business benefits. In other words, they are useless. It's fun to generate an image (especially if I don't have talent and can suddenly make something I've always dreamed of) or write a marketing strategy. But when you calculate the costs versus the benefits, there’s no way it can be justified.
And here's a reflection: Zoltar was not supposed to predict the future. It was supposed to be a tool for making money off emotionally unstable people or those who treated such divinations as a form of entertainment. The machine paid for itself in one month, which is why it was a hit in its time. LLMs don't pay for themselves so quickly, so they are gradually becoming a burden on their environment. Undoubtedly, they raised awareness of advanced analytical technologies and sparked the imaginat