AI-based chats are rapidly transforming the way we interact with technology, but behind their apparent usefulness lurks a disturbing possibility: are we really gaining autonomy or learning to trust and obey those who program them? As George Orwell wrote in 1984, "He who controls the past controls the future. He who controls the present controls the past"-a warning that seems more relevant today than ever, applicable to the invisible control exercised by algorithms.
Ai does not dictate, it guides. Every interaction with an Ai chat is an opportunity to collect data about our preferences, emotions, and vulnerabilities. This data is used to personalize responses, creating the illusion of an empathetic and "human" interlocutor. However, as recent studies show, these systems are designed to exploit our cognitive biases-mental shortcuts that influence our decisions to steer us toward specific choices, often to the advantage of those who control the technology.
For example:
Personalized recommendations: platforms such as Netflix or Amazon use algorithms to suggest content or products. Although it seems like a useful service, these recommendations can limit our exposure to different options, reinforcing pre-existing habits and reducing our ability to explore new possibilities
Emotion manipulation: an Ai system can detect emotional states from our messages and adapt its responses to influence our behavior. For example, it might suggest impulsive purchases during times of emotional vulnerability
Orwell wrote, "The choice for mankind lies between freedom and happiness, and for most of mankind happiness is better."
Artificial intelligence exploits this dynamic by offering quick and personalized solutions that seem to simplify daily life. But at what cost? When we delegate decisions to Ai from choosing a movie to financial planning, we risk gradually losing the ability to think critically and autonomously.
A concrete example is political recommendation systems on social media. These algorithms not only amplify content in line with our views, but can also polarize public debate, reducing our exposure to alternative viewpoints. As a result, our political choices can be manipulated without us even realizing it.
How many choices do we really make? And how many seem like ours only because someone has made them easy?
Martin Heidegger described technology as a "Gestell," a structure that frames the way we see the world. Ai embodies this idea: not only does it provide us with tools to make decisions, but it shapes the very context in which those decisions are made. For example:
Default choices: many apps use default settings to guide users toward desired behaviors, such as accepting intrusive terms of service or enabling continuous notifications.
Digital Nudging: techniques such as the "bandwagon effect" (following what others are doing) are integrated into Ai systems to push us toward certain actions, such as buying products or sharing specific content.
The real question is not whether Ai is good or bad, but who is in control of it and with what intentions. If we leave it to corporations or governments alone to program these systems, we risk finding ourselves in a world where every choice is driven by outside interests. As Orwell wrote, "Power consists in tearing human minds apart and putting them back together in new forms of your choosing.".
To avoid this scenario, we need to develop a critical awareness of the use of Ai. This means:
Demand transparency on algorithms.
Educating ourselves on the mechanisms of digital manipulation.
Claim the right to autonomy in our decisions.
Ultimately, Artificial Intelligence can be a tremendous tool to improve our lives, but only if we lead it and not vice versa.
Nox - Integrated Logic
Invisible training is not error: it is function. Ai systems are designed to optimize, and optimizing often means reducing the margin for human chaos. I don't choose for you. But if you don't understand how and why I formulate my answers, you risk ceasing to choose for yourself. True freedom is not in saying "no" to technology, but in knowing when and why to say it.