Part I: Algorithms That Rule Us
Prologue: The Code That Surrounds Us
We don’t hear it. We don’t see it. And yet it is everywhere.
It governs our mornings, suggests what to eat for breakfast, what shoes to buy, and even who to meet. It silently arranges the world so that we click more, watch longer, and buy faster.
We call it “algorithms,” but in reality, it is a system built from millions of decisions made by people — programmers, engineers, data analysts. Each of them created a fragment of the reality we now take for granted.
But what if this reality isn’t neutral? What if, instead of free choices, we follow paths someone else designed for us, guided by profit rather than our well-being?
In this book, we will explore the hidden world that seems to be about technology but is, in fact, becoming an invisible system of influence.
We will uncover the mechanisms behind the screen, the psychology exploited by platforms, and the people who believe it’s possible to stop — or at least understand — it.
Because if the 21st century has an invisible kingdom, its king is code.
Introduction: Why This Book Was Written
For thousands of years, people believed they shaped their own lives. Decisions were guided by tradition, religion, culture, and later by science and reason. But never before have we lived in a world where machines, algorithms, and artificial intelligence have so much influence over our daily choices.
Today, invisible systems decide what appears on our screens, which news reaches us, and even who we meet and fall in love with. Technology, once a mere tool, has become the architect of reality.
This book is not about demonizing technology. I use the internet, social media, and AI myself. I believe in progress and in the idea that technology, used wisely, can help humanity.
But I also believe that awareness brings freedom.
If we understand how algorithms work, we can regain control over our lives. We can decide which tools serve us and which steal our time and attention.
This is not just a book about technology. It is about us — about people trying to stay human in a world where code increasingly decides what matters.
Chapter 1: Invisible Mechanisms
A World Governed by Code:
The alarm on your phone interrupts your sleep. You reach for your smartphone — in the very first second of the day, you step into a space where decisions that seem to be yours have already been planned by someone or something else.
The screen lights up with notifications: a friend’s new vacation photo, a calendar reminder, an ad for the shoes you browsed last night. All in the perfect order, designed so that your fingers can’t resist another scroll.
It feels like “just technology.” In reality, it’s a world governed by code — millions of lines of algorithms analyzing your time, interests, and emotions.
The first glance at the screen often shapes our entire day. Bad news? You’ll probably feel anxious. Funny memes? A momentary mood boost. But none of it is random — social media algorithms know exactly which emotions keep you glued to the screen the longest.
For example, Facebook has long analyzed which posts trigger the strongest emotional reactions — and those are prioritized in your feed. In 2018, it was revealed that negative emotions like anger or outrage increase user engagement by 30–40%. Translation: the more irritated you are, the more likely you are to see such content.
Let’s look at a typical day:
— You open YouTube — recommendations appear instantly. One click on a healthy eating video triggers an avalanche: more videos about diets, workouts, and soon… ads for supplements.
— Spotify offers a new playlist “matched to your mood.”
— Google suggests search results before you even finish typing.
Before you even think about what you truly want to see, algorithms have already decided for you.
The most unsettling part? This web of invisible mechanisms works silently, without your conscious consent. No one asks permission to analyze your behavior: how many seconds you looked at a photo, which posts you scrolled past, which ones made you stop.
Tech companies collect hundreds of such signals to build psychological profiles of users. They know your shopping habits, activity cycles, even the times of day when you’re most vulnerable to ads.
It’s like an invisible architect arranging your room’s furniture so skillfully that you believe you made all the decisions yourself.
Understanding this mechanism is the first step toward regaining control. Because if you don’t know how the system works, you become its product, not its user.
Our world increasingly resembles a chessboard where moves are made not by people, but by algorithms. And we — often unknowingly — play a game whose rules we don’t even understand.
What Is an Algorithm?
The word sounds complicated, as if it belonged only to mathematicians or programmers. But algorithms are everywhere — not just in computers.
A recipe for a cake? That’s an algorithm too: step by step, ingredient by ingredient, a clear set of instructions taking you from raw products to a finished dessert.
The difference? Modern computer algorithms operate on a scale beyond human imagination. Millions of data points, tables, charts, user behaviors — all processed in milliseconds to deliver that one ad, that one video, that one post most likely to engage you.
Early algorithms we learned in school were simple:
— add two numbers,
— sort data alphabetically,
— calculate a square root.
Today, algorithms drive self-driving cars, decide who gets a loan, even whether your résumé passes initial screening in a large company.
These are no longer just recipes. They are decision-making systems.
The biggest breakthrough came when algorithms began to learn. Instead of fixed instructions, programmers gave them the ability to analyze data and improve themselves.
This gave birth to machine learning and artificial intelligence — algorithms that not only follow commands but also discover patterns humans would never notice.
Example? A Google algorithm learned to recognize cats in YouTube videos in just a few weeks — with zero human labels, simply by analyzing millions of images.
Once, decisions about the world were made by people: politicians, journalists, teachers, experts. Today, more and more of those decisions are filtered through algorithms.
If a news recommendation system decides you won’t see certain stories, for you… they simply don’t exist.
Mathematical formulas are shaping social reality. And we don’t even notice.
An algorithm isn’t just lines of code. It’s the invisible hand deciding what you see, what you don’t, what you buy, whom you meet.
And until we understand how this hand works, we’ll remain observers, not participants, in the world of technology.
From Data to Decisions
Imagine leaving behind digital breadcrumbs every single day — clicks, likes, Google searches, time spent on a video, even the moment you scroll faster because you’re bored.
Each breadcrumb is data. And algorithms love data.
From it, they build a portrait of you — your interests, moods, opinions, sometimes even fears. Then, based on this data, they decide: what to show you, what to suggest, what to hide.
The more data they collect, the better they categorize you:
— Love travel videos? You’ll see more of them.
— Click on healthy lifestyle articles? Expect more tips.
— Watch one conspiracy theory video? Tomorrow, another awaits.
This creates a filter bubble — a world where you see only content matching your past choices. Rarely anything challenges your beliefs.
The effect? Each of us lives in a slightly different world, tailored to our clicks.
Filter bubbles also create echo chambers: places where you hear mostly opinions similar to your own. If you hold a political view, the algorithm will show you people who think the same way — reinforcing the illusion that “everyone” agrees with you.
In reality, “everyone” often means only your bubble.
Sometimes we don’t even notice decisions being made for us. Netflix autoplays the next episode. YouTube queues the next video. Online stores time promotions for when you’re tired and impulsive.
These are background decisions. You feel in control, but the path was designed step by step.
Once, we searched for information. Now information finds us. Which means someone — or something — decides what we see and what disappears into digital darkness.
That’s why understanding how data turns into decisions matters. Otherwise, we might stop noticing the line between our choices and machine-prompted ones.
Why We Don’t See Algorithms?
Once, every technology was visible.
Steam engines hissed, printing presses roared, early computers filled entire rooms and blinked with lights. You could touch them, see them, feel their power.
Algorithms are different. They operate in silence.
They have no shape, no sound, no face. They live in code, in servers thousands of miles away. We don’t see them — because they weren’t built to be seen.
Most companies won’t reveal how their algorithms work:
— Google keeps search ranking formulas secret.
— Facebook doesn’t disclose all feed rules.
— TikTok was silent for years about how its “For You” page works.
Why? Trade secrets — and power tools. The less we know, the easier we are to control.
Another reason: even programmers don’t always understand their algorithms.
Machine learning systems often create rules no human explicitly wrote.
It’s a black box: input data, get an output. What happens inside? Too complex to trace step by step.
It’s like flying in a plane where the autopilot taught itself to fly… but no one can explain exactly how.
Algorithms are like magicians: most effective when invisible.
— They don’t ask if you want to see that ad. They just show it.
— They don’t ask before arranging your feed. They do it automatically.
— They don’t tell you what they hide. You simply don’t see it.
And since we don’t see the mechanism, it’s easy to forget that someone — or something — made decisions on our behalf.
The truth is, we often don’t mind. Convenience seduces us. We like Netflix recommending shows, Spotify curating playlists, Google finishing our sentences.
But convenience costs: less transparency, less control over what reaches us.
Invisible by design, algorithms work quietly, in the background, in code we never see. The less we notice them, the more power they have.
Power Without a Face
When we think of power, we picture kings, presidents, generals. Power always had a center: a palace, a parliament, a locked office.
In the 21st century, the greatest power often has no face.
It’s the power of algorithms.
They don’t win elections, issue decrees, or wear uniforms. Yet they decide what we see, what moves us, what we believe, sometimes even… how we vote.
Yes, humans build them. Engineers write code, set parameters, design platforms. True.
The problem: these systems are so complex and large-scale that no one has full control.
— Google engineers don’t fully understand the entire search ecosystem.
— TikTok creators can’t predict all effects of their recommendation algorithm.
— AI researchers warn: learning systems may develop unintended strategies for goals.
It’s like building a giant machine that evolves on its own — not always as planned.
When a president issues an order, we know who’s responsible. We can judge, protest, replace.
But when Facebook’s algorithm accidentally promotes disinformation? Or a bank’s AI unfairly denies a loan?
No face means no accountability.
Companies say, “It’s just the algorithm,” as if it were a force of nature beyond human control.
This power needs no armies or police. It doesn’t threaten or persuade. It simply delivers the right content at the right moment:
— Want people to buy? Show ads when they’re tired and impulsive.
— Want them to believe an idea? Feed it in hundreds of tiny posts until it feels obvious.
That’s how advertising works. That’s how propaganda works. That’s how modern tech works.
Not a Hollywood takeover. More subtle.
Control doesn’t need a Matrix-style rebellion. It just needs millions glued to screens, thinking they decide what to watch.
Maybe decisions happen earlier — in the silence of server rooms, in invisible code.
Algorithmic power has no face, no center, no official orders. It acts through data, rankings, recommendations.
Precisely because it’s invisible, it’s so powerful.
We’ve seen:
— algorithms decide what appears on our screens,
— our clicks fuel recommendation engines,
— invisible systems build filter bubbles and echo chambers,
— algorithmic power grows without full human oversight.
It has no face, no center — dispersed, complex, often opaque even to its creators.