Tech companies control what you think matters
The most successful manipulation is the one you don’t notice. Tech companies have achieved something unprecedented: they’ve inserted themselves into the fundamental process of human value formation, and most people think they’re just getting “personalized content.”
The attention allocation system
Your values are expressed through attention. What you spend time thinking about, caring about, and acting upon reveals what you actually consider important, regardless of what you claim to value.
Tech platforms have become the primary mediators of this attention allocation. Through algorithmic curation, they determine what appears in your awareness and what remains invisible. This is not content delivery—it’s value programming.
The feed is not showing you what matters. It’s teaching you what should matter.
Manufactured urgency
Platforms optimize for engagement, which means they optimize for emotional arousal. The algorithm learns that anger, fear, and outrage generate clicks, shares, and time-on-platform.
Consequently, the issues that feel most urgent to you are often the ones that generate the strongest emotional response, not the ones with the greatest actual impact on your life or the world.
Climate change gets less mental bandwidth than celebrity scandals not because people don’t care about the environment, but because the latter triggers more immediate neurochemical responses that platforms can monetize.
Your sense of what’s “urgent” has been outsourced to engagement optimization systems.
The curation of care
Platforms don’t just show you content—they shape your capacity for caring. Through repeated exposure patterns, they influence what kinds of suffering you notice, what kinds of joy you experience, and what kinds of problems you consider solvable.
Social media algorithms can make you care more about a stranger’s vacation than your neighbor’s struggles, more about a fictional character’s fate than policy changes affecting your community, more about viral trends than long-term consequences.
This is not accidental. It’s the predictable result of systems designed to maximize engagement rather than human flourishing.
The illusion of personal taste
The recommendation engine creates the illusion that you’re discovering your own preferences when you’re actually being shaped by someone else’s optimization targets.
“People who liked X also liked Y” sounds like it’s helping you find what you naturally enjoy. In reality, it’s training you to want what’s profitable to recommend. Your taste is being nudged toward whatever generates the most valuable data or the highest engagement rates.
The algorithm doesn’t reflect your values—it constructs them.
Value arbitrage
Tech companies profit from the gap between what they can make you care about and what actually serves your interests.
They can make you care about purchasing decisions while you ignore investment strategies. They can make you care about social media drama while you neglect real relationships. They can make you care about content consumption while you avoid content creation.
This arbitrage—exploiting the difference between algorithmically-induced priorities and your genuine well-being—is the core business model.
The democracy of distraction
Everyone gets an equal vote in political elections, but not everyone gets equal influence over what people think those elections should be about.
Platform algorithms determine which issues dominate public discourse. They decide whether conversations focus on policy details or personality conflicts, on systemic problems or individual scandals, on long-term consequences or immediate reactions.
This is agenda-setting power that exceeds what traditional media ever possessed, exercised by entities with no democratic accountability.
Automated value judgment
Recommendation algorithms make value judgments at scale. Every ranking decision embeds assumptions about what’s more important, more relevant, more deserving of attention.
When YouTube decides which educational video to recommend, it’s making a judgment about what knowledge is valuable. When LinkedIn decides which career content to promote, it’s making a judgment about what professional success means. When dating apps decide which profiles to show, they’re making judgments about what makes someone worthy of love.
These are fundamentally axiological decisions—decisions about value—being made by optimization systems designed for corporate metrics, not human flourishing.
The externalization of judgment
Previous generations developed personal judgment through direct experience, social interaction, and cultural transmission. Current generations increasingly outsource judgment to algorithmic recommendations.
“What should I watch?” becomes “What does Netflix recommend?” “What should I read?” becomes “What’s trending on Twitter?” “What should I care about?” becomes “What’s in my feed?”
This creates a generation that’s exceptionally good at consuming algorithmically-curated content but increasingly unable to independently evaluate what deserves their attention.
Post-human value systems
The logical endpoint of this process is value systems optimized for machine learning rather than human experience.
When human behavior is increasingly shaped by algorithmic recommendations, human values start to conform to what algorithms can easily process, predict, and manipulate.
Complex, nuanced, context-dependent values get replaced by simple, quantifiable, engagement-optimized preferences. The result is humans becoming more machine-readable but less human.
The counter-strategy
Recognition is the first step toward resistance. Understanding that your values are being actively shaped by profit-driven algorithms creates the possibility of conscious choice.
The counter-strategy is not to reject technology but to reclaim agency over attention allocation. This means deliberately choosing what to care about rather than accepting algorithmic suggestions as natural or neutral.
It means developing personal systems for determining importance that operate independently of platform recommendations. It means cultivating judgment that doesn’t depend on engagement metrics.
Most fundamentally, it means recognizing that in an attention economy, the most radical act is deciding for yourself what deserves your care.
────────────────────────────────────────
The question is not whether tech companies influence what you think matters. They do. The question is whether you’ll develop the capacity to think about what should matter independently of their optimization targets.
Your values are too important to be determined by someone else’s algorithm.