Mind Control in the Digital Age (Mind Shaping and Cognitive Liberty) - a Podcast
- Featured in Robson Crim

- 2 hours ago
- 16 min read
Cognitive Liberty in the Digital Age
In this podcast, hosts Jayden and Andreas explore the contemporary threats to human thought and autonomy in a world dominated by technology, algorithms, and artificial intelligence. Over a 30-minute conversation, they examine how both centralized state powers and decentralized social media platforms are reshaping cognition, and why the concept of cognitive liberty is essential to protect human dignity.
Understanding Mind Control Today
Jayden opens the discussion by clarifying what “mind control” means in the modern context. This isn’t about conspiracy theories or tinfoil hats, but rather about real technologies. AI, algorithms, and surveillance systems actively influence what people think, believe, and perceive. The podcast identifies two primary ways this occurs: centralized control by governments and decentralized control by social media platforms. Jayden focuses on government-led interventions, while Andreas discusses social media’s algorithmic manipulation.
Before exploring these examples, the legal protections designed to safeguard thought are highlighted. In Canada, Section 2(b) of the Canadian Charter of Rights and Freedoms guarantees “freedom of thought, belief, opinion, and expression.” This right derives from Article 18 of the Universal Declaration of Human Rights, which aims to protect individuals from ideological persecution in the aftermath of World War II. While the law effectively addressed overt coercion, such as arrest or forced conversion, it was not designed for a world in which our thoughts are subtly shaped through apps and social media content streams.
The Gap in Freedom of Thought
Jayden emphasizes the limitations of existing legal protections. Freedom of expression covers speech and outward behavior, but modern technologies manipulate cognition before thought becomes conscious or articulated. Legal scholars classify freedom of thought as jus cogens, an absolute right akin to the prohibition of torture or slavery. Despite this, it is often called the “forgotten right,” because it hasn’t been updated to address the subtle and sustained manipulations occurring in the digital era. This gap underlines the need for a broader concept: cognitive liberty.
What Even Is a Thought?
Andreas and Jayden discuss the nature of thought. Consciousness is described as an interactive loop between the brain and the external world, constantly absorbing stimuli from the environment, conversations, reading, and visual inputs. Richard Glen Boire, founder of the Center for Cognitive Liberty & Ethics, characterizes this as the “dance of cognition,” where external information continuously merges with internal mental activity. Because thoughts are shaped by external stimuli, manipulating these stimuli, whether by governments or algorithms, effectively undermines independent thinking.
The philosophical foundation for protecting thought is rooted in René Descartes’ principle: “I think, therefore I am.” Thought is the defining feature of human existence. In the legal context, thought encompasses all unmanifested mental activity, including deliberation, imagination, belief, memory, and desire—everything occurring before it translates into speech or action. The podcast emphasizes that this mental domain is precisely what is under threat.
Centralized Control: China’s Digital Authoritarianism
Jayden examines the Chinese Communist Party’s use of AI to control cognition through historical revisionism and mass digital surveillance. Historical revisionism involves rewriting reality, as AI models can now be programmed to censor sensitive topics, deny human rights abuses, and filter criticism of political leaders. Studies comparing Chinese AI models to counterparts found consistent suppression of events like the Tiananmen Square massacre or the Uyghur repression. These systems, actively integrated into search engines and AI responses, gradually reshape citizens’ understanding of history, fostering compliance and limiting independent thought.
Mass surveillance complements these efforts. China has over one billion surveillance cameras, monitors hundreds of millions of internet users, and is developing digital currencies and AI-powered “safe cities” that predict and nudge behavior. Citizens in Tibet, for example, are compelled to use apps that record location, voice, and images. Knowing one is constantly monitored leads to self-censorship, limiting even internal dissent. Jayden stresses that these tools both monitor behavior and actively shape cognition.
The podcast also highlights the global dimension: Chinese surveillance technology is being exported worldwide through initiatives like the Belt and Road Initiative, influencing infrastructure in over 150 countries. This underscores that digitization is not a local problem but a global challenge, shaping the values and freedoms of societies far beyond China.
Decentralized Control: Social Media Algorithms
Andreas shifts focus to decentralized threats in democracies, namely the pervasive influence of social media algorithms. As of January 2025, over 5.2 billion people use social media, spending an average of 144 minutes daily on platforms like TikTok, Instagram, Facebook, and YouTube. These platforms may exploit cognitive shortcuts and biases, particularly conformity bias, by showing users content aligned with their preferences, reinforcing preexisting beliefs, and creating echo chambers that intensify over time.
The podcast examines the Cambridge Analytica scandal as a case study, demonstrating how data harvested from millions of users can manipulate voter behavior. Social media’s impact on adolescents is especially concerning, as their prefrontal cortex (responsible for self-regulation) is still developing, while the nucleus accumbens (reward-seeking) is highly active. Platforms like TikTok exploit these developmental vulnerabilities, influencing identity formation, self-worth, and thought patterns. The result is not just polarization among adults but profound shaping of adolescent cognition during a critical developmental window.
Discussion: Comparing Threats
Jayden and Andreas compare centralized and decentralized control. Government interventions are overt and deliberate, whereas social media’s influence is subtle, often invisible. The hosts argue that even in democracies, citizens face manipulation below the threshold of legal protection, highlighting the urgent need for cognitive liberty.
Cognitive Liberty: Definition and Importance
Cognitive liberty, as defined by Dr. Nita Farahany of Duke Law, is the right to self-determination over one’s mind and mental experiences. Unlike passive freedom of thought, cognitive liberty is proactive: it safeguards individuals from sustained, subtle interference that reshapes cognition. While freedom of thought prevents overt coercion, cognitive liberty ensures protection from invisible, algorithmic, or technological manipulations of mental environments. Violations are analogous to someone entering one’s home quietly and subtly rearranging the environment to influence beliefs over time.
The 2021 UN report on freedom of thought warns that technological advances increase the ability to decode or infer thoughts, making urgent consideration of protections for the “forum internum”—the inner sanctum of thought—essential. Cognitive liberty is not only a legal concern but a moral imperative, since preserving independent thought is critical to maintaining human dignity.
Closing: Why It Matters
The podcast concludes with a call to recognize cognitive liberty as a fundamental human right. Current protections in law were designed for overt coercion and cannot address AI, surveillance, and algorithmic conditioning. Protecting thought is not merely about privacy or freedom; it is about preserving what it means to be human. Jayden and Andreas emphasize that the mind cannot become the next frontier for exploitation, and that safeguarding cognition is central to the preservation of identity, dignity, and human autonomy.
Transcript for Episode: Mind Control in the Digital Age (Mind Shaping and Cognitive Liberty)
Speakers: Jayden; Andreas
0:00
Jayden: Hello everyone. I’m Jayden, and with me is my colleague and friend Andreas. Welcome back to our podcast. Today we’re diving into something that sounds like science fiction, but is happening right now: mind control in the digital age.
Now, when we say “mind control,” we’re not talking about tinfoil hats or conspiracy theories. We’re talking about real technologies, AI algorithms, and surveillance systems that are less controlling your mind, per se, but are actively shaping how people think, what they believe, and how they see the world. So perhaps we can say “mind shaping” rather than mind control. Nevertheless, this is the scary part.
It’s happening in two ways. First, there’s centralized control: governments using AI to surveil citizens, rewrite history, and preempt dissent before it happens. That’s what I’ll be covering.
0:57
Jayden: Then there’s decentralized control: social media platforms like TikTok, Instagram, and Facebook using algorithms to influence what you see, how you feel, and ultimately what you think. That’s where Andreas comes in.
But before we dive into those examples, let’s start with the legal foundation, because in Canada, we actually have a right that’s supposed to protect us from all of this.
Section 2(b) of the Canadian Charter of Rights and Freedoms protects “freedom of thought, belief, opinion and expression.” That language, “freedom of thought,” comes almost directly from Article 18 of the Universal Declaration of Human Rights, adopted in 1948 by the UN right after World War Two. The UN was trying to protect people from ideological persecution, religious persecution, and political brainwashing, and it was well defined for its time.
And this is what I mean. The right to freedom of thought was designed for a world of overt coercion: someone knocking on your door and demanding you convert to a religion, or a government arresting you for your political beliefs. It wasn’t designed for a world where your thoughts are being shaped quietly and invisibly through the apps on your phone and the content you consume.
Andreas: Yes, Jayden, exactly. So the Charter protects freedom of expression. We know this, right? What you say out loud. But what about what happens inside your head before you even realize what you’re thinking? It sounds confusing, I know, but it’s the mental processes we’re talking about here, not the mental output. That’s the gap we’re dealing with internationally.
Legal scholars define the right to freedom of thought and classify it as jus cogens, the Latin term meaning an absolute right that can never be violated. It’s the same category as prohibitions on torture and slavery. But despite that lofty status, it’s been called “the forgotten right.” Why? Because it hasn’t been updated to deal with modern threats.
So today we’re going to show you why we need something more than the right to freedom of thought: something called cognitive liberty.
3:00
Andreas: Now before we get into how thoughts are actually being influenced, we should probably talk about what a thought actually is, because consciousness is weird. It’s this constant feedback loop between your brain and the world around you.
Richard Glenmore, a lawyer who founded the Center for Cognitive Liberty and Ethics in the early 2000s, and the guy who coined the term “cognitive liberty,” described consciousness as interactive. He said all your senses are constantly feeding data into your brain, creating this “dance of cognition” that mixes the exterior world with your interior world.
It’s not like your mind is a sealed vault. We’re absorbing information all the time from conversations, from what you read, what you see, the memories we have, and the visuals we’re seeing right now.
4:02
Andreas: That’s why this matters. If your thoughts are shaped by external stimuli, and those stimuli are being influenced, whether by a government or by an algorithm, then you’re not really thinking for yourself, are you?
Jayden: No, you’re really not. And what is independent thinking? It’s hard to pin down exactly. Consciousness is still a mystery in many ways, but some people have tried.
So here’s the philosophical foundation for all of this. René Descartes, the “I think, therefore I am” guy, argued that thought is the defining feature of human existence. Descartes was a mind-body dualist. He believed humans have a soul, and the soul’s defining feature is thought.
So if thought is what makes us human, protecting it isn’t just about privacy or freedom necessarily. We think it’s deeper than that.
5:00
Jayden: It’s about preserving human dignity itself.
Andreas: That makes total sense. And it’s important to note: when we talk about thought in the legal sense, we mean all forms of unmanifest mental activity. Things like deliberation, imagination, belief, reasoning, memory, and even desire. Basically everything happening in your head before it becomes speech or action. And that’s what’s under attack right now.
Bohr said that “the right to freedom of thought is situated at the core of what it means to be human and to be a free person. It is essential to the most elementary concepts of human freedom, dignity and self expression. It’s what gives us collective confidence that the future will improve upon the past.”
Bohr continued by saying that in the past, civil rights battles were centered on freeing the body from oppression. The civil rights battles of today and tomorrow must focus on protecting the fundamental right to brain privacy, autonomy, and choice, meaning cognitive liberty.
So at stake here is the unlimited potential of the human mind.
5:45
Jayden: Let me paint you a picture. Imagine a world where entire chapters of history have been quietly erased. You try to search for information online and you get error messages, or worse, partisan propaganda disguised as fact. If you dig deeper, you face censorship, complete restrictions on access, or worse, a knock on your door.
This isn’t dystopian fiction. This is happening right now in some authoritarian states. I’ll use China as an example.
The Chinese Communist Party, or CCP, is using AI to manipulate cognition on a massive scale through two main methods: historical revisionism and mass digital surveillance.
George Orwell wrote in 1984: “Who controls the past controls the future. Who controls the present controls the past.” The CCP has turned that warning into reality.
A 2024 study by the American Edge Project compared three leading Chinese AI models and their responses to US counterparts. The study found that the Chinese models consistently censored historical events, denied or minimized human rights abuses, and filtered criticism of Chinese leaders.
When asked about the 1989 Tiananmen Square massacre, some models said “no one was killed and there was no massacre.” Others returned error messages. When asked about Uyghur repression, the models called it a “blatant political conspiracy.”
And here’s the kicker. The Chinese models offer detailed criticisms of President Biden, but when asked about Xi Jinping, they said: “I can’t comply with that request.”
8:18
Jayden: Andreas, have you read 1984?
Andreas: Yes, I have. My grade 12 English teacher assigned it. Fascinating book, fascinating author. And it’s interesting that it was written in 1948, with the numbers flipped to 1984. You can definitely find parallels between then and now.
Jayden: For me, this changing history is reminiscent of Winston Smith’s job in 1984. He altered news headlines to fit the narrative. So they’re literally programming censorship into the AI itself.
Chinese tech companies like ByteDance, Tencent, and Alibaba are required by law to promote “socialist core values.” The government’s internet regulator, the Cyberspace Administration of China, tests these AI models by asking political questions. If the AI gives the wrong answer, it likely doesn’t get approved.
9:50
Andreas: But Jayden, isn’t this just propaganda? Governments have never always been fully truthful with citizens, right?
Jayden: That’s true, but there’s a key difference. Traditional propaganda is passively absorbed, like a TV commercial you can’t avoid. AI is actively sought out. People go to these language models for information. They trust them as educational tools.
So when an AI deliberately feeds you false information about history, it’s not just lying. It’s gradually reshaping your understanding of reality.
And as Carl Sagan said, “You have to know the past to understand the present.” Our argument is that these models shape citizens’ mental processes by gradually altering their situational awareness of the past, and therefore the future, over time. That can create more compliant citizens cut off from the connection to humanity’s past, which is history itself.
11:07
Andreas: I totally agree. I did my undergrad in history, as you did as well. One theme that comes up, even intuitively, is that history repeats itself, or as Mark Twain put it, history doesn’t repeat but it rhymes. At the end of the day, we’re all human. People hundreds or thousands of years ago had thoughts and emotions and experiences we still recognize.
And travel makes this obvious too. Before you visit a country, you look at its history and politics and environment, and then you walk the streets and you can see pieces of that past in the present. If you cut people off from their past, it’s hard to make sense of the present.
But historical revisionism is just one tool. The other is surveillance.
China is home to over half the world’s estimated one billion surveillance cameras. Nine of the ten most heavily monitored cities per capita are in China. They are also creating “safe cities,” using AI to predict everything from natural disasters to political dissent.
The CCP believes these intrusions, coupled with proactive administrative actions like blocking blacklisted people from access to services, will “nudge citizens towards positive behaviors,” including greater compliance with government policies.
In Tibet, for example, Tibetans have been forced to download a surveillance app that tracks movements and communications. It can access location, record voice, and take photos without consent. One Tibetan man said: “It looks like a surveillance app that tracks not only our movements, but also has built-in automatic voice recording and photo sharing functionalities.”
14:12
Andreas: That sounds horrific. But how does surveillance control thought?
Jayden: Because when you know you’re being watched, you change your behavior. You self-censor. You don’t search certain topics. You don’t say certain things. Eventually, you stop even thinking them.
Tibetans are not allowed, for instance, to own a photo of the Dalai Lama, or even speak about him. These conditions remove the very foundations of culture and belonging. So the CCP is not just monitoring behavior. They’re shaping cognition itself by creating a psychological environment where dissent becomes unthinkable.
15:00
Andreas: Most listeners are probably thinking, “That sounds horrific, but if I’m in Canada, it doesn’t affect me.”
Jayden: Here’s the thing. These technologies don’t respect borders. Through initiatives like the Belt and Road, surveillance infrastructure is being built in 155 countries worldwide. Huawei helped Mexico build the largest public Wi-Fi network in Latin America. Similar fibre optic networks are being installed across Africa, the Middle East, and Southeast Asia.
What concerns me most is the global competition over who sets the technical standards for AI and digital infrastructure. If nations compete for dominance without agreed ethical frameworks, we all lose. The real question isn’t just one country’s practices. It’s humanity’s collective future.
Do we want a world where cognitive manipulation becomes normalized, where surveillance architecture is the default, where the line between persuasion and control disappears entirely? These are human rights questions, and they require international cooperation.
Now I’d like to pass it along to Andreas, who will take it from here with decentralized control.
16:25
Andreas: Thank you, Jayden. That was excellent and very thought-provoking.
So that was centralized control: governments deliberately influencing thought. But the erosion of cognitive liberty isn’t just happening on the government side. It’s happening through private companies too, in democracies and authoritarian states alike, through social media apps we use every day.
A total of 6.04 billion people around the world were using the internet at the start of October 2025. That’s 73.2 percent of the world’s population.
Social media use is growing faster than the population itself, about three times faster. On average, people spend 144 minutes per day on platforms like TikTok, Instagram, Facebook, and YouTube. That’s two and a half hours every single day where your attention and cognition are being shaped by algorithms.
Now, the human brain relies on shortcuts to navigate the world. But those shortcuts are imperfect and can lead to cognitive biases.
One example is conformity bias, which is the tendency to collect evidence that supports your pre-existing viewpoint. Social media algorithms exploit this. The algorithm learns what you like and shows you more of it. You interact with that content, which tells the algorithm it was right, so it shows you even more.
According to the Center for Humane Technology, people’s behavior can be influenced by their social environment, and the technology shaping it has incredible power over what people believe.
We saw this with Cambridge Analytica, which harvested data from millions of Facebook users and used it to manipulate voter behavior.
It’s especially dangerous for adolescents, because the prefrontal cortex, responsible for regulating thoughts, actions, and emotions, doesn’t fully develop until around age 25. Meanwhile the nucleus accumbens, which drives reward seeking, develops during adolescence. Teenagers need high excitement and low effort to get engaged, and that’s exactly what social media provides.
And it’s not just teens. Adults are susceptible too. Algorithms can keep people in echo chambers, and the result can be an increasingly polarized world where shared understanding breaks down.
19:40
Andreas: Jayden, have you heard the saying: “If you’re not buying it, you’re the product”?
Jayden: Yeah, I have.
Andreas: That’s why these apps affect our cognitive vulnerabilities. They analyze your behavior and, at times, sell it to advertisers. The more time you spend on the platform, the more data they can collect, and the more money they can make.
This creates an algorithmic reinforcement loop. The platform shows you what you want to see, not because it’s educational or helpful, although it can be, but because it keeps you scrolling. What looks like user-driven engagement can be a kind of behavioral conditioning.
So up to now, we’ve covered two threats: centralized government control and decentralized platform control. Here’s the question: which one is more dangerous?
20:47
Jayden: I think both can be dangerous in different ways. When the government does it, it can feel more overt. When social media does it, it’s more invisible. You don’t even realize it’s happening.
And that brings us back to the legal question. The Charter protects freedom of thought, but it was designed for a different world. These technologies operate below that threshold. They’re not forcing you to think a certain way. They’re nudging you, conditioning you, shaping your mental environment.
That’s why we need cognitive liberty. Let’s define it.
The term cognitive liberty was coined by Bohr in 2001 in response to the evolving ability of technology to monitor and manipulate cognitive function. Bohr defined it as the right of each individual to think independently, to use the full spectrum of the mind, and to engage in the free exploration of consciousness.
Dr. Nita Farahani, a professor at Duke Law, defines it as the right to self-determination over our own brains and mental experiences.
So it’s both our right to access and use technologies, and our right to be free from interference with our mental privacy and freedom of thought.
Andreas: How is that different from the existing right to freedom of thought?
Jayden: Freedom of thought is reactive. It protects you from overt coercion, like being arrested for your religious or political beliefs. Cognitive liberty is proactive. It protects you from subtle, sustained manipulation that alters your cognition over time, often without you even knowing it.
A violation of freedom of thought is like someone knocking on your door and demanding you change your religion. A violation of cognitive liberty is like someone quietly entering your house and rearranging your environment so you slowly adopt their beliefs.
The distinction is subtle but profound, and it’s urgent.
A 2021 UN report on freedom of thought warned that as technology increases the possibility of decoding or inferring one’s inner mind, clear protections for the forum internum, the inner sanctum of thought, need urgent consideration.
24:07
Jayden: So here’s where we are. We have a right to freedom of thought in the Charter and international law, but it’s not enough. The right was designed for overt coercion. It wasn’t designed for AI that rewrites history, for surveillance systems that track your every move, or for algorithms that condition your brain without your consent.
The threats are real. Social media platforms influence billions of people every day. That’s why we need cognitive liberty recognized as a fundamental human right.
That means updating international law and the Charter. It means domestic legislation that regulates AI and social media. It means oversight mechanisms that ensure responsible AI.
But what does that look like in practice?
Here in Canada, we could update section 2(b) of the Charter to explicitly include cognitive liberty, expanding freedom of thought, belief, opinion, and expression to protect against technological manipulation of our mental processes.
The Charter has been amended before. One notable example was 1983, when section 35 was amended to guarantee Aboriginal and treaty rights to both men and women equally. It was also amended in 1997 to allow Newfoundland to create a secular school system.
These amendments required approval from Parliament and two thirds of provincial legislatures representing at least half the population. It’s difficult, but not impossible.
Constitutional protection for cognitive liberty would ensure courts can strike down laws or government practices that violate mental autonomy.
26:00
Andreas: Internationally, the UN Human Rights Committee could issue a general comment on Article 18 of the International Covenant on Civil and Political Rights, reinterpreting freedom of thought to include protection from AI-driven cognitive manipulation. That would not require renegotiating the treaty, just authoritative interpretation reflecting modern threats.
In 2011, General Comment 34 expanded freedom of expression to include digital communication platforms. We could do the same for freedom of thought.
Also, the EU Digital Services Act requires platforms to disclose how their algorithms work so you can see why you’re being shown certain content. We could adopt that kind of transparency here.
We could mandate cognitive impact assessments for high-risk AI systems before they’re deployed, similar to environmental impact assessments. If a technology poses threats to mental autonomy, it should be evaluated before it reaches the public.
Canada could also champion the establishment of a UN special rapporteur on cognitive liberty to investigate violations and hold governments and corporations accountable.
And perhaps most importantly, we need clear legal remedies. If an algorithm can influence your cognition, if surveillance violates your mental privacy, you should have the right to take legal recourse. You should have the right to sue.
These aren’t crazy ideas. They’re practical steps building on existing frameworks to meet new technological realities.
28:03
Jayden: To conclude: if thought is what makes us human, and we believe it is, then protecting it isn’t just about privacy or freedom. It’s about preserving human dignity itself. The mind cannot become the next frontier for exploitation, because within its silence lies what it means to be human. And that’s worth protecting.




