Democratizing Information
When Johannes Gutenberg unveiled his movable-type printing press in the mid-15th century, few could have imagined how profoundly it would reshape society. What seemed at first to be a technical innovation in book production soon proved to be one of the most disruptive forces in human history. The press democratized access to knowledge, lowered the cost of books, and fueled literacy and the spread of ideas that ignited the Renaissance. Yet it also provoked fear. Authorities—both religious and political—recognized the printing press as a threat to their control over information and attempted to contain it through censorship, licensing, and even book burnings. The story of the printing press is not just a historical curiosity. It offers a striking parallel to our current moment with generative AI. Much like the press, AI is a technology that enables alterity—radical change that challenges existing systems, hierarchies, and accessibility to information. Alterity here means more than just “otherness”; it describes the transformative potential of a new way of producing and sharing knowledge.
Who Resisted the Printing Press
- Religious authorities: The Catholic Church had long controlled access to scripture and its interpretation. The press broke that monopoly, enabling vernacular translations of the Bible and fueling the Protestant Reformation.
- Political authorities: Monarchs feared that dissenting or revolutionary ideas could spread too quickly to contain. Early presses faced licensing requirements and censorship.
- Knowledge gatekeepers: Monasteries, universities, and scribes who had once mediated access to texts found their influence diminished.
Who Resists Generative AI
- Corporations: Media and tech companies fear loss of control over content, profits, and intellectual property. Ironically, some of the loudest voices warning of AI’s dangers come from within Big Tech itself—often in ways that could cement their control by shaping regulation.
- Political leaders: Democracies fear AI-driven misinformation; authoritarian states fear its potential to weaken censorship and empower dissent.
- Intellectual gatekeepers: Scholars, journalists, and cultural critics worry about the erosion of expertise, authorship, and human creativity.
- Ethical doomsayers: Some warn of existential risks, framing AI as a possible extinction-level threat. These narratives often capture attention, funding, and influence.
The Deeper Pattern: Knowledge and Power
At its core, resistance of disruptive technology reflects a deeper pattern: the loss of control over knowledge. In Gutenberg’s time, the ability to read, interpret, and spread ideas moved from a select elite to the broader public. Today, generative AI similarly lowers barriers to content creation, analysis, and creativity. When everyone can generate ideas, summaries, art, or even code, the monopoly of traditional gatekeepers is destabilized.

Using Generative AI Positively
Rather than seeing AI only through a lens of fear, we can embrace it as a tool for learning, connection, and creativity. Here are a few practical ways to use it constructively:
- Foster dialogic learning: Use AI to spark conversations, pose multiple perspectives, and support reflective dialogue.
- Assess for misinformation: Use AI as a tool to cross-check claims and strengthen critical thinking about sources.
- Generate better questions: Let AI help formulate thoughtful questions that improve human interaction and discussion.
- Support reading comprehension: Read a book and use AI to generate a clear, accessible summary to reinforce understanding.
- Create non-sensationalized news summaries: Use AI to distill current events into balanced, fact-based summaries that avoid sensationalism and reduce stress.
A Closing Thought
The printing press reminds us that disruptive technologies are not inherently good or bad. They are tools that enable alterity—a break from the old order, with both liberating and unsettling effects. Generative AI, like the press, will expand access to knowledge while provoking those invested in gatekeeping to amplify fears. The challenge for us today is to recognize the resistance for what it often is: a defense of power and profit. And the opportunity is to ask—how can we use this technology to broaden human potential, deepen well-being, and write a new chapter of collective learning?
Alterity Technology that enables alterity can be understood as technology that democratizes generative learning, creating access for more people to produce and share knowledge, and in doing so, has the potential to radically change existing systems of power, authority, and culture.
At Next Level Safety, we believe technology can serve community well-being when AI is used in a safe and ethical manner.