You’ve felt it, even if you couldn’t name it. That study tool students relied on suddenly locks features behind paywalls. The MOOC platform that promised free education now requires payment just to watch lectures. Your learning management system keeps adding “features” that make everything more complicated while support gets slower and bugs pile up. There’s a term for this: ensh**tification. And once you understand how it works, you’ll see it everywhere in educational technology.
What this means for education
Cory Doctorow coined the term to describe a predictable lifecycle for digital platforms. The pattern follows three stages: first, platforms are good to users. Then they abuse users to benefit business customers. Finally, they abuse business customers to extract all value for themselves. Then the platform dies, or becomes so degraded it barely functions.
The term resonated because it captured something people had experienced but couldn’t articulate. For those of us working in educational technology, the framework explains dynamics we’ve watched unfold across the sector for years.
Here’s what makes this particularly dangerous in education: power asymmetry. When Facebook degrades, you can delete your account. When Coursera puts courses behind paywalls, students can’t just switch platforms mid-semester. When your district signs a multi-year LMS contract, teachers have no exit option even as the platform deteriorates. Students and educators become captive audiences while platforms optimize for shareholder value instead of learning outcomes.
The MOOC promise, broken
The clearest story of this in education comes from the MOOC revolution. When Coursera launched in 2012, founders promised to make the best education in the world freely available to anyone who seeks it. EdX, created by MIT and Harvard as a nonprofit, offered an alternative to commercial platforms. By 2025, that founding vision lies shattered.
Coursera followed the three-stage pattern with remarkable precision. In its early years, the platform delivered genuinely free education. Full course access, video lectures, readings, and practice exercises at no cost. The platform attracted millions of learners worldwide. Then came gradual monetization: paid certificates, specializations, and degree programs, while maintaining free course auditing.
The third stage began when new leadership from major tech companies implemented “Preview Mode,” a time-based paywall that restricts full course access and allows users only to sample course beginnings. The timing reveals everything about priorities. Coursera had never been financially stronger, with substantial revenue, positive cash flow, and massive cash reserves. Yet pressure from investors demanded more aggressive extraction.
EdX’s trajectory proved even more dramatic. After operating at losses, MIT and Harvard sold the nonprofit to a for-profit company for hundreds of millions. Universities had partnered specifically because of nonprofit status. The sale shocked the higher education community and revealed how financial pressure overwhelms educational mission.
The consensus among education technology observers: the original vision of free, open, high-quality online education is dead. What remains is a platform optimized for revenue extraction from locked-in users.
When everyday tools turn predatory
Student-facing platforms have implemented these moves with surgical precision. Quizlet, used by tens of millions of students monthly, paywalled its core Learn and Test modes after years of free access. The timing? Strategically placed for the school year start. Students who built study habits around these features suddenly faced barriers.
The backlash highlighted the real impact on students. High schoolers protested that the platform was excluding students who couldn’t afford subscriptions from using vital education tools. The hypocrisy stung particularly because Quizlet’s own educational content advocated for the very features now locked away.
Duolingo transformed from educational tool to advertisement delivery system. By recent years, users reported experiencing far more advertisement time than actual learning time in each session. The company uses machine learning to optimize ad delivery, generating substantial revenue from this system. With revenue approaching unprecedented levels for a language learning app, Duolingo moved the “Explain My Answer” feature behind paywalls. This feature helps learners understand their mistakes, a core pedagogical principle, not a premium luxury.
When employees at Duolingo expressed concerns that aggressive monetization clashed with their mission, leadership acknowledged community frustration but proceeded anyway. The resulting monetization strategy vindicated those concerns. Machine learning-optimized advertisement delivery maximizes extraction regardless of educational impact.
LMS platforms as surveillance infrastructure
Learning management systems deserve special attention because they combine these efforts with surveillance capitalism. Canvas and Blackboard dominate the U.S. market, creating powerful network effects that trap institutions.
Recent class action lawsuits allege that major LMS companies collect data including names, student IDs, messages, discussion comments, test results, grades, search activity, essays, and creative writing. This data surpasses what we traditionally consider an education record and allows companies to build detailed profiles of children.
Legal complaints cite violations of children’s privacy laws, constitutional protections, and state privacy acts. These platforms use APIs allowing third-party developers to build integrations, raising concerns about data sharing without adequate transparency or consent. Information about data practices and third-party partners often scatters across multiple separate privacy policies.
Vendor lock-in makes LMS efforts particularly insidious. Switching costs are extraordinarily high. Proprietary data formats, complex integrations, and multi-year contracts trap institutions even when platforms degrade. Teachers report entering data in multiple incompatible systems. Platform lock-in challenges academic freedom and intellectual property rights because faculty often don’t control their own course materials once uploaded.
The scale of privacy violations
Recent research found that the vast majority of educational apps share student data with third parties. Categories of data commonly collected include personal identifiers, academic performance, behavioral data, biometric data like facial recognition and fingerprints, location data, browsing history, and psychological profiles.
The Federal Trade Commission has issued warnings that parents and schools must navigate an industry dominated by the commercial surveillance business model. Potential civil penalties for privacy violations have increased substantially.
Google Classroom and Zoom Education demonstrate how “free” educational services extract value through data. Despite company claims about privacy protections in educational contexts, these platforms collect extensive information including account details, activity data, device information, voice and audio, location data, IP addresses, and sensor data.
Zoom’s privacy failures proved particularly egregious. Before policy changes, the platform allowed third-party advertising cookies on its education version, sharing student data with major tech and advertising companies. Major school districts banned its use. Even after reforms, Zoom automatically scans content and collects biometric data including iris or retina scans.
AI as accelerator
Artificial intelligence represents what some researchers call an “ensh**tification engine.” The business model becomes clear: many aspects of everyday life will be downgraded to the AI-as-default version. The things you currently do and take for granted will be sold back to you as the premium version. For a subscription you can have access to real people. Otherwise it’s AI for you.
The lock-in proves particularly insidious with AI because once everyone adopts it, where can you go? Students themselves recognize the degradation. Research shows students use ChatGPT to avoid thinking for themselves, with some studies finding students felt AI stole their voice when forced to use it for assignments.
Major AI companies position themselves as educational saviors while pursuing profit extraction. Partnerships with teachers’ unions drew criticism because unions exist to protect workers, not offer them up for exploitation. Company leaders declare education a priority initiative while pursuing aggressive monetization strategies.
Why the “sustainability” defense fails
When edtech companies defend their practices, arguments center on financial sustainability. Yet evidence often contradicts these claims. EdX justified its sale by arguing nonprofit status created financial disadvantage, but the massive sale price suggests substantial value was being extracted rather than the platform being unviable.
Platform executives claim everything focuses on students and learning. Yet companies implement paywalls while maintaining substantial cash reserves and positive cash flow. Language learning apps approach unprecedented revenue levels while implementing aggressive advertising that makes platforms nearly unusable.
Companies claim premium tiers fund free users, but the evidence suggests free tiers exist primarily to create lock-in before monetization. Small numbers of business accounts generate large percentages of annual revenue while millions of students use degraded free access. The business model depends on locking in massive user bases through “free” offerings, then systematically degrading experience until payment becomes necessary.
The exception that proves the rule
Khan Academy represents a notable exception, a platform that has largely resisted these efforts by maintaining nonprofit status. The organization receives substantial donations from major foundations and companies but keeps core educational content serving millions of students freely accessible without advertisements.
The contrast with for-profit competitors proves instructive. While major platforms generate hundreds of millions annually and implement new paywalls despite strong financials, Khan Academy demonstrates that educational technology can remain mission-driven when freed from shareholder demands. The platform’s persistence suggests that ensh**tification stems not from technological or business necessity but from the specific demands of venture capital and public market investors.
Understanding the pattern to make better decisions
Recognizing ensh**tification helps us make informed choices about educational technology. The pattern is consistent enough to predict: platforms attract users with promises of accessible education, achieve lock-in through network effects and switching costs, then systematically prioritize investor returns over learning outcomes.
What makes educational ensh**tification particularly harmful is the power imbalance. Students cannot simply switch platforms when coursework requires specific access. Parents cannot opt out when schools mandate surveillance-laden tools. Teachers cannot resist when districts sign multi-year contracts with platforms that degrade.
The most vulnerable students suffer disproportionately as free educational resources become paywalled luxuries, while their data gets extracted and monetized without meaningful consent or compensation. This represents not just platform degradation but a fundamental betrayal of education’s democratic promise.
Key takeaways for educational stakeholders
For administrators
- Evaluate total cost of ownership beyond initial price tags. Free or low-cost platforms may become expensive after lock-in occurs. Factor in switching costs, vendor lock-in risks, and the likelihood of future price increases or feature paywalls when making platform decisions.
- Prioritize interoperability standards and data portability. Require vendors to support open standards like LTI, IMS standards, and data export capabilities. Ensure you can migrate content and user data if a platform degrades or pricing becomes unsustainable.
- Shorten contract terms when possible. Multi-year agreements create vulnerability to ensh**tification. Annual or two-year contracts provide more flexibility to respond when platforms degrade service quality.
- Establish data governance policies. Require clear documentation of what data gets collected, where it goes, who has access, and how it can be deleted. Reject platforms with unclear or scattered privacy policies. Make data minimization a selection criterion.
- Consider open-source alternatives seriously. While proprietary platforms may offer polish and convenience, open-source learning management systems and tools avoid the profit-extraction cycle. Factor in long-term sustainability and mission alignment, not just immediate features.
For educators
- Design for platform independence. Create content and courses that can migrate between systems. Use portable formats like standard HTML, PDF, or open document formats rather than proprietary tools. When platforms degrade, you want exit options.
- Maintain local copies of your content. Don’t trust that platform-hosted materials will remain accessible or exportable. Keep your syllabi, assignments, rubrics, and other course materials in formats you control outside the LMS.
- Build in multiple modalities. Don’t make any single platform essential to course delivery. If your entire course lives inside one LMS or tool, you’re vulnerable when that platform degrades. Create redundancy through email, course websites, or alternative delivery methods.
- Question “free” tools before adoption. When a new platform offers free access, ask what the business model will look like in a few years. How will this platform monetize once students are dependent? What’s your backup plan if things change?
- Teach students about platform dynamics. Help learners understand why some educational tools degrade over time and how to recognize when platforms prioritize profit over learning. Build critical digital literacy into your curriculum.
For professional developers and trainers
- Prioritize transferable skills over platform-specific features. When training educators or corporate learners, focus on pedagogical principles and instructional design fundamentals that apply across tools. Platform-specific training loses value when that platform ensh**tifies.
- Develop awareness of the ensh**tification cycle. Help your audiences recognize early warning signs: sudden feature changes, degraded support, introduction of premium tiers for previously free features, or changes in company ownership and leadership.
- Create resources that outlast specific platforms. Design training materials, templates, and frameworks that work across multiple tools. Your professional development should remain valuable even when the specific platforms change.
- Model platform-agnostic thinking. Demonstrate how to accomplish learning goals using multiple tools or approaches. Show learners they have options rather than dependence on any single platform.
- Advocate for sustainable models. Use your voice to push back against platform consolidation and vendor lock-in. Support open educational resources, open-source tools, and nonprofit alternatives in your recommendations and training.
For tech coordinators
- Build technical infrastructure for portability. Implement systems and workflows that make platform switching feasible. Use middleware and integration layers that separate your data and processes from specific vendor platforms.
- Document platform dependencies systematically. Maintain clear records of what tools connect to what, what data flows where, and what the switching costs would be for each major platform. This visibility helps you respond quickly when platforms degrade.
- Establish platform evaluation criteria that include ensh**tification risk. When assessing new tools, consider the vendor’s business model, ownership structure, market position, and incentives. Venture-backed startups and newly public companies face the highest risk.
- Create contingency plans for major platforms. For your critical systems like LMS, student information systems, and communication tools, document what you would do if that platform doubled prices, was acquired, or significantly degraded service. Having a plan reduces panic when changes occur.
- Support and advocate for open standards. Participate in communities and initiatives that promote interoperability. The more the sector adopts open standards, the less power any individual vendor has to lock users in and then extract value.
For families
- Ask schools about data practices. Request clear information about what educational technology tools collect data about your children, what companies receive that data, and how you can review or delete it. Schools should be able to answer these questions.
- Recognize that “free” tools have business models. When your student uses a free app or platform, ask how that company makes money. If the answer isn’t clear, they’re likely monetizing through data collection or advertising.
- Support educational alternatives when possible. Choose nonprofit educational resources like Khan Academy over for-profit alternatives when you have options. Model critical thinking about technology platforms for your children.
- Teach children about digital literacy. Help kids understand that platforms don’t always have their best interests at heart. Discuss why apps ask for certain permissions, how companies make money from user data, and what it means when a free service becomes paid.
- Advocate within parent organizations. Raise questions about educational technology decisions at PTO meetings, school board sessions, and in parent communities. Your voice matters in pushing schools toward more sustainable and ethical technology choices.
Conclusion
The research connecting ensh**tification to educational technology reveals a pattern we can no longer ignore. Every decision about educational technology represents a choice: will we prioritize mission or profit extraction? Will we demand platforms that serve learners or platforms that serve shareholders? Will we build systems we control or rent access to systems that will inevitably degrade?
Yet resistance persists and matters. We can support nonprofit models, open-source alternatives, and platform cooperatives. We can design for portability and independence. We can teach critical digital literacy and demand transparency. We can resist the assumption that education must be subordinated to extraction logic.
The framework makes the stakes visible. Use it to evaluate platforms, question vendor promises, and push back against the assumption that this trajectory is inevitable. Education deserves better than platforms optimized for quarterly returns at the expense of learning outcomes.
The question isn’t whether ensh**tification will continue. The question is whether we’ll recognize it, name it, and build alternatives that put learning first.


Leave a comment