In Mary Doria Russell’s The Sparrow, published in 1996, one of the earliest plot points involves a professional role that Russell invented for her near-future setting: the vulture. A vulture is a freelance AI analyst who gets assigned to a specialist, spends weeks shadowing them, interrogating the details of their work, and then determines how much of that work can be encoded into software and handed over to machines. The name tells you everything about how the workers feel about it. When Jimmy Quinn, an astronomer at Arecibo, gets assigned a vulture, his colleague Peggy Soong’s first reaction is to urge him not to cooperate. If he does, it gets that much harder for the rest of them to keep their jobs.
Russell wrote this nearly thirty years ago, and it reads like a job description you could post on LinkedIn tomorrow.
I’ve been thinking about vultures a lot lately because, in a way that I find both exciting and uncomfortable, the work I described in my recent article on DIY EdTech automation is the same fundamental activity. When I sit with an instructional coach like Sarah and walk through her Friday PD certificate routine, asking her to articulate every step, every decision point, every edge case she handles without conscious thought, I am doing what Sofia Mendes does when she arrives at Arecibo to be Jimmy Quinn’s vulture. I’m picking apart the structure of someone’s expertise so that a machine can take over parts of it.
The difference, and I think it’s a meaningful one, is in who holds the power.
The Dystopian Read
In Russell’s world, vultures work on behalf of institutions. They’re hired by employers to extract worker knowledge, encode it, and potentially eliminate the positions those workers hold. Sofia Mendes doesn’t choose Jimmy Quinn as a client because she wants to help him. She’s contracted by ISAS, the organization that owns Arecibo, to determine whether Quinn’s job can be done by software. The knowledge flows upward: from worker to analyst to institution. The benefits flow upward too.
It doesn’t take much imagination to see how the kind of automation I’m advocating for could go this direction in education. Picture a consulting firm that positions itself as a “workflow optimization partner” for school districts. They send analysts into buildings to shadow administrative assistants, instructional coaches, and support staff. They map every process, identify automation candidates, build the tools, and hand them to administration. The pitch to the school board writes itself: we can maintain the same level of service with fewer staff, or at least without replacing the positions that open up through attrition.
This isn’t hypothetical. It’s a business model that already exists in the private sector, and the language of “efficiency” and “doing more with less” has been creeping into public education for decades. When someone external maps your processes and builds automation on your behalf, the implicit question is always the same one Russell dramatized: now that we know how to encode your work, do we still need you to do it?
The vulture framing also captures something about the emotional experience that I think gets underplayed in most conversations about workplace automation. Quinn’s colleagues don’t just dislike vultures on principle. They’re afraid. They’ve watched it happen to others. They know that cooperating with the process makes it easier for the institution to justify cuts. There’s a real betrayal embedded in the act of honestly describing what you do all day to someone whose job is to figure out if a machine could do it instead. You’re handing over the keys to your own replaceability.
Any honest conversation about bringing automation tools into schools has to sit with that feeling rather than rushing past it.
The Utopian Read
But here’s where the Sarah story from my article diverges from the vulture model, and where I think there’s something genuinely different happening.
Sarah wasn’t assigned a vulture. Nobody from central office sent an analyst to shadow her and determine which parts of her job could be eliminated. She looked at her own Friday routine, decided she was tired of spending three hours on certificate generation, and chose to automate it herself. The knowledge still got mapped and encoded, but it stayed in her hands the entire time. She decided what to automate, how to automate it, and what to do with the time she got back.
That last part matters enormously. When an institution automates a worker’s tasks, the reclaimed time belongs to the institution. They can redistribute it, eliminate it, or pile more work into the gap. When a worker automates their own tasks, the reclaimed time is, at least initially, theirs. Sarah used her recovered hours for the work she’d been wanting to do all along: deeper coaching conversations, more classroom observations, actual relationship-building with the teachers she supports. Nobody told her to do that. She just finally had the space.
This is the version I want to believe in, and it’s the version I try to support in my work with educators. Self-elected automation, driven by the person who does the work, where the benefits accrue to the worker first. You identify your own pain points. You map your own processes. You build (with AI’s help) your own tools. You decide what the freed-up time becomes.
In The Sparrow, Jimmy Quinn actually tries something like this. Rather than simply cooperating or refusing, he proposes a pilot project: let Sofia build the AI program from his expertise, then pit him against it to see which performs better. He’s trying to prove that the human still adds value beyond what the encoded version can replicate. He’s asserting that the map of his expertise is not the same as his expertise. That the automation can handle the routine pattern recognition, but the novel observations, the creative leaps, the judgment calls in ambiguous situations, those still require him.
That’s essentially the argument I make when I tell educators to automate the routine 80% and keep the human judgment for the remaining 20%. The certificate generation, the hour calculations, the email routing: those are pattern-matching tasks that follow clear rules. The decision about whether a particular professional development experience actually changed a teacher’s practice, the conversation with a struggling educator about what support they really need, the ability to read a room during a training and pivot on the fly: those are the things that require a person, and they’re the things that get crowded out when that person is buried in spreadsheet work.
My Own Conflict
I’d be dishonest if I presented this as a clean binary. The truth is that even the self-elected version makes me uneasy, because I don’t fully trust the systems these educators work within to respond the way I’d hope.
I don’t even have to look outside my own organization to find this tension. Part of my work involves developing online courses, and some of my colleagues deliver in-person professional development on related topics. These are people who are not opposed to online learning in any philosophical sense. They understand its value. But I’ve felt the hesitation when the conversation turns to building an online version of something they currently deliver face-to-face. The concern isn’t about pedagogy. It’s about what happens when a school district looks at a catalog and sees that they can get an asynchronous course on the same topic for a fraction of the cost of bringing someone on-site for a full day. The worry is that the cheaper, easier-to-implement option becomes a replacement rather than a complement, and that the people who built their professional lives around delivering that training lose the work that sustains them.
I understand that worry completely, because it’s the same vulture logic playing out at a smaller scale. The knowledge that my colleagues carry, the ability to read a room, adjust pacing on the fly, respond to the specific challenges a particular group of teachers is facing, none of that translates into an asynchronous course. But the institutional decision-makers comparing line items on a budget spreadsheet may not see it that way. They see two products that cover the same topic, and one costs significantly less. The nuance about what gets lost in translation doesn’t always survive a budget meeting.
This is the automation anxiety made personal and immediate. It’s not about robots or AI or some distant future. It’s about the people I work with wondering whether the things I build will eventually be used to argue that their work is redundant. And the fact that I can see both sides, that I genuinely believe online and in-person learning serve different purposes AND that I understand why the fear is rational given how organizations actually make decisions, doesn’t resolve the discomfort. It just means I carry it with me.
Here’s the scenario that keeps me up at night on a broader scale: Sarah automates her certificate process and saves 2-3 hours a week. She uses that time for deeper coaching work, and her teachers benefit. Then her principal notices that Sarah seems to have extra capacity. New responsibilities get added. A committee assignment here, a data reporting task there. Within a year, the automation didn’t free Sarah up. It just made room for more work to fill the gap. The efficiency gains got captured by the institution, not the worker, even though Sarah did everything “right.”
Or worse: Sarah shares her automation tools with other coaches in the district. Central office sees that coaches are now significantly more efficient at administrative tasks. The next budget cycle, they reduce coaching positions because “we’ve automated a lot of that work.” The tools Sarah built to help herself end up being used as justification for cutting her colleagues.
This is not a problem unique to education. It’s the fundamental tension of automation under capitalism, and it extends well beyond for-profit enterprises. I’ve watched nonprofit organizations, school districts, and cooperatives all fall into the same pattern: someone figures out how to do more with less, and leadership’s response is to demand more with even less. The efficiency gains never settle into reduced workload or improved quality of life for the people doing the work. They get absorbed into the organism’s growth imperative.
And that growth imperative is worth naming plainly. Continuous growth is the logic of cancer, not of a healthy organization. A healthy system grows to the size its environment can sustain, then maintains itself. It invests in resilience, in relationships, in the capacity to respond to change. An organization that treats every efficiency gain as an opportunity to expand, to take on more clients, serve more students, cover more territory with fewer people, is consuming its own people as fuel.
I see this in education constantly. Districts that respond to staffing shortages not by making the remaining positions more sustainable but by finding ways to squeeze more output from fewer workers. Service cooperatives (including ones I’ve worked with) where the response to a good tool or a streamlined process is always “great, now what else can we take on?” rather than “great, now our people can do their current work better and go home at a reasonable hour.”
Automation, even self-elected automation, feeds into this dynamic if the surrounding organizational culture treats human capacity as infinitely expandable.
What Sofia Teaches Us
There’s one more thread from The Sparrow that I think is worth pulling. Sofia Mendes is an extraordinary character for reasons beyond her role as a vulture. She’s essentially an indentured worker. Her technical skills were developed by someone who bought her as a child, and her earnings flow to the man who “invested” in her. Her expertise in AI analysis is simultaneously her greatest asset and the mechanism of her captivity. The work she does for others is the same work that, eventually, helps purchase her freedom.
There’s something in that paradox that maps onto what I see with educators and technical literacy. The same skills that could be used by institutions to replace workers can also be the thing that gives workers agency. When Sarah can build her own automation tools, she’s not dependent on IT approval that takes months. She’s not waiting for a vendor to build something that almost-but-not-quite fits her needs. She’s not vulnerable to a consulting firm that might map her processes and hand them to administration.
She has, in a small but real way, bought back a piece of her professional autonomy. She understands what she does well enough to articulate it, and she understands the technology well enough to decide which parts to encode and which parts to keep. That combination of domain expertise and technical literacy is, I think, the closest thing to a safeguard against the dystopian version of this story.
It’s not a complete safeguard. Sofia’s skills don’t protect her from the system she’s trapped in. They just give her the means to eventually work her way out. Similarly, an educator who can build their own tools isn’t protected from an administration that decides to cut positions or pile on responsibilities. But they’re better positioned than someone who can’t, because they understand the terrain well enough to advocate for themselves and to see when the efficiency narrative is being weaponized against them.
Where I Try to Land
I don’t have a clean resolution for any of this. The honest answer is that I believe the work I’m doing, helping educators build small automation tools to reclaim their time, is worthwhile and also potentially dangerous, depending on the organizational context it happens within.
What I try to hold onto is the distinction between self-elected and imposed automation. When a teacher or coach or administrative assistant looks at their own work and says “I want to stop spending three hours on this every Friday,” that’s qualitatively different from an efficiency consultant or a building administrator saying “we’ve identified three hours of automatable work in your schedule.” Even if the technical outcome is identical, the power dynamics are not.
I also try to be honest with the educators I work with about the risks. If you build a tool that saves you significant time, think carefully about who you tell and how you frame it. Not because you should hide your work, but because you should be intentional about the narrative. “I built a tool that lets me spend more time on coaching” is a different story than “I built a tool that eliminated three hours of work from my week.” Both are true. They invite very different institutional responses.
Russell understood something in 1996 that we’re still wrestling with: the act of mapping human expertise for machine consumption is never neutral. It’s always embedded in power relationships, economic incentives, and institutional dynamics. The question isn’t whether to automate. That ship has sailed. The question is who controls the process, who benefits from the results, and whether the humans involved get to keep the parts of their work that make them human.
In the best version of this story, teachers become their own vultures, and they do it on their own terms. In the worst version, someone else does it to them. The technology is the same either way. The difference is entirely about power, and about whether we build organizational cultures that treat reclaimed time as a gift to the worker or a resource to be harvested.
I’d rather help people build the tools themselves and take that chance than wait for the consultants to show up and do it for them.


Leave a comment