How decentralized administration and phased rollout balanced autonomy with consistency across a fragmented entity
- Project Type: Statewide Learning Management System Implementation
- Audience or Client: Statewide entity serving employees across departments (obfuscated per contract)
- Timeline: Implementation 2022–2023, public launch August 1, 2023, first-year reflection through 2024
- Role: Implementation Lead and Acting Administrator
- Collaborators: Departmental sub-admins; cross-functional content migration teams
- Format or Deliverables: Canvas LMS deployment with Canvas Catalog and Canvas Studio, sub-admin training, course quality rubric, accessibility guidelines
The Situation
A large statewide entity needed to replace a professional development system that had grown incrementally over years and was no longer working for the people relying on it. Offerings were fragmented across departments, tracking of participation and completion was inconsistent, and the existing platform could not meaningfully support blended or asynchronous learning. Statewide initiatives and mandated trainings were especially difficult to manage.
The people using the system knew it was not serving them, but the scope of replacing it was daunting enough that the project had stalled before. My role was to lead the implementation of a centralized platform that would serve employees across every department in the state.
The Approach in Brief
The central design problem was balancing autonomy with rigor across an entity where departments operated differently, had different needs, and required genuine ownership of their learning spaces. The solution was a decentralized administration model: each department appointed sub-admins who had real authority over their course areas, while a shared quality rubric, accessibility standards, and a peer community of practice provided the consistency that a centralized approach would have enforced top-down. The reasoning was straightforward: at this scale, departments would adopt the platform if they owned it, and they would own it only if they had actual control.
What Came Out of It
The platform launched publicly on August 1, 2023 and has served as the primary professional learning environment for the entity since. A Teams channel for instructors and sub-admins grew into an active community of practice, and departments developed genuine ownership of their course areas. Coaching instructors through Cathy Moore’s action mapping approach moved a portion of the catalog away from the “recorded webinar with a completion checkbox” pattern that had been the default. Integration with the entity’s HR system and single sign-on solution removed a meaningful adoption barrier by eliminating new accounts and passwords, and a peer mentor program pairing less tech-comfortable users with colleagues who had adapted faster proved more effective than formal training sessions in many cases.
Specific engagement and completion data are confidential to the entity and cannot be shared here.
Key Takeaway
Large-scale LMS implementations succeed or fail based on how well they account for the people using them, not the systems they run on. Decentralized ownership and phased rollouts work because they treat adoption as a continuing relationship rather than a one-time launch.
Deeper Context
The starting position had history. The existing professional development system had been built up department by department over years, with each addition solving a local problem without consideration for the system as a whole. The result was the kind of organic accumulation that is hard to reverse: every department had something that worked for them, even if poorly, and replacing any one piece risked disrupting workflows that people had built workarounds around.
Canvas by Instructure was selected as the LMS, with Canvas Studio for video content and Canvas Catalog for course registration and management. The selection was driven by Canvas’s adult learning features, its scalability for statewide deployment, its integration capabilities with the entity’s existing HR system and employee database, and an interface that did not require significant prior LMS experience to use. The integration capabilities mattered because the entity already had a single sign-on solution in place, and any platform that required users to remember new credentials would create friction we did not want to add to a transition that was already going to require people to change habits.
The Journey
The work began with a needs analysis that combined surveys, stakeholder interviews, participation data, and a review of the existing content creation and delivery process. The quantitative data was useful and necessary, but the insight that ended up shaping the project came from time spent listening to the people who would actually be using the system every day.
What I heard consistently was that the problem was not just technical. People felt disconnected from professional development as a concept. The existing system made offerings hard to find, the tracking was unreliable enough that completing a course did not always feel like it counted, and instructors had limited tools for creating engaging experiences and limited support for improving their practice. These conversations clarified that the design challenge was not just replacing a system. It was rebuilding a relationship between people and the idea of professional learning, which is a different and harder problem.
The decentralized administration model, which I touched on in the top section, is worth expanding on here. The decision was not about reducing central workload. It was about putting authority where the contextual knowledge lived. A central admin team could not understand the specific professional development needs of every department in the state, but the people inside those departments could. Giving them real control meant the platform could serve genuinely different needs without becoming inconsistent in the ways that matter.
The consistency came from somewhere else: a shared course quality rubric, standardized templates for common course types, accessibility guidelines built into the development process rather than retrofitted afterward, and the Teams channel that grew into a community of practice where sub-admins compared notes. The rubric and templates were the rigor. The community of practice was where the rigor actually got reinforced in day-to-day work, which is harder than enforcing it through policy.
Alongside the administration model, we committed to a phased rollout. The pilot phase involved select departments and employee groups, and we expanded gradually before the full public launch on August 1, 2023. This approach allowed for iterative improvements at each stage and generated early success stories from pilot departments that became useful for building buy-in with departments that came on board later. The phased approach also meant that when we encountered problems, the consequences were contained to a smaller group while we worked them out.
Easy to underestimate at the time was the integration work. Connecting Canvas to the employee database, the HR system, and the single sign-on solution removed one of the most common adoption barriers. When people can access a system without creating a new account or remembering a new password, the friction drops considerably, and the message that this is a real institutional system rather than a side project gets reinforced.
Where the design shifted most significantly was in the content itself. As the platform matured, I recognized a pattern that I suspect is common in organizations making this kind of transition: many of the “courses” on the platform were essentially recorded webinars with a completion checkbox attached. The content might have been solid, but the design was not taking advantage of what an asynchronous platform could offer. The response was to introduce action mapping, the approach developed by Cathy Moore, to help instructors redesign their content around specific learning goals, the actions needed to reach those goals, and practice activities that would drive real-world application. This shift improved the quality of offerings, but it required sustained coaching to actually take hold. It was not enough to explain the concept once. We had to work alongside instructors as they applied it to their own content, which meant the coaching work extended well beyond the initial training period and became a continuous part of the platform’s evolution.
Adoption itself was uneven in ways the change management literature predicts but that still feel different when you are living through them. Some departments leaned in early and became reference cases for others. Some moved slowly, and some individual users were genuinely uncertain about engaging with the platform at all. The response was layered: targeted change management workshops, showcasing department-specific wins in the Teams channel, regular coaching and office hours, and a peer mentor program that paired less tech-comfortable users with colleagues who had adapted more quickly. The peer mentor program turned out to be more effective than the formal training in many cases, because it met people where they actually were rather than where we hoped they would be.
The Work Itself
Because the entity remains obfuscated per the contract, the artifacts from this project cannot be shown directly. What I can describe is the categories of materials the implementation produced: a course quality rubric used by sub-admins and instructors across departments, accessibility guidelines and checklists built into the course development workflow, standardized templates for self-paced, instructor-led, and blended course formats, sub-admin training modules for platform management, and a content migration triage system used by cross-functional teams to prioritize and adapt existing material for the new platform. Where representative excerpts can be shared without identifying the entity, I can include those on request. The frameworks and approaches behind these artifacts also appear, in adapted form, in articles and resources on bradylicht.com.
What I’d Change
A few things I would approach differently if I ran this implementation again.
If I were starting over, I would build action mapping into the initial instructor training rather than introducing it as a mid-stream correction. The “recorded webinar with a completion checkbox” pattern is predictable enough at the start of an LMS implementation that addressing it from day one would have saved coaching time and produced a stronger first wave of courses. Introducing action mapping after instructors had already built and launched material meant asking them to revisit work they had just finished, which is harder than asking them to design with the approach in mind from the beginning.
The peer mentor program would also start earlier. We leaned on formal training in the first several months and only built out peer mentorship after noticing that informal peer support was doing more of the adoption work than the structured sessions were. Recognizing that pattern at the start would have shifted peer mentorship from a supplemental support to a core piece of the rollout architecture.
I would also set a more honest content migration timeline. Moving existing material to a new platform while maintaining quality and adapting it to take advantage of new features was a bigger job than the original schedule reflected. A more realistic timeline would have set better expectations with stakeholders and given the migration teams more room to do the rethinking the work actually required, rather than the simpler work of transfer.
