Implementing a Statewide Professional Learning Platform


Note: The specific entity outlined in this case has been purposefully obfuscated per the contract with the entity.

When a large statewide entity approached us about replacing their professional development system, the situation was familiar in the way that these situations often are. Their existing system had been built incrementally over years, and the result was fragmented professional development offerings spread across departments, inconsistent tracking of employee participation and completion, limited capacity for blended or fully online learning, and no clean way to manage statewide initiatives or mandated trainings. The people using the system knew it was not working well, but the sheer scope of replacing it felt daunting.

My role was to lead the implementation of a modern, centralized learning management system that would serve employees across the entire state. What follows is a narrative of that process, including the decisions that shaped it, the challenges that emerged during the first year, and the lessons I took away from it. My goal in writing this is not to present a tidy success story but to document what actually happened in enough detail that other professionals facing similar projects can find something useful in it.

Starting With the People, Not the Platform

I began the project with a comprehensive needs analysis, and the most important thing I learned during that phase was that the quantitative data only told part of the story. We ran surveys of employees, managers, and department staff, conducted in-depth interviews with key stakeholders, analyzed existing participation data, and reviewed current content creation and delivery processes. All of that was necessary. But the insight that ended up shaping the rest of the project came from spending extended time listening to the people who would actually use the system every day.

What I heard consistently was that the problem was not just technical. People felt disconnected from professional development as a concept. The existing system made it difficult to find relevant offerings, and the tracking was unreliable enough that completing a course did not always feel like it counted. Instructors, meanwhile, had limited tools for creating engaging experiences and limited support for improving their practice. Understanding these perspectives, not just the system requirements, led to better decisions throughout the project.

Canvas by Instructure was selected by the entity as the LMS platform, based on its features for adult learning, its scalability for statewide deployment, its integration capabilities with existing systems, and its relatively approachable interface for both instructors and learners. The selection also brought access to Canvas Studio for video content and Canvas Catalog for course registration and management, both of which proved valuable as the implementation matured.

The Decisions That Shaped the Implementation

Three early decisions had an outsized impact on how the project unfolded.

A phased rollout. Rather than launching statewide all at once, we started with a pilot phase involving select departments and employee groups, then gradually expanded before the full public launch. This approach allowed for iterative improvements at each stage and helped manage the change resistance that inevitably accompanies a system transition of this scale. It also generated early success stories that became powerful tools for building buy-in with departments that came on board later.

A decentralized administration model. This was one of the most consequential decisions of the entire project. Instead of managing the platform centrally, we appointed sub-admins for each department and division, provided them with tailored training on platform management, and allowed departments to customize their course areas within established guidelines. The effect was significant. Departments developed a sense of ownership over their section of the platform, which drove both engagement and innovation. It also distributed the workload in a way that made the system sustainable and ensured that each department had someone who understood their unique requirements.

Flexible course development with quality guardrails. To accommodate diverse professional development needs without sacrificing consistency, we created standardized templates for common course types, supported multiple formats (self-paced, instructor-led, and blended), and developed a course quality rubric that gave instructors creative freedom within a clear framework. We also built accessibility into the process from the start, creating guidelines and checklists to ensure all content met established accessibility standards rather than trying to retrofit accessibility after courses were already built.

Integration with the state’s existing systems was another priority. Connecting the platform to the employee database, the human resources management system, and the state’s single sign-on solution created a seamless login experience that removed one of the most common barriers to adoption. When people can access a system without creating a new account or remembering a new password, the friction drops considerably.

The First Year

We officially launched the platform to the public on August 1, 2023. The initial response was positive, with many users appreciating the centralized access to professional development. But adoption, as I have learned on other projects, is not a moment. It is an ongoing process that requires sustained support and attention.

Building a community of practice. One of the most rewarding developments was the emergence of a genuine community among our users. We established a dedicated Teams channel for Canvas instructors and administrators, which became a hub for sharing updates, best practices, and creative solutions to common problems. Regular communication through this channel kept stakeholders informed about new features, course design tips, and success stories from across departments. What started as a top-down communication tool evolved into something more collaborative, with users sharing ideas and helping each other in ways we had not anticipated.

Shifting from webinars to real learning experiences. As the platform matured, we recognized a pattern that I suspect is common in organizations making this transition: many of the “courses” on the platform were essentially recorded webinars with a completion checkbox attached. The content might have been solid, but the design was not taking advantage of what an asynchronous platform could offer.

In response, we introduced action mapping, an approach developed by Cathy Moore, to help instructors redesign their content around specific learning goals, the actions needed to reach those goals, and practice activities that drive real-world application. This shift improved the quality of offerings and made better use of the platform’s capabilities, but it also required sustained coaching. It was not enough to explain the concept once; we needed to work alongside instructors as they applied it to their own content.

Staying current with platform updates. Canvas releases new features on a regular cycle, and keeping pace with those releases while also supporting day-to-day operations proved to be its own challenge. After attending InstructureCon, we learned about developments including Lucid integration for collaborative activities, screen capture in SpeedGrader for more dynamic feedback, smart search for improved content discovery, AI-generated discussion summaries, a rubrics redesign, and selective release for personalized content delivery. We assigned a dedicated team member to monitor and evaluate new releases, shared updates through the Teams channel, and developed targeted training for the features with the most immediate impact.

Refining evaluation and quality. As we neared the end of the first year, we took time to reflect on what was working and what needed adjustment. We refined our course evaluation checklist with a stronger emphasis on accessibility and instructional design best practices, and we began exploring personalized learning pathways and adaptive strategies as a direction for the platform’s next phase.

What We Ran Into

No implementation of this scale goes smoothly, and I want to be honest about the friction points because they are where the most useful lessons tend to live.

Adoption was uneven. Despite initial enthusiasm, some departments and individual users were slow to engage with the platform. Our response was multifaceted: targeted change management workshops, showcasing early department-specific wins through the Teams channel, regular coaching and “office hours” for questions, a comprehensive onboarding program, just-in-time resources like video tutorials and quick guides, and a peer mentor program that paired less tech-comfortable users with colleagues who had gotten up to speed. What I observed was that the peer mentor approach was often more effective than formal training, because it met people where they actually were rather than where we hoped they would be.

Content migration was more complex than anticipated. Moving existing content to the new platform while maintaining quality and taking advantage of new features turned out to be a bigger lift than the timeline suggested. We developed a triage system to prioritize what got migrated first, created small cross-functional teams to handle migration by content area, and offered training on optimizing content for online delivery. The action mapping work fed directly into this process, helping instructors rethink their content rather than simply transferring it to a new container.

The synchronous-to-asynchronous transition required patience. Many instructors were comfortable with webinars and less certain about designing effective asynchronous experiences. We addressed this by providing ongoing coaching, showcasing successful asynchronous examples from within the platform, and encouraging a blended approach that combined asynchronous content with optional synchronous touchpoints. This gave instructors a bridge rather than asking them to make the leap all at once.

Sustaining engagement in a digital environment took deliberate effort. Keeping learners engaged over time in a fully online platform required more than good content. We encouraged the use of multimedia and interactive elements, promoted collaborative features, and experimented with gamification where it made sense. The discussion summary feature in Canvas turned out to be particularly helpful for managing engagement in courses with large enrollments.

What I Took Away

Several lessons from this project have stayed with me and shaped how I approach similar work.

Empower departments. The decentralized administration model was not just an operational convenience. It was the single most important factor in building genuine buy-in across the organization. When departments have real ownership over their section of the platform, they invest in making it work.

Phase the rollout. The gradual implementation gave us room to adjust, and the early phases generated the success stories and practical knowledge we needed for the later ones. I would not do a statewide launch any other way.

Spend the time on empathy. The needs analysis data was important, but the time I spent listening to employees and instructors, understanding their frustrations and their hopes for the new system, informed better decisions than the numbers alone would have. This is a lesson I keep relearning, and it keeps proving true.

Coach from want to need. One of the most effective strategies I discovered was initiating coaching and training around what was initially appealing to users, then gradually guiding the experience toward organizational priorities. This “want to need” transition, borrowed from theatrical principles, proved far more effective at driving meaningful adoption than starting with mandated requirements and hoping enthusiasm would follow.

Communicate continuously. The Teams channel became more than an information distribution tool. It became the connective tissue of the entire implementation. Regular, transparent communication about what was happening, what was changing, and what was coming next kept people engaged through the inevitable rough patches.

Where It Goes From Here

The first year established the foundation, but the platform’s value will ultimately be determined by how it evolves. The directions we are exploring include establishing a regular feedback loop with users to drive ongoing improvements, investigating micro-credentialing and badging as ways to recognize professional growth, developing partnerships with higher education institutions for credit-bearing options, and continuing to shift traditional webinar-style content into more engaging asynchronous experiences.

On the technical side, we are talking about AI-driven course recommendations based on employee profiles and goals, adaptive learning pathways for more differentiated professional development, and features like selective release for more targeted content delivery. The analytics capabilities of the platform also have room to grow, and better data on learning patterns and outcomes would strengthen both the courses and the case for continued investment.

The implementation of this platform was one of the most complex and rewarding projects I have worked on. It reinforced my belief that large-scale educational technology projects succeed or fail based on how well they account for the people using them, not just the systems they run on. Structure matters, flexibility matters, and the willingness to adapt continuously matters. But none of it works without genuine attention to the humans at the center of the process.

If you are working on a similar implementation, or thinking about starting one, I am happy to talk through what we learned. You can reach me at licht.education@gmail.com, and you can find more case studies and articles at bradylicht.com.


Discover more from Brady Licht

Subscribe to get the latest posts sent to your email.