Individual Reflection 2: Stakeholder Challenge Proposals Round 2


Back to Lessons from Round 1

The first round of stakeholder challenges taught me the importance of specificity, stakeholder communication, and user-centered design. While crafting our cybersecurity challenge for Round 2, I deliberately included specific parameters (30% requiring assistive technologies, 40% remote workforce, 60% staff over 45) to provide clear context. Ironically, this attention to specificity later highlighted an interesting tension when we received similar guidance about our mental health framework response – to “keep the project universal and straightforward” while still providing implementable solutions.

Theme 1: The Universal-Specific Paradox in Instructional Design

This round revealed a fascinating paradox in instructional design: the simultaneous demand for universal applicability and specific implementation detail. When reviewing Team 1’s cybersecurity response, I noted: “Looking at how we approached their broad challenge with actionable details in our mental health framework, I would have expected them to do the same (or even more so) with our more narrowly defined cybersecurity challenge.” Yet when they evaluated our mental health framework, they critiqued it for being too broad – despite having explicitly requested a universal approach.

This parallel critique exposed a fundamental tension in instructional design: how do we create solutions that are both broadly applicable and specifically actionable? The experience taught me that what appears as “surface-level” to one team might represent “flexible adaptability” to another.

Theme 2: Context and Design Interpretation

Our team discussions revealed how context shapes both design approaches and their interpretation. When Team 1 told us to create a universal framework, we responded with a detailed three-tiered system that we believed provided both structure and flexibility. Yet some of the very elements we saw as providing necessary detail were interpreted as making the solution too broad.

Similarly, when reviewing their cybersecurity response, we found ourselves wanting more specific implementation details about accessibility features and LMS integration – even though they might have seen their approach as appropriately high-level. This dynamic highlighted how designers’ interpretations of terms like “universal,” “specific,” and “implementable” can vary significantly based on their perspective and priorities.

Theme 3: Beyond Knowledge Checks – Ensuring Real Behavioral Change

A key insight from this round centered on the gap between knowledge testing and actual behavioral change. As I noted in our discussions, “We’ve all worked with folks who can pass security tests but still use weak passwords!” This observation led me to advocate for more sophisticated assessment approaches, including “pop quizzes” and real-world application tests.

This focus on behavioral change represented a significant evolution in my thinking about assessment design. Rather than being satisfied with traditional knowledge checks, I began pushing for assessment strategies that could verify whether training actually modified workplace behaviors. This included suggestions for periodic security checks and simulated phishing tests that would occur months after the initial training.

Theme 4: Collaborative Leadership in Design Teams

The project revealed valuable insights about team dynamics and collaborative leadership in instructional design. When summarizing our team’s evaluation approach, I consciously worked to integrate multiple perspectives: “I tried to make sure our evaluation reflected both of your approaches while still pushing for the deeper technical and implementation details we all felt were missing.”

Our team’s success came from balancing different viewpoints and expertise. When Jenna reminded us about maintaining a constructive learning focus, and Malissa raised concerns about peer support implementation, we were able to integrate these perspectives into a stronger overall evaluation. This experience taught me that effective instructional design often emerges from the synthesis of diverse team perspectives rather than any single viewpoint.

Looking Forward

This round has deepened my understanding of instructional design’s complexities. The experience has taught me that:

  • Universal frameworks and specific implementations can coexist
  • Assessment must focus on behavioral change, not just knowledge retention
  • Team collaboration enhances both design and evaluation processes
  • Different perspectives on design can be equally valid

Moving forward, I aim to:

  • Develop frameworks that better balance universal applicability with specific guidance
  • Create assessment strategies that verify real behavioral change
  • Further develop skills in collaborative leadership and team synthesis
  • Better understand and articulate the reasoning behind design decisions

These experiences have shown me that instructional design’s most valuable insights often come from engaging with its complexities rather than seeking simple solutions. Whether wrestling with the universal-specific paradox or working to ensure genuine behavioral change, the key lies in embracing these challenges while maintaining focus on practical, implementable solutions.

Leave a comment