
This section outlines my approach for evaluating the course’s effectiveness. I’ve designed a plan that blends qualitative and quantitative feedback, ensuring the course meets its goals and continuously improves. The process uses Kirkpatrick’s Evaluation Model (Reaction, Learning, Behavior, and Results) .
Below, you’ll find an interactive visualization showing how I applied Kirkpatrick’s Model to this course.
To further demonstrate my evaluation methodology recommendations, I have included a data visualization that outlines how qualitative data will be collected and analyzed. Click on each of the interactive buttons below to learn about data sources for each stage, the reasoning for the data collection, and how the data will be analyzed.
Data Analysis Plan
I designed a comprehensive data analysis plan that can serve as a flexible template for any district or organization adopting this course, with room for local adaptation. The plan emphasizes equity, continuous improvement, and actionable insights, supporting both instructional quality and learner success. Key elements of the plan include:
Key metrics to track
Data collection methods
Data preparation and integration
Data analysis framework and methods
Challenges in data interpretation
Actionable insights and continuous improvement strategies
End-to-end data workflow
Reporting and review timeline
Appendix with templates and examples to ensure reliability and validity in data collection (e.g., teacher evaluation rubric, data dictionary)
You can interact with the plan on the embedded content below.
As this instructional design project moves into the evaluation phase, the focus shifts to continuous improvement and real-world learner success. While this plan has not yet been implemented, it offers a comprehensive approach grounded in user feedback, user experience assessments, and quality assurance practices. The goal: to ensure the course truly meets its objectives and supports all learners’ engagement and growth.
Each stage of the ADDIE process demonstrates a user-centered mindset, anticipating challenges and adapting solutions as needed. The detailed evaluation framework shows a commitment to collecting meaningful data, analyzing outcomes, and applying those insights for future improvements.
Looking ahead, this structured evaluation proposal sets up the project for ongoing adaptation and innovation. By fostering a cycle of learning and reflection, this approach supports continuous improvement, equity, and responsiveness across diverse learning environments. Ultimately, effective instructional design is a long-term process that centers learners, values feedback, and drives quality at every stage.
For a closer look at how this project shaped my approach to instructional design, explore the lessons and takeaways on my Reflections page.