Evidence of Teaching Effectiveness & Impact
A. Effectiveness Data
Summary Narrative of Course Evaluations (Four Most Recent Online Semesters)
Across four recent fully online semesters (Spring 2024 through Fall 2025), I taught nine online course sections (INTS 210 Sustainable World, INTS 410 Contemporary Health, INTS 249 Digital Literacy, and INTS 204 Leadership Theory and Practice). Response rates varied by course, but averaged approximately 64%, with higher participation in INTS 210 and INTS 410 and lower participation in INTS 249 (a skills-oriented elective courses).
Using the CHSS reporting method (the “average of the medians” within each category), my course-evaluation results show consistently strong performance across four dimensions:
Course Environment/Experience
Course environment and experience represented my strongest and most stable area across semesters (overall average median approximately 4.78/5, ranging 4.60–5.00). Students consistently experienced the courses as organized and navigable.
Learning Outcomes
Learning outcomes were similarly strong (overall average median approximately 4.54/5, range 4.00–5.00), aligning with my emphasis on applied learning products that make thinking visible and transferable.
Student Participation
Student participation was also rated highly (overall average median approximately 4.72/5, range 4.25–5.00), suggesting that students perceived the learning environment as one that prompts engagement even in asynchronous conditions.
Instructor Preparation &
Course Organization
Instructor preparation and course organization remained positive overall (overall average median approximately 4.50/5, range 4.00–5.00), with the expected variability that comes with different course formats (asynchronous vs. synchronous history, new builds vs. mature course iterations, and differing student expectations by course type).
Collectively, the evaluation data support a clear story: students consistently rate the online course experience as well-structured and supportive, and they report strong learning outcomes and engagement across multiple preparations and semesters. Where there is room to strengthen is how students interpret “teaching style” in a fully asynchronous environment. I address that directly in my improvement plan below through expanded instructor presence, feedback modalities, and optional in-person touchpoints. The chart below is a summary of the 4 semesters and course evaluations.
Evidence of Improved Student Learning Outcomes
A case study of one student’s journey
One of the most difficult tasks for an online educator is providing evidence of achievement beyond a letter grade. In my INTS 210 Sustainable World Class and INTS 410 Contemporary Health Class, I know my students have achieved mastery because they move through a step-by-step methodology that builds the confidence to find specific leverage points and design solutions that are truly desirable, feasible, and viable as demonstrated in the student case study below:
The Learning Continuum
Student: Noelle Talarek (with permission)
Meaningful Assignments for Finding Leverage Points
Noelle used the CRAAP test (a tool to assess for Currency, Relevance, Authority, Accuracy, and Purpose of sources) to establish factual evidence of the athlete mental health crisis and create an infographic to visualize the information. However, the learning became "meaningful" when she conducted six interviews with student athletes and their coaches and discovered a leverage point: the fear that vulnerability would jeopardize an athletic career.
Infographic (Scroll to view the rest of the document)
Variety of Assessment Methods
I use formative and summative assessments to ensure mastery. I assessed Noelle's ability to synthesize raw qualitative data into a Mind Map (Formative) before she moved to her Final Design Thinking Pitch (Summative).
Mind Map
Final Design Thinking Pitch without audio or video
(Use the buttons at the top to scroll throught the pages.)
Developing Desirable Solutions
By the time Noelle reached the Prototype phase, her "Athlete Check-In" system was not a ‘top down solution that she contrived of’; it was a desirable solution built on the "pain points" and "stress cycles" identified by the athletes themselves.
Clear Evaluation & Mentoring
My feedback is prompt and clear, often involving "weeding through" themes and insights with students through Zoom calls or in-person office hours to distill a viable one. This mentoring ensures students remain agile and flexible, responding to feedback by making fundamental changes to their designs when necessary.
Transformative Learning
From Bias to Strategic Insight
The most significant "meaning" students take away from my course is the ability to unlearn ‘action’ biases (“here’s what I am going to do…or ‘here is what I think they need”).

Cognitive Shift
Transformation is evidenced when a student realizes that "real solutions to problems cannot just be one thing" and must consider the entire context.

Confidence Through Competency
Noelle's confidence grew from having a "structured approach to problem discovery".

Human-Centered Impact
The final "Athlete Check-in" prototype is the ultimate proof of achievement as it represents a student who has learned to suspend their own biases to truly hear the person's points regarding pain and motivations.
In INTS 204: Leadership Theory and Practice, I can see evidence of learning and growth because students begin the semester with a “sensemaking” assignment in which they surface a real leadership challenge from their own lives one that did not resolve the way they hoped and make their assumptions, instincts, and unanswered questions explicit. Near the end of the course, they return to the same scenario and complete the same sensemaking prompts again, this time using the theories and language they have learned across the semester to reinterpret what happened, what mattered most, and what they would do differently. The culminating assessment then asks them to translate that revised thinking into an Escape Room Finale: a branching leadership simulation built from their own case, with theory-labeled decision points, evidence checks that signal whether their approach is working, and explicit guardrails (e.g., equity, trust, morale, transparency, well-being). The result is visible evidence of mastery that is difficult to fake and easy to recognize: students move from “this is what happened to me” to “this is how I would lead now,” showing how theory changes interpretation, choices, consequences, and definitions of success.
Evidence of Learning Outcomes as Reflected Upon by Students
2030 Skills Matrix
The 2030 Skills Matrix Summary Table serves as a vital artifact of my Online Teaching Effectiveness and Impact. This table triangulates student feedback to demonstrate the "meaning and competencies" learners take away from the course, specifically focusing on the high-complexity capabilities required for future professional success. Here is the accompanying prompt for the last day’s assignment:
As we move toward the future, certain skills are becoming increasingly important. Please take a look at the 2030 Skills Matrix shared in the link above. Focus especially on the skills in the UPPER RIGHT QUADRANT, which represent high-value and high-complexity capabilities needed for future success. Don't miss the competency on the 'line' which is empathy and active listening!
PROMPT/INSTRUCTIONS:
You may write a reflection (about 3-4 paragraphs) addressing the following:
-
Which of the 2030 skills (from the upper right quadrant) do you feel you have developed in this class?
-
Provide specific examples from the course assignments, tools, or experiences that helped you strengthen these skills.
-
If applicable, mention how this skill might support your future academic, professional, or personal goals.

Below is a matrix of selected student reflections and alignments to the skills:
B. Support Letters


C. Closing Statement: Evidence, Reflection, and Forward Plan
My online teaching has been shaped by an unusual pairing of careers: I am an academic trained to value evidence, theory, and methodological rigor and I am also an entrepreneur who has spent more than two decades building digital learning experiences where the design of the experience is the instruction. Over time, I have learned that these two identities strengthen each other. Entrepreneurship has made me relentlessly interactive and learner-centered; academic training has made me disciplined about scaffolding, assessment clarity, and intellectual integrity.
The game I designed and executive produced was downloaded by more than 700,000 girls because it responded to needs revealed through formative, human-centered research. Each episode was intentionally scaffolded to align with learning objectives and the core rhythm of game-based learning: challenge, explore/discover, decide, get feedback, and try again.
Across my fully online courses, the goal is not simply comprehension; it is agency within constraints, in the same way that players of Go Nisha Go build confidence and self-efficacy through scaffolded choices. I want students to be able to say: I can relate to this issue, I care deeply about it, and I can investigate it rigorously enough to design a response that I as a college student and within the limits of this semester can realistically solve. That is the throughline behind my design choices: students move from “I understand this topic” to “I can do something with it,” because the course repeatedly asks them to translate evidence and lived perspectives into feasible, desirable action.
What my evidence suggests about effectiveness
For INTS 210 and INTS 410, I can observe evidence of learning in the quality of student prototypes and in students’ own reflections about skill development—especially when they describe how interviewing, synthesis tools, and iteration changed what they were able to produce. In INTS 204, I see the same “beyond the grade” evidence in a different form: students begin with a baseline “sensemaking” assignment about a real leadership challenge that did not turn out as they hoped. Near the end of the semester, they revisit the same scenario, answer the same prompts again, and then build an Escape Room Finale that turns that lived dilemma into a branching leadership simulation grounded in theory. When a student can show how their interpretation, choices, guardrails (e.g., equity, trust, morale), and evidence checks evolved from week one to the final simulation, growth becomes visible not just asserted.
Student course-evaluation data across the most recent online semesters reinforces this pattern of “visible learning.” Using the CHSS-required approach (reporting the average of the medians within each factor), my overall results across nine online course offerings (total enrollment 221) show consistently strong performance in the areas that matter most in asynchronous teaching:
-
Course Environment/Experience: 4.78
-
Student Participation: 4.72
-
Learning Outcomes: 4.54
-
Instructor Preparation & Course Organization: 4.50
Participation rates varied by course and cohort (approximately 32%–88%, mean ~64%), which is typical of voluntary end-of-term surveys in fully online contexts. Even with that variability, the pattern is stable: students rate the courses highly for structure, clarity, and outcomes—precisely the dimensions that reflect intentional course design rather than live delivery.
Where there is room to strengthen is in how students interpret and score “teaching style” in a fully asynchronous environment. Many evaluation instruments implicitly assume an in-person or synchronous teaching presence (performance, real-time delivery, spontaneous interaction). In my asynchronous courses, “teaching style” shows up differently: it is the experience architecture—how the course is sequenced, how students are oriented and re-oriented, how choices and expectations are made explicit, and how feedback is delivered so students can revise and improve. I address that directly in my improvement plan below by expanding instructor presence and feedback modalities in ways that fit a fully online course without increasing burden for students.
The two support letters in this portfolio align with—and humanize—the evaluation data. One letter, written from a student perspective, describes how a design-thinking structure paired with accessibility and responsiveness supported confidence and applied problem-solving in a fully asynchronous course. A second letter, from a colleague/administrator who reviewed my online teaching, highlights navigability, module-to-module scaffolding, and my intentional combination of design thinking and gamification to help students engage deeply and produce substantive work. Taken together, the letters validate what my course design intends: students are not only learning content; they are learning how to do real work with ideas in a way that transfers beyond the course.
Finally, I included student reflections aligned to the 2030 Skills Matrix as an additional form of learning-impact evidence. These reflections capture transfer in students’ own language. Across INTS 210 and INTS 410, students repeatedly describe growth in skills that map directly to the matrix’s “core/high-value” quadrant—especially empathy and active listening (through structured interviews), analytical thinking (through synthesis and pattern finding), systems thinking (through mapping tools), creative thinking (through ideation and prototyping), and resilience/flexibility (through iteration after feedback). These outcomes are not incidental; they are exactly what the assignments are designed to evoke.
What I am working to improve (the next design cycle)
It is hard for me to write about “improvement” because my teaching already runs on iteration: each semester produces new evidence of what worked, what was unclear, and what students needed more of. So, I frame improvement the way I teach it: not as deficit, but as the next design cycle.
1) Strengthen instructor presence in ways that fit asynchronous learning.
Because “teaching style” can read as “live performance” in student evaluations, I will make my asynchronous presence more legible—without turning the course into a synchronous requirement. Concretely, I will expand short weekly orientation videos (what matters this week, what success looks like, common pitfalls), add optional community touchpoints (lightweight, time-bounded), and diversify feedback modalities (brief audio/video feedback where it adds clarity). The goal is not more content; it is more “human signal” at the moments students most need it: when they are choosing a topic, narrowing scope, interpreting interviews, and moving from insight to prototype.
2) Make career readiness more explicit—without narrowing intellectual ambition.
I believe part of our responsibility in higher education is to prepare students to do real work with ideas: communicate clearly, collaborate responsibly, interpret evidence ethically, and produce audience-specific deliverables. I also recognize that “career readiness” can be misunderstood as training rather than education. My aim is to bridge that divide by showing—through assignments and student work—that applied work raises the intellectual bar. It demands evidence, synthesis, trade-off thinking, and accountability. Going forward, I will continue refining language in prompts and rubrics so students can name the competencies they practiced (not just the product they produced), and I will share those patterns with colleagues who are navigating the same campus-wide tension.
3) Keep generative AI inside the learning process.
I no longer treat AI as a policing problem; I treat it as a design problem. I will further develop structured, theory-driven workflows where students use AI transparently for bounded purposes (for example, forecasting plausible consequences in a branching scenario), and then critique, correct, and revise those outputs using course concepts. This strengthens academic integrity by requiring judgment rather than text production. The improvement goal here is sharper AI-use guardrails and tighter reflection prompts that make students’ reasoning unmistakably their own.
4) Expand the pathway from student prototypes to implementation.
The most meaningful learning evidence I see is when students move from insight to action: when a prototype becomes a campus conversation, a pilot, a proposal, or a fundable idea. I am committed to building institutional pathways for that transition. I have already begun meeting with university leadership to explore mechanisms—internal seed funds, cross-unit partnerships, and external grants—that can help student ideas travel beyond the classroom. My next cycle goal is to document “what happened next” for select student projects so the portfolio reflects not only excellent prototypes, but credible steps toward implementation.
How I will extend impact beyond my own courses
Going forward, I want to formalize what I already do informally: mentor other faculty in how to make online courses interactive using principles like design thinking and game-based learning. I had a faculty member say to me that he teaches a class on poverty and there is nothing fun about it. And would respond that game-based learning is about engagement, and design thinking is about solving a problem by understanding root causes. So, design thinking is a perfect approach to a class on poverty.
Closing reflection
I began my career as an investigative journalist trained to pursue the who, what, where, when, and why. Public health and environmental science deepened my understanding of underlying systems and determinants. Design thinking and game design gave me a way to make those systems teachable—especially online—by turning complexity into navigable choices, visible trade-offs, and improvable prototypes. That arc comes full circle in my teaching: I do not use games or design methods as novelty. I use them because they make thinking visible, transferable, and human-centered.
Taken together—evaluation summaries, letters, and student learning reflections—my evidence points to an online teaching practice that is organized, applied, and deeply student-centered. Students are asked to engage real problems, listen to people affected, interpret evidence with care, and build responses that are feasible and desirable within constraints. My next phase is not a reinvention; it is a scale-up: supporting colleagues who want to teach this way, and helping more student ideas travel beyond the classroom into the university and the world it serves.
Bringing the classroom to the world:
Two George Mason students attending the Game Changers 2024 conference in India

