Overview
This use case highlights how Togeder's Assessments Dashboard was successfully used as the core component of an alternative midterm exam in a graduate-level course on mathematics education.
Faced with the growing challenge of assessing student understanding in the age of AI-generated content, Professor Baruch Schwarz replaced the traditional, proctored exam with an alternative midterm assessment. This included a collaborative group problem-solving session conducted live on Zoom, powered by Togeder's Assess platform, designed to capture authentic learning, critical thinking, problem solving and teamwork.
Why This Matters
This is not just an enhancement to existing pedagogy — it was a direct response to the limitations of traditional exams in online and AI-rich environments. By rethinking the structure of assessments, Prof. Schwarz turned the midterm exam into:
- A real-world simulation of collaborative problem-solving
- A method to evaluate process over product
- AAI-resilient scalable solution that still allowed for individual accountability
Context
The alternative assessment was conducted in a graduate-level course on mathematics education. The course had 16 students, with weekly lectures delivered over Zoom in a hybrid format. A key topic in the course was mathematical problem-solving (e.g., Polya's four-stage model, heuristic methods, metacognitive processes).
Group work was a central component of the course. Breakout Rooms in Zoom were frequently used, and Togeder facilitated most sessions in real-time, to guide the student collaboration.
Solution: Togeder-Assisted Group Exam
Overview of the Exam
The final evaluation consisted of two parts:
- 25% Midterm Exam: Group problem-solving activity conducted live over Zoom using the Togeder Assess platform
- 75% Final Exam: Written individual exam
Group Assignment Format
- Students were grouped into teams of 3–4
- Each group received a complex problem: "The Gardener Problem" (an open-ended problem with infinite solutions)
- Roles were assigned: solvers, guide, reporter
Togeder Assess was used to:
- Monitor real-time participation
- Provide private channels for mentoring guides
- Collect transcripts and summaries
- Generate metrics on participation and group balance
Assessment Criteria & Automation
Criterion | Weight | Automated Support via Togeder |
---|---|---|
Group balance | 10% | Calculated from word count standard deviation vs. median |
Problem-solving stages compliance | 20% | AI-analyzed summaries for adherence to Polya's model |
Articulation of heuristics/methods | 30% | AI-identified problem-solving strategies in conversation |
Reasoning processes | 20% | AI flagged use of inductive, deductive, or theorem appeals |
Solution attainment | 20% | Manually reviewed with AI-generated suggestions |
Participation (individual) | Factor | Word count tracked by Togeder; scaled individual grades |
Students were explained that all but participation were collective scores, given to the entire group.
Process
- Real-time Collaboration: Groups worked in breakout rooms with Togeder Assess, to track all conversations during the assessment session.
- Data Collection: Togeder automatically collected:
- Full transcripts
- Summaries per group
- Participation stats, including words count
- AI Support:
- AI engine triggered alerts when the group did not understand the instructions clearly
- Teacher used AI summary to understand the progress of each group
- Manual Review:
- Final scores for each group and student were reviewed and adjusted
- Teachers added 2–3 sentence justifications for each criterion
- Each group took ~5 minutes to assess per criterion
Benefits Observed
- Authentic Assessment: Students solved problems collaboratively using real-world processes; the assessment was a learning opportunity for the students
- AI-Resilience: Group structure and oral collaboration made it difficult to outsource answers to generative AI
- Scalability: Togeder's automation drastically reduced the instructor's evaluation workload
- Transparency: Students knew exactly how they'd be assessed. Rubric was shared in advance
- Pedagogical Depth: Encouraged application of concepts taught during the course in real-time, guided practice
Conclusion
This case demonstrates that Togeder's Assess is not only a tool for guiding group work, but also a powerful platform for authentic, AI-resilient assessment. The hybrid evaluation model—combining human teaching expertise, structured group collaboration, and AI-powered summaries—offers a scalable and pedagogically rich alternative to traditional exams.
For additional details, see our Blog at Revolutionizing Assessment in the AI Era: How Togeder's Platform Transforms Math Education Evaluation