Use Case: Oral Presentation and Defense

A Peer-Reviewed Approach for Oral Presentations, for a large class

Implemented By

Dr. Liat Eyal, lecturer at the Levinsky-Wingate Academic College

Date of Implementation

June 2025

Overview

This semester, I experimented with a new way of assessing students’ oral presentations, using Togeder to create a structured, collaborative, and authentic learning experience. To my surprise, it turned out to be one of the most engaging class sessions we have had.

Designing the Experience

In large courses, it is often challenging to give every student a meaningful opportunity to present their work and receive individual feedback. To address this, I divided the class (around 50 students) into small groups of 4–5.
This was a hybrid class, with the presentations taking place on Zoom, which made it even more important to structure time carefully and ensure everyone stayed engaged.

Each student had 7 minutes to present their work and 3 minutes to respond to questions from peers. The guideline was clear: presentations had to be delivered orally, not just read from slides, and questions from peers had to be authentic and thought-provoking, not superficial.

This structure made it possible to complete an entire round of presentations and feedback within a single 60-minute session — something that’s nearly impossible in traditional settings.

Facilitating the Session

The experience was truly enjoyable, and even empowering. I absolutely loved the sense of control it gave me, being able to hear and see the interesting things my students were presenting, while also keeping a clear handle on time management across groups. Not in a controlling sense, but rather in a way that allowed me to stay connected to the learning process as it unfolded in multiple rooms at once.

In a traditional setting, when I am physically present in all presentations, which takes a great deal of time, and yet students rarely get to learn from one another’s work or feedback. When I let students present in groups, I miss out on most of the discussions. This time, I could actually follow what was happening: what ideas emerged, how students interacted, and how they reasoned.

The students, too, responded positively. I kept them informed throughout the process, explaining the goals of the activity, how the tool works, and even sharing the screens and participation data afterward. They appreciated the transparency and were curious about how such tools could be used in schools themselves.

What We Learned

A few interesting insights emerged:

  • Summary View: it was extremely helpful to get a short overview of each room’s discussion.
  • Participation metrics: The data on who spoke, how often, and for how long gave valuable insights into engagement and group dynamics.
  • Student reactions: Having a “bot” in the room didn’t bother them. Their main anxiety came from the act of presenting, but that eased as the session went on.

Next Steps

Now, I want to explore how the transcript data can be used for assessment purposes, to identify patterns of reasoning, argumentation, or collaboration quality.
There is clear potential here for developing new ways to evaluate oral performance and peer feedback at scale. This was a meaningful step toward making large-class assessment more personal, interactive, and authentic, and I am eager to refine it further.

Oral Presentations AI-Enhanced Teaching Higher Education