In 2022, two professors were drowning in 8,224 student questions. Instead of hiring more staff, they built an AI that's changing elite education forever. When Kevin Bryan and Joshua Gans got tired of answering the same MBA student questions repeatedly, they didn't just complain. They built. All Day TA—an AI assistant trained on their actual course materials—became their solution to an age-old academic problem. The results have been nothing short of amazing. In one course with 250 students, their AI handled 8,224 questions—over 30 interactions per student—with projections hitting 50 by semester's end. This isn't theoretical conjecture or wishful thinking. This is real classroom data from the University of Toronto, where innovation meets necessity. Students could ask questions anytime, whether it's during 2am cramming sessions, moments before class, or while puzzling through feedback. No more waiting for office hours or hoping for an email response in time. The system knew its limits, too. When questions ventured beyond its scope, it gracefully escalated to professors without guessing or overreaching. No hallucinations. Just intelligent guardrails respecting human judgment. Student feedback revealed something powerful: they felt more confident asking questions. The AI eliminated that familiar "am I bothering the professor?" hesitation we've all felt when our hand hovers over the send button. Impact breakdown: • Faster responses • Fewer repeat questions • More consistent answers • Less teaching fatigue Now MIT, Berkeley, Princeton, UCLA, and others have piloted All Day TA, or they are implementing similar systems, recognizing the potential for transformation. This isn't about replacing professors with robots. It's about liberation. By eliminating the mental load that silently drains teaching energy, professors can teach once, then let technology handle the repetitive inquiries that consume precious time. They protect their energy for what truly matters: deeper connections, complex concepts, and the meaningful mentorship that machines can never provide. The schools leading the future won't just have smarter students. They'll have freed-up faculty with bandwidth for what matters. And that changes everything for everyone involved in education.
Enhancing Student Feedback with Tech Solutions
Explore top LinkedIn content from expert professionals.
Summary
Enhancing student feedback with tech solutions means using digital tools, artificial intelligence, and data systems to provide students with quicker, more personal, and actionable responses about their learning progress. This approach helps teachers manage large workloads, supports critical thinking, and keeps the feedback cycle moving smoothly so students can continuously improve.
- Adopt AI assistants: Integrate AI-driven feedback platforms to allow students to ask questions and receive instant answers, making learning more interactive and accessible any time of day.
- Streamline data workflows: Set up automated systems to track student performance and deliver regular updates, freeing up educators to focus on deeper teaching instead of manual data entry.
- Empower student reflection: Use tech tools to prompt students to review, respond to, and discuss feedback points before meeting with teachers, building confidence and encouraging critical engagement.
-
-
Two hundred students. One course. And each one expects, and deserves, personalized feedback. At the University of Passau, teacher education faculty faced this challenge. Imagine reviewing 3,000 reflective entries in a single semester. Without help, it’s an impossible task. Their solution? KI-Folio, an e-portfolio platform enhanced with generative AI. Students write reflections on their learning and experiences; the AI offers instant, tailored suggestions. Later, teachers step in with human nuance, context, and empathy. The result: faster feedback cycles, deeper critical thinking, and no compromise on quality. This week, in EdTech Research Insights, we dive into this case study: > How AI + human feedback can scale personalization without losing pedagogical depth > Lessons from the first deployments > A practical checklist to launch your own AI-supported e-portfolio What’s your take — can AI truly amplify rather than replace formative feedback? 📩 Read a real case study in the latest edition of the Edtech Research Insights newsletter (link in comments)
-
𝗘𝘅𝗽𝗲𝗰𝘁𝗮𝘁𝗶𝗼𝗻: "I'll harness data and AI for smarter decisions" 🌟 𝗥𝗲𝗮𝗹𝗶𝘁𝘆: "My data is a mess" "Data collection is such a manual chore" 😣 Drawing from my experience as a Machine Learning Engineer in Cambridge and management consultant at McKinsey helping CEOs drive strategic insights, I've seen the power of data. When I founded GuruLab, I was determined to integrate data analytics into our core operations and make that our competitive advantage. In the next 3 posts, I will be sharing a few examples of how our data initiatives drive outcomes and the invaluable learnings we gained. --- 🔍 𝗛𝗼𝘄 𝗚𝘂𝗿𝘂𝗟𝗮𝗯 𝗢𝗽𝘁𝗶𝗺𝗶𝘇𝗲𝗱 𝗣𝗿𝗼𝗴𝗿𝗲𝘀𝘀 𝗥𝗲𝗽𝗼𝗿𝘁𝗶𝗻𝗴 𝗯𝘆 𝟭𝟬𝗫 Traditionally, parents awaited report cards for an update on their child's performance, often just once every six months. We made continuous feedback without overwhelming our educators possible by building a streamlined data workflow. This allowed us to send progress updates to >500 students easily, boasting an impressive 80% open rate with each report being revisited ~7 times! Here's how we did it: 1️⃣ Automate Backend Metrics The one-off investment in tech setup enabled us to collect clean, accurate, and continuous data without manual work moving forward. We used existing SaaS tools like PostHog and built bespoke trackers. For example, attendance is tracked by join and leave class clicks, engagement by patterns of reward points earned. Soon, we can also monitor if students remain focused in our live class. No more attendance marking or noting down which students struggled, improved, or enjoyed the lessons. 2️⃣ Integrate Data Collection into Existing Workflows Let's be real, if your users don't currently do an action, they are not gonna do it for you to collect data. When we got tutors to write student feedback, they were slow and reluctant with the additional chores. Instead, we built a seamless process to extract scores from classwork and translate a wall of numbers into parent-facing comments that tutors could easily verify. With generative #ai on the rise, we are also exploring opportunities in this area 🚀 3️⃣ Collaborative Approach between Tech and Student Success Purely tech-driven data systems risk losing touch with the actual user needs. On the flip side, non-tech teams might not envision the full technological potential (you don't know what you don't know 😉) The report you see below has data automatically pulled and populated - and parents resonated with the content. We achieved this by engaging our developers. We learned how to communicate context clearly and to come prepared with a manual workflow, which forces the non-tech team to think through each step of what they need. --- How does your organisation automate data collection? Would love to exchange notes ☕ Stay tuned next week on how we analyse the data at GuruLab! Make sure to hit that 🔔 on my profile to get notified.
-
One of the ways I'm incorporating #AI in the feedback loop for students in my writing class is to use it as a guide for talking points when they go to the language lab for support. I told students I would be using Brisk Teaching for round 1 (maintaining transparency about when I'm using AI and hopefully leading by example), where it creates feedback points based on my rubric and inserts them in a table at the top of their essay. Using Google Docs, I converted the bullet points to checkboxes (though it would be nice if Brisk did this part automatically), so students can go through point by point and show me that they're at the very least looking at the feedback before the next round of writing. Next, I asked students to highlight one point from each category and use the comment feature to speak to it. This could be any variation of responses: 🔦 Spotlighting an issue that they know they need to work on and how they're dealing with it in this paper 🙅♀️ Disagreeing with the AI and explaining why they don't want to make the change it's suggesting ❓ Asking for clarification on how to respond to a point ➕ Etc. Next, when they go to the lab to get help, these highlights and the changes they made will form the foundation for the talking points when they work with the professor. One of the biggest problems when students go to a lab for support is always training them to be prepared instead of going in and saying "please check my paper" rather than empowered with a specific learning goal in mind. So the goal here is to have them go in with 5 already acted upon (or at least considered) points to discuss in order to make a more productive lab time. The screenshot is a sample that I sent to my students to understand the concept. I'm sure there will be some fine-tuning, but already many of them are interacting more with their early drafts and even coming to me to make sure that they're building good responses to talk to the professor in the lab about. I'll need more exploration, but to me this is a good way to take advantage of the strengths of AI, continue to challenge students to think critically about what it generates, and wrap it all in a human-centered approach focused on student learning rather than just using a shiny toy for the sake of it. #AIinESL #ArtificialIntelligence #TESOL #TESL #TESOL #ELT #LanguageLearning #Composition #StudentSuccess
-
Rice & Harvard study: Socratic, Enhancing Human Teamwork via AI-enabled Coaching .... 👉 WHY THIS MATTERS Students today face a paradox: they have access to more information than ever, yet struggle to retain knowledge, connect ideas, and think critically. Traditional education often prioritizes memorization over understanding, leaving learners unprepared for real-world problem-solving. As co-founder of StudentCentral.ai, I have seen firsthand how this gap undermines academic success—and why solving it requires reimagining how learning tools work. A new collaborative paper from Rice University and Harvard researchers explores this challenge through the lens of adaptive learning systems. Their findings align with our core belief: true mastery comes not from passive consumption, but from active engagement that mimics how humans naturally teach and learn. 👉 WHAT THE RESEARCH REVEALS The study identifies three key barriers to deep learning: 1. Fragmented Knowledge: Students often memorize isolated facts without grasping underlying principles. 2. Cognitive Overload: Complex subjects become overwhelming when presented as monolithic blocks. 3. Feedback Gaps: Without timely guidance, errors compound and confidence erodes. The researchers propose a framework where AI systems break concepts into “explainable units,” adapt to individual misconceptions, and simulate teaching scenarios. This approach mirrors how people achieve mastery—by simplifying ideas until they can articulate them clearly. 👉 HOW WE’RE APPLYING THIS At StudentCentral.ai, our AI tutor operates on a similar principle: if you can’t teach it, you don’t know it. Here’s how we translate the research into practice: 1. Deconstruct Complexity - Break subjects into core components - Use analogies rooted in a student’s existing knowledge 2. Identify Hidden Gaps - Track patterns in mistakes to pinpoint specific misunderstandings - Ask targeted questions to test whether knowledge is surface-level or deeply rooted 3. Build Through Dialogue - Challenge students to explain concepts back in their own words - Provide incremental feedback, focusing on one improvement at a time 4. Reinforce Through Repetition - Reintroduce concepts in new contexts to strengthen neural connections - Schedule practice sessions based on proven memory decay models The Rice-Harvard paper validates what we’ve observed: students using systems that force active articulation outperform those relying on passive study methods by 2.3x in long-term retention. This isn’t about replacing human educators—it’s about scaling their most effective techniques. By combining these insights with our work at StudentCentral.ai, we’re moving closer to a world where personalized, concept-driven learning isn’t a luxury for the few, but a standard for all. Interested in the technical details? I’m happy to share key excerpts from the paper or discuss how we implement these principles at StudentCentral.ai.
-
🌍 Is AI effective in enhancing students' written response to Poetry? 🌍 That was the bold question our team set out to answer with our Capstone research project. Yesterday, we had the privilege of presenting our findings at the National Institute of Education. Project Overview: 🤖 Why AI Poetica? As educators, we know the struggle of giving personalized, timely feedback to every student, especially in large classrooms. 🤖 Poetry analysis, in particular, requires a deep understanding and nuanced responses—something that can be hard to nurture with limited time. 🤖 This is where the idea of AI Poetica was born: could an AI tool provide the kind of instant, detailed feedback that helps students not just improve, but thrive in their literary analysis? Our Journey: 👾 We started by developing AI Poetica, an AI-powered chatbot designed to assist students in crafting stronger, more analytical responses to unseen poetry. 👾 We had to understand the needs of students, the gaps in traditional teaching and how technology could fill those gaps. 👾 There were moments of doubt—would the AI really understand the subtlety of poetry? Would students trust feedback from a machine? But our pilot studies showed promise, and we forged ahead. Key Insights: 🔍 We observed a notable improvement in students' writing, with an average band increase of +0.33 and a +1.3 mark improvement across the class. 🤖 Students found AI Poetica both useful and easy to use, indicating that AI can effectively supplement traditional teaching methods. 🎯 Our findings suggest that AI Poetica not only enhances students' analytical writing but also boosts their confidence in tackling complex poetry tasks. 👦 Beyond the numbers, what really struck us was the change in the students themselves. 👧 Many reported feeling more confident and motivated, seeing the AI not as a replacement for their teacher, but as an extra layer of support. 🤧 Of course, it wasn’t perfect. 🤧 Some students didn’t see the same level of improvement, and a few even struggled with the additional feedback, but these challenges only highlighted the potential for further development. Looking Ahead: 🏄♂️ As we reflect on our presentation and the journey that brought us here, one thing is clear: AI Poetica is just the beginning. 🏄♂️ There’s so much potential to refine this tool, perhaps by incorporating more interactive elements. 👥 Presenting at the National Institute of Education was a milestone, but it’s also a reminder of how much more there is to learn and achieve. 👥 We’re incredibly grateful for the opportunity and for the support from our mentors, peers and most importantly, our students, who were willing to embrace this new approach with open minds. If you’re interested in the intersection of AI and education, I’d love to connect and discuss how we can push the boundaries even further. Thank you for reading, and here’s to the future of learning! 🚀
-
If our students passively absorb info, we failed them. They need active, meaningful, enduring learning. We do that by increasing conceptual friction (nod to Jason Gulya). Students need challenges and complexities to increase Critical thinking, problem-solving, deeper understanding. ✅ 𝗧𝗶𝗽𝘀 𝘁𝗼 𝗹𝗲𝘃𝗲𝗿𝗮𝗴𝗲 #AI 𝗳𝗼𝗿 𝗶𝗻𝗰𝗿𝗲𝗮𝘀𝗶𝗻𝗴 𝗰𝗼𝗻𝗰𝗲𝗽𝘁𝘂𝗮𝗹 𝗳𝗿𝗶𝗰𝘁𝗶𝗼𝗻 ➡️ Structured academic controversy Assign students different stances on an issue. Use AI to generate arguments for each side. ➡️ Predict-observe-explain (POE) activities Students predict outcomes, observe results, and explain observations. Use AI to simulate physical phenomena or historical events. Students test predictions and refine their understanding. ➡️ AI-generated prompts for critical thinking Generate complex, open-ended questions. Require students to apply knowledge in new ways. (Use Ruben Hassid Prompt Maker GPT to improve prompts.) ➡️ Interactive simulations and scenarios Create interactive simulations that mimic real-world scenarios. In a physics class, AI can simulate different frictional forces and their effects on motion, allowing students to experiment and observe outcomes in a controlled environment. ➡️ Analyzing AI responses Ask AI to write an essay or solve a problem. Students analyze and critique the AI responses. Identify errors, biases, and areas for improvement. ➡️ AI as a debate partner Use AI to simulate a debate partner. Help students practice argumentation skills. They respond to AI-generated counterarguments in real-time. ➡️ Scaffolded assignments Students use AI tools at different stages of their work. Brainstorm ideas, draft an outline, and refine final product. ➡️ Role-playing and simulations Simulate negotiations or market analysis. Provide a dynamic, interactive learning experience. Students and AI take on different roles in a simulated environment. ➡️ Feedback and revision cycles Provide instant feedback on student work. Encourage multiple revision cycles. ➡️ Ethical and societal implications Explore ethical and societal implications of decisions. Simulate the impact of different policies on society. ✅ 𝗦𝘁𝗿𝗮𝘁𝗲𝗴𝗶𝗲𝘀 𝗳𝗼𝗿 𝗲𝗳𝗳𝗲𝗰𝘁𝗶𝘃𝗲 𝗶𝗺𝗽𝗹𝗲𝗺𝗲𝗻𝘁𝗮𝘁𝗶𝗼𝗻 ➡️ Co-create expectations With students, define appropriate use and how AI should be cited. ➡️ Encourage reflection After using AI, students reflect on their experiences: How they'll use AI differently in the future. How AI influenced their thinking. What they learned. ➡️ Provide support and resources Tutorials, help sessions, online resources. Explain how to use AI effectively and ethically. ------------------------- Thoughtfully integrate AI into your classroom to ⬆️ conceptual friction. Challenge students. Promote critical thinking. Prepare them for an AI-infused future. ------------------------- ♻️ 𝗿𝗲𝗽𝗼𝘀𝘁 𝘁𝗼 𝘀𝗵𝗮𝗿𝗲 𝘄𝗶𝘁𝗵 𝘆𝗼𝘂𝗿 𝗻𝗲𝘁𝘄𝗼𝗿𝗸 𝘀𝗼 𝘄𝗲 𝗰𝗮𝗻 𝗹𝗲𝗮𝗿𝗻 𝘁𝗼𝗴𝗲𝘁𝗵𝗲𝗿
-
🎙️ Amplifying Learning Through Student Voice with Snorkl In Episode 268 of My EdTech Life, I had a great conversation with Jeff Plourd and Jon Laven, the founders of Snorkl, about their mission to transform education by harnessing the power of student's voice. Snorkl's innovative platform allows students to record verbal explanations of their problem-solving process, such as walking through how they determine the width of a rectangle, given its perimeter and length. By capturing students' spoken thoughts, Snorkl creates powerful tools for personalized learning and deeper engagement. Their AI analyzes each student's response and provides timestamped feedback, helping students solidify their understanding and catch their own mistakes. This technology supports students' individual learning journeys and empowers teachers with automatic scoring tools. Discover how Snorkl amplifies learning and transforms math education by giving students a voice.
-
Using AI to grade student assignments is controversial. I know this. I've seen how heated the conversations get in teacher communities. Some say it's the only way to keep up with grading loads that have become unsustainable. Others feel assessment is too personal, too contextual, too central to the teacher-student relationship to hand off to a machine. I'm sympathetic to both sides. But I side with those who embrace AI. Because that's where things are heading. To bury your head in the sand and pretend AI isn't already reshaping how we work is not going to help anyone. Students are using it. Teachers are using it. The question now is how we use it responsibly. We need to leverage the potential of AI in such a way that we keep our judgment at the center. A Gallup survey found that teachers who use AI weekly save an average of 5.9 hours per week. That adds up to roughly six extra weeks over a school year. And 57% of teachers report that AI actually improves their grading and feedback. In this new guide, I'm sharing tools and tips to help you use AI in grading thoughtfully. I cover how to use ChatGPT, Gemini, and Claude for grading. I walk through dedicated platforms like CoGrader, Gradescope, and Brisk Teaching. And I discuss the limitations every teacher should know. #AIinEducation #EdTech #TeacherResources #AIGrading #Assessment
-
Tools in Data Science Sep 2025 edition is live: https://tds.s-anand.net/. Major update: a new AI-Coding section and fresh projects. I teach TDS at the Indian Institute of Technology, Madras as part of the BS in Data Science. Anyone can audit. The course is public. You can read the content and practice assessments. I fed the May 2025 term student feedback into ChatGPT and asked: • 𝘞𝘩𝘢𝘵 𝘢𝘳𝘦 𝘵𝘩𝘦 𝘵𝘰𝘱 𝘯𝘰𝘯-𝘪𝘯𝘵𝘶𝘪𝘵𝘪𝘷𝘦 / 𝘴𝘶𝘳𝘱𝘳𝘪𝘴𝘪𝘯𝘨 𝘪𝘯𝘧𝘦𝘳𝘦𝘯𝘤𝘦𝘴? • 𝘞𝘩𝘢𝘵 𝘢𝘳𝘦 𝘪𝘯𝘵𝘦𝘳𝘦𝘴𝘵𝘪𝘯𝘨 𝘰𝘣𝘴𝘦𝘳𝘷𝘢𝘵𝘪𝘰𝘯𝘴? • 𝘞𝘩𝘢𝘵 𝘢𝘳𝘦 𝘩𝘪𝘨𝘩 𝘪𝘮𝘱𝘢𝘤𝘵 𝘢𝘤𝘵𝘪𝘰𝘯𝘴? Full analysis: https://lnkd.in/gVWVqaxN: summary, outliers, and action ideas. Most students find the course tough (or at least time-consuming), especially the Remote Online Exam (ROE). 𝗦𝘂𝗿𝗽𝗿𝗶𝘀𝗲: students who mentioned ROE time limits rated it 2.61 vs 2.33 (+12%!). Those who felt time pressure also saw more value -- suggesting "desirable difficulty," rather than frustration. A minority even asked for 𝘵𝘰𝘶𝘨𝘩𝘦𝘳 𝘱𝘳𝘰𝘫𝘦𝘤𝘵𝘴. The main actions are faster feedback loops, automated pre-checks, mock ROEs, clear rubrics, etc. But my two takeaways are: • Students value rigor and challenge, even if it makes the course harder. • Using LLMs to analyze student feedback is a force multiplier for instructors.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development