S1E4 Human Presence in Age of AI
===
[00:00:00]
[00:00:10] Hi, and welcome back to the AI for Educators Design Lab podcast. I'm Jennifer Maddrell. In episode one, we looked at how AI is exposing vulnerabilities in our assignments. Then in episode two, we stepped back and examined whether our learning goals still hold up when AI can handle so many of the tasks those goals were originally build around. And in episode three, we dug into AI literacy. I suggested the students need more than tool proficiency. They also need the judgment to evaluate AI output.
[00:00:41] Today I want to focus on the interactions that make learning possible. I want to make the case that learning in an AI integrated environment is shaped not only by how students interact with the technology, but also by the people who design, guide and participate in those interactions. I also want to use this episode to draw on [00:01:00] decades of research on designing technology mediated learning where I see a lot of parallels with teaching and learning with AI.
[00:01:07] I've chosen this focus today because I find that many of the conversations we're having about AI and education are focused on what AI can do. In other words, how it can generate content, give feedback, tutor students, or offer personalized learning pathways.
[00:01:23] While some teachers and schools are resisting AI, a growing number are integrating it in some way. For example, you may have heard about Alpha School, where AI systems play a primary role in core instruction. In these AI driven learning environments, human roles shift from teachers to instead guides or coaches.
[00:01:43] You may also have followed the coverage of the hype and recent adjustments to Khan Academy's AI chatbot called Khanmigo.
[00:01:50] When Sal Khan introduced Khanmigo at a TED talk back in 2023, he hailed AI as probably the biggest positive transformation that education has ever [00:02:00] seen.
[00:02:00] At the time, he noted Khan Academy was going to do that by giving every student on the planet an AI-based personal tutor. And then every teacher, an amazing AI teaching assistant.
[00:02:11] But here we are now three years later, and those bold predictions haven't fully materialized.
[00:02:16] Instead, Khan Academy found that many students weren't really using the AI tutor as intended, and they compared the offer of AI's help to brussel sprouts.
[00:02:25] In other words, while the AI tutor was provided as something good for the learners, the students simply didn't want it. And then in recent reporting, some teachers have said it generated more excitement among administrators than it did among teachers.
[00:02:39] And so what concerns me most about stories like this is the way they're framed.
[00:02:44] It's as if we're entering entirely new territory, or that we've never grappled with what happens when technology reshapes those relationships between teachers and learners.
[00:02:53] But we have been here before. Many times.
[00:02:57] Well before the internet, when correspondence [00:03:00] courses became popular as a distance education option, there were genuine concerns about whether students could learn without the teacher in the room.
[00:03:07] When radio and television started educational programming, people wondered if these technologies would replace instructors.
[00:03:14] And then with online learning, we've spent decades researching what happens to learning when interaction is mediated by technology rather than those face-to-face interactions between a teacher and learners.
[00:03:26] And this research base is extensive. For example, if you've spent any time looking into the theory and research about online education, you've probably come across the Community of Inquiry framework.
[00:03:37] Hundreds, if not thousands of studies have examined the role of social presence, teaching presence, and cognitive presence.
[00:03:45] The framework focuses on these three elements as interdependent conditions for deep learning in computer-mediated learning experiences.
[00:03:53] However, much of the theory behind that work was built on earlier distance education research on things like learners' [00:04:00] interaction with teachers, peers, and instructional content.
[00:04:03] It also leaned into findings related to teacher immediacy and overcoming transactional distance, which is the psychological distance between the learner and instructor in these mediated environments.
[00:04:15] So here we are again with a new technology. And unfortunately the conversation about how to design AI integrated learning experiences is proceeding as if none of this research exists.
[00:04:26] And I'm hardly alone in calling out this collective amnesia we often experience when a new technology comes on the scene in education.
[00:04:34] Whenever a new technology emerges, we often fixate on the disruptive potential of the tool itself instead of drawing on what we've already examined about integrating technology into learning experiences.
[00:04:46] It happened with distance education. It happened with online learning, and it's happening now with AI.
[00:04:53] Every new wave of educational technology generates a fresh round of both excitement and anxiety. And [00:05:00] unfortunately, research from the prior wave is largely set aside rather than built upon.
[00:05:05] But this history is worth digging into as it might relate to AI. For example, in my own doctoral research 15 years ago on the Community of Inquiry framework, I found something that I think is relevant to this conversation.
[00:05:18] My findings suggested that social presence, teaching presence, and cognitive presence are all correlated with student satisfaction and perceived learning.
[00:05:27] In other words, students who felt more connected, who experienced stronger instructor presence and who engaged in deeper inquiry also reported better experiences.
[00:05:37] And those findings were consistent with what the broader Community of Inquiry literature has repeatedly found.
[00:05:43] However, when I looked at objective measures of achievement, the picture was more complicated.
[00:05:48] Unfortunately, I didn't find any correlation between the students' perceptions of community, and then three different teacher assessed measures of their learning. In other words, students who [00:06:00] reported a stronger sense of community also reported greater satisfaction and perceived learning.
[00:06:06] However, those perceptions did not correlate with the teacher's assessment of their learning.
[00:06:10] These findings suggested that the presence of human interaction had a positive effect on satisfaction, and then by extension, perhaps on persistence.
[00:06:20] However, that feeling of connection was not sufficient to positively influence their actual learning outcomes.
[00:06:27] Instead, it appears to suggest something beyond the mere presence of interaction.
[00:06:32] My findings pointed in the same direction as decades of other educational research. What appears to matter is how the interactions with teachers, peers, and content are designed.
[00:06:43] So I'm not saying that human connection isn't important. Instead, it means that connection alone isn't enough to produce learning unless that interaction itself is well designed.
[00:06:54] Therefore, much like the Khanmigo example, adding in opportunities for interaction [00:07:00] isn't enough. What matters is the kinds of interactions students are having and how that interaction is structured to support learning.
[00:07:07] And I feel this distinction is highly relevant right now. Because with regard to AI, there's a real risk of falling into that same trap that earlier technology waves fell into.
[00:07:18] This body of research acknowledges that technologies may offer different affordances, but what matters most for learning is how the interactions using those affordances are designed.
[00:07:30] And that's the core of what I want us to sit with today.
[00:07:33] How do we design for the conditions of human interaction in the age of AI that produce real learning?
[00:07:40] How do we avoid the past mistakes of focusing merely on the presence of opportunities for connection versus focusing on what types of interaction strategies support learning.
[00:07:50] And then how do we draw on this past research to inform these design decisions?
[00:07:56] The answers to these questions have real implications for how we design [00:08:00] any learning experience that involves AI.
[00:08:02] So to get us started down this path,
[00:08:04] let me walk through five design considerations I think are worth exploring. This roster certainly isn't exhaustive.
[00:08:12] Instead, these are what I see as several relevant design questions to get you started thinking about human connection and interaction in your own context and with your own learners.
[00:08:22] The first question is this, where in your current learning experience does human presence most support student learning? In other words, what kinds of human support do students need to feel seen, heard, and known in order to learn? And which of those interactions do you feel are most directly tied to what students actually learn?
[00:08:42] For example, some studies suggest students benefit from human presence at moments of what is called productive struggle. This is when they're working through confusion and need feedback or maybe guidance to say the right thing at the right time.
[00:08:56] They might benefit from it when receiving feedback that requires [00:09:00] knowledge of their individual context, or their strengths, their history, or even their goals.
[00:09:06] Or they could need it when navigating emotionally significant experiences such as processing a difficult situation or presenting their ideas publicly for the first time.
[00:09:16] Or maybe confronting assumptions that challenge what they thought they knew.
[00:09:21] These are those moments when trust and human connection really seem to matter. A student is more likely to push through difficulty, take intellectual risks, and revise their thinking when they trust the person on the other side of that interaction.
[00:09:35] AI can simulate responsiveness, but it remains to be seen whether AI can build the kind of trust that comes from a human who has real stakes in the student's growth.
[00:09:45] So the design question here is this. Have you mapped where human presence is doing real work in your learning experiences? And are you intentionally protecting those moments when AI is in the mix?
[00:09:57] And moving on, the second design question [00:10:00] follows directly from the first. How are you allocating tasks between AI and humans? How can you use AI to automate tasks in ways that serve the learner and free up more time for the kinds of support humans best provide?
[00:10:13] So to frame this, I want to share a distinction between what I've seen referred to as either backstage tasks or frontstage tasks.
[00:10:22] You can think of backstage tasks as the operational work of teaching. So this could include things like scheduling or generating practice problems. Maybe organizing resources or drafting your rubric.
[00:10:36] These are also likely good candidates for AI support because they help teachers free up their time.
[00:10:41] In contrast, the frontstage tasks are those moments of direct interaction where the educator's knowledge of the student, their own professional judgment, and then their relational presence with a learner all come together.
[00:10:55] So this could be during one-on-one mentoring conversations.
[00:10:59] [00:11:00] Or it could be targeted feedback that names what a specific student is doing well or where they need encouragement to be pushed harder.
[00:11:08] It could also be facilitated discussions with a whole class where the instructor reads the room and then makes a judgment call about when to push or when to pause or when to redirect.
[00:11:20] Another useful distinction here is between automation and augmentation.
[00:11:25] Automation helps replace or streamline a task.
[00:11:29] In contrast, augmentation helps a person do the task better.
[00:11:33] And in general, the most thoughtful AI integration I've seen uses AI to buy educators time not to replace their presence.
[00:11:42] So in other words, that's using AI to enable both automation and augmentation in service of furthering human relationships.
[00:11:50] And I think Khanmigo is a useful example again here.
[00:11:54] Early on, it sounds like it was designed to be an automated help resource that was on the side for students [00:12:00] to choose to use.
[00:12:01] However it appears their redesign efforts signal a shift toward how they're using AI.
[00:12:07] It seems like they plan to use AI for interactions that scaffold student learning and augment those frontstage tasks of teaching rather than an option for students to simply seek answers.
[00:12:19] So the design question here is this: For each task in your learning experience, are you using AI to automate a backstage task, or are you using it to augment frontstage tasks in ways that serve both the teacher and the learner?
[00:12:36] Okay, now moving on to the third design consideration.
[00:12:39] I want to now drill down into what teaching presence looks like in an AI supported environment.
[00:12:45] Circling back to the Community of Inquiry framework, teaching presence encompasses three aspects:
[00:12:51] 1). the design and organization, 2). facilitating discourse, and then 3). direct instruction.
[00:12:57] And traditionally, obviously, the [00:13:00] instructor was responsible for all three, which has been a major focus in research on technology supported learning. This centers on the question of what role technology should play in supporting all of these aspects.
[00:13:13] And already AI tools can perform a lot of these teaching presence tasks.
[00:13:18] Depending on the AI's capabilities, it might generate content summaries, facilitate practice. It can also be used to structure discussion prompts, or even grade assignments.
[00:13:30] However, as these responsibilities shift to technology, there's certainly a concern that human presence may become less visible to students.
[00:13:39] And this is an area of research that really interests me.
[00:13:42] Our existing Community of Inquiry research on teaching presence suggests it's really a binding element. It binds social presence and cognitive presence together.
[00:13:53] So for example, when a student can see their instructor's thinking, or experience an engaging lesson, [00:14:00] and see their needs addressed as the course progresses, it then can strengthen the entire learning experience.
[00:14:07] So then the nagging question is, what happens when human teaching presence becomes less visible when those teaching interactions are mediated primarily by automated systems? What would be the impact if students lose the sense that a person is present and paying attention to their experience?
[00:14:26] If we look at our prior body of research for potential answers, it signals the need to make deliberate choices about where and how teaching presence remains visible.
[00:14:37] That might look like recording a short personal video at the start of each module, rather than relying on AI generated summaries.
[00:14:45] It might mean adding your own audio commentary to AI generated feedback so that students experience your input alongside the tools. It might mean scheduling a quick synchronous one-on-one [00:15:00] session to share what you've noticed in students' recent work. Something that signals you're paying attention to them specifically, and not just running a system that handles things automatically.
[00:15:11] So the point is that teaching presence in an AI supported environment needs to be designed. And the more AI handles, the more intentional that design needs to be.
[00:15:22] So the design question here is this, if a student walked through your entire learning experience, where would they encounter the teaching presence you intentionally designed?
[00:15:32] Where would they feel that a human was present, paying attention, and invested in their learning?
[00:15:38] And now our fourth design consideration shifts the lens to ask this: Who is most affected by the shift away from human connection when AI is introduced? And by extension, how are you ensuring AI used by learners is inclusive and supportive for all learners?
[00:15:55] In other words, how are you ensuring that AI integration doesn't widen the gap [00:16:00] for students who most depend on human relationships to persist and to succeed?
[00:16:05] And again, the existing theory and research on topics such as belonging or community in educational settings suggests that students who feel they belong, who feel accepted and valued and recognized by their instructors and peers are much more likely to persist and more likely to engage.
[00:16:22] And the learners who are already navigating barriers are often those whose persistence most depends on feeling seen and supported.
[00:16:30] This might include first generation college students, or learners who have struggled or failed in formal learning settings.
[00:16:38] Maybe they're your learners who don't have the necessary digital literacy skills or even prior experience working with technology. Or it could be learners returning to education after many years away.
[00:16:50] So when we use AI to either automate or augment elements of the learning experience, we need to ask whose experiences are most [00:17:00] affected and how? For example, if an AI chatbot replaces the advisor's check-in conversation, that might be sufficient for a confident, well connected student who already knows how to navigate the system.
[00:17:14] But for another student who doesn't feel connected or isn't used to working with technology, swapping out the human to human conversation might influence whether the student persists and stays engaged.
[00:17:26] And that's really the design tension. AI integration has the potential to expand educational access, but the benefits are not always equally distributed.
[00:17:36] So the design question here is this. In your learning experiences, which students depend most on human relationships to persist and succeed?
[00:17:46] And from your experience, what types of interactions are most essential to their success?
[00:17:52] And importantly, does your AI integration plan design those interactions and protect those relationships?
[00:17:59] And now [00:18:00] our fifth and final design consideration. It begs questions about how AI integration might be meaningfully different compared with earlier waves of educational technology.
[00:18:10] As I've noted already, edtech's impact on interaction and presence has long been studied.
[00:18:16] However, there's an important distinction. The research largely focused on how the technology was used to mediate learners' interaction with the content, the teacher, or other learners.
[00:18:27] But AI is a different type of technology. When an AI chat bot can remember your name, adapt to your preferences, and respond immediately in ways that feel conversational and warm, the technology then starts to feel less like a neutral channel and more like a participant in the learning experience.
[00:18:46] That difference raises a new design question.
[00:18:50] Does AI introduce a distinct kind of presence in the learning environment? And if it does, what does that mean for how we intentionally design AI [00:19:00] mediated interactions?
[00:19:01] If you've spent any time at all using generative AI, you know it's getting very good at responding in ways that feel very personal.
[00:19:10] For some learners, especially those who are used to feeling invisible, maybe in a large lecture hall or an asynchronous course, that responsiveness can feel like connection.
[00:19:22] But tied to my other design questions, is AI's responsiveness the same as connection among real people?
[00:19:28] People who have genuine stakes in your growth.
[00:19:31] Historically, belonging in education sounds like an instructor saying things like, "I see what you're trying to do here, and I think you're onto something." Or it could be a peer saying, "that's not how I read it, and here's why."
[00:19:46] It's that productive discomfort of being challenged by someone who knows you well enough to do so.
[00:19:52] So these are those relational experiences that happen between people, whether they meet in person or online. And often they're [00:20:00] strengthened over time.
[00:20:00] So then what happens when some of those relational signals in education come from an AI instead of a person? That could be encouragement, validation, or even a gentle challenge or nudge.
[00:20:14] What might students feel about either the positives or limitations of having AI's presence in their educational experiences?
[00:20:21] For example, they might really appreciate having rapid just in time guidance to get them past a struggle or a deadline.
[00:20:29] Or they may appreciate lowering the social pressure of publicly sharing unfinished ideas.
[00:20:35] However, the AI's presence might also cause frustration.
[00:20:38] For example, if the learners feel shortchanged when the feedback is superficial or the guidance doesn't offer deeper nuance
[00:20:46] that happens when an instructor has followed a student's progress over a course, not just a single chat session.
[00:20:53] And the implication for design isn't whether AI presence feels like human presence. Instead, it's whether and [00:21:00] how that AI presence matters for learning.
[00:21:03] So as you reflect on your own learning experiences, you can ask yourself this, where if at all, does AI presence already show up? And how does that AI presence contribute to learning?
[00:21:15] And further, how can you design those interactions so that the AI's presence strengthens learning and augments the learner's engagement with both the content and other people?
[00:21:26]
[00:21:26] So as I wrap things up in this episode, the argument I've been hoping to make today is that AI integration is a design challenge with deeply human dimensions.
[00:21:36] Our existing research on distance and online learning tells us that connection matters and that the quality of interactions within the learning environment matter too.
[00:21:46] And my own research points in the same direction. What matters is how those interactions are designed.
[00:21:52] While AI is clearly a rapidly emerging educational technology, we have decades of research that can help us map the [00:22:00] conditions under which human interaction supports learning.
[00:22:03] So if nothing else, I hope a takeaway from this episode is we need to build on that knowledge rather than setting aside and starting over as if none of it happened.
[00:22:12] And to help you think through these design considerations in your own context, I've created a free companion Design Brief for this episode. It includes a worksheet called the Human Presence Design Map.
[00:22:25] I created it to help you identify the non-negotiable human moments in one of your learning experiences. You can then start thinking about where AI might free up some of your time for some of those moments.
[00:22:37] You can find it along with all the Design Briefs in this podcast series at nextpathdesign.com/designbriefs.
[00:22:45] And when you sign up for the free Design Briefs, you get access to the full library.
[00:22:50] The library includes this and all future Design Briefs as they're released.
[00:22:55] Looking ahead to episode five, we're going to tackle the ethical considerations associated with [00:23:00] data privacy, security, and safety in the age of AI. When our students and educators use AI tools, they too often forget their sharing data. Sometimes a lot of data is shared. And as educators, many of us don't think about what we're asking our students to expose about themselves.
[00:23:17] We're also not thinking about what we might be putting at risk or what our responsibilities are to protect our learners.
[00:23:24] That's the topic we'll be taking up next.
[00:23:27] Thank you so much for thinking through these questions with me.
[00:23:30] Keep an eye out for new episodes, which I release about twice a month on Apple Podcasts, Spotify, YouTube, and on our website at nextpathdesign.com/podcast.
[00:23:43] And most importantly, I really look forward to continuing these conversations about AI and education with you.