top of page

Can AI Personas Help Students Learn, or Just Keep Them Talking?

  • Jan 27
  • 4 min read

By Rick Grammatica (the AI Learning Designer) and Dr Gary F. Fisher (Liverpool School of Tropical Medicine)


AI is already woven into higher education in multiple ways, from automating routine administrative tasks to supporting student services and learning. Within this broad landscape, one strand gaining particular momentum is the use of persona-based chatbots; AI tools that are designed to adopt a specific persona or role. Early case studies show how universities are experimenting with these AI personas in different contexts, from subject-specific teaching to pastoral support, raising important questions about their potential to shape student learning and engagement.


Image source: Nano Banana Pro (Google Gemini)


The Appeal of Talking to Someone

Millions of young adults worldwide are interacting with AI personas through chatbots like Character.AI, which offer users the opportunity to engage in open-ended conversations with avatars ranging from historical figures to fictional characters. Their popularity reflects an appetite for AI that feels realistic, conversational and tailored to their needs.


But the very qualities of flexibility, accessibility, and entertainment value that make these tools appealing can sit uneasily with academic goals. A chatbot that simplifies a complex idea too much, or strikes the wrong emotional note, risks undermining learning. The challenge for universities is how to capture the benefits of personalisation and engagement without compromising academic depth.


Research suggests that the most effective educational AI is designed with learning in mind from the start. That means aligning the chatbot’s “voice” with course objectives, offering meaningful interaction, and adapting to varied student needs. Personalisation, familiarity and clarity of purpose matter as much as technological sophistication.


At Nanyang Technological University, Singapore, a custom AI chatbot known as Professor Leodar provided 24/7, personalised learning support. The majority of students responded positively highlighting its help in reinforcing their understanding and application of course content. We should not assume that all students engage with AI in the same way, however. Other research highlights challenges, such as perceived ease of use or not being able to formulate effective questions, pointing to the need for AI literacy and support from course tutors. 



Encouraging Reflective Dialogue

These opportunities are not just theoretical. Institutions are already experimenting with AI personas in discipline-specific teaching, with encouraging results. At the University of Derby (where both authors worked previously), AI personas are already being embedded within Blackboard Ultra (the virtual learning environment, or VLE) to support student learning outcomes in discipline-specific contexts. The VLE’s built-in ‘AI Conversation’ tool allows tutors to assign a specific voice or character to the AI, shaping how it responds to students through carefully crafted prompts.


In one nursing module, the AI takes on the role of Florence Nightingale, offering insights into historical approaches to care and preparing students to reflect on how contemporary practices have evolved in collaboration with their academic tutors. In a sustainability in business operations module, the AI assumes the persona of a small business owner. Students consult with it as if advising a real client, using the exchange to test their ideas and apply their learning to practical sustainability challenges within a controlled, yet agile, environment. Meanwhile, within a forensic psychology module examining the psychological processes underpinning terrorism, the AI persona takes on the role a critical friend who invites students to interrogate and refine their definition of terrorism to prepare them to for a more in-depth discussion with their tutor.


Research supports this approach. In one study examining the experience of undergraduate statistics students, those who engaged with a Socratic chatbot showed greater learning outcomes when they engaged more frequently in reflective and evaluative activities, rather than merely searching for answers or checking their understanding. These findings suggest that chatbot design should prioritise deeper and more thoughtful interaction over just transactional efficiency.



The Case for Caution

Universities must balance innovation with care. A recent Jisc report into student perceptions of AI in higher education reveals a mixed picture of curiosity tempered by uncertainty. While students value quick support and clarification, they also express doubts about accuracy, bias and whether they are “allowed” to rely on such tools without clearer institutional guidance.


Transparency matters. Students want to know who has created a chatbot, how it was trained, and what happens to their data. Without clarity, even well-intentioned tools can feel impersonal or alienating.


Institutions therefore need to go beyond the technology itself. Thoughtful communication, clear boundaries and ongoing dialogue with students are as important as the chatbot’s design. When implemented carefully, AI personas can stimulate critical thinking, build confidence and enrich learning. But their success depends on transparency and purpose. The most effective AI personas will not be those that mimic reality, but those that push students to think harder, question more deeply and engage with course materials in meaningful ways.



References

Bilquise, G., Ibrahim, S. & Salhieh, S.M. (2024). Investigating student acceptance of an academic advising chatbot in higher education institutions. Education and Information Technologies, 29, 6357–6382. https://doi.org/10.1007/s10639-023-12076-x


Jisc (2025). Student perceptions of AI 2025. https://www.jisc.ac.uk/reports/student-perceptions-of-ai-2025


Lai, J. W., Qiu, W., Thway, M., Zhang, L., Jamil, N. B., Su, C. L., Ng, S. S. H., & Lim, F. S. (2025). Leveraging Process-Action Epistemic Network Analysis to Illuminate Student Self-Regulated Learning with a Socratic Chatbot. Journal of Learning Analytics, 12(1), 32-49. https://doi.org/10.18608/jla.2025.8549


Sánchez-Vera, F. (2025). Subject-Specialized Chatbot in Higher Education as a Tutor for Autonomous Exam Preparation: Analysis of the Impact on Academic Performance and Students’ Perception of Its Usefulness. Education Sciences, 15(1), 26. https://doi.org/10.3390/educsci15010026


Thway, M., Recatala-Gomez, J., Lim, F. S., Hippalgaonkar, K., & Ng, L. W. T. (2024). Battling Botpoop using GenAI for Higher Education: A Study of a Retrieval Augmented Generation Chatbot’s Impact on Learning. arXiv. https://doi.org/10.48550/arXiv.2406.07796



ChatGPT was used to help edit this article. Nano Banana Pro (Google Gemini) was used to create the image.


bottom of page