This post is part of our [work in] Progress series, an effort to share our preliminary findings on the impact of artificial intelligence in higher education to help the field move at the pace of technology.
Executive Summary
Providing increasingly personalized learning pathways is a prized goal in education, particularly online programs that leverage personalized pacing and learn-anywhere models to help busy students fit learning into their lives and progress as quickly as possible. Technology has been a powerful tool for personalizing, and recent developments in AI and Large Language models have given us even more tools to explore.
Kyron Learning is an education start-up company that is rapidly building new solutions for learning and using technology to provide more individualized and interactive experiences. Kyron Learning offers adaptive video-based lessons with personalized feedback intended to enhance student engagement and improve course outcomes.
A small note: We were thrilled to partner with Kyron on this project and appreciate their commitment to learning and research. We also want to acknowledge that Kyron's tools have evolved rapidly. Recent advancements have added even more interactive capabilities beyond what we tested in this project.
In collaboration with the WGU’s School of Technology, we examined the impact of Kyron’s AI-assisted learning platform on student outcomes in the Fundamentals of Web Development course. In so doing, this research study explored how Kyron Learning’s AI-driven personalization approach might improve student experience and outcomes in a real online learning environment. This effort represents one of the first attempts to integrate AI-assisted learning support into WGU’s academic programs. The teams at WGU Labs, WGU School of Technology, and Kyron Learning each approached the implementation as an exploratory study, hoping to better understand how students engage with and interact with AI-assisted learning technologies.
One of our main findings was that only a fraction of students who were invited to use Kyron Learning actually used it, likely because it was not required and it was not possible to fully integrate the technology into their course website. We also did not find measurable improvements in overall academic outcomes such as assessment pass rates or time to course completion. Despite this, those who did engage with the platform reported positive learning experiences. Our key findings were:
- Most students did not opt into using the supplemental AI learning tool: Less than half of the students who consented to participate in the study engaged with the Kyron Learning platform.
- Students who used the AI learning tool liked it: Students who completed lessons through Kyron Learning reported favorable experiences with the platform, highlighting its potential to enhance learning, particularly through interactive video-based feedback.
- The AI learning tool did not improve student outcomes: Despite the positive reception from engaged students, there were no significant differences in key academic metrics such as assessment attempt rates, pass rates, or time to course completion when comparing students who used Kyron Learning and those who did not.
- The limited engagement and lack of impact on outcomes may be due to our strategies related to the deployment of technology, rather than the tool’s inherent potential: Factors such as the lack of full integration of the lessons into the course structure, the absence of an AI assistant to support students, and limited variation in video content may have constrained Kyron Learning’s effectiveness in this context.
Introduction: A New Frontier for Learning Resources
Artificial intelligence (AI) offers new opportunities to transform education by creating more personalized and effective learning experiences for diverse learners. WGU Labs is committed to exploring the potential of AI to change the landscape of higher education. As part of this broader mission, the current project empirically assessed the impact of Kyron Learning, an AI-assisted learning tool, on student academic outcomes and their perceptions of faculty, course materials, and technology in education among School of Technology students at Western Governors University (WGU). More broadly, this research contributed to our understanding of how students engage with and benefit from AI learning tools.
Kyron Learning uses a set of short videos created by the instructor to make adaptive branching scenarios in which students see specific video sequences depending on how they interact with the platform. This is made possible by an AI engine that listens to the conversation, such as the Q&A between an instructor in a pre-recorded video and a student, and seamlessly integrates pre-recorded video segments to create a natural, conversational flow. When done correctly, the lesson resembles a live one-on-one video conference with the instructor. This is in contrast to traditional online learning environments, where students often do not receive personalized instruction or real-time feedback due to high student-to-teacher ratios and limited time and resources. This AI-assisted learning platform seeks to address these issues.
Overall, our findings showed that only a fraction of students invited to use Kyron Learning engaged with it, likely due to these integration and support limitations. Among students who did use the platform, feedback was favorable, with many highlighting the interactive video-based feedback as a helpful addition to their learning experience. Despite these positive experiences, no significant differences in key academic metrics, such as assessment attempt rates, pass rates, or time to course completion, were observed between students who used Kyron Learning and those who did not.
This initial pilot offered valuable early insights into how AI-assisted learning tools may impact student learning and outcomes by examining Kyron Learning in a real online learning environment. However, certain structural and implementation limitations were noted, which may have restricted the platform’s overall impact. For example, the AI-assisted tool was not fully integrated into the course website, requiring students to access it separately, and, as an optional tool, it saw low engagement. Additionally, the absence of an AI assistant for ongoing support and limited video content customization may have further constrained student interaction and the tool’s effectiveness. While the pilot had its limitations, it provided clear directions for refining future implementations of such AI-powered education technologies.
Research Conducted
We conducted a randomized control trial (RCT) study that allowed us to compare the impact of the Kyron Learning platform against a business-as-usual group of non-participating students. In addition, we also created a version of Kyron Learning that eliminated the dynamic branching of pre-recorded videos and, instead, presented the information contained in the videos in a more static format to students. This allowed us to assess whether the potential impact of Kyron Learning on academic outcomes was derived from the interactive, AI-personalization of Kyron Learning or simply the presentation of the information itself.
Between April 29th and July 9th, we recruited all students enrolled in the Web Development Foundations (D276) course in the School of Technology at WGU by sending them an email invitation to participate. In total, 1,374 students received an email invitation to participate in the study, and 235 consented to participate and were provided access to Kyron in one 1 of 2 experimental conditions. Among the 235 participants, 121 were randomly assigned to the Experimental condition, in which students watched instructor-developed pre-recorded instructional videos and interacted with pre-recorded short clips of anticipated questions and their answers. The Experimental condition offered dynamic branching sequencing of the videos based on how the students answered questions posed by the instructor in video clips. The 114 students assigned to the Control condition watched the same videos but in a predetermined order (see Figure 1 for illustration). Students were not required to use Kyron Learning as part of their course curriculum. Students accessed the Kyron Learning platform via the web URL link sent to the emails they received after consenting to participate in the study.
Both courses were hosted on the Kyron Learning platform and consisted of five following lessons:
- HTML Basics
- HTML Forms
- CSS Basics
- CSS Advanced
- XML, Media, and Devtools
Each lesson built upon the previous one, but students were free to use the Kyron Learning platform however they liked. At the end of each lesson, we asked the following survey questions on a scale of 1-7:
- Please indicate your agreement with the following statement: The integration of cutting-edge technology in modern education is essential. (1 – Strongly disagree; 7 – Strongly agree)
- Please indicate your agreement with the following statement: My course instructor effectively facilitated my understanding and engagement with the course material. (1 – Strongly disagree; 7 – Strongly agree)
- How interested are you in seeing Kyron Learning integrated into other courses you are taking or plan to take? (1 – Not at all interested; 7 – Very interested)
- How effective do you believe the lesson is in facilitating your learning for your course? (1 – Not at all effective; 7 – Very effective)
We also asked an open-ended question, asking them to describe their overall experience with the lesson they had just completed on the platform. These questions served as short-term outcome measures of Kyron Learning’s success, as they were designed to assess whether using Kyron Learning affects students’ perception of faculty, course materials, and technology in education. In addition to short-term outcome measures of survey questions, we assessed longer-term, more critical measures of success through attempt and pass rates of the objective assessment for the course, as well as the number of days it took for students to complete the course. Throughout the 4-week period where we assessed students’ experiences with Kyron Learning, weekly emails to students in both conditions received weekly reminders to engage with Kyron Learning, along with web links for them to access the Kyron Learning platform.
Key Findings
Finding 1: Most students did not opt into using the supplemental AI learning tool
After using multiple outreach channels to encourage students to use Kyron Learning, we found that just under half of students engaged with the platform. Our recruitment email presented participation in our study as an opportunity for students to contribute to educational innovation and enhance their chances of passing the objective assessment. Additionally, we engaged various individuals, such as the course instructor, the dean of the School of Technology, and program mentors, to encourage student participation. Despite these efforts, we did not see clear improvement in student engagement. It is worth noting that, on average, WGU students open emails from WGU at a rate of 33% within seven days, so delivering access to Kyron Learning via email likely limited its uptake compared to a tool seamlessly integrated within the course. Notably, the largest reduction in potential participants (an 83% decrease) resulted from this delivery method.
Specifically, out of 1,374 students who received recruitment emails, only 17.1% (235) consented to participate in the study (Figure 2). 88.09% (207) of participants logged into Kyron Learning at least once. There was no statistical difference between the Experimental (86.78%) and Control (89.47%) conditions in their likelihood of logging into Kyron Learning.
However, because it is possible that some students might have simply logged on to the platform but never actually meaningfully engaged with the lessons contained in it, we examined whether participants completed at least one lesson. We found that a much smaller number of participants completed at least one lesson, as only 45.41% (94) of those who logged into Kyron at least once completed at least one lesson. In other words, 54.59% (113) of students who logged into Kyron did not complete a single lesson (Figure 2).
Interestingly, there was a statistically significant difference in the number of students who completed a lesson with Kyron Learning between the Experimental (37.14%) and Control (53.92%) participants, χ(1) = 5.22, p = .02 (Figure 3). That is, participants in the Control condition—those who watched video lessons in a non-adaptive, predetermined order—were more likely to complete a lesson than their counterparts in the Experimental condition—those who watched video lessons in adaptive, dynamic branching orders. This effect held even when we controlled for students’ momentum indicator, which uses a statistical model to predict student outcomes (e.g., course completion rates) in a given term, indicating that any pre-existing differences in students overall engagement in their coursework did not explain the significantly lower engagement rate among the Experimental participants compared to the Control participants.
Given prior findings that older online students prefer more passive learning experiences—such as video lectures—over interactive activities, which tend to be more time-consuming, it is unsurprising that more students completed a lesson in the Control condition, which involved non-interactive videos. While dynamic branching may require more time and effort from students, there is no direct evidence from this study that it negatively impacts student experience. Instead, this highlights an opportunity to further explore how Kyron Learning’s interactive elements could be refined to better fit the time constraints and learning preferences of students. Iterating on these features while gathering real-time student feedback could help improve both engagement and learning outcomes in future implementations of Kyron Learning.
Next, we examined the number of lessons completed by students who completed at least one lesson on Kyron Learning. Overall, 44 students completed just one lesson (44.68%), followed by two lessons (17 students, 18.09%), three lessons (13 students, 13.83%), five lessons (12 students, 12.77%), and four lessons (10 students, 10.64%) (Figure 4). On average, students completed 2.29 lessons (SD = 1.45). There was no significant difference between the Experimental (2.38) and Control (2.22) participants.
Our findings indicate that overall, students did not engage with Kyron Learning much but also point to several opportunities for future pilots. One possible reason for the low engagement rate may be that our message framing—encouraging students to use the platform “to improve their chances of passing the assessment”—was too abstract to motivate students effectively. To increase engagement, we suggest the following strategies:
- Provide more concrete messages: Rather than such an abstract message with a promise to improve their chances, a more concrete message (e.g., increase your chance of success by XX%) may increase the likelihood of student participation.
- Supplement with nudges and targeted messages: Instead of generic weekly reminders, implementing more personalized messaging and nudging that take into account how students interact with the platform could encourage more consistent engagement with Kyron Learning.
- Tap behavioral interventions: Leveraging well-established behavioral strategies, such as implementation intentions facilitated by the course instructor or program mentor, may help students overcome initial hesitancy and engage more actively.
- Consider platform integration: Integration of Kyron Learning directly into the course website, rather than relying solely on email outreach, may further streamline access and encourage participation.
Finding 2: Students who used the AI-assisted learning tool like it
Although most students did not engage with Kyron Learning, those who did reported positive experiences. Among those who completed at least one Kyron Learning lesson, most agreed that the integration of cutting-edge technology in modern education is essential (mean = 6.15, SD = 1.06; on a scale of 1-Strongly disagree to 7-Strongly agree). Additionally, students, on the whole, agreed that their course instructor effectively facilitated their understanding and engagement with the course material (mean = 5.73, SD = 1.15). Similarly, students who completed at least one lesson reported that they believed the lessons were effective in facilitating their learning for the course (mean = 5.71, SD = 1.15 on a scale of 1-Not at all effective to 7-Very effective). Lastly, students showed strong interest in seeing Kyron Learning integrated into other courses they were taking or planned to take (mean = 5.72; SD =1.39 on a scale of 1-Not at all interested to 7-Very interested). No statistically significant difference emerged between the Experimental and Control conditions in any of these four measures.
Next, because we administered the same four-question survey after each lesson, we explored whether the responses would differ depending on how many lessons students completed. All four responses became more positive (i.e., higher numbers) as students took more lessons (Figure 5). This finding may indicate that repeated engagement with Kyron Learning reinforced positive experiences even further or that those who reacted more positively to Kyron Learning tended to engage with Kyron Learning more.
Finally, to identify differences in themes and sentiment, we examined students’ open-ended responses to the question asking them to describe their overall experience with the lesson they had just completed on the platform. Text responses from both conditions were tokenized, meaning they were broken down into individual words. We then removed common filler and stop words (e.g., ‘the,’ ‘and,’ ‘to’) that do not contribute to the analysis. After this cleaning process, we analyzed the remaining words to compare their frequencies and identify thematic differences between the Experimental and Control conditions. A simple sentiment analysis was performed by assigning +1 for each occurrence of a positive word (e.g., “good,” “great,” “excellent,” “helpful,” “clear,” and “effective”) and -1 for each occurrence of a negative word (e.g., “bad,” “confusing,” “difficult,” “unclear,” and “frustrating”). A sentiment score was calculated for each response, and the average sentiment score for each condition was computed.
We found that the Experimental condition had an average sentiment score of 0.4, indicating slightly positive feedback overall. The Control condition had an average sentiment score of 0.53. The two scores did not statistically differ.
Additionally, there were some qualitative differences between conditions in terms of the most frequently used words, reflecting the design differences between the two conditions (Table 1). For example, the word “question” was one of the most frequently used words to describe the Experimental condition, whereas the word “video” was one of the most frequently used words to describe the Control condition. We also highlight these qualitative differences with examples of student feedback on Kyron Learning in Table 2.
These results suggest that, while most students did not sign on to use Kyron Learning, those who did found it to be a positive addition to their learning experience. The qualitative feedback indicates distinct preferences between the Experimental and Control conditions, reflecting different aspects of the learning experience. Students in the Experimental condition valued interactive Q&A elements, while those in the Control condition appreciated the clarity and structure of video explanations. Although we did not observe significant differences between conditions, it is clear from the qualitative feedback that those who meaningfully engaged with Kyron Learning saw potential benefits. These findings suggest that Kyron’s personalization features may be effective when fully integrated with course structures. Future research could explore ways to maximize the benefits of Kyron Learning’s personalization features. Given the low overall engagement and the small sample of students who completed lessons and provided survey responses, it is possible that our study simply lacked the statistical power necessary to detect meaningful differences.
Moving forward, we recommend focusing on identifying and refining specific features of Kyron Learning that have the potential to enhance student engagement and outcomes. Future research should aim to investigate these mechanisms more closely, using a larger sample and ensuring that the platform’s features are fully integrated into the course structure to encourage higher participation.
Finding 3: The AI learning tool did not improve student outcomes
Finally, we evaluated whether students who engaged with the AI-assisted resources saw higher attempt and pass rates for objective assessments or shorter time to complete the course than those in the control group as well as others in the course. We found no significant differences across the three groups on assessment attempt rates, assessment pass rates, and length for course completion.
First, we found that 996 of 1,346 students (73.99%) who started the Web Development Foundations course attempted the assessment at least once. The attempt rates did not differ significantly among those in the Experimental condition (72.28%; 76 out of 105), those in the Control condition (81.37%; 83 out of 102), and those who did not participate in the study (73.49%; 837 out of 1,139).
Next, among those who attempted the assessment at least once, 888 out of 996 students (89.16%) passed it. Again, the pass rates did not differ significantly among groups: the Experimental condition (82.89%; 63 out of 76), the Control condition (91.57%; 76 out of 83), and the non-participating students (89.48%; 749 out of 837).
Lastly, we found that those who passed the objective assessment and completed the course did so in 35.69 days on average (SD = 25.86). Similar to the previous two metrics, the days to complete the course did not differ among groups: the Experimental condition (mean = 37.86, SD = 25.02), the Control condition (mean = 39.93, SD = 24.63), and the non-participating students (mean = 35.09, SD = 26.04).
Additionally, we explored the effects of students’ momentum indicator on these objective outcomes and its interaction with both the above factors and the effects of Kyron Learning. We found that the momentum indicator significantly predicted all three outcome measures, with students having higher momentum scores being more likely to attempt and pass the objective assessment and to do so in fewer days (Figure 6). However, the effects of Kyron Learning did not alter these relationships. This means that Kyron Learning (both Experimental and Control conditions) did not impact the outcomes for students with either low or high momentum scores.
It is important to note, however, that this study may have lacked the statistical power necessary to detect meaningful differences in these outcomes due to the low overall engagement and the limited sample size of those who completed the lessons. It will be important for future pilots to improve the communication and engagement limitations noted above.
Future studies could focus on targeted pilot programs that address both the integration of AI-powered educational technologies like Kyron Learning with course content and the effects of personalized support for students at different levels of engagement. By refining deployment strategies and understanding how students interact with the platform, we can better assess its potential impact on student outcomes,
Next Steps
AI has the potential to significantly improve education by enabling more personalized and engaging learning experiences. The current study on Kyron Learning aligns with WGU Labs' recent efforts to investigate how such AI tools can enhance student learning experience. The insights gained from this research help us understand both the strengths and limitations of current AI applications in educational settings, as well as guide our approach for the next phase of research. Despite the positive feedback from students who engaged with Kyron Learning, overall engagement was notably low. Additionally, there were no significant differences in the objective measures of student performance between those who used Kyron Learning and those who did not.
The findings offer valuable opportunities to refine both the platform’s interactive capabilities and the way it integrates into existing course structures to better support engagement. For example, making the platform easier to use within Learning Management Systems (LMS) and ensuring seamless access for both students and instructors will ensure that all students know about the resource and have a frictionless way to use it. Making the tool required within courses, embedding it in the curriculum to ensure consistent usage, or offering it as a supplemental resource that enhances, rather than disrupts, the existing course flow could all boost engagement. Instead of relying on active student participation in research studies, we could use historical data to evaluate the impact of Kyron Learning on student outcomes. This approach will allow us to assess the effectiveness of Kyron Learning with greater sample sizes, circumventing many limitations observed in the current study.
Building on the results of this study, a follow-up investigation will explore whether enhancing Kyron Learning with a generative AI chatbot can increase student engagement and personalization. By integrating AI more deeply into the course structure, we hope to address the specific issues identified in this experiment. The generative AI chatbot is expected to offer more dynamic and tailored responses, enhancing the interactivity and personalization of the learning experience. While this approach may lack the “human-like” interaction of video clips, it will improve engagement by providing real-time, personalized feedback.
By addressing these key areas, our next phase of work with Kyron Learning will more rigorously evaluate its functionality and effectiveness, with the goal of improving WGU’s key results of return for graduates, personalized on-time completion, and equitable access and attainment. This research has not only uncovered insights into Kyron Learning’s impact but also highlighted ways to enhance its implementation and boost student engagement. Understanding these factors will allow for more targeted improvements, ensuring the platform better supports student success. Additionally, this work is part of WGU Labs’ broader mission to explore and understand how AI can transform higher education, particularly in developing effective mentoring and personalized learning tools that meet the diverse needs of students.