The ways artificial intelligence will change how we learn, work, and live are still unfolding. At WGU Labs, we’re largely enthusiastic about advancements in technology. Yet, we also apply a healthy dose of skepticism to promises of tech-driven transformation. In AI, we see the potential for tools to help learners succeed and scale solutions while simultaneously recognizing the ways they could uphold historic and current systemic inequities. It will be vital for administrators and EdTech founders to understand these nuances as they adopt and build AI-based solutions. If they don’t, we risk leaving underserved learners and underresourced institutions behind once again.
Higher education continues to face immense and intricate challenges, or as we call them, wicked problems:
- The paths to postsecondary learning experiences are limited for people who don’t fit the historic student profile.
- One-size-fits-all classrooms, high tuition costs, and a lack of clear information are driving students to disengage and drop out.
- The transition from learning to work is also unclear, muddying the return on investment of postsecondary experiences.
AI has the potential to help postsecondary education solve these wicked problems. But turning theory into reality requires an assessment of how AI truly helps and hurts each wicked problem.
Wicked Problem: Access to Post-Secondary Learning
How AI Improves Access
Many people find the routes and entry points to postsecondary education confusing, hidden, or out of reach. Higher education needs better ways to meet learners and potential learners where they are and guide them through paths to success. AI can help remove barriers by personalizing the journey.
Even before the ChatGPT explosion, institutions were experimenting with AI to help students navigate the postsecondary system. Chatbots have been used to help guide students by providing basic information about admission requirements, program and course options, tuition costs, and deadlines for financial aid and scholarships.
AI can also help smaller institutions with fewer resources be more competitive by enabling them to tap into capabilities that, until now, have only been accessible to larger institutions. For example, well-resourced institutions have been using data to identify students who are at risk of falling behind or dropping out so they can intervene accordingly with resources and support. These systems require a large amount of financial resources and skilled talent to manage effectively. AI can unlock the large amounts of data underresourced institutions have, as well as improve touchpoints like academic support, career counseling, and emergency aid.
How AI Hurts Access
Relying completely on chatbots to provide information to students is risky. AI models are known to produce inaccurate or misleading information, sometimes referred to as hallucinations. The frequency of hallucinations can be as little as 3% and as much as 27%. When relying on chatbots to provide prospective or enrolled learners support or guidance, that is a significant margin of error. Chatbot hallucinations may also make institutions vulnerable to financial and brand repercussions, including a greater risk of lawsuits.
Then there is the problem of bias. Institutions have employed big data sets and predictive analytics to better guide learners in enrolling in and completing programs. However, reports demonstrate that these models, as well as newer AI tools, have a built-in bias. Without recognizing the biases of AI or the lived experiences of learners who need support, numbers can yield inaccurate assumptions and unintentional bias. As institutions look to use AI to provide better wayfinding for students, it will be important to take learner-centered approaches informed directly by the people the technology intends to serve.
Wicked Problem: The Learning Experience
How AI Personalizes Learning
Learner-centered AI solutions have the biggest potential to democratize education and give students the learner-centered experience they deserve.
Students are already using AI tools like ChatGPT — but not to write essays or engage in other academic misconduct. Last year’s Student EdTech Survey from the College Innovation Network (CIN) at WGU Labs found that students most often use ChatGPT to simplify complex topics, brainstorm creative ideas, and conduct research. However, about 31% of ChatGPT users in our sample reported using the tool to write homework responses or discussion points, and 23% reported using it to write papers and exam responses.
Institutions still need clear policies on how AI is used in learning and by learners, but it’s time to shift the conversation from prohibiting AI to exploring how students can use it to expand their thinking. Some faculty members are already taking these steps by using ChatGPT to inspire new ideas when students get stuck and teach AI literacy skills.
Testing AI’s Impact
As part of our Solutions Lab, we are conducting two research projects with Western Governors University (WGU) that endeavor to better understand how AI tools can be designed and used to support learning.
The first project is a randomized controlled trial to empirically assess the impact of an AI-assisted learning platform on student academic outcomes, as well as their perceptions of the course and its faculty compared to traditional learning methods. Students in the treatment group are provided additional support videos that offer interventions personalized to their progress. The pilot aims to evaluate this platform by measuring its impact on academic performance, student perceptions of EdTech, faculty, and their ability to persist when faced with challenging topics. We are currently analyzing the results from the pilot and will share relevant information soon.
The second project integrates AI (i.e., ChatGPT Team) into an introductory programming course. While the course is introductory, some of the content is advanced, which students find challenging. We are introducing AI-based learning support that students can access in real-time and measuring its impact. The tool can be trained on specific course learning materials.
For the project, our sample size is small, consisting of 13 students. Among the participating students, eight replied to surveys gauging their use and confidence in AI and their perceptions of the learning experience. We are still collecting data, but early findings show that, over time, students report an increase in AI use, confidence, and helpfulness. We will continue to collect and evaluate data and share qualitative findings as we have them.
How AI Hurts the Learning Experience
In April 2023, as part of the EdTech Student Survey, we found an uneven uptake of ChatGPT among students. Awareness among first-generation community college students is particularly low. Only about a third of students whose parents had never attended a four-year institution knew about ChatGPT, compared to roughly half of students with a parent who attended or graduated from college.
Low usage rates among first-generation community college students were still present even when we restricted our analyses to those who were aware of ChatGPT. Among first-generation students aware of ChatGPT, around 20% of students at a four-year college had used ChatGPT, whereas only 10% of those enrolled at community colleges had.
The good news is the gap is closing, albeit minimally. In our August 2024 AI brief, we found that although awareness and usage have increased, first-generation students are still 12% less likely to know about ChatGPT and other AI tools. Additionally, fewer than half of all students feel confident in using AI effectively, and support to build this confidence is scarce.
These results indicate that, similar to previous technologies, privileged populations may adopt AI tools faster and widen existing equity gaps. Institutions must be ready for varying levels of student awareness and usage of AI platforms and should provide guidance on available tools and their optimal applications.
Learn-Work-Learn Cycle
How AI Helps Advance Careers
As AI becomes a bigger part of all aspects of our world, administrators and educators need to consider how their programs prepare students to be successful at work and beyond. Already, 80% of Fortune 500 companies are using ChatGPT, and it’s clearer than ever that building AI skills will be critical to staying competitive in today’s job market.
Micro-credentials, short-term certificate programs, and other flexible, workforce-oriented skills acquisition programs can help workers develop these skills quickly to meet demand.
AI tools also have the potential to build a clearer map between what skills students learn in academic programs and what skills employers need. Among job seekers, 45% of people have used AI to create or improve their resume, providing students a new way to translate their abilities into qualities employers look for on a resume. On a larger scale, institutions and employers could use AI to power databases that unify data about job types, hiring rates, salaries, and student success rates and map them to training programs, making short-term training directories more powerful.
AI may also prove to be effective in connecting students with valuable networking contacts, like alumni from their school or program. We studied the impact of Protopia, an AI tool that builds connections between students and alumni. We learned that alumni are an ideal secondary tie for students as they already share similar university or college experiences. More than half of the Protopia users we interviewed indicated they were motivated to use the platform because they would be connected to alumni who they believed would be more likely to respond to their questions than strangers on platforms such as LinkedIn.
How AI Can Hurt Learner’s Chances in the Workforce
The majority of Americans (75%) believe AI will reduce the number of jobs in the next decade. Workers can reskill but without employer support or affordable education options that will be challenging. There will be an even greater need for employer investment in retraining programs and championing a cycle of learning and working.
Simultaneously, AI will redefine entry-level roles, necessitating new skills from graduates. The AI literacy skills students develop in coursework will continue to carry value in the workforce as they navigate the what, when, and how of AI tools in a professional setting. To address this, partnerships between postsecondary institutions and employers are essential. Together, stakeholders from each group will need to evolve internships, apprenticeships, and other programs that prepare entry-level workers. Academic programs must also put greater emphasis on empathy, critical thinking, and analysis to help learners build future-proof skills and provide value that technology can’t replicate.
Discerning the ways AI can help or hold back learners and the learning experience is pivotal to realizing the technology’s potential in education. Ignoring the potential harm AI can cause could exacerbate resource and opportunity gaps that already exist in postsecondary education.
But addressing what’s wrong with AI and leaning into what’s right could help institutions and learners overcome those gaps. WGU Labs will continue to pursue projects that help identify potential drawbacks and advance beneficial uses of AI to ensure all learners are ready for a tech-enabled future.