AI and Education: Promises and Pitfalls of Artificial Intelligence in Higher Education 

AI and Education Promises and Pitfalls of Artificial Intelligence in Higher Education 

By Emil Bjerg, journalist and editor 

Artificial intelligence is making its way into higher education, from how students learn and get help to how universities and colleges retain students and address admin tasks. This presents promising opportunities, but avoiding its pitfalls might be just as vital. 

In 2020, AI in education was a 1 billion USD market; just two years later, in 2022, it was a 4 billion industry. By 2032, the number is expected to be 30 billion USD. In other words, we’re just starting to see the effects of AI in education – as the numbers suggest, its impact will grow exponentially in the next ten years. 

Let’s take a look at the promises and pitfalls of artificial intelligence in higher education. Towards the end, we’ll look at a few principles to use for successful implementation. 

Potentials for students 

Personalized learning 

The learning opportunities that AI represents are incredibly exciting. Whereas lecturers and professors can hardly mentor every student they have in detail, AI-driven learning platforms can identify learning styles, strengths, and weaknesses and tailor learning material to the specific needs of each student. Machine learning algorithms excel at recognizing patterns. Given enough data, these algorithms can predict with a fair degree of accuracy what type of content a student might prefer or benefit from next. 

This allows students to advance in complexity at a pace tailored to them. Something both the best and the most challenged students can benefit from. One student might learn well with video, while another prefers text. 

A lot has been said and written about ChatGPT and similar large language models’ abilities to facilitate students cheating with papers and exams. Others believe AI – and its ability to give everyone a personal tutor – will save, not destroy, education. 

One of the firm believers in AI and personalized learning is Vistasp Karbhari, a former president of The University of Texas at Arlington, who says: “Just as the Gutenberg press irreversibly changed the way knowledge was shared, digital technologies are helping transform education from an industrial revolution-based ‘one size fits all’ paradigm where students receive the same information, at the same time, and at the same pace, akin to an

assembly line, to one that can be self-paced, adaptive, and personalized — focusing on the learner.” 

An AI tutor can not only tailor and deliver personalized teaching material, but it can also generate it. With generative AI, the cost to produce teaching materials falls dramatically, allowing for an even finer personalization. 


While lecturers and professors need to have certain allocated time slots for teaching and counseling, with AI, help is always accessible. This can help students during intense learning, such as exam periods. 

This can benefit both the best and the most challenged students. For the best students, it means they’re not held back by only following their cohort. The most challenged students are not held back by a lack of resources that work for their particular learning style. 

Potentials for universities and colleges 

Student retention 

The US is undergoing what the New York Times has called a College Dropout Crisis. Data from 2022 indicated that around 33 percent of students don’t finish their degree. 

Nova Southeastern University in Florida has been an early adopter of AI to limit dropouts. By using AI to identify students most likely to drop out, the university’s Center for Academic and Student Achievement could better prioritize retention efforts toward the students needing them the most. 

What AI can help add is a proactive approach in an area that is usually reactionary. Most retention efforts start only after finding out that a student is facing difficulties, but AI can “help a college target curricular changes, intensify its advising and offer support services much earlier, before the time when a student begins to experience troubles”. 

Administrative help 

AI can also help with administrative work, benefiting employees and students. Georgia State University is a good example of that. Here, they employed an AI-driven chatbot called “Pounce” to answer students’ enrollment questions, reducing “summer melt” (students intending to start who don’t enroll) by over 20%. At the same time, it reduced admin costs as the AI was answering questions. Just like AI is entering service in a number of industries, it can greatly benefit the support of students – thereby saving higher education institutions time and money. 

Challenges and ethical considerations 

There are a number of ethical considerations to be had before implementing AIs, both when it comes to handling data and when it comes to involving AI in teaching.

Privacy and bias considerations 

Ensuring data privacy is essential in dealing with sensitive information, such as learning patterns or which students are most likely to drop out. Concerns over data responsibility often holds back higher education institutions from implementing AI systems. 

One way to ensure best practice is to store data on the university or college servers – instead of a third-party platform. Another way is to go for bigger groups when applying AI systems. In a smaller group – e.g. less than 25 – individuals can easily be identified by their stored information, so the data becomes more secure as the group grows. 

Just like reflections on privacy and data are always appropriate in introducing new AI systems, so are reflections on bias. Humans are biased, and humans not only build AI systems they also write and curate the information and data AI is trained on. Whether it’s an AI teaching system or a retention system, having an awareness about what data a system is built on and curiously investigating its outputs, rather than perceiving AI as inherently intelligent, is key. 

Considerations in learning applications 

Just like there are good reasons to be excited over the personalization of learning that AI brings about, there are good reasons to introduce it in good measure. Some critics argue that over-personalization might create an “educational echo chamber”. 

A potential pitfall of personalized learning is that a learning system can adapt too much to a student’s preferences, which might limit a student’s exposure to diverse learning methods. For example, if a student leans towards video content because it’s more engaging but might benefit from reading to improve literacy skills, an overly adaptive system might deprive them of a necessary challenge. 

If the digital optimism that defined the early part of this century made us forget, COVID-19 taught us the value of physical, social encounters – and what consequences it can have to lose that collectively. A survey among college students during the pandemic showed a 20 percent spike in depression, an 11 percent rise in anxiety, and a 16 percent increase in loneliness. This is, of course, the result of a pandemic rather than just a transition to digital teaching, but the number points to the importance of being around peers. 

These examples indicate that it’s much needed to be wary of the over-adoption of AI in higher education. The skills we learn from being around other humans – and the natural joy that comes from it – can never be replaced by AI encounters. 

As we’re in the middle of an AI revolution – just like we’re in the middle of AI hype – carefully considering appropriate integration will be the way to reap the benefits and avoid the pitfalls. The following section will discuss how universities can successfully implement AI systems.

How can universities and colleges implement AI? 

Following the enormous success of OpenAI’s ChatGPT, a lot of companies have jumped on the AI train with the promise to increase efficiency, deliver results, and cut costs. Some provide more value than others. When implementing new AI systems, it’s a good idea to start with the insight that just because it’s AI doesn’t mean it’s the most intelligent solution. 

According to Arijit Sengupta, founder of Aible, many universities get disappointed with the results they get from their initial implementation of AI systems, experiencing that they are a waste of time and money. He says that’s because “it’s not built to achieve tangible goals and specific outcomes that are most important to the institution.” In other words, starting with the end in mind is a great idea for implementing AI systems in higher education. 

Once implemented, the performance of the AI system should be regularly monitored by collecting feedback from users to understand any challenges they face and, from there, make necessary adjustments. This is the best way to continually check and adjust for bias, bugs, and other initial issues. 

Just as it’s beneficial to have a project leader, involving a diverse group of stakeholders, including faculty, staff, and students, in the decision-making process regarding AI systems is equally important. In inviting stakeholders, getting representatives of those who work with the system and those whose education will be impacted by the system is a great way to get valuable feedback on the practicality and effectiveness of the AI solutions. In the same way, giving proper training to those who need to operate and use the systems is – quite obviously – a way to increase the chances of a successful implementation. 

AI can be a supplement, not a cure. Great AI systems can identify students at risk of dropping out and help them get academic challenges they wouldn’t get without. But to have a memorable college or university time and come out as a person with a diverse skillset, including great social skills, treating AI as a supplement rather than a cure is vital as AI’s impact on higher education will grow exponentially in the years to come


Please enter your comment!
Please enter your name here