The way we assess learning and credential talent is disrupted by ChatGPT. It is forcing higher education to strip down and rebuild better. Dr Nancy Gleason provides four key areas for significant changes: adapting outdated workload calculations, offering flexible timetabling, instating grade-free learning; and socialising ourselves to shifting ethics. Governments and industry need to work together to fund these changes to ensure students have equal access and the economy has the talent it needs. If higher education does not make significant changes, then an alternative model of credentialing will replace it.
ChatGPT is the disruption that higher education needs. In case you have missed it, ChatGPT is a deep learning platform running on the large language model, GPT3.5, which is operated by OpenAI technology. It was released free to the public on 30 November 2022, and within five days it had over 1 million users. Two months later, it is reported1 to have over 100 million monthly users. It uses a natural language processing (NLP) model that generates responses based on a conversation with the user. It is iterative and learns from the conversation to fine-tune answers. It can write any structured language – poems, PowerPoint decks, recipes, essays, code, and translate across languages. It can summarise texts, produce an outline, and edit your writing. The sophistication and convincing nature of the text it can generate signals a phase transition in the capabilities of artificial intelligence (AI). We are in a new world now.
A key point here is that this is just the beginning. CEO of OpenAI, Sam Altman, tweeted on December 11th, that it is a “preview of progress.” He released this to the public to socialise us to it and to train the model. Future iterations will be monetised for certain, and there is already a $20 per month option which promises ease of access at high speeds during peak hours. Products, services, and even new jobs will come from the inevitable applications of this platform. We are in the early days. Any discussion of what it cannot do is beside the point. It is true it cannot do math well. It cannot understand logic. And it can make up narratives that are not true. But the point is the model now exists, and it is improving with time. Later versions of the model will get us very close to artificial general intelligence or singularity.
Some organisations and sectors of the economy need to move faster than others in determining the extent to which generative AI is appropriate to integrate into business practices and systems. Higher education is front and centre. Universities credential students by verifying that they know certain subjects and have various competencies. Their reputations are based in large part on how well their graduates are regarded in the workplace.
But ChatGPT can produce much of what we ask undergraduate students to generate in order to assess their abilities. It can match quality in listing facts, detailing procedures, and preparing presentations. There are no reliable countermeasures for catching ChatGPT-generated text. Here in lies the disruption. Higher education can no longer verify the skills and capabilities of a given student with existing formats of asynchronous assessments such as homework and take-home exams. Conversations around academic integrity and the ethics of producing your own work are a hot topic across students, faculty, and staff. We will find some pedagogical countermeasures (such as oral exams), but GPT3 will increase in sophistication, with newer iterations. Human cognitive processes, strategic knowledge, and emotional intelligence are everything now. Universities need to teach and assess process, not completed products.
|You might also like:|
In the meantime, students still want to learn. And employers are looking for human talent that can think critically for itself. Without reliable quality university reputation and corresponding grade point average credentials, industry is likely to recruit young talent without the necessity of tertiary degrees. The question is, can higher education change fast enough to avoid being made redundant?
Higher education has been in need of an overhaul for years now. Pricing structures, fair work policies, and a crushing bureaucracy continue to put pressure on quality, access, and equity in the academy. The content is often divorced from the learner, the mode of delivery has become less and less engaging as class sizes balloon, the assessments often fail to measure learning, and the facilities are not conducive to learning at many institutions. Elite and famous institutions are not what this conversation is about as they can both fend for themselves, and based on reputation alone they can afford to be slow to change. However, the fourth industrial revolution (4IR) and the automation of many cognitive tasks demand a different kind of education because thinking in the workplace is changing. The content alone of education cannot sustain a person because knowledge has a shortened shelf life – in some areas being relevant for only a few years. Knowing how to learn is essential. You have to learn ways of thinking to thrive in adulthood. ChatGPT makes assessing learning at scale more difficult because the methods we currently deploy are no longer robust against academic misconduct, nor relevant to the new world of humans plus technology.
Higher education failed to change despite the massive disruption of COVID-19. After having trained millions of teachers to move their courses online and deliver them via zoom, and altering billions of assessments, most institutions have put all the old ways of doing things back in place. It’s a shocking waste. Not to say that it all worked well but most colleges and universities haven’t implemented lessons learned from COVID-19.
Now we have ChatGPT offering a similar-sized disruption as COVID-19, threatening business continuity in an unanticipated way. Generative AI – which includes a massive landscape of platforms – is changing how we produce, and the skills we need to generate and assess the veracity of outputs of many different kinds. The scale and pace of this shift means fundamentals have to change beyond how we teach and assess students. There are four ways in which higher education can strip itself down and build back better in the wake of this current disruption.
Adaptive Workload Calculations
First, ChatGPT may change the associated workload of learning outcomes. What that means is we can learn faster now. Higher education institutions award degrees based on completed credit hours, not by conventions of a given discipline, but by conventions of the pitch and rigour of the degree being earned. How many credits a student needs to complete is determined by the government. In Europe, this is the European Credit Transfer and Accumulation System (ECTS), whereby recognition of qualifications is based on contact hours. The United States credit-based system measures Carnegie Units and is used in countries around the world. This antiquated system was developed in 1906 and it is used to determine the appropriate amount of time a student must study. Accreditation bodies operationalise, monitor, and determine quality assurance. ChatGPT likely means we do not need 120 credit hours of instruction across all disciplines at the undergraduate level. People now get information faster and the shortened relevance of knowledge means time can be better spent learning on the job.
Here is where government comes in. Higher education cannot change until governments change their accreditation rules. Part of the reason higher education institutions could not embrace desired improvements in the post-COVID-19-vaccine classroom is that they aren’t allowed to. Accreditation bodies will likely need to rethink how student workload is calculated and make the time it takes to get an undergraduate bachelor’s degree shorter for some disciplines.
Second, timetabling needs to adjust to the ways in which students engage with technology. Timetabling is a complex process that takes into account the availability of faculty and students, class enrolment size, classroom types, and the availability of resources to optimise efficient use of the campus and ensure students can complete their degrees in the allotted number of years. Shifting timetabling away from blocks of time will be very difficult to do but something has to give. What disciplines can be learned in a shorter time because of generative AI platforms? What classes need more contact hours? We need a better system for addressing differences across disciplines rather than timetabling all courses for the same amount of time because generative AI impacts disciplines differently. This may mean for some disciplines online instruction for 30 minutes four times a week is a better mode of delivery than sitting in a classroom for an hour and a half twice a week. And it means contact hours during examination periods need to be rethought to enable things like oral exams in the social sciences and humanities. But again, that is an accreditation issue that higher education institutions cannot address on their own. Timetabling is currently linked to workload, and not to learning outcomes. Generative AI means that needs to change for higher education to stay relevant.
Third, grading and marks as we know them need to be phased out. Grade-free learning is practiced around the world to encourage intellectual risk taking – most frequently in the format of covered transcripts for first-year students. Evergreen State College2 in the United States is an accredited four-year college with no grading system in place. And highly regarded Brown University has no final exams. A massive movement of “ungrading3” occupies the chat rooms and libraries of pedagogically engaged university teachers the world around. Essentially, ungrading is minimising the use of points and weights on assignments, and instead providing feedback and focusing on student growth and learning. It costs a lot of time per student, something many institutions of higher education do not have, but that is part of the structural change that generative AI forces. Implementing such a change at scale means administrative leadership, government and industry need to work together to allocate resources – both financial and temporal – to train faculty in new ways of teaching.
Fourth, ChatGPT is shifting the ethics around academic integrity. It changes what it means to cheat. The future for those of us connected to the internet is humans plus AI. Going forward, and in the near future, there are very few tasks that professionals who work with computers will do without consultation with an intelligent machine. But universities and colleges need to legitimately verify student learning. And how students achieve their learning needs to be done fairly. We need new ways to develop and verify knowledge that integrates generative AI tools. The structures, layout, and expectations of assessments and tests need to shift. And we need to socialise ourselves to AI-generated ideas. John Dewey, an influential educational philosopher from the early 20th century, argued that education is the renewal of life by transmission. Education is, he argued, social continuity of life. It is time to prepare ourselves, and our societies for a world in which that continuity is co-developed by non-human ideation, shifting our understanding of the ownership of ideas.
The other ethical challenge is access to this model. Despite its power and the decision by some countries and local school boards to ban its access, the wealthy will have better access than those with fewer economic resources. There is no equity in education without equal access to this tool. And you can be certain it will be monetised with lofty subscription fees for educational institutions; ones only well-endowed universities and colleges can afford. That is not ethical. The power of this technology means every enroled student needs to have equal and fair access to it. Not to mention that it is a fabulous tool for neurodiverse students, non-native speakers of the language of instruction, and those who struggle with literacy.
We need a plan
All of this is to say, we need a plan, not plasters. Cheating software is a plaster – one worthy of our attention in the very short term only. If we need to know what is human generated and what is AI generated then governments need to regulate the industry to make it so. We cannot expect faculty to adjudicate this alone. That is not only unrealistic, but it is immoral given the certainty of failure.
Education is a process, not a collection of facts. That has always been the case but challenges of access to quality education at scale have meant many schools are only delivering facts. That approach is now rendered obsolete, as it is fully automated. We need leaders to make bold proposals. And we need industry to help fund the necessary shifts or there will be no talent available to run the economy.
There are a lot of crises in the world today. It is difficult to get anyone’s attention beyond the news cycle. ChatGPT has our attention for the moment. But the next war, the next ecological disaster, or the next tech breakthrough may cause it to fade from the headlines. Despite this, it is not going away. Education is disrupted and the status quo is not an option. Taking action for systemic change is a forced move. If higher education and the government bodies that regulate them do not make significant changes, then an alternative model of credentialing will replace them.
About the Author
Nancy W. Gleason, Phd is the Director of the Hilary Ballon Center for Teaching and Learning at NYU Abu Dhabi. Her research focuses on the Fourth Industrial Revolution’s impact on higher education, employment disruption, and upskilling of adults. She is the editor of Higher Education in the Era of the Fourth Industrial Revolution (Springer, 2018).