Instructors after AI

How will the current wave of artificial intelligence change college teaching?

I’ve been thinking AI and education for years, and it’s all come into sharp focus lately, due to the advent of large language learning (LLM) bots like ChatGPT (previously). I’ve posed the question to academics and other people interested in higher education, and many answer by wondering what a post-AI instructor’s job looks like.  One extreme view holds that AI is failing now and will collapse soon, so our task is to protect higher education from damage.  Opposed to this is the fear that we might not need human instructors* at all, if these new technologies keep developing.

Today I’d like to chart a middle path and wonder what happens to the job of teaching in higher education if AI continues and improves.  What happens if generative AI gets better – not artificial general intelligence level, but good enough to make content which large numbers of people value, somewhere between a calculator and a mad scientist’s assistant?  What does an instructor do in a post-LLM world?

University faculty after artificial intelligence, visualized by Midjourney

“University faculty after artificial intelligence,” visualized by Midjourney

Here’s the list I came up with in conversation with many people over the past month, from students to Patreon supporters, Facebook pals, friends in person and strangers online.  It’s not in any particular order and there’s a lot of overlap between points.

  1. Teaching prompt engineering (showing how use the tech: the best ways to write a prompt, how to iterate results, how to go beyond simple content generation). This includes teaching learners how to interact with AI to teach them best, apart from the human instructor.
  2. Instilling a critical stance about technology.  This should certainly include criticizing AI, which can take place in various ways and through different disciplines – i.e., science and technology studies, rhetoric and composition, computer science, etc.
  3. Offering students emotional support, both in the class context and in their lives.
  4. Facilitating group work.  Right now Bard, Bing, etc. are good at interacting with a single user, but don’t seem to have much capacity for wrangling clusters of students.
  5. Guiding students through a curriculum, or answering the “what to learn now?” question. Teachers do this within classes, as well as through advising.
  6. Related to 6: teaching students what they need to know and aren’t interested in.  This might be according to an instructor’s views, or what a larger authority (state government, community) prefers.  For one example, it could take the form of encouraging an arts-loving student to learn math.
  7. Structuring learning over the long haul. It’s easy now to learn something small on demand (what’s the French word for “cat”? what happens inside a biological cell?), but people have a harder time persisting in learning over weeks and years.
  8. Protecting students in their learning process. This can be defense against political attacks (example: studying evolution in a creationist context) as well as in terms of social, interpersonal issues.  (Related to #3 above)
  9. Nurturing curiosity.  Generative AI can satisfy one’s curiosity, but how to spark and support it?
  10. Teaching critical thinking. It’s not easy to find consensus among educators about what that means or how we do it, and I think we overstate how much we actually do this, but it’s something we tend to value highly. (#2 above is part of this.)
  11. Teaching, inspiring, and supporting creativity. There are other sources for this, but teachers can be good at helping students exercise and explore their creative sides.
  12. Modeling.  Former student and current teacher Justin Kirkes thoughtfully explained on Facebook: “Modeling vulnerability in the learning process, excitement in exploration, curiosity in the unknown. These aren’t behaviors that are always innate in learners, but can be called forth.”

It’s a messy list, with redundancies and parts I’m not sure about. And assembling it gives me questions.  How much of this can be automated, either now or in the likely near future? How are we likely to value (culturally, financially, professionally) such counter-AI skills?

University faculty after artificial intelligence visualized by Stable Diffusion

“University faculty after artificial intelligence,” visualized by Stable Diffusion

What do you think of this outline of the college teaching profession post-generative AI?

*I’m using the term “instructor” here to emphasize the college or university faculty member’s teaching mission.

Liked it? Take a second to support Bryan Alexander on Patreon!
Become a patron at Patreon!
This entry was posted in automation and tagged . Bookmark the permalink.

5 Responses to Instructors after AI

  1. Roxann says:

    #4 Brings to my attention why we need to choose AI to prioritize human learning needs, bringing awareness to going beyond personalization, to what you mention as group work- I think of as collective learning experience engagements, can be incredibly valuable for teaching and learning environments. Hopefully, this will be part of what educators can create and facilitate.

  2. Although definitely alluded to in several different points listed, the aspect of the instructor being able to have a strong presence in developing a community of inquiry by directly motivating students, explicitly helping them see the relevancy of what is learned, and forming an emotional connection to the learning to foster deep learning and better retention, will be key aspects of what human instructors will need to be able to accomplish in order to thrive and be relevant in the near AI dominant future.

  3. Pingback: Looking Back and Looking Forward — Prompts Are the New Thesis Sentence | Rob Reynolds

  4. Joe says:

    I think in the not too distant future, universities will reorientate themselves to be research institutions and the job to teaching students will largely be taken out of their hands.

    Graduate schools and highly technical subjects (medicine, engineering, chemistry) may well be a different thing – but I think the vast majority of book based subjects will no longer taught by humans in the vast majority of universities around the world.

    Ultimately and eventually I imagine there will no longer be such a thing as a “degree” in many subjects – and employers will simply ask for employees to follow a course of study to develop the skills that they require directly with an AI.

    That might not happen for a while, and meantime I think the universities will simply offer examinations in many/most subjects without any actual metriculated students. The students will then have to find their own ways of study, probably significantly assisted by rapidly improving LLMs.

Leave a Reply

Your email address will not be published. Required fields are marked *