AI isn’t a tool, it’s an environment

There’s a fear that students will use artificial intelligence as “magic button” that cuts out the need for deeper thinking. For Josh Thorpe, another world is possible

Josh Thorpe is a learning developer and academic skills advisor at University of Stirling

Previously on Wonkhe, I argued for caution around the use of generative AI tools in education. Abstinence, however, is not something we can reasonably expect from students.

In fact, I’ve been asking students, and, in some groups, a vast majority are using AI. Some in impressively reflective ways, others – sadly – to outsource their own thinking (i.e. “to cheat”).

Given this reality, it’s time higher education embraces a critical and creative AI literacy for educators and students alike. With this approach, caution and curiosity can – and should – go hand in hand. It will help us all to grapple with our new tech landscape, and may lead to wonderful new forms of learning.

The magic button

What does this mean? First it means acknowledging a chief fear about AI: that we’ll treat it simply as a magic button that does our bidding, a kind of “replicator” that farms out human knowledge, cuts us out of the loop. This was my first assumption. Hence the message of caution above.

The next step is finding a different path, which I’ll get to in a moment.

But first it needs to be said that the fear mentioned just now, and others, should be taken seriously. We also need to consider academic integrity, access and accessibility, intellectual property, privacy, environmental sustainability, coloniality, and more. It’s a lot to keep in view, and it can make a person dizzy just to try.

One thing you don’t hear about as often is student experience and autonomy. With large language models churning out reams of “knowledge” at the push of a button, students must wonder: what the heck am I supposed to do? What’s my role in the creation of knowledge?

Again, this fear is based on the assumption that AI is primarily a content mill. I’m optimistic that society as a whole, and the education sector, can see past this limited view with a bit of AI literacy and creative practice.

Ideas in space

Hence the need to find a different path. This is what motivated me to write my book, AI for Students, and this is what I’m advocating for with academics and students in my job.

Here it is: don’t treat AI as a tool, but as a space to work in, a virtual classroom, a situation in which to have experiences. This move from magic button to magic space, demands a new norm, a cultural shift from fear to critical curiosity.

It begins with using the tools creatively. Many of our first intuitions with ChatGPT are to simply make it do stuff, primarily writing. Write me a joke about kittens in the style of Jerry Seinfeld. Write me a sonnet. You know the drill.

The results can be thrilling for a moment but become pretty dull pretty quickly.

And we see this in student work that treats AI as a content mill. “Post-traumatic stress disorder” becomes “post horrendous pressure problem”, as one colleague found in a student essay. It gets worse from there.

We want to avoid these uses of AI. I don’t think it’s unreasonable to think we can.

My ah-ha moment came when it occurred to me to have ChatGPT help me write a joke. I typed something like: “you’re a brilliant scholar of comedy. I’ll improvise a joke, you critique, I’ll try again. We’ll repeat.”

Now, the chatbot’s critiques were not brilliant. But they caused me to think more carefully about my task. This was the key. Ultimately what was important was that, in tandem with the technology, I’d opened up a space for practice and reflection. Now this was an exciting feeling.

So this is the principle: Treat AI not as “tools” but as environments, as spaces for play with ideas.

In practical terms, this means we don’t just make simple demands like “write me a paragraph.” And even questions such “how should I structure my essay?” are only a start. Instead, students can learn to set up the AI to engage them, to create an interactive setting with specific aims, behaviours, and structures that stimulate, rather than stifle, creativity.

Discussion and improvisation

It’s a nice image, but how is this achieved? It’s largely in something called “prompt design”. You need to be good at your inputs to get good things from the AI. In other words, garbage in, garbage out. Becoming more skilled with prompt design is a straightforward project, but it can lead to much richer experiences. The joke-writing crit-session described above is a simple example. It can get much more nuanced than that.

Now, I don’t suggest that the interactive approach solves all the problems AI creates. But it can help. If education adopts this model, it will lead to a significant cultural shift. This will happen slowly, through lots of discussion and practice, but it needs to start now with each of us.

The pedagogical value of this cultural shift seems clear to me. To start, it creates more opportunities for active recall and immersion in course concepts. It also enables students to practice using vocabulary and ideas with almost no social anxiety. It’s an entry point to discussion and improvisation, it can spark creativity, and it even makes authorship more accessible and multimodal by opening ways to translate our messy non-linear thought patterns into more ordered texts. All of this can be designed into assignments so that the process is documented and assessable.

But the radical thing in this experiential or interactive approach, is this: it’s simply more fun. More fun than just cogitating or ruminating. More fun than solitaire-style index cards. And definitely more fun than letting the bot do all the work. I think most students will see this immediately and happily learn to use AI in ways that are human and healthy.

There is a lot of good work to be done in this area. If we change the frame and develop a culture of critical AI literacy, it is possible to create effective, cautious-yet-curious, pro-student, pro-academic integrity uses for AI in education. If any of this is meaningful to you, please join the conversation.

4 responses to “AI isn’t a tool, it’s an environment

  1. Great piece – I’d argue this is a threshold concept for many educators within HE. If we can position gen AI as playful environment, and not as a tool, we can redress the balance between expert/novice. The fear of not being the expert seems to stymie creativity for some.

  2. What about this part, though? “We also need to consider academic integrity, access and accessibility, intellectual property, privacy, environmental sustainability, coloniality, and more.” What are the answers to these serious issues?

  3. Thanks Josh, that is an interesting piece. Dr. Mairead Pratschke made an interesting observation about how we interact with AI, namely that it is like a dialogue. Rather amazingly, I found myself listening to ChatGPT premium released talking version last Saturday, where it responds in your choice of voice, rather than in just text. This is going to alter how we interact and use AI even more fundamentally. What is still at issue is whether this form of interaction will actually change how we think and perceive reality!

  4. Having encountered a slowly growing number of people in industry using AI for design, I am coming to feel that Josh’s proposed approach has the additional benefit of beginning to engage students with ways it may/will be used in the workplace. Indeed there is an increasing chance that they will encounter such approaches when on placements or doing projects with companies. While I am sure some are simply using it to turn out reports easily, there is generally too high a risk to reputation, potential loss of customers and even safety if it is utilised in an uncritical way, for companies to plough on using it regardless. Thus, we are more likely to see the kind of dialogue opening up with AI systems which can be challenged by or may indeed challenge our perceptions of a particular topic. Obviously a voice for AI emphasises this dialogue, but even text-based can highlight this given how often students engage in discussions via texts.

Leave a Reply