In the past few months, the internet has witnessed an impressive array of demos powered by GPT-3, the latest language model released by OpenAI. From writing fiction, to generating code, to summarizing legal text, to turning bullet points into full emails, just to name a few, GPT-3 is impressively versatile.

So I thought of experimenting with it to understand its potential use in childhood education.

Here are the use cases that I explored. For the first one (creative writing), I got the help of my 9-year-old daughter, Safa, to co-write the pieces with GPT-3. 

  1. Creative writing – short fiction, poem
  2. Generating ideas – fiction writing, science project
  3. Learning a topic – science, math
  4. Making quizzes – trivia, multiple-choice question

As I went through these use cases, I also pondered over the bigger picture.

How can technology like this help children learn better? Is it safe to use? Is this bordering on plagiarism? Should the idea even be entertained in the first place?

And it’s not just about GPT-3 per se. As language models like this become bigger, better, and more widely accessible, will they make their way into education?

About GPT-3

GPT-3, released in June 2020, is the third generation of the Generative Pre-trained Transformer (GPT) language model developed by OpenAI, an AI research company. It uses deep learning to produce sequences of text based on the context given to it.

GPT-3 is one massive deep learning model. It has 175 billion parameters, more than 100 times its predecessor, GPT-2, which was released only last year.

What does this translate into? It means that GPT-3 can be applied to pretty much any language task. And the most impressive part is that it can carry out these tasks by learning from just a few examples. You don’t have to train it with lots of data, unlike other typical machine learning tasks.

This ‘few-shot learning’ approach is what makes it so powerful. The heavy lifting is already done. It has already been pre-trained with billions of words from the internet and various other sources. All you need to do is provide the right (normally short) context, and the model will adapt to whatever the task it’s given. We’ll see how this works in the use cases.

GPT-3 produces these text outputs via an API. But since I was just experimenting, I used its web interface called ‘Playground’ to generate the text.

Use Cases

Before going through the use cases, here are some points to note about the GPT-3 generated text shown in the coming screenshots:

  • The text in bold indicates human-generated text, and the text not in bold came from GPT-3.
  • Due to the length, I am going to show only snippets of the text.
  • In most cases, I didn’t edit the text generated by GPT-3. I kept it as-is. But on some occasions, I did ask GPT-3 to re-generate the text, for example, when the content was not very appropriate.
  • The prompts in the first use case (creative writing) came from Safa. For the rest of the use cases (2 to 4), all the prompts came from me.
  • You will notice that in most cases, all I had to do was to provide a few lines of prompts. Those were enough for GPT-3 to understand the output patterns that I’m looking for. 
1. Creative Writing

This is my favorite and it’s the first in the list because it’s something I’ve been trying this year with Safa and her elder brother, Adel. I try to encourage them to explore creative writing, which I believe offers a rich learning experience.

But let alone a child. If you’re an adult starting out, it can be pretty daunting, especially to keep coming up with ideas. 

We have tried a few tricks to make it feel more natural. For example, voice recording whatever stories or key points that came to mind, and then getting an app (yet another AI!) to transcribe them into text. This way, they will not be staring at a blank page when writing. But still, there is a lot required to maintain motivation.

Now, GPT-3 can easily generate long-form text given a context. But of course, this defeats the purpose of Safa learning to write. I like the idea of collaborative writing, especially in the early stages of learning. So I thought of using that with GPT-3.

We first tried this with a short fiction that Safa called ‘The Enchanted Forest’. She first provided a single line prompt about what the story was about and the age range it was intended to (I found this helpful to ensure the content generated would be clean). Then, she added a few lines of text to begin the story. Next, she would trigger GPT-3 to output some text. And she would repeat this a few times (with some adult help) until the end of the story.

Here are the beginning lines of the story.

Since the only idea Safa had was how the story started, it was interesting to see how the plot unfolded through this back and forth process. The text GPT-3 produced was reasonably good, though sometimes it did veer off-context. 

But that’s where it got interesting. Even when the prompts were not making much sense, she managed to use them to create a fairly coherent story at the end. The writing process became more natural when she had prompts to continue from. And compared with doing this alone, ideas came at a much faster rate. It felt like she had a writing buddy to bounce ideas off, which was effectively what happened.

And the whole process was fun too, as some of the prompts from GPT-3 were quite witty and amusing. For example, it created a magpie character out of nowhere, which Safa continued to use in the story. It even captured the nuances of Safa’s writing. For example, responding to an emoticon that she wrote (-_-) by adding one of its own (:O)!

We then tried with poems. We used the same format of collaborative writing, but this time it was one line each. It turned out to be much more coherent than long-form. Plus, this was the first time Safa writing a poem, and it didn’t feel like a chore!

2. Generating Ideas

Of the books that Safa enjoyed reading was A Series of Unfortunate Events. So I thought perhaps we can get some advice from Mr Lemony Snicket, its author, for some writing ideas. It turns out, we could. 

..or looking for science experiment ideas from Albert Einstein!

3. Learning a Topic

Chatbots are starting to emerge as a useful tool in education. Its conversational and casual approach can help make learning more enjoyable. So I wondered if GPT-3 would make a great educational chatbot. Could it be used to help a child learn a new topic, say, in science? 

What if we could learn about relativity theory from Einstein himself? Better still, what if we could make it more fun by giving Einstein a bit more personality? 

What about math? Could we tailor solution explanations according to how a child would understand it? (note that there are some errors in the arithmetic, especially in large numbers)

4. Making Quizzes

What about generating trivia questions? Is there a way to quickly create a set of questions, in any subject? A potential lifesaver for educators.

Or perhaps coming up with multiple choice questions.

Thoughts

I think it’s a matter of when, rather than if, that language models like GPT-3 will make an impact in childhood education. The age of personalized learning is coming and I can’t help but see this kind of technology playing a key role in that. 

On the one hand, having observed Safa playing around with it, I am excited by the prospect of using such technology to make education more personalized and enjoyable. The possible use cases are limited only by the imagination.

But on the other hand, I do feel the trepidation about its potential harm. It’s plain to see how easily things could go wrong. For example, it wouldn’t take long until you find GPT-3 outputting text that’s not appropriate for children. Another example is how easily bias can be observed, such as text that stereotypes certain groups of people. Also, can we trust the accuracy and validity of the output? 

As with any technology, it is up to us to decide whether it becomes beneficial or harmful. Now is the opportunity to nip any potential harm in the bud, and at the same time to explore the exciting possibilities that it can bring. The last thing we should do is pretend that it’s not going to make its way into education. It may or may not come after all, but it’s better to be prepared rather than not.