Using AI to Generate Detailed Lesson Plans

Dan McCreary
6 min readSep 12, 2020

--

Data flow diagram showing how machine learning and NLP Transformer models can generate detailed lesson plans using GPT–3.

On May 28th, 2020 OpenAI announced its new monument to general AI: The GPT-3 system. It is an incredible piece of engineering: a deep neural network with 175 billion parameters. And it does play some incredible parlor tricks:

  1. It can translate from English to other languages and back
  2. Given a prompt, can generate some pretty readable short stories
  3. Given a description of a user interface, it can generate the HTML web page layout
  4. Give a description of a database search, it can generate the program to query a database
  5. Given some text, it can use common sense and knowledge of the real world to answer questions about the text
  6. It can summarize long documents
  7. It can emulate a real person in a chat (we are getting closer to passing the Turning Test)
  8. When asked to write a short paragraph, a real person can guess if it was written by GPT-3 or a real person 52% of the time (50% is a random guess)
  9. It can answer simple math questions

However, it’s debatable how “smart” it really is. It is very good at many language tasks that involve complex pattern matching, but it falls down on many common-sense tasks. In this blog, we will discuss if these new “generative” systems will be able to generate detailed lessons that can be customized to the need of a classroom of students or an individual student.

At the heart of GPT systems is the ability to “Generate” text. GPT stands for Generative Pre-trained Transformer. It means that the GPT-3 neural network was built using a Transformer model. The key difference between GPT-3 and its predecessor like GPT-2 is its mammoth size. It was trained on many English language data sources (Wikipedia is only about 6% of the input) that included almost 1 trillion words. The model size of 175B parameters is almost 100-times larger than the 1.5B parameters of GPT-2 which was released in February of 2019. So that is a factor of 100x in 16 months! But generating these models is not easy. It is estimated that just the cost of running the GPUs to train these models cost around 10 million USD.

I should also mention that the Transformer models are not unique to OpenAI and GPT models. They are also used by Google in the autocomplete features of Gmail and in automatic grammar checking programs like Grammarly, which I am using to help write this article. Transformer models like BERT and their kin have revolutionized natural language processing (NLP) technologies in the past two years.

So how could we use Transformers to generate lesson plans for our classrooms? First, we need to understand how Transformers and GPT systems work. You can see from the list above that a single GPT-3 model does lots of different things. The way we vary its behavior is by giving it some short new prompts in addition to the underlying “base” pre-trained model. Once the prompts are given the network is “primed” to return the right types of answers to questions. It is almost like telling people the “rules” of a new game you want them to play. These systems are called “few-shot leaners,” because the programmer only needs to add a very few short examples and the GPT systems then learn how to generalize the intent of the prompts.

So to generate new lesson plans we would have to provide a “training set” of sample descriptions of lesson plans and what we would expect the results to be. This dataset of input-output pairs would be our “few-shot learning” examples that we layer above the 175B parameter GPT-3 model. Once this is done we would then send it a short English language description of a lesson plan we desired and GPT-3 would return the lesson plan. Sounds simple, right? We could have AI in every classroom by the end of the year.

Well, hold on minute folks! Generating detailed lesson plans is not quite that simple! Let’s take a look at some of the challenges using GPT-3 and some of the complexities that will come up when we attempt to build this system. We will also describe the strategies we can take to overcome these limitations.

Since the start of the COVID pandemic, I have been helping organizations move their lesson plans online. Let's see how we could build a training set for lesson plan generation using these lesson plans. We have standardized on using GitHub Pages, Markdown, mkdocs, and Google Material widgets to build these pages. You can see examples of our curriculum for Scratch, Python, Arduino, and Web.

The first thing to understand is that we don’t really want to generate the raw-lower-level HTML code for these sites. That would be difficult to manage. What we want to do is to generate an easy-to-understand and easy-to-maintain Markdown. Here is an example of what that input might look like for learning how to use the SVG Circle element:

# Drawing a Circle with SVG
In this lession, we will generate a circle using the SVG language. We will show you how to position the circle, change the size of the circle and change the fill and border color of the circle.
## Prerequisites
Before you begin, you will need to know how to edit markup and add new attributes to elements. To test the drawing you will need to render the code in a web browser.
## Cicrle Attributes
cx = x or horizontal position of the center of the circle from the left
cy = y or vertical position of the center of the circle from the top
r = radius
## Sample Code
...
## Rendering
...
## Experiments to try
1. What would happen if you change the fill from blue to red?
2. How would you change the color of the border of the circle
## Resources
https://developer.mozilla.org/en-US/docs/Web/SVG/Element/circle

Note you can see the actual source of the SVG Circle Markdown here.

In the SVG circle example above the block of text after the first title line is the description “input” preamble. It is the job of the Transformer model to generate the rest of the Markdown file.

How would it do this? It would need to be trained on many example tutorials of how other courses taught Web, HTML, and SVG labs. It would build a neural network with each of the words in these tutorials and what other words followed any word within the context of each tutorial document. What is important to understand is that GPT-3 is kind of already doing some of this today. Here is an example of GPT-3 generating an HTML web page from a written description of that page.

One of the challenges here is understanding the implied context of the lesson plan generation. If you are working with junior-high students they may need some additional background information and a slower more gentle introduction to the concepts. College students, on the other hand, usually will have prior experiences that will accelerate their learning. Their lesson plans can make assumptions of their prior knowledge or just provide some links for any necessary background concepts they must master.

So although this is a good example, I have not been able to test this yet since GPT-3 keys are only being given to a few people. By October 2020 we think that there will be commercial versions of the GPT-3 API available that we can start testing. Note that the pricing is based on “tokens generated”. This is roughly equivalent to the number of words in a lesson plan.

I hope I have convinced some of you that using Transformer models to generate a lesson plan is not that far off. What we will need are good training sets and some volunteers that want to try this out in their classrooms and mentoring sessions.

If you would like to help out building these tools for teachers, mentors, and students, please connect with me on LinkedIn.

I look forward to hearing from you! — Dan

--

--

Dan McCreary

Distinguished Engineer that loves knowledge graphs, AI, and Systems Thinking. Fan of STEM, microcontrollers, robotics, PKGs, and the AI Racing League.