You may have heard of Wonder which is a generative pre-trained transformer, more commonly referred to as GPT-3.
GPT-3 is a “big language” artificial intelligence algorithm that has achieved extremely high levels of fluency with the English language, with some speculating that AI may now be able to create most of the text we are managing.
The goal is for the large language model to be able to ask a question and consequently find an answer that is innate, accurate, thorough and effective.
I would like to set aside the larger debate about AI and education, AI and writing, what it means and how we should teach students to prepare for a world where these things exist, and instead learn how to make some interesting notes about GPT-3. By
I assumed that in order to create fluent prose, GPT-3 was programmed with English grammar and syntax rules, the kind of thing Mrs. Thompson tried to drill into my classmates, and IM Class.
When using the subjective case, the verb… blah blah blah.
The difference between a gerund and a participle is … etc.
You know the stuff. That’s all I was taught once, then taught to others for a while, and now spends just zero time thinking.
I thought the big advantage of GPT-3 over our carbon-based life forms was that it had a wide and instant access to these rules, but it was 100% wrong.
This writing New York TimesThis is a description of how GPT-3 works by Steven Johnson from Ilya Suitscaveer, one of the people who works with the system.
Sutskever told Johnson, “Understanding the underlying concept of FPT-3 is a way to connect an intuitive concept to something that can be mechanically measured and understood, and it is a function of predicting the next word in the text.”
Since GPT-3 is “composing”, it does not imply a vast knowledge of the rules of grammatical expression. It’s just being asked, based on the word it used, what a good word to use next.
Interestingly, this is very close to how human writers compose. One word in front of another, repeatedly we try to put something intelligent on the page. This is why I say that I teach students “sentences” instead of “grammar”. Writing is a feeling-making activity, and the way the audience perceives what we are saying is word format in a sentence, a sentence in a paragraph, a paragraph on a page and much more.
Listeners do not evaluate the accuracy of grammar in the sense that they are making words.
Considering the complexity of sensory formation, we realize that human writers are working at a much more sophisticated level than GPT-3. As people make our choices, we don’t just think about which word makes sense, but which word makes sense in terms of our purpose, our medium, and the fully figurative situation of our audience.
As I understand it, GPT-3 does not have this level of awareness. It is really moving from one word to another, the huge amount of information and examples of its settlement being driven by sentences. Since it creates pleasing sentences, it “learns” to make the sentences more similar To enhance the sophistication of GPT-3’s expression, programmers have trained it to write in a particular style, basically working out which word is next in the parameter of the word type assigned to a particular style.
The current (and possibly permanent) flaws in GPT-3 further illustrate both the similarities and differences between how it is written and how people write it. GPT-3 can apparently start making things with just one prompt response. As long as the next word is in hand, there is no worry about whether the information is correct or true. In fact, there is no way to know.
GPT-3 also has no qualms about spreading racist rhetoric or misinformation. Garbage goes inside, garbage goes out, as the proverb goes.
Of course human writers can do it too, so in order to work with students we need to help them understand how to place words in sentences and paragraphs, etc.… but help them to assimilate and internalize what I say “Writer’s practice, “Skills, knowledge, attitudes and habits of mind, which writers use.
I think it might be interesting to write at a prompt to compare and contrast GPT-3 on how GPT-3 and Joan Dideon use grammar in their writing, based on Dideon’s famous quote, “Grammar is a piano that I play to the ear.” , Seems to have been out of school that year since the rules were mentioned. All I know about grammar is its infinite power. Changing the structure of a sentence changes the meaning of that sentence as much as explicitly and inflexibly the position of the camera changes the meaning of the object being photographed. “
I wonder what it will say?
 Last year I wrote about an experiment where GPT-3 tried to answer writing prompts from the college curriculum and how it was successfully managed to reproduce the unpleasant responses that many students would churn out to prove they did. SomethingClass-related, even if they have not learned much of interest.