In collaboration with Emilia Riane & Priyanka Kumar
GPT-2 is an open-source AI model with 1.5 billion parameters trained on 40GB of internet data that predicts the next word in context. Someone once called it "basically autocorrect on steroids."
Author¹ GPT2 is a little book of curated dialogues with the machine on the nature of language. I gave it an initial prompt, thereafter kept feeding back in interesting content it generated until 31 pages of raw data were generated.
The idea was to perform each idea rather than just stating it. Everything in black is the machine, everything red is author. The entire book can be found here.