Experiments: Language



Aenean aliquet nec nibh et aliquet. Sed in lorem sed est 
ultriciealiquet. 
Zasve is a fictional language created for a very logical people in a densely stratified society that suppresses emotion. Based on principles of mathematical equations and geometrical purity, with the triangle central to the culture, Zasve was an interesting experiment into the world-building, mechanics of language creation, and the possibilities of alternate methods of communication.

Find the entire Zasvian travel guide and a little book of poetry here.




   Zasve    ·    Mentored by Abe Burickson     ·     2017     ·    Created with Emilia Riane & Priyanka Kumar    ·      Read here






GPT-2 is an open-source AI model with 1.5 billion parameters trained on 40GB of internet data that predicted the next word in context. Someone once called it ‘autocorrect on steroids.’  

Author¹ GPT2 is a little book of curated dialogues with the machine on the nature of language. I gave it an initial prompt, thereafter kept feeding back in interesting content it generated until 31 pages of raw data were generated.

The idea was to perform each idea rather than just stating it. Everything in black is the machine, everything red is author. The entire book can be found here.




       Author¹ GPT–2    ·    Mentored by Abe Burickson     ·     2017     ·    Created with GPT-2    ·      Read here