Skip to content

Latest commit

 

History

History
 
 

07-llms

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

Language Models

Markov Chains

Sequential Data and Recurrent Neural Networks

Transformers and Large Language Models

among the reasons I use large pre-trained language models sparingly in my computer-generated poetry practice is that being able to know whose voices I'm speaking with is... actually important, as is being understanding how the output came to have its shape - @aparrish, full thread

LLM Training

Datasets for LLMs

Climate Impact

Code Examples and Implementations

Markov chains

Replicate

Examples will be shared over email due to use of ITP proxy server.

  • 🎨 Single Prompt + Reply to Llama hosted on Replicate
  • 💬 ChatBot Conversations Llama hosted on Replicate. This follows the specification in the Llama 3 Model Card.

Ollama

These examples require working with p5.js locally on your computer and outside of the web editor. Some resources for doing so can be found in my workflow video series.

Transformers.js

Assignment

  • Read Language models can only write ransom notes by Allison Parrish and review the The Foundation Model Transparency Index.
  • Experiment with prompting a language model in some way other than a provided interface (e.g., ChatGPT) and document the results in a blog post. You can use any of the code examples above and/or try a variety of LLMs locally with ollama. Reflect on one or more of the following questions:
    • How does the concept of LLMs as “ransom notes” influence your perception of using these models creatively?
    • How does the hidden origin of the text that LLMs generate affect your sense of authorship or originality in your creative work?
    • How does the metaphor of "collage" used to describe LLMs align with or differ from your creative process?
    • How would you compare working with an LLM to other forms of text generation, such as using a Markov chain?
  • Document your experminents in a blog post and add a link to the Assignment Wiki page. Remember to include visual documentation (screenshots, GIFs, etc)