202004062216
Language Generation
tags: [ nlp , fun , gpt , _teaching ]
- What if you started with GPT-2, fined-tuned it on erotica (bdsm tag from erotic story site), and then prompted it with bible verses? comedy gold via reddit
- seems a little crude, to have to do the “conditioning” via these “prompts”
- although, in some sense, it’s sort of like an initial seed type thing. i wonder if it’s possible to make it so that you can essentially have it as a parameter
- or, at the very least, it seems really crude to have to always be re-prompting the text (using some kind of sliding window that I’m not sure I completely understand) (need to understand how that actually works)
- this is part of something called NaNoGenMo, which is this really fun application of all these NLP things #teaching
- links: