Last year I went right over the top for National Novel Generating Month and wrote a dynastic succession simulator and blogged about it at length. Where did I get the time for that, I find myself asking.
This year I made something simpler. Formations is a parody or homage to …
more ...Here are all 1000 classes in the ImageNet challenge, rendered with the Visual Geometry Group's 19-layer neural net, using the same deepdraw technique I used for the thousand faces of CaffeNet. These ones are rendered on random grayscale noise, not my face.
This is the net which the neuralgae Twitter …
more ...Glossatory is a recurrent neural net trained on all of the definitions in the lexical database WordNet. I recently retrained it on a GPU, increasing the number of hidden units to 512 and the layers from 2 to 3. The results are now a lot more grammatical, and there's a …
more ...I was tinkering around trying to get Justin Johnson's neural-style implementation working on the HPC - this is an algorithm which uses a deep learning image recognition net to extract low-level style features from one image and use them to render the large-scale contours of another. This technique has been commercialised …
more ...@voronidols are on Twitter, and also on Mastodon @voronidols@botsin.space and the bots page.
more ...The third of three posts about the details of my NaNoGenMo 2016 project, Annales
1. Vocabularies | 2. TextGen | 3. Events
The event loop is the least interesting part of Annales, but it took the most effort and time. The basic idea is that there's a structured variable which describes the …
more ...The second of three posts about the details of my NaNoGenMo 2016 project, Annales
1. Vocabularies | 2. TextGen | 3. Events
I'm proud that Annales is the only representative of Haskell in 2016's entries, but there's a reason that it's the only one: knocking together quick text generators is really a …
more ...The first of three posts about the details of my NaNoGenMo 2016 project, Annales
1. Vocabularies | 2. TextGen | 3. Events
Annales was originally going to be a long-form version of @GLOSSATORY, a Twitter bot I made earlier this year. Glossatory is driven by a recurrent neural net trained on around …
more ...