Samuel, beat poet

I didn't contribute to NaNoGenMo this year: on the spur of the moment, I decided to try writing 50,000 words of fiction, rather than generating it, and blogged briefly about it at Nannygoat Hill. I didn't even get around to feeding the results into a neural net, like I …

more ...

Sartor Resartus RNN

g CINA CONTENC MI CINE SI CENTENI SEE CONDENN DE CRE SU DE QUAENT ME CONNE CUNTUR DEACLETICE (FIUD SEE DED SEE SER SER SER SEENNN SER SER SEENN SER SER SEENDENNENCEN (_DER SER K of my own striking the more than the spirit of the morning of the Professor …

more ...


Glossatory retrained

Glossatory is a recurrent neural net trained on all of the definitions in the lexical database WordNet. I recently retrained it on a GPU, increasing the number of hidden units to 512 and the layers from 2 to 3. The results are now a lot more grammatical, and there's a …

more ...


NaPoGenMo 2017

I found out about NaPoGenMo halfway through April, and decided to do something throwaway and quick for it, unlike my last couple of NaNoGenMo efforts, which have taken much more effort than the end results justified. (I found out about it on Mastodon but that's a story for another blog …

more ...




Annales: Details

This was the first time I submitted a NaNoGenMo entry which aimed at being readable, in contrast to my previous efforts, Everywhere Dense and Neuralgae, which were really abstract. Annales is a generated parody of a very ancient genre, the chronicle, which tells history as a sequence of reigns, deaths …

more ...