Today, @horse_ebooks sorta died. Long live @horse_ebooks!
I remember playing around with Markov chains back in the day, and thought that it would be fun to see what that looked like when fed all 15,000 of my Tweets.
How do Markov chains work?
They are pretty neat. There are a couple different ways to do them, but the general idea is that they take some corpus of text and keep track of how often various words appear before and after each other. Using some amount of randomness they then construct sentences based on these percentages that have most likely never been put together before, but, in some strange alternate grammatical universe, might have been put together.
The result? Frankensentences that have a bit of the personality of the original text, but often in sort of ridiculous new configurations. Pretty funny (to me at least).
And of course many more.
A couple people asked for how this is done:
So I posted the (relatively disorganized but hopefully comprehensible) code here:
Follow @buster_ebooks here. And let me know if you do something with this. Enjoy!