This one was a doozy.
Tweet ‘births’ are location-random and their gentle fading-into existence is accompanied by a random note chime in A major, the most optimistic of keys. Between that and the cotton-candy pastel background, the whole thing feels a little like a prozac overdose – that is, unless you run an extremely-popular keyword (”lol”, “god”, “music” and “fuck” are some very effective usual suspects) – in which case a delightful cacophony of sound and color ensues.
The text isn’t always legible due to the distances, but then again, these are tweets. It’s okay to not catch them all.
Playing around with the twitter API was one of those easy, then challenging, then dumbfounding experiences. I used a npm called ‘node-twitter’ and, as with every third party package, the trick was to know how to speak its language rather than the streaming API. Then there was the issue of isolating each tweet in the stream (true to its name, it tended to cluster-up, but of course I wanted each individual tweet to be born in a different place, like a precious snowflake).
THEN there was the issue of setting up a web socket connection to dynamically inform my three.js based “twitter-maker” every time a new tweet should be born. While we’re at it, I didn’t have a lot of experience with canvas-based textures, or canvas-based anything, for that matter – so that was illuminating. You can certainly do a lot with it, if you play by its rules… Yadda yadda yadda.
I tried to put a much bigger emphasis on aesthetics this time around; go for a clean, clear user experience and make sure every element supported the desired ambience. It’s not perfect but I think I pulled that off better than the last two attempts.
Honestly, the biggest hurdle wasn’t programmatic, it was conceptual. Up until a fairly late point in the process I had this notion that I’ll set up a point cloud (the obvious threeJS solution for particles), and it didn’t bother me one-bit that those didn’t seem to allow for dynamic creation; I’d have to gather all my tweets while I may, then “plunk” them into the scene instead of gracefully tweening them into existence one-by-one. It’s definitely feasible, but far from desired (even with repeating processes; and I’d have to clean the tweet array every time, etc….). Instead, I went for the laughably obvious solution and created planes. Their position, rotation, font and color are randomized to create a nice heterogenous feeling.
I feel great about this one. The week-long deadline is continually proving its merit. I may revisit it – this one actually has “portfolio material” written all over it, if I pimp it out a little bit and clean up the code and comments. But even as is it’s at a good place to leave for now; as my girlfriend demands of good work, I believe “it speaks for itself”. She also likes the chimes a lot.
Resources that really saved my butt when I hit the many walls I encountered:
On Draw Calls and why you want to take it easy with them (didn’t end up helping that much as I needed dynamic creation, but a GREAT explanation): http://simonschreibt.de/gat/renderhell/
Ilmari Heikkinen WebGL’d and animated the entire text of Kafka’s Metamorphosis! (Way more intricate than this, and very inspiring.)
This Codepen example helped me with some mouse-movement-based rotation ideas (that I didn’t end up using, but I now know of!)
Max (not that one!) is a wonderful little free utility to batch convert audio files – I used it to turn all the WAVs from Logic to OGGs which play nicer and are smaller.
This article absolutely saved me when I was stuck with a weird Heroku-related problem that ended up being unique to socket.io (watch out for those ports!).
That’s it for week 3 – onwards and upwards!