My friends were complaining that wedding photobooths were too expensive to rent. Could we make one for them?
Glen and I from the Porkbuns Initiative stepped up in full armor, ready to help.
What is it? It’s a self-running photobooth that uses your Mac for brains and DSLR for eyes and a Webkit browser for its clothes and a photo printer for… a printer. You can connect an iPad as the frontend for a nice visual touch (pun intended).
We built it on a backend Rails instance, pushing SVG+HTML5 in the frontend and using the gphoto4ruby gem as a camera library wrapper.
All dressed up and ready to go.
An early UI prototype.
This comes out of the printer.
Try it out
Check it out on Github.
Some UI work I did for a stealth startup in early ‘11. Responsible for look & feel and frontend chat interactions. jQuery/UI communicating to a CakePHP/nodejs backend.
We developed this prototype with statecharts, a concept commonly found in event-driven programming and with which I first learned from Sproutcore. I found it really helped map out all the complex user interactions we had to deal with on the page.
See more screenshots from the set.
I’m on a team with BRUTE LABS, a volunteer-led design agency working on StudentsConnect, a prototype project making chatroulette-style interactions connecting students from the global North and South. Here’s a UX flow we worked on:
Is that Jon Chan I see? Yes it is. Photo credit Oakland Local.
A few notes from this one-day barcamp/hackfest. The goal was to create mobile apps for Oakland, with a special emphasis on serving underserved populations.
- Text messaging is still king. While smartphones and their apps are at the leading edge and grabbing all the attention these days, most underserved populations don’t have access to these services. Lots of apps today are using Twilio or Tropo APIs as their SMS/voice gateways.
- Open data is awesome. I learned about ScraperWiki, which basically puts site scraping code into the cloud so people can maintain your scraper script long after you’ve tossed it. Some staff from maplight.org are here.
- Open data is also difficult to maintain. One presenter mentioned how Oakland keeps a database of social services — that’s great, but what if the organization shuts down and/or changes its hours? Who’s responsible for updating the information? IIRC, the estimate was that 30% of the listings kept by the City are no longer valid. There need to be active efforts to combat dataset decay.
- By the way, here’s a list of Oakland GIS datasets. Some of it really sucks (filled with garbage spam data). Some of it is useful.
- comtxt: text gateway for community organizers. github.com/ryanjarvinen/comtxt
- freexchange.org. mobile interface to donated goods.
- oakland:pm. get oakland high-schoolers connected to afterschool programs. github.com/jedp/oakland.pm
- txt2wrk: text-based job matching. this one is unique because it is a complete SMS/voice based interface. connects to craigslist. call it and it reads back craigslist job postings. targeted for parolees.
- Oakland Food Finder: helping people find healthy, locally-grown food.
- BettaSTOP: help people find buses, access bus schedules. in oakland, many underserved communities depend on making and finding the right bus. it also allows users to give feedback on buses, remark on their timeliness, and talk about bus route features. Live & in production: http://www.bettastop.net.
What I did
I’m helping out the Oakland:PM team, which is in the process of building out a service to get high schoolers connected to city-funded afterschool programs. The idea is that they can pull up their mobile phones and see what’s available to them while they’re kicking it with friends and bored out of their minds.
While the others hacked on wireframes and some code, I worked on a few user stories and resolved to interview a few of my contacts who work for the YMCA in East Oakland. We won a $500 grant from the City to see this thing through in July. Here’s hoping that we’ll make it.
So far the app is a nodejs app with pages served with the Express framework. We threw around ideas of using Sencha Touch, but I think that decision is out of my hands. We’ll see how we proceed.
Source code can be found at: http://www.github.com/jedp/oakland.pm
This design was created from a Processing sketch that breaks up the preamble to the UN Declaration of Human Rights and connects adjacent words together with lines. More frequent word associations can be noted by darker, thicker lines.
The source code (albeit messy) can be found at www.github.com/andrewhao/freedom-sunday. I’m running the sketch in full Java mode, so be sure to compile with your Java IDE of choice (rather than the Processing IDE).
Inspiration taken from the designs of Harry Kao (http://www.hairycow.name) and Jer Thorp (http://blog.blprnt.com).
My fellow interns and I are headed off to Cebu for 10 days to observe organizations involved in anti-trafficking efforts. More info: interns.regenerationweb.com
Here’s how the installation looked on the day of the art show.
We mounted the installation on the inside of the Regeneration cafe. The Arduino lies behind the Macbook behind the monitor.
The LEDs are mounted on breadboards suspended on fishing wire, binder clips, rubber bands, chopsticks, and a prayer.
Finished it just in time.
Save our souls - Twitter art installation from Andrew Hao on Vimeo.
What are people saying about the ashes in the world today? This installation visualizes a live Twitter stream on heartache, injustice, loss, and our city and matches them up with the redemptive promises of Isaiah.
Life is difficult, and redemption is something we all long for. What changes do you hope for in your life or in the world? Send a response from your Twitter account to @sos_61 and watch the installation react. If you’d like to be kept anonymous, send your response in a DM to @sos_61.
“I hope for ____”
“I wish that ____”
“I want to see ____”
A few notes
- Web interface is a fullscreen Google Chrome window. socket.io is the Websocket interface to the node.js backend. The slide transition is animated via a CSS3 animation, and the red overlay is a simple SVG shape plotted with the help of RaphaëlJS.
- The Twitter backend is a collection of four self-updating Twitter searches, one for heartache (“i feel lonely, sad, depressed”), injustice (“violence, war, oppression, justice”), death (“rest in peace, passed away”), and Oakland (“oakland”). A blacklist filters out undesirable tweet keywords (“justin bieber”).
- Additionally, the backend connects to Twitter via the Streaming API and displays a special animation for users who reply via tweet to the @sos_61 account.
- The installation picks a tweet to display and pulses the LED array corresponding to the right tweet.
- Communication to the Arduino happens via a python script over the Firmata protocol, using the python-firmata library. The nodejs server signals the script over a socket connection which will run the pulse animation on the correct pin.
- I printed the graphic on an oversize printer with the good folks at Alameda Copy. Friendly service, fast turnaround, very reasonable prices. Ask for Joe.
I just spent five hours trying to figure out why none of the Firmata libraries for Python were working over my serial connection. I was wondering why the previous program remained on the board and none of the signals sent were hot.
Hint: you need to load Firmata first onto the board before it can understand the protocol. Oh. Duh.
Don't forget to load up "OldStandardFirmata"
Only “OldStandardFirmata” (Firmata 2.0) seems to work with my version of python-firmata. The newer Firmata PDEs can talk Firmata 2.1 and 2.2, but I’m too tired to figure them out.
So I got the poster printed, and the LEDs currently show through the board pretty well. This is good:
But last night I spent a good chunk of my evening and early morning hitting a lot of walls:
- I might have to throw out the idea of using the Twitter Streaming API. For the kind of searches I need to do, I just can’t get enough granularity to use live information. Plus, I can only open one connection to the API at a time, which is not good if I need to run four listeners at a time.
- I couldn’t get the Classifier Ruby gem to work; which looked like the easiest implementation of an LSI/naive Bayes classifier out there. The next closest thing was nltk, and there was no way in heck I had the time to figure that out. Plus, I realized that creating a training set was a LOT more work than I thought I had. So… scratch the machine intelligence out of this. I’m just going to manually search for specific search terms.
- New solution: Periodically use the Search API to grab results. This allows more exact search results and gives me the ability to tweak the search terms while the demo loop is running.
- Event-driven programming is throwing me for a loop (ha, get it?). After perusing the node.js docs for the better part of an evening, I think I need to re-architect the code. I need to create a simple event-driven queue, which is confusing because it seems like something simple, yet support isn’t built in. node provides so little out of the box.
- It could be more difficult to set up a socket connection to Arduino than I thought. I may have to set up a socket connection in python with python-firmata to interface with the Arduino. Other Arduino/serial proxies report not working well with Snow Leopard.
- I haven’t yet thought about the Web interface.
- I haven’t thought about how I’m going to hang the piece. I have some scrap wood and fishing wire, but I haven’t thought about whether it’s possible to hang all those LED arrays w/o some weird gravity issues.
- Woodwork help? I need to figure out how to use a rotary saw.
So uh, yeah, I’m getting nowhere but at least I know what I still need to do.
A quick update on the art project:
- I switched from my normal LEDs to “straw hat” wide-angle LEDs. The viewing angle is much better, meaning also that I can get a more diffuse glow @ closer distances so the light comes through better through the paper.
- I found out my deadline is Sunday so I’m a little stressed.
- No progress on Arduino/Processing or the Twitter feed. Still have to figure out whether I should use a Bayesian classifier for better fuzzy matching on tweets or if I should just rig the feed with a simple text filter on “I feel ___”. I have a feeling the former is nice in theory but isn’t good enough and the latter is probably best given this short time span.
- No idea how I’m going to solder the LEDs to the breadboards. Heck, I think my breadboards aren’t even wide enough, meaning I either need to find a larger breadboard or cut one up and paste the two pieces together lengthwise.
- I still need to find wood to mount/suspend the project.
- I called around and got some good quotes from print shops. I’m probably going to go with Alameda Copy. Friendly staff, good Yelp review, best price I saw around. I’ll be printing the poster off their oversize printer on a roll of 70lb paper. Quote: $6-8 per sq. foot, color.
- I’ve made some good progress on the design, though. I think I’m lengthening the poster but decreasing the width. I’ll post more screens if I get a chance.
Left: conventional LEDs. Right: straw hat LEDs