The Sweet Spot

Andrew Hao's thoughts about software engineering, design, and anything shiny.

Ohm gotchas

| Comments

Here’s a list of things that have been annoying, or at least a bit frustrating using Ohm, the Redis ORM, in a Rails app. Beware to those who assume Ohm is ActiveRecord in new clothes. It is, but it’s not:

CRUD

Don’t make the mistake of treating your Ohm objects like AR:

ActiveRecord Ohm
`destroy delete
self.find(id) self[id]
update_attributes update
create create

Also note that Ohm’s update_attributes behaves differently from Rails` — it doesn’t persist the updates to DB. That owned me for the good part of the day.

Callbacks

Thankfully, these are ActiveRecord-like with the addition of ohm/contrib.

Associations

ActiveRecord Ohm
has_a or belongs_to reference
has_many collection

Read this article if you’re considering creating associations from AR objects to Ohm objects and the other way ‘round.

Now at Blurb

| Comments

I should have mentioned this long ago, but I started work at Blurb in early August. It’s been a quick ramp-up and I’m loving it there, surrounded by smart engineers and great designers. I do Rails/JS work there, and I’m building a lot of chops around Agile/TDD methodologies.

Anyways, they had me do a Camera Thursdays blog post, which I wrote about my Nikon/1.8 camera combo:

mmtss, a collaborative loop station

| Comments

mmtss is a loop station built for live performances.

Let’s make music together! This project simplifies a traditional loop tracking station and is designed for interactive collaborative music performances.

The idea: Everybody adds or modifies one “part” of a 32-bar loop. The user gets to play an instrument over the existing mix and record the 32-bar phrase when she or he is ready. Once the person is finished, the project selects another instrument at random for the next viewer to record.

It’s an Ableton Live controller serving a Webkit view, backed by node.js on the backend and socket.io + RaphaelJS on the front. Communication is done through a LiveOSC Live plugin via sockets.

Displayed at the Regeneration “We Collaborate” art show in Oakland, CA. 9/24/2011.

Screenshots

Practice mode

mmtss in practice/playback mode. Here the user is able to practice/mess around with the current instrument to prepare to record the next track.

Cued mode

Pressing “record” puts the user in a wait state. They are prompted to begin recording when all the black boxes count down and disappear.

Record mode

mmtss in record mode.

More screenshots: http://www.flickr.com/photos/andrewhao/sets/72157627640840853/

Source code

Github: http://www.github.com/andrewhao/mmtss.

MIT/GPL-sourced for your coding pleasure.

Installation

  • Make sure you have npm installed: http://www.npmjs.org

  • Copy lib/LiveOSC into /Applications/Live x.y.z OS X/Live.app/Contents/App-Resources/MIDI\ Remote\ Scripts/ folder

  • Set it as your MIDI remote in the Ableton Live Preferences pane, in the “MIDI Remote” tab.

Running it

  • Open Mmtss_0.als as a sample Live project.

  • Install all project dependencies with npm install from the project root.

  • Start the Node server with node app.js from the root directory.

  • Open a Web browser and visit localhost:3000

Modifying the sample project

You can modify this project to suit your own needs. Note that there are two sets of tracks; instrument (MIDI input) tracks and loop tracks that actually store clips.

For n tracks, you can add or remove your own instruments. Just make sure that instrument at track x corresponds to track x + n.

Credits

License

MIT and GPLv3 licensed. Go for it.

You will, however, need to get a license for Ableton Live yourself.

The handsome collaborators

Introducing Boink, a photobooth for the rest of us.

| Comments

My friends were complaining that wedding photobooths were too expensive to rent. Could we make one for them?

Glen and I from the Porkbuns Initiative stepped up in full armor, ready to help.

What is it? It’s a self-running photobooth that uses your Mac for brains and DSLR for eyes and a Webkit browser for its clothes and a photo printer for… a printer. You can connect an iPad as the frontend for a nice visual touch (pun intended).

We built it on a backend Rails instance, pushing SVG+HTML5 in the frontend and using the gphoto4ruby gem as a camera library wrapper.

[caption id=“attachment_1105” align=“alignnone” width=“500” caption=“All dressed up and ready to go.”][/caption]

[caption id=“” align=“alignnone” width=“500” caption=“An early UI prototype.”]Boink Preview[/caption]

[caption id=“attachment_1106” align=“alignnone” width=“500” caption=“This comes out of the printer.”][/caption]

Try it out

Check it out on Github.

Chat App - Frontend Prototype

| Comments

Chat View - 1

Interview View

Some UI work I did for a stealth startup in early ‘11. Responsible for look & feel and frontend chat interactions. jQuery/UI communicating to a CakePHP/nodejs backend.

We developed this prototype with statecharts, a concept commonly found in event-driven programming and with which I first learned from Sproutcore. I found it really helped map out all the complex user interactions we had to deal with on the page.

See more screenshots from the set.

Code For Oakland Barcamp

| Comments

[caption id=“” align=“alignnone” width=“500” caption=“Is that Jon Chan I see? Yes it is. Photo credit Oakland Local.”][/caption]

A few notes from this one-day barcamp/hackfest. The goal was to create mobile apps for Oakland, with a special emphasis on serving underserved populations.

  • Text messaging is still king. While smartphones and their apps are at the leading edge and grabbing all the attention these days, most underserved populations don’t have access to these services. Lots of apps today are using Twilio or Tropo APIs as their SMS/voice gateways.

  • Open data is awesome. I learned about ScraperWiki, which basically puts site scraping code into the cloud so people can maintain your scraper script long after you’ve tossed it. Some staff from maplight.org are here.

  • Open data is also difficult to maintain. One presenter mentioned how Oakland keeps a database of social services—that’s great, but what if the organization shuts down and/or changes its hours? Who’s responsible for updating the information? IIRC, the estimate was that 30% of the listings kept by the City are no longer valid. There need to be active efforts to combat dataset decay.

  • By the way, here’s a list of Oakland GIS datasets. Some of it really sucks (filled with garbage spam data). Some of it is useful.

Today’s Projects:

  • comtxt: text gateway for community organizers. github.com/ryanjarvinen/comtxt

  • freexchange.org. mobile interface to donated goods.

  • oakland:pm. get oakland high-schoolers connected to afterschool programs. github.com/jedp/oakland.pm

  • txt2wrk: text-based job matching. this one is unique because it is a complete SMS/voice based interface. connects to craigslist. call it and it reads back craigslist job postings. targeted for parolees.

  • Oakland Food Finder: helping people find healthy, locally-grown food.

  • BettaSTOP: help people find buses, access bus schedules. in oakland, many underserved communities depend on making and finding the right bus. it also allows users to give feedback on buses, remark on their timeliness, and talk about bus route features. Live & in production: http://www.bettastop.net.

What I did

I’m helping out the Oakland:PM team, which is in the process of building out a service to get high schoolers connected to city-funded afterschool programs. The idea is that they can pull up their mobile phones and see what’s available to them while they’re kicking it with friends and bored out of their minds.

While the others hacked on wireframes and some code, I worked on a few user stories and resolved to interview a few of my contacts who work for the YMCA in East Oakland. We won a $500 grant from the City to see this thing through in July. Here’s hoping that we’ll make it.

So far the app is a nodejs app with pages served with the Express framework. We threw around ideas of using Sencha Touch, but I think that decision is out of my hands. We’ll see how we proceed.

Source code can be found at: http://www.github.com/jedp/oakland.pm

UN Declaration of Human Rights (Visualization)

| Comments

UN Declaration of Human Rights (Visualization)

This design was created from a Processing sketch that breaks up the preamble to the UN Declaration of Human Rights and connects adjacent words together with lines. More frequent word associations can be noted by darker, thicker lines.

The source code (albeit messy) can be found at www.github.com/andrewhao/freedom-sunday. I’m running the sketch in full Java mode, so be sure to compile with your Java IDE of choice (rather than the Processing IDE).

Inspiration taken from the designs of Harry Kao (http://www.hairycow.name) and Jer Thorp (http://blog.blprnt.com).

My fellow interns and I are headed off to Cebu for 10 days to observe organizations involved in anti-trafficking efforts. More info: interns.regenerationweb.com

Save our souls - a Twitter art installation

| Comments

Here’s how the installation looked on the day of the art show.

[caption id=“” align=“alignnone” width=“333” caption=“We mounted the installation on the inside of the Regeneration cafe. The Arduino lies behind the Macbook behind the monitor.”]Installation[/caption]

[caption id=“” align=“alignnone” width=“333” caption=“The LEDs are mounted on breadboards suspended on fishing wire, binder clips, rubber bands, chopsticks, and a prayer.”]Mounted LED array[/caption]

[caption id=“” align=“alignnone” width=“333” caption=“Finished it just in time.”]In action[/caption]

Save our souls – Twitter art installation from Andrew Hao on Vimeo.

What are people saying about the ashes in the world today? This installation visualizes a live Twitter stream on heartache, injustice, loss, and our city and matches them up with the redemptive promises of Isaiah.

Life is difficult, and redemption is something we all long for. What changes do you hope for in your life or in the world? Send a response from your Twitter account to @sos_61 and watch the installation react. If you’d like to be kept anonymous, send your response in a DM to @sos_61.

“I hope for ” “I wish that ” “I want to see ____”

A few notes

  • Web interface is a fullscreen Google Chrome window. socket.io is the Websocket interface to the node.js backend. The slide transition is animated via a CSS3 animation, and the red overlay is a simple SVG shape plotted with the help of RaphaëlJS.

  • The Twitter backend is a collection of four self-updating Twitter searches, one for heartache (“i feel lonely, sad, depressed”), injustice (“violence, war, oppression, justice”), death (“rest in peace, passed away”), and Oakland (“oakland”). A blacklist filters out undesirable tweet keywords (“justin bieber”).

  • Additionally, the backend connects to Twitter via the Streaming API and displays a special animation for users who reply via tweet to the @sos_61 account.

  • The installation picks a tweet to display and pulses the LED array corresponding to the right tweet.

  • Communication to the Arduino happens via a python script over the Firmata protocol, using the python-firmata library. The nodejs server signals the script over a socket connection which will run the pulse animation on the correct pin.

  • I printed the graphic on an oversize printer with the good folks at Alameda Copy. Friendly service, fast turnaround, very reasonable prices. Ask for Joe.

Links