my_method.should =~ <my_expectation>
See the source.
my_method.should =~ <my_expectation>
See the source.
Here’s a list of things that have been annoying, or at least a bit frustrating using Ohm, the Redis ORM, in a Rails app. Beware to those who assume Ohm is ActiveRecord in new clothes. It is, but it’s not:
Don’t make the mistake of treating your Ohm objects like AR:
Also note that Ohm’s
update_attributes behaves differently from Rails` — it doesn’t persist the updates to DB. That owned me for the good part of the day.
Thankfully, these are ActiveRecord-like with the addition of
Read this article if you’re considering creating associations from AR objects to Ohm objects and the other way ‘round.
I should have mentioned this long ago, but I started work at Blurb in early August. It’s been a quick ramp-up and I’m loving it there, surrounded by smart engineers and great designers. I do Rails/JS work there, and I’m building a lot of chops around Agile/TDD methodologies.
Anyways, they had me do a Camera Thursdays blog post, which I wrote about my Nikon/1.8 camera combo:
Let’s make music together! This project simplifies a traditional loop tracking station and is designed for interactive collaborative music performances.
The idea: Everybody adds or modifies one “part” of a 32-bar loop. The user gets to play an instrument over the existing mix and record the 32-bar phrase when she or he is ready. Once the person is finished, the project selects another instrument at random for the next viewer to record.
It’s an Ableton Live controller serving a Webkit view, backed by node.js on the backend and socket.io + RaphaelJS on the front. Communication is done through a LiveOSC Live plugin via sockets.
Displayed at the Regeneration “We Collaborate” art show in Oakland, CA. 9/24/2011.
mmtss in practice/playback mode. Here the user is able to practice/mess around with the current instrument to prepare to record the next track.
Pressing “record” puts the user in a wait state. They are prompted to begin recording when all the black boxes count down and disappear.
mmtss in record mode.
More screenshots: http://www.flickr.com/photos/andrewhao/sets/72157627640840853/
MIT/GPL-sourced for your coding pleasure.
Make sure you have npm installed: http://www.npmjs.org
/Applications/Live x.y.z OS X/Live.app/Contents/App-Resources/MIDI\ Remote\ Scripts/ folder
Set it as your MIDI remote in the Ableton Live Preferences pane, in the “MIDI Remote” tab.
Mmtss_0.als as a sample Live project.
Install all project dependencies with
npm install from the project root.
Start the Node server with
node app.js from the root directory.
Open a Web browser and visit
You can modify this project to suit your own needs. Note that there are two sets of tracks; instrument (MIDI input) tracks and loop tracks that actually store clips.
n tracks, you can add or remove your own instruments. Just make sure that instrument at track
x corresponds to track
Design and architectural inspiration taken from vtouch, a HTML5/Node/Canvas Ableton controller.
You will, however, need to get a license for Ableton Live yourself.
My friends were complaining that wedding photobooths were too expensive to rent. Could we make one for them?
Glen and I from the Porkbuns Initiative stepped up in full armor, ready to help.
What is it? It’s a self-running photobooth that uses your Mac for brains and DSLR for eyes and a Webkit browser for its clothes and a photo printer for… a printer. You can connect an iPad as the frontend for a nice visual touch (pun intended).
We built it on a backend Rails instance, pushing SVG+HTML5 in the frontend and using the gphoto4ruby gem as a camera library wrapper.
Check it out on Github.
Some UI work I did for a stealth startup in early ‘11. Responsible for look & feel and frontend chat interactions. jQuery/UI communicating to a CakePHP/nodejs backend.
We developed this prototype with statecharts, a concept commonly found in event-driven programming and with which I first learned from Sproutcore. I found it really helped map out all the complex user interactions we had to deal with on the page.
I’m on a team with BRUTE LABS, a volunteer-led design agency working on StudentsConnect, a prototype project making chatroulette-style interactions connecting students from the global North and South. Here’s a UX flow we worked on:
A few notes from this one-day barcamp/hackfest. The goal was to create mobile apps for Oakland, with a special emphasis on serving underserved populations.
Text messaging is still king. While smartphones and their apps are at the leading edge and grabbing all the attention these days, most underserved populations don’t have access to these services. Lots of apps today are using Twilio or Tropo APIs as their SMS/voice gateways.
Open data is awesome. I learned about ScraperWiki, which basically puts site scraping code into the cloud so people can maintain your scraper script long after you’ve tossed it. Some staff from maplight.org are here.
Open data is also difficult to maintain. One presenter mentioned how Oakland keeps a database of social services—that’s great, but what if the organization shuts down and/or changes its hours? Who’s responsible for updating the information? IIRC, the estimate was that 30% of the listings kept by the City are no longer valid. There need to be active efforts to combat dataset decay.
By the way, here’s a list of Oakland GIS datasets. Some of it really sucks (filled with garbage spam data). Some of it is useful.
freexchange.org. mobile interface to donated goods.
oakland:pm. get oakland high-schoolers connected to afterschool programs. github.com/jedp/oakland.pm
txt2wrk: text-based job matching. this one is unique because it is a complete SMS/voice based interface. connects to craigslist. call it and it reads back craigslist job postings. targeted for parolees.
Oakland Food Finder: helping people find healthy, locally-grown food.
BettaSTOP: help people find buses, access bus schedules. in oakland, many underserved communities depend on making and finding the right bus. it also allows users to give feedback on buses, remark on their timeliness, and talk about bus route features. Live & in production: http://www.bettastop.net.
I’m helping out the Oakland:PM team, which is in the process of building out a service to get high schoolers connected to city-funded afterschool programs. The idea is that they can pull up their mobile phones and see what’s available to them while they’re kicking it with friends and bored out of their minds.
While the others hacked on wireframes and some code, I worked on a few user stories and resolved to interview a few of my contacts who work for the YMCA in East Oakland. We won a $500 grant from the City to see this thing through in July. Here’s hoping that we’ll make it.
Source code can be found at: http://www.github.com/jedp/oakland.pm
This design was created from a Processing sketch that breaks up the preamble to the UN Declaration of Human Rights and connects adjacent words together with lines. More frequent word associations can be noted by darker, thicker lines.
The source code (albeit messy) can be found at www.github.com/andrewhao/freedom-sunday. I’m running the sketch in full Java mode, so be sure to compile with your Java IDE of choice (rather than the Processing IDE).
My fellow interns and I are headed off to Cebu for 10 days to observe organizations involved in anti-trafficking efforts. More info: interns.regenerationweb.com