Jul
23

Speech at Vivianna & Greg’s wedding dinner

Meg and I gave a speech at Vivianna and Greg’s wedding dinner:

Viv and Greg Wedding Reception Speeches from Alex Hessler on Vimeo.

Jun
20

The Breakers Blues Band

“Play that thing!”

My stage presence was lacking.  I wasn’t confident, and it showed.  I hunched over the guitar and stared at my hands.  I didn’t smile, or ever look at the audience.

Breakers Blues Band LIVE album cover

Suddenly, Big Ray pointed his huge, meaty finger at me and shouted: “Kevin! PLAY THAT THING!”  Unsatisfied, he grabbed a spare guitar, and waded in to save me.

Ray’s guitar playing was not technically very good.  He played very simply, perhaps one note in rhythm, bending the string.

But his was totally different, and BETTER, than my playing.  He stood tall, he radiated confidence, he got louder, he got quieter, he looked at the audience, he smiled, he walked around and interacted with the drummer, with the bassist.

… I was by far my own worst critic.  Not everyone in the audience is musical; many can’t tell when you play wrong notes.  If you are smiling, and moving with the rhythm of the song, and sounds are coming from your guitar, all seems well.

It’s insulting if people come to see you play, and you don’t attempt to give a convincing performance by smiling and having good stage presence.

As with many other things in life, fake it till you make it.

Here’s a recording of us LIVE at Michigan Tech in the fall of 2003:
https://soundcloud.com/kevin-trowbridge/cadillac-assembly-line-breakers-blues-band-live

The Beginning

Posing on the shore of the Portage in Hancock, MI.

The genesis of the Breakers Blues Band was the Michigan Tech Jazz program, lead by talented and wonderful Mike Irish.

I played the trumpet in these bands all through college.  But honestly, I hated the trumpet.  The musician must match the instrument.  The trumpet was everything I was not: brassy, loud, and over-confident.

The guitar, on the other hand, was fundamental, a rhythm instrument, it hung out in the background, keeping time and sounding groovy.  Much more my speed.  And I had been secretly playing the guitar (and the keyboard) since back in high school.  I taught myself, from ”The Complete Easy Beatles” with simplified chord diagrams.

My trumpet hatred must have been obvious, because in the fall of 2003, there was an opening in the MTU Jazz band for a guitarist, and Mike Irish gave the spot to me.  Thus my escape from the trumpet was accomplished.

Soon afterwards, there were a series of ‘jam sessions’ at the Motherlode cafe in downtown Houghton, which I shyly attended, although I rarely played.

One night a strange and interesting guy showed up:

It was a typical scene — the cafe full of stereotypical engineering students: shy, pale, slight, quiet, introverted, no one talking. Suddenly a tall, large, bearded, joyous man, leapt onto the stage with his saxophone, and passionately played, many notes, up and down the scale, moving his body with the rhythm, totally dominating the stage.

It was kind of shocking!  Such a lack of reserve!  Such confidence!  Did we like it?  Not sure.  Did we notice? Definitely.

This was “Big Ray” Haywood, and shortly afterwards, he joined the MTU jazz band.

The charts we were playing were good practice, and very beautiful.  But they were played straight from the charts, and solos were generally short, and somewhat rehearsed.

So, the opportunity was definitely there for smaller groups to form.  But we were lacking id, or primal desire.  If you had asked me if I wanted to form a band and go around gigging at bars, the idea wouldn’t have appealed to me.  But I didn’t know what I was missing.  ;)

However, Ray had a passion for the blues!  He had already been in a band (the Crossroads Blues Band with his brother and others) back at home in Detroit, and here, he saw opportunity: music-making, and money-making potential.

The Perks of Fame

My junior year of college, I spent in Switzerland, learning French.  It was so amazing, it reset all my expectations.  When I came back, I felt really down.  I didn’t want to be at Michigan Tech.  For almost two years, I limped along, not enjoying much, frustrated, apathetic.

The root problem was my poor attitude.  However, time cured me eventually: I just got tired of feeling down all the time, and I decided to live again, in the here and now, where I was.

The Breakers Blues Band was an important part of this new period of self-growth: I “came out of my shell.” We were successful.  We had groupies.  Our shows were always full: we regularly drew crowds of more than 100 people.

Also, I had a responsibility to pull my weight.  This included having a good stage presence by appearing to enjoy myself.  At first it was an act.  But slowly, it became real.

I switched to the keyboard: piano blues can be so simple.  I got a lot better.  I learned how to solo: start off softly, build tension with a repetitive note or a riff, and then build a vocabulary of little tricks that sound cool.  Our whole group got tighter: we were aggressively gigging: 2 shows a week for more than a year.  It was a whole epoch of my life.

We even had away shows, in Marquette and Copper Harbor.  We would throw afterparties and invite our fans to party with us.  And, at one of these shows, I met my first real girlfriend: Mandy Edwards.

Apr
09

Moving to San Francisco

In all honesty, the time I spent in Mountain View was one of the most lonely and empty of my life. That sounds pretty dire, I wasn’t depressed or sad, but the time was — basically empty. When I thought about it later, I couldn’t remember much happening.

I spent many nights at home alone, saving money, drinking “Full Sail Amber Ale,” eating pasta, and watching films from “Ebert’s Great Movies” list. ;)

Ian, me, and Alex Mayer at the 3656A housewarming party on October 6th, 2006.

Ian, me, and Alex Mayer at the 3656A housewarming party on October 6th, 2006.

My dear friend Dan Simon came to visit me and he hated, hated, hated Mountain View. He paraphrased the poem Slough by John Betjeman:

“Come friendly bombs and fall on Mountain View!
It isn’t fit for humans now,

Tinned fruit, tinned meat, tinned milk, tinned beans,
Tinned minds, tinned breath.”

So, in my weekly meetings with my manager Stephanie Schoch, I started speaking about how I would like to move to San Francisco. Moving to San Francisco was a constant tension amongst the young folks at Ariba and on the Peninsula. The glittering city was the allure, but the cost was having to commute more than an hour every morning and evening.

Stephanie encouraged me, so I started taking the Caltrain up to SF on weekends and at night to go to apartment open houses. At first I fell into a common trap: renting a studio in the TenderKnob. When I told Stephanie about this, she said: “You’re not living in some bullshit studio in the Tenderloin! You’re living in the Mission!”

Gatsby and I at the housewarming party.

Gatsby and I at the housewarming party.

So I decided that I wanted to move in with a group of housemates in a shared apartment in the Mission. I went to a few open houses, but they were so competitive, and I hated feeling judged or having to “sell myself” to potential roommates.

So, I came up with a new plan. I found a “Roommate Meetup” group where people seeking roommates could meet, in a bar, and try to connect. The people who were seeking a place to live, wore red nametags, and those who were seeking people to move in with them, wore blue nametags.

The first few times I went, I wore the red nametag, indicating that I didn’t have an apartment and I was looking for someplace to live. But, since we were drinking in a bar, there was a blurry line between socializing, and actually looking for someone to move in with. It wasn’t very efficient and it was still very competitive.

So I had a brainstorm: I could wear a blue tag, and “pretend” that I already had an apartment, but I would actually be looking for people to join me, in looking for an empty apartment, to “found” together.

This was the winning approach. At the next meetup, wearing the blue nametag, I started meeting people. It was a little embarrassing to admit that I didn’t actually have a place, but I stood out, and I met other ambitious people.

Ian at the 3656A housewarming party on October 8th, 2006.

Ian Gunn

One of these people was Ian Gunn. He had a job with Google directly out of college and he was living in corporate housing in Mountain View. At first I didn’t like him, he seemed young, and pushy. We talked for a few minutes and then moved on.

However, at the end of the night, just as I was contemplating the long, uncomfortable ride home on Caltrain, he approached me again, saying that he had a car and he would give me a ride back to Mountain View. I somewhat half-heartedly agreed.

But during the 45 minute car ride south, we started talking, and I realized that I really liked Ian Gunn. He was smart, and funny, and a good guy. We decided that we would be partners and look for an apartment together.

This was the beginning of an even more epic stage, which would be so boring to encyclopedically relate, but suffice it to say that we needed a 3rd roommate to join, who we found eventually: Alex Mayer. Using his skills with the ladies, Ian met a real estate agent, Tatianna, who eventually gave us the keys to a quasi-shitty apartment with a great location: 3656A 20th Street.

The view of SF downtown from 3656A's back porch the night of the housewarming party.

The view of SF downtown from 3656A's back porch.

Alex Mayer was gone in Thailand when we needed to sign the lease, so we called hotels in Thailand and faxed the documents to him there. On the night we needed him to actually sign the lease, he was gone on a 50 mile bike ride, and Ian and I found ourselves driving aimlessly around Palo Alto, because he had agreed to meet us at a certain place on the side of the road, at a certain time, to sign the lease. It was hilarious and very memorable.

But finally we were in! We got a place and we all moved in together on September 28th, 2006. It was one of the best decisions I had made in my life, thus far, because going forward, so many good things flowed from this.

Jun
07

MonkeyPatch NewRelic so that it doesn’t completely hijack the ‘process_action’ method

I recently upgraded a site from Rails 2 to Rails 3 and moved it to Heroku. Once the upgrade was complete, I eagerly opened up the NewRelic performance monitoring tools to see how much speedier the application had become.

To my dismay, the charts were all completely blank. I carried out the usual debugging steps … turned up the NewRelic agent logging verbosity … but I couldn’t find anything wrong. Then I discovered that NewRelic has a ‘development’ mode — which logs requests made locally. I turned this on and took a look at these charts. They were also completely blank! So I realized that the problem was not that my application wasn’t properly reporting information to the NewRelic servers, the problem was that the application wasn’t even logging it correctly.

All of NewRelic's charts were totally blank!  But other information was being reported correctly ...

The charts were totally blank!

So I opened up RubyMine’s trusty ‘External Libraries’ menu and started placing breakpoints in the ‘newrelic_rpm’ gem, hoping to find the source of the problem. Hours later I eventually discovered the problem:

NewRelic overrides the ‘ActionController#process_action’ method to initialize its performance monitoring code. This trick is somewhat well known and other codebases may well fiddle with this method as well. In my case, the (Rails 2 era) user session / state (pre-Devise) code uses ‘alias_method_chain’ on this method, renaming it to ‘process_action_with_current_user_assignment’ and ‘process_action_without_current_user_assignment’. Renaming the method in this way was blocking NewRelic from working correctly.

Once I knew that this was the problem, I was able to come up with a MonkeyPatch to get NewRelic to allow other pieces of code to use ‘alias_method_chain’ on the ‘process_action’ method as well.

Place the following code into your /config/initializers directory:

module NewRelic
  module Agent
    module Instrumentation
      module Rails3
        module ActionController

          def self.newrelic_write_attr(attr_name, value) # :nodoc:
            write_inheritable_attribute(attr_name, value)
          end

          def self.newrelic_read_attr(attr_name) # :nodoc:
            read_inheritable_attribute(attr_name)
          end

          # determine the path that is used in the metric name for
          # the called controller action
          def newrelic_metric_path(action_name_override = nil)
            action_part = action_name_override || action_name
            if action_name_override || self.class.action_methods.include?(action_part)
              "#{self.class.controller_path}/#{action_part}"
            else
              "#{self.class.controller_path}/(other)"
            end
          end

          def process_action_with_newrelic_trace(*args)
            # skip instrumentation if we are in an ignored action
            if _is_filtered?('do_not_trace')
              NewRelic::Agent.disable_all_tracing do
                return process_action_without_newrelic_trace(*args)
              end
            end
            perform_action_with_newrelic_trace(:category => :controller, :name => self.action_name, :path => newrelic_metric_path, :params => request.filtered_parameters, :class_name => self.class.name) do
              process_action_without_newrelic_trace(*args)
            end
          end

        end
      end
    end
  end
end

DependencyDetection.defer do
  @name = :rails3_controller

  depends_on do
    defined?(::Rails) && ::Rails::VERSION::MAJOR.to_i == 3
  end

  depends_on do
    defined?(ActionController) && defined?(ActionController::Base)
  end

  executes do
    NewRelic::Agent.logger.debug 'Installing Rails 3 Controller instrumentation'
  end

  executes do
    class ActionController::Base
      include NewRelic::Agent::Instrumentation::ControllerInstrumentation
      include NewRelic::Agent::Instrumentation::Rails3::ActionController
      alias_method_chain :process_action, :newrelic_trace
    end
  end
end

Credit: https://gist.github.com/959784

May
29

Heroku Necessities: generate CSV files in the background with “delayed_job” and store them on S3 with “paperclip” …

I’m trying to get back into technical blogging as I encounter interesting situations on a daily basis … and I get so much information from others doing the same thing.

In this case I’m moving a fairly large blog from a custom deployment platform on EngineYard, to Heroku.  Heroku enforces a 30-second request timeout — so the webserver can’t be used for heavy, long-running tasks like generating a large CSV file.

The solution is to move the generation of the CSV file into a background task, and store the generated CSV file on Amazon S3.  Since in my case the data that I am compiling into the CSV file is private, I also show how to configure Paperclip to make the generated CSV file only downloadable to authenticated users.

Here’s a brief (30 second) video showing the UI you can build by following these steps:

The Model: ExportedDataCsv.rb

In my case I have a few large sets of data that are stored in the database, that need to be exportable from the system for reporting and administrative tasks.  Think … the ‘Users’ table (full list of users with email addresses, names, and so on) … or the ‘Stories’ table (for a blog, all of the ’stories’ that have ever been written for the site).  So this is stateful.  We’re going to turn the Users table into a CSV file and save it on Amazon S3.  We’ll be storing specific information about the file:

  • What’s its exact name?
  • When was it generated?
  • Is it actively generating right now, or is it available for download?

We’re using Paperclip to handle the mechanics of saving the file to S3, but we’ll need to setup a model in order to configure paperclip, as well as to store that stateful information.

class ExportedDataCsv < ActiveRecord::Base
  has_attached_file :csv_file, {:s3_protocol => 'https', :s3_permissions => "authenticated_read"}

  acts_as_singleton

  def generating?
    job_id.present?
  end

  def csv_file_exists?
    !self.csv_file_file_name.blank?
  end

  def trigger_csv_generation
    job = Delayed::Job.enqueue GenerateCsvJob.new({:csv_instance => self})
    update_attribute(:job_id, job.id)
  end

  def write_csv
    file = Tempfile.new([self.filename, '.csv'])
    begin
      file.write self.data_string
      self.csv_file = file
      self.save
    ensure
      file.close
      file.unlink # deletes the temp file
    end
  end

  protected

  # Kevin says: override me in subclasses ...
  def filename
    'exported_data_csv_'
  end

  def data_string
    ''
  end
end

Now that you’ve seen it, let’s discuss this model in more detail:

  • The first line ‘has_attached_file’ is the familiar way of configuring paperclip.
  • acts_as_singleton — I’m only storing a single version of each ExportedDataCSV file … so I am using the acts_as_singleton gem … the model associated with the exported CSV file will be a singleton.
  • ‘generating?’ & ‘csv_file_exists?’ are two methods I can use in my view to determine the immediate state of the CSV file.
  • ‘trigger_csv_generation’ this method gets called by the application server’s controller method to queue up the ‘write_csv_file’ background job.
  • ‘write_csv_file’ this is the actual method that turns a CSV string into a TempFile which is then handed off to Paperclip.
  • Then there are two methods to be overridden in subclasses … oh yes, did I fail to mention? Since we are generating several distinct types of CSV files, each with its own name and data, I am using what’s called Rails ’single table inheritance’ to create a set of subclasses to model this.

Database Migration

Here’s the migration to create the ExportedDataCSV table in the database.

  • The ‘timestamps’ will keep track of when it was last updated.
  • The presence of the ‘type’ string makes the Rails Single Table Inheritance work.
  • The ‘has_attached_file’ is the paperclip migration helper.
  • The ‘job_id’ is used to track the delayed_job and make the model’s ‘generating?’ method work.
class CreateExportedDataCsv < ActiveRecord::Migration
  def up
    create_table :exported_data_csvs do |t|
      t.timestamps
      t.has_attached_file :csv_file
      t.string :type
      t.integer :job_id
    end
  end

  def down
    drop_table :exported_data_csvs
  end
end

Subclassed Models

With the previous two files written, it’s trivial to create a CSV file:

class UsersCsv < ExportedDataCsv

  protected

  def filename
    'users_'
  end

  def data_string
    User.all.to_comma
  end
end

The information to be put into the CSV file is simply a string. Please see https://github.com/crafterm/comma for more information on working with CSV files in Ruby.

The Delayed Job

We use the now-standard ‘delayed job’ https://github.com/collectiveidea/delayed_job gem to handle the passing off of the long running task (the ‘write_csv’ method in the root model).

Here’s my ‘job’ file:

class GenerateCsvJob < Struct.new(:options)
  def perform
    csv_instance = options[:csv_instance]
    begin
      csv_instance.write_csv
    ensure
      csv_instance.update_attribute(:job_id, nil)
    end
  end
end

Credit — this stackoverflow post was very helpful to me: http://stackoverflow.com/questions/5582017/polling-with-delayed-job

The Controller

The controller is pretty simple … there are two methods.

  • ‘generate_csv’ — queue up a new delayed job to generate the CSV file and immediately redirect_to :back
  • ‘index’ — point the client to the S3 ‘expiring url’ path (the URL only lasts 5 minutes) to download the CSV file, if it exists.
  def index
    respond_to do |format|
      format.csv do
        if Rails.env[/production|demo/]
          redirect_to UsersCsv.instance.csv_file.expiring_url(5.minutes)
        else
          send_file UsersCsv.instance.csv_file.path
        end
      end
    end
  end

  def generate_csv
    UsersCsv.instance.trigger_csv_generation
    flash[:notice] = "We're generating your CSV file. Refresh the page in a minute or so to download it."
    redirect_to :back
  end

The Routes file …

In the routes file we just need to add a custom route to allow the client to access the ‘generate_csv’ action that we created in the controller:

    resources :users do
      collection do
        post :generate_csv
      end
    end

The last tricky bit … the View

The last tricky piece is the view. In the view we determine whether a CSV has been generated yet … if not we allow the user to trigger the generation of a CSV file … if so we show the link to it, but also allow the user to refresh the file as it may be far out of date.

Since we’re building a framework that will allow us to have many different CSV files … we first create an abstracted partial that will accept various input variables and that we can use all over our site:

- if csv_object.generating?
  Generating CSV ...
- else
  - unless csv_object.csv_file_exists?
    No CSV exists.
  - else
    - shortened_filename = csv_object.csv_file_file_name.slice(/(^.*)_/, 1) + '.csv'
    = link_to shortened_filename, download_path
    Last updated:
    = csv_object.updated_at.to_s(:viewable)
  = link_to "#{csv_object.csv_file_exists? ? 'Update' : 'Generate'} CSV.", trigger_generation_path, :method => :post

Here’s an example of how to call the partial:

    %li
      = link_to 'Users', admin_users_path
      %br/
      Download all:
      = render :partial => '/common/csv_generation_ui', :locals => {:csv_object => UsersCsv.instance, :trigger_generation_path => generate_csv_admin_users_path, :download_path => users_stories_path(:format => :csv)}

Summary

There are lots of moving parts in this scheme but once you get your head around it all, it’s a pretty straightforward pattern and a variant of this could be used in other situations as well. Enjoy and good luck!


Flourish