24 March 2008

Steak and Chard

My wife is visiting friends with the kids this week, during my daughter's first week of spring break. I'm home doing a lot of work, but it's also a real rarity that I'm home when I'm away from my family (usually it's me who's traveling). Anyway, I'm definitely doing some cooking, as well as will investigate one or two "sketchy" Mexican taco places (I scout them out, and then take the fam if good :) They're sketchy in appearance (hopefully not in food). But, for me, the sketchier, usually means the better. But I digress...

Tonight I made probably the best steak I've made in a long time, definitely one of my best ever. Not a new recipe, but just perfectly executed, if I do say so myself. Combined with it was an experiment with chard; recipe of my own on-the-fly creation.

The Steak

First I went and got an absolutely top quality ribeye from my local meat market (Long's, here in Eugene). A Roughly 1lb beauty. Then I ground up some fresh Blue Bottle Roman Espresso coffee (ground at a setting approx between drip and French Press, so on the course side, but not huge chunks). It is absolutely key to use fresh coffee beans, and grind them, none of this canned or pre-ground crap. Also, the better the coffee, the better the result. I could go on a long time here, but I won't (because I'm working on a blog entry about Blue Bottle :) Next, combine that with a pinch or three of either kosher salt or Fleur de Sel or similar salt of your choice. And then, add fresh ground pepper to the mix - about 1/3rd as much as you have coffee (more or less to taste I suppose - but don't put so much that you drown out the coffee). Liberally coat your steak with that - hide the steak in it.

I then recommend grilling the steak over a very hot grill. I use a gas BBQ, with my burners all on high - about 500 degrees on average. For the thickness of steak I had tonight (1.25"?), I cooked it just short of 11 minutes - about 5 minutes a side or so. This yields a medium-rare steak, and I mean truly medium-RARE, plenty of pink, but not bleeding. Once done grilling, pull it off and let it rest a few minutes. Stellar.

The Chard

I'm a big fan of chard, usually sautéed. Tonight I had some organic red chard. At Long's I'd also picked up some prosciutto, although a last minute decision to try something new, yielded some green peppercorn infused prosciutto cotto (cooked). First I sautéed some chopped red onion, with a pinch of Napa herbs, fresh ground pepper, and a bit of the helpful chunky grey garlic sea salt (go light here, this is not a garlic thing). A bit of red wine (a bottle I had open, oddly enough a tempranillo-syrah blend). Saute and fry that prosciutto up a bit.

Next, I separated the stalks from the chard, as they need to cook longer. Toss those in with the above mixture and steam/saute a bit to soften the stalks up. Then, put the chard leaves in, and essentially steam until done. Doesn't take long. Given that I made this up while I was cooking it, it worked out really well. Of course, most things with prosciutto do :)

Drink

I went with the easy choice here, although unexpected. I think most people would expect a nice bold red wine, and I do have some nice ones in the wine fridge. But, when I'm alone and not at a restaurant (thus not opening a bottle, or ordering by the glass) I'll go with a cocktail. My standard favorite is gin rocks with onions. Tonight this was Zuidam dry gin (battling for top spot with my usual favorite No. Ten by Tanqueray), and the best cocktail onions, Sable and Rosenfeld Tipsy Onions. I prefer my gin over just a couple large cubes of ice, so that it's not so cold as to take away flavor. Good botanical gin has a myriad of wonderful flavors, and I think shaking it with ice just kills some of that - No Ten is FAR better just slightly cooled over a couple cubes of ice.

All this, while listening to some great jazz (not typical for me, but "completed" the evening), and sitting at the bar-counter in our house. I felt like I was in a great restaurant, eating a wonderful meal at the bar, only it was in the comfort of my own home, relaxing, and loving it. I guess it was my own great restaurant; how nice!

20 March 2008

Creating Sparkle Appcast via Rake Tasks

I have a RubyCocoa application that self-updates using Sparkle. To do so, you need to create an "appcast" file which contains the version and download information for your application, as well as creating the zip file that holds your app. Then, you of course have to upload this to the server and location that you have specified in the SUFeedURL key value in your Info.plist file of your app. For general instructions on using Sparkle and setting it up, see their Basic Instructions page.

My Rake tasks do not create the zip file. I may enhance it to do this at some point, but so far I haven't needed to, and have had cases where I need to create it myself for various reasons. What the tasks do is to build an appcast.xml file from a YAML file that contains all the necessary information. Note that the name of my app is "Linker", so you'll see that in various spots. The tasks do rely on a simple directory structure:


  • Your app root directory


    • Rakefile

    • appcast


      • version_info.yml

      • build


        • Your app zip files go here (e.g. Linker_0.8.zip, Linker_0.9.zip, etc.)

        • Rake task will create the linker_appcast.xml file here






So, you have a spot you drop your zip files into, and this same dir is where the Rake tasks create the appcast file. The version_info.yml file is where you put the info needed to generate the appcast. It looks like this:


linker-04:
title: Linker 0.4
filename: Linker_0.4.zip
description: Added Sparkle updating mechanism.

linker-05:
title: Linker 0.5
filename: Linker_0.5.zip
description: Added help (see Help menu). Added bookmarklet support/custom URL protocol handling. See the new help for information on how to use the bookmarklet.


Note that you can put HTML into the "description" field, and my Rake task will deal with and preserve that.

Finally, I have two Rake tasks, one for building the appcast, and the other for uploading it and the latest zip file to the server. These each are simply one liners that call a parallel Ruby method within the Rakefile:


namespace :appcast do
desc "Create/update the appcast file"
task :build do
make_appcast
end

desc "Upload the appcast file to the server"
task :upload do
upload_appcast
end
end


The two methods rely on you defining a couple of variables in your Rakefile, adjust these as desired:

APPCAST_SERVER = 'your_appcast_server.com'
APPCAST_URL = "http://#{APPCAST_SERVER}"
APPCAST_FILENAME = 'linker_appcast.xml'


Here are the methods, first the one that builds the appcast, which you'll need to modify for your app:

def make_appcast
begin
versions = YAML.load_file("appcast/version_info.yml")
rescue Exception => e
raise StandardError, "appcast/version_info.yml could not be loaded: #{e.message}"
end

appcast = File.open("appcast/build/#{APPCAST_FILENAME}", 'w')

xml = Builder::XmlMarkup.new(:target => appcast, :indent => 2)

xml.instruct!
xml.rss('xmlns:atom' => "http://www.w3.org/2005/Atom",
'xmlns:sparkle' => "http://www.andymatuschak.org/xml-namespaces/sparkle",
:version => "2.0") do
xml.channel do
xml.title('BWA Linker')
xml.link(APPCAST_URL)
xml.description('Linker app updates')
xml.language('en')
xml.pubDate(Time.now.rfc822)
xml.lastBuildDate(Time.now.rfc822)
xml.atom(:link, :href => "#{APPCAST_URL}/#{APPCAST_FILENAME}",
:rel => "self", :type => "application/rss+xml")

versions.each do |version|
guid = version.first
items = version[1]
file = "appcast/build/#{items['filename']}"

xml.item do
xml.title(items['title'])
xml.description { xml << " xml.pubDate(File.mtime(file))
xml.enclosure(:url => "#{APPCAST_URL}/#{items['filename']}",
:length => "#{File.size(file)}", :type => "application/zip")
xml.guid(guid, :isPermaLink => "false")
end
end
end
end
end

Looking through that above, you'll want to modify the title and description at least. Now on to the uploader method:

def upload_appcast
remote_dir = "/var/www/apps/bwa/shared/public/updaters/"

Net::SSH.start( APPCAST_SERVER, 'deploy' ) do |session|
cwd = Dir.pwd
Dir.chdir('appcast/build')

shell = session.shell.sync

begin
out = shell.cd remote_dir
raise "Failed to change to proper remote directory." unless out.status == 0

out = shell.ls("-1")
raise "Failed to get directory listing." unless out.status == 0

files = Array.new
out.stdout.each { |file| files << file.strip }

# Look through the list of files and see what we need to upload, as
# compared to what we have locally - but always upload the appcast itself
local_files = Dir.glob('*')
files.delete(APPCAST_FILENAME) # we always upload this
local_files.each do |local_file|
unless files.include?(local_file)
print "Uploading: #{local_file}... "
`scp #{local_file} deploy@#{APPCAST_SERVER}:#{remote_dir}`
puts $?.exitstatus == 0 ? "done." : "FAILED!"
end
end
rescue => e
puts "Failed: #{e.message}"
ensure
Dir.chdir(cwd)
shell.exit
end
end
end

You will of course want to modify the remote_dir, and the login credentials towards the bottom (where it does the scp command). This also relies on you having your SSH keys set up, so you don't have to enter a password when it does the scp.

You could further generalize this obviously, but this is what I have, it works fine, and I haven't needed to extract anything out further. Posting here as per a request, and hopefully it saves someone else a few minutes.

18 March 2008

Webcam Recommendations?

I setup the site Basecamp Silverton for a friend, and he's wanting to add a webcam to it, to show what the mountain and a bit of town look like at a fairly frequent interval. He had gotten what looked like a relatively decent web cam - the optics are fairly good for the price, but the thing's software is atrocious and it only does FTP in terms of sending images out (it does provide a live feed, but we're not after that). So, I'm seeking recommendations.

What we want is a networked webcam with the following characterisitcs:


  • A decent enough lens to point it from his house in town up at the mountains (which are right there - as in less than a mile away).

  • It should be low cost (preferably under $300, over that will be considered, but there has to be a good reason).

  • Support SFTP preferably (FTP is ok), as a way to send images at regular intervals from the camera to a server.

  • RSS Feed for the images is an acceptable alternative to FTP/SFTP.

  • Be easy to configure and manage.

  • WiFi is ideal, but not required.

  • Must be configurable from a Mac or a Mac browser (Safari or Firefox).

  • Multiple mounting options would be good - bolt on, simply sit on a desk/shelf, etc.

  • No special networking requirements.

  • Exported image size of about 500 pixels wide or more. e.g. something with reasonable size to provide reasonable detail.

  • Exported images in JPEG, PNG, or GIF.


Anyway, let me know your recommendations.


Update: It turned out that there was a network configuration issue with the camera we have. I was luckily able to figure this out within literally 1 minute of logging into the configuration web app for the camera, after Matt opened up a hole in his firewall so I could remotely access the camera. There still seems to be some problems with the FTP, but this got us closer, and I suspect we'll be staying with this camera afterall. Of course, don't hesitate to recommend what you like, may be useful later on (and we may do a second camera if this one proves as successful as the demand has indicated so far).


Tech Books for Free

I have some tech books that I will be donating to the local library, unless someone wants any of them. If you would like one or more of these books, I'm happy to send them to you for the price of shipping. Click through to the Flickr page and message me, or leave a comment on my blog (make sure you include your email address (which doesn't get published) in your comment, so I can reply. The books are primarily cover Java and Linux, but also Python, Jabber, Mozilla, Emacs, etc.

Books For Free

Also, not in the picture, but most likely available is, "Object Oriented Perl" by Damian Conway, from Manning Press. It's on eBay, but doesn't appear it'll sell.

17 March 2008

Another Reason to NOT Put Seed Data in Rails Migrations

I discussed an approach I took recently to getting standard or seed data into your app. While I've used Migrations quite successfully in the past for this, I am no longer doing so. And, on an older app, I just got bit by it. So, here's another reason not to do it...

Now that I've run into this, it's extremely obvious, but: If you change model code for a model which is used to create records in previous migrations, you can easily break those prior migrations. This won't matter when you have existing databases you are migrating, but it will matter if you need to create a database and migrate it from scratch (maybe your Continuous Integration server does that for example, or you are simply setting up a new DB in your development environment).

For example, the case that bit me was that I recently changed a model, that had some data created by migrations, to specify "acts_as_list". In doing so, I created a new migration that added the position column - an attribute that gets filled in automatically for your model when you create objects of that type. However, when recreating the database and running up through the older migrations, the prior seed data failed, since the position column did not yet exist, yet the model's code was trying to populate it.

Luckily I was actually adding an administrative interface to managed CRUD and other ops on this particular model, and as part of that, no longer needed the seed data anyway, so was able to just nuke that from the older migrations (and luckily no tests depended on it, and production and staging systems were well past those migrations).

13 March 2008

Facebooker Publisher and URL Fixes for Rails 1.2.x Use

I'm using Facebooker, and specifically the new Publisher class it has, with a Rails 1.2.6 app (hopefully I'll get us on Rails 2 sooner than later). But, Publisher uses some methods that are only available in Rails 2 it appears, as well as the mechanism it uses to look up Publisher view templates doesn't work properly in Rails 1.2. Also, link generation doesn't work quite right in all cases, so I have a fix for that too. Documenting my changes here for my own reference, as well as anyone else it may help.

For Publisher, I've made two changes to remedy these issues, both in facebooker/lib/facebooker/rails/publisher.rb:

In the initialize_template_class method, change the line:

returning ActionView::Base.new([template_root,File.join(template_root,self.class.controller_path)], assigns, self) do |template|

to instead be:
returning ActionView::Base.new(template_root, assigns, self) do |template|

This fixes the problem where the Publisher would look for views in a directory path that contained your publisher's name twice.

The second one is to change the inherited method's call to send! to instead simply call send.

For the link generation, I tweaked the implementation of Facebooker's UrlRewriter#link_to_canvas? method, shown in entirety here:

def link_to_canvas?(params, options)
option_override = options[:canvas]
options[:only_path] = false if !option_override.nil?
return false if option_override == false # important to check for false. nil should use default behavior
option_override || @request.parameters["fb_sig_in_canvas"] == "1" || @request.parameters[:fb_sig_in_canvas] == "1"
end

The result is that if the canvas parameter is specified, then we force a full URL, instead of only a path (which is the default). This covers apps that have both a regular web application and a Facebook app, where you are generating links that point to one from the other (e.g. you're in Facebook, but generating a link that points to your regular web app).

Seed Data for Your Rails 2 Apps - Another Approach

Historically, I've used migrations to set up standard data that my database must contain in a Rails app. This would be things like standard Roles for the system, or maybe country codes or such things. However, it appears this simply won't work in Rails 2.x, because as far as I can tell, when you run something like "rake test", it blows away ALL data in your database (not just fixture data). If I'm wrong about that, please correct me. This makes sense given that it seems the drive is towards schema.rb being the official way to create a DB from scratch, and that you have the equivalent of Foxy fixtures which do lots of magic to make creating your fixtures easy (but likely quite painful to figure out how to explicitly clean up those fixtures in certain cases - so it's easier just to wipe the DB clean).

There are various solutions for creating seed/standard/structured data for your app. However, from what I've seen none address this problem that that data will get wiped out when testing. For many people that may not matter, their tests may not hinge on it. But, I like to stay DRY, and when you have standard roles, or similar types of data, there is no reason I should have to recreate those in fixtures (and risk being out of sync), or leave them out, etc. I likely have app functionality that directly depends on such things, and thus I need this during testing as well.

My solution as of now is a simple one, and one that does not scale well for large amounts of data, but for the five records I need at this point in the particular app I'm working on, it's an approach (I very much welcome better approaches!)... I simply created a "seed_data.rb" file in my config/initializers directory. Within this file I have code that does a create_or_update (or similar) of the standard data I need. This seems to work out quite well.

Update: the above breaks things like "rake db:reset", because when the initializers run, as part of the Rake environment, and the DB has been dropped, the initializer fails, and thus fails rake.

11 March 2008

SVN Externals are Evil; Use Piston or Braid

I've recently spent a considerable amount of time rectifying problems caused by SVN externals. In one of the codebases I work on, it had been developed with a heavy number of Rails plugins as SVN externals. In general, it was a good approach as these were external code, or shared code, etc. This I think is at least better than directly checking the code in, as you have a more precise record of where it's from, etc. I should also note that our externals were all set to specific tags or branches specific to our code (i.e. not to trunk, where you'd be getting updates without your control). Sounds good, what about this "evil"?

The problem comes in when you need to make changes to the code of an external. You might think, well, go change the root code and then adjust your tag, etc. In some cases you can't do that - maybe it's not code you have commit rights to, or you're making a change that's specific to your app and can't be done another way, or, as was often in the case I had, we were on a much older version, and the trunk and other tags had major differences that I didn't want to integrate.

Thus, what I needed to do was remove this as an external, and check the code in directly. Another approach would be to branch it from where you were and modify that, etc. I wasn't able to do that due to various Subversion permissions (probably not a common case, but I had no choice). This action itself (remove external, add code) is not a real problem in SVN. But, it IS a problem when you go to update. A simple "svn up" on other machines failed. That is pathetic. Instead, what I had to do was go delete the existing (svn externaled) directories, then do "svn up". This of course broke our continuous integration server, and I also had to go manually fix this up on machines I was deploying to. Crappy, but if that was the end of it, I'd probably not be as unhappy...

When it comes to merging these kinds of changes into branches, watch out! This is where SVN just flails. First if you happen to use svnmerge.py to manage your branch merging, forget it. It just can't deal with it, and will leave you with a partially complete merge. Doing it manually, even with things like --ignore-ancestors, does not work either. I had to do something similar as to the "svn up" fix: I had to go in and delete all the directories that were previously svn externals, and then do my merge. And note, do NOT delete the parent directories. For example, if all of your Rails app's plugins were externals, do go and nuke "vendor/plugins". It will then be totally confused and just not do anything, and fail. Nope, you need to specifically delete each offending svn external directory. I make extensive use of branches (I do most work on a branch for daily work), so you can multiply these problems across the number of branches you might need to be merging to, etc.

Having said all that, this problem isn't really all that illogical. I don't know how SVN works internally, but the whole svn:externals thing seems a bit like a hack, or at least not a first class citizen in SVN land. SVN merge or update, should be able to see: hey, you were up to date (for your current revision) on directory X, but this update is going to replace that with new code with the same dir name. But, it doesn't, maybe because it doesn't look at the externals properly in relation. I don't know, and I don't care, since it's broken, and my fix is that I'm moving to Git soon enough :) Also, as another point of view, I know Perforce handles this kind of thing just fine (we used remote mounted Perforce depots all the time at Adobe, and made seriously extensive use of branches (in fact, we required working on a branch)).

Now that I've spent entirely too much time on the build-up, what's the solution? Simple: use Piston (or Braid if using Git). What Piston does, is to not use svn:externals, and instead check the code in directly, yet maintain linkage to the external it came from. My take is this is really probably how svn:externals should've worked (I presume that constantly updating an external is actually a rarely desired trait). You import an svn external using Piston, and it will pull the latest code from whatever SVN URL you supply. In this case, you could use trunk, or you could as usual use a tag or branch. But then it's fixed - it will not update that anytime you do "svn update". Instead, it is up to you to explicitly tell it to update. This avoids svn externals as far as your daily operations go, and also causes zero problems for merges. It does more though.

The second benefit of Piston is that you can then modify the external code, but still bring down updates from the external, allowing a synergy between using external code and your app's specific needs. This is exactly what I needed on a couple of plugins we use, where those plugins' code had deviated significantly from our codebase so I couldn't use a newer version, but I needed to make some changes.

To summarize, the evil is SVN itself not handling changing of externals (i.e. to/from an external) in basic operations like updates and merges, which may cause a lot of manual work on your end, and break automated builds or similar. The solution: use Piston or Braid and get the best of everything.

06 March 2008

Rails Applications and Gems: Solving the Dependency Problem

There's a post today on the Relevance blog about Frozen Gems Generator. I tried posting a comment there, but it seems to have not gone through, so I'll blog my solution here instead.

Chad Woolley at Pivotal Labs created GemInstaller to solve the problem of specifying exactly what gems you want your Rails app (or other Ruby code) to use. I've dealt with this issue a lot over the course of building Rails apps, and while at first blush I didn't think this was a good solution, I'm now really like it, and use it on most of my projects (basically all the projects I control or can :)

So, why is it better than other solutions, or at least the other solutions I've seen? First, let me give a quick synopsis: it is a simple gem that allows you to create a geminstaller.yml file that specifies the version(s) of gems your app wants. This can be an explicit version, or can use things like >= version, etc. It can then automatically install the gems for you on app launch, on deploy, or just at the command line. The benefits of this solution for a Rails app include:


  • Solves the arhictecture/platform-specific gems problem. I haven't seen any of the other solutions do this, or do it well. Most just punt on it, others require a convoluted process or hacking up your other code. Because geminstaller simply relies on the gems being installed on your system, it will use the proper version for whatever system it is running on. This also ties into the next point...

  • No polluting your source control with gems. This speeds up your source control, as well as your deploys. Further, for architecture specific gems, you now don't have to have every version of each gem in your source control for each platform you need (which is quite likely at least two: your dev boxes (e.g. Macs) and your deployment boxes (Linux), but could be even more).

  • Easier, single location, statement of what gems your app requires. By using the geminstaller.yml file, you have a single place to go see what gems and which versions of those gems your app uses. This is much better than trying to look through your vendor directory, and determine what version of a gem you might have.

  • Great for bootstrapping your development environment. Sure, frozen gems usually solve this too (except for the architecture specific ones!). You can just run geminstaller after pulling down the code and it'll go install all the specific version gems you need.

  • Allows for multiple config files, so that you can build common ones you use across projects, etc. Or even cooler, your plugins or whatever can provide a file to specify what they need and you can integrate that into your config!

  • Easy to install and use. In Rails 2 environments, you can simply drop the few lines needed to use it into its own file in config/initializers. In Rails 1.x, you add these lines to your environment.rb.

  • You determine what level of function you want geminstaller doing in your app: e.g. do you want it automatically installing missing gems or just warning you? Should it put them on the load path so you are guaranteed the proper version loads, or do you want to just use it to bootstrap and live dangerously otherwise ;-)

  • Makes it easier to experiment with new versions of gems. Since you'll have to install the gem anyway (or most solutions need that to freeze them in, but not all), you can experiment by simply changing the version number in your geminstaller.yml file. To undo it, just change the number back. No need to copy the gem into vendor or a private gem repo, etc. Easy.

  • GemInstaller can tell you what gems you have on your system, but are not in your config file, as a way to see what you might need.



Check out the GemInstaller page for more details. I highly recommend this, and thanks Chad for creating it.

05 March 2008

Innovation and the New Gauge of a Good Job

Sam Davidson has a great post over on the new-ish Brazen Careerist blog/site, "New Gauge of a Good Job: Freedom." I couldn't agree more. This pretty much sums up why I left Adobe.

There are parts of Adobe, in my opinion, doing interesting things (Air and Lightroom for example), but for the most part it's so corporate, slow, and risk averse, that they are still not in the web app game - they're not just late, they're non-existant (and note that I say that as someone who has shipped network and web apps for Adobe). And to think that Microsoft said they were late to the game 10 years ago! (I don't have the quote from Gates on that, so 10 years give or take :)

I have worked on web apps there, and there are actually a couple out there, but none anybody talks about. It is sad to me, because there are awesome people there! I've worked in the web services & apps groups, Photoshop group, on consumer software, and so on, and there are so many super smart people, and lots of great ideas. I've worked for Adobe twice in my career, so it's not as if it's a bad or evil company. Moving on...

In Mr. Davidson's article, it's mentioned that Fast Company's latest issue says Google is the world's most innovative company. I've been discussing innovation with friends lately, and our (if I may) take is that that's not actually true. It is potentially the most visible and obvious candidate for that, but look at many of Google's "innovations" lately: they're actually acquisitions. The percentage of in-house developed apps and innovation has gone down as they've grown (not surprisingly).

At the core of our discussions is that big companies just aren't the ones doing the bulk of innovation these days, or even that noticeable of a percentage of it. Most innovation is coming from the tiny startups, the "garage" built companies, or much smaller companies. To many people this may not be obvious, because what seems to then often happen is that Google, Yahoo, etc. snatch up those companies. I would argue that Apple is more innovate from within than any other big company right now (iPhone, iPod, MacBook Air). Amazon might be my next candidate (Amazon Prime, their web services). It's of course all debatable and that in and of itself is fun.

The best part of it all to me, is simply that there is a lot of great stuff being done! Lots of cool web apps, interesting hardware bits, intriguing business models, and so on. So, here's to all the innovation going on out there, regardless of where it's being done! And, for folks sitting there in a non-creative, or constrained environment, take a serious look outside. It can be a bit scary to leave that cushy, well paying, great benefits job, but there's a lot more to life, and having made the jump myself, I find I'm constantly saying I wish I'd done it sooner!

02 March 2008

Beautiful Dining Table, Chairs, and Mirror For Sale

DiningSet-1

Normally I don't post eBay or Craigslist stuff I'm selling on my blog, but this is a special item (items). We have an amazing dining room table, chairs, and mirror that we're selling, because it very unfortunately doesn't work (color wise) in our new house. We've held out nearly a year (since moving) and are finally ready to part with it :)

DiningSet-2

Check out the ad on Craigslist, as well as the bigger/better photo gallery.