29 June 2006

Feedback on Scale with Rails Seminar?

I'm looking into going to the Scale with Rails seminar put on by the Joyent/TextDrive folks. If you're reading this, and have attended a prior one, let me know what you thought. Also, let us know what your background with Rails, web apps, sys admin, etc. was to provide context. I've gone through their slides that they link to on the site, but obviously you get more detail and interaction in person.

The one thing that worries me is the related discussion about how they had such horrid network throughput with BSD, as compared to Solaris (or really, just in general). I find it exceedingly difficult to believe that they didn't mess something up on their BSD systems. You can't tell me that Yahoo (who uses BSD) has these problems - their business wouldn't work. So, that brings into question the competancy to some degree. The rest of the material seems fine, but this one I worry about. Joseph Scott also blogged about this, in more detail.

Three Big Questions from The Number

I wanted to mention one of my favorite parts from reading The Number. It's near the end (so stop reading if you want to stay in suspense :), but it was a set of questions called "The Three Big Ones." You are asked, or ask yourself these questions, and you should physically write down the answers, answering the first one before reading the second question, and so on. He discusses this more in the book, and how people tend to answer them (the general trend/traits of the answers). I think they are a great way to think about not just retirement and financial planning, but what you are doing in life. Note that these are actually quoted in the book from George Kinder, a speaker on "life planning":

  1. Assume that you've got all the money you need - enough for the rest of your life. Maybe you're not as rich as Warren Buffet, but you never have to worry about money for any reasons. The question is, what would you do with it? How would you live?
  2. You go to the doctor. The doctor discovers you have a rare illness. He says that you are going to feel perfectly fine for the rest of your life. But he says, the illness will prove fatal. The sorry outcome will occur sometime within five and ten years. It will be sudden. The question is, now that you know that your life will be over in five years, how would you live it? What would you do?
  3. [it will sound like the previous one, but is different...] It starts the same way. You go to the doctor. You're feeling perfectly healthy. And again the doctor says you have a serious illness. But then the doctor syas, 'You only have twenty-four hours to live.' What I want to know is, what did you you miss? Who did you not get to be? What did you not get to do?
Pretty good eh? Obviously, #3 is the heavy hitter. It's a good way to think about things. It sure made me think.

Bigger Drives for MacBook Pro?

I have a mere 80GB disk in my MacBook Pro. It's maxed (it was truly at 100% this morning when I woke it up). I need something bigger. Apple has only as large as 120GB disks, and they're the slower ones. I'd like to get say a 200-300GB disk. What have folks found? Anyone successfully using a >120GB disk?

Universal Binary of Flash Player 9 (beta) Now Available

I'm quite happy (as I type this on my MacBook Pro), that a beta of Flash Player 9, for MacTel (universal binary) is now available.

Tags: ,

28 June 2006

Latest Reading: JPod by Douglas Coupland

JPod : A Novel by Douglas Coupland
I read JPOD while on vacation last week, and I loved it! I'd read Microserfs (first as the short story in Wired), when it came out, and loved that too. JPOD is similar in terms of following a person and his group, and is based at a video game company. But, the book's style and story are wild. And, the sort of indifference to evil was great. It was really just such a fun book to read, and wacky too. I was laughing out loud at times, and was rather bummed to finish it. I also particularly liked how Coupland included himself in the book. Highly recommended.

Warning: spoiler-like material ahead...
Today, I was listening to the Distributing the Future podcast Playing with Location. During this, they briefly talked with Onomy Labs, who has the "spinny table" (or tilty table as it's called on their web site). It is very interesting to compare this to the Dglobe product in the book.

27 June 2006

Flex 2 and Flash 9 Released! (and a smidge of Rails)

Great news, Adobe Flex 2 - the Flex 2 SDK, and Flex Builder (Eclipse-based Flex IDE), amongst other pieces, have been released (press release)! There's also a new flex resources web site, flex.org, in addition to the Adobe developer center's flex area. And, there's a new Flex Team blog. I really like Flex. I've been experimenting with it for a while, and in particular with using Flex to do RIA's on top of Ruby on Rails back ends. Some links related to this:

As part of this, version 9 of the Flash player has also been released. Press release is here. To me, two of the coolest things with this, are the 10x speed improvements (try it!), and ActionScript 3.0, which is really a nice language and has great improvements (good data typing, E4X integration, etc.). Download it now.

25 June 2006

The Number

What is "The Number" you ask? Well, it is two things. First, it's this notion of how much money you'd need when you retire, or in order to retire. Second, it's a great book:

The Number : A Completely Different Way to Think About the Rest of Your Life by Lee Eisenberg.

First, if you jump right to the Amazon link, I'd ignore the (currently first) review by Gaetan Lion. This person completely misses the point of the book. They clearly didn't read the jacket cover, or other info about the book, and expected it to have a magic formula and be specific about calculating your number. Secondly, if you know how to calculate your number so well, why'd you bother reading the book?!

Anyway, I'd recommend it. It's very well written, and quite an interesting read. From the wit, to the fun stories, to some quite interesting bits of history surrounding financial planning and retirement planning, it's great. No, there is no magic formula, and the book isn't about that. It's more of a philosophical exploration of The Number, retirement planning, and financial planning in general. I'm only 35, but I want to ensure I set my family up for retirement, amongst other things. It was also a timely read, as I'm working with our financial planner right now on some of these issues.

The book is a quick read, and I think of good general interest. It's not the typical kind of thing I read, but I liked it.

Espresso and Coffee in Bend

On our recent trip to Bend, OR, I had to of course scope out a good place for espresso (and WiFi). Starbucks doesn't count. I was happy to find Bellatazza, right in downtown. They had WiFi (with plenty of folks using it), and great espresso - damn good in fact. I had three espresso macchiato's (this is the real kind, not that abomination of a latte that Starbucks makes; i.e. a doppio with just a tiny layer of foam on top, and well done foam at that). Very good. My wife had assorted mochas and other such fru-fru drinks, and enjoyed them as well. They also had paninis they'd grill up right there for you. Good place. There were a bunch of other indi coffee shops, but I didn't have enough time to give others a shot, and Bellatazza was so good that I was less motivated to be potentially disappointed.

21 June 2006

Rocklin vs. Bend temperatures

We're currently in Bend, OR, scouting it out as a potential place to move to. People often ask why we are interested in moving from Rocklin, CA. Heat is our #1 reason. For example, check out this comparison of high temps for the next week or so:











RocklinBend
10272
10675
10878
11185
10989
10587
10081
9676

So far, Bend is good, more on it later.

19 June 2006

Yahoo Maps Beta is Awesome

Have you tried the beta of Yahoo's Maps? You really should, as it really is the best out there right now. Google certainly raised the bar when they brought their maps out, but I almost never use it now, instead much preferring Yahoo's. Yes, it requires Flash (it was built with Flex), but that's part of what makes it a good experience.

Why do I think it's so great? Well, the feature that really hooked me was the ability to right click on the map and say "Drive from here," or "Drive to here." This is just awesome when you see it give you directions, but you know you want to use a different highway or what not. You can add additional points, or re-route using this with ease. Also, removing points is a simple click, and it auto-updates the route.

I also really like the little popup you get that looks up addresses and asks you if it's a certain business or other known entity. This is often a nice validation that you've got the right address. But, what's even cooler, is that when you select a place, it puts that in as part of your directs, and includes the phone number. Right now I've been making maps for our trip to Portland and Bend, Oregon, and this has been very nice for say the hotels and other places we're going.

One last thing that's slick, is that when you hover over a segment of the route instructions, it highlights that segment on the visual map. The whole system is just really well done, and I find it far nicer to use, and more useful. If you haven't, it's definitely worth taking a look.

Tags: , , ,

16 June 2006

A View Inside Adobe Minnesota and Lightroom Team

Jeff Schewe did a photo writeup on a visit with the Lightroom team at Adobe's Minnesota office. This is rare stuff, as I thought there were usually no pictures allowed in our offices. Of course, here's a shot of Seetha in our San Jose office, and an article on the hillarious fan club he has!

And yes, I currently work on Photoshop, amongst other things.

JSON Presentation by Yahoo at WebGuild

On Wednesday night, I headed to Google to listen to Yahoo's (no irony there is there?) presentation on JSON, Yahoo web services, and mash-ups. This was a presentation setup by the Sillicon Valey WebGuild. This turned out to be quite good. In particular, Doug Crockford's presentation on JSON. I also, by complete coincidence (it took both of us a bit of discussion to figure out the connection), ran into a colleague from my original days at Adobe (he was one of the Pagemill folks, and I worked on Pagemill and related stuff very briefly when I was first hired).

Doug first went over the basics of what JSON is, and then got into what the advantages were, why you'd use it, etc. One of the things he mentioned, was that folks have been using JSON, or something extremely close for some time now, but it's only now that it's being recognized and more formalized. I couldn't agree more. I was using JSON about three years ago in a project, yet I'd never heard of JSON. I simply took the JavaScript object literal notation as an easy way to serialize data between C++ and JavaScript code. I parsed it myself on the C++ side (no clue about JSON, so I don't even know if C++ JSON parsers existed then), and obviously just eval'ed it on the JS side.

Here are some of the bits of info I found interesting from the JSON portion (mash-up and web service stuff is below):

  • Yahoo uses JSON to bridge C/C++ and JS code in their web apps
  • There is a MIME type, which is application/json. Doug tried to get a plain type, but application is all they'd allow.
  • There are two ways to parse JSON in JSON, first with eval, and then with parseJSON (see below). One person commented that they've used eval to parse 100MB JSON strings with no problems. The main difference is that eval is not strictly safe, but parseJSON is, and should be used when you don't necessarily trust the source.
  • You can use iframes, or the dynamic script tag hack to deal with cross domain and other security issues.
  • There is a proposal for a new JSONRequest, which deals with the security/cross-domain issues in a non-hackish way.
  • JSON.org/json.js is a JS library that adds JSON parsing and handling methods to strings, arrays, and objects. Use this until they get this stuff into the ECMAScript standard.
  • There is a neat template system, that lets you create JS templates, and then substitute into them using JSON. See the supplant method in json.js. Also, going further is the JSONT (JSON transform, vaguely like XSLT) system.
  • I liked their mention of the fact that JSON has no version numbers. There are no plans for changes, and it's totally stable. It's not extensible, so basically, it's what it is now, and that's what it'll be "forever".
  • JSLint is a lint tool for JavaScript, that also does a variety of other code validation, verification, and check-ups. This looks very interesting.
I'm quite sold on JSON myself. I've recently been using Atom, and something similar to gData in my REST web services, but JSON just makes more sense, and would be a lot easier to use from the apps that are using these services. For example, I have apps that are in Ruby, as well as ActionScript (Flash/Flex), both of which can easily parse JSON (Ruby has YAML, which is a superset of JSON, and Adobe has a library for JSON (and other things) for AS). Atom is great, but even though it's extensible, it just seems like you are shoehorning things into it at times, and that raises a flag for me.

After the JSON portion, Dan Theurer of Yahoo spoke about their web services and related developer bits and mash-ups. In particular he showed their browser-based authentication system that lets mash-ups have their users authenticate with the Yahoo services that are being mashed in, but without the user ever supplying passwords or account info to the mash-up itself. There was also some discussion including Bill Scott of Yahoo, and some of us in the audience around Yahoo's JS UI toolkit (YUI). Bill is also one of the Rico guys (a competing JS UI lib). Some notes from this portion of the talk:
  • Some specific reasons mentioned to do web services: extend the reach of your apps/services; enable people to work with their data in their own ways; etc.
  • Driving web service adoption: make it free, support it, get feedback, create a community around it, provide sample apps, good docs and API's, etc.
  • The YUI has interesting widgets like a calendar, status/loading, image animation, etc. It is very well documented, but not as extensive as Rico or similar. They will be adding more high level items in the future. There is also the YUI blog.
  • There is a gallery of third party apps that use Yahoo web services.
  • Yahoo's web services use JSON of course, or at least some do, but also Atom, RSS, serialized PHP, SOAP, REST, yREST, RESTful, and more.
Van also mentioned to me that if looking at YUI, to also check out Rico's LiveGrid component, which is pretty cool way of lazy loading data into a table that lives in a scrollable area (i.e. load as data comes into view, using AJAX).

15 June 2006

New beta of Lightroom

There's a new beta of Adobe Lightroom that just came out. Run out and get it. I've been using Lightroom for a while now, and store all my photos in it. It's really nice.

13 June 2006

Root.net and the Attention Stream

Have you seen Root.net? This is quite interesting. It's a system (server and Firefox extension) to monitor your "attention stream," which mainly amounts to where you go on the Internet, what searches you do, what you post to del.icio.us, etc. And to answer the immediate question, yes, you can blacklist domains you don't want it to monitor (i.e. say your corporate intranet).

I am just starting to play with it. I also have to commend them on their UI. Heavy use of AJAX, and a very nice draggable module based UI presentation of your data. It's clean, easy to use, and obvious. Once I get this going a bit more, and maybe get a few friends using it, I'll be interested in using their Attention Exchange, which lets you exchange your attention stream with others to see what sites you have in common, compare your surfing time, etc.

Finally, they have a RESTful web service API. Now just don't let your manager get ahold of your clickstream; you don't want them coming around telling you you spend way too much time on <insert a favorite site here>.

12 June 2006

The Peer to Patent Proposal

I listened to a podcast from ITConversations (my current favorite podcast) today with Beth Noveck, about her Peer to Patent proposal. This was awesome.

Her proposal is not to completely redo the system, or to chuck it altogether, but to use peer review and make things far more like typical scientific review systems. It's relatively simple, and just a great idea. If you're at all influenced by, work with, or affected by patents, or you just work in the software or other tech industries, you should give this a listen or read.

Microsoft Blogging is PR

I thought this was an interesting article over at Microsoft Monitor. It discusses how Microsoft's blogs are a prime PR vehicle for them.

10 June 2006

Adobe blog entry on Microsoft and PDF issue

Mike Chambers has blogged about the Microsoft PDF issue.

09 June 2006

Book review: Micro-ISV: From Vision to Realityhttp://www.blogger.com/img/gl.link.gif

I recently finished reading, Micro-ISV: From Vision to Reality by Bob Walsh. I've been thinking more and more about a startup, and in particular a "Micro-ISV", as coined by Erik Sink, who also has a book, Eric Sink on the Business of Software. In general, a Micro-ISV is a single person, or maybe a couple person company, self-funded, producing a software product/web app|service (thus "ISV"). A friend bought both of these books, and we're swapping reading them.

Micro-ISV is a pretty quick, and good read. It is very practical, with a ton of interviews with Micro-ISV folks. The forward of both books is by Joel Spolsky, of Joel on Software (I'm personally more in the Paul Graham (blog) camp though :), plus a longer interview with him later in the book. What I liked about this book is all the practical, and very specific bits. Walsh covers specific payment/e-commerce systems like PayPal, 2Checkout, and Verisign. He talks about specific associations and communities. He covers the various types of company you can form (sole proprietership, LLC, S-Corp, etc.) in a very easy to understand, and right to the point way. And, all the interviews are quite nice.

The main take away for me, and this is ringing true of many things I read from various indi folks, is that the software part is usually the easiest part, and it's all the other stuff (sales, marketing, PR, legal, taxes, etc.) that are the challenge. I'm looking forward to doing it some day (soon?) though.

07 June 2006

Identical UTC dates don't always match in Rails tests?

Here's an odd one. Check out the results of this test:


1) Failure:
test_update_existing_asset(WsControllerTest) [/Users/chbailey/Code/Stingray/test/functional/ws_controller_test.rb:146]:
<Thu Jun 08 05:11:36 UTC 2006> expected but was
<Thu Jun 08 05:11:36 UTC 2006>.

1 tests, 11 assertions, 1 failures, 0 errors

So, uh, how are those time stamps not equal?!
Tags: ,

Testing REST in Rails

I've been working on a Rails app that has a REST API. This API uses the Atom syndication protocol, very similar to GData. It works, but there are issues, primarily in testing it, within Rails. This thread on the Rails mailing list describes it well. The problem is that Rails doesn't easily facilitate passing raw XML data in your POST, and it tries to turn that data into a hash. Unlike the call shown in the thread, I've gone and, prior to the POST call, done:

@request.env['RAW_POST_DATA'] = my_xml_post_data

to try to deposit the Atom XML directly. However, Rails seems to still try to hash this, and I get the same warnings as shown in the mailing list thread.

I will be digging in to the Rails source to see what can be done about this. But, if you've got a work around, please comment. The one other bit here, and I hope this will be an easy fix/change in the Rails code, is support for content type and accept headers of "application/atom+xml". If you use this today, it does not result in your xml request handler getting invoked. This is not a strict requirement, but following standards and so on, this would be good to have.

05 June 2006

Changed blog software/host

I've moved my blog to Blogger as you may now be noticing. I'm in the process of setting up redirects and so on to maintain all the old URL's. So far all I have redirected is the home/index/default page of the site, and the Atom/RSS feeds. I'm going to write a script to actually move all my typo blog entries out of my DB and over to Blogger, at which point I'll setup redirects as needed.

Why did I make this switch? Well, while I'm a huge Ruby and Rails fan, and I think typo is actually a superb blog engine, it wasn't where I wanted to spend my effort. I was having a hard time keeping up with updates, and also, my config on TextDrive with Lighty wasn't very stable. Blogger is completely adequite, and has a few nice bits, and is stable (in comparison at least). Plus, I've been ramping up more experiments, and will use my domain more for that, where I can take Lighty up and down as needed, etc.

Tags: ,

Going to OSCON 2006

I'm quite excited, as I'll be going to OSCON this year. It was a bit of a stretch to get the company to pay for it. There are a few interesting tracks on TDD, unit testing, scripting, etc, and I'm one of the big troublemakers at work in pushing this stuff. In particular I've been, oh the evil, writing some of my features completey in JavaScript, instead of the usual C++! This is actually very cool when you can do it. I took one feature that, was only partially done, at about 2000+ lines of code, and rewrote it, and finished it with all error handling, test support, etc. down to around 650 lines of JS. Of course, while the sessions I sold the company on will be good, I'm even more excited about many others, plus the interesting networking at this conference.

03 June 2006

Mapping Rails app hits

This is a great post about how to geographically view/analyze your Rails web app hits. I find it particularly interesting because they also used Flex in the solution.

Tags: ,

Cleaning out the computer graveyard


Bookshelf
Originally uploaded by Chris Bailey.
Today my wife and I spent a good chunk of time cleaning out the home office, aka the computer graveyard. The accumulation of equipment is kind of amazing. People are already awed when they learn we have eight computers in-use in the house (three others not in use). But the office had gotten out of control. Of course my bookshelf is still completely overflowing. I really need to purge. I rarely do Java work anymore, and a large chunk of space is occupied by Java books (click on the pic to see the books more closely). Some of the things that we found/have, and are either going on eBay or are getting trashed:
  • The Be Book - yes, this is the original manual for the Be computer - a collectors item now I'm sure :) This will go on eBay for sure.
  • 3 dead hard drives
  • Dead SonicWall VPN box
  • Old video cards, NIC's, SCSI cards, etc.
  • USB Bluetooth adaptors
  • Mac Cube computer
  • US Robotics Mac&Fax 28k modem
  • Several Iomega Zip and Jazz drives (I still use a Zip for some things)
  • Tons of cables, many useful, many I have no idea what they even are. Some include, parallel, serial, SCSI (in about 4 formats), ADB, telephone, power, the list goes on.
  • Tons of converters, parallel, serial, ADB, null modems, PS/2 to USB, etc.
  • A couple 10Mbit hubs (10MBit is so last century!)
  • A half dozen cruddy old mice (the computer kind, not the rodent!)
  • Lucent Orinoco Gold WiFi PCCard
  • Inkjet printer
  • A flatbed scanner
  • About 10 bags of software/instructions for APC backup power units
The various cables and such really take one back to the pre-USB/firewire days. What a pain. So many varying formats and junk and hoops to jump through.
Tags:

02 June 2006

Rails and Perforce vs. SVN

I've been doing Rails work, both at work and for some personal projects. At work we use "Perforce":http://www.perforce.com for version control, but at home I use Subversion. It's interesting to compare the two. For example, Jonathan Rentzsch "does not like Perforce":http://rentzsch.com/notes/whyNotPerforce. Also, Rails is a little harder to use with Perforce (P4), and has some specific integration with SVN (for example, the "--svn" switch to some scripts).

My take is that SVN is a HUGE improvement over CVS - the atomic checkins alone make SVN all of a sudden a very viable version control system, whereas I think CVS pretty much sucks. But how does Perforce compare? First, I agree with most points Rentzsch makes. P4 is not ideal when you want to work disconnected. And, yes, once you've used a system like SVN or CVS where you aren't required to check out a file, P4's (and several other VC systems) requirement for this is somewhat painful. However, P4 completely blows most others (all others I've used, including CVS, SVN, Visual Source Safe, PVCS, and one or two I can't remember) away when it comes to branches and the use of what we call at work, "sandboxes" or workspaces.

I've come to almost not want to work without sandboxes. What are they and why are they so great? So, a sandbox is essentially just a branch, in fact it specifically is a branch. Every developer has their own sandbox, and then there's the main code line (and on my current project, there is another sandbox between those that is for our sub-team, but I won't complicate things with that right now). What this does is allow you, as individual developer to fully leverage version control in your own work area. Then, when you are ready, you can integrate your code back to the main line.

Before I go further, I want to say how incredibly easy it is to merge/integrate code back and forth between main and your sandbox, when using P4. This is, to me, probably Perforce's top strength and what sets it apart from most other version control systems. In CVS and SVN, it's a complete pain to merge back and forth between branches. You have to look up last merge times, etc. With Perforce, you simply run an integrate command on your sandbox branchspec, and P4 takes care of the whole thing, either direction. It does a merge, and shows you what conflicts, then has many automated ways to resolve conflicts. I should also note that P4's merge abilities are top notch (SVN has no auto-merge that I'm aware of). Perforce's merging and branch handling is so good that I regularly do this multiple times any given day.

Anyway, once you work in this manner, you come to rely on it. It's superb as well if you work in a cross platform environment with compiled code. For example, you can develop a feature say on your Mac, build and test, then check it in to your sandbox. Then, you go to your Windows box, sync up to your sandbox, build and test. Once it's dialed on both platforms, then you can merge to main.

These techniques are very useful with teams, as one can imagine. However, it's also useful just for a solo developer as well. With such easy branch use, you can create branches for experiments, or longer term/parallel work.

Coming back around to Rails, I use P4 with it at work, but it's more difficult due to P4's requirement to check files out prior to editing them. As Rentzsch mentions, various editors will make this easier with their P4 integration, but this doesn't work for Rails' various generators or say plugin installer scripts. These scripts just expect to be able to directly edit a file. Luckily you can at least do a dry run, and pre-check out files, but that's kind of a pain. So, I'm hoping to start working on adding a "--p4" parameter corollary to the "--svn" parameter. If/once I get that done, I will contribute it back of course. Feel free to let me know if you'd find that interesting.