Ski Touring at White Pass

The wife and I checked out White Pass this past Sunday to get some early season skiing in.  The ski resort wasn't open yet, so we ski toured (aka backcountry skiing) up the resort using our own legs.  It was our first tour of the season, with the goal of getting re-acquainted with the equipment and get our climbing legs back.  I'd say it was a success - pretty much any day in the mountains is better than a day out of the mountains.

Base of White Pass Ski Area - we saw three other groups on the day

On the way up, looking down at Hourglass I believe (from Cascade Cat track) - definitely needs more snow

Happy wife at the top

Top part of Cascade - choose your line wisely!

Making tuns

Whoops, not the best photo positioning, but still fun

The dog had a blast. Here he is telling me to throw his squeaker ball for him again.

Post-tour soup and hanging out

All-in-all, a fun day. Now let's get more snow!

Add a comment

First Olympic Triathlon

On September 1st, my wife and I each completed our first Olympic Distance triathlon, called the Titanium Man Triathlon hosted by the local triathlon group in the Tri-Cities of Washington. Olympic distance means a 1 mile swim, 25 mile bike, and 6.2 mile (10k) run. Overall, it went pretty well. My overall time was 2 hours, 35 minutes. Wife completed in 3 hours, 5 minutes. Here's how the morning went.


First let's set up the swim to bike transition. Turns out the wife's back tire (not pictured) was going flat, so we had to use her spare tube. Fortunately, the nice people at REI gave us a free spare tube and it turns out we didn't need the spare after all.

Setting up the bike-run transition was relatively uneventful (which is good), so now let's get on to the start! 

Photo from

Everyone swim! This actually didn't go so hot for me, I did well the first quarter and last quarter, but the middle half I couldn't get into a good rhythm. I was passed by a number of people, so I just had to try to relax and do well on the bike/run.

Photo from

Lots of bikes at T1.  The bike went much better.  I felt quite good on it. The benefit of "underperforming" in the swim means I get to "overperform" on the bike and pass a lot of people.  Especially on the up-hills - advantages of being skinny I suppose. 

The Wife on the bike

Me coming out of T2 starting the run

The run was relatively uneventful, I was pretty gassed in the latter half, particularly miles 4-5.5.  But in the end we both finished! 


Looking forward to participating in more triathlons in the years to come!

Add a comment

Charting a Temperature Over Time Plot with Python and JQuery

 The finished product!
The finished product!

I’ve got a few tech side projects in my head and one of them involves the ability to display some charting abilities for multiple data sets on the same graph.  As we are in the middle of summer, I thought it might be interesting to look up a few different zip codes across the country and see what some different temperatures are like throughout the day.  Here are the basic requirements:

  • Query a weather service on a set timeline for a variety of locations.  Record the current weather at that time to a DB.
  • Using a JavaScript library, create a graphical representation to show the weather across the different areas
  • Updates must happen automatically on the hour
In theory it sounds simple, but as with most things that sound simple, it takes a little longer than expected.

Identify the Weather Source

Let’s tackle item 1 first.  It turns out my favorite weather site, Wunderground has a developer API.  So I signed up for an API key and it’s very easy to create a RESTful query to retrieve the current weather for a zip code:{API_KEY}/conditions/q/99354.json

Output (response truncated in areas): 

    "response": {
        "version": "0.1",
        "termsofService": "",
        "features": {
            "conditions": 1
    "current_observation": {
        "display_location": {
           "full": "Richland, WA",
            "city": "Richland",
            "state": "WA",
            "state_name": "Washington",
            "country": "US",
            "country_iso3166": "US",
            "zip": "99354",
            "latitude": "46.28040314",
            "longitude": "-119.29050446",
            "elevation": "120.00000000"
        "local_tz_offset": "-0700",
        "weather": "Clear",
        "temperature_string": "98.7 F (37.1 C)",
        "temp_f": 98.7,
        "temp_c": 37.1,
        "relative_humidity": "31%",

Using Python to Retrieve the Weather

Perfect! Now I just need a way to programmatically query this service and interpret the response. I’ve been experimenting with Python in some Machine Learning books/tutorials so let’s try to continue to get familiar with it (though in hindsight it sounds like Ruby may have been the ideal language to do this in). First I installed/imported the simplejson library to make the GET request and interpret the JSON object. Then I created an array of the zip codes I wanted to query:

import urllib2
import simplejson
import _mysql

def getTemperature (location) :
        url = '{API_KEY}/conditions/q/' + location + '.json'
        response = simplejson.load(urllib2.urlopen(url) )
        temperature = response["current_observation"]["temp_f"]
        return temperature

locations = ['99354', '55403', '80482', ‘46953’]
for location in locations :
        temperature = str(getTemperature(location))

Note – the zip codes were semi-randomly chosen based on what was relevant at the time. Version 2 of this will likely have a little more variety. Now, we just need to store it. Since the site already runs on a LAMP stack, let’s just use the existing MySQL DB to store this. So after connecting to the DB (not shown), I added another line in the for loop:

db.query("insert into weather (zip, temperature, time) values ('" + location + "', '" + temperature + "', NOW());")


Great, so now I have a Python script I can run to query a list of zip codes and store them into my DB. Let’s work on item 3 next – scheduling. Let’s use the “L” portion of the LAMP stack and add a cron job to run my python script once per hour on the hour.

0 * * * * python ~/python/

Items 1 and 3 now done!

Which Graphical Library?

So now my next decision is what library should I use to display the graphs.  I figured I was wanting to use either an extension for JQuery or Dojo.  I had heard Dojo has good charting capabilities, which appears to be true. I’ve used Dojo on a previous project in 2008 and I thought it would be quick to pick it up again, find a charting example, create some sort of Data grid, and call it done.  However … unfortunately it wasn’t quite as easy as that.  I had trouble running the examples until I put the source code into my local Mac built-in webserver.  After a little frustration then I had trouble getting a good plot over time graph.  Finally I was able to get one plot with poor labels, but then I had trouble having more than one chart in a single graph. Boo!

Enter JQPlot.  Pretty quickly I found an example with multiple series on a single chart:, and then I found a way to zoom in:, plus an example of loading data through AJAX:  The only thing I didn’t quite figure out right away was the format the data was needed in.  If you have data locally, you can use single quotes, but if using AJAX, you need double-quotes for any non-numerics.

Retrieve the Data

It seems I have everything I need except a way to programmatically pull the data.  Well let’s use the P in LAMP and create a small script to pull the data out (yes I got a little bit lazy on the logic for pulling multiple zip codes:

@mysql_select_db($database) or die( "Unable to select database");


echo "[";
echo ",";
echo ",";
echo ",";
echo "]";


function retrieveZipAndPrint($zip) {
  $query="SELECT * FROM weather where zip='".$zip."' and time >= date_sub(NOW(), interval 7 day)";</p></pre>
<pre class="p1">$result=mysql_query($query);

<pre class="p1">  $first=true;</pre>
<pre class="p1">  echo "[";
  while($row = mysql_fetch_array($result)){
    if ($first != true)
      echo ",";
    echo "[\"".$row['time']."\",".$row['temperature']."]";
    $first = false;
  echo "]";

To see a snapshot of my data you can view/download it here

Graphing the Data

After some investigation and tweaking of the examples mentioned above, I finally have a working graph (see above)!

var ajaxDataRenderer = function(url, plot, options) {
    var ret = null;
      // have to use synchronous here, else the function 
      // will return before the data is fetched
      async: false,
      url: url,
      success: function(data) {
        ret = data;
    return ret;
    var plot1 = $.jqplot('chart1', "./temperatureData.txt", { 
        title: 'Temperature Over Time (GMT)', 
        dataRenderer: ajaxDataRenderer,
        series: [{ 
            label: '99354 - Richland, WA'
            label: '80482 - Winter Park, CO'
            label: '55403 - Minneapolis, MN'
    label: '46953 - Marion, IN'
        axes: { 
            xaxis: { 
                tickRenderer: $.jqplot.CanvasAxisTickRenderer,
                tickOptions: {
                  angle: -40
            show: true, 
            zoom: true
        legend: { show: true } 

To see the full HTML you can view it at

Future Next Steps

This was a good learning experience. After doing this I know there is a version 2 of this graph that I'd like to do, including:

  • Updating the zip codes, let's include Alaska, San Diego, and maybe some foreign cities.
  • Include humidity, I usually hate humidity!
  • Local time - It's probably not quite fair to try to compare cities at the exact same time as during sunrise/sunset they could be varied
  • Unstructured/NoSQL DB, such as Apache Cassandra.  I used MySQL because it was quick and my data model was planned to be static for this phase.  As I start to think about the above attributes, I'm going to need to update my table structure, maybe it's time to go to an unstructured NoSQL DB...
  • Define an "ideal day" of temperature and humidity, and start running calculations. This starts to get into a little bit of machine learning. As I start collecting more data, can I start rating different cities with how ideally suited they are to my tastes, or maybe someone who loves heat, or for an animal that loves coldest possible weather.  Lots of interesting ideas to play with.
Add a comment

Migrating an Expiring Amazon EC2 Free Account to a New Free Account

My site,, is hosted on the Amazon Cloud and was free until April 30, 2012.  Specifically, it was hosted using the EC2 Micro Instance free tier.  I first created my account so that I could try to experiment around with having a LAMP server in the cloud and learn a bit more about it.  It then turned into the Sammy Cam, to see what my dog was up to in the corporate apartment.

So my free tier is ending, what do I do?!  Can I migrate my web server to another free account for a year and shut off my server that I now must pay actual money for?  Yes!  Do I need to restart from scratch?  No!  It turns out that's it's pretty easy to create an AMI (Amazon Machine Image, or a snapshot of the current server) and share it with another account.  I'm hoping that no one from Amazon reads (or cares).  Here were my steps:

  1. Log in to Amazon Management Console (let's call this one account A), then go to the EC2 Bucket
  2. Select your web server instance and create an AMI:

  3. It will take some time to create the image, so let's work on setting up our second EC2 free account (let's call this account B)
  4. Log out of AWS or open a new web browser and sign up for another EC2 account
  5. Enter your credit card, confirm your info, and activate your account
  6. Log in to account B, and obtain your account number.  You can find this under Account Activity (which you can get to by clicking your name in the upper-right).
  7. The AMI is probably created by now, go back to account A and check on the status, you should see it under EC2 -> AMIs

  8. Now we need to share this AMI from account A to account B. Select the AMI and choose "Permissions"

  9. Enter the account number from account B (you will need to remove the dashes)
  10. Log in to account B and look in the AMIs bucket under the EC2 tab
  11. Now it's time to launch your instance in account B.  Choose your availability zone (I do the default)

    The steps below could be optional, depending on how much pre-work you have done with account B. 

  12. Set up your security groups (I add ports for HTTP, HTTPS, and SSL)

  13. I also like to set up an elastic IP.  In the EC2 tab, choose "Elastic IPs" and associate your instance
  14. You should now have an elastic IP address - try confirming this IP takes you to your web server and conduct a quick smoke test to ensure functionality is working as-expected
  15. Now let's set up our DNS.  I'm using Route 53 and GoDaddy for my domain name (probably not the best choice, but it works). Go to the Route 53 tab and sign up
  16. Create a hosted zone with your domain name ( in my case)
  17. Next create a record set.  I added two additional record sets for and that conduct a simple (A) routing to an IPv4 address and fill in my elastic IP address.

  18. Next I go to my godaddy account and update my four nameserver addresses with what was provided to my be Route53
  19. If we're feeling adventurous, let's go back into account A and stop the existing web instance
  20. Now, let's trying going to our original domain name.  If all goes well we will see our original page!  You can also try SSH'ing into your web server

    If you don't see your new web server, make sure your elastic IP is still a valid address.  If that's still working as-expected, it could be the DNS takes some time to update.  If all-else fails, start up your original instance on account A while you continue to troubleshoot.

    Good luck! 


Add a comment