Blog

Skiing on the North Shore of Lake Tahoe

Last week work brought me to the city of San Francisco.  My good buddy Jim moved to Berkeley area in the Fall of last year, so I took the opportunity and headed up to Lake Tahoe with him.  Tahoe is an interesting place, it tends to either be very sunny or very snowy, and seems like it is rarely between those extremes!  My new project, WhereShouldISki said it would be sunny, and it was right!

We started off Friday with a trip to Mt Rose, NV; I was intrigued by the famed chutes.  They were definitely a fun challenge, I would be glad to go back. 

Untitled
Heading over to NV

Untitled
The Chutes

IMG_7234
Mt Rose Ski Area 

After a backcountry route recommendation by Mike at the BackCountry store in Truckee, we decided to head towards Rose Knob Peak near Incline Village, NV.  Saturday morning we started skinning (like hiking) up towards Rose Knob Peak.  We were treated to spectacular weather and even better views.  

 IMG_7254

Getting started in the morning

IMG_7268
Taking a break for lunch

IMG_7270
Amazing views from the top of Rose Knob Peak

IMG_7283

A great day out 

The snow itself didn’t ski particularly well on the way down, it was a bit sun-crusted and extremely variable.  That’s what happens sometimes in the backcountry though, it’s not all about the way down, it’s about spending time in the mountains with friends!  Looking forward to heading back!

IMG_7305
Skiing down

Untitled
Flying out, Heavenly Ski Area from the plane

 
Add a comment

Ski Touring at White Pass

The wife and I checked out White Pass this past Sunday to get some early season skiing in.  The ski resort wasn't open yet, so we ski toured (aka backcountry skiing) up the resort using our own legs.  It was our first tour of the season, with the goal of getting re-acquainted with the equipment and get our climbing legs back.  I'd say it was a success - pretty much any day in the mountains is better than a day out of the mountains.


Base of White Pass Ski Area - we saw three other groups on the day


On the way up, looking down at Hourglass I believe (from Cascade Cat track) - definitely needs more snow


Happy wife at the top


Top part of Cascade - choose your line wisely!


Making tuns


Whoops, not the best photo positioning, but still fun


The dog had a blast. Here he is telling me to throw his squeaker ball for him again.


Post-tour soup and hanging out

All-in-all, a fun day. Now let's get more snow!

Add a comment

First Olympic Triathlon

On September 1st, my wife and I each completed our first Olympic Distance triathlon, called the Titanium Man Triathlon hosted by the local triathlon group in the Tri-Cities of Washington. Olympic distance means a 1 mile swim, 25 mile bike, and 6.2 mile (10k) run. Overall, it went pretty well. My overall time was 2 hours, 35 minutes. Wife completed in 3 hours, 5 minutes. Here's how the morning went.

Untitled

First let's set up the swim to bike transition. Turns out the wife's back tire (not pictured) was going flat, so we had to use her spare tube. Fortunately, the nice people at REI gave us a free spare tube and it turns out we didn't need the spare after all.

Setting up the bike-run transition was relatively uneventful (which is good), so now let's get on to the start! 


Photo from tri-cityherald.com

Everyone swim! This actually didn't go so hot for me, I did well the first quarter and last quarter, but the middle half I couldn't get into a good rhythm. I was passed by a number of people, so I just had to try to relax and do well on the bike/run.


Photo from tri-cityherald.com

Lots of bikes at T1.  The bike went much better.  I felt quite good on it. The benefit of "underperforming" in the swim means I get to "overperform" on the bike and pass a lot of people.  Especially on the up-hills - advantages of being skinny I suppose. 


The Wife on the bike


Me coming out of T2 starting the run

The run was relatively uneventful, I was pretty gassed in the latter half, particularly miles 4-5.5.  But in the end we both finished! 

Untitled

Looking forward to participating in more triathlons in the years to come!

Add a comment

Charting a Temperature Over Time Plot with Python and JQuery

 The finished product!
The finished product!

I’ve got a few tech side projects in my head and one of them involves the ability to display some charting abilities for multiple data sets on the same graph.  As we are in the middle of summer, I thought it might be interesting to look up a few different zip codes across the country and see what some different temperatures are like throughout the day.  Here are the basic requirements:

  • Query a weather service on a set timeline for a variety of locations.  Record the current weather at that time to a DB.
  • Using a JavaScript library, create a graphical representation to show the weather across the different areas
  • Updates must happen automatically on the hour
In theory it sounds simple, but as with most things that sound simple, it takes a little longer than expected.

Identify the Weather Source

Let’s tackle item 1 first.  It turns out my favorite weather site, Wunderground has a developer API.  So I signed up for an API key and it’s very easy to create a RESTful query to retrieve the current weather for a zip code: http://api.wunderground.com/api/{API_KEY}/conditions/q/99354.json

Output (response truncated in areas): 

{
    "response": {
        "version": "0.1",
        "termsofService": "http://www.wunderground.com/weather/api/d/terms.html",
        "features": {
            "conditions": 1
        }
    },
    "current_observation": {
...
        "display_location": {
           "full": "Richland, WA",
            "city": "Richland",
            "state": "WA",
            "state_name": "Washington",
            "country": "US",
            "country_iso3166": "US",
            "zip": "99354",
            "latitude": "46.28040314",
            "longitude": "-119.29050446",
            "elevation": "120.00000000"
        },
...
        "local_tz_offset": "-0700",
        "weather": "Clear",
        "temperature_string": "98.7 F (37.1 C)",
        "temp_f": 98.7,
        "temp_c": 37.1,
        "relative_humidity": "31%",
}

Using Python to Retrieve the Weather

Perfect! Now I just need a way to programmatically query this service and interpret the response. I’ve been experimenting with Python in some Machine Learning books/tutorials so let’s try to continue to get familiar with it (though in hindsight it sounds like Ruby may have been the ideal language to do this in). First I installed/imported the simplejson library to make the GET request and interpret the JSON object. Then I created an array of the zip codes I wanted to query:

weatherping.py

import urllib2
import simplejson
import _mysql

def getTemperature (location) :
        url = 'http://api.wunderground.com/api/{API_KEY}/conditions/q/' + location + '.json'
        response = simplejson.load(urllib2.urlopen(url) )
        temperature = response["current_observation"]["temp_f"]
        return temperature

locations = ['99354', '55403', '80482', ‘46953’]
for location in locations :
        temperature = str(getTemperature(location))

Note – the zip codes were semi-randomly chosen based on what was relevant at the time. Version 2 of this will likely have a little more variety. Now, we just need to store it. Since the site already runs on a LAMP stack, let’s just use the existing MySQL DB to store this. So after connecting to the DB (not shown), I added another line in the for loop:

db.query("insert into weather (zip, temperature, time) values ('" + location + "', '" + temperature + "', NOW());")

Scheduling

Great, so now I have a Python script I can run to query a list of zip codes and store them into my DB. Let’s work on item 3 next – scheduling. Let’s use the “L” portion of the LAMP stack and add a cron job to run my python script once per hour on the hour.

0 * * * * python ~/python/weatherping.py

Items 1 and 3 now done!

Which Graphical Library?

So now my next decision is what library should I use to display the graphs.  I figured I was wanting to use either an extension for JQuery or Dojo.  I had heard Dojo has good charting capabilities, which appears to be true. I’ve used Dojo on a previous project in 2008 and I thought it would be quick to pick it up again, find a charting example, create some sort of Data grid, and call it done.  However … unfortunately it wasn’t quite as easy as that.  I had trouble running the examples until I put the source code into my local Mac built-in webserver.  After a little frustration then I had trouble getting a good plot over time graph.  Finally I was able to get one plot with poor labels, but then I had trouble having more than one chart in a single graph. Boo!

Enter JQPlot.  Pretty quickly I found an example with multiple series on a single chart: http://www.jqplot.com/tests/line-charts.php, and then I found a way to zoom in: http://www.jqplot.com/tests/zooming.php, plus an example of loading data through AJAX: http://www.jqplot.com/tests/data-renderers.php.  The only thing I didn’t quite figure out right away was the format the data was needed in.  If you have data locally, you can use single quotes, but if using AJAX, you need double-quotes for any non-numerics.

Retrieve the Data

It seems I have everything I need except a way to programmatically pull the data.  Well let’s use the P in LAMP and create a small script to pull the data out (yes I got a little bit lazy on the logic for pulling multiple zip codes:

mysql_connect("localhost",$username,$password);
@mysql_select_db($database) or die( "Unable to select database");

 

echo "[";
retrieveZipAndPrint("99354");
echo ",";
retrieveZipAndPrint("80482");
echo ",";
retrieveZipAndPrint("55403");
echo ",";
retrieveZipAndPrint("46953");
echo "]";
mysql_close();

  

function retrieveZipAndPrint($zip) {
  $query="SELECT * FROM weather where zip='".$zip."' and time >= date_sub(NOW(), interval 7 day)";</p></pre>
<pre class="p1">$result=mysql_query($query);

<pre class="p1">  $first=true;</pre>
<pre class="p1">  echo "[";
  while($row = mysql_fetch_array($result)){
    if ($first != true)
      echo ",";
    echo "[\"".$row['time']."\",".$row['temperature']."]";
    $first = false;
  }
  echo "]";
}
 

To see a snapshot of my data you can view/download it here

Graphing the Data

After some investigation and tweaking of the examples mentioned above, I finally have a working graph (see above)!

$(document).ready(function(){
   
var ajaxDataRenderer = function(url, plot, options) {
    var ret = null;
    $.ajax({
      // have to use synchronous here, else the function 
      // will return before the data is fetched
      async: false,
      url: url,
      dataType:"json",
      success: function(data) {
        ret = data;
      }
    });
    return ret;
  };
 
    var plot1 = $.jqplot('chart1', "./temperatureData.txt", { 
        title: 'Temperature Over Time (GMT)', 
        dataRenderer: ajaxDataRenderer,
        series: [{ 
            label: '99354 - Richland, WA'
        }, 
        {
            label: '80482 - Winter Park, CO'
        }, 
        {
            label: '55403 - Minneapolis, MN'
        },
{
    label: '46953 - Marion, IN'
}], 
        axes: { 
            xaxis: { 
                renderer:$.jqplot.DateAxisRenderer,
                tickRenderer: $.jqplot.CanvasAxisTickRenderer,
                tickOptions: {
                  angle: -40
                } 
            }
        }, 
        cursor:{
            show: true, 
            zoom: true
        },
        legend: { show: true } 
    });
});
 

To see the full HTML you can view it at http://ericfahsl.com/temperature/temperaturePlot.html

Future Next Steps

This was a good learning experience. After doing this I know there is a version 2 of this graph that I'd like to do, including:

  • Updating the zip codes, let's include Alaska, San Diego, and maybe some foreign cities.
  • Include humidity, I usually hate humidity!
  • Local time - It's probably not quite fair to try to compare cities at the exact same time as during sunrise/sunset they could be varied
  • Unstructured/NoSQL DB, such as Apache Cassandra.  I used MySQL because it was quick and my data model was planned to be static for this phase.  As I start to think about the above attributes, I'm going to need to update my table structure, maybe it's time to go to an unstructured NoSQL DB...
  • Define an "ideal day" of temperature and humidity, and start running calculations. This starts to get into a little bit of machine learning. As I start collecting more data, can I start rating different cities with how ideally suited they are to my tastes, or maybe someone who loves heat, or for an animal that loves coldest possible weather.  Lots of interesting ideas to play with.
Add a comment