Before the Pacing
With my Grindstone 100 service requirement looming over my head, I decided early on that I would volunteer at the Vermont 100 either as a volunteer and/or pacer. After emailing the pacing director, I was matched up with Andy N. from Massachusetts who was a two-time, ~20 hour finisher of this race. Initially, I was actually a little worried pacing someone of this caliber as I know I tend to be erratic with my mile times during longer races, but with a shortage of available pacers, I stayed with Andy. I arrived at the race site Friday afternoon just in time for the pre-race meeting which was highly informative. I’m not sure how I missed this fact over my last two years in the ultrarunning community, but the meeting director informed us that the Vermont 100 is the only race of its kind where humans race the same course as horses. Yes, I said horses.
The pre-race meeting ended and I went back to my car while the runners ate a wonderful pre-race feast (very worth the money if you’re wondering – pacers can get a meal ticket and eat free). I listened to the Yankees game on the radio (only station I could get) and fell asleep in my car (too lazy to break out the tent).
I was awoken by the sounds of heavy rain at 3:45am Saturday morning which was perfect timing because the race starts at 4am. Luckily, the rain did not last too long and the runners were off. I went back to bed so that I was rested for the 30 miles of running I would be doing later. I woke up at 10am and immediately I was bored and anxious. I started questioning “why didn’t I just sign up for this race?”, but tried to relax and slowly get ready. As I started walking around the camp, I could feel that the air wasn’t overly hot, but it was very thick and humid and wondered how many of the runners would hold up in the overbearing humidity. Eventually, I got dressed in my running gear, grabbed my water bottles and went to the main tents to wait for the shuttle to Camp 10 Bear. Camp 10 Bear does double aid station duty at the 47 and 70.1 mile marks. I got there around 2pm and decided to just hang around and take in the excitement of the runners coming in to 10 Bear.
Watching the runners come into 10 Bear was exciting and informative as well. Just by looking at the clothing of the runners, you could see a lot of runners were pummeled by the humidity and were moving slower than they had expected. Quite a few runners looked extremely dehydrated and I heard quite a few stomach related complaints. Seeing the conditions of the runners made me focus on hydrating while waiting even though I was in the Porta-Potty about 15 times. The heavens opened up and some gnarly lightning and thunder accompanied by rain drenched 10 Bear on and off for an hour or so.
The Pacing Begins - Mile 70.1 Camp 10 Bear Aid Station
An hour ahead of schedule, my runner came into 10 Bear at the 70.1 mile mark right at 5pm (13 hours in). He had obviously been running near the Top 20 runners, but as he shuffled into 10 Bear I could see that he was hurting, but definitely not near that red line … yet. However, he did mention in passing that he contemplated dropping thinking that he had gone out too fast, but I think I pressured him into going out. While he took care of his feet, I grabbed his drop bag and got him some M&Ms, cookies and HEED. We set out from 10 Bear at a decent walking pace in preparation of the trail uphill that was upcoming. We handled the trails at a slow, but easy pace and this pace was quick enough to have only 2 runners pass us while passing one ourselves. Andy and I strolled into Seabrook (74.7 miles) which is a small aid station along a gravel road and my runner was starting to get a little cranky; however, I still wasn’t concerned because we were moving at a decent enough pace and a 20 hour finish was still well within our grasp. However, once we left Seabrook, I realized my runner was in serious mental trouble.
When you’re pacing, I think you’re almost more conscious of the things you have learned from running and reading content from other ultrarunners than when you are actually running your own race. I could see that there was nothing medically wrong with my runner, but that he was hitting a major mental hurdle and it was only getting worse. Again, the talk of dropping at West Winds started especially as we encountered some steep, very muddy single track trail. As horse riders passed us saying how we looked good, his comments were all very negative and defeatist. For this section, my approach was to not be encouraging nor discouraging, but more mathematical and tell him that this is a common feeling and that once he was beyond West Winds he would leap the mental hurdle.
77 Miles Down - West Winds Aid Station
We arrived at West Winds and the situation didn’t improve as much as I thought it would. He was turning down water stating he couldn’t drink anymore water (not a good sign as this will lead to other medical problems later) although he was still eating. To his credit, he didn’t stay at West Winds for more than 5 minutes and was back out shuffling along. This ended up being the calm before the storm and we jogged a nice gravel road section passing a few 100K runners and we arrived at the unmanned Goodmans aid station. I was actually thinking we had a real chance of keeping a decent pace and still obtaining that 20 hour goal until we left Goodmans and night started to roll in.
It amazed me from the pacer’s perspective that you can actually watch a fellow runner deteriorate mentally right before your eyes. As dusk was setting in and it started to become darker, the incessant pleas to quit, DNF and just sleep started raining upon me. Again, I tried to be more analytical about the situation stating that “it’s a normal reaction to night rolling in and we’ve all been there”. Our conversation for the next 2.6 miles was straight out of a bad marriage with him saying what he wanted and not listening to a word I said and vice versa. It was an extremely long 2.6 miles for me because I continued to walk silently when all I wanted to do was squirt my water bottle at him. We finally arrived at Cow Shed and he immediately stated his intentions to lay down and he took two blankets and laid face down on the ground outside the tent. When he went down, I was starting to think that he wasn’t going to get over the mental hurdle. After staying down for 30 minutes, I caught a break when the Vermont bugs started biting his face causing him to be uncomfortable enough to want to keep going. I knew he was leaving not because he wanted to run, but because he wanted to leave so I was dreading the long 5 miles between Cow Shed and Bill’s, but slowly we left the great volunteers there.
It's important to note that it was so humid that the night time air was so foggy that our headlamps were rendered basically useless as you couldn't see more than a few feet in front of you. This fact added to Andy's demoralized state as progress was difficult to discern.
The 5 Mile Road to Bill's & 88.6 Miles
I did all that I could to keep him upright those next 5 miles. He was wobbly. He couldn’t walk in a straight line and the slightest elevation change brought him close to toppling over. In this section I was completely silent because there was no piece of encouragement that would drive him further so I instead opted for focusing on nothing but our forward progress. It took almost a full 2 hours to get to Bill’s and there were several times I actually caught him to prevent him from falling over. We crashed into Bill’s (88.6 miles) and he immediately headed to a medical cot and I thought for sure our race was over. At this point it’s about 11:30pm and I know that any significant amount of downtime, with our current pace, might mean missing out on a sub-24 hour buckle finish. As he laid down in the cot, the determining struggle began.
Heading into Bill’s, my runner kept stating how Bill’s had medical personnel and that he should be checked out and maybe they would pull him. Translation – he didn’t want to quit, but he didn’t want to run anymore and he wanted someone else to make that decision for him. Knowing this I decided I wouldn’t make it easy on him or medical personnel. As they questioned him how he felt, I interjected stating that medically he was fine and had urinated 3 times in the last 3 hours, he was still eating, but that he was just tired, needed to drink more fluids and was more of a mental issue. I, of course, said this loud enough for my down runner to hear. They took his blood pressure and monitored his oxygen level and they were both in excellent condition. We wrapped him up in a foil wrap and a blanket and the waiting game began. I watched him try and rest as other runners were coming in and most in much worse physical/medical shape than my partner. The minutes tick by and it’s 12:30am and we’re still down and I’ve given up all hope and I’m starting to make plans to pace another runner, Jeff, the rest of the way. A medical volunteer comes over and gives my runner a yellow Vitamin Water which he sips slowly and then pours the rest in his water bottle. Miraculously, he decided to try and get up and says we’re going to give it a go. We tie the foil wrap around him like a cape and out of Bill’s we go just before 1am with 3 hours and 11.4 miles left for a sub-24 hour finish.
After being down a total of 90 minutes, his decision to get up at the minute he did was an amazing feat of mental strength on his part and saved his race.
Don't Call it a Comeback
Out of Bill’s we moved very gingerly at first and I told him that this was normal after an extended period of downtime because your leg muscles will stiffen. I tried to encourage him to try and run some and what do you know – he could walk quickly/jog again. He became alive with excitement screaming out all types of jibberish and I was now motivated to switch gears and become the “pushy pacer”. I knew that I would have to take advantage of the energy burst now, get him close to the finish and hope that he could stay motivated enough to fight through the inevitable pain. Luckily, not only did his energy level dramatically increase, but so did his belief in my abilities to get him in under the 24 hour mark. Our pace quickened to around 12-15 minute mile pace depending on the terrain’s slope which was going to make it a very close finish.
My goals for the next three add stations, Keating’s, Polly’s and Sargent’s, was to have Andy give me his water bottle, I would fill it up as he got food and for him to continue out of the aid station in under a minute. I would then fill up my water bottles, grab some food if needed and then sprint to catch Andy a few hundred yards beyond the aid station. When we hit Keating’s (92 miles) our pace was solid and I now had him believing in a sub-24 and I turned into the encouraging “Great work/Good job/Looking strong” pacer. Instead of running side-by-side, I decided to run in front of him to call out terrain issues as well as to set the pace to push him a little more to stay with me. To his credit, very few times did I have to turn around and slow up as he did a tremendous job of powering through those tough, last 11.4 miles. Polly’s is a great, late race aid station (95.5 miles) and we arrived at 2:40am with only an hour and 20 minutes to get in. As a comparison, Andy informed me that the year prior, when he was feeling good, Polly’s to the end took him 1 hour and 8 minutes so we were living dangerously close to not finishing in time.
4.5 Miles to Go - Can We Pull Victory from the Claws of Defeat
Then, a disaster strikes as we make it to the end of road out of Polly’s to find a T intersection and no trail markings (plates or glow sticks). I sprint back up the small hill to find that we missed a left turn off and shout back down that we had, in fact, missed a turn off. We lost about 2 minutes and now I was really starting to worry that we weren’t going to make it. To his credit again, Andy didn’t let his motivation level slide and we took advantage of the non-trail terrain. The terrain between Polly’s and Sargent’s is nothing but gravel roads and our plan was to keep as fast a pace as possible since Andy informed me that the rest of the race after Sargent’s is single track trail. Andy and I arrived at Sargent’s (97.7 miles) with ~45 minutes left until 4am.
By this time, we had built up a nice little convoy of 3 runners and 2 pacers and I really believe we all fed off each other’s energy. One runner, Christopher Martin (who we passed earlier and said he “didn’t have the heart to get sub-24), was now right on our heels looking extremely strong and motivated (in fact, he stated that the year previous he finished in 26+ hours so this was a great finishing time for him). I, again, led the way helping the runners navigate the trail and when we hit the .5 mile mark to go with 17 minutes left, we all started to celebrate while keeping our pace up. With about a quarter mile left, we could hear cheering in the distance and we all started shouting back. After emerging from the woods, I left Chris and Andy to cross the finish line, arms raised together, in 23:49 with less than 11 minutes until the 24 hour buckle cutoff.
A Little About Me After 30 Miles
Interestingly enough, I was so worried about Andy finishing and keeping a good pace for the 3 runners to follow that I didn’t notice my own deterioration in the last 2.3 miles until after we finished. I wouldn’t dare stop and re-tie my shoe, which had become undone, so I developed 2 gnarly blisters on my right foot. While I wasn’t anywhere near tired, being in a rush and completely consumed with Andy’s condition would have put me in jeopardy if my pacing duties were extended as I would have had downtime to deal with the blisters and other irritations. My body responded extremely well and I had tons in the tank and still had the ability to change speeds when needed, but I learned that even at a short and slow 30 miles that a pacer needs to make sure to account for his needs as well so that he is still helpful to his runner. Pacing was an extremely rewarding experience and an adventure where I gained a lot of valuable insight into common runner problems and how to combat them. Hopefully, next year, I’ll be receiving my silver buckle with Andy who I am sure plans on running the Vermont 100 again.
Microsoft Virtual Server 2005 R2: Fixing the "An error occurred accessing the website application data folder" Error
Recently, while trying to work on simultaneous consulting projects, I had the need to install and run multiple VPN clients on my development machine. What was the result of trying to actually do this? Blue screened. I pretty much had figured something nasty was going to happen, but curiosity killed the server. Therefore, I figured I would finally turn to virtualization to solve my dilemma. I recently received my MSDN Universal disks and decided to install Microsoft Virtual Server 2005 R2 on my Windows 2003 Server development box. The install is quick and painless and the initial post-installation documents states that a web site was created to administer your Virtual Server. I loaded up the URL and received the following error -
Just to give you an idea what type of environment I was working in, I was trying to set this up using -
- Workstation: Windows Server 2003 Standard Edition
- My workstation was part of a Windows 2003 Domain
- I was signed into my workstation with my own account which is a Domain Admin
As a Domain Admin, I would have assumed that I had the proper credentials; however that didn't seem the case. To resolve this problem, I had to create (or use an existing) local administrator account and when presented with a login box, to use the local login and not your domain login. In addition, if the issue persists, open up IIS and under the Authentication options for the Virtual Server site, make sure that anonymous logins is unchecked. Those two options should resolve any initial security issues that you may have after first installing Virtual Server 2005 R2.
While extremely unrecommended, sometimes you just want to fix a problem quickly and without jumping through a ton of hoops using the SharePoint SDK and writing a small C# program. I recently encountered this when I needed to delete a Folder from a whole bunch of Document Libraries across hundreds of sites. Making sure that I checked to see if the folder was there first, calling the right Web, calling the right List, getting the right Folder GUID, etc. all seemed like a lot of work for a simple SQL statement so I cheated.
Folders all stored in the AllUserData table with a tp_ContentType = 'Folder'. I was able to write a simple UPDATE statement like the following to delete the unwanted folder across the sites I wanted -
SET tp_DeleteTransactionID = 0x00000010
WHERE tp_ContentType = 'Folder' and ...
Quick and very dirty. Not recommended, kids don't try this at home.
Google recently started their Online Security blog which discusses security related issues encountered on the web and with the infrastructure that powers the Internet. While they only have 3 posts currently, I can see this blog becoming one of my favorite blogs since Google has access to so much security data that I believe that they will be able to identify Internet trends extremely fast and potentially police the Internet in a "gray hat" manner.
Their most recent post talks about web server software and malware infection rates. The Google Online Security team drew some pretty interesting conclusions about web server software and its infection rate in different sections of the world. According to the data they uncovered, malware infected machines were split right down the middle 49% to 49% when talking about infection rates on Apache web servers and IIS servers. However, when looking at the regional infection rates for each, we see that IIS is extremely more vulnerable in the Pacific than anywhere else. Now, one might conclude that people in the Pacific can't secure IIS well, but that would be a poor conclusion. Instead, the Google Online Security team identified (correctly IMO) that the issue stems from the rampant piracy in those areas and the fact that piracted copies of Windows are not privy to receiving updates from Microsoft. Google's Online Security team goes on to state that this may show evidence that Microsoft needs to change their policy and allow for pirated copies of Windows to still receive security updates since pirates will continue to run the software and that these infections hurt valid users more than pirates.
Google has the opportunity here, with their unparalleled data collection abilities to really make a difference in web security and I look forward to seeing exactly what their online security team comes up with in the future.
Speculation is running rampant as Blizzard announced that they will be giving away a beta key to BlizzCon attendees for "an upcoming Blizzard game". Now, while Blizzard probably has several games in development, I almost guarantee that the beta key will be for Starcraft 2. Why? Most likely because there were early beliefs mentioned by Blizzard that they could get the game out by Christmas 2007 which would meant that they would have to be in beta by August which is exactly when BlizzCon takes place (BlizzCon takes place in early August).
In other Starcraft 2 news, there was an interview with Blizzard's Cinematic Creative Director, Nick Carpenter, where he talks about making cinematic tools used to make certain images available to users with the Starcraft 2 release. I find this pretty interesting and should allow for users to create some pretty nice fan content.
Update - As of 17 May 2010, it looks as if secured search will be offered by Google!
After reading about another vulnerability found in the Google Desktop software (a new man-in-the-middle attack), I was reading through some user comments at Slashdot and was hit by one of them like a ton of bricks -
Why doesn't Google offer secured search?
Wait a second, they don't? They offer a secured version of their Gmail application. Although Google doesn't make it very public that a secured version exists, you can login through a SSL page and all pages will be appended with the https prefix. But, what happens when you try to navigate to https://www.google.com ? Simply, you are redirected to the basic Google search page.
So, is this even a big deal? Probably not since many services are not encrypted over SSL. In addition, the data being passed are simply search terms. However, if search terms can be sniffed and recorded and indexed by parties outisde of Google, there is a certain level of privacy that doesn't exist that maybe should. When Google released their search history and web history functionality, many outsiders complained about the privacy violations. However, if a third party could garner the same information through back channels, should we be as equally concerned? When you think about the biggest sites around on the Internet, I cannot think of one single site that gets nearly the smae traffic Google does and that offers a SSL connection for all their pages. Therefore, one has to wonder how well the secured search solution would scale and what type of overhead would be involved on offering the solution at all - no less by default.
My final thought - I think it would be nice if the option existed, but it definitely doesn't need to be a default connection that "normal" users need to conern themselves with. I think that you will see a secure search page within the next year or two for sure.
Today, Google officially announced the purchasing of FeedBurner, the RSS analytical company providing statistics for RSS feeds such as subscriber counts, clickthroughs and other web metrics. This is an extremely wise purchase by Google because this closes a gap that their other analytical offering, Urchin (and its free web based version for webmasters), didn't provide RSS metrics. Prior to the FeedBurner purchase, Google could only provide RSS metrics through Google Reader subscriber counts for a specific feed - highly inaccurate due to the fact that Google only had access to only a slice of the RSS-using population.
What does this mean for webmasters? Well, I would think that the SEO implications are pretty obvious - either take advantage of the FeedBurner service or sacrifice the RSS metric potentially effecting your site's ranking. Sure, Google could still get some of the information from the user side (i.e. Google Reader subscribers to your feed), but webmasters would lose the metrics race to those optting-in to the Google services. Since Google currently drives a large portion of traffic to all sites, webmasters really do need to make a decision on linking all of their site's information into Google related services and what that means to their search position on other engines.
While most of all the other Google analytic initiatives still allowed webmasters to not sacrifice their position with other search engines, redirecting your RSS feeds through FeedBurner directly is a dangerous move long-term. Instead, I would suggest webmasters redirect their original feed URL to FeedBurner and display the original feed to site users. This will allow webmasters to relinquish control in some manner, but still does not solve the problem of having to potentially muticast the feed to other RSS analytic services.
In an interview with MSNBC, Blizzard's Vice President Rob Pardo announced that Starcraft 2 would not be released in 2007. At first, he stated that some inside Blizzard thought that they could have Starcraft 2 released by Christmas 2007 so I have to imagine that they are pretty deep in the development cycle. When you think about Starcraft 2 and the fact that much of the gameplay and how a unit works, etc. has already been done, most of the work that needed to be done was with the 3D engine and the Battle.Net enhancements.
I admit - I'm crying a little inside right now.
Although there are numerous free bulletin board software packages available, ever since Hagrin.com was registered I made phpBB my board of choice. Recently, phpBB announced their version 3 Release Candidate 1 package and I decided that I would give the new version a test run and see how the package is shaping up. Since barring any major bugs this RC would be made the final release, I felt comfortable evaluating the package in a production environment and to evaluate it under the same careful eye I would any production application.
The one area that I will cut them some slack is in the documentation department. Unfortunately, when you're installing and setting up a piece of brand new software you rely on the documentation heavily sometimes - especially, like in my case, where you are upgrading an older system and your main concern is data preservation. Problems started immediately when I went to read the upgrade instructions from the phpBB website and found limited "just point and click" instructions to perform the conversion. What this page fails to tell you is that it's the furthest thing from the truth for completing a sucessful 2.0 to 3.0 upgrade. To actually successfully upgrade, you need to:
- Do not, I repeat not overwrite your phpBB 2.0 files. You need to keep these in place.
- Copy phpbb 3.0 files onto your webserver into a directory different than your current forums directory.
- When creating the database tables, you need to make the new phpBB 3.0 tables in the same database/schema as your old 2.0 tables, but remember to use a different table name prefix.
- Complete the installation process.
- Complete the conversion process (the point and click interface mentioned in the link posted above).
- Move your old phpBB 2.0 files out and move your new phpBB 3.0 files into your old forums directory.
Not so point aned click huh? However, I'll cut them some slack because I was able to find the documentation somewhere eventually and documentation usually catches up over time. Oh, and don't forget to clean out your database of the old phpBB 2.0 tables that are still there.
Once the board is up and running, you do have to marvel out how things have progressed for phpBB over time. Most of the changes you see are on the backend; however, the default prosilver theme definitely gives users a brand new experience when using phpBB 3.0. First, in the prosilver theme, user information for a post is located on the right hand side as opposed to the traditional left hand side. New user options such as reporting posts and being able to grab in-depth information about a poster/poster in a single click proves a worthwhile feature. A "Friends & Foes" option was introduced to give users the ability to create a more social networking/Slashdot like feel to their forums where having friends and foes allows users to filter through data easily. On the admin side of things, phpBB did tremendously great work when handling how bots are able to crawl your site by not assigning them a SID or session ID so that your URLs remain consistent and void of any long, always changing querystrings. phpBB developers also improved the caching system which should help server load in times like the "Digg Effect" and other large sites picking up your site's content. Finally, phpBB finally gave admins the ability to edit templates through the Admin Control Panel as opposed ot having to edit files manually.
However, there are a few missing features from the newest phpBB version which disappointed me. The lack of a RSS feed for the user's board really seems the biggest missing feature especially with the advent of iGoogle, Netvibes, RSS readers, etc. In addition, especially with the explosion of CSS layouts, I'm surprised that users do not have the ability to move poster information from the right to left side in the default prosilver theme and that type of functionality isn't introduced.
Overall, I give phpBB a thumb up on their newest release, but would still like to see some very rudimentary improvements and features added to bring the board more inline with how users are using the web these days.
After a long overdue hiatus from writing in my Search Engine Optimization Guide, I have finally added a new entry in a series of hopefully many new articles covering SEO issues raised on today's Internet. Today, I added an entry concerning Drupal 5 and assigning unique META tags to help differentiate your content. Check back for more SEO articles as I can crank them out (hopefully one a day for a while).