One of the greatest strengths of today's computing environments that is completely underutilized by most of the user base around the world is that of virtualization. Virtualization allows you to run multiple "guest" operating systems simultaneously on a single machine. You might ask - "Why is this so groundbreaking?". Virtualization offers numerous advantages over traditional single operating system setups for a lot of reasons, but not limited to the following -
- No longer need to dual, triple or "more" boot your main machine so you can stay in the operating system that is most comfortable to you while allowing you to learn, test and experiment with other operating systems (great way to learn Linux).
- No need to run multiple computers saving on power and equipment costs while also saving on space (only needing one keyboard/mouse, no A/B switch, no KVM needed, etc.)
- Quicker and more efficient restores of system settings and data allowing for more testing of applications, settings, etc.
- Increased flexibility in moving virtual machines from one physical machine to another.
Those are just some of the advantages of setting up a virtual environment. Now that I have convinced you on virtualization merits, how do we get everything setup. My virtualization software of choice is VMWare Server 2.0 which is free to use and allows you to install this software on your current operating system. My computer at home was running Windows Vista Ultimate at the time which made the decision of what virtualization software easy to use as Windows Virtual Server 2008 doesn't support Windows Vista.
Installing VMWare Server 2.0
Here's a pretty easy guide to follow for installing VMWare Server.
Once VMWare Server 2.0 is up and running on your "host" machine, you want to start setting up "guest" machines. In preparation of starting the guest virtual machine process, you want to obtain an ISO of every operating system you want to setup. For me, I get all my Windows ISOs through my MSDN Universal subscription, but you can definitely create Windows ISOs other ways. For Linux distros, you can download those freely from that distro's website. After you have all the operating system ISOs you need, you are ready to create your first guest virtual machine.
Creating a Guest Virtual Machine
For this example, I am going to create a Windows 7 Ultimate guest virtual machine because it's probably the most complicated (and that's not saying much). I followed this guide for creating a Windows 7 virtual machine. You would follow very similar steps for setting up any other VM whether Windows or Linux.
Using Your New Guest VM
I won't rehash too much of what is already spelled out in the above guide; however, there are a few things I want to point out. When you load up the VMWare Web Access Home Page, you may get an authentication prompt and you have no idea what to put in. You will put in the same username and password that you used to log into Windows. After that, you will want to make sure you install VMWare Tools as directed by the guide. This allows for much easier copying and pasting between your host and guest machines. In addition, I have found that VMWare Web Access doesn't work properly in some browsers like Google Chrome so you might be forced to use a browser that you don't normally use like Internet Explorer - especially when trying to launch the console plug-in.
Once you have powered on your guest VM, click on the Console tab in the Web Access page, load the console plug-in viewer and start using your new VM! Good luck to everyone trying to setup virtualization - once you go virtualization, you'll never go back.
Finally, finally, finally. Blizzard has opened the StarCraft 2 Beta! I was lucky enough to win a beta key through Twitter and following the SC2 Team via their updates there. Promptly after the beta opened, I received my email with download instructions and beta key.
The download was 1.64 GBs and went pretty quickly considering I am sure the beta servers were being hammered by downloads from all over. Once the download was complete, you are presented with a well-done installation screen and installation is remarkably painless for a beta product. On my 64-bit system, it defaulted to the Program Files (x86) directory.
Upon setting up your StarCraft 2 user ID, you actually have to pick two names separated by a period which is definitely a little confusing. This is supposed to prevent naming collisions, but I think the entire system is too confusing and could definitely be simplified. After setting up my account, I immediately created a 1v1 game against a computer AI. Currently, the computer AI is locked into "Very Easy" mode which basically means you can build uninterrupted with no aggression from the computer. Immediately, there are some pretty noticeable differences from the original StarCraft - mainly, old buildings have some new key shortcuts and some buildings have new add-on options and research capabilities which I will hopefully address in future, separate posts.
Currently, the only way to really see the tech trees, since there is no manual, is from within the game itself. Therefore, I decided to use the game's screenshot function to capture the tech trees for each of the races.
One of the biggest omissions from the Zerg tech tree that I noticed is that the Lurker unit has been removed from the beta. The Lurker was a powerful, burrowed attack unit that did range damage to everything in its path in the original StarCraft and I believe was originally in the plans for StarCraft 2, but is currently unavailable.
Original StarCraft vs. StarCraft 2 Beta
Wow, there are so many differences it's almost nothing like the original, but similar enough that outsiders may just see "only new pretty graphics". I will try to address the differences race by race as I find them.
- Command Center - Mostly the Command Center is the same - builds SUVs, can take off, has a scan add-on just like the original. However, there is an additional add-on that allows for the Command Center to become a powerful defensive turret, but anchors the CC to the ground. This "replaces" the nuclear silo add-on from the original.
- Refinery - no real change from the original.
- Barracks - Almost completely different. The only thing the same from the original is that it can produce marines and that's it. Barracks now have two possible add-ons - one that allows you to produce 2 marines at a time, but you can't build any advanced units and an add-on that allows you to build more advanced units like Reapers and Marauders. There are no medics at the barracks anymore and have been replaced by a Medivac airship.
- Engineering Bay - Mostly the same with your infantry upgrades done here with a few new additions. The biggest mechanic here is the queuing of research which wasn't available in the original.
- Bunkers - Very similar to the original.
- Factory - Similar to the original in purpose; however, the units you can create, outside of the siege tank, are different. However, the same game mechanics apply.
- Ghost Academy - Brand new building from the original. As the name implies, allow for Ghosts and nukes to be built.
- Armory - Very similar to the original.
- Starport - Very similar to the original in terms of game mechanics. The main difference with Starports in SC2 are that all of the ships available, except the Battlecruiser, are different than the original. Medivacs replace ground unit medics and also act as cargo ships as cargo ships have been removed from SC2.
- Fusion Core - A new building specifically used for being able to create Battlecruisers.
Recently, on my new 64-bit home development machine, I fired up Visual Studio 2008 for the first time in a while and decided to start cranking out C# solutions to the Project Euler questions. Since I have already completed 14 questions about a year ago, I was able to code a solution to Question #1 very quickly. I decided to debug the solution and ... wham. I received the following error -
The components for the 64-bit debugger are not registered. Please repair your Visual Studio 2008 Remote Debugger installation via 'Add or Remove Programs' in Control Panel.
To resolve this issue, you will need your Visual Studio 2008 installation disk. On this disk, you will find the following path -
Inside this folder should be an executable called "rdbgsetup.exe". Run this EXE, open up Visual Studio and you will now be able to remotely debug your projects.
Recently, seemingly at random, our company's SharePoint sites all experienced a serious error when trying to utilize the "Export to Spreadsheet" list functionality. When users would click on "Export to Spreadsheet" everything would seem normal - users would get a prompt asking them to either Open or Save an .iqy file, (assuming you choose Open) Excel would launch, depending on the user's security they would get the macros warning and then ... error. The error reads -
"Excel cannot connect to SharePoint list."
That's pretty much it - no details, no stack trace, no real logging, no event log notification - nothing. With pretty much no information to go on and nothing in our Change Management Log for our SharePoint server farm, I started troubleshooting this issue mainly in the dark. Turning to the search engines, I definitely didn't find much information on this error and even less when taking into account we were running Windows SharePoint Services (WSS) v3. I tried a few of the fixes I found online with no success. At this point I opened up a case using my MSDN incidents and started working with Microsoft. The MSFT tech had me check some of the basic issues such as what authentication type my site was using (Kerberos vs. NTLM), some user permissions, SQL security, etc. However, he finally found the issue when we opened up the web.config file located at C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\ISAPI. In this web.config file, there was the following entry -
.0.0.0,Culture=neutral,PublicKeyToken=31bf3856ad364e35" priority="1" group="0" />
According to the MSFT tech, this entry is due to the fact that at some point Web Services Enhancements (WSE) 2.0 was installed on the server (although, I really have no recollection of this ever occurring - maybe this was done through a patch?). By removing this line from the web.config specified above and performing an iisreset /noforce, I then tried exporting to Excel and the functionality was restored. Hopefully, if you are encountering this article, I can save you a support call to Microsoft.
Well, after almost no updates for 5 months (although the Hagrin.com Forum has been pretty active), I finally decided to take a look at the blog portion of this site and address it once again. I finally upgraded the core Drupal installation and all the installed modules from version 5 to version 6 - a task that I really didn't want to have to do, but was surprisingly easy. I can't stress this enough - make sure you follow the UPGRADE.txt file's instructions verbatim. The instructions posted on the official Drupal site are actually not as complete as the instructions found in this text file and I avoided a lot of headaches by taking my time and stepping through the steps outlined.
A couple of other suggestions that I would recommend -
- Not only backup the web files directed in the UPGRADE document, but you might as well backup all the core Drupal files while you are at it - it doesn't take much time at all and I found it better to be overly cautious.
- Take a screenshot and/or leave open your module page so that when performing a major upgrade you know which modules you had installed and need to update to their new versions. This information is otherwise lost when you upgrade if you didn't back up your modules folder.
- Expect to be down/in offline mode for at least an hour. Now, how long you are down for will depend on your expertise level and how many non-core modules your site has installed. I would expect that most sites have about 5-10 non-core modules installed and updating and downloading each takes time. You can shorten your downtime with a little prep work - download all the updated module packages in advance, extract them prior to bringing your site offline and this should help speed up the upgrade process.
Now, just think - we all get to do this again when version 7 comes out. Good luck everyone.
Here's a quick tip post that will help those doing web development on an Apache web server and cannot see their .htaccess which FTPing. If you use Filezilla as your FTP client, by default, Filezilla may hide the .htaccess file during a remote directory listing. To view the file using the Filezilla FTP client, click on Server and make sure the "Force showing hidden files" option is checked. Your .htaccess file should now be viewable in the remote directory.
After having issues installing Magento on a GoDaddy Virtual Dedicated Server (VDS), I had two clients that wanted Magento installed on 1and1's Virtual Private Server (VPS) hosting package. After struggling with this process, I decided that I would document that process.
- Log into your 1and1 Control Panel.
- Click on Domains and setup your domain as necessary. This will link your domain to your hosting package.
- Click on Server Administration -> Server Access Data. Here, you will find all the login information needed to login to your Virtuozzo and Plesk control panels as well as your SSH login.
- Login into the Virtuozzo control panel. Click System Services and then drill down into the psa and mysqld services to make sure they are started and that auto-start is enabled.
- Login into the Plesk control panel. Click Domains -> Create Domain. Fill out the domain information as necessary and create your FTP account at the bottom of this page. Click Next when done.
- On the resulting page, uncheck the option for PHP safe_mode. Click Finish at the bottom of the page.
- Upload the Magento files to the webserver. You can most easily do this by FTPing using the login you created when setting up the domain in the Plesk Panel. You can also SSH into your server, upload the Magento tar and untar the archive into your /var/www/vhosts/
- Try loading your domain in a browser. If you try to load your site in a browser and you get a "Whoops" error stating that you have an invalid PHP version, this is due to the fact that when 1and1 images your server they use an older version of PHP. Therefore, you have to update your version of PHP and you should continue with these steps. If not, try to install Magento and disregard the following steps.
- SSH into your server and type the following command - wget -q -O - http://www.atomicorp.com/installers/atomic.sh |sh
- Type yum update to check the available updates, but choose No to install these updates.
- Type yum update php and choose Yes to all the prompts.
- Type yum update mcrypt* and choose Yes to any prompts.
- Type yum update mhash* and choose Yes to any prompts.
- Type yum update php-mcrypt* and choose Yes to any prompts.
- Type yum update php-mhash* and choose Yes to any prompts.
- Type /etc/init.d/httpd restart to restart Apache.
- Re-load your domain in your browser and you should be able to install Magento now.
Hopefully, this guide helps everyone get their Magento installation up and running with little effort on 1and VPS hosting.
Yesterday, I ran a somewhat successful Long Island marathon as I completed the race in 3:35:12 which was good enough for a 95th place finish. Considering I had run 70 miles only 3 weeks ago at the McNaughton 150 and was battling some right foot problems, I am generally happy with how I ran and how my body held up.
The LI marathon is built to be a fast marathon - it's held right at the beginning of May before it gets too hot, has an 8am starting time, has very little elevation change, has fewer runners than most races and hasn't been very windy the years I have been out on the course. However, the LI marathon isn't without its faults having one of the more mind numbingly boring courses I have ever been on and the sheer lack of spectator enthusiasm when compared to other races. Personally, I despise running this race and its ~12 mile Wantagh Parkway section, but I do it because I have friends who run the race and I should definitely support my local races more.
Newton Running Shoes
After reading some interesting emails on the Ultra List and seeing Pam Reed where them at the 6 day race here in NY, I decided to purchase a pair of 2009 Newton Neutral Racer shoes. While these shoes are on the expensive side, I have to say that they are worth every penny. These shoes are lightweight and help to improve and force runners to foot strike with the correct part of the foot. There is a pronounced red "lug" that really helps runners to learn to midfoot strike as opposed to heel striking. These shoes really protected the bottoms of my feet as the bottoms of my feet tend to get very sore when road racing, but are still very lightweight at ~8.6 ounces. I have to say this is the perfect shoe once you get used to the different feel of the shoe's bottom and I definitely recommend this product to anyone looking to improve their foot striking and running technique.
Before the race I had decided that I couldn't run this race slowly as my right foot really couldn't take the extra pounding of going slower (somewhat counter intuitive, but that's how my foot felt in warm-ups) so I had told my running partner, Allegra, that I was going to go out at a much faster pace than she had planned on running (more on this later). My first 4 miles put me at sub 7 minute pace and I was definitely holding back as I could have definitely run that section much faster, but I was worried my foot wouldn't hold up. By the 10K mark, I was somewhere in the 7:15 range (maybe faster) and was generally feeling pretty well. I went through the half marathon at ~1:37 so I was definitely making pretty decent time considering I wasn't really trying to run at 100% effort (I had run a 1:35 first half in Miami in January where I was going out at 100%). By this point I was still feeling pretty good and the cool mist that had existed since the race start was starting to turn into a light drizzle.
This put me about 3 miles into the section of the course I dread - the Wantagh Parkway where there are basically no spectators, there are basically no turns and it's nothing but empty parkway. Being used to and in love with trail running, I can only describe this section as depressing. However, having run this race before, I was ready for it and just tried my best to zone out. By mile 18 I was starting to fall apart with the pre-race injury in my right foot really starting to act up. In addition, my running partner Allegra, who had planned on running 10 minute miles, zoomed right by me at mile 19 running 7:30 pace leaving me as if I was standing still (so much for running slow huh?). At that point I was pretty demoralized and decided to shut it down and save myself for the 50K I will most likely be running in 6 days. Between miles 20 and 24.5 I mailed it in just sort of plodding along, but with less than 2 miles left I decided to pick up the pace knowing I could still run sub 3:40. Unfortunately, the only runners in front of me were also kicking pretty hard so there was no one to pass in the last 1.5 miles and I crossed the finish line with the announcer yelling out someone else's name. Allegra ended up finishing in 3:21 and placing in the Women's Overall group while running a PR - not bad for someone who had planned on running "10 minute mile pace".
The normally festive post-race area was a little mellow this year due to the fact that it had turned pretty cold and the rain was coming down more heavily at race end. The race does a great post-race setup where you get your medal, protective heat wrap and goodie bag which I honestly haven't opened yet. I'll probably be back again next year with a goal of running a 3:30 or so depending on my ultrarunning schedule.
For the last 5 days, I have been encountering an error when trying to automate the creation of an Excel document through a scheduled SQL Server job. The error I was receiving was -
Exception from HRESULT: 0x800A03EC
Let's take a step back. The same code I had written worked on my development machine, worked on other target servers while scheduled as SQL jobs, but the code would not work on the following target server -
- Windows Server 2008 Standard
- 64-bit platform
- SQL Server 2005
- Excel/Office 2007 Professional
After adding some StackTrace code to my program, I was able to determine that the EXE was failing on the Microsoft.Office.Interop.Excel.WorkbookClass.SaveAs line. After trying multiple iterations of the SaveAs command (a quick Google search provided some potential solutions), I continued to get the same 0x800A03EC error code. I went down the road of thinking it was a problem with my 32-bit development environment and had our Sysadmin build me a 64-bit virtual machine for me to compile my application; however, this also yielded the same error. I made sure that the Excel assemblies on the development environment and the target machine were the same version. On the target server, I went to dcomcnfg, selected the Microsoft Excel Application entry and made sure it was using the "interactive user" and still no luck. I made sure that, for testing only, all the SQL services ran under my domain admin account to account for SQL security differences and still nothing.
At this point I decided that I was spinning my wheels and I would call Microsoft and use one of my MSDN support cases.
After speaking to several departments, I finally reached the Office department who tried to help. After explaining my issue quite a few times over and over, I finally got the response I had dreaded - it can't be done that way. Ugh. I had feared I would get this response after reading that using Open XML to create Office documents was the recommended approach and that using the Office COM references was no longer supported (if it ever was) and Windows 2008 has additional security that prevents the old approach from working properly. Well, I guess that explains why it wasn't working.
So where do I go from here? Even though Microsoft support couldn't provide me with sample code or a link to some code, I was able to find this knowledge base article detailing how to to create Excel files using Open XML. Hopefully, this approach will work on my target server environment.
Recently, a client wondered how they could add an external document to a document library that existed on their Intranet. The problem with just saving their document and uploading it to the Intranet is a matter of document "freshness" as changes are made by the document owner. Therefore, creating an external link to that document is a much better solution. However, adding a link to a document library would seem difficult on its face.
To get around this issue, I took the following steps -
1. Create a TXT file.
2. Open the TXT file and add the following HTML -
3. Save the file as a HTML file.
4. Upload to your Document Library.