• No recently listened tracks.

Deploying to Multiple Locations in TFS 2010

This is just a forewarning, I am not an expert in the ways of TFS. I’ve only been working with TFS in general for about a year, and TFS 2010 for about 2 months or so. That being said, if anyone has recommendations on a better way to do this, please point me in the right direction 🙂

Scenario:

We want to deploy our built code to different environments, but, we don’t want everything that’s dumped into the Release folder.

The Plan:

The plan is to create a custom build activity to copy files from one place to another checking each file against an exclusion list.

The Guts:

Referencing my previous post, Dependency Replication in TFS 2010, I will build upon the write-ups by Ewald Hofman.

You can grab the file here: DeployFiles.cs

Here’s the whole activity:

using System;
using System.Activities;
using System.IO;
using System.Text.RegularExpressions;
using Microsoft.TeamFoundation.Build.Client;
using Microsoft.TeamFoundation.Build.Workflow.Activities;
using Microsoft.TeamFoundation.Build.Workflow.Tracking;

namespace BuildProcess.Activities
{
    [BuildActivity(HostEnvironmentOption.Agent)]
    public sealed class DeployFiles : CodeActivity
    {
        //Source dir being deployed from
        [RequiredArgument]
        public InArgument<string> SourceDir { get; set; }
        //Destination dir being copied from
        [RequiredArgument]
        public InArgument<string[]> DestinationDir { get; set; }
        //Files to exclude
        [RequiredArgument]
        public InArgument<string> FileExclusions { get; set; }
        //Folders to exclude
        [RequiredArgument]
        public InArgument<string> FolderExclusions { get; set; }

        //globals
        public string fileExclusionPattern = "";
        public string folderExclusionPattern = "";

        protected override void Execute(CodeActivityContext context)
        {
            // Obtain the runtime value of the Text input argument
            string fileExclusions = context.GetValue(this.FileExclusions);
            string folderExclusions = context.GetValue(this.FolderExclusions);
            string[] destinations = context.GetValue(this.DestinationDir);
            DirectoryInfo sourceDir = new DirectoryInfo(context.GetValue(this.SourceDir));

            //parse exclusions, add them to regex patterns
            if (!String.IsNullOrWhiteSpace(fileExclusions))
            {
                string[] fileexarr = fileExclusions.Split(',');
                fileExclusionPattern = "(";
                foreach (string s in fileexarr)
                {
                    fileExclusionPattern += s.ToUpper().Trim().Replace(".", @"\.").Replace("*", @"[a-zA-Z0-9]*") + "|";
                }

                if(fileExclusionPattern.EndsWith("|"))
                    fileExclusionPattern = fileExclusionPattern.Substring(0, fileExclusionPattern.Length - 1);

                fileExclusionPattern += ")";
            }

            if (!String.IsNullOrWhiteSpace(folderExclusions))
            {
                string[] folderexarr = folderExclusions.Split(',');
                folderExclusionPattern = "(";

                foreach (string s in folderexarr)
                {
                    folderExclusionPattern += s.ToUpper().Trim().Replace(".", @"\.").Replace("*", @"[a-zA-Z0-9]*") + "|";
                }

                if(folderExclusionPattern.EndsWith("|"))
                    folderExclusionPattern = folderExclusionPattern.Substring(0, folderExclusionPattern.Length - 1);

                folderExclusionPattern += ")";
            }

            foreach (string dir in destinations)
            {
                DirectoryInfo destDir = new DirectoryInfo(dir);

                context.Track(new BuildInformationRecord<BuildMessage>()
                {
                    Value = new BuildMessage()
                    {
                        Importance = BuildMessageImportance.High,
                        Message = "Source: " + sourceDir.FullName + "\nDestination: " + destDir.FullName +
                            "\nFolder Exclusion Pattern: " + folderExclusionPattern +
                            "\nFile Exclusion Pattern: " + fileExclusionPattern,
                    },
                });

                CopyDir(sourceDir, destDir);
            }
        }

        private void CopyDir(DirectoryInfo sourceDir, DirectoryInfo destDir)
        {
            //create destination dir if it doesn't exist
            if (!destDir.Exists)
            {
                destDir.Create();
            }

            // get all files from current dir
            FileInfo[] files = sourceDir.GetFiles();

            //copy ze files!!
            foreach (FileInfo file in files)
            {
                if (!String.IsNullOrWhiteSpace(fileExclusionPattern))
                {
                    if (!Regex.IsMatch(file.Name.ToUpper(), fileExclusionPattern))
                    {
                        if (File.Exists(Path.Combine(destDir.FullName, file.Name)))
                        {
                            File.Delete(Path.Combine(destDir.FullName, file.Name));
                        }
                        file.CopyTo(Path.Combine(destDir.FullName, file.Name));
                    }
                }
                else
                {
                    if (File.Exists(Path.Combine(destDir.FullName, file.Name)))
                    {
                        File.Delete(Path.Combine(destDir.FullName, file.Name));
                    }
                    file.CopyTo(Path.Combine(destDir.FullName, file.Name),true);
                }
            }

            // get subdirectories.
            DirectoryInfo[] dirs = sourceDir.GetDirectories();

            foreach (DirectoryInfo dir in dirs)
            {
                // Get destination directory.
                string destinationDir = Path.Combine(destDir.FullName, dir.Name);

                if (!String.IsNullOrWhiteSpace(folderExclusionPattern))
                {
                    if (!Regex.IsMatch(dir.Name.ToUpper(), folderExclusionPattern))
                    {
                        // Call CopyDirectory() recursively.
                        CopyDir(dir, new DirectoryInfo(destinationDir));
                    }
                }
                else
                {
                    // Call CopyDirectory() recursively.
                    CopyDir(dir, new DirectoryInfo(destinationDir));
                }
            }
        }
    }
}

Explanation:

Now i’ll go section by section explaining exactly what’s going on. First up are the input requirements:

         //Source dir being deployed from
        [RequiredArgument]
        public InArgument<string> SourceDir { get; set; }

The SourceDir is where the deployment starts from. This means what subdirectory from the BinariesDirectory gets deployed. This is important for Websites, since they get dropped in a _PublishedWebsites folder inside the BinariesDirectory.

        //Destination dir being copied from
        [RequiredArgument]
        public InArgument<string[]> DestinationDir { get; set; }

The DestinationDir argument is a string array of paths being passed in (we use UNC network paths, ie: \\deploymentbox\websites\sitename)

        //Files to exclude
        [RequiredArgument]
        public InArgument<string> FileExclusions { get; set; }

The FileExclusions argument is meant to be a comma delimited string of filenames and patterns you want to exclude. (ie: “web.config, *.pdb”)

        //Folders to exclude
        [RequiredArgument]
        public InArgument<string> FolderExclusions { get; set; }

The FolderExclusions argument is similar to the FileExclusions, except that if there are certain folders you don’t want deployed, you can specify them here

        //globals
        public string fileExclusionPattern = "";
        public string folderExclusionPattern = "";

These two variables will be used to build the Regex pattern which we will check files and folders against.

Now we’ll go into

protected override void Execute(CodeActivityContext context)

explaining each part in more detail.

            // Obtain the runtime value of the Text input argument
            string fileExclusions = context.GetValue(this.FileExclusions);
            string folderExclusions = context.GetValue(this.FolderExclusions);
            string[] destinations = context.GetValue(this.DestinationDir);
            DirectoryInfo sourceDir = new DirectoryInfo(context.GetValue(this.SourceDir));

Here we get the values being passed in at execution time

            //parse exclusions, add them to regex patterns
            if (!String.IsNullOrWhiteSpace(fileExclusions))
            {
                string[] fileexarr = fileExclusions.Split(',');
                fileExclusionPattern = "(";
                foreach (string s in fileexarr)
                {
                    fileExclusionPattern += s.ToUpper().Trim().Replace(".", @"\.").Replace("*", @"[a-zA-Z0-9]*") + "|";
                }

                if(fileExclusionPattern.EndsWith("|"))
                    fileExclusionPattern = fileExclusionPattern.Substring(0, fileExclusionPattern.Length - 1);

                fileExclusionPattern += ")";
            }

            if (!String.IsNullOrWhiteSpace(folderExclusions))
            {
                string[] folderexarr = folderExclusions.Split(',');
                folderExclusionPattern = "(";

                foreach (string s in folderexarr)
                {
                    folderExclusionPattern += s.ToUpper().Trim().Replace(".", @"\.").Replace("*", @"[a-zA-Z0-9]*") + "|";
                }

                if(folderExclusionPattern.EndsWith("|"))
                    folderExclusionPattern = folderExclusionPattern.Substring(0, folderExclusionPattern.Length - 1);

                folderExclusionPattern += ")";
            }

Here we check to see if anything was passed into the file or folder exclusion parameters. If something was passed in, we split the comma delimited string into an array and then iterate through it adding the exclusions to their appropriate Regex pattern.

            foreach (string dir in destinations)
            {
                DirectoryInfo destDir = new DirectoryInfo(dir);

                context.Track(new BuildInformationRecord<BuildMessage>()
                {
                    Value = new BuildMessage()
                    {
                        Importance = BuildMessageImportance.High,
                        Message = "Source: " + sourceDir.FullName + "\nDestination: " + destDir.FullName +
                            "\nFolder Exclusion Pattern: " + folderExclusionPattern +
                            "\nFile Exclusion Pattern: " + fileExclusionPattern,
                    },
                });

                CopyDir(sourceDir, destDir);
            }

Now we loop through all the destinations we need to deploy to, copying the files and folders.

Now lets take a look at

private void CopyDir(DirectoryInfo sourceDir, DirectoryInfo destDir)
            //create destination dir if it doesn't exist
            if (!destDir.Exists)
            {
                destDir.Create();
            }

Let’s make sure the destination directory exists, creating it if it doesn’t exist already.

            // get all files from current dir
            FileInfo[] files = sourceDir.GetFiles();

Here we get the all the files from the source directory

            //copy ze files!!
            foreach (FileInfo file in files)
            {
                if (!String.IsNullOrWhiteSpace(fileExclusionPattern))
                {
                    if (!Regex.IsMatch(file.Name.ToUpper(), fileExclusionPattern))
                    {
                        if (File.Exists(Path.Combine(destDir.FullName, file.Name)))
                        {
                            File.Delete(Path.Combine(destDir.FullName, file.Name));
                        }
                        file.CopyTo(Path.Combine(destDir.FullName, file.Name));
                    }
                }
                else
                {
                    if (File.Exists(Path.Combine(destDir.FullName, file.Name)))
                    {
                        File.Delete(Path.Combine(destDir.FullName, file.Name));
                    }
                    file.CopyTo(Path.Combine(destDir.FullName, file.Name),true);
                }
            }

Let’s copy some files! Check each file against the exclusion pattern, if one is set, and copy it if needed. If the file needs to be copied and it exists in the destination already, we’ll delete it and copy the file over.

            // get subdirectories.
            DirectoryInfo[] dirs = sourceDir.GetDirectories();

            foreach (DirectoryInfo dir in dirs)
            {
                // Get destination directory.
                string destinationDir = Path.Combine(destDir.FullName, dir.Name);

                if (!String.IsNullOrWhiteSpace(folderExclusionPattern))
                {
                    if (!Regex.IsMatch(dir.Name.ToUpper(), folderExclusionPattern))
                    {
                        // Call CopyDirectory() recursively.
                        CopyDir(dir, new DirectoryInfo(destinationDir));
                    }
                }
                else
                {
                    // Call CopyDirectory() recursively.
                    CopyDir(dir, new DirectoryInfo(destinationDir));
                }
            }

Last we check to see if the directory has any sub-directories and we call CopyDir() recursively to copy all sub directories and files inside them, only of course if they aren’t on the folder exclusion list.

Dependency Replication in TFS 2010

I’m not sure exactly how many people do this, but it seems like there isn’t much documentation out there on how to do this exactly. So after lots of trial and error I’ve finally gotten dll replication to work for our TFS environment.

So, first i’ll go over the general overview of what needs to be done to enable TFS to replicate files during a build. Just a reminder, this is for TFS 2010, not any previous version of TFS.

1) You’ll need a custom build template
2) You’ll need a custom TFS Activity

Second, I’ll let you know i basically followed the tutorials by Ewald Hofman and then modified things to work the way I needed them to.

So our scenario is that we have multiple products that use dll’s from a core product. When that core product gets changed, we don’t want to have to copy the dll’s manually from the build location to the locations needed by the other products. Each one of our products have a folder called SharedBinaries, this is where those dll’s live.

Now the question is, how do we do this? If you’re using TFS 2008 or 2005, you can easily just acquire TFS Dependency Replicator, it will handle all that for you. For TFS 2010 however, this doesn’t work anymore. Or more precisely, I was unable to get it to work. So, time to learn how to use Windows Workflow. At first this seems very daunting, but after you take it a bit at a time, things start to make sense.

I would highly recommend you take a look at the tutorials i mentioned earlier, they will give you a good base to work from. And if i were to go through and document the whole process here, i would feel like i’m plagiarizing someone else’s work. Go at least as far as Part 4 – Creating your own activity, this is where i’ll start my documentation.

I created a new Code Activity and called it ‘ReplicateFiles.cs’.  You can download it here.

You’ll want to add three Arguments to your custom template.


In the Metadata Argument, you’ll want to add those three Arguments as required fields.

In the custom template, I added the custom activity just before the gated check-in process.

Lastly, you’ll want to add the three Arguments into the appropriate fields on the custom activity properties, as well as specifying the BinariesDirectory as the SourceLocation to copy the files from.

While setting up your build, you’ll want to make sure you fill out those newly required fields in Build Editor Process Section.

Before you go wild and start running builds, go through this checklist.

  • Does your build controller know the path to the custom binaries you’ve made (for the custom build activity).  (reference)
  • Have you checked in the latest version of your custom template, as well as the custom activity dll
  • The required arguments in the ReplicateFiles.cs are what you pass into the class from the custom template you’ve created, make sure those are filled out in the build template.
  • Make sure you have the correct path to your TFS server for tfsName in ReplicateFiles.cs.
  • Make sure that the user you run your build service under has file system permissions for the location where you want to create your temporary workspaces.
  • I had this issue, and it almost drove me nuts! I kept getting access denied errors when trying to overwrite the files in the target branches. Make sure that the files being replicated ARE NOT checked out and locked!

If you have any corrections, feedback, or ideas! Please comment and let me know!

A few weeks in the making

Over the last few weeks i’ve been collecting links i think are neat or interesting or adorable or funny. So some of these links aren’t even current events (in Internet terms, when yesterday’s news is already old news and no one cares anymore), but I’d still like to share them. I think here in the future i’ll be posting some more stuff with actual content. Things have been kinda blah lately. I think we might go shoosting this weekend, I’ll be sure to take pictures if we do.

Benoît Mandelbrot Has Passed Away
Honestly I don’t know much about Mandelbrot sets or Fractal geometry, but i do recognize that Mandelbrot was a genius. Enough so that he even has songs written about him (Mandelbrot Set).

Angry NES’s version of Cee Lo’s “F*ck You”


I like this song.

MySQL Fanboy’s GIS Articles
Not everyone is into GIS, but i think it could have some really neat applications in mobile development.

Raytheon XOS2 — Pretty cool
I think before long we’re going to have real life bipedal weapons platforms. Gundams and Evas and Dragonauts! 😛

Perceived Wealth Distribution and Reality
I found this interesting. It seems that people think the income gap is a lot smaller than it actually is..

Americans don’t know much about religion
This i found amusing, mainly because I know so many people who think themselves quite religious, but then when you want to have actual discussion or debate about theological issues they can’t seem to defend their positions.

Arduino UNO FAQ on LadyAda.net
For those of you who use Arduinos and are interested in how the new UNO’s differ from the older series. Take a look here.

Tupperware Computing!

A few days ago my roommate got a box of random electronic parts donated to him. Most of the stuff was junk, only good to salvage parts off of. But there were two items that we did find of interest. Two motherboards that seemed like they were still in pretty good condition.

What if we could get them up and running again?

And so our weekend project was started!

IMAG0080 First we got an old power supply and some old RAM we had laying around and plugged everything up to see if it would even turn on. That turned out to be a bit more trouble than we initially thought we’d have. Sean searched the internet for the manual to the motherboard and we found the pins that the power button was to plug up to. One small problem, we didn’t have a power button. So while Sean got something together, we used a set of hemostats to bridge the pins and power on the pile of parts. Something was wrong though, the monitor we had hooked up to it didn’t show anything. and since we didn’t have a PC speaker, we couldn’t hear beep codes. We almost resorted to the ancient art of poke-at-it-with-a-screwdriver when i noticed a jumper in between the two RAM banks. I pulled it out and tried the power and BAM! POST!IMAG0081 By this time Sean had gotten a power button and a light soldered together for us to use. Now we needed some sort of boot up medium, and we decided that a USB drive would be optimal. We got a 2GB drive from Walmart and loaded it up with Lubuntu.

After tweaking the BIOS settings a bit we finally were able to boot off the USB drive and get to the boot menu for the drive. On our first attempt to boot from the drive we got a kernel panic so we decided to test the RAM. That ended up being a good idea, since we discovered all of our extra DDR RAM was bad. Into the trash with that stuff! Luckily there was a second bank of RAM slots on the motherboard which supported SDRAM, an older format of RAM.

Sean pulled out the motherboard for one of his first computers, Jenny, and luckily it still had RAM in it’s banks! Thank god for hording computer parts 🙂 We pulled out two of the ram sticks and tested them. They were good to go!IMAG0095 We booted up and tada! We got a loading screen. It seemed to take forever to load, then we noticed we’d forgotten to put in the second chip of RAM, whoops :3

Now we’re in the process of updating the Ubuntu core and then we’re going to see if we can get some sort of clustering up and running. It would be neat to get a farm up and running.

There are more pictures on my flickr account

outrageous to some; rational to others

Obama invokes ‘state secrets’ claim to dismiss suit against targeting of U.S. citizen al-Aulaqi

When i first read about this, i was outraged! I thought that we had taken one of our own citizens and just decided to execute them without reason, without a trial, without a chance of self-defense. But when i read the original article (linked above), there were some things that caught my eye. First was the link to a previous story, giving a bit more history about al-Aulaqi and why he was put on the — second thing that caught my eye — capture-or-kill list. The operative words are CAPTURE or KILL. This means that he’s not being executed or assassinated, he’s a wanted man. I make the connection between this and the posters from the Old West, “WANTED: Dead or Alive”.

I can see how people might be upset that we’re going after one of our own citizens, but if that citizen has blatantly turned his back on our country and everything it stands for (good and bad), and is openly trying to harm it. I don’t see why we have to extend a hand and say, “Come back, we’ll give you a trial in our extremely flawed and overly complicated legal system”. We will either capture you and treat you as an enemy of the state, or if it comes down to it, we won’t hesitate to kill you. This doesn’t mean though that we’re just going to shoot you when we see you.

If this guy is actually a cleric, and not really a terrorist like the mounds of evidence seem to show, then he should turn himself over.

Things and stuff and happenings

Last night I went to see The National play in Louisville. The show was great! What really impressed me the most was their special guest, Owen Pallett. What he can do with a violin, and a mixer is pretty awesome. After his show, I went and got two of his albums. I’ve been listening to them since then! I should be uploading pictures soon.

Things I’ve been collecting over the last week. Posts that I thought were pretty interesting — to me anyway.

Building the Wall
I remember going to the Vietnam Memorial while we lived in Maryland — about 15 years ago. I don’t think it really meant anything to me, I just thought it was cool to see so many names on a shiny black wall. I think if I were to go now it would mean a whole lot more — after having served in the military.

4chan takes on the MPAA
So after hiring a company to DDoS bittorrent trackers who wouldn’t comply with their requests to take down certain material, 4chan had planned to DDoS the hired company. Well it seems that someone beat them to the punch. So what do the hordes of Anon at do? Go after the MPAA itself! It only took 8 minutes for them to take the site down. It made me lul.

From single-core to quad-core in less than a year
Multi-core phones would be amazing! That, coupled with the concept phone by Mozilla (linked below), would probably be enough for most people — who just surf the web, check email, and play Farmville.

Nifty Artwork
I think the fact that I’ve been watching Mad Men like a.. well.. mad man lately attracted me to this page. I want a suit like Don Draper.

Obama talks about taxes and strengthening the economy
While i don’t agree with everything this administration is doing / has done, I do think that taxing people more evenly would make things simpler and help bridge gaps between social groups. I’d like to see a day when we all pay the same amount in taxes, instead of this confusing system of earned and unearned income, capital earnings, donations to charities, etc. How about we all just pay 15 or 20 percent of our income? I think that would make things much easier, not just in terms of administrating and collecting, but also planning for what you’ve got to pay. But of course, a simple idea like this would probably just get screwed up by the government. It’s sad we can’t trust them to do something simple.

Paul Lekakis – Boom Boom
I lol’d at this, thanks Katie.

Wind Power!
Let’s see who else will pony up the big bucks and commit to something of this size? Or would there be more protests that a wind farm 20 miles off shore would disrupt the scenery?

Old People know stuff, kinda
I watched this video of Walter Breuning talking about his life, and what advice he has for the current generation. You’ll have to watch what he says though, some things are exaggerated a little. It’s still really neat to hear first hand accounts of times so long ago from someone who actually lived them!

Community Wi-Fi?
While the speeds are slow compared to N, the range is so much better. I could see community Wi-Fi or Municipal WiFi using this technology.

Phone of the future!?
How great would it be if your phone became the end-all-be-all tool you’d use? Instead of having to have a laptop, a phone, a projector, a watch, and a list of other gadgets, you could have one piece of technology to take many of those items and consolidate them into one. The big thing that really appeals to me is being able to carry your information in “physical” form around with you, instead of having to rely on some sort of “cloud” storage. With newer phones now also being multi-core, a phone like the Seabird would have more than enough power to handle web browsing, email, watching online videos, along with most office type applications.

Full Moon Beer
This sounds interesting. Being a lover of beer, I would definitely try it at least once.

I love fall weather
I think it’s the colors and lower temperatures. I can’t wait to get a motorcycle (something like this or this) and be able to ride around in this weather.

oh.. and this:

and this:
TEAMWORK!

Things of the day

Happy Birfday Supah Mario Brudders! — is it weird that this makes me feel kinda old? I used to love playing Mario Bros on the NES, i never beat it or really got that far in it, but i just like the music and jumping on koopa troopahs. (I know this is from yesterday, I started this blog then, and it’s stretched over a day or so :P)

Indestructible 3-ring Binder — this looks like something a post-apocalyptic bean counter would carry around with him.

Don’t want to shed out 500 bucks for MS Project? Try Gantt Project. — something an old coworker and i were talking about brought me across this. He was saying that often times he was working from home after work to get things done. Besides adding extra stress to your life, after you’re off shift, you shouldn’t have to put in extra hours (especially if you’re not being paid for them). Being able to give management a project plan, detailing what tasks need to be done, how long they’ll take, and when the estimated completion date is, should allow him to have more time at home NOT being a slave to the man 😉

About time the US started really upgrading it’s network infrastructure. — I was astounded when i read the number ‘1Gbps’! That’s how fast my local network at home is. I could only imagine having that kind of speed for the internet! Not only would that let me do more things with my server(s), but download speeds and gaming would be so faaaaaaaaaast!

Pandas love to party! — it’s true. they love teh cakes. but they don’t quite move-it-move-it

I love this house! — I love it when people incorporate their environment into their home-building.

The database nerd in me likes reading about stuff like this 🙂 — I always find it amazing when you can aggregate and search data efficiently. Especially when you’re talking about the amount and size that Google has to deal with.

This is old, but I still find it funny 😛 — I think it’s a combination of people not knowing how to use computers, and then believing everything they read on the internets.

New MC Frontalot.

Last but not least, Happy Birthday to my dad! If you’re friends with him on Facebook, go wish him a happy B-Day 🙂

A run around the Internet

I’ve been perusing the internet today, while waiting on RM stuff to get done and I’ve found some pretty neat things. In no particular order:

Woven Steel Wallet!
I think the idea is neat and it supposedly blocks RFID from being read. But at 75 bucks, I think it’s a bit pricey.

The United States of Inequality
To be honest, I didn’t read the entire article, mainly because it’s quite long, but from what I read I thought it was an interesting take on the traditional thoughts on what economists call, “The Great Divergence”. Even though everyone seems to be blaming the government for something (legitimate complaints or not) this seems to take data gathered over multiple administrations and comes up with a rather interesting hypothesis.

Tractor Beams are becoming REAL!
It’s the nerd in me that finds this awesome. How awesome would it be to fire a beam at something and it moves toward you!? Granted the technology is nowhere near fruition, but the idea that there’s a possibility of it being a reality is awesome.

Pictures I think are neato. Nuff said.

Detroit is not dead.
I found this rather refreshing. A look at what people are doing in Detroit to rebuild and reclaim, even though the rest of the world seems to think the city is dead.

TFS Builds and Build Servers for TFS 2008

It seems that a lot of companies have become a fan of virtualization. I can understand why. Instead of having 5 single machines — and as such 5 sets of hardware — all doing one task, you can have one machine run 5 virtual servers running their one task. It saves space, power consumption, and in the end money.

There’s a problem though. If those servers are chugging along doing their thing and needing to utilize CPU cycles or I/O to hardware, things slow down. A lot. We’ve run into this issue lately since we’ve started doing a lot of parallel development — developing on multiple projects that are releasing at different times on the same product. We have code check-ins happening at all times of the day — we also have an offshore team hard at work — and as such we have continuous replication builds firing off all the time as well. If multiple products or projects are being worked on, then all those builds queue up, which normally isn’t a problem. It’s once we need to move a product to another environment (ie: Integration Testing) that things become evident that there is a problem. Once we start a build which also stages files on a different server (but happens to just be another virtual on the same physical machine) everything creeps to a halt. Coupled with other builds trying to run while large amounts of files are being transfered from one virtual to another things get even slower.

Solution time. To help take the load off the Dev environment server, we thought of getting setting up a new build server on its own physical machine. Makes total sense. But we’d have to go through all the bureaucracy of getting approved hardware, putting in requests, blah blah blah. Developers don’t have time for that kind of thing! So we thought about what we could do, and it struck us! Why not use our own machines? Release management has three members in it, that could be three extra build machines!

So I used myself as a Guinea pig and installed the build service on my machine. I configured it and with a few minor tweaks was able to get builds running on my machine without too much impact on performance. Since then, we’ve planned to take our workstations and dedicate them to one product each. This way we can have 3 CI (Continuous Integration) builds running at a time. We’re still planning on keeping the staging builds local until we get all issues ironed out — because I’m sure there will be issues that will come up.

So far, so good though. Everything seems to be running well, and our Integration Testing push times have been cut in half.

Links to pages i used:
How to: Set up and Install Team Foundation Build (for TFS 2008)
How to: Add the Build Service Account to the Build Services Security Group
How to: Create and Manage Build Agents

Some notes:

  • If you have existing builds that use special build properties (ie: use VB6 to compile an executable) make sure that VB6 is in the same location on all build machines that will be used for such builds.
  • If your initial build server has special properties for MSBuild, make sure you copy the MSBuild directory to your build machines. This will help with issues due to certain Imports not being available.

I LIVE!

I’m not sure what’s happened to me in the last few months. Actually, that’s a lie. I do know. But I’m not really comfortable talking about it in the open. Short version is, I was kind of depressed. But I think I’m getting better, I’m starting to be interested in things again. I have desires to go out and do stuff, instead of just moping around the house and going to go get shit faced on the weekends.

Yay me!

Ok, so, now for some updates as to what I’ve been doing. About a month ago I started P90X, which is a pretty intense workout program. Since then I’ve slacked off, since I’ve also picked up Judo on Monday’s and Friday’s. I’ve been going with some friends, and it’s quite enjoyable (besides the conditioning portion of the lessons, that’s always quite.. exhausting). So, right now I’m only exercising twice a week. I’d like to get that back to at least 3 times a week. I think we’re (the friends I’m doing Judo with) going to try and do something on Wednesdays. Maybe Yoga, or some sort of stretch routine.

I’ve been pretty busy with work too. This weekend will be the second weekend in a row where i have to work on saturday, and the third weekend this month. It’s not too bad because i get days comped, but it does suck that i can’t sleep in on Saturday 😛 This Saturday will be extra strenuous. We’re doing our scheduled monthly release, which means i get to be at work at 3am and work until everything (our software) is out in production. I should be home sometime between 9am and 10am. So that means i’ll have a few hours to sleep before Brew at the Zoo! I’m sure we’ll all have a blast 😀

After a 3-ish month hiatus I’m playing Eve again. I always come crawling back 😛 I’m taking it slow this time, only playing for an hour or so a few days a week. But it’s fun to talk to all my old friends, even that crazy nut borked 😉

I also have plans to work on my car again. I think next up will be the valve cover gasket and VANOS. That’s going to be a weekend project I think, and the next few weekends don’t look like i’ll have enough time to do it. I also have to find a fairly inexpensive VANOS (maybe from DrVANOS?).

Either way, I think I’ll be busy-busy for the next few weeks. And it feels pretty good.