Showing posts with label Automation. Show all posts
Showing posts with label Automation. Show all posts

Saturday, 23 May 2009

Encrypting Connections strings in .Net with NAnt

I have been working a lot with databases over the last couple of weeks, one of the things that is a novelty to the people I am working with now is the ability to store multiple connection strings in our .config files. Its a really simple and flexible thing to do, allowing us to transport any of our apps from environment to environment with very little hassle. The only downer is that the connection strings are stored in plain-text, which naturally is a massive security hazard.



That is, of course, unless you encrypt them.



There are loads of examples on the Internet explaining how this can be done, and they are very simple to follow and implement - but I wonder how many people make use of this fairly simple feature? One thing that became apparent to me was, even though I can encrypt the file - just when and how do I want to do this? The when is simple, I want my connection strings encrypted when the application moves into a testing environment an I want them to stay encrypted from then on. The how is slightly different, there is no point in having the encryption take place as I am developing the app, as I want to have the flexibility of changing the connection strings as and when I need them. So, the ideal time for me to encrypt them would be during the build itself. Its a very straight forward thing to do, simply write a small program that encrypts the connection strings section of a web or app .config and have Visual Studio process this as a post build event. Its possibly the easiest way to do this. But, this isnt so flexible, it would only really encrypt one file relevant to a project, and whilst the code is reusable, it isnt very agile.

So, why not move this from being a post build event in the soloution file out to being a post build event in your larger build process? Using NAnt (or MSBUILD), we can create a custom task that will encrypt the connectionStrings section in any .config file. This could be used over and over again as and when requirements dictate a section in a config file should be encrypted.

There are a couple of things we need to capture:
  • The path of the .config file
  • What sort of encryption we want to use.

It would appear that .Net offers two methods to encrypt a .config file, DPAPI and RSA. DPAPI is provided by Microsoft, there is a nice article on Wikipedia discussing this encryption method. For a lot of people though, RSA may be a more suitable encryption method (its the one I am using if that is worth anything).

Back to the task. To start with, I want to get the location of the .config file and decide which encryption method I want to use. This is easy enough:


[TaskName("config-encrypt")]
class Encryption : Task
{
private string strProvider;
private string _filePath;
[TaskAttribute("configfile", Required = true)]
public string FilePath
{
get { return _filePath; }
set { _filePath = value; }
}
private string _rsa;
[TaskAttribute("RSAencryption", Required = false)]
public string RSA
{
get { return _rsa; }
set { _rsa = value; }
}
private string _dpapi;
[TaskAttribute("DPAPIencryption", Required = false)]
public string DPAPI
{
get { return _dpapi; }
set { _dpapi = value; }
}
protected override void ExecuteTask()
{
if (checkFileExists())
{
if (FilePath.Contains("web.config"))
{
encryptWebConfig();
rename();
}
else
{
encryptAppConfig();
rename();
}
}
else
{
throw new BuildException("The .config nominated does not seem to exist");
}
}

I have left the encryption methods as optional - this way, you dont need to worry about which type you need, however by doing this I need to set up a default encryption provider. A simple method like the one below will take care of that for me, I have established that I want to use RSA as the default encryption method if no other is specified:
private void chooseEncryptionProvider()
{
if (RSA == "true")
{
strProvider = "RSAProtectedConfigurationProvider";
}
else if (RSA == "" & DPAPI == "")
{
strProvider = "RSAProtectedConfigurationProvider";
}
else if(DPAPI == "true")
{
strProvider = "DataProtectionConfigurationProvider";
}
}

I also need to make sure that the file I want to apply encryption to exists in the first place. Again, this is very simple to do:


private bool checkFileExists()
{
if (File.Exists(FilePath))
{
return true;
}
else
{
return false;
}
}

Now I am all set to get encrypting. I want my task to be able to handle both web and app.config files, so I have two methods to handle each. They are pretty much identical in their operation, the only real difference is down to the way .Net handles web.config files. To encrypt a web.config I have done the following:
private void encryptWebConfig()
{
chooseEncryptionProvider();
try
{
Configuration _configFile = WebConfigurationManager.OpenWebConfiguration(FilePath);
if (_configFile != null)
{
try
{
ConnectionStringsSection _section = (ConnectionStringsSection)_configFile.GetSection("connectionStrings");
_section.SectionInformation.ProtectSection(strProvider);
_section.SectionInformation.ForceSave = true;
_configFile.Save();
}
catch (Exception e)
{
throw new BuildException("Failed to encrypt the connection strings sectiion",
e.InnerException);
}
}
}
catch (Exception e)
{
throw new BuildException(e.Message.ToString(), e.InnerException);
}
}

You can see where I am choosing the encryption provider at the start of the method. One thing I havent done here is implement any logging. For the sake of an audit trail, it would be a very wise thing to log what sort of encryption is being used. This way, you can look back across the builds you have completed for a specific environment and see which encryption provider was used. The following method encrypts an app.config:


private void encryptAppConfig()
{
string path = FilePath.Replace(".config", "");
chooseEncryptionProvider();
Configuration _configFile = ConfigurationManager.OpenExeConfiguration(FilePath);
if (_configFile != null)
{
try
{
ConnectionStringsSection _section = (ConnectionStringsSection)_configFile.GetSection("connectionStrings");
_section.SectionInformation.ProtectSection(strProvider);
_section.SectionInformation.ForceSave = true;
_configFile.Save();
}
catch (Exception e)
{
throw new BuildException("Failed to encrypt the connection strings section"
, e.InnerException);
}
}
}

As you can see, it is pretty much the same idea for this operation compared to encrypting a web.config. With all this in place, we now have all we need to automatically encrypt a .config file as part of an automated build. This can be added on to the end of a build target in a NAnt script, or better yet added to its own target to be used as a post build event. One thing I found when using this is that after the .config had been encrypted a .config.config file was created leaving the un-encrypted .config in the same directory. Naturally, this defeats the point so I put the following method together to clean up for me:
private void rename()
{
if (File.Exists(FilePath + ".config"))
{
string oldname = FilePath;
string newname = FilePath + ".config";
File.Delete(FilePath);
File.Copy(newname, oldname);
File.Delete(newname);
}
}

And there we go, all done. The syntax for the build script is insanely simple - if you dont mind using RSA encryption all you need to do is point it at the .config you want to encrypt:
</target>
<target name="encrypt">
<config-encrypt configfile="C:\test\web.config" />
</target>

I have been using this task for the past week or so, and it has been working like a dream :). The complete code for the task is below:
using System;
using System.Collections.Generic;
using System.Configuration;
using System.IO;
using System.Linq;
using System.Text;
using System.Web;
using System.Web.Configuration;
using NAnt.Core;
using NAnt.Core.Tasks;
using NAnt.Core.Attributes;

namespace Custom.NantTasks.Encryption
{
[TaskName("config-encrypt")]
class Encryption : Task
{
private string strProvider;
private string _filePath;
[TaskAttribute("configfile", Required = true)]
public string FilePath
{
get { return _filePath; }
set { _filePath = value; }
}
private string _rsa;
[TaskAttribute("RSAencryption", Required = false)]
public string RSA
{
get { return _rsa; }
set { _rsa = value; }
}
private string _dpapi;
[TaskAttribute("DPAPIencryption", Required = false)]
public string DPAPI
{
get { return _dpapi; }
set { _dpapi = value; }
}
protected override void ExecuteTask()
{
if (checkFileExists())
{
if (FilePath.Contains("web.config"))
{
encryptWebConfig();
rename();
}
else
{
encryptAppConfig();
rename();
}
}
else
{
throw new BuildException("The .config nominated does not seem to exist");
}
}
private void encryptWebConfig()
{
chooseEncryptionProvider();
try
{
Configuration _configFile = WebConfigurationManager.OpenWebConfiguration(FilePath);
if (_configFile != null)
{
try
{
ConnectionStringsSection _section = (ConnectionStringsSection)_configFile.GetSection("connectionStrings");
_section.SectionInformation.ProtectSection(strProvider);
_section.SectionInformation.ForceSave = true;
_configFile.Save();
}
catch (Exception e)
{
throw new BuildException("Failed to encrypt the connection strings sectiion",
e.InnerException);
}
}
}
catch (Exception e)
{
throw new BuildException(e.Message.ToString(), e.InnerException);
}
}

private void encryptAppConfig()
{
string path = FilePath.Replace(".config", "");
chooseEncryptionProvider();
Configuration _configFile = ConfigurationManager.OpenExeConfiguration(FilePath);
if (_configFile != null)
{
try
{
ConnectionStringsSection _section = (ConnectionStringsSection)_configFile.GetSection("connectionStrings");
_section.SectionInformation.ProtectSection(strProvider);
_section.SectionInformation.ForceSave = true;
_configFile.Save();
}
catch (Exception e)
{
throw new BuildException("Failed to encrypt the connection strings section"
, e.InnerException);
}
}
}

private void chooseEncryptionProvider()
{
if (RSA == "true")
{
strProvider = "RSAProtectedConfigurationProvider";
}
else if (RSA == "" & DPAPI == "")
{
strProvider = "RSAProtectedConfigurationProvider";
}
else if(DPAPI == "true")
{
strProvider = "DataProtectionConfigurationProvider";
}
}

private bool checkFileExists()
{
if (File.Exists(FilePath))
{
return true;
}
else
{
return false;
}
}

private void rename()
{
if (File.Exists(FilePath + ".config"))
{
string oldname = FilePath;
string newname = FilePath + ".config";
File.Delete(FilePath);
File.Copy(newname, oldname);
File.Delete(newname);
}
}
}
}

Saturday, 9 May 2009

NAnt, .Net, Sandcastle and Documentation

Documentation is great, it makes people feel good about themselves. After they have created a particularly cunning peice of work, it's nice to throw together some documentation so that others can see your genius. But wait, that means you need to go through all your work and come up with some documentation of your own! Gasp, shock - horror!!

Lets face it, for a lot of people, creating standardised documentation after they have finished their project etc isnt at the fore front of their minds. This is a shame, because with flippancy aside, people do really like documentation. It enables developers to get on with their work without asking people for info on an API etc and it lets managers see that there is "stuff being done with documentation". And dont forget, standardisation is key here - you dont want a hundred different developers all creating documentation in different ways, it would be a nightmare.

There are a lot of ways documentation can be generated at build time. One popular way is to use NDoc and this is catered for within NAnt. However, this doesnt support .Net 2.0 let alone 3.5 and it hasnt seen a release since 2005... So, there appears to be a massive gap for generation of documentation, that is if you don't count Sandcastle. This is a fairly useful, if slightly complex method of generating documentation. There are lots of examples on how to use Sandcastle via batch files and powershell scripts (the version I downloaded came packed with them). In an effort to understand the generation process, I decided to put together a NAnt task so this could be automated as part of continuous integration.

Sandcastle itself is not really one application. Right now, Sandcastle is a number of programs working together. It consits of:

  • MRefBuilder.exe
  • XslTransform.exe
  • BuildAssembler.exe
  • CHMBuilder.exe
  • DBCSFix.exe

Each of these programs are used in turn to generate first XML documentation followed by CHM (should you need it.). The following examples give an indication of how you could put a NAnt task together to make use of these programs for all your documentation needs.

All we need to do is create a suite of tasks that represent each of these programs. Essentially, all we are doing here is capthuring the arguments we need to run the apps themselves, there isnt much going on in task per se. For example, here is the complete code for MRefBuilder:

using System;
using System.Collections.Generic;
using NAnt.Core;
using NAnt.Core.Tasks;
using NAnt.Core.Attributes;
using Custom.NantTasks.Common;

namespace Custom.NantTasks.Sandcastle
{
/// <summary>
/// NAnt task to make use of Sandcastle from MS
/// </summary>
[TaskName("mrefbuilder")]
class MRefBuilder : ExternalProgramBase
{
private string _assemblyPath;
[TaskAttribute("assembly-path", Required = true)]
public string AssemblyPath
{
get { return _assemblyPath; }
set { _assemblyPath = value; }
}
private string _outputPath;
[TaskAttribute("output-path", Required = true)]
public string OutPutPath
{
get { return _outputPath; }
set { _outputPath = value; }
}
private string _executable;
[TaskAttribute("executable", Required = true)]
public string Executable
{
get { return _executable; }
set { _executable = value; }
}
public override string ExeName
{
get
{
return _executable;
}
set
{
base.ExeName = value;
}
}
public override string ProgramArguments
{
get { return AssemblyPath + " /out:" + OutPutPath; }
}
protected override void ExecuteTask()
{
Log(Level.Info, "Running MRefBuilder");
Task();
}

private void Task()
{
try
{
base.ExecuteTask();
}
catch (Exception e)
{
throw new BuildException(e.Message);
}
}
}
}


As you can see, this is pretty much just an arguments gathering exercise. Things become a little trickier for XslTransform. This app specifies multiple transformations, so we need to be able to specify at least one transformation in the task itself - but reserve the ability to specify a second. I am not going to go into exactly how Sandcastle works here, all I would be doing is copying information. For an in depth explanation of how Sandcastle works, take a look at the Sandcastle blog. So, for the XslTransform task, I simply specify two attributes for the .xsl files:


private string _xslPath;
[TaskAttribute("xsl-path", Required = true)]
public string XslPath
{
get { return _xslPath; }
set { _xslPath = value; }
}
private string _xslPath2;
[TaskAttribute("xsl-path2", Required = false)]
public string XslPath2
{
get { return _xslPath2; }
set { _xslPath2 = value; }
}

You may wish to use more than one .xsl - I don't, the example batch file in the version I downloaded didnt seem to make use of more than two. However, if you need to add more then its a very simple task to add this to the task itself. Once the arguments have been captured, we need to then use them with the app itself. We need to be able to provide at least one .xsl in the argument we use with the app and if a second is specified, then we need to be able to provide this also. To achieve this, I used the following:


private string checkXSLPath()
{
string _arg;
if (XslPath2 == null)
{
_arg = "/xsl:\"" + XslPath + "\" " + InputFile + " /out:" + OutputPath;
Log(Level.Info, _arg);
return _arg;
}
else
{
_arg = "/xsl:\"" + XslPath + "\" " + "/xsl:\""
+ XslPath2 + "\" " + InputFile + " /out:" + OutputPath;
Log(Level.Info, _arg);
return _arg;
}

}

All this does is simply check to see if a second xsl file has been specified. If it has, then it returns an argument that makes use of it, if not, then it returns an argument that doesnt. The full source looks like this:


using System;
using System.Collections.Generic;
using NAnt.Core;
using NAnt.Core.Tasks;
using NAnt.Core.Attributes;

namespace Custom.NantTasks.Sandcastle
{
[TaskName("xsltransform")]
class XslTransform : ExternalProgramBase
{
private string _xslPath;
[TaskAttribute("xsl-path", Required = true)]
public string XslPath
{
get { return _xslPath; }
set { _xslPath = value; }
}
private string _xslPath2;
[TaskAttribute("xsl-path2", Required = false)]
public string XslPath2
{
get { return _xslPath2; }
set { _xslPath2 = value; }
}
private string _inputFile;
[TaskAttribute("inputfile", Required = true)]
public string InputFile
{
get { return _inputFile; }
set { _inputFile = value; }
}
private string _outPutPath;
[TaskAttribute("output", Required = true)]
public string OutputPath
{
get { return _outPutPath; }
set { _outPutPath = value; }
}
private string _executable;
[TaskAttribute("executable", Required = true)]
public string Executable
{
get { return _executable; }
set { _executable = value; }
}
public override string ExeName
{
get
{
return _executable;
}
set
{
base.ExeName = value;
}
}
public override string ProgramArguments
{
get { return checkXSLPath(); }
}
protected override void ExecuteTask()
{
Log(Level.Info, "Starting XslTransform");
Task();
}

private void Task()
{
try
{
base.ExecuteTask();
}
catch (Exception e)
{
throw new BuildException(e.Message);
}
}
private string checkXSLPath()
{
string _arg;
if (XslPath2 == null)
{
_arg = "/xsl:\"" + XslPath + "\" " + InputFile + " /out:" + OutputPath;
Log(Level.Info, _arg);
return _arg;
}
else
{
_arg = "/xsl:\"" + XslPath + "\" " + "/xsl:\""
+ XslPath2 + "\" " + InputFile + " /out:" + OutputPath;
Log(Level.Info, _arg);
return _arg;
}

}
}
}


Looking good so far. Now we have two NAnt tasks that will generate documentation as xml and then transform it. Now we get to the meat of the whole process, BuildAssembler. Again, I am not going to go into detail about what this does here - instead, check the Sandcastle blog entry on BuildAssembler here for a complete explanation. Whilst the process behind BuildAssembler is very involved, the NAnt task for it is quite simple, the complete source is below:


using System;
using System.Collections.Generic;
using NAnt.Core;
using NAnt.Core.Tasks;
using NAnt.Core.Attributes;

namespace Custom.NantTasks.Sandcastle
{
[TaskName("buildassembler")]
class BuildAssembler : ExternalProgramBase
{
private string _manifest;
[TaskAttribute("manifest", Required = true)]
public string Manifest
{
get { return _manifest; }
set { _manifest = value; }
}
private string _config;
[TaskAttribute("config", Required = true)]
public string Config
{
get { return _config; }
set { _config = value; }
}
private string _executable;
[TaskAttribute("executable", Required = true)]
public string Executable
{
get { return _executable; }
set { _executable = value; }
}
public override string ExeName
{
get
{
return _executable;
}
set
{
base.ExeName = value;
}
}
public override string ProgramArguments
{
get { return " /config:\"" + Config + "\" " + "\"" + Manifest + "\""; }
}
protected override void ExecuteTask()
{
Log(Level.Info, "Starting BuildAssember");
base.ExecuteTask();
}
}
}


Essentially, all we are doing is giving BuildAssembler a config file to work with as well as a manifest file we generated with the previous task, XslTransform. Now even though there are more steps to go through, we need to start looking at the NAnt script itself. On its own, Sandcastle wont do everything you want, there are things you need to initiate, some of these are the creation of directories as well as copying files etc. Lets take a look at the NAnt script, at this stage it should look something similar to this:


<target name="Sandcastle">
<echo message="Trying out my Sandcastle task" />
<echo message="Checking for output directory.." />
<mrefbuilder assembly-path="Custom.NantTasks.dll"
output-path="C:\somefolder\sandcastletest.xml"
executable="C:\Program Files\Sandcastle\ProductionTools\MrefBuilder.exe" />
<xsltransform xsl-path="C:\Program Files\Sandcastle\ProductionTransforms\ApplyVsDocModel.xsl"
xsl-path2="C:\Program Files\Sandcastle\ProductionTransforms\AddFriendlyFilenames.xsl"
inputfile="C:\somefolder\sandcastletest.xml"
output="C:\somefolder\refelction.xml"
executable="C:\Program Files\Sandcastle\ProductionTools\xsltransform.exe" />
<xsltransform xsl-path="C:\Program Files\Sandcastle\ProductionTransforms\ReflectionToManifest.xsl"
inputfile="C:\somefolder\refelction.xml"
output="C:\somefolder\manifest.xml"
executable="C:\Program Files\Sandcastle\ProductionTools\xsltransform.exe" />
<mkdir dir="C:\somefolder\html" />
<mkdir dir="C:\somefolder\icons" />
<copy todir="C:\somefolder\icons">
<fileset basedir="C:\Program Files\Sandcastle\Presentation\vs2005\icons">
<include name="*" />
</fileset>
</copy>

<mkdir dir="C:\somefolder\scripts" />
<copy todir="C:\somefolder\scripts">
<fileset basedir="C:\Program Files\Sandcastle\Presentation\vs2005\scripts">
<include name="*" />
</fileset>
</copy>
<mkdir dir="C:\somefolder\styles" />
<copy todir="C:\somefolder\styles">
<fileset basedir="C:\Program Files\Sandcastle\Presentation\vs2005\styles">
<include name="*" />
</fileset>
</copy>
<mkdir dir="C:\somefolder\media" />
<mkdir dir="C:\somefolder\intellisense" />
<buildassembler config="C:\Program Files\Sandcastle\Presentation\vs2005\configuration\sandcastle.config"
manifest="C:\somefolder\manifest.xml"
executable="C:\Program Files\Sandcastle\ProductionTools\buildassembler.exe" />
</target>

As you can see, my script is very static. It specifies one model for Sandcastle (VS2005) and specifies one assembly to generate from. Also, as you can see all of the paths are static - but this doesnt stop you from creating a generic task to handle multiple assemblies as part of a CI process. Points to note in this script - simply the creation and population of directories prior to using BuildAssembler. These come directly from the location Sandcastle has been installed to (or from wherever you will choose to store them).

So, we are nearly there. By now, we have xml documentation generated, following the provided scripts in the Sandcastle download, the next step is to create some CHM file. Personally, I dont really need .chm files generated - I want my help files to go straight into a website/wiki.... To do that though, requires a bit more thought and code than I cover here. So stay tuned!!

Saturday, 4 October 2008

Generating SQL Server Documentation with NAnt and RedGate

The other day I posted a method by which you can automatically sync schema from one MS SQL Server db to another. While I was looking at the RedGate tools, I found a handy utility called SQL Doc. This doesnt have an api (and doesnt really need one) it consits of a GUI and a commandline client, which is good.

Basically, this app allows you to create a set of documentation on any given db including the setup scripts needed to replicate that particular db. It can output this report as a set of web pages, a .chm or a .doc. I thought that this was pretty cool, it represents a really good way of documenting the db at certain points through out its life. One good use of this tool would be to generate documentation prior to a new release (or just after). That way, you can maintain a handy record of the changes going into your db that can be published on your intranet/wiki etc.

Of course, as a GUI tool, this isnt so great for CI, so whacking the commandline parameters into a NAnt script will make it slot into your automated process nicely. Plus, it's an easy win - the task itself is very easy to write:

using System;
using System.Text;
using NAnt.Core;
using NAnt.Core.Tasks;
using NAnt.Core.Attributes;

namespace Custom.NantTasks.RedGate
{
[TaskName("sqldoc")]
class SQLDoc : ExternalProgramBase
{
private string _projectPath;
[TaskAttribute("project", Required = true)]
public string ProjectPath
{
get { return _projectPath; }
set { _projectPath = value; }
}
private string _reportType;
[TaskAttribute("report-type", Required = true)]
public string ReportType
{
get { return _reportType; }
set { _reportType = value; }
}
private string _password;
[TaskAttribute("password", Required = true)]
public string Password
{
get { return _password; }
set { _password = value; }
}
private string _exePath;
[TaskAttribute("app-path", Required = true)]
public string ExePath
{
get { return _exePath; }
set { _exePath = value; }
}
public override string ExeName
{
get
{
return _exePath;
}
set
{
ExeName = value;
}
}

public override string ProgramArguments
{
get { return "/project:\"" + ProjectPath + "\"" + " /password:" + Password +" /filetype:" + ReportType; }
}
protected override void ExecuteTask()
{
Log(Level.Info, "Generating SQL documentation");
Log(Level.Info, "Running the following command: " + ProgramArguments.ToString());
base.ExecuteTask();
}
}
}


And thats it!! There are, however, a few caveats. In order to automate this, you first need to set up a project file through the GUI. Unfortunately, there doesnt seem to be a commandline route to do this. Once you have configured the project and saved it, you basically get an XML file - which is very handy as you could use NAnt to alter the values held within it (I havent done this). The build file would look a little something like this:
<target name="doc">
<sqldoc project="C:\WidGetDev.sqldoc" password="password" report-type="html" app-path="C:\Program Files\Red Gate\SQL Doc 2\SQLDoc.exe" />
</target>

Unfortunately, there is no way to specify a username on the commandline, only a password, which I guess is good enough. Once done, you can run this as a NAnt task pretty much anywhere. I did this on Vista, so I got some interesting security notifications pop up. I also got a very annoying bug following a sucessful build. For some reason, SQL Doc would fail to exit correctly (exiting on code 77, which means that the user does not have permission), however this just went away for me in the end... which was odd. Regardless, once the task has ran and completed, you will be able to pick up your documentation, SQL Doc places it in a location specified during the project creation, but you can change this whenever you like. After the task is run, just do what ever you fancy with the results, my idea was to post them to a wiki, but you should be able to attach them as a build artefact in which ever CI server you are using (if any).

Friday, 3 October 2008

Sync Schema with NAnt and RedGate SQL Compare

This was a problem that was bothering me a little throughout last year. On my previous blog I had a very basic NAnt task that made use of RedGate's SQL Compare API. RedGate makes some fantastic tools for use with MS SQL Server, if you have to spend time working on multiple databases, then I can't recommend it enough. For more infor mation, pop over to their website: http://www.redgate.com/.

First up, you need to have a little think about the problem at hand - you have a number of databases that you need to sync. SQL Compare gives you a nice GUI to do all of this, which is fine if you only have one db to worry about (or at a stretch, a couple.) but if you have a large amount of db's spread across multiple environments this will become a problem very quickly. To get around this problem, you can make use of the command line version of SQL Compare or you can do what I did and make use of the API to automate things just a little further. One clear benefit I have noticed so far is that using the API over the command line version results in a quicker operation.

Another thing to consider is this: If you have more than one db to manage, you are going to have to put the login credentials somewhere. This will potentially panic your IT manager or administrator - they dont really want login credentials floating around on machines, which is a very valid and fair point. Froms the example I have below, you will see that I am managing my login credentials by placing them in NAnt's app.config. This will send a shiver of fear down some peoples spines, as this means that they are there for all to see in plain text - but, dont panic. It is a very simple thing to encrypt them so that no one can see them. Just plonk all your db credentials in there, encrypt it and make sure that this .config is deployed correctly when ever the details change.

I am sure there will be other considerations to take into account depending on your own setup (there always is). But for me, this is all I needed. So, lets move on - what do I need to capture in my NAnt build script? Well, if all the db credentials are stored away in a .config file then all I need to do is retrieve them by name. This means that in my NAnt script, I just need to specify the source and destination db's based on the key value, e.g;

<add value="db-server-name, db-name, username, password" key="test">

It's up to you if you want this to be in plain text in NAnt's config file, personally I would encrypt it - but thats just me.

Now, onto the task. As I said above, I am only getting two attributes: the source and destination database names. This means the task would look a little like this:

[TaskName("schema-sync")]
public class SchemaSync : Task
{
public Sync sync = new Sync();
private string _sourceDBName;
[TaskAttribute("source", Required = true)]
public string SourceDBName
{
get { return _sourceDBName; }
set { _sourceDBName = value; }
}
private string _destinationDBName;
[TaskAttribute("destination", Required = true)]
public string DestinationDBName
{
get { return _destinationDBName; }
set { _destinationDBName = value; }
}

Now, we get onto the good stuff - getting the sync underway!

I am going to present this in the way I figured it out - I am fairly sure that a lot of people reading this will want to rejig it to fit there own needs. Lets start off getting a collection of differences between two databases:



private Differences GetDifferences(string dbServerNameSource,
string dbUserNameSource,
string dbPasswordSource,
string dbNameSource,
string dbServerNameDest,
string dbUsernameDest,
string dbPasswordDest,
string dbNameDest)
{
Differences differences = null;
Database db1 = new Database();
Database db2 = new Database();
db1.Register(new ConnectionProperties(dbServerNameSource,
dbNameSource,
dbUserNameSource,
dbPasswordSource),
Options.Default);
db2.Register(new ConnectionProperties(dbServerNameDest,
dbNameDest,
dbUsernameDest,
dbPasswordDest),
Options.Default);
differences = db1.CompareWith(db2, Options.Default);
foreach (Difference difference in differences)
{
difference.Selected = ExcludeObject(difference);
}
return differences;
}

This method returns a collection of differences between two database's as supplied through the NAnt script. Note that this doesnt actually affect any changes at all on your db's, all it does it find out just how many differences there are. Next, we want to build the SQL needed to sync the schema from DB1 to DB2, this is fairly simple:


private ExecutionBlock ScriptDifferences(Differences differences)
{
Work work = new Work();
work.BuildFromDifferences(differences, Options.Default, true);
ExecutionBlock block = work.ExecutionBlock;
return block;
}

The returned block now contains the SQL script that will alter the destination db. To execute this SQL, we need the following:


private BlockExecutor SyncDifferences(string dbServerNameDest,
string dbNameDest,
string dbUserNameDest,
string dbPasswordDest,
ExecutionBlock block)
{
BlockExecutor executor = new BlockExecutor();
executor.ExecuteBlock(block,
dbServerNameDest,
dbNameDest,
true,
dbUserNameDest,
dbPasswordDest);
return executor;
}

And thats it, this will effectively sync db1 to db2 for you. Incidentally, this seemed to me to be a lot simpler that fudging a script together using the commandline Compare app... You will notice that I am also filtering out some database's from the compare it self, the following will do this for you:



private static bool ExcludeObject(Difference difference)
{
if (difference.DatabaseObjectType != ObjectType.User &&
difference.DatabaseObjectType != ObjectType.Role &&
difference.DatabaseObjectType != ObjectType.Queue &&
difference.DatabaseObjectType != ObjectType.Service &&
SpecialObjects(difference) == false)
{
return true;
}
return false;
}
private static bool SpecialObjects(Difference difference)
{
if ((difference.DatabaseObjectType == ObjectType.Table &&
difference.Name.ToLower().StartsWith("[dbo].[aspnet_sql"))
(difference.DatabaseObjectType == ObjectType.StoredProcedure &&
difference.Name.ToLower().StartsWith("[dbo].[aspnet_sql"))
(difference.DatabaseObjectType == ObjectType.StoredProcedure &&
difference.Name.ToLower().StartsWith("[dbo].[sqlquery")))
{
return true;
}
return false;
}

To tie it all up, I have one public method, which is called from my task class:


public object SyncDB(string dbServerNameSource,
string dbUserNameSource,
string dbPasswordSource,
string dbNameSource,
string dbServerNameDest,
string dbUserNameDest,
string dbPasswordDest,
string dbNameDest)
{
object obj = null;
Differences diff = GetDifferences(dbServerNameSource,
dbUserNameSource,
dbPasswordSource,
dbNameSource,
dbServerNameDest,
dbUserNameDest,
dbPasswordDest,
dbNameDest);
ExecutionBlock script = ScriptDifferences(diff);

obj = SyncDifferences(dbServerNameDest,
dbNameDest,
dbUserNameDest,
dbPasswordDest,
script);
return obj;
}

Et voila, we now have enough code to automatically sync schema from one db to another. All that is left to do is to fit this all into my NAnt task. Below is the complete code I used:


using NAnt.Core;
using NAnt.Core.Attributes;
using NAnt.Core.Tasks;
using NAnt.Core.Util;
using System;
using System.Collections.Generic;
using System.Configuration;
using System.Text;
using RedGate.Shared.SQL.ExecutionBlock;
using RedGate.SQLCompare.Engine;
using Custom.NantTasks.RedGate.Compare;

namespace Custom.NantTasks
{
[TaskName("schema-sync")]
public class SchemaSync : Task
{
public Sync sync = new Sync();
private string _sourceDBName;
[TaskAttribute("source", Required = true)]
public string SourceDBName
{
get { return _sourceDBName; }
set { _sourceDBName = value; }
}
private string _destinationDBName;
[TaskAttribute("destination", Required = true)]
public string DestinationDBName
{
get { return _destinationDBName; }
set { _destinationDBName = value; }
}
protected override void ExecuteTask()
{
string dbServerNameSource;
string dbNameSource;
string dbUserNameSource;
string dbPasswordSource;
GetSourceDetails(out dbServerNameSource,
out dbNameSource,
out dbUserNameSource,
out dbPasswordSource);
string dbServerNameDest;
string dbNameDest;
string dbUserNameDest;
string dbPasswordDest;
GetDestDetails(out dbServerNameDest,
out dbNameDest,
out dbUserNameDest,
out dbPasswordDest);
try
{
SyncDB(dbServerNameSource,
dbNameSource,
dbUserNameSource,
dbPasswordSource,
dbServerNameDest,
dbNameDest,
dbUserNameDest,
dbPasswordDest);
}
catch (Exception e)
{
Log(Level.Error, "Sync failed");
throw new BuildException(e.Message);
}
}

private void SyncDB(string dbServerNameSource,
string dbNameSource,
string dbUserNameSource,
string dbPasswordSource,
string dbServerNameDest,
string dbNameDest,
string dbUserNameDest,
string dbPasswordDest)
{

Log(Level.Info, "Syncing {0} to {1}", SourceDBName, DestinationDBName);
try
{
sync.SyncDB(dbServerNameSource,
dbUserNameSource,
dbPasswordSource,
dbNameSource,
dbServerNameDest,
dbUserNameDest,
dbPasswordDest,
dbNameDest);
Log(Level.Info, "Sync complete");
}
catch (Exception e)
{
throw new BuildException(e.Message);
}
}

private void GetDestDetails(out string dbServerNameDest,
out string dbNameDest,
out string dbUserNameDest,
out string dbPasswordDest)
{
Log(Level.Info, "Getting credentials for: " + DestinationDBName);
string[] destDetails =
ConfigurationManager.AppSettings.GetValues(DestinationDBName)[0].ToString().Split(',');
dbServerNameDest = destDetails[0].Trim();
dbNameDest = destDetails[1].Trim();
dbUserNameDest = destDetails[2].Trim();
dbPasswordDest = destDetails[3].Trim();
}

private void GetSourceDetails(out string dbServerNameSource,
out string dbNameSource,
out string dbUserNameSource,
out string dbPasswordSource)
{
string[] sourceDetails =
ConfigurationManager.AppSettings.GetValues(SourceDBName)[0].ToString().Split(',');
Log(Level.Info, "Getting credentials for: " + SourceDBName);
dbServerNameSource = sourceDetails[0].Trim();
dbNameSource = sourceDetails[1].Trim();
dbUserNameSource = sourceDetails[2].Trim();
dbPasswordSource = sourceDetails[3].Trim();
}
}
}



GetDestinationDetails() and GetSourceDetails() handle the retrieval of db credentials from NAnt's .config file. To put this new task into use, the build script would look a little like this:

<?xml version="1.0" encoding="utf-8" ?>
<project name="Test" default="test">
<target name="Test">
<schema-sync source="WidGetDev" destination="WidGetLive" />
</target>
</project>

If an error is encountered, then this will be thrown as a build exception and fail the build (which is nice). If you are syncing a number of db's at one time it means the operation will stop at the last successful sync. If you have a number of db's to sync, then it would be a good idea to us NAnt's foreach over a list of db's that you want to sync. This lets you skip over anything you dont want to go from the source to the destination (like a db full of environment variables for instance).

And thats pretty much it, one simple and fairly quick way to use the power of RedGate's SQL Compare engine with NAnt. Thus giving you a nice opportunity to use CI servers like CruiseControl.Net or TeamCity etc.

Below is the complete code for the schema compare:


using NAnt.Core;
using NAnt.Core.Attributes;
using NAnt.Core.Tasks;
using NAnt.Core.Util;
using RedGate.Shared.SQL.ExecutionBlock;
using RedGate.SQLCompare.Engine;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace Custom.NantTasks.RedGate.Compare
{
public class Sync
{
private Differences GetDifferences(string dbServerNameSource,
string dbUserNameSource,
string dbPasswordSource,
string dbNameSource,
string dbServerNameDest,
string dbUsernameDest,
string dbPasswordDest,
string dbNameDest)
{
Differences differences = null;
Database db1 = new Database();
Database db2 = new Database();
db1.Register(new ConnectionProperties(dbServerNameSource,
dbNameSource,
dbUserNameSource,
dbPasswordSource),
Options.Default);
db2.Register(new ConnectionProperties(dbServerNameDest,
dbNameDest,
dbUsernameDest,
dbPasswordDest),
Options.Default);
differences = db1.CompareWith(db2, Options.Default);
foreach (Difference difference in differences)
{
difference.Selected = ExcludeObject(difference);
}
return differences;
}
private ExecutionBlock ScriptDifferences(Differences differences)
{
Work work = new Work();
work.BuildFromDifferences(differences, Options.Default, true);
ExecutionBlock block = work.ExecutionBlock;
return block;
}
private BlockExecutor SyncDifferences(string dbServerNameDest,
string dbNameDest,
string dbUserNameDest,
string dbPasswordDest,
ExecutionBlock block)
{
BlockExecutor executor = new BlockExecutor();
executor.ExecuteBlock(block,
dbServerNameDest,
dbNameDest,
true,
dbUserNameDest,
dbPasswordDest);
return executor;
}
private static bool ExcludeObject(Difference difference)
{
if (difference.DatabaseObjectType != ObjectType.User &&
difference.DatabaseObjectType != ObjectType.Role &&
difference.DatabaseObjectType != ObjectType.Queue &&
difference.DatabaseObjectType != ObjectType.Service &&
SpecialObjects(difference) == false)
{
return true;
}
return false;
}
private static bool SpecialObjects(Difference difference)
{
if ((difference.DatabaseObjectType == ObjectType.Table &&
difference.Name.ToLower().StartsWith("[dbo].[aspnet_sql"))
(difference.DatabaseObjectType == ObjectType.StoredProcedure &&
difference.Name.ToLower().StartsWith("[dbo].[aspnet_sql"))
(difference.DatabaseObjectType == ObjectType.StoredProcedure &&
difference.Name.ToLower().StartsWith("[dbo].[sqlquery")))
{
return true;
}
return false;
}


public object SyncDB(string dbServerNameSource,
string dbUserNameSource,
string dbPasswordSource,
string dbNameSource,
string dbServerNameDest,
string dbUserNameDest,
string dbPasswordDest,
string dbNameDest)
{
object obj = null;
Differences diff = GetDifferences(dbServerNameSource,
dbUserNameSource,
dbPasswordSource,
dbNameSource,
dbServerNameDest,
dbUserNameDest,
dbPasswordDest,
dbNameDest);
ExecutionBlock script = ScriptDifferences(diff);

obj = SyncDifferences(dbServerNameDest,
dbNameDest,
dbUserNameDest,
dbPasswordDest,
script);
return obj;
}
}
}