At this point in my blog series on Continuous Delivery with TFS / VSTS we have finally reached the stage where we are ready to start using the new web-based release management capabilities of VSTS and TFS. The functionality has been in VSTS for a little while now but only came to TFS with Update 2 of TFS 2015 which was released at the end of March 2016.
Don’t tell my wife but I’m having a torrid love affair with the new TFS / VSTS Release Management. It’s flippin’ brilliant! Compared to the previous WPF desktop client it’s a breath of fresh air: easy to understand, quick to set up and a joy to use. Sure there are some improvements that could be made (and these will come in time) but for the moment, for a relatively new product, I’m finding the experience extremely agreeable. So let’s crack on!
Setting the Scene
The previous posts in this series set the scene for this post but I’ll briefly summarise here. We’ll be deploying the Contoso University sample application which consists of an ASP.NET MVC website and a SQL Server database which I’ve converted to a SQL Server Database Project so deployment is by DACPAC. We’ll be deploying to three environments (DAT, DQA and PRD) as I explain here and not only will we be deploying the application we’ll first be making sure the environments are correctly configured with PowerShell DSC using an adaptation of the procedure I describe here.
My demo environment in Azure is configured as a Windows domain and includes an instance of TFS 2015 Update 2 which I’ll be using for this post as it’s the lowest common denominator, although I will point out any VSTS specifics where needed. We’ll be deploying to newly minted Windows Server 2012 R2 VMs which have been joined to the domain, configured with WMF 5.0 and had their domain firewall turned off – see here for details. (Note that if you are using versions of Windows server earlier than 2012 that don’t have remote management turned on you have a bit of extra work to do.) My TFS instance is hosting the build agent and as such the agent can ‘see’ all the machines in the domain. I’m using Integrated Security to allow the website to talk to the database and use three different domain accounts (CU-DAT, CU-DQA and CU-PRD) to illustrate passing different credentials to different environments. I assume you have these set up in advance.
As far as development tools are concerned I’m using Visual Studio 2015 Update 2 with PowerShell Tools installed and Git for version control within a TFS / VSTS team project. It goes without saying that for each release I’m building the application only once and as part of the build any environment-specific configuration is replaced with tokens. These tokens are replaced with the correct values for that environment as that same tokenised build moves through the deployment pipeline.
Writing Server Configuration Code Alongside Application Code
A key concept I am promoting in this blog post series is that configuring the servers that your application will run on should not be an afterthought and neither should it be a manual click-through-GUI process. Rather, you should be configuring your servers through code and that code should be written at the same time as you write your application code. Furthermore the server configuration code should live with your application code. To start then we need to configure Contoso University for this way of working. If you are following along you can get the starting point code from here.
- Open the ContosoUniversity solution in Visual Studio and add new folders called Deploy to the ContosoUniversity.Database and ContosoUniversity.Web projects.
- In ContosoUniversity.Database\Deploy create two new files: Database.ps1 and DbDscResources.ps1. (Note that SQL Server Database Projects are a bit fussy about what can be created in Visual Studio so you might need to create these files in Windows Explorer and add them in as new items.)
- Database.ps1 should contain the following code:
[CmdletBinding()]
param(
[Parameter(Position=1)]
[string]$domainSqlServerSetupLogin,
[Parameter(Position=2)]
[string]$domainSqlServerSetupPassword,
[Parameter(Position=3)]
[string]$sqlServerSaPassword,
[Parameter(Position=4)]
[string]$domainUserForIntegratedSecurityLogin
)
<# Password parameters included intentionally to check for environment cloning errors where
failure to explicitly set the password in a cloned environment causes an off-by-one error which these outputs can help track down #>
Write-Verbose "The value of parameter `$domainSqlServerSetupLogin is $domainSqlServerSetupLogin" -Verbose
Write-Verbose "The value of parameter `$domainSqlServerSetupPassword is $domainSqlServerSetupPassword" -Verbose
Write-Verbose "The value of parameter `$sqlServerSaPassword is $sqlServerSaPassword" -Verbose
Write-Verbose "The value of parameter `$domainUserForIntegratedSecurityLogin is $domainUserForIntegratedSecurityLogin" -Verbose
$domainSqlServerSetupCredential = New-Object System.Management.Automation.PSCredential ($domainSqlServerSetupLogin, (ConvertTo-SecureString -String $domainSqlServerSetupPassword -AsPlainText -Force))
$sqlServerSaCredential = New-Object System.Management.Automation.PSCredential ("sa", (ConvertTo-SecureString -String $sqlServerSaPassword -AsPlainText -Force))
$configurationData =
@{
AllNodes =
@(
@{
NodeName = $env:COMPUTERNAME
PSDscAllowDomainUser = $true
PSDscAllowPlainTextPassword = $true
DomainSqlServerSetupCredential = $domainSqlServerSetupCredential
SqlServerSaCredential = $sqlServerSaCredential
DomainUserForIntegratedSecurityLogin = $domainUserForIntegratedSecurityLogin
}
)
}
Configuration Database
{
Import-DscResource –ModuleName PSDesiredStateConfiguration
Import-DscResource -ModuleName @{ModuleName="xSQLServer";ModuleVersion="1.5.0.0"}
Import-DscResource -ModuleName @{ModuleName="xDatabase";ModuleVersion="1.4.0.0"}
Import-DscResource -ModuleName @{ModuleName="xReleaseManagement";ModuleVersion="1.0.0.0"}
Node $AllNodes.NodeName
{
WindowsFeature "NETFrameworkCore"
{
Ensure = "Present"
Name = "NET-Framework-Core"
}
xSqlServerSetup "SQLServerEngine"
{
DependsOn = "[WindowsFeature]NETFrameworkCore"
SourcePath = "\\prm-core-dc\DscInstallationMedia"
SourceFolder = "SqlServer2014"
SetupCredential = $Node.DomainSqlServerSetupCredential
InstanceName = "MSSQLSERVER"
Features = "SQLENGINE"
SecurityMode = "SQL"
SAPwd = $Node.SqlServerSaCredential
}
xDatabase DeployDac
{
DependsOn = "[xSqlServerSetup]SQLServerEngine"
Ensure = "Present"
SqlServer = $Node.Nodename
SqlServerVersion = "2014"
DatabaseName = "ContosoUniversity"
Credentials = $Node.SqlServerSaCredential
DacPacPath = "C:\temp\Database\ContosoUniversity.Database.dacpac"
DacPacApplicationName = "ContosoUniversity.Database"
}
xTokenize ReplacePermissionsScriptConfigTokens
{
DependsOn = "[xDatabase]DeployDac"
recurse = $false
tokens = @{LOGIN_OR_USER = $Node.DomainUserForIntegratedSecurityLogin; DB_NAME = "ContosoUniversity"}
useTokenFiles = $false
path = "C:\temp\Database\Deploy"
searchPattern = "*.sql"
}
Script ApplyPermissions
{
DependsOn = "[xTokenize]ReplacePermissionsScriptConfigTokens"
SetScript =
{
$cmd= "& 'C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\110\Tools\Binn\sqlcmd.exe' -S localhost -i 'C:\temp\Database\Deploy\Create login and database user.sql' "
Invoke-Expression $cmd
}
TestScript = { $false }
GetScript = { @{ Result = "" } }
}
# Configure for debugging / development mode only
#xSqlServerSetup "SQLServerManagementTools"
#{
# DependsOn = "[WindowsFeature]NETFrameworkCore"
# SourcePath = "\\prm-core-dc\DscInstallationMedia"
# SourceFolder = "SqlServer2014"
# SetupCredential = $Node.DomainSqlServerSetupCredential
# InstanceName = "NULL"
# Features = "SSMS,ADV_SSMS"
#}
}
}
Database -ConfigurationData $configurationData
- DbDscResources.ps1 should contain the following code:
$customModulesDestination = Join-Path $env:SystemDrive "\Program Files\WindowsPowerShell\Modules"
# Modules need to have been copied to this UNC from a machine where they were installed
$customModulesSource = "\\prm-core-dc\DscResources"
Copy-Item -Verbose -Force -Recurse -Path (Join-Path $customModulesSource xSqlServer) -Destination $customModulesDestination
Copy-Item -Verbose -Force -Recurse -Path (Join-Path $customModulesSource xDatabase) -Destination $customModulesDestination
Copy-Item -Verbose -Force -Recurse -Path (Join-Path $customModulesSource xReleaseManagement) -Destination $customModulesDestination
- In ContosoUniversity.Web\Deploy create two new files: Website.ps1 and WebDscResources.ps1.
- Website.ps1 should contain the following code:
[CmdletBinding()]
param(
[Parameter(Position=1)]
[string]$domainUserForIntegratedSecurityLogin ,
[Parameter(Position=2)]
[string]$domainUserForIntegratedSecurityPassword,
[Parameter(Position=3)]
[string]$sqlServerName
)
<# Password parameters included intentionally to check for environment cloning errors where
failure to explicitly set the password in a cloned environment causes an off-by-one error which these outputs can help track down #>
Write-Verbose "The value of parameter `$domainUserForIntegratedSecurityLogin is $domainUserForIntegratedSecurityLogin" -Verbose
Write-Verbose "The value of parameter `$domainUserForIntegratedSecurityPassword is $domainUserForIntegratedSecurityPassword" -Verbose
Write-Verbose "The value of parameter `$sqlServerName is $sqlServerName" -Verbose
$domainUserForIntegratedSecurityCredential = New-Object System.Management.Automation.PSCredential ($domainUserForIntegratedSecurityLogin, (ConvertTo-SecureString -String $domainUserForIntegratedSecurityPassword -AsPlainText -Force))
$configurationData =
@{
AllNodes =
@(
@{
NodeName = $env:COMPUTERNAME
DomainUserForIntegratedSecurityLogin = $domainUserForIntegratedSecurityLogin
DomainUserForIntegratedSecurityCredential = $domainUserForIntegratedSecurityCredential
SqlServerName = $sqlServerName
PSDscAllowDomainUser = $true
PSDscAllowPlainTextPassword = $true
}
)
}
Configuration Web
{
Import-DscResource –ModuleName PSDesiredStateConfiguration
Import-DscResource –ModuleName @{ModuleName="cWebAdministration";ModuleVersion="2.0.1"}
Import-DscResource -ModuleName @{ModuleName="xWebAdministration";ModuleVersion="1.10.0.0"}
Import-DscResource -ModuleName @{ModuleName="xReleaseManagement";ModuleVersion="1.0.0.0"}
Node $AllNodes.NodeName
{
# Configure for web server role
WindowsFeature DotNet45Core
{
Ensure = 'Present'
Name = 'NET-Framework-45-Core'
}
WindowsFeature IIS
{
Ensure = 'Present'
Name = 'Web-Server'
}
WindowsFeature AspNet45
{
Ensure = "Present"
Name = "Web-Asp-Net45"
}
# Only turn off whilst sorting out the web files - needs to be on for rest of script to work
Script StopIIS
{
DependsOn = "[WindowsFeature]IIS"
SetScript =
{
Stop-Service W3SVC
}
TestScript = { $false }
GetScript = { @{ Result = "" } }
}
# Make sure the web folder has the latest website files
xTokenize ReplaceWebConfigTokens
{
Recurse = $false
Tokens = @{DATA_SOURCE = $Node.SqlServerName; INITIAL_CATALOG = "ContosoUniversity"}
UseTokenFiles = $false
Path = "C:\temp\website"
SearchPattern = "web.config"
}
Script DeleteExisitngWebsiteFilesSoAbsolutelyCertainAllFilesComeFromTheBuild
{
DependsOn = "[xTokenize]ReplaceWebConfigTokens"
SetScript =
{
Remove-Item "C:\inetpub\ContosoUniversity" -Force -Recurse -ErrorAction SilentlyContinue
}
TestScript = { $false }
GetScript = { @{ Result = "" } }
}
File CopyWebsiteFiles
{
DependsOn = "[Script]DeleteExisitngWebsiteFilesSoAbsolutelyCertainAllFilesComeFromTheBuild"
Ensure = "Present"
Force = $true
Recurse = $true
Type = "Directory"
SourcePath = "C:\temp\website"
DestinationPath = "C:\inetpub\ContosoUniversity"
}
File RemoveDeployFolder
{
DependsOn = "[File]CopyWebsiteFiles"
Ensure = "Absent"
Force = $true
Type = "Directory"
DestinationPath = "C:\inetpub\ContosoUniversity\Deploy"
}
Script StartIIS
{
DependsOn = "[File]RemoveDeployFolder"
SetScript =
{
Start-Service W3SVC
}
TestScript = { $false }
GetScript = { @{ Result = "" } }
}
# Configure custom app pool
xWebAppPool ContosoUniversity
{
DependsOn = "[WindowsFeature]IIS"
Ensure = "Present"
Name = "ContosoUniversity"
State = "Started"
}
cAppPool ContosoUniversity
{
DependsOn = "[xWebAppPool]ContosoUniversity"
Name = "ContosoUniversity"
IdentityType = "SpecificUser"
UserName = $Node.DomainUserForIntegratedSecurityLogin
Password = $Node.DomainUserForIntegratedSecurityCredential
}
# Advanced configuration
xWebsite ContosoUniversity
{
DependsOn = "[cAppPool]ContosoUniversity"
Ensure = "Present"
Name = "ContosoUniversity"
State = "Started"
PhysicalPath = "C:\inetpub\ContosoUniversity"
BindingInfo = MSFT_xWebBindingInformation
{
Protocol = 'http'
Port = '80'
HostName = $Node.NodeName
IPAddress = '*'
}
ApplicationPool = "ContosoUniversity"
}
# Clean up the uneeded website and application pools
xWebsite Default
{
Ensure = "Absent"
Name = "Default Web Site"
}
xWebAppPool NETv45
{
Ensure = "Absent"
Name = ".NET v4.5"
}
xWebAppPool NETv45Classic
{
Ensure = "Absent"
Name = ".NET v4.5 Classic"
}
xWebAppPool Default
{
Ensure = "Absent"
Name = "DefaultAppPool"
}
File wwwroot
{
Ensure = "Absent"
Type = "Directory"
DestinationPath = "C:\inetpub\wwwroot"
Force = $True
}
# Configure for debugging / development mode only
#WindowsFeature IISTools
#{
# Ensure = "Present"
# Name = "Web-Mgmt-Tools"
#}
}
}
Web -ConfigurationData $configurationData
- WebDscResources.ps1 should contain the following code:
$customModulesDestination = Join-Path $env:SystemDrive "\Program Files\WindowsPowerShell\Modules"
# Modules need to have been copied to this UNC from a machine where they were installed
$customModulesSource = "\\prm-core-dc\DscResources"
Copy-Item -Verbose -Force -Recurse -Path (Join-Path $customModulesSource xWebAdministration) -Destination $customModulesDestination
Copy-Item -Verbose -Force -Recurse -Path (Join-Path $customModulesSource cWebAdministration) -Destination $customModulesDestination
Copy-Item -Verbose -Force -Recurse -Path (Join-Path $customModulesSource xReleaseManagement) -Destination $customModulesDestination
- In ContosoUniversity.Database\Scripts move Create login and database user.sql to the Deploy folder and remove the Scripts folder.
- Make sure all these files have their Copy to Output Directory property set to Copy always. For the files in ContosoUniversity.Database\Deploy the Build Action property should be set to None.
The Database.ps1 and Website.ps1 scripts contain the PowerShell DSC to both configure servers for either IIS or SQL Server and then to deploy the actual component. See my Server Configuration as Code with PowerShell DSC post for more details. (At the risk of jumping ahead to the deployment part of this post, the bits to be deployed are copied to temp folders on the target nodes – hence references in the scripts to C:\temp\$whatever$.)
In the case of the database component I’m using the xDatabase custom DSC resource to deploy the DACPAC. I came across a problem with this resource where it wouldn’t install the DACPAC using domain credentials, despite the credentials having the correct permissions in SQL Server. I ended up having to install SQL Server using Mixed Mode authentication and installing the DACPAC using the sa login. I know, I know!
My preferred technique for deploying website files is plain xcopy. For me the requirement is to clear the old files down and replace them with the new ones. After some experimentation I ended up with code to stop IIS, remove the web folder, copy the new web folder from its temp location and then restart IIS.
Both the database and website have files with configuration tokens that needed replacing as part of the deployment. I’m using the xReleaseManagement custom DSC resource which takes a hash table of tokens (in the __TOKEN_NAME__ format) to replace.
In order to use custom resources on target nodes the custom resources need to be in place before attempting to run a configuration. I had hoped to use a push server technique for this but it was not to be since for this post at least I’m running the DSC configurations on the actual target nodes and the push server technique only works if the MOF files are created on a staging machine that has the custom resources installed. Instead I’m copying the custom resources to the target nodes just prior to running the DSC configurations and this is the purpose of the DbDscResources.ps1 and WebDscResources.ps1 files. The custom resources live on a UNC that is available to target nodes and get there by simply copying them from a machine where they have been installed (C:\Program Files\WindowsPowerShell\Modules is the location) to the UNC.
Create a Release Build
With the Visual Studio now configured (don’t forget to commit the changes) we now need to create a build to check that initial code quality checks have passed and if so to publish the database and website components ready for deployment. Create a new build definition called ContosoUniversity.Rel and follow this post to configure the basics and this post to create a task to run unit tests. Note that for the Visual Studio Build task the MSBuild Arguments setting is /p:OutDir=$(build.stagingDirectory) /p:UseWPP_CopyWebApplication=True /p:PipelineDependsOnBuild=False /p:RunCodeAnalysis=True. This gives us a _PublishedWebsites\ContosoUniversity.Web folder (that contains all the web files that need to be deployed) and also runs the transformation to tokensise Web.config. Additionally, since we are outputting to $(build.stagingDirectory) the Test Assembly setting of the Visual Studio Test task needs to be $(build.stagingDirectory)\**\*UnitTests*.dll;-:**\obj\**. At some point we’ll want to version our assemblies but I’ll return to that in a another post.
One important step that has changed since my earlier posts is that the Restore NuGet Packages option in the Visual Studio Build task has been deprecated. The new way of doing this is to add a NuGet Installer task as the very first item and then in the Visual Studio Build task (in the Advanced section in VSTS) uncheck Restore NuGet Packages.
To publish the database and website as components – or Artifacts (I’m using the TFS spelling) as they are known – we use the Copy and Publish Build Artifacts tasks. The database task should be configured as follows:
- Copy Root = $(build.stagingDirectory)
- Contents =
- ContosoUniversity.Database.d*
- Deploy\Database.ps1
- Deploy\DbDscResources.ps1
- Deploy\Create login and database user.sql
- Artifact Name = Database
- Artifact Type = Server
Note that the Contents setting can take multiple entries on separate lines and we use this to be explicit about what the database artifact should contain. The website task should be configured as follows:
- Copy Root = $(build.stagingDirectory)\_PublishedWebsites
- Contents = **\*
- Artifact Name = Website
- Artifact Type = Server
Because we are specifying a published folder of website files that already has the Deploy folder present there’s no need to be explicit about our requirements. With all this done the build should look similar to this:
In order to test the successful creation of the artifacts, queue a build and then – assuming the build was successful – navigate to the build and click on the Artifacts link. You should see the Database and Website artifact folders and you can examine the contents using the Explore link:
Create a Basic Release
With the artifacts created we can now turn our attention to creating a basic release to get them copied on to a target node and then perform a deployment. Switch to the Release hub in the web portal and use the green cross icon to create a new release definition. The Deployment Templates window is presented and you should choose to start with an Empty template. There are four immediate actions to complete:
- Provide a Definition name – ContosoUniversity for example.
- Change the name of the environment that has been added to DAT.
- Click on Link to a build definition to link the release to the ContosoUniversity.Rel build definition.
- Save the definition.
Next up we need to add two Windows Machine File Copy tasks to copy each artifact to one node called PRM-DAT-AIO. (As a reminder the DAT environment as I define it is just one server which hosts both the website and the database and where automated testing takes place.) Although it’s possible to use just one task here the result of selecting artifacts differs according to the selected node in the artifact tree. At the node root, folders are created for each artifact but go one node lower and they aren’t. I want a procedure that works for all environments which is as follows:
- Click on Add tasks to bring up the Add Tasks window. Use the Deploy link to filter the list of tasks and Add two Windows Machine File Copy tasks:
- Configure the properties of the tasks as follows:
- Edit the names (use the pencil icon) to read Copy Database files and Copy Website files respectively.
- Source = $(System.DefaultWorkingDirectory)/ContosoUniversity.Rel/Database or $(System.DefaultWorkingDirectory)/ContosoUniversity.Rel/Website accordingly (use the ellipsis to select)
- Machines = PRM-DAT-AIO.prm.local
- Admin login = Supply a domain account login that has admin privileges for PRM-DAT-AIO.prm.local
- Password = Password for the above domain account
- Destination folder = C:\temp\Database or C:\temp\Website accordingly
- Advanced Options > Clean Target = checked
- Click the ellipsis in the DAT environment and choose Deployment conditions.
- Change the Trigger to After release creation and click OK to accept.
- Save the changes and trigger a release using the green cross next to Release. You’ll be prompted to select a build as part of the process:
- If the release succeeds a C:\temp folder containing the artifact folders will have been created on on PRM-DAT-AIO.
- If the release fails switch to the Logs tab to troubleshoot. Permissions and whether the firewall has been configured to allow WinRM are the likely culprits. To preserve my sanity I do everything as domain admin and I have the domain firewall turned off. The usual warnings about these not necessarily being best practices in non-test environments apply!
Whilst you are checking the C:\temp folder on the target node have a look inside the artifact folders. They should both contain a Deploy folder that contains the PowerShell scripts that will be executed remotely using the PowerShell on Target Machines task. You’ll need to configure two of each for the two artifacts as follows:
- Add two PowerShell on Target Machines tasks to alternately follow the Windows Machine File Copy tasks.
- Edit the names (use the pencil icon) to read Configure Database and Configure Website respectively.
- Configure the properties of the task as follows:
- Machines = PRM-DAT-AIO.prm.local
- Admin login = Supply a domain account that has admin privileges for PRM-DAT-AIO.prm.local
- Password = Password for the above domain account
- Protocol = HTTP
- Deployment > PowerShell Script = C:\temp\Database\Deploy\Database.ps1 or C:\temp\Website\Deploy\Website.ps1 accordingly
- Deployment > Initialization Script = C:\temp\Database\Deploy\DbDscResources.ps1 or C:\temp\Website\Deploy\WebDscResources.ps1 accordingly
- With reference to the parameters required by C:\temp\Database\Deploy\Database.ps1 configure Deployment > Script Arguments for the Database task as follows:
- $domainSqlServerSetupLogin = Supply a domain login that has privileges to install SQL Server on PRM-DAT-AIO.prm.local
- $domainSqlServerSetupPassword = Password for the above domain login
- $sqlServerSaPassword = Password you want to use for the SQL Server sa account
- $domainUserForIntegratedSecurityLogin = Supply a domain login to use for integrated security (PRM\CU-DAT in my case for the DAT environment)
- The finished result will be similar to: ‘PRM\Graham’ ‘YourSecurePassword’ ‘YourSecurePassword’ ‘PRM\CU-DAT’
- With reference to the parameters required by C:\temp\Website\Deploy\Website.ps1 configure Deployment > Script Arguments for the Website task as follows:
- $domainUserForIntegratedSecurityLogin = Supply a domain login to use for integrated security (PRM\CU-DAT in my case for the DAT environment)
- $domainUserForIntegratedSecurityPassword = Password for the above domain account
- $sqlServerName = machine name for the SQL Server instance (PRM-DAT-AIO in my case for the DAT environment)
- The finished result will be similar to: ‘PRM\CU-DAT’ ‘YourSecurePassword’ ‘PRM-DAT-AIO’
At this point you should be able to save everything and the release should look similar to this:
Go ahead and trigger a new release. This should result in the PowerShell scripts being executed on the target node and IIS and SQL Server being installed, as well as the Contoso University application. You should be able to browse the application at http://prm-dat-aio. Result!
Variable Quality
Although we now have a working release for the DAT environment it will hopefully be obvious that there are serious shortcomings with the way we’ve configured the release. Passwords in plain view is one issue and repeated values is another. The latter issue is doubly of concern when we start creating further environments.
The answer to this problem is to create custom variables at both a ‘release’ level and and at the ‘environment’ level. Pretty much every text box seems to take a variable so you can really go to town here. It’s also possible to create compound values based on multiple variables – I used this to separate the location of the C:\temp folder from the rest of the script location details. It’s worth having a bit of a think about your variable names in advance of using them because if you change your mind you’ll need to edit every place they were used. In particular, if you edit the declaration of secret variables you will need to click the padlock to clear the value and re-enter it. This tripped me up until I added Write-Verbose statements to output the parameters in my DSC scripts and realised that passwords were not being passed through (they are asterisked so there is no security concern). (You do get the scriptArguments as output to the console but I find having them each on a separate line easier.)
Release-level variables are created in the Configuration section and if they are passwords can be secured as secrets by clicking the padlock icon. The release-level variables I created are as follows:
Environment-level variables are created by clicking the ellipsis in the environment and choosing Configure Variables. I created the following:
The variables can then be used to reconfigure the release as per this screen shot which shows the PowerShell on Target Machines Configure Database task:
The other tasks are obviously configured in a similar way, and notice how some fields use more than one variable. Nothing has a actually changed by replacing hard-coded values with variables so triggering another release should be successful.
Environments Matter
With a successful deployment to the DAT environment we can now turn our attention to the other stages of the deployment pipeline – DQA and PRD. The good news here is that all the work we did for DAT can be easily cloned for DQA which can then be cloned for PRD. Here’s the procedure for DQA which don’t forget is a two-node deployment:
- In the Configuration section create two new release level variables:
- TargetNode-DQA-SQL = PRM-DQA-SQL.prm.local
- TargetNode-DQA-IIS = PRM-DQA-IIS.prm.local
- In the DAT environment click on the ellipsis and select Clone environment and name it DQA.
- Change the two database tasks so the Machines property is $(TargetNode-DQA-SQL).
- Change the two website tasks so the Machines property is $(TargetNode-DQA-IIS).
- In the DQA environment click on the ellipsis and select Configure variables and make the following edits:
- Change DomainUserForIntegratedSecurityLogin to PRM\CU-DQA
- Click on the padlock icon for the DomainUserForIntegratedSecurityPassword variable to clear it then re-enter the password and click the padlock icon again to make it a secret. Don’t miss this!
- Change SqlServerName to PRM-DQA-SQL
- In the DQA environment click on the ellipsis and select Deployment conditions and set Trigger to No automated deployment.
With everything saved and assuming the PRM-DQA-SQL and PRM-DQA-SQL nodes are running the release can now be triggered. Assuming the deployment to DAT was successful the release will wait for DQA to be manually deployed (almost certainly what is required as manual testing could be going on here):
To keep things simple I didn’t assign any approvals for this release (ie they were all automatic) but do bear in mind there is some rich and flexible functionality available around this. If all is well you should be able to browse Contoso University on http://prm-dqa-iis. I won’t describe cloning DQA to create PRD as it’s very similar to the process above. Just don’t forget to re-enter cloned password values! Do note that in the Environment Variables view of the Configuration section you can view and edit (but not create) the environment-level variables for all environments:
This is a great way to check that variables are the correct values for the different environments.
And Finally…
There’s plenty more functionality in Release Management that I haven’t described but that’s as far as I’m going in this post. One message I do want to get across is that the procedure I describe in this post is not meant to be a statement on the definitive way of using Release Management. Rather, it’s designed to show what’s possible and to get you thinking about your own situation and some of the factors that you might need to consider. As just one example, if you only have one application then the Visual Studio solution for the application is probably fine for the DSC code that installs IIS and SQL Server. However if you have multiple similar applications then almost certainly you don’t want all that code repeated in every solution. Moving this code to the point at which the nodes are created could be an option here – or perhaps there is a better way!
That’s it for the moment but rest assured there’s lots more to be covered in this series. If you want the final code that accompanies this post I’ve created a release here on my GitHub site.
Cheers – Graham
The post Continuous Delivery with TFS / VSTS – Server Configuration and Application Deployment with Release Management appeared first on Please Release Me.