I just finished presenting a session at SPTechCon Boston which I entitled “VSTS, ARM and DSC for SharePoint 2019 – A match made in heaven“. The session had a very small number of attendees, which made me wonder if the audience I was trying to target with the session even existed in modern enterprises. Off course the session was targeting SharePoint 2019 On-premises (on Azure VM) deployment, which may have been a turn off for some people, but the idea presented in it is applicable to much more than just SharePoint.
Microsoft has done an amazing job at providing us with mature enterprise scale deployment options. With Azure Resource Manager (ARM), you can now define your Data Center as code. With a single JSON file, you can now define how many vCPUS and much RAM your Virtual Machine should have, you can define what type of storage to use (SSD or Hard Drive), etc. With Desired State Configuration (DSC) you can define what software needs to be installed on your VMs and how it needs to be configured. Since we are now talking about Data Center and Configuration as code, should those follow the same best practices as any other piece of code that gets produced in the enterprise?
The answer to this question is obviously Yes, but how can you change the mindset of IT Pros that are used to having managed their servers the same way for the past decades using PowerShell scripts they developed and running them in all of their environment (from Dev to Prod), hoping for the same results? You do it like you do when you need to convince anyone to do anything, you show them what’s in it for them! This is exactly what I hope this article will achieve.
To better prove my point that organizations would benefit from adopting Software Development Lifecycle (SDL) methodology for their environment’s configuration we will look at a real life scenario. Imagine that the SharePoint development team is working on a new Client web part that exposes all forms in the organization, and the in order for it to work properly, a new Search Content Source named Forms need to be created. What we want is to establish a fully automated pipeline that will allow our IT Admin folks to write additional configuration for that one Content Source, and have it automatically pushed to the SharePoint dev environment for the development team to test their new web part. In order to achieve this, we will build our dev environment using ARM templates, store our DSC configuration code onto a Visual Studio Team Services (VSTS) environment, and leverage its Continuous Integration and Continuous Development (CI/CD) pipelines to have VSTS automatically push and apply our configuration changes to our dev environment.
Azure Resource Manager (ARM) Templates
As mentioned above, ARM templates let you define as code what we know has the physical hardware components of our traditional Data Centers. You don’t have to be an expert at writing ARM templates to get started, Microsoft does an amazing job at letting you build your components using the graphical Web portal, and then letting you export it as a JSON file. The beauty about ARM templates, is that we can actually have them deployed to our environment with a single click, by using the Azure Deployment APIs. For our scenario, our ARM templates will deploy a 5 servers environment: one Domain Controller, one SQL Server 2017, and 3 SharePoint 2019. My ARM template that defines those is store on my personal GitHub repository at https://GitHub/NikCharlebois/Conferences, and can be deployed via the Deployment engine by clicking on the below button:
Clicking the above button will automatically bring you to the Azure Portal, where once logged in, you will be able to pick an existing Resource Group or create a new one and deploy your 5 servers environment to it.
The SharePoint DSC configurations for my project will be stored within a single file named CorporateIntranet.ps1, which defines a configuration named CorporateIntranet. This is the file where we will be making the configuration changes we want deployed to our dev environments to.
Visual Studio Team Services
VSTS is where our code will be stored and it is also where the deployment automation will happen. We will be creating a new build definition that will do two very simple tasks: Copy the artefacts from our project and publish them into a folder named DSC onto our build agent.
In order for our Build to be triggered as soon as code changes are committed to the environment, we will enable Continuous Integration from the Build definition’s Triggers tab.
The next thing we need to do is define a Release definition that, upon a successful build, will take the DSC configuration changes in our project, and publish them to an Azure Automation DSC account. Our Release definition will have a single task, an Azure PowerShell task that will run 3 simple lines of PowerShell that will upload the DSC file to our Azure Automation DSC account and initiate a compilation job on it.
The PowerShell code used for the task is the following:
$azAccount = Get-AzureRMAutomationAccount -ResourceGroupName SPTechCon -Name SPTechConAADSC
$azAccount | Import-AzureRMAutomationDSCConfiguration -SourcePath "$env:SYSTEM_ARTIFACTSDIRECTORY\Dev Build\DSC\CorporateIntranet.ps1" -Published -Force
$azAccount | Start-AzureRmAutomationDscCompilationJob -ConfigurationName CorporateIntranet
Where SPTechConAADSC is the name of my Azure Automation Account I want to deploy the DSC script to, CorporateIntranet.ps1 is the name of my DSC configuration script that contains the SharePoint configurations in my code project, and CorporateIntranet is the name of my configuration.
Azure Automation DSC
Azure Automation DSC is where the official DSC configurations will end up and it is the engine that will take care of deploying the latest configuration changes to our SharePoint environment.
Seeing it Action
Our DevOPS pipeline is now ready to be operational, all the pieces are in place. In our scenario, the IT Admin now opens the CorporateIntranet.ps1 file in the Visual Studio Code free editor, and makes the configuration change to include the new Search Content Source required by the Development team.
Name = "Forms"
ServiceAppName = "Search Service Application"
ContentSourceType = "SharePoint"
Addresses = @("http://localhost")
PSDscRunAsCredential = $SPSetup
Once the changes are done to the file, the admin pushes the changes back into the VSTS Source Control. The moment a changes is detected, VSTS initiates a new Build:
The moment the build successfully completes, a new Release will get triggered:
Once the release completes, the configuration has been automatically uploaded to our Azure Automation DSC account:
All that is left for you to do is assign the Compiled configuration to our SharePoint farm. Once that assignment is done, every time you commit a change to the CorporateIntranet.ps1 file, those changes will automatically make their way up to the SharePoint Dev farm.
As you can see from the scenario above, by ensuring IT Admin follow the best Software Development Lifecycle principles, just like the Development team should, we can help automate environment changes, reducing snowflake servers by making the process repeatable and automated. Every organization that is serious about DevOPS should get their IT Admin on-board, sooner than later!