How to Write your Own ReverseDSC Orchestrator

ReverseDSC is a module that allows you to extract the PowerShell Desired State Configuration out of an existing environment, in order for you to analyze it, onboard it onto DSC, or replicate it somewhere else. ReverseDSC as it stands is a technology Agnostic PowerShell Module. It only provides methods to allow you to properly convert extracted values into a DSC notation. In order to obtain these values, you need to dynamically call into the Get-TargetResource function of a given DSC Resource.

Every DSC Resource needs to include 3 core functions in order for it to be valid: Get-TargetResource, Set-TargetResource, and Test-TargetResource. For more information on the role of each of these function, you can consult the readme content on the SharePointDSC.Reverse repository. As explained in my How to use the ReverseDSC Core article, in order for you to obtain the values of a Resource instance, you need to call the Get-TargetResource for it, passing in the mandatory parameters that will allow the function to retrieve the instance (e.g. Primary Key of the instance).

An Orchestrator script, is responsible for determining these mandatory parameters and for calling the Get-TargetResource function for each instance, to obtain the complete set of key/value pairs for that instance. It then calls the ReverseDSC Core for each of these key/value pair to obtain the DSC notation, collects them all, and saves them into a resulting .ps1 file. The Orchestrator script is technology Specific, meaning that it requires the person writing the script to be familiar to some level with the technology stack it is for. As an example, when writing the Orchestrator script for SharePoint, when trying to retrieve information about all the Web Applications, you need to be able to know how to call the Get-SPWebApplication cmdlet in order to retrieve the URL (Primary key) of a Web Application instance.

ReverseDSC is all about community effort, and to help contributors get started I published a new Orchestrator Script Template to allow people to quickly get their script up and running. In the script, you will find several instances of placeholders starting with “[**“. Simply replace these with the values specified to begin with. The next thing for you to do is to start writing the set of Read- (Read-Dash) methods in the Reverse Functions section of the template. For every DSC Resource you wish to reverse, you should define a unique Read-Dash function. The template provides a very generic example on how to write that method, but you may wish to refer to existing Orchestrator scripts for more complex scenarios and see how they are done.

The last thing left for you to do once all your Read-Dash functions have been written, is to make sure that you are actually calling them from within the Orchestrator function. Try to proceed each of these calls with a Verbose output line that will help the users identify where we are at with the script’s execution. Once you script is completed, you should be able to execute it by simply executing the .ps1 file within a PowerShell session. In order to properly test your script, make sure that you don’t get any errors running it, but also try to execute the resulting output .ps1 file, which will attempt to compile the .MOF file, and make sure you don’t get errors at compilation time either.

Should you have any questions or comments regarding the Orchestrator templates or on how to get started, please use the issue section on the GitHub repository for the templates.

Add Site Title in the Search Filters

In this article I will be covering the process of adding Sites’ Title in the Search Filter of a SharePoint 2013/2016 site. A client I am currently working with has a dozen of what they call “legacy” sites. They went ahead and created a dedicated Result Source to allow people in their organization to search only for content stored in these sites. What they want, is for a new “Site Title” section to show up under this ResultSource to allow people to filter their search result based on a specific site.

Scenario

The SharePoint site has a wildcard managed path named “Legacy” and has 3 site collections created:

  • /legacy/HR
  • /legacy/Finances
  • /legacy/Communications

I have a Search Center created at /sites/Search, and a Result Source defined within it that only search content of sites located under the Legacy managed path

Process

Create a New Managed Property

By default, searching for documents inside the Legacy Result Source will only provide me with the Time Range slider and Author list as Filters (see Figure below).

In order for us to add sites’ title as a filter, we first need to ensure it has an associated Search Managed Property. To do so, navigate to your Search Service Application in Central Administration and select Search Sources from the left navigation.

SharePoint Search Schema

SharePoint Search Schema

Once in there, search for a RefinableString property that is not yet mapped to any crawled properties. In my case, I will select RefinableString00.

On the managed property edit page, scroll to the bottom of the page, to the Mappings to crawled properties section. Click on Add a mapping.

On the Crawled property selection, search for and select the ows_SiteName property.

Back on the managed property screen, click OK to save the mapping. go back to the Search Administration page and initiate a Full Crawl of you content.

Configure the Filter Panel

Now that the information about our sites’ titles is available as a managed property within Search, we can go ahead and update the refinement (filter) webpart on our Legacy search page to include it as a refiner. To do so, navigate to your search page and edit the page. Click on the Refinement web part and select Edit Web Part.

In the Refinement properties panel on the right hand side, click on Choose Refiners….

In the Available refiners section, find the Refinablestring00 property we just modified, and click on Add > to add it to the Selected refiners section on the right. Once that is done, Change its Display name to Site Title and click OK.

Click OK in the web part properties panel on the right, and save (or check-in) your page. You should now be able to use your new refiner as shown in the Figure below.

Deploy a SharePoint 2016 Standalone VM in Azure using PowerShell Desired State Configuration (DSC)

In the PowerShell Desired State Configuration (DSC) world, you really have two options when it comes down to configuring a machine. You can either use Push mode to manually “push” a DSC script into a machine’s Local Configuration Manager (LCM) memory, or use a Pull Server and connect your machine to it and let them obtain their DSC script themselves. In the Azure world, we have something called “Azure Automation DSC” which is effectively a PowerShell Desired State Configuration Pull Server in the cloud, managed as “Software-as-a-Service” (SaaS). With Azure Automation DSC, you can manage both on-premises and Azure VMs by having them connect back to your Azure Automation Account as if it was just a regular DSC Pull Server.

In this article, we will go through the process of setting up a SharePoint 2016 Standalone Virtual Machine using nothing but Azure Automation DSC. The idea is for people to easily create SharePoint Development machines in Azure Infrastructure-as-a-Service (IaaS).

The Scenario

In Azure IaaS I already have setup two VMs:

  • SPTechCon-DC is a Windows Server 2016 VM acting as a Domain Controller for the contoso.com domain.
  • SPTechCon-Share is a Windows Server 2016 VM that acts as a File Share, where I have the installation media for both SQL Server 2016 Enterprise and SharePoint Server 2016. These two shares are exposed at:
    • \\SPTechCon-Share\Share\SQL2016Media\
    • \\SPTechCon-Share\Share\SP2016Media

The Process

By following the following steps in order you will be able to deploy a new Windows Server 2016 VM in Azure IaaS, have it automatically join the contoso.com domain, and install both SQL Server 2016 and SharePoint 2016 on it. By the end of this process, you will have a fully functioning SharePoint 2016 development VM that you can simply go and install Visual Studio 2017 on to use as you main development environment for developers within your enterprise. This process is completely reusable, and can help your enterprise ensure your development team all have VMs with a configuration that matches your production environment.

1 – Create a new VM

In this article, this is the only manual process. Off course this could be automated, by for this example here, I will leave it up to you to decide how you wish to create your VM. In my case, I will be creating my VM with the following specs:

  • Name: SPTechCon-Share
  • OS Version: Windows Server 2016 Datacenter
  • Memory: 7Gb of RAM
  • CPU: 2 cores

To create the VM, start by selecting Windows Server as you Template category.

Creating Azure Windows VM

Creating Azure Windows VM

From the following screen, I select Windows Server 2016 Data Center.

Make sure you select Resource Manager as your deployment model and click Create

Azure Resource Manager

Azure Resource Manager

Fill in all the mandatory information, give your machine a meaningful name and make sure you create it as part of the same resource group where your Domain Controller and File Share servers are. Click OK.

Create an Azure Virtual Machine

Create an Azure Virtual Machine

Choose an appropriate VM size for your environment. In my case, I use a DS11_V2 Standard size. Click Select.

DS11_V2 Azure Machine

DS11_V2 Azure Machine

Leave out the default values on the Settings screen. Click OK.

Review the details for your machine. Click OK.

Summary of Azure VM

Summary of Azure VM

Wait a few minutes until you receive notification that the VM was successfully provisioned.

Azure VM being provisioned

Azure VM being provisioned


2 – Create a New Azure Automation Account

Remember we mentioned that Azure Automation is somewhat a DSC Pull Server in the Cloud. What we need to do next is create an instance of an Azure Automation Account to manage our DSC configurations. Azure Automation Accounts are available in the marketplace. Simply do a search for Azure Automation to find and select it.

Create a new Azure Automation Account

Create a new Azure Automation Account

Click on it to select it from the marketplace and then click Create.

Azure Automation Account

Azure Automation Account

Give your Azure Automation Account a name, make sure you select the same Resource Group as all the VMs we have created so far in this demo. Click Create.

Setting up Azure Automation

Setting up Azure Automation

The Azure Automation Account creation process is almost instantaneous, it should only take a few seconds to get created.

Review the Desired State Configuration Script

To configure our Standalone SharePoint box, we will be using the following PowerShell Desired State Configuration (DSC) script. I strongly encourage you quickly read through it an try to understand what is really happening under the covers. This below, is the complete script.

Configuration SharePoint2016StandAlone
{
    param(
        [String]$ParamDomain,
        [String]$ParamInternalDomainControllerIP,
        [String]$ParamMachineName,
        [String]$ParamProductKey,
        [String]$ParamUsername,
        [String]$ParamPassword,
        [String]$ParamShareName
	)

    Import-DSCResource -ModuleName xDSCDomainJoin
    Import-DSCResource -ModuleName xNetworking    
    Import-DSCResource -ModuleName SharePointDSC
    Import-DSCResource -ModuleName xSQLServer    

    $secdomainpasswd = ConvertTo-SecureString $ParamPassword -AsPlainText -Force
    $ParamCredsJoindomain = New-Object System.Management.Automation.PSCredential($ParamUsername, $secdomainpasswd)

    Node $ParamMachineName
    {
        xFireWall SQLFirewallRule
        {
            Name = "AllowSQLConnection"
            DisplayName = 'Allow SQL Connection' 
            Group = 'DSC Configuration Rules' 
            Ensure = 'Present' 
            Enabled = 'True' 
            Profile = ('Domain') 
            Direction = 'InBound' 
            LocalPort = ('1433') 
            Protocol = 'TCP' 
            Description = 'Firewall Rule to allow SQL communication' 
        }

        xDNSServerAddress DNS
	{
	    Address = $ParamInternalDomainControllerIP
	    AddressFamily = "IPv4"
	    InterfaceAlias = "Ethernet 2"
	}

        xDSCDomainJoin Join
	{
	    Domain = $ParamDomain
	    Credential = $ParamCredsJoindomain
	    DependsOn = "[xDNSServerAddress]DNS"
	}		

        xSQLServerSetup SQLSetup
        {
            SetupCredential = $ParamCredsJoindomain
            InstanceName = "MSSQLServer"
            SourcePath = "\\$ParamShareName\Share\SQL2016Media\"
            Features = "SQLENGINE,FULLTEXT,RS,AS,IS"
            InstallSharedDir = "C:\Program Files\Microsoft SQL Server"
            SQLSysAdminAccounts = $ParamCredsJoindomain.UserName
            DependsOn = "[xDSCDomainJoin]Join"
        }

        SPInstallPrereqs SP2016Prereqs
        {
            InstallerPath = "\\$ParamShareName\Share\SP2016Media\prerequisiteinstaller.exe"
            OnlineMode = $true
            DependsOn = "[xSQLServerSetup]SQLSetup"
        }

        SPInstall InstallSharePoint 
        { 
             Ensure = "Present" 
             BinaryDir = "\\$ParamShareName\Share\SP2016Media\" 
             ProductKey = $ParamProductKey
             DependsOn = @("[SPInstallPrereqs]SP2016Prereqs", "[xFirewall]SQLFirewallRule")
        } 

        SPCreateFarm CreateSPFarm 
        { 
            DatabaseServer           = $ParamMachineName
            FarmConfigDatabaseName   = "SP_Config" 
            Passphrase               = $ParamCredsJoindomain 
            FarmAccount              = $ParamCredsJoindomain 
            AdminContentDatabaseName = "SP_AdminContent" 
            PsDSCRunAsCredential     = $ParamCredsJoindomain
            ServerRole               = "SingleServerFarm"
            CentralAdministrationPort = 7777
            DependsOn                = "[SPInstall]InstallSharePoint" 
        }

        SPManagedAccount FarmAccount
        {
            AccountName = $ParamCredsJoindomain.UserName
            Account = $ParamCredsJoindomain
            PsDSCRunAsCredential     = $ParamCredsJoindomain
            DependsOn = "[SPCreateFarm]CreateSPFarm"
        }

        SPServiceAppPool SharePoint80
        {
            Name = "SharePoint - 80"
            ServiceAccount = $ParamCredsJoinDomain.UserName
            PsDSCRunAsCredential     = $ParamCredsJoindomain
            DependsOn = "[SPManagedAccount]FarmAccount"
        }

        SPWebApplication RootWebApp
        {
            Name = "RootWebApp"
            ApplicationPool = "SharePoint - 80"
            ApplicationPoolAccount = $ParamCredsJoinDomain.UserName
            Url = "http://$ParamMachineName"
            DatabaseServer = $ParamMachineName
            DatabaseName = "WebApp-SharePoint-80"
            Port = 80
            PsDSCRunAsCredential = $ParamCredsJoinDomain
            DependsOn = "[SPServiceAppPool]SharePoint80"
        }

        SPSite RootSite
        {
            Url = "http://$ParamMachineName"
            OwnerAlias = $ParamCredsJoinDomain.UserName
            Template = "STS#0"         
            PsDSCRunAsCredential = $ParamCredsJoinDomain
            DependsOn = "[SPWebApplication]RootWebApp"
        }
	}
}

Let’s take a closer look at what the script actually defines. Note that the Configuration script actually expects 7 parameters to be passed at compilation time. These parameters are:

Parameter Name Value Description
ParamDomain contoso.com Specifies the domain name that our machine will be joining.
ParamInternalDomainControllerIP 10.0.10.5 Internal IP address of our Domain Controller VM. (Note that this will likely differ for you).
ParamMachineName SPTechCon-SA Name of the Azure VM we created at Step 1 above.
ParamProductKey XXXXX-XXXXX-XXXXX-XXXXX-XXXXX Your own SharePoint 2016 (Standard or Enterprise) Product Key.
ParamUsername contoso\sp_farm Username for your SharePoint Farm Account.
ParamPassword pass@word1 Password for the SharePoint Farm Account used.
ParamShareName SPTechCon-Share Name of the File Share VM.

We will now break down each resource block an give you a quick overview of what it actually does.

The following creates a Domain Firewall rule on port 1433, to allow connections to our SQL Server (in our case hosted on the local machine) in case we wished to add more servers to our farm.

xFireWall SQLFirewallRule
{
    Name = "AllowSQLConnection"
    DisplayName = 'Allow SQL Connection' 
    Group = 'DSC Configuration Rules' 
    Ensure = 'Present' 
    Enabled = 'True' 
    Profile = ('Domain') 
    Direction = 'InBound' 
    LocalPort = ('1433') 
    Protocol = 'TCP' 
    Description = 'Firewall Rule to allow SQL communication' 
}

This block changes the DNS Server IP address to point to our domain controller.

xDNSServerAddress DNS
{
    Address = $ParamInternalDomainControllerIP
    AddressFamily = "IPv4"
    InterfaceAlias = "Ethernet 2"
}

The following joins the machine to the contoso.com domain.

xDSCDomainJoin Join
{
    Domain = $ParamDomain
    Credential = $ParamCredsJoindomain
    DependsOn = "[xDNSServerAddress]DNS"
}

This block installs SQL Server 2016 from our Shared Media Installation location.

xSQLServerSetup SQLSetup
{
    SetupCredential = $ParamCredsJoindomain
    InstanceName = "MSSQLServer"
    SourcePath = "\\$ParamShareName\Share\SQL2016Media\"
    Features = "SQLENGINE,FULLTEXT,RS,AS,IS"
    InstallSharedDir = "C:\Program Files\Microsoft SQL Server"
    SQLSysAdminAccounts = $ParamCredsJoindomain.UserName
    DependsOn = "[xDSCDomainJoin]Join"
}

This block installs the SharePoint 2016 pre-requisites. The server will automatically reboot itself once it reaches that step and will automatically resume the DSC configuration process.

SPInstallPrereqs SP2016Prereqs
{
    InstallerPath = "\\$ParamShareName\Share\SP2016Media\prerequisiteinstaller.exe"
    OnlineMode = $true
    DependsOn = "[xSQLServerSetup]SQLSetup"
}

This block installs the actual SharePoint 2016 bits on the machine.

SPInstall InstallSharePoint 
{ 
    Ensure = "Present" 
    BinaryDir = "\\$ParamShareName\Share\SP2016Media\" 
    ProductKey = $ParamProductKey
    DependsOn = @("[SPInstallPrereqs]SP2016Prereqs", "[xFirewall]SQLFirewallRule")
} 

This block creates the SharePoint Farm. Think of it as being the equivalent of running PSConfig.

SPCreateFarm CreateSPFarm 
{ 
    DatabaseServer           = $ParamMachineName
    FarmConfigDatabaseName   = "SP_Config" 
    Passphrase               = $ParamCredsJoindomain 
    FarmAccount              = $ParamCredsJoindomain 
    AdminContentDatabaseName = "SP_AdminContent" 
    PsDSCRunAsCredential     = $ParamCredsJoindomain
    ServerRole               = "SingleServerFarm"
    CentralAdministrationPort = 7777
    DependsOn = "[SPInstall]InstallSharePoint" 
}

This block creates a SharePoint Managed Account for our farm admin.

SPManagedAccount FarmAccount
{
    AccountName = $ParamCredsJoindomain.UserName
    Account = $ParamCredsJoindomain
    PsDSCRunAsCredential     = $ParamCredsJoindomain
    DependsOn = "[SPCreateFarm]CreateSPFarm"
}

This block creates a SharePoint Application Pool for our Web Application to be.

SPServiceAppPool SharePoint80
{
    Name = "SharePoint - 80"
    ServiceAccount = $ParamCredsJoinDomain.UserName
    PsDSCRunAsCredential     = $ParamCredsJoindomain
    DependsOn = "[SPManagedAccount]FarmAccount"
}

this block should be self explanatory. It creates a SharePoint Web Application on port 80.

SPWebApplication RootWebApp
{
    Name = "RootWebApp"
    ApplicationPool = "SharePoint - 80"
    ApplicationPoolAccount = $ParamCredsJoinDomain.UserName
    Url = "http://$ParamMachineName"
    DatabaseServer = $ParamMachineName
    DatabaseName = "WebApp-SharePoint-80"
    Port = 80
    PsDSCRunAsCredential = $ParamCredsJoinDomain
    DependsOn = "[SPServiceAppPool]SharePoint80"
}

This last block simply creates a Site Collection at the root of our Web Application.

SPSite RootSite
{
    Url = "http://$ParamMachineName"
    OwnerAlias = $ParamCredsJoinDomain.UserName
    Template = "STS#0"         
    PsDSCRunAsCredential = $ParamCredsJoinDomain
    DependsOn = "[SPWebApplication]RootWebApp"
}

What we need to do now, is upload this DSC configuration into our Azure Automation Account. To do this, start by navigating to your Automation Account and click on DSC Configurations.

DSC Configuration Node

DSC Configuration Node

Click on Add a configuration.

Upload DSC Configuration

Upload DSC Configuration

Click on the folder icon and browse to the .ps1 script above. Note that you will need to copy the complete script and save it locally first. Click on OK.

Our DSC Configuration Script is now in the Cloud, contained within our Azure Automation Account.

4 – Import the Required DSC Module

If you paid close attention to our full script above, you’ve realized that it needs to import 4 different DSC modules:

  • xDSCDomainJoin
  • xNetworking
  • SharePointDSC
  • xSQLServer

However, by default Azure Automation knows nothing about these modules. We need to import them first for Azure Automation to be able to properly configure our servers. Think of this as being the equivalent of putting the required modules and resources on a Pull Server for your registered nodes to consume (in an on-premises type of scenario). To import a resource, you need to go back to the main Azure Automation screen and click on Assets.

On the next screen, select Modules.

Adding Azure Automation Module

From here we have two choices: upload the required resources as individual .zip files from our local machine, or Import them from the PowerShellGallery.com repository. In my case, I choose to import them from the gallery, therefore I need to click on Browse gallery.

PowerShell Gallery Import

In the search box, type in the name of the first module we are trying to import: xDSCDomainJoin. Select the proper module from the search results by clicking on it.

Import DSC Domain Join

To finalize the import process, click on the Import icon.

Module imported

On the next screen, simply click OK.

Imported Module DSC

Repeat the same import process with the remaining missing modules: xNetworking, SharePointDSC, and xSQLServer.

5 – Initiate Compilation Job

In an on-premises scenario, you need to call the Configuration keyword of your DSC script in order for it to get “compiled” as a .MOF file. In Azure Automation, this is normally done by clicking on your DSC Configuration (uploaded at Step 3 above), and by clicking on Compile. However, in our case, we have credentials that need to be passed to our configuration. Therefore, instead of manually initiating the compilation job from the Azure Portal, we will use a local PowerShell script to remotely initiate the compilation job, allowing us to pass in parameters.

We will be using the following PowerShell script to remotely initiate that compilation job. Note that these are all the parameters we mentioned previously that are simply passed up to my Azure Automation Account.

$ProductKey = Read-Host "Please enter your SharePoint 2016 Product Key"
$MachineName = "SPTechCon-SA"
$ConfigData = @{
    AllNodes = @(
        @{
            NodeName = $MachineName
            PSDscAllowPlainTextPassword = $True
        }
    )
}

$Parameters = @{
    ParamDomain = "contoso.com"
    ParamInternalDomainControllerIP = "10.0.10.5"; 
    ParamMachineName= $MachineName
    ParamProductKey = $ProductKey 
    ParamUsername = "contoso\sp_farm"
    ParamPassword = "pass@word1"
    ParamShareName = "SPTechCon-Share"
}

Login-AzureRMAccount
Start-AzureRmAutomationDscCompilationJob -ResourceGroupName "SPTechCon" -AutomationAccountName "SPTechCon-Automation" -ConfigurationName "SharePoint2016StandAlone" -ConfigurationData $ConfigData -Parameters $Parameters

Upon executing this script, you will get prompted to enter your Azure Credentials, which you’ll need to do in order for the compilation jo to get queued up.

Provide Azure credentials

The script should only take a second or two to execute and will automatically initiate a compilation job in Azure.

Azure Automation DSC Compilation

Give Azure about5 minutes to initiate and finalize the compilation. Once the job has completed, the compilation status will get updated to Completed.

Completed Compilation of Azure DSC

6 – Register a DSC Node

If we recap what we have done so far, we started off by creating a new Azure IaaS VM that we wish to configure as a SharePoint 2016 Standalone development machine. We have then wrote the Desired State Configuration script for it, and have uploaded and compiled it into Azure. Now what we need to do is actually associate the VM we created with the DSC script we’ve uploaded. To do this, you need to go back to your Azure Automation’s account main page, and this time click on DSC Nodes.

Register an Azure DSC Node

Azure Automation gives you the option of managing both Azure and on-premises Virtual Machines. On-premises Virtual Machines will be covered in another article, when time permits. In our case we want to register an existing Azure VM. Click on Add Azure VM.

Register Azure VM with DSC

Azure Automation will then ask you for two things: The VM you wish to register, and the DSC Configuration to associate with it. Start off by clicking on Virtual Machines, and select the Virtual Machine we created from the list (in my case SPTechCon-SA). Click OK. One thing that is interesting to note here, is that because we have generalized our DSC script (meaning not valus are hardcoded in it), we can easily select multiple VMs in this step and they will each get assigned the exact same configuration.

Associate Azure VM with Automation DSC in Azure

Now that you have selected your VM, it’s time to pick the DSC Configuration we wish to deploy onto it. Click on Registration. From the Node Configuration Name, pick the Node Configuration we compiled previously. The rest of the properties listed on the page should look familiar to you. They represent the LCM settings that can normally be set via PowerShell in on-premises scenarios. Leave everything as default, with the exception of the Reboot Node if Needed checkbox that absolutely need to be checked for the installation to complete properly. Click OK.

Associate Configuration

The last thing left for us to do now is initiate the registration process by clicking on Create.

Initiate DSC Registration

Now sit back and relax, your machine is going to go and configure itself. Depending on several factors (machine size, region, etc.) the process may take up to 45 minutes to complete

How does it Work?

If you were to connect to your VM before registering it to the Azure Automation Account and run the Get-DSCLocalConfigurationManager cmdlet on it, you would see that by default the machine’s LCM is set to PUSH mode.

Get-DSCLocalConfigurationManager

Upon registering your machine against the Azure Automation Account, a DSC Extension is assigned to your VM. That extension will automatically change the configuration of the machine’s LCM to set it in PULL mode and register it against your Azure Automation’s Pull Server endpoint.

DSC Azure Pull mode

Deploying a Multi-Server SharePoint Farm with PowerShell Desired State Configuration

In this article we will cover the process of writing a DSC Configuration with SharePointDSC and deploying it to multiple servers to create a SharePoint 2016 farm (note that this would also work for SharePoint 2013 with minor changes to the SPFarm block). We will be using a Push Refresh mode, meaning that the Configuration will be manually applied to our servers, and not using a Pull Server to centrally manage the configuration. Using this approach, if our configuration was to change, it would need to be manually re-pushed onto the servers in the farm. While I believe that in most cases, you will wish to use a Pull refresh mode along with an Enterprise Pull Server to manage your SharePoint deployments, we are using a Push mode to keep things simple for the sake of this article.

The article will cover a scenario I will be demonstrating over at SPTechCon Austin next week. In this demo, I have a very small multi-server SharePoint 2016 farm that is made up of one dedicated SQL Server 2016 (SPTechCon-SQL), one Web Front-End (SPTechCon-WFE1), and one Application Server (SPTechCon-APP1). The configuration script will be built on a separate machine named (SPTechCon-Pull) and will be remotely applied to the servers in my farm from that machine. The figure below gives you a complete overview of the landscape of my demo.

Every server in my farm is a Windows Server 2016 Datacenter instance. As mentioned above, SPTechCon-SQL has SQL Server 2016 installed on it, and both SharePoint boxes (SPTechCon-WFE1 and SPTechCon-APP1) have the SharePoint 2016 bits installed on them (PSConfig was not run, just the SP2016 bits were installed).

As part of the demo, I will be using the following DSC configuration script to deploy my farm:

Configuration SPTechCon-OnPrem
{
    Import-DSCResource -ModuleName SharePointDSC

    $farmAccount = Get-Credential -UserName "contoso\sp_farm" -Message "Farm Account"
    $adminAccount = Get-Credential -UserName "contoso\sp_admin" -Message "Admin Account"

	Node SPTechCon-WFE1
	{
        SPFarm SPFarm 
        { 
            DatabaseServer           = "SPTechCon-SQL"
            FarmConfigDatabaseName   = "SP_Config" 
            Passphrase               = $farmAccount 
            FarmAccount              = $farmAccount
            AdminContentDatabaseName = "SP_AdminContent" 
            PsDSCRunAsCredential     = $farmAccount
            ServerRole               = "WebFrontEnd"
            Ensure                   = "Present"
            RunCentralAdmin          = $true
            CentralAdministrationPort = 7777
        }

        SPManagedAccount FarmAccount
        {
            AccountName = $farmAccount.UserName
            Account = $farmAccount
            PsDSCRunAsCredential     = $farmAccount
            DependsOn = "[SPFarm]SPFarm"
        }

        SPManagedAccount AdminAccount
        {
            AccountName = $adminAccount.UserName
            Account = $adminAccount
            PsDSCRunAsCredential     = $farmAccount
            DependsOn = "[SPFarm]SPFarm"
        }

        SPServiceAppPool SharePoint80
        {
            Name = "SharePoint - 80"
            ServiceAccount = $adminAccount.UserName
            PsDSCRunAsCredential     = $farmAccount
            DependsOn = "[SPManagedAccount]FarmAccount"
        }

        SPWebApplication RootWebApp
        {
            Name = "RootWebApp"
            ApplicationPool = "SharePoint - 80"
            ApplicationPoolAccount = $adminAccount.UserName
            Url = "http://SPTechCon-WFE1"
            DatabaseServer = "SPTechCon-SQL"
            DatabaseName = "WebApp-SharePoint-80"
            Port = 80
            PsDSCRunAsCredential = $farmAccount
            DependsOn = "[SPServiceAppPool]SharePoint80"
        }

        SPSite RootSite
        {
            Url = "http://SPTechCon-WFE1"
            OwnerAlias = $adminAccount.UserName
            Template = "STS#0"         
            PsDSCRunAsCredential = $farmAccount
            DependsOn = "[SPWebApplication]RootWebApp"
        }
	}
    Node SPTechCon-APP1
	{
        SPFarm SPFarm 
        { 
            DatabaseServer           = "SPTechCon-SQL"
            FarmConfigDatabaseName   = "SP_Config" 
            Passphrase               = $farmAccount 
            FarmAccount              = $farmAccount
            AdminContentDatabaseName = "SP_AdminContent" 
            PsDSCRunAsCredential     = $farmAccount
            ServerRole               = "Application"
            Ensure                   = "Present"
            RunCentralAdmin          = $false
        }
        SPServiceInstance BusinessDataConnectivityServiceInstance
        {
            Name = "Business Data Connectivity Service";
            Ensure = "Present";
            PsDSCRunAsCredential = $farmAccount;
        }
    }
}
$ConfigData = @{
    AllNodes = @(
        @{
            NodeName = "SPTechCon-WFE1"
            PSDscAllowPlainTextPassword = $True
            PSDscAllowDomainUser = $true
        },
        @{
            NodeName = "SPTechCon-APP1"
            PSDscAllowPlainTextPassword = $True
            PSDscAllowDomainUser = $true
        }
    )
}
SPTechCon-OnPrem -ConfigurationData $ConfigData
Start-DSCConfiguration SPTechCon-OnPrem -Wait -Verbose -Force

In summary, that script will automatically create the SharePoint 2016 farm, assign the Web Front-End MinRole to SPTechCon-WFE1, create a Web Application on port 80, and a root site collection, add SPTechCon-APP1 to the farm with the Application MinRole and have it run the Business Connectivity Service. Nothing too complicated, again to keep the demo focused and concise. Now, if you take a look at the last two lines of the script, you see that we are passing ConfigurationData to our Configuration method to ensure passwords can be passed as plain text. In an enterprise context, you will normally want to encrypt the credentials using a certificate. In our case here, I omitted to specify a certificate for simplicity sake.

Let us now head onto our SPTechCon-Pull machine, which is noting but a windows 10 machine hosted on the same domain as our servers to be configured. You will need to make sure that this machine has the SharePointDSC modules installed on it, because it will be required in order for us to compile our MOF file from this server. Installing the module is as easy as running the following cmdlet if your machine has internet connectivity:

Install-Module SharePointDSC

If you machine doesn’t have internet connectivity, then you will need to manually copy the SharePointDSC module onto the machine, under C:\Program Files\WindowsPowerShell\Modules\.

Upon executing the script above, you will be prompted to enter the credentials for the Farm account, which is required to execute the configuration steps, as well as for the admin account’s credentials, which I use as the owner of my site collection. Using remoting, PowerShell will remotely contact the two SharePoint servers and initiate the configuration. After a few minutes, you should see that the execution completed (see figure below).

You should also be able to navigate to the site collection we created (in our case at http://sptechcon-wfe1/) or to central administration, which based on our script, is hosted on SPTechCon-WFE1 and exposed through port 7777.

What is important for you to take away from this example, is that you don’t have to physically go on each server node that are part of your DSC configuration script in order to push the configuration onto it. This can all be done remotely from a machine that is external to the farm, as long as that machine has the SharePointDSC bits installed on it.

Content Type Hub Packages

Every now and then I like to spend some time understanding the internals of some of the various components that make up SharePoint. This week, while troubleshooting an issue at a customer’s, I decided to crack open the Content Type Hub to see how it is exactly that Content Types get published down to subscriber site collections. In this article I will explain in details the process involved in publishing a Content Type from the Content Type Hub to the various site collections that will be consuming them.

First off, let us be clear. The Content Type Hub is nothing more than a regular site collection onto which the Content Type Syndication Hub feature has been activated on (site collection feature).

The moment you activate this feature onto your site collection, a new hidden list called “Shared Packages” will be created in the root web of that same site collection.

The moment you activate the feature, that list will be empty and won’t contain any entries. You can view the list by navigating to http://Content Type Hub Url/Lists/PackageList/AllItems.aspx.

However, the moment you publish a Content Type in the Hub, you will see an entry for that Content Type appear.
Publish a SharePoint Content Type
SharePoint Content Type Hub Package

In the two figures above, we can see that we have publish Content Type “Message” which has an ID of 0x0107. Therefore the entry that gets created in the Shared Packages list has that same ID value (0x0107) set as its “Published Package ID” column. “Pulished Package ID” will always represent the ID of the Content Type that was recently Published/Updated/Unpublished and which is waiting to by synchronized down to the subscriber site collections. The “Taxonomy Service Store ID” column, contains the ID of the Managed Metadata Service that is used to syndicate the Content Type Hub changes to the subscriber sites. In my case, if I was to navigate to my instance of the Managed Metadata Service that takes care of the syndication process and look at its URL, I should be able to see that its ID matches the value of that column (see the two figures below).

SharePoint Taxonomy Service Store ID
Managed Metadata Service Application ID

The “Taxonomy Service Name” is self explanatory, it represents the display name of the Manage Metadata Service Application instance represented by the “Taxonomy Service Store ID” (see the two figures below).

SharePoint Taxonomy Service Name
SharePoint Managed Metadata Service Application Name

The “Published Package Type” column will always have a value of “{B4AD3A44-D934-4C91-8D1F-463ACEADE443}” which means it is a “Content Type Syndication Change”.
SharePoint Published Package Type

The last column, “Unpublished”, is a Boolean value (true or false) that indicates whether or not the operation that was added to the queue is a Publish/Update, in which case that value will be set to “No”, or if it was an “Unpublish” operation, in which case the value would be set to “Yes”. The two figures below show the results of sending an “Unpublish” operation on a previously published Content Type to the queue.
SharePoint Unpublish a Content Type

Now what is really interesting, is that even after the subscriber job (Content Type Subscriber timer job) has finished running, entries in the “Shared Packages” list persist. In fact, these are required for certain operations in the Hub to work as expected. For example, when you navigate to the Content Type Publishing page, if there are no entries in the “Shared Packages” list for that content type, you would never get “Republish” and “Unpublished” as an option. The page queries the list to see if there is an entry for the given Content Type that proves it was published at some point before offering the option to undo the publishing.

To better demonstrate what I am trying to explain, take the following scenario. Go to your Content Type Hub and publish the “Picture” Content Type. Now once published, simply go back to that same “Content Type Publishing” page. You will be presented with only two options: Republish or Unpublish. The “Publish” option will be greyed out, because SharePoint assumes that because you have an entry in the Shared Packages list marked with “Unpublished = No”, that the Content Type has already been published. Therefore you can only “Republish” or “Unpublish” it. No navigate to the “Shared Packages” list and delete the entry for that Content Type. Once the entry has been deleted, navigate back to the Content Type Publishing page for the Picture Content Type. The “Publish” option is now enabled, and the “Republish” and “Unpublish” ones are disabled. That is because SharePoint couldn’t find a proof in the “Shared Packages” list that this Content type has been published in the past.

Also, if you were to publish a Content Type, and later unpublish it, you would see two entries in the “Shared Packages” list (one for publish one for unpublish). The Unpublish operation simply updates the existing entry in the “Shared Packages” list and sets its “Unpublished” flag to “Yes”.

If you were to create a Custom Content Type, publish it, and then delete it. SharePoint is not going to automatically remove its associated entry in the “Shared Packages” list. Instead, the next time the “Content Type Hub” timer job runs, it will update the associated entry to set its “Unpublished” flag to “false”. Meaning that we want to make sure that deleted Content Type never makes it down to the Subscriber Site Collections.

How does synchronization Works

By now you are properly wondering how the Synchronization process works between the Hub and the Subscriber Site Collections if entries are always persisted in the “Shared Packages” list. The way this process works is actually quite simple. The “Content Type Subscriber” timer job is the one responsible for that operation. By default that timer job runs on an hourly basis, and indirectly (via Web Services) queries the “Shared Packages” list to retrieve all changes that have to be synchronized. The root web of every Site Collection that subscribes to the Content Type Hub exposes a property called “metadatatimestamp” that represents the last time the Content Type Gallery for that given Site Collection was synchronized. The following PowerShell script can help you obtain that value for any given subscriber Site Collection.

$url = Read-Host "URL for your subscriber Site Collection"
$site = Get-SPSite $url
Write-Host $site.RootWeb.Properties["metadatatimestamp"]

When the “Content Type Subscriber” timer job runs, it goes through every Subscriber Site Collection, retrieve its “metadatatimestamp” value, and queries the “Shared Packages” list passing that date stamp. The queries then returns only the list of entries that have their “Last Modified” date older than that time stamp. Upon receiving the list of changes to sync, the timer job retrieves the Content Type information associated with the listed changes from the Hub and applies them locally to the Subscriber Site Collection’s Content Type gallery. Once it finished synchronizing a Site Collection, it updates its “metadatatimestamp” to reflect the new timestamp.

If you really wanted to, you can make sure that every single Content Type listed in the “Shared Packages” list be synchronized against a given Site Collection by emptying the “metadatatimestamp” property on that given site. As an example, when you create a new Site Collection, that root web won’t have that value set and therefore every Content Type that was ever published in the Hub will make its way down to that new Site Collection. Using the interface, you can also blank out that property by going to the Content Type Publishing page and selecting the option to “Refresh all published content types on next update”. All that this option does is empty the value for that property on the site.

Refresh all published content types on next update

Document Sets

Let’s now look at a more complex scenario (which is really why I started taking a closer look at the Publishing process in the first place). Let us investigate what happens if we publish Content Types that inherit from the Document Set Content Type. In my example, I’ve created a new custom Content Type named “DocSetDemo”. This custom Content Type, since it inherits from its Document Set parent, defines a list of Allowed Content Types. In my case, I only allow for “Picture” and “Image” as Allowed Content Types.
Allowed Content Types

The moment you go and try to publish this custom Content Type, you will get the following prompt, which let’s you know that every Allowed Content Types identified within your custom Content Type will also be published.

What that truly means is that not only is SharePoint going to create an entry in the “Shared Packages” list for your custom Content Type, it will also create one for every Allowed Content Type identified. In the figure below, we can see that after publishing my custom Content Type “DocSetDemo”, which has an ID of 0x0120D5200049957D530FA0554787CFF785C7C5C693, there are 3 entries in the list. One for my custom Content Type itself, one for the Image Content Type (ID of 0x0101009148F5A04DDD49CBA7127AADA5FB792B00AADE34325A8B49CDA8BB4DB53328F214) and one for the Picture Content Type (ID of 0x010102).