Cloud LCM – Prototype for DSC-as-a-Service

I’ve been experimenting over the past few weeks with a project I’ve been planning to work on for years now. It is this concept of being able to deploy PowerShell Desired State Configuration (DSC) in a server-less mode. In any traditional DSC project, the actual configuration gets executed by a local service that comes with PowerShell, called the Local Configuration Manager (LCM). This is the agent responsible for bringing the machine in its Desired State and ensuring it stays in that state by doing regular checks referred to as Consistency Checks to make sure it stays in the Desired State. Since its inception, DSC has always focused on configuring software for on-premises scenarios. However, with the recent introduction of DSC modules to configure and manage Software-as-a-Service such as the Microsoft365DSC project, traditional deployment processes no longer apply. One the main challenge I’ve been facing since starting the Microsoft365DSC project two years ago, is that it’s always been requiring what I refer to as a middle-man agent. Sure you can run and deploy a configuration from your own laptop if you wish, but the moment you close the lid on it, the LCM service that monitors your Microsoft 365 tenant for configuration drift stops. You either need to have it run on a server that will be kept up and running, but then that server simply becomes an extra “middle-man” because all it does at this point is make remote calls to your Microsoft365DSC. Off course you can also run it inside of an Azure DevOPS pipeline, but this requires you to create scheduled Release pipelines for monitoring and may not be ideal for some organizations. I’ve described this challenge in length in my The PaaS/SaaS DSC Paradigm article.

What I am after here is a true way of running PowerShell DSC for monitoring hosted software; something that would truly be agentless. This is where the concept of having a Cloud LCM comes into play. The idea is to be able to run DSC inside of an Azure Function, therefore removing the requirement for any “middle-man” agent. Now, having a DSC agent hosted in an Azure function is not as straightforward as it might sound to some of you. Azure functions currently support running PowerShell 7, and at the time of writing this article, DSC is still not an integral part of PowerShell 7. That version of PowerShell only includes a few of the DSC functions normally found in Windows PowerShell 5.1. The good news however is that one of these functions is the Invoke-DSCResource cmdlet, which allows us to manually call into a DSC Resource. Therefore, our solution will need to orchestrate calls to this cmdlet to replicate the behavior of a traditional LCM. By leveraging the DSCParser module, we will be able to parse any PowerShell DSC configuration into an array of PowerShell Objects, and then dynamically call into Invoke-DSCResource for each DSC resource block in our configuration. This blog article presents an overview of a Prototype I’ve realized for what I refer to as the Cloud LCM.

Overview

My prototype will consist of the following Azure artifacts:

  • Azure Function: A PowerShell 7 Http triggered function that can accept two query string parameters. The first one is the Method parameter which defines what method we which to call inside of our resources (Get, Set or Test). The second parameter is ConfigurationUrl which is the URL to our DSC configuration, hosted in an Azure Storage blob and accessible via HTTP.
  • Azure Storage Account: Will host our PowerShell DSC configurations. Each configuration will be uploaded as it’s own .PS1 file and made publicly available via HTTP. You don’t have to use Azure Storage here, as long as you have a way of making the configuration available via the web.
  • Azure Key Vault: Will store the credentials for our Microsoft 365 administrator’s account as secrets.

Azure Key Vault

For my prototype, I’ve created a new Azure Key Vault named CloudLCM and granted my Azure Function’s Service Principal access to manage Secrets for it. In my case, my Azure Function is named BaaS-Managed.

Create Azure KeyVault

In my Azure Key Vault account will define two secrets:

  • UserName: represents the username of my administrator account. E.g. admin@contoso.onmicrosoft.com
  • AccountPassword: represents the password for my administrator’s account.

Key Vault Secret

Azure Storage Account

As mentioned previously, I am hosting the PowerShell DSC (.ps1) configuration I wish to apply in an Azure Storage Account. In our case, the configuration is named CloudLCMDemo.ps1. Remember that your configuration doesn’t have to be hosted in Azure Storage, it can be hosted anywhere as long as you can access it with a URL.

Azure Storage DSC Configuration

In my case, my configuration simply contains an Azure AD Groups lifecycle definition and a list of blocked words.


# Generated with Microsoft365DSC version 1.0.5.127
# For additional information on how to use Microsoft365DSC, please visit https://aka.ms/M365DSC
param (
[parameter()]
[System.Management.Automation.PSCredential]
$GlobalAdminAccount
)

Configuration M365TenantConfig
{
param (
[parameter()]
[System.Management.Automation.PSCredential]
$GlobalAdminAccount
)

if ($null -eq $GlobalAdminAccount)
{
<# Credentials #>
$Credsglobaladmin = Get-Credential -Message "Global Admin credentials"
}
else
{
$Credsglobaladmin = $GlobalAdminAccount
}

$OrganizationName = $Credsglobaladmin.UserName.Split('@')[1]
Import-DscResource -ModuleName Microsoft365DSC

Node localhost
{
AADMSGroupLifecyclePolicy aa1d0235-e1aa-4c52-a496-f96c81f7d2f4
{
IsSingleInstance = "Yes";
GroupLifetimeInDays = 819
ManagedGroupTypes = "All"
AlternateNotificationEmails = "Nik.Charlebois@Microsoft.com"
Ensure = "Present";
GlobalAdminAccount = $Credsglobaladmin;
}
AADGroupsNamingPolicy GroupsNamingPolicy
{
CustomBlockedWordsList = @("CEO", "President");
GlobalAdminAccount = $CredsglobalAdmin;
IsSingleInstance = "Yes";
PrefixSuffixNamingRequirement = "[Title]Test[Company][GroupName][Office]Redmond";
}
}
}
M365TenantConfig -ConfigurationData .\ConfigurationData.psd1 -GlobalAdminAccount $GlobalAdminAccount

Azure Function

As part of my Azure Function project, I will need to include several PowerShell module since these can’t be installed at runtime within an Azure Function using the Install-Module cmdlet. The modules contained as part of my Azure Function project are the Microsoft365DSC project and all of its dependencies (e.g. MicrosoftTeams, DSCParser, ReverseDSC, etc.) and the PowerShell 7 PSDesiredStateConfiguration module. This module is normally located at C:\Program Files\PowerShell\7\Modules\PSDesiredStateConfiguration. These modules will all be put under a folder named Modules which I created at the root of my project (see screenshot below).

Add Custom PowerShell Modules to Azure Function

Now, the first thing you will need to do in your function’s code (run.ps1 in my case) is import your PSDesiredStateConfiguration module. To do so, simply run:


Import-Module PSDesiredStateConfiguration

However, for Microsoft365DSC, we will need to use the -UseWindowsPowerShell switch in order to load it as a Windows PowerShell module. Microsoft365DSC doesn’t yet support PowerShell core, due to some of its dependencies not supporting it. One weird behavior of the Microsoft365DSC module with PowerShell 7 is that you actually have to implicitly load its AzureADPreview dependency as a Windows PowerShell module before attempting to import Microsoft365DSC.

The overall logic of the function is as follow:

  1. Retrieve the Method query string. If it is not present, default the method to Test.
  2. Import all required PowerShell modules.
  3. Retrieve the ConfigurationUrl property from the query string.
  4. Retrieve the Microsoft 365 administrator’s credentials from Azure Key Vault and create a PSCredential object from it.
  5. Retrieve the content of the Configuration from its URL.
  6. Parse the content of the configuration using DSCParser’s ConvertTo-DSCObject cmdlet.
  7. Loop through each DSC resource block in the parsed configuration. For each resource block:
    1. Update the GlobalAdminAccount property by the credential object generated from the Azure Key Vault’s secrets.
    2. Retrieve the name of the current DSC Resource from the parsed object.
    3. Remove the ResourceName property from the current resource block’s properties.
    4. Call into the Invoke-DSCResource cmdlet passing in the ResourceName, the method retrieved by the query string and all the parsed properties.

My complete function’s code is as following:


using namespace System.Net

# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)

# Import Required Modules. AzureADPreview has to be explicitily imported as
# a Windows PowerShell module before we can import the Microsoft365DSC module.
Import-Module PSDesiredStateConfiguration -Force
Import-Module AzureADPreview -UseWindowsPowerShell -Force
Import-Module Microsoft365DSC -UseWindowsPowerShell -Force

# Retrieve the method to call (Get/Set/Test) from the query string.
# Default to Test is non passed;
$Method = "Test"
if (-not [System.String]::IsNullOrEmpty($Request.Query.Method)) {
$Method = $Request.Query.Method
}

# Retrieve the Configuration's URL from QueryString
$ConfigurationURL = $Request.Query.ConfigurationURL

# Retrieve Credentials from Azure Key Vault
[string]$userName = (Get-AzKeyVaultSecret -vaultName "Blueprint-as-a-Service" -name "UserName").SecretValueText
[string]$userPassword = (Get-AzKeyVaultSecret -vaultName "Blueprint-as-a-Service" -name "AccountPassword").SecretValueText
[securestring]$secStringPassword = ConvertTo-SecureString $userPassword -AsPlainText -Force
[pscredential]$creds = New-Object System.Management.Automation.PSCredential ($userName, $secStringPassword)

# Retrieve the configuration's content from the web
$Configuration = Invoke-WebRequest -Uri $ConfigurationURL -UseBasicParsing
$ConfigurationContent = $Configuration.RawContent

# Parse the DSC content into an array of PowerShell Objects using the DSCParser;
$ConfigurationComponents = ConvertTo-DSCObject -Content $ConfigurationContent

# For each DSC Resource block in the parsed array, replace the credentials by
# the ones retrieved from Azure Key Vault, and dynamically call into the
# Invoke-DSCResource cmdlet.
foreach ($DSCResourceBlock in $ConfigurationComponents) {
# Replace the credentials by the ones from Azure Key Vault;
$DSCResourceBlock.GlobalAdminAccount = $creds

# Obtain the ResourceName property from the parsed DSC content
$ResourceName = $DSCResourceBlock.ResourceName

# Remove the ResourceName property which is created by the DSC parser
# for usability purpose but which is never part of the set of
# accepted parameters of the resource.
$DSCResourceBlock.Remove("ResourceName") | Out-Null

# Dynamically call the Invoke-DSCResource cmdlet.
Invoke-DSCResource -ModuleName Microsoft365DSC `
-Name $ResourceName `
-Property $DSCResourceBlock `
-Method $Method -Verbose | Out-Null
}
$body = "Successfully called the {$Method} Method on configuration {$ConfigurationURL}"

# Associate values to output bindings by calling 'Push-OutputBinding'.
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
StatusCode = [HttpStatusCode]::OK
Body = $body
})

Putting it into Action

The only thing left to do is to trigger my function by opening up a browser and navigating to my function’s Url. As a demo, I will trigger the function by passing in the Method query string property as Set, telling my function to apply the configuration. In my case, the url will be https://<Url of my Azure Function>?Method=Set&ConfigurationUrl=https://<URL of my configuration in Azure Storage>. Upon successfully completing its execution, my function will return a message stating that it successfully applied the configuration. If you then navigate to Azure AD, you should see the Lifecycle Policy and Blocked Word list applied to your tenant.

Leave a Reply

Your email address will not be published. Required fields are marked *