This article describes how you can upload a PowerShell DSC module into an Azure Automation Account using PowerShell. When using the interface, you have two options to upload a module: import from the PowerShell Gallery or upload a zip file from your computer.
Import Module from the PowerShell Gallery
Importing a DSC module from the PowerShell Gallery requires you to know what the URL of the module is. That URL can be found by navigation to the PowerShell Gallery and finding the module you want to import and determine the version you want, then generate the package URL for it using the https://www.powershellgallery.com/api/v2/package/
Login-AzureRMAccount New-AzureRMAutomationModule -Name "SharePointDSC" -ResourceGroupName "TBD" -AutomationAccountName "NikDemo" -ContentLinkUri "https://www.powershellgallery.com/api/v2/package/SharePointDSC/2.3.0.0"
This will automatically initiate an import process within your Azure Automation Account as shown in the following screenshot:
Upload Module from Local Computer
Uploading a custom module from the local computer is a little more tricky than importing on from the gallery. As we’ve seen in the section above, the New-AzureRMAutomationModule cmdlet takes in a URL for its ContentLinkUri parameter. Therefore, in order to import our custom module, the zip file needs to be accessible from the web. The simplest solution here is to upload it to a Blob Storage account, and then call the cmdlet passing in that URL. The following PowerShell script will create a new Blob Storage account named “modulerepo” in our existing TBD resource group and create a Blob named blobstorage within it:
$storageAccount = New-AzureRmStorageAccount -ResourceGroupName "TBD" -Name "modulerepo" -Location "EastUS" -SkuName "Standard_GRS" -Kind "BlobStorage" -AccessTier Hot $ctx = $storageAccount.Context Set-AzureRmCurrentStorageAccount -ResourceGroupName "TBD" -Name "modulerepo" New-AzureStorageContainer -Name "blobstorage" -Context $ctx -Permission blob
If we were to go an verify within the Azure portal that everything has been created as expected, we would see the following in the Storage Account section:
Next step is to upload our Zipped module into that newly created Blob. That can be achieved with the following lines of PowerShell. In my example, I will be using a local module named xDownloadISO.
$xDownloadISOPath = "C:\Modules\xDownloadISO.zip" $blob = Set-AzureStorageBlobContent –Container "blobstorage" -File $xDownloadISOPath -Blob "xDownloadISO.zip" -Force
We can now verify in the portal that our file was properly uploaded to our blob storage, as shown in the follow screenshot:
Now that we have our zip file available on the internet, we can call our New-AzureRMAutomationModule cmdlet passing in that URL as follow:
$xDownloadISOUrl = $blob.ICloudBlob.Uri.AbsoluteUri New-AzureRmAutomationModule -Name "xDownloadISO" -ResourceGroupName "TBD" -AutomationAccountName "NikDemo" -ContentLinkUri $xDownloadISOUrl
In the Azure portal, verify that the module was properly imported within our Azure Automation Account. The following screenshot shows the my xDownloadISO custom module was successfully imported into my account.
We are now ready to go and deploy DSC configurations that can make use of those new DSC modules!
Is this still accurate? I did it and it could never find the blob because of permissions, but I didn’t want to make the module world readable (it’s proprietary code), so I had to generate a SAS signature and pass that to New-AzAutomationModule.
$SASToken = New-AzStorageBlobSASToken -CloudBlob $result.ICloudBlob -Context $storageAccountContext -FullUri -ExpiryTime (Get-Date).AddMinutes(5) -Permission r
New-AzAutomationModule -Name ‘mymodule’ -ResourceGroupName ‘mygroup’ -AutomationAccountName ‘myaccountname’ -ContentLink $SASToken
Hi Justin,
This is still applicable, but mostly applies for public resource in the PowerShell Gallery, as you already figured out. SASToken would be the way to go for sure for private blobs. You could even use Azure DevOPS pipelines (if available) to automate this further, and use KeyVault to store the the token for additional security.