top of page
Writer's pictureWill Francillette

M365DSC: Conditional Access Monitoring Automation

Updated: Mar 7


microsoft 365 dsc

In the previous blog, we used the Microsoft 365 DSC module to monitor Entra Conditional Access policy drift. This works fine in a standalone environment but for a wider and flexible deployment we want take advantage of M365DSC original design Configuration-As-Code and by extension its integration with DevOps and automation.

In this blog, we will automate the provisioning of Azure resources and configure them to monitor our workload and generate Sentinel alerts in a no-touch approach.


You may ask, why would we do such things?

1- Maintenance:

M365DSC is updated every week so to minimize cost and effort, we should automate the process.

2- Redeployment:

What if you wanted to redeploy this setup in multiple environment or different tenant?

3- Scope:

What if we wanted to monitor more than Conditional Access and split M365DSC into workloads as I mentioned in a previous blog?

4- Centralized management:

What if you wanted to manage all your frameworks from a single place?


This project is only a proof-of-concept and is subject to improvements. I've already few ideas in mind which I will work on next.


The first part of this post describes all the different components of the project so that we can understand the purpose of each elements.

The second part describes the configuration and deployment.


I've also decided to use few pre-existing resources such as the Sentinel instance and analytics rules because they are a one off configuration.


Now let's get started!


Table of content


The Project

1. The source repo

The project has been designed to run in an Azure DevOps organization and we will use my GitHub repository as a starting point.


github repo

This repo contains :

  1. The YAML pipeline file.

  2. A couple of Bicep templates to provision the Azure resources

  3. An M365DSC template with our Conditional Access policies. A User Managed Identity will be used to authenticate the Microsoft 365 DSC workload.

  4. A few PowerShell scripts to configure the compute resource

2. Azure DevOps

We won't go through the configuration of the Azure DevOps organization and project as this is outside the scope of this blog. The pipeline runs using a Windows Hosted agent.

For those not familiar with Azure DevOps, hosted agents are Microsoft-managed virtual machines in charge of running the sequence of tasks included in a pipeline. We could say that a pipeline in Azure DevOps is equivalent to a task sequence in SCCM.


azure devops project

2.1 Service Connection

To allow Azure DevOps to create resources in the Azure tenant we use a service connection which is a Service Principal managed by Azure DevOps. It can be security trimmed to be used only on specific pipelines and scoped to a subscription or a resource group.

You can update those permission as required at creation or from the Azure portal.


2.2 Variable Groups

We use variable group to store values we don't want to display in the scripts. For example, I have added my subscription Id as a variable and the password for the VM as a secret. It's a convenient way to keep your scripts clean and secured.

You can also link a variable group to an Azure Key Vault and keep all your secrets in one place.


3. The M365DSC export file


I used the export from the previous blog but slightly modified it:

​Note:

The DSC export file included in this repo is only for test purpose. Please make sure to export your own policies. Refer to this blog for more information:

1- M365DSC: Getting Started Part 2: Installation, authentication and export configuration (french365connection.co.uk)

2- Taking a Snapshot of Existing Tenant - Microsoft365DSC - Your Cloud Configuration

1- I have added the OrganizationName (i.e. the M365 tenant name) as a parameter

m365dsc export modification

2- I removed the version dependency so that it doesn't rely on a specific version

m365dsc export modification 1

3- I replaced the credentials configuration by ManagedIdentity

m365dsc export modification 2

4- The default M365DSC module use a configurationdata file to authenticate via certificate or password (encrypted) which is not required in our case because we are using a managed identity.

m365dsc export modification 3

4. The PowerShell scripts


We have 4 scripts for this projects:

4.1 AssignPermission.ps1

This is an interactive script used to assign the required permission to our managed identity

$managedIdentityObjectId = "0e36d981-23b5-46ad-b6b8-85b0b583b0d9" # Your Managed Identity Object Id here

# Connect to Grah SD with required permissions
Connect-MgGraph -Scopes 'Application.Read.All','AppRoleAssignment.ReadWrite.All' 

$serverApplicationName = "Microsoft Graph"
$serverServicePrincipal = Get-MgServicePrincipal -Filter "DisplayName eq '$serverApplicationName'"
$serverServicePrincipalObjectId = $serverServicePrincipal.Id

#Retrieving required permission to run our M365DSC resource
$appRoleName = @(
    'Policy.Read.All'
    'Policy.ReadWrite.ConditionalAccess'
    'Application.Read.All'
    'RoleManagement.Read.Directory'
    'Group.Read.All'
    'User.Read.All'
    'Agreement.Read.All'
    'Application.Read.All'
)

$appRoleIds = ($serverServicePrincipal.AppRoles | Where-Object {$_.Value -in $appRoleName }).Id

# Assign the managed identity access to the app role.
foreach ($appRoleId in $appRoleIds)
{
    New-MgServicePrincipalAppRoleAssignment `
        -ServicePrincipalId $managedIdentityObjectId `
        -PrincipalId $managedIdentityObjectId `
        -ResourceId $serverServicePrincipalObjectId `
        -AppRoleId $appRoleId
}

4.2 InstallM365DSCModule.ps1

This script installs the Microsoft 365 DSC module and update all the dependencies

[Net.ServicePointManager]::SecurityProtocol =
    [Net.ServicePointManager]::SecurityProtocol -bor
    [Net.SecurityProtocolType]::Tls12

$module = Get-Module PowerShellGet -ErrorAction SilentlyContinue
if ($null -eq $module)
{
    Install-PackageProvider -Name NuGet -Force
    Write-Output "Installing PowerShellGet"
    Install-Module PowerShellGet -Force -AllowClobber
}
$module = Get-Module Microsoft365DSC -ErrorAction SilentlyContinue
if ($null -eq $module)
{
    Write-Output "Installing Microsoft 365 DSC module"
    Install-Module Microsoft365DSC -Confirm:$false -Force
}
Write-Output "Upgrading Microsoft 365 DSC module"
Update-Module Microsoft365DSC -Force -Confirm:$false

Write-Output "Updating Microsoft 365 DSC dependencies"
Update-M365DSCDependencies

4.3 ConfigureDSC.ps1

This script is used to configure and start the DSC engine. It is setup as 'ApplyAndMonitor' only

You can change to 'ApplyAndAutoCorrect' if you want to automatically revert the setting when a drift is detected

param(
    [Parameter()]
    [String]
    $resourcePath = "C:\DevOps\M365DSC\Export",
    [Parameter()]
    [String]
    $OrganizationName
)

Write-Output "Configuring LCM"

$LCMConfigPath = 'C:\M365DSC\LCM'
if (-not (Test-Path -Path $LCMConfigPath))
{
    New-Item -ItemType Directory -Path $LCMConfigPath -Force
}

$LCMConfig = @'
[DSCLocalConfigurationManager()]
configuration LCMConfig
{
    Node localhost
    {
        Settings
        {
            RefreshMode = 'Push'
            ConfigurationMode = 'ApplyAndMonitor'
            ConfigurationModeFrequencyMins = 15
        }
    }
}
LCMConfig
'@
$LCMConfig |Out-File "$LCMConfigPath\LCMConfig.ps1"
Set-Location $LCMConfigPath
.\LCMConfig.ps1
Set-DscLocalConfigurationManager -Path "$LCMConfigPath\LCMConfig" -Force

# 5- Start the Configuration
Write-Output "Starting DSC Configuration and Engine"
if(-not (Test-Path $resourcePath))
{
    new-item -ItemType Directory -Path $resourcePath -Force
}
Set-Location $resourcePath
.\M365TenantConfig.ps1 -OrganizationName $OrganizationName
Start-DscConfiguration -Wait -Force -Verbose -Path .\M365TenantConfig

4.4 extractCAPolicies.ps1 (alternative)

This script is an alternative approach where instead of using the M365DSC export file from the repo you extract one directly from the VM. This script is not used in the pipeline

param(
    [Parameter()]
    [String]
    $TenantId,
    [Parameter()]
    [String]
    $resourcePath = "C:\DevOps\M365DSC\Export"
)

$ProgressPreference = 'SilentlyContinue'

# 3- Export the Entra ID Conditional Access policies
$params = @{
    #Credential           = $Credential
    #ApplicationId         = $ApplicationId
    TenantId              = $TenantId
    #ApplicationSecret    = $ApplicationSecret
}

Write-Output "Extracting resource"
Export-M365DSCConfiguration `
    -Components @("AADConditionalAccessPolicy") `
    -Path $resourcePath `
    -TenantId  $TenantId `
    -ManagedIdentity


5. The pipeline


To be brief, a pipeline is a set of steps/tasks used to automate the deployment of an application or some cloud resources as code. It is written in YAML, a declarative language similar to JSON, and includes all the tasks to complete our deployment. it can be triggered manually, scheduled or based on events such as a new release.


The YAML file is composed of blocks (mapping or sequence) differentiated by their indents.


5.1 The trigger

trigger:
- main

We will automatically trigger the pipeline when a commit is initiated on the main branch of our repo.

You can also trigger a pipeline run directly from Azure DevOps or on a schedule.

Azure devOps trigger

5.2 The pipeline variables

variables:
- group: "CA Monitoring - Variables"
- name: location 
  value: 'uksouth'
- name: rgTemplateFile
  value: 'Bicep/main.bicep'
- name: vmTemplateFile  
  value: 'Bicep/vm.bicep'
- name: azureserviceconnection
  value: 'F365C-ServiceConnection'
- name: resourcePath
  value: 'C:\DevOps\M365DSC\Export\AADConditionalAccess'
- name: resourcePrefix
  value: 'camonitoring'

Here we define the variables

group is a reference to our variable group and the other key-value pairs represent other adhoc variables. Those will appear in the script and repo so make sure that no sensitive data is included.


5.3 The agent

pool:
  vmImage: windows-latest

The pool define our agent ie the VM running the pipeline. 2 types are available:

  1. hosted: managed by Microsoft. this agent is initialized at every run. it may take longer to run if you need to install prereqs like modules and dependencies.

  2. Self-hosted: This is a VM linked to Azure DevOps by installing the agent.

Both types of agents can run Windows or Linux.


Here we choose Windows because the AzureFileCopy task only supports Windows. It also require a Public IP.


5.4 The steps


5.4.1 Initialisation

We reinitialise the existing resource group and resources if any using a PowerShell Inline script and the task: AzurePowerShell@5

steps:
- task: AzurePowerShell@5
  inputs:
    azureSubscription: '$(azureserviceconnection)'
    ScriptType: 'InlineScript'
    azurePowerShellVersion: LatestVersion
    Inline: |
      $rg = Get-AzResourceGroup -Name '$(resourcePrefix)-rg' -ErrorAction SilentlyContinue
      if ($null -ne $rg)
      {
        $rg | Remove-AzResourceGroup -Force
      }

​Note: In this PoC, we will recreate the resource group and resources every time we run our pipeline. If you plan to deploy this in production, you may want to prevent any monitoring downtime by for example only remove the existing resources after confirming everything is up and running.


5.4.2 Resource group and permissions

We redeploy the resource group and permissions by deploying the main.bicep template using AzureResourceManagerTemplateDeployment@3

- task: AzureResourceManagerTemplateDeployment@3
  displayName: Deploy Main Bicep
  inputs:
    deploymentScope: 'Subscription'
    deploymentMode: 'Incremental'
    azureResourceManagerConnection: '$(azureserviceconnection)'
    subscriptionId: '$(subscriptionid)'
    location: '$(location)'
    templateLocation: 'Linked artifact'
    csmFile: '$(rgTemplateFile)'
    overrideParameters: ' -resourcePrefix $(resourcePrefix) -location $(location) -spId $(serviceconnectionid)'

Deploying a resource group has to be on the subscription level but the permission assignment applies on the resource group level so to change the scope I use a module referring to rgRole.bicep template.


5.4.3 Resource deployment

In this task, we deploy all the resources required for our deployment which includes a VM and its extensions, a vNET, an NSG and more.

- task: AzureResourceManagerTemplateDeployment@3
  displayName: Deploy VM Bicep
  inputs:
    deploymentScope: 'Resource Group'
    deploymentMode: 'Incremental'
    azureResourceManagerConnection: '$(azureserviceconnection)'
    subscriptionId: '$(subscriptionid)'
    action: 'Create Or Update Resource Group'
    resourceGroupName: '$(resourcePrefix)-rg'
    location: '$(location)'
    templateLocation: 'Linked artifact'
    csmFile: '$(vmTemplateFile)'
    overrideParameters: ' -adminUsername "adminPipeline" -adminPassword $(adminPassword) -resourcePrefix $(resourcePrefix) -location $(location) -spId $(serviceconnectionid)'

5.4.4 M365DSC template

Here, we copy the M365DSC export file to the VM using AzureFileCopy@5

This task requires a storage account, a public IP (both deployed from the previous bicep file) and Windows agent. The file(s) is first copied to a storage account and then to the VM using AzCopy

- task: AzureFileCopy@5
  inputs:
    SourcePath: 'CAMonitoring\DSC\M365TenantConfig.ps1'
    azureSubscription: '$(azureserviceconnection)'
    Destination: 'AzureVMs'
    MachineNames: '$(resourcePrefix)-vm'
    storage: "$(resourcePrefix)sa"
    resourceGroup: "$(resourcePrefix)-rg"
    vmsAdminUserName: 'adminPipeline'
    vmsAdminPassword: '$(adminPassword)'
    TargetPath: '$(resourcePath)'
    CopyFilesInParallel: false
    skipCACheck: true
    enableCopyPrerequisites: true
    AdditionalArgumentsForVMCopy : '--log-level=INFO'

I'm not a particular fan of this task and used it temporarily. I configured the NSG to only allow traffic from the Azure DevOps region but I don't like using a storage account and exposing the VM unnecessarily. I will work on a new version that doesn't use this task and do not expose the VM to the internet.

Keep posted!

5.4.5 Installing M365DSC

In this task, we install the latest version of M365DSC on the VM and update all the dependencies using a PowerShell script and AzurePowerShell@5.

- task: AzurePowerShell@5
  inputs:
    azureSubscription: '$(azureserviceconnection)'
    ScriptType: 'InlineScript'
    azurePowerShellVersion: LatestVersion
    Inline: |
      Install-Module Az.Compute -force -confirm:$false
      Invoke-AzVmRunCommand `
      -ResourceGroupName "$(resourcePrefix)-rg" `
      -VMName "$(resourcePrefix)-vm" `
      -CommandId "RunPowerShellScript" `
      -ScriptPath "Scripts\installM365DSCModule.ps1"

This task is the most time consuming


5.4.6 Start DSC engine

in this task, we start the DSC engine and configure it to monitor the settings from our M365DSC export. We use a PowerShell script and AzurePowerShell@5 another time.

- task: AzurePowerShell@5
  inputs:
    azureSubscription: '$(azureserviceconnection)'
    ScriptType: 'InlineScript'
    azurePowerShellVersion: LatestVersion
    Inline: | 
      Invoke-AzVmRunCommand `
      -ResourceGroupName "$(resourcePrefix)-rg" `
      -VMName "$(resourcePrefix)-vm" `
      -CommandId "RunPowerShellScript" `
      -ScriptPath "Scripts\configureDSC.ps1" `
      -Parameter @{'ResourcePath'='$(resourcePath)';'OrganizationName'='$(entraidtenant)'}

6. The Bicep templates


Bicep is a declarative language based on ARM template developed by Microsoft and the community to ease and improve deployment of resources in Azure.

We used 3 templates in this deployment:

6.1 main.bicep

main.bicep

We use this template to create our resource group (camonitoring-rg) and assign permissions. Resource groups are created at the subscription level and permission at the resource group level. To change the scope, I have initiated the second template as a module.

main.bicep 2

6.2 rgRole.bicep

rgRole.bicep

This template enables our service connection service principal to create resources in camonitoring-rg


6.3 vm.bicep

This template creates all the resource required to run the workload.

  • a VNET

vm.bicep
  • an NSG

vm.bicep 2
  • a Public IP

vm.bicep 3
  • a VM to run our workload

vm.bicep 4
  • The Azure Monitor Agent (AMA) extension

vm.bicep 5
  • The data collection rule to associate the AMA agent with our Sentinel Log Analytics workspace

vm.bicep 6

I invite you to check the other settings from vm.bicep.


I wanted to run all templates from main.bicep and include rgRole and vm as module but unfortunately the pipeline was failing due to the permissions not being applied before vm.bicep starts. I tried creating module dependencies without success so for now I simply split the deployments and it worked like a charm.

If someone knows the trick please contact me on LinkedIn or in the comments 😜


Setting up your environment

If you've been patient enough to read all my explanations and still have energy then it's time to setup the environment and play with the solution


1- The Managed Identity


We will use a User Managed Identity to enable our VM to run the M365DSC workload. I've decided to use this type of identity so that we can reuse it across multiple resources with a single set of permissions.


1- In the Azure portal search for Managed identities

2- Press the create button

3- Fill the basic settings


managed identity

4- Add the relevant permissions required to managed the Conditional Access workload use the AssignPermission.ps1 script from the repo. Make sure to modify your Service Principal Id


2- The Azure Pipeline agent

Let's jump to Azure DevOpd and configure the Windows hosted agent

1- Go to the Organization or Project settings and then Agent pools

azure pipeline agent

2- Select Azure Pipelines

3- Select Agents and make sure Hosted Agent is enabled

azure pipeline 2

4- You may need to enable your Hosted Agent by raising a request to Microsoft: submit a requestChanges to Azure Pipelines free grants | Microsoft Learn

You can use the Microsoft Hosted for 1800 min/month free for a private project. Alternatively you can create a self-hosted agent which is a Virtual Machine with the Azure Pipeline agent installed: Deploy an Azure Pipelines agent on Windows - Azure Pipelines | Microsoft Learn


3- The repo


1- Go to your Azure DevOps organization and create a new project

2- Go to Repos and select Import under Import a repository


repo

3- Under Clone URL, add:

https://github.com/William-Francillette/camonitoring
repo 2












5- Wait until initialization completes


repo 3

6- The repo should then be ready

repo 4

4- The Service Connection

To create your Service Connection, navigate to your project:

1- In the project settings at the bottom left corner and then under pipelines select service connection

repo 5
repo 6

2- Press New Service Connection

repo 7

3- Select Azure Resource Manager

repo 8

4- Select Service Principal (automatic) or Workload identity federation (automatic) -- the latter is in public preview


repo 9

5- Select the scope as Subscription, choose your subscription and give it a name. leave the resource group empty.

repo 10

Your service connection is also available in the Entra ID portal.

repo 11


We also need to add the Log Analytics contributor role to the service connection

1- Go to your Log Analytics workspace

2- Select Access Control (IAM)

3- Select Role Assignment and Add Role Assignment

4- Choose Log Analytics Contributor

5- Search for your Service Connection and then complete the assignment


5- The variable group

It's now time to add the variable we don't want to store in the pipeline.

1- In your Azure Devops project, under Pipelines, select Library


variable group

2- Create the variable group "CA Monitoring - Variables"


variable group 2

Note:

You can use a different name but make sure to edit the variables section of the pipeline file and edit as required:


variable group 3


3- Add the following variables:

  • adminPassword: don't forget to press the lock pad to register this as a secret - This password is the VM local admin password and has to comply with Azure password complexity. You should use a large and complex generated password/passphrase. Note that you may only need to sign in the VM for troubleshooting purpose.

  • entraidtenant: your Entra ID tenant GUID

  • serviceconnectionid: Your Service Connection GUID ( Service Connection )

  • azureserviceconnection: Your Service Connection name

  • subscriptionid: the Azure subscription id hosting the new resources


variable group 4

6- The pipeline

We need to to import and configure our pipeline:

1- In your project, select Pipelines> Pipelines again and Create pipeline

pipeline

2- Choose Azure Repos Git

pipeline 2

3- Select your repo

pipeline 3

4- Select Existing Azure Pipelines YAML files and select camonitoring-cd-m365dsc-with-public-ip.yml file

pipeline 4

5- Select Save


pipeline 5

6- Rename your pipeline as camonitoring-cd-m365dsc-with-public-ip


piepline 6

7- Finally Run your pipeline

pipeline 7

If it all goes well, your pipeline should complete successfully

pipeline 8

And you should see all your resources in Azure

pipeline 9


The last words

One of the major advantage in managing our Microsoft 365 tenant as code with M365DSC and Azure DevOps is the flexibility in redeploying this configuration (and other policies) to a single or multiple tenants. You could for example create an M365DSC export file for your EXO/EOP tenant and deploy it as a baseline.

This project is still in working progress as I want to remove the VM internet exposure and later use containers instead of a VM but this will be for a next blog 😜


I hope this will encourage you to use Microsoft 365 DSC in your environment and don't hesitate to contact me if I made any mistake or need help with the setup 👍

Happy coding!



 
Will Frenchie

I am a Microsoft Solutions Architect specialized in Microsoft 365, Entra and Azure security products at Threatscape.

I love learning, blogging and coding. My interests are very diverse from architecture, security, cloud engineering, automation, DevOps and PowerShell.

I own over a dozen Microsoft certifications and have worked in IT across multiple and diverse industries for over 15 years now.

2,200 views1 comment

1 comentario


chrirest78
24 mar

This looks really cool and your comparison between pipelines and task sequences has blown my mind. I've been trying it out and most of it seems to work. The only problem I've got is the pipeline fails at the AzureFileCopy stage with this error: "failed to parse user input due to error: cannot use wildcards in the path section of the URL except in trailing "/*". If you wish to use in your URL, manually encode it to %2A" I had a look at the debub verbose log and at the azcopy copy command there are additional asterisk symbols at the end of the URI.


Do you know if there is any workaround for this?

Me gusta
bottom of page