In our previous post we, we created an SCCM environment and deployed desktop analytics. We left when DA console notified us that we need to wait 72 for the data to be processes.

Let’s validate that everything is ok

However, we will not use DA console in our work from now on. Instead we get the data directly from the Desktop Analytics workspace

Let’s validate we can get the data

  • Open an Azure Portal  https://portal.azure.com/
  • Reach for Log Analytics Workspaces and select the Desktop Analytics workspace we created when deploying DA
  • Click Logs
  • Enter the following query in the query window
MADevice | where TimeGenerated > ago(48h) | limit 10

Click Run

The query should return one asset as you see below

Considering our SCCM environment is not a production real-live one, we don’t expect desktop analytics to capture any errors or any other real-live data. The query that we use is very simple, just return the computer name. However, in real live you can use queries to find devices that experience problems that are captured by DA in logs analytics workspace

Creating Storage Account and Table

With the SCCM environment ready and DA deployed is time to create some Azure resources. Let me remind you what we need to create

  • Resource group
  • Storage account
  • Storage table
  • Automation Account

To make things less time consuming we will use a PowerShell script to deploy most of the resources

Update the parameters in the below script to reflect the values you want to use

Let’s see what the script does:

  • Create a resource group in a specified location
  • Create a storage account and a table
  • Create a SAS token with a 1-year viability for tables in the storage account
  • Create an automation account
  • Add all the required parameters for the PowerShell runbooks we will create later
  • Add required modules to the automation account required by the PowerShell runbook 

Below is the script

#define paramaters 

$rgName="DASCCM-RG"
$location = 'eastus'

#make sure you use a valida storage account name 
$storageAccountName='dastorage101'
$tableName = 'SCCMRemediationsTable'
$pkey = 'SCCMRemediation' 
$autoaccName='DASCCMAutoAccount'

#connect to Azure
Connect-AzAccount

#execute get-AzSubscription and Select-Subscription if you dont have a default subscription or you dont know what to select
Select-AzSubscription "Microsoft Azure"

#Create Resource Group
New-AzResourceGroup -Name $rgname -Location $location

#Create Storage account
$Storageaccount = New-AzStorageAccount -ResourceGroupName $rgname -Name $storageAccountName -Location $location -SkuName Standard_LRS -Kind Storage
$ctx = $Storageaccount.context

#create table
New-AzStorageTable -Name $tableName -Context $ctx	

#create sas token
$now = Get-Date
$nowoneyearplus = $now.AddYears(1)	
$sastkn = New-AzStorageAccountSASToken -Service Table -ResourceType service, container, object -Permission "racwdlup" -StartTime $now -ExpiryTime $nowoneyearplus -Context $ctx

#create Automation Account
$automationAccount = New-AzAutomationAccount -ResourceGroupName $rgname -Location $location -Name $autoaccName
$automationAccountId = (Get-AzResource -ResourceType "Microsoft.Automation/automationAccounts" -ResourceGroupName $rgname -Name $autoaccName).ResourceId

#add automation account parameters
new-AzAutomationVariable -AutomationAccountName $autoaccName -Name "SASToken" -ResourceGroupName $rgname -Value $sastkn -Encrypted $false
new-AzAutomationVariable -AutomationAccountName $autoaccName -Name "StorageAccount" -ResourceGroupName $rgname -Value $storageAccountName -Encrypted $false
new-AzAutomationVariable -AutomationAccountName $autoaccName -Name "TableName" -ResourceGroupName $rgname -Value $tableName -Encrypted $false
new-AzAutomationVariable -AutomationAccountName $autoaccName -Name "PartitionKey" -ResourceGroupName $rgname -Value $pkey -Encrypted $false

#import Az modules for the runbooks 
#Modules need to be imported in a specific order

$modules = @()
$typedef = New-Object -TypeName PSObject
$typedef | Add-Member -Name 'OrderID' -MemberType Noteproperty -Value '1'
$typedef | Add-Member -Name 'Name' -MemberType Noteproperty -Value 'Az.Accounts'
$typedef | Add-Member -Name 'Value' -MemberType Noteproperty -Value '1.6.4'
$modules += $typedef
$typedef = New-Object -TypeName PSObject
$typedef | Add-Member -Name 'OrderID' -MemberType Noteproperty -Value '2'
$typedef | Add-Member -Name 'Name' -MemberType Noteproperty -Value 'Az.storage'
$typedef | Add-Member -Name 'Value' -MemberType Noteproperty -Value '1.9.0'
$modules += $typedef
$typedef = New-Object -TypeName PSObject
$typedef | Add-Member -Name 'OrderID' -MemberType Noteproperty -Value '3'
$typedef | Add-Member -Name 'Name' -MemberType Noteproperty -Value 'Az.Resources'
$typedef | Add-Member -Name 'Value' -MemberType Noteproperty -Value '1.8.0'
$modules += $typedef
$typedef = New-Object -TypeName PSObject
$typedef | Add-Member -Name 'OrderID' -MemberType Noteproperty -Value '4'
$typedef | Add-Member -Name 'Name' -MemberType Noteproperty -Value 'Aztable'
$typedef | Add-Member -Name 'Value' -MemberType Noteproperty -Value '2.0.2'
$modules += $typedef


for ($modulesCounter = 1; $modulescounter -le $modules.count; $modulesCounter++){
    $CurrentModule = $modules.Where({$_.OrderID -eq $modulesCounter})
    Write-Output "working on $($CurrentModule.OrderID) - $($CurrentModule.Name) - $($CurrentModule.Value)"
    $DebugMode = $false ## set this to $true to skip actual Azure processing, or to $false to pass the values to Azure

    $mod =$CurrentModule.Name
    $version = $CurrentModule.Value

    if ($DebugMode -ne $true){
    New-AzAutomationModule -AutomationAccountName $autoaccName -ResourceGroupName $rgname -Name $mod -ContentLinkUri "https://www.powershellgallery.com/api/v2/package/$mod/$version"
    }
    $wait = $true
    $count = 1
    $failed = $false 
    while ($wait){

        sleep (10)
        $count++
        if ($DebugMode -ne $true){
        $imported = Get-AzAutomationModule -AutomationAccountName $autoaccName -ResourceGroupName $rgname -Name $mod 
        }

        if ($DebugMode -ne $true){
            write-output "Provisioning state for $mod is $($imported.ProvisioningState)"
        } else {
            write-output "Debug mode - no actual provisioning done"
        }
        if ($imported.ProvisioningState -eq "Succeeded"){
            Write-Output "$mod has been successfully imported"
            $wait = $false
        }
        if ( $imported.ProvisioningState -eq "Failed"){
            Write-Output "$mod import failed"
            $wait = $false
            $failed = $true
        }
        if ($count -ge 30){
            Write-Output "$mod timed out"
            $wait =$false
            $failed =$true
        }
        if ($DebugMode -eq $true){
            Write-Output "$mod has been successfully processed in debug mode"
            $wait = $false           
        }
    

    }
}

 

The script will take roughly 10-20 minutes to complete. A successful run will look like this

  Let’s go to the Azure portal to see if our resources are there

  • Select the resource group and you should have one automation account and a storage account

Click the automation account and select Variables under Shared Resources to validate the parameters

Go back to the resource group and select the storage account and then tables

We are now ready to create the runbooks and the alert. We will continue in the next post