Leveraging Log Analytics to Query Secure Boot Certificate Update Status

inconceivable

Hey Everyone! So I’ve just returned from a break (like almost 3 months), while I was making Workplace Ninja US 2025 in Dallas happen. I couldn’t have returned at a better time because this Secure Boot certificate thing is heating up.

Recently, Microsoft posted an article discussing the upcoming update of the Secure Boot certificates that will be expiring at the end of June 2026. Easy enough right? Don’t I wish! I investigated this a bit. Despite Device Query being able to query for this information, Multi-Device Query (MDQ) cannot! Inconceivable! Today, we will discuss:

Secure Boot Certificate Update Architecture

Overall, it’s simple. The older devices that are pre-2024 will have their certificate expire in 2026. The entire framework for this update revolves around registry keys and a Windows Scheduler task.

The registry keys all live in this location:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecureBoot and HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecureBoot\Servicing

The key registry keys are:

UEFICA2023Status  States if the new certificate has been updated on the device. Will likely say “NotStarted” and moves to “InProgress” and finally to “Updated”
UEFICA2023Error  After the upgrade, this will be set to “0” on success. An error will reflect a non-zero number.
WindowsUEFICA2023CapableThis key tells you if the certificate is now in use. “0” says the certificate isn’t in the database yet. “1” means its there, but not in use. “2” states its using the 2023 boot manager.
AvailableUpdatesThis key will be typically set to “0”, but you can set it to 0x5944 to make it update the certificate.
HighConfidenceOptOutYou can set this to “1” lets you opt-out of automatic updates of the certificate via the cumulative update.
MicrosoftUpdateManagedOptInLet’s you opt into Microsoft-managed update via “Controlled Feature Rollout”

The other aspect of this is the scheduled task: Secure-Boot-Update inside of Microsoft\Windows\PI\Secure-Boot-Update. The setting of the “AvailableUpdates” key will trigger the scheduled task within 12 hours:

The Secure Boot scheduled task

From a technical perspective, it’s interesting as it uses a “custom handler”, which runs compiled system process to update the certificate. If you want to try it out manually, you can run this PowerShell:

reg add HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Secureboot /v AvailableUpdates /t REG_DWORD /d 0x5944 /f

Start-ScheduledTask -TaskName "\Microsoft\Windows\PI\Secure-Boot-Update"

Manually reboot the system when the AvailableUpdates becomes 0x4100

Start-ScheduledTask -TaskName "\Microsoft\Windows\PI\Secure-Boot-Update"

It’s a bit convoluted as running the scheduled task once will trigger the process caused by the “AvailableUpdates” registry key. Next, you reboot and run it again and it does its certificate magic. It’s pretty chaotic from a logical perspective.

My friend Richard Hicks wrote a great article on how to update these here and wrote a really nice PowerShell script to check out the certificate itself via PowerShell found here.

Now, let’s get into my solution.

Leveraging Log Analytics to Capture Secure Boot Certificate Status Across Your Fleet

We know that we want to query the status of our devices, but we don’t really know much about what’s going on overall. After considering several different options, we decided that we could create a database of some sort and use PowerShell to write the device’s state to that database.

We chose to use Log Analytics as it’s good at that and despite me NOT loving KQL, let’s take a shot at it anyways.

We have 3 main steps:

Create the Log Analytics Workspace

First, we start by going to into Azure and “Log Analytics Workspace” and click “Create”

This part is simple, we just set the resource group and click create magically:

Creating the Log Analytics workspace

Create the Data Collection Endpoint

Now, let’s go over to Data Collection Endpoint area and create it. Simple to create just like the workspace:

Creating the data collection endpoint

Now, we drill into the created data collection endpoint, and write down the logs ingestion endpoint, which you will need later.

Capturing the logs ingestion endpoint

Create the Log Analytics Custom Table

After the endpoint is created, go back to your Log Analytics Workspace, you drill into “Tables” and click Create > New Custom Log (Direct Ingest)

Creating the Log Analytics Custom Table

You can set a name for the table, select “Analytics”, create a new data collection rule, and click “Next”

Setting the Custom Table settings

After clicking next, you upload the JSON file found here

It looks like this:

{

  "streamDeclarations": {

    "Custom-Json-SecureBootServicing_CL": {

      "columns": [

        {

          "name": "TimeGenerated",

          "type": "datetime"

        },

        {

          "name": "Hostname",

          "type": "string"

        },

        {

          "name": "RegistryPath",

          "type": "string"

        },

        {

          "name": "UEFICA2023Status",

          "type": "string"

        },

        {

          "name": "WindowsUEFICA2023Capable",

          "type": "string"

        },

        {

          "name": "KEK_Thumbprint",

          "type": "string"

        },

        {

          "name": "KEK_IssueDate",

          "type": "datetime"

        },

        {

          "name": "KEK_ExpirationDate",

          "type": "datetime"

        }

      ]

    }

  }

}

Once done, you just click “Next” and “Create” to finish creating that custom table.

After finished, go into the Data Collection Rule” and document your immutable ID you will need for the script later:

Grabbing the Data Collection Rule immutable ID

Drill into Configuration > Data Sources and document the Stream Name, for your script.

Grabbing the Data Collection Rule Stream Name

Create the App Registration

Now, you go to App Registrations and click “New Registration”

Name it and click “Register”

Creating the App Registration

Document the App ID and the Tenant ID. Once done, go to certificates & secrets to create the client secret for your PowerShell script.

Documenting the Client ID and Tenant ID

From there, just click “New Client” secret, follow the prompts and document the client secret:

Generating the client secret for your script

Grant Permissions to the Data Collection Rule

Drill into the Data Connection Rule and the Access Control (IAM) section to grant the role assignment to the App Registration you just created and click Add > Add role assignment.

Adding the Roll Assignment for the Data Collection Rule

Select the “Monitoring Metrics Publisher”:

Finding the Monitoring Metrics Publisher role

Add your app registration and click “Review and Assign”

Adding the Role Assignment

Test the Log Analytics Script Locally

You can download my Log Analytics script here.

Simply, copy the script locally, and update the variables below with information you have collected in the previous steps.

$TenantId  = ""          # Entra ID tenant (Directory) ID (You collected this from the App Registration)

$ClientId  = ""          # App registration (Application) ID (You collected this from the App Registration)

$Secret    = ""      # Client secret VALUE (You collected this from the App Registration)

$DceLogsIngestionUri = "" (This was from the Data Collection Endpoint you collected earlier)

$DcrImmutableId      = "" (You grabbed this from the Data Collection Rule earlier).

$StreamName = "" (You grabbed this from the Data Collection Rule earlier).

Run it and see the magic happen:

The magical results of your Log Analytics powershell script

Once you run it, it will take a few minutes and then you can see it in your log analytics workspace:

Results written to the Log Analytics workspace

Deploy Log Analytics Script via Intune

Now, you just need to save your script and deploy it as a platform script here

First, name the script, upload it, and set the settings like this:

Deploying the script via Intune

After that, assign to all devices and deploy!

Setting the script to go to All Devices

Shortly after, you will start seeing those sweet devices coming in with this easy query:

SecureBootServicing_CL

| where ingestion_time() > ago(24h)

| order by ingestion_time() desc

| take 50
Running your KQL to see the results

Final Thoughts

Be sure to let me know what you think and leaves some comments below!

I thought overall it was an interesting exercise to create my own log analytics workspace and write things to it, which can be used for many different purposes.

You can also create new Data Collection Rules (DCRs) and build your own JSON schema if the DCRs I made aren’t useful or working like you expect. Replication can be a little annoying sometimes when you grow the schema. A good rule of thumb I learned is if stuff isn’t writing to your workspace properly, it is likely the schema or how it’s formatted.

Facebook
Twitter
LinkedIn

Let me know what you think

Scroll to Top

Discover more from Mobile Jon's Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading