Filling the Gaps in Workspace ONE Access Audit Logs

Filling the Gaps in Workspace ONE Access Audit Logs

powershell, scripting, WS1 Access, WS1 Intelligence
Filling the Gaps in Workspace ONE

SIEM/SysLog integrations are one of the most important things in defending Cyber Attacks today, which provides intelligence to large organizations. One of the few VMware products that is lacking in that area is WS1 Access. Over the last year or so, WS1 Access did finally get some much needed love with their Workspace ONE Intelligence integration that pulls logins and app launches. Today, we’re here to discuss the bigger part of this (auditing logs), which are a gap currently. Workspace ONE Access Audit Logs are becoming more prevalent with attacks, bad admins doing bad things, etc.

The Challenges of the VMware Workspace ONE Access API

The WS1 Access API is not without its challenges. It becomes quite obvious when you look at it here that it is largely undocumented and out of date. We’ve been fortunate that a few people like William Lam and Sascha Warno have been good partners in filling in some of those gaps. I can say honestly that it’s one of the more challenging APIs that I have worked with because there are many shifting requirements around content-type and accepts in headers that can frustrate just about anyone. Today, we will cover the automation that I wrote to ingest Workspace ONE Access Logs from the API directly into a SIEM as CSV, which takes some manipulation and creativity. The specific use case I am targeting is the monitoring of changes to the Access Policy (aka someone trying to circumvent MFA).

Collecting the Data from WS1 Access API

We will take some time here and cover piece by piece the current code and how it works.

Setting Variables and Authenticating to the WS1 Access API

The first piece of the code sets variables, because the API leverages UNIX time (takes to Sascha for helping me working this through in my head). It’s fairly basic, but following this baseline sets you up well for the rest of the script. It’s similar to my other WS1 Access scripts. What I’ve learned through trial and error here is that less is more. Avoid Accept and Content headers where possible, because often we don’t need them at all.

#Forces the use of TLS 1.2
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
##Declares Variables for the Date Filter for Audit Logs##
$date1 = Get-Date -Date "01/01/1970"
$date2 = (Get-Date).adddays(-7)  
$date3 = (Get-Date) 
$timespan =  (New-TimeSpan -Start $date1 -End $date2).TotalMilliSeconds
$timespan2 = (New-TimeSpan -Start $date1 -End $date3).TotalMilliSeconds
##Declares the Variables for the Filter itself
$fromMillis = [math]::Floor($timespan)
$toMillis = [math]::Floor($timespan2)

##Specify your Access Hostname
$AccessURL = ''
##Specify your oAuth Client ID and Secret
$ClientId = ''
$ClientSecret = ''
$text = "${ClientId}:${ClientSecret}"
$base64 = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($text))
$headers = @{
        "Authorization"="Basic $base64";
##Auth and Get your Bearer Token##
$results = Invoke-WebRequest -Uri "https://$AccessURL/SAAS/auth/oauthtoken?grant_type=client_credentials" -Method POST -Headers $headers
$accessToken = ($results.Content | ConvertFrom-Json).access_token
  $authHeader = @{
        "Authorization"="Bearer $accessToken";
      $global:workspaceOneAccessConnection = new-object PSObject -Property @{
        'Server' = "https://$AccessURL"
        'headers' = $authHeader

Collecting the Workspace ONE Access Audit Logs from the API

First, let’s discuss the API command itself (querying audit data), which you can find “some” info here. The “actual” API endpoint is “/analytics/reports/audit” so again thanks Sascha since its undocumented.

Let’s start by covering your potential filters:

  • Filter by user (actorUserName)
  • objectType
  • objectAction like Create, Delete, Update, Link, Unlink
  • linkedObjectType
  • objectName
  • fromMillis and toMillis
  • startIndex and pageSize

Now that I spent some time, here’s a few helpful hints about these filters:

  • Always use an objectAction filter because it does NOT capture everything since it has a default pageSize of 5000 (which is also the max). Don’t forget my article on pagination
  • Leverage objectAction if it matters to you e.g. Delete or Update actions
  • fromMillis and toMillis are Epoch Time IN Milliseconds. I’ll help you on that!
  • You can use offsets to build in pagination.

So my code below will do a few fun things. You can see how it creates the header, performs the audit query, builds an array, populates that array, and writes it to CSV. This particular code gives me all results over the last 7 days.

 ##Declare the Header for your Audit Log Query##    

$Headers = @{

    $Headers = @{
##Perform the Audit Log Query##
$Response = Invoke-RestMethod -Uri "https://$AccessURL/analytics/reports/audit?objectType=RuleSet&fromMillis=$fromMillis&toMillis=$toMillis&objectAction=Update -headers $Headers
##Build the Array
$list = New-Object System.Collections.ArrayList
for ($i=0; $i -lt $; $i++)
##Populate the Array
{$list.Add(($[$i][4] | ConvertFrom-Json))}
$List | Export-CSV "C:\temp\auditlog.csv"

Now, that we have our functional code where do we go from here? One thing I am working on for Policy Changes is leveraging the “jersey/manager/api/authmethods” to update Auth Method GUIDs to their friendly names before syslog ingestion, more on that this week I hope!


Thanks to my co-pilot on this mission Sascha, we have the translation engine part of the script function. Let’s check it out!

##Perform Audit Search
$Response = Invoke-RestMethod -Uri "https://$AccessURL/analytics/reports/audit?objectType=RuleSet&fromMillis=$fromMillis&toMillis=$toMillis" -Method GET -headers $Headers
$list = New-Object System.Collections.ArrayList
for ($i=0; $i -lt $; $i++)
{$list.Add(($[$i][4] | ConvertFrom-Json))}
##Capture AuthMethods into Array and Re-write Audit Log##
$authmethods = Invoke-RestMethod -Uri "https://$AccessURL/SAAS/jersey/manager/api/authmethods" -Method GET -Headers $headers
$authmethods = $authmethods.items | Select-Object authMethodName, uuid
$authnmethods=Get-Content -Path C:\temp\internalauthmethodlist.json | ConvertFrom-Json
for ($i=0; $i -lt $; $i++)
{ $list.Add(($[$i][4] | ConvertFrom-Json))| out-null

foreach ($item in $list){
foreach ($authmethod in $authnmethods.Methods) {
$item.values = $item.values -replace $authmethod.ID, $authmethod.Name
if ($ -contains "oldValues") {
$item.oldValues = $item.oldValues -replace $authmethod.ID, $authmethod.Name}
foreach ($item in $list){
    foreach ($authmethod in $authmethods) { 
        $item.values = $item.values -replace $authmethod.uuid, $authmethod.authMethodName 
        if ($ -contains "oldValues") {
            $item.oldValues = $item.oldValues -replace $authmethod.uuid, $authmethod.authMethodName

So if we discuss this code a little bit, we do some really cool things. On my Github you will now find a JSON file with the internal IDs of some of the auth methods, which we will use in a fancy little for loop to convert UUIDs to friendly names. We ALL like friendly names!

Once we finish that up, we will use the authmethods API call to convert even more UUIDs to names, because we like when our audit logs are ACTUALLY readable. Hope you enjoy, I will probably add even more to this and translate network ranges as well.

Now, we will talk about how I leveraged this code to ingest logs into Azure’s SIEM.

Leveraging WS1 Access Automation to Ingest Logs into Azure Sentinel

I was fairly surprised by how easy it was to bring data into Sentinel with this. Let’s start with the code:

$list | Upload-AzMonitorLog.ps1 -WorkspaceId $WorkspaceId -WorkspaceKey $AzureKey -LogTypeName WS1-Access

With that done, I could see the logs inside of Azure pretty easily:

Overall, not too painful. The big thing that I will be working quite a bit over the next few weeks here is massaging and formatting the data in these tables to be more friendly, like converting time stamps from epoch to human dates, etc. Overall, I’m very happy with what I was able to put together which should work nicely in Splunk or whatever else I want to use.

Final Thoughts

I thought albeit short, this was a really beneficial thing to write about. Not much is out there, and I think many of us forgot these logs are not being captured. Leveraging these scripts and extending your visibility further helps build trust throughout your organization and really takes you from a Mobile Zero to a Mobile Hero!



Social Media

Get The Latest Updates

Subscribe To Our Weekly Newsletter

No spam, notifications only about the latest posts and updates.