Call Microsoft Graph Organization Scope to Check Last AAD Connect Sync Time – Part 7

Within any organization who run a hybrid Office 365 setup, AAD Connect has a sync cycle of 30 minutes (default) and unless someone is on one of the sync servers or in the admin portal and spots an error, there is a possibility that Dir sync (deprecated technology but still can be used to refer to AAD Connect) could have failed and go unnoticed for some time.

In terms of generating an alert, there is more than one way to achieve this using a PowerShell job, or a task on the server using task scheduler that runs PowerShell, but for some organisations these things are not permitted or possible, so in steps Flow & Microsoft Graph.

Firstly, I'd advise popping over to the Microsoft Graph Explorer and authenticating with your admin account.

Check the box to Consent on behalf of your organization.

We are now going to access the organization resource type and look at at 2 properties, onPremisesSyncEnabledonPremisesLastSyncDateTime.

Now let's make a call to and let's look at the results.

As you will see above, "onPremisesSyncEnabled": true so that assures us we are looking at a hybrid environment with AAD Connect synchronisation in place.

Would you like to buy Alan a coffee?

Visit the AlanPs1 Ko-fi page

The information that provides us the most interest here is "onPremisesLastSyncDateTime": "2018-12-03T08:19:12Z"and with the default sync time of 30 minutes, wouldn't it be great to know if the value of the property 2018-12-03T08:19:12Z were to exceed 30 minutes?

In this example, we don't need to be precise and alert on 31 minutes, although you could do that if you wanted to be extremely up to date. The shortfall of doing so would be that your Flow would have to recur at shorter intervals hence using more Flow runs, but for this example we will look to check onPremisesLastSyncDateTime every hour and if it were to have exceeded 40 minutes since last sync time there is a fair chance we have a problem we will have to address, hence an alert will be generated by email.

So here is the full Flow we will create.

Before we create the Flow we need to allow ourselves a path to communicate with Microsoft Graph and will create an App Registration withing Microsoft Azure.

Please see my blog post called Register an App in the Azure Active Directory were more information can be found on how to do this.

Note: You'll want to call this App something like MicrosoftGraphAPI and it will require different permissions to access Microsoft Graph for all Application Permissions & Delegated Permissions. Please see below:

So, let's cover the Flow trigger and it's first 3 actions. For this, you will require information from your App Registration, namely: TenantID, ClientID, SecretID.

Once the above has been created we will add an HTTP action as below

The on to parse the JSON from the body of the HTTP Request action.

Before we can move on to the next step we will need to make sure we have built an accurate schema. That can be achieved by going back to the HTTP Request in Graph Explorer and by copying the JSON Output, pasting it into the Flow and allowing the Parse JSON action to build the schema, as below:

The next step will see us Filter the Array.


@and(equals(item()?['DisplayName'], 'ExchangeOrg123'))

Then we target the property we are looking for.

Now to do the comparison. But please consider the below and how we work with  the Array.

body('Select') = "onPremisesLastSyncDateTime": "2018-12-03T08:19:12Z"

body('Select')?[0] = onPremisesLastSyncDateTime

body('Select')?[1] = 2018-12-03T08:19:12Z

Based on the above we can compare the contents of body('Select')?[1] with utcnow().

Let's have a look at the all important condition.

@less(body('Select')?[1], addminutes(utcnow(), -30))

EDIT: 6th February 2019 –

When this condition returns false, it means that the sync time is less than 30 minutes so there is no need to react hence we use Terminate to complete the Flow as a success.

Returning False, this will mean the sync time has exceeded 30 minutes and since the Flow runs every hour we will definitely have to attend to a sync issue. Please feel free to drop this to 30 minutes or approach it in another way as it could be a little sweeter. I am fine with how it as and saving the Flow runs.

And to finish off, let's create the alert we require to go to the O365 Sync sub-folder.

Now let's close off this series of blogs with the final part, Accessing and alerting on data produced from Microsoft 365 Roadmap.

Please continue to part 8 …


Part 8 >>>

Do you need assistance building this Flow?

Visit the Get Help section of the Power Automate Community