Monitor changes to Intune using Azure Functions, GraphAPI & PowerShell

In my last post, I showed you how to move a very common task – authenticating into the GraphAPI, up into an Azure Function App.

Now that our authentication function has been turned into a REST Endpoint, we can stop focusing on how we get authenticated and start doing some really interesting stuff inside the environment. Case in point, myself and a colleague @OnPremCloudGuy were wondering if we could create a solution to monitor the Intune configuration of a tenant for any changes that might get made without our approval. After a bit of digging around the GraphAPI documentation, and a few beers later, we had a very rough proof of concept to show that, indeed you can indeed report on changes in Intune!

The basic requirements of the solution involve 3 things:

  • A way to capture the existing configuration
  • Somewhere to store that configuration
  • And finally, a way to compare that configuration against the current Intune configuration
  • We can achieve all three of these requirements with a single Azure Function App, so let’s jump right in and configure everything.

    First, let’s create the Function App – give it a cool name, stick it in a resource group, and make note of the storage account used (new or existing doesn’t matter – we just need to know where everything is stored).

    function app

    Once your new Function App is created, we will now set up a container to store the configuration snapshots.

    Remember the storage account name? Let’s go into the storage accounts blade and select that account, and head into the containers blade.

    containers

    You won’t see much here – just a single container housing your Function App. Let’s create a new container – I’ll call mine intunesnapshots. Keep the public access level at private.

    new containers

    Go into the properties of your freshly made container and make a note of the URL – you’ll need this later.

    While we are in the storage account, lets also create a Shared Access Signature so that we can access the snapshots without too much effort – there are better ways to do this and if I wasn’t specifically using PowerShell as the language of choice, I’d go into this, but for now, lets just create SAS codes and move on to the cool stuff. Set the expiry time to a few days from now (or years, it’s your tenant, just be aware of the security implications), generate the SAS codes and again, store it for later.

    SASsy

    Alright – your storage is set up, access & security is set up to it, now let’s set up the functions. Head over to the Function App blade and add a new function to your newly created Function App. Let’s start with the snapshot creation endpoint.

    As always (or until Powershell moves out of the experimental phase), set Experimental Language Support to Enabled, choose PowerShell as your language of choice, and create a HTTP trigger.

    Snapshot Function

    As you will come to learn, I am a big fan of moving any user customizable settings/script configuration out of the main code block and into an appConfig.json file. It’s a handy way for the end user to be able to modify what the script does without being overwhelmed by a wall of code (and it makes demoing code that might have sensitive data all that easier!), so like in the last post, we are going to add a new file to the function. Your file structure should end up looking like this:

    appconfig.json

    We aren’t going to put much into this config file right now – we just want to move the URL of the GraphAPI Connector that we made in the last post outside of the code so that if we ever want to change it, we know where to do so! The layout of your config should look as follows:

    Now we need to add some extra outputs into our function, so head over to the Integrate section, hit New Output and select Azure Blob Storage

    Azure Blob Storage

    Set your storage account connection to AzureWebJobsStorage and update your path as shown in the screenshot below (anything in curly braces is a variable placeholder that gets its value from user input – very cool, very handy!).

    Blobby!!

    Back to the main script file run.ps1, replace the default “hello world” sample code with the following:

    A quick breakdown of the code above – we are capturing the tenant details & the final endpoint of the intune graphAPI from end-user input – I’ll show how this is done in the next step. Everything else is set up from the appConfig.json file. All we are doing is querying the GraphAPI (using our handy-dandy GraphConnector function), storing the resultant JSON file in our storage account and then sending the requestor a pass/fail object back.

    To validate that this function is working how we want it, let’s go open the test pane and well… test it out!

    For this demonstration, let’s just pull back the list of sidecar scripts in our Intune environment – use a tenant you have configured in your GraphConnector function and the query value as shown below:

    If all is good, you should receive the following in the output pane:

    success!

    We can now validate that the blob file was created by going to the intunesnapshots in the Function App storage account and confirming that the file exists.

    More Success!

    See how the name has been created using our tenant name & the value of the query we provided in the test? The benefit of this is that we can query multiple tenants and multiple endpoints of the Intune environment and store everything inside the same location.

    Alright, now we have our snapshots being created, let’s create a CRON Job Function to periodically monitor our Intune environment for any changes.

    This time when creating the function, select Timer Trigger. Again as always, give the function a cool name, but this time, set up a schedule – for those who know their CRON expressions you won’t have any issues here. For those of you that see the sunlight from time to time, here’s a handy guide on how Azure Function Apps expect the format.

    In the below screenshot, I’ve set my schedule to run every hour.

    CRON Job

    As always, create yourself an appConfig.json file. You’ll want to get the URL of the container that you stored at the top of this guide and place it at the start of the currentSnapshot property, followed by the SAS code. The graphConnectorURI is the full function URL of the graphConnector function that you created from the last post.

    Unlike in the last function, because we are triggering this script on a schedule where there will be no input, we will be placing the tenant details & the query property in here as well – if you want to monitor multiple tenants, it is as simple as duplicating this function and updating the appConfig.json file.

    Finally, the graphVer property is there because right now, most of the Intune config data is only accessible via the beta GraphAPI, this will eventually change, so it’s best to move this variable outside of the code for easy modification once it becomes GA.

    Now in the main script file, again as before, replace the default “hello world” sample code with the following:

    Code above, again is fairly simple – the heavy lifting is done by the Compare-ObjectProperties function that appears on Jamie Nelson’s TechNet article (found here). What we are doing is simply loading up the snapshot JSON object, capturing a new copy of the intune config and comparing each property and its value. If there are any changes, they are noted and reported back to the requestor. If there are no changes found, that is also noted and sent back.

    Alright, so let’s see this in action! Before we make any changes, let’s just make sure that when we trigger this new function that it confirms there are no changes. Run the code and you should receive the output as shown below.

    Success!! Nothing Changed!!

    ChangesFound = False.

    Perfect – the script has run, compared itself against the snapshot and of course, nothing has changed.

    Now, let’s break stuff.

    I’ll go into my Intune environment and change one of our scripts to run as the logged on user instead of the system account (Nothing groundbreaking I know, but if this happened in a production environment, there’s a high chance the script would fail).

    Very Scary Stuff

    Alright, now we could wait an hour to see the function work, but let’s rush things along and force the script to run.

    For this, I’m going to use PostMan because it handles the large amount of data we will be returning back better than the test pane does in Azure. Remember to grab the function URL from the Function App!

    All you’ll need to do here is paste the URL into PostMan, set the method to post and hit send.

    Changes ahoy!

    Success!! as you can see, we receive back very detailed information about the original snapshot configuration, what the new configuration looks like as well as an itemized breakdown of each value that changed!

    Ahh but Ben, I hear you say, the current way that the monitoring script is configured, the resultant output will not have anywhere to go… It’s on a timer and it is designed to spit the output back to a requestor!

    You are correct – but now that we know we can detect changes, lets mull over what we could possibly do now that we have all this power…

    As always, code from this post will be available on my GitHub and I always appreciate feedback, so either leave a comment below or reach me on twitter @powers_hell

    Leave A Comment

    Please be polite. We appreciate that. Your email address will not be published and required fields are marked