Autopilot Prestager – A Power Automate for staging Autopilot devices using an in-house inventory database.


This year we, like many others, were inundated with new laptops coming through the back door. Part of our onboarding process was to add group tags to devices depending on where they were going which could be one of 100+ locations. So, while yes, our OEM could add a group tag for us through the partner portal, this would not have done much good for us.

Many of these orders were in quantities of 500 or more but once they got to receiving, a stack of a hundred would get grabbed and scanned to go to location A, another stack would get scanned to location B, and so on. Even if it were smaller orders and the group tag was something like a PO Number, it wouldn’t be very efficient to add every PO to a dynamic rule for group memberships. But if every group tag were set to the location that it was going, that would perfectly sync with our dynamic device groups where most of our configurations and applications are assigned to.

My quick fix for this was to write a PowerShell script ( that would use a csv of serial numbers to add a group tag to. This worked well for us but still was a manual and tedious process that would sometimes get stuck in the ticket queue or someone would be out sick so the imaging bench would be empty waiting for these to get staged.

At the same time, we were really wanting to go back to naming our devices by their asset tags instead of the PREFIX-%SERIAL% naming convention that we started with. That naming convention only seemed to cause confusion, especially with users that were calling in help desk tickets, and ultimately, to techs working on tickets because we would end up with asset tags in the ticket but did not have a quick and convenient way of looking up the device name by the asset tag provided.

Ideally, before the device was even taken out of the box, the name would be set by the asset tag and the group tag by the location in Autopilot. And no one would have to do it.


This one is not too hard to do if you have a little help from your DBAs. I am lucky enough to have some great ones on my team and they gave me some read only credentials to run an on-prem data gateway with. I highly encourage you to get with them and show them what you are trying to do and to also offer up any data that you can give them. It turns out mine could use a lot of data from Intune that they were getting in other ways.

For this demo, we need to make three things: an AAD app registration, a custom connector, and the Flow. We’ll start with the app registration.

App Registration

I’m going to go over this pretty quickly since I covered it more in-depth on my last post (MEM Script Helper - A Power App for viewing and editing MEM Scripts). Navigate to Azure AD, open the App registration blade, and create a new application. Once in the Overview blade, copy the Application ID (now’s a good time to turn on cloud clipboard if you haven’t already). 

Assign it the DeviceManagementServiceConfig.ReadWrite.All Graph API permission and grant admin permissions.

Open the Certifications & secrets blade and create a New client secret. Make sure to copy this and secure it now because you will not be able to later. Treat this like a…secret.

While we’re here, we might as well add the redirect URI. This isn’t given to you until you save the security details in the custom connector but it’s always the same, so we’ll do it now.

Open the Authentication blade, click Add a platform, choose Web, and add “” as the redirect URI.

Custom Connector

Now that we have the permissions that we need for our custom connector, we can go build it. I also covered this more in-depth on the MEM Script Helper post.

Navigate to Power Apps (or Power Automate), open the Custom Connectors blade, create a new one from blank, and give it a name.

Once it opens the connector, click on the Swagger Editor switch to expose the Swagger YAML editor, and replace all that text with this text ( This will add the two actions we need which are to list the autopilot devices with a serial number filter applied and to then update the device’s group tag and display name properties. It does not add any of the security information for you though.

Flip the Swagger Editor switch back to off and click on the Security tab. Select OAuth 2.0 and change the Identity Provider to Azure Active Directory. Paste the Client ID and Client secret that you copied from the app registration into their respective fields. Use “” as the Resource URL and click Create connection.

You can double check if everything is working by going to test and supplying the serial number, display name, and group tag manually. If everything is successful, we’re ready for the next step.


Navigate to Flow and create a new automate Flow, give it a name, and use “When an item is created (V2)” as the trigger.

Remember those DBAs? This is where their help pays off. You need the inventory database server name, database name, and table name as well as a login with read premissions. In my case, I only wanted it to run when laptops or desktops came in, so I filtered the CategoryID column by 3 or 4 since that’s how my database was set up.

For this demo, I made an Azure SQL database with two tables, a facility table, and an equipment table. This is out of the scope of this blog post so I’m not going to go into that. I’m not a DBA but it only took me a couple hours of googling to get a database in my demo tenant that matched the structure of my work database.

The next step is going to be using the serial number and the facility ID values from the newly received device that was just scanned into inventory to get two things; the autopilot identity which was created well before the time we receive the device since it is entered in by our OEM at the factory and the facility name from another table since it’s not in the equipment table. The first one will be getting the autopilot identity.

In the New step “Choose an operation” menu, select Custom, pick the connector that we just created, and choose the second one which should have “(serialNumber,{serial})”. In the Filter field, replace “{serial}” with the SerialNumber dynamic content wrapped in single quotes. We also need to rename the action because it’s over the character limit as it is.

Now click the “plus” button above the step you just created and select “Add a parallel branch”. This will allow the action we just created and our next action to run at the same time, making it just a bit faster.

For this step, we’ll be getting the facility name using the facility ID from the first step. Most databases use structure like this. Whether it is location, facility, department, or some other property that you want to use for your group tag, the equipment table probably won’t have that. It usually will just have a propxID that has a relationship to the propx table.

On the “Choose an operation” menu, select “Get Row (V2)”. Select your database info in the dropdowns and use the FacilityID Dynamic content as the Row id.

Now create a new step and choose the other (first) action from the custom connector. We want the id value from the Get Autopilot Device action but getting it is a little tricky. If you just select the dynamic content, it will create a “for each” loop since the response to that action is part of an array, even if it’s an array of one object. This is because the JSON response from our Graph request was to list the devices. We just filtered by something that we should only ever have one of. So, to avoid the unnecessary loop, we can just tell it to use the first “id” for our value by using this expression: “first(outputs('Get_Autopilot_Device_by_Serial')?['body/value'])?['id']”. For groupTag and displayName, use the FacilityName and TagNumber dynamic content.

This completes the Flow. After saving, you can go the Flow’s details page and see that there is no history of runs yet. I added a device to my equipment table and refreshed the run history which now showed that it has successfully run.

If you view the results of the run, it will show all the inputs and outputs of each step. Here you can see the inputs used on our final step. The autopilot ID, “Sparks MS” for the groupTag, and “A157122” for the displayName.

You can also view the audit logs to verify what change was made, the application that made the change, and what properties were modified.

After a few minutes, that change will be reflected in the Autopilot devices blade in MEMac.


Using this method, you can fully automate your autopilot device onboarding. In my organization, all data gathering work is already done by receiving so this eliminates someone else having to double back on that work and duplicate it to Intune. This isn’t just a one-way street either. We can also assign a user to the device in Intune and have that pushed to the database so inventory will always show who has the device. Our inventory also has a field to show if this is a student device or teacher device. We can use that property to add the device to an AAD Group that we have different restrictions or configurations applied based on the user type.

This is just the framework of how to build this kind of an automation but it’s highly customizable and minimally manual. Build it to suit your needs and stop worrying about those “lost” tickets.


  1. Hi, this looks like a great solution that we will look into.

    Would it be possible to have the screenshot be bigger? This would help better understand the entire process.

    Thank you in advance.

    1. Yup! I’m not sure what happened with the pictures but it should be better now.

  2. Thanks for the great walkthrough! I implemented something similar with Logic Apps, for API-based onboarding to our helpdesk. I think Flows are linked to user accounts by default though, so you might want to migrate to Logic Apps at some point.

    1. That’s a good point. I haven’t dived in yet but I should give it a go. Any starter tips?


Post a Comment