We recently had a higher education prospect ask us if we could use PowerApps to facially recognize students as well as retrieve information on them as they enter an exam room. Being aware of the new Cognitive Services in Azure and having some experience in creating PowerApps, I responded with, well… yes, of course, we can.

Later that week, I decided to take my response to task and attempt to create a PowerApp which would capture an anonymous image using the camera control, then use FaceAPI from Azure Cognitive Services to validate the image against all images stored against my Dynamics 365 contacts, and finally respond with the contact’s first name, last name and student number, if a match was found.

This post will show what Dynamics 365 PowerApps are capable of when combined with Azure technologies and Dynamics 365.

What is Azure Cognitive Services?

Before we start, Microsoft released a set of APIs in Azure which implement machine learning algorithms, enabling natural and contextual interaction with your applications. These include:

  • Language
  • Speech
  • Vision
  • Search
  • Knowledge

We are going to focus on Vision, and more specifically, FaceAPI. FaceAPI uses image processing algorithms through several APIs which provide face verification, face search, face grouping and face identification.

We want to utilize the functionality FaceAPI offers to iterate a set of stored images and validate a given anonymous image against them—in this case, the image captured by the PowerApp against the CRM contact images.

At the time of building this demo, FaceAPI was in preview, and therefore had a few limitations and issues in terms of the frequency you could call the API.

Solution Overview

There are several ways to achieve facial recognition using the FaceAPI methods but I have chosen to keep it as simple as possible by using the Verify method and simply providing the method with the images to verify against. Alternatively, one could build a Person Group and individually add each Dynamics 365 contact to the group as a Person with an associated list of faces.

More information and API documentation on FaceAPI can be found at Microsoft Cognitive Services.

To deliver this solution, we need five components to work in harmony:

  • Azure Storage Account

We want to use an Azure Storage Account as a blob container for the Dynamics 365 contact images as well as the anonymous images submitted by the PowerApp. The container stores and provides an absolute URL to the images which is needed in the FaceAPI methods.

  • Azure App Service

A WebAPI service will be responsible for the uploading of the anonymous image to the Storage Blob, as well as using FaceAPI to verify the images and querying Dynamics 365 contact data.

  • FaceAPI

We will be using the FaceAPI methods to detect the faces in our images, as well as verify an anonymous image against a list of stored images.

  • D365 PowerApps

We will build a simple PowerApp to act as our user interface in capturing an image and surfacing Dynamics 365 data.

  • Dynamics 365

Dynamics 365 is acting as our data store for contacts.


Azure Storage Account

We want to use an Azure Storage Account to serve as the container for the Dynamics 365 contact images and our anonymous images uploaded when captured in the PowerApp.


I created two containers within the blob service named:

  • entityimages
  • powerappimages


Entityimages will contain all the images stored in Dynamics 365 on the contact records. I have chosen to populate the blob via a simple console application. Of course, in the real world, you may want to consider rather using a plugin on the contact to maintain these images in the blob, or possibly an Azure Function App which can be scheduled to run at a specific interval.

I am naming the image files by the contact ID and adding some metadata to the images when uploading these images to the blob.

Powerappimages will contain the anonymous images captured by the PowerApp camera control. Initially, my intention was to use Microsoft Flow to simply upload the captured image to the Azure Blob; however, after much tinkering, I found it was not possible out of the box. Because of this limitation with Flow, I opted to create and use a WebApi web service which will be responsible for uploading the image to the Azure Blob and executing the FaceAPI logic. More on that below.

Console App Sample:

Azure App Service

As mentioned, PowerApps and Flow do not currently offer any OOB functionality to upload a captured image to a specific Azure Storage Blob; therefore, we need to create a custom WebApi service hosted as an Azure App Service which will do this work for us.

First, we need to create a Cognitive Services resource in Microsoft Azure. You can simply search on “Cognitive” to quickly find the Cognitive Services resource within Azure.


Create a new account and make sure you use Face API as the API Type.


Once created, take note of the Face API Resource Key which will be used when connecting to the service going forward.


To simplify things, I am going to create the WebApi solution based on the solution in this blog post by Microsoft: How to upload images from the camera control to Azure blob storage. This will cover the creation of the WebApi service, publishing of the app to Azure, and finally the registration of the custom service in PowerApps!

I will be making changes to the UploadImage method to include the FaceAPI functionality and the Dynamics 365 contact lookup on successful facial recognition. I will also add to the UploadedFileInfo class to include FirstName, LastName, StudentID and Confidence properties which will be used in the PowerApp form.

To implement these changes, you will first need to add the NuGet packages for Dynamics 365 and FaceAPI to the solution mentioned above:

  • ProjectOxford.Face
  • CrmSdk.CoreAssemblies
  • CrmSdk.Deployment
  • CrmSdk.Workflow
  • CrmSdk.XrmTooling.CoreAssembly

For us to facially verify a person in an image, we need to first detect the face within the image. We do this by calling the DetectAsync method which will return a list of detected faces with their attributes. We will perform a Detect on both the anonymous image as well as each image stored in the entityimages blob from Dynamics 365. Once we have detected the face in the anonymous image, we then Verify this face against the face in the Dynamics 365 contact image by calling the VerifyAsync method and parsing the two face objects for comparison.

The VerifyAsync method will return a result with IsIdentical and Confidence properties telling us if the faces were a match. Once we validate if the faces are a match, we can strip the contact ID from the filename (or alternatively use the metadata on the blob object) to retrieve the contract entity from Dynamics 365.

Changes made to the UploadImage method:


D365 PowerApp

Next, I have created a basic PowerApp consisting of a camera control, toggle button, and four text fields to surface the student data.


I will go through each control and the configuration needed to make the solution work.

Camera Control

The camera control has an event named OnSelect available for use to write some code against, and this is exactly where we will plug our Web API web service into.

If you followed the MS blog correctly, you should have a data source available named ImageUploadAPI which represents the Web API service we created and published to Azure. This data source allows us to reference the service methods UploadImage and its return object UploadedFileInfo.


So, by clicking on the camera control, selecting the Action tab and clicking On select, we can add code to the event much like you would in an Excel formula.


The UpdateContext function is used to create a context variable, which temporarily holds a piece of information. In our case, the object returned by the Web API service. The ResultValue variable acts as a datasource once created and can be bound to any control. We also have the capability of stringing multiple methods on after another separated by a semicolon. The PowerApp will execute these methods sequentially.

In my demo, I strung in a Microsoft Flow after the UploadImage method which would accept the ContactID, and within the Flow logic, would retrieve the contact from Dynamics 365 and fire off an email to them, notifying them they have just been successfully verified.

Toggle Button

The toggle button simply allows the user to switch between the front and rear cameras on a mobile device, and the configuration is really straightforward.

Click on the camera control, and within the right-hand side properties pane, click the Advanced tab.

In the design section, add Toggle1.Value in the Camera textbox.


Text Boxes

Finally, we have four textboxes representing First Name, Last Name, Student ID and Confidence. Earlier on the camera control, we explicitly created the variable ResultValue which will be populated by our Web API service when uploading and verifying the captured image. All that is left to do is to set these textbox values to the ResultValue object and relevant property.

ResultValue represents the UploadedFileInfo class in the Web API service, and we will be using the FirstName, LastName, StudentID and Confidence properties to populate the textbox values. We also want to validate the ContactID properties contain a valid contact GUID in the case a match was verified, or is an empty GUID in the case no match was found.

Click on the textbox and within the right-hand side properties pane; click Advanced, and set the Text properties to:


First Name field:

Last Name field:

Student ID field:

Confidence field:


The Final Result


Ensure you have set a few contact images on your contacts in Dynamics 365.

Make sure you run the console application code to upload all Dynamics 365 contact images to the Azure Storage Blob.


Install the PowerApps app on your mobile device of choice, or simply use the PowerApps designer to test the PowerApp. I have installed PowerApps on my iPhone 7 and have signed in using my Dynamics 365 credentials.


Now you can simply capture an image of the contact by tapping the camera control or taking a picture of yourself if you have uploaded a selfie to one of your contacts. The PowerApp will query the Web API service and finally return the result of the verification.

Facial Recognition using Dynamics 365, PowerApps and FaceAPI


Happy coding!