Tuesday, 18 September 2018

An Azure Function to keep other Functions/URLs warmed up

imageJust a very quick post to share some code that might be useful. If you’re running Azure Functions on a consumption plan (i.e. pay for what you use), then you might know that a function app can go to sleep, leading to cold start issues. If it’s code you’re running behind a web part, PowerApp or other user-facing thing (usually on a HTTP trigger), then every now and then users will get really bad performance as they wait for the function app to spin back up again. Currently the timeout period for Azure Functions is 20 minutes – so if there are periods where your function won’t run, you’ll suffer from this issue.

As noted in Understanding serverless cold start and Azure Functions cold starts in numbers, just how long it takes to warm-up your function depends on many things. JavaScript functions can be slower than other languages if you’re using lots of npm modules for example.

You *could* switch to an App Service plan for your function. But then your costs will (typically) be much higher because you’re paying for dedicated VM instances in Azure, rather than taking advantage of one of the key advantages of serverless, in that you can pay for only the time your code is executing.

So, why not have another function which runs on a timer and just hits a series of URLs you specify in config?  This is a nice way of having one centralized function that can help out with *many* other functions distributed across many function apps on a consumption plan. So long as this code runs more frequently than the timeout period, you can be guaranteed that your other functions will be kept alive and performance will be good. Of course, there are numerous ways you could do this (a PowerShell script running somewhere, a Flow which executes on a schedule etc.) but if you like the idea of another function, then you can copy/paste my code below and get the solution implemented in minutes.

Here’s some code for a C# function to do this. You might want to take things further in terms of error-handling or something else, but it does the core job and might save you a quick coding/testing exercise - simply create a new C# function and use this as the implementation:

The code above expects an App Setting named “EndPointUrls” where the value contains a list of semi-colon separated URLs - your code will hit each of these:


You should then see a successful execution every 10 minutes (or whatever schedule you chose):


NOTE – you should be able to run *this* function on a consumption plan too without any problems. I’ve seen issues with timer functions not running in some circumstances, but none should apply in this case. But look out for these in your other timer functions perhaps:

  • Functions which make async HTTP calls with HttpClient or similar, but are not async “all the way down”. To make use of this with “await”, you should change the signature of your Run() method to be:
  • Functions which run for longer than the schedule e.g. runs for 1 minute 20 seconds but is scheduled every minute. Of course you’ll miss executions!
    • NOTE – the myTimer.IsPastDue property show in my first log statement can help you understand if this execution is due to this
  • Functions being missed due to app restarts
    • You might want to consider what other things exist in the same Function App
  • Look at the “useMonitor” flag for frequently-running functions

Also see:

Friday, 31 August 2018

Using a custom Azure Function in PowerApps (e.g. to call the Graph)

In the last post we looked at how to call out to your custom code from Flow. In this article I want to talk about the similar process for PowerApps, so that we can get to the point when our own code is running - perhaps when a button is clicked in the PowerApp or a screen in the app loads. This article is essentially a companion article to the last one. In fact, we did the important work of creating a custom connector (required by both PowerApps and Flow to call into our function code) there – so if you’re using my posts as a guide, you’ll need to reference my Flow article even if you’re only working with PowerApps. Here are the links again:

In both cases, it’s all about creating a custom connector which talks to your Azure Function, and then using the connector appropriately in PowerApps/Flow.

A recap on the overall process

Essentially the bit that’s different between PowerApps and Flow is the bit at the end – the overall process looks like this:


A PowerApp to update the user’s profile

In my last article, I used the idea of an Azure Function/custom connector that can update the user’s Office 365 profile from Flow. We’ll use the very same Azure Function/custom connector but from PowerApps this time. Since we already have the connector defined, it’s fairly easy to start using it in PowerApps.

I created a fairly simple PowerApp that uses each of my API’s functions:

  • Fetch profile details –> run when the “Lookup user” button is clicked
  • Update profile details –> run when the “Update user” button is clicked

In some ways, this could potentially be a useful PowerApp in the enterprise with a bit more work. Perhaps it could be useful for first line employees who rarely use a desktop PC, but occasionally need to update their profile for example. In any case, the overall recipe of a PowerApp which can do powerful things by calling into an Azure Function is certainly a useful technique to have in the toolbox.

For my app, I didn’t do too much on the user interface - here’s what it looks like:


In real life I’d lose the “Lookup user” button and just have the user’s existing data be loaded when the app loads – but for now, having the two buttons is useful to help illustrate my two methods and how to wire things up. If you did want to fetch the data without a button click, you would just take the PowerApps formula in the “Lookup user” button’s OnSelect event and use it in the first screen’s OnStart or OnVisible event instead.

Adding a data source

Once you have your connector created and made available, to use it in PowerApps you need to add a data source which points to the connector. You can then use your methods in PowerApps formulas. Creating the data source is done like this:



In the panel that appears now, find the connector which represents your API/Azure Function:


It should then be successfully added:


That’s it – you can now call your Azure Function from PowerApps.

Wiring up to app controls

Essentially, what you’ll find is that when you’re typing a formula somewhere in PowerApps, your API is available and has some auto-completion/IntelliSense on the methods. So in my case, I can start typing ‘cob’ and then my API’s methods appear:


When I select a method, I get prompted for the parameter(s) required:



Using in practice

The first thing we do is set the default value of the text input controls to the appropriate value of the JSON object returned from fetching the user’s data – for example, here’s the display name text box:


All we’re doing here is working with variables in PowerApps, but if you’re new to this there is some weirdness compared to coding. You need to understand PowerApps commands such as Set, UpdateContext, Collect/ClearCollect etc. This guide helps you out - Understand canvas-app variables in PowerApps

Once our text input controls are set to display the values of the “fetchedUserData” variable, we populate that object by calling into our custom connector.

On the “Lookup user” button’s OnSelect event  (or screen OnVisible event) – note that we’re passing in the current user’s account name from the label next to the photo. This is simply obtained from User() function in PowerApps, specifically User().Email (I’m assuming the e-mail address matches the account name in this case):

Consider that what we're doing here is calling the function once, and then storing the overall JSON returned as a variable. This matches up with how I've assigned the default value of my text input controls i.e. fetchedUserData.displayName.

On the “Update user” button:

NOTE: Don’t make the mistake of calling your function multiple times, once for each time you reference a value. That would look like this:

As per the comment, this approach means that your function is called multiple times rather than just once. Instead, ensure you call your function just once and then store the returned JSON in a variable. Each form control then references a child element of that JSON.


That’s it! You can now use Azure Functions within your PowerApps, which can open up lots of possibilities. In production you’ll need to consider who the custom connector is shared with/how it is made available (amongst other things), but the general process is quite straight-forward once you’ve done the hard work of defining your Open API definition etc.

Sunday, 22 July 2018

Using a custom Azure Function in Flow (e.g. to call the Graph)

PowerApps and Flow are awesome, but at some point you might need to extend them with some custom code. When integrating with other systems, you can do a lot with Flow (and the HTTP action in particular), but a lot more doors are open to you if you can plug in an Azure Function to your app or process. Perhaps you have a particular system you need to talk to, or some custom logic that is easier to express in code rather than in Flow. Or perhaps you need to call the Graph, and do it from your custom code rather than another way. Over the next two articles I want to talk about how to create a custom connector so that you can talk to your function from different places in Office 365:

  • Using a custom Azure Function in Flow [this article]
  • Using a custom Azure Function in PowerApps

We’ll do most of the work in this article. If you haven’t already gone through the process of working with an Open API/Swagger definition for a custom API, the key point is that your methods get individual parameters which are strongly-typed, and this becomes easy to work with in PowerApps/Flow. You have to do some work for this, but it’s nice considering underneath it’s just a big ugly string of JSON which is passed between PowerApps/Flow and the function. In the example I’m using in these articles, I have a function which updates the Office 365/AAD record for the current user (using the Graph) – having done the OpenAPI work and defining a custom connector, in Flow the action for my API exposes individual parameters which makes it simple to call:


It’s a similar story in PowerApps. So these steps are pretty much mandatory if you want to plug in some custom code to these technologies.

Let’s walk through the process needed to piece these things together.

Defining the Azure Function

First off, we need an Azure Function which is going to do the work. I won’t walk through every aspect of this since our focus here is on use of Swagger to define the API and make it callable from PowerApps/Flow. But the setup is:

  • A HTTP trigger function which receives some data passed as JSON in the request body (from Flow/PowerApps/some other client)
    • For function authentication, we just use the “key” option
  • An AAD app registration which has permissions to call the appropriate Graph endpoints
  • OPTIONAL - Use of Managed Service Identity + Azure Key Vault to obtain the Client ID/secret for the AAD app registration (N.B. this is what you should do and is fairly trivial, but I’ll keep this example simple by omitting it – see How to use Azure Managed Service Identity in App Service and Azure Functions for the MSFT docs on this)
  • Code in the Function to take the passed data and call the /v1.0/users/endpoint with the PATCH method to update the user profile
A note on authentication
I’m using function/key authentication (where the caller has to pass the code as a querystring parameter on the function URL e.g. /api/DoUpdate?code=foo). If you didn’t want the caller to have to provide the key, one option could be to use an Azure Function proxy in front of your function – this would supply the key to the back-end, but provide a plain URL to callers.
Also note it is possible to use AAD authentication between PowerApps/Flow and your function. Things are more involved here though, since you need 2 x AAD app registrations (one for your connector, one for your API) and the right configuration – see https://docs.microsoft.com/en-us/connectors/custom-connectors/create-web-api-connector#add-the-custom-connector-to-microsoft-flow for more on this. You might want to start with key authentication first though.

Here’s the code – some notable points:

  • There’s quite a lot of logging – useful as you’re getting things working, but you might dial it down slightly once you’re happy
  • You should use a static class level HttpClient object, rather than instantiating a new one during each execution (this is a best practice with Azure Functions which make HTTP calls with this object)
  • The data passed from the client is basically in exactly the same format that the Graph endpoint needs to – this simplifies our function code since we just need to serialize it again
  • ADAL.Net is used to obtain the token for AAD app registration

Our Azure Function also needs CORS to be allowed so that it can be called from PowerApps/Flow - either with a wildcard (*) entry, or the the domains used by PowerApps/Flow.

    Defining the API using OpenAPI (previously known as Swagger)

    Azure tries to be useful by providing a default API definition for your functions, but typically we need to extend this before your function can be used with PowerApps/Flow. In this step we’re going to expand the default API definition, so that the lower level detail of how JSON is passed and returned is also defined. There are a number of ways you can do this – options include:

      1. Directly in the Azure editor
      2. Using an online Swagger editor which provides a bit more support (e.g. http://editor.swagger.io/)
      3. Using a tool to generate the definition from your code. Mikael Svenson has a Swagger definition generator - I ran into some issues with it, but I’m sure it was just user error (missing annotations?) and to be honest I didn’t spend too much time on it. Looks useful though.
      4. Defining your API from scratch in the PowerApps/Flow/Logic Apps “create custom connector from scratch” process.

      Options 1-3 have you working with the OpenAPI definition file in some form. Option 4 allows you to use a designer which builds the OpenAPI file for you (which we’ll look at later) – this sounds good, but for some reason I find I prefer the quick “copy/paste of some JSON” experience of options 1 or 2. Unless your functions have lots and lots of parameters, you’ll probably find these options get the job done fine, but you can choose.

      When working with the OpenAPI definition for a function, we update the definition and then point PowerApps and Flow at it:

      1. Go to the “API definition” area of your Function app:
      2. Notice the default API definition that is provided for you – this is what we’re going to update:
      3. Using one of the above approaches, ensure that:
        1. Each function in your API has an entry in “paths”
        2. The “parameters” and “responses” section for each method properly describes how JSON is passed in and out of the function – you need to provide a “schema” element which describes this
        3. The “produces” and “consumes” elements are set to “application/json”
      4. Save the updated definition.

      The best way to illustrate this is with a before and after – in the definitions below, you can see that before the editing the “parameters” and “responses” sections are empty or minimal. I added the detail of the JSON objects which are passed into these sections (along with a couple of other bits of added info):


      Once the definition of your API is looking good, it’s time to make it available to where it will be used.

      Export your API to PowerApps and Flow

      The next step is to define a custom connector in PowerApps/Flow/Logic Apps from your API – the OpenAPI definition is the foundation of this. You can start this process from either the Azure Function side or the PowerApps/Flow side. In PowerApps/Flow, you would do:



      Or, you can start from your Azure Function. Either approach gives the same result. On the Azure side, within the “API definition” area you’ll find this button:


      Let’s use this approach. When you do this, Azure asks you for some details to link your OpenAPI definition with PowerApps/Flow – be careful because you can’t rename it later (without affecting any PowerApps/Flows which use it):


      1. Provide a name for your API, and set other initial details:

        NOTE – in the “API Key Name” field above, I’m providing a hint to users of my connector about how to get the key (which must be passed to my function in the back-end). This appears whenever someone uses the connector for the first time.

      2. Your API will now show up as a “custom connector” in PowerApps or Flow:

      NOTE – the connector is only shared with you at this point. To enable it to be used by other people creating PowerApps/Flows in your organisation, it will need to be shared with them (or Everyone) also..

      [OPTIONAL] Extending your API definition for PowerApps/Flow/Logic Apps

      Now that your API is known to PowerApps/Flow, you can optionally play with the definition some more on the PowerApps/Flow/Logic Apps side. This is known as “extending an OpenAPI” definition. Microsoft have some specific extensions to the OpenAPI spec, and you can do some nifty things with them such as marking parameters as “important” or “advanced” in the x-ms-visibility tag. When you use “advanced” for example, the setting is hidden away behind an “advanced” or “more” option, which might make more sense for some of your parameters:


      The process of extending an OpenAPI definition looks like this – and although it’s technically optional if your OpenAPI definition is perfect, you’ll often find that this process is great for validating your definition and helping you find missing attributes etc. In my case, I found my methods were missing a summary which I needed to add:

      We are still editing the OpenAPI definition here, your changes are saved there – however, this is not going back to the definition stored at your Azure Function, but a version held by PowerApps/Flow/Logic Apps.

    1. Click the edit button on your custom connector:
    2. On the “General” tab, provide a description (and perhaps an icon):
    3. On the “Security” tab, leave the defaults as they are:
    4. On the “Definition” tab you may have some work to do - in my case, here’s where I found my methods had a warning initially because they had no value for the summary. You’ll see an “Action” for each function you’re exposing in your API – you can see my fetch and update operations for example:
      Take each operation one by one. After adding any missing details and ensuring any warnings have disappeared, you should drill into the “Body”:     
      Here you should check that the individual bits of data in the JSON that would be passed to your API is reflected. In my case, I’m returning user profile information and it looks like this: 
      You should also check the parameter validates OK:
      Once you’ve done this for all your methods, save the changes:
      You’re now ready to use your API in PowerApps/Flow/Logic Apps.

      Using your connector in a Flow

      When in the Flow designer, you can now add an action which calls your API. If you create or edit a Flow and search for your connector, it will show up with the actions that are defined (in my case fetch or update the user profile):


      Of course, I now get the benefit of individually-defined fields that are easy to work with (shown here for my update operation):


      If you test the Flow by performing the trigger action (I’m using Postman to make a HTTP POST call), then you should see things work beautifully:


      ..and of course, my Office 365 profile is indeed updated with the values:



      Being able to plug custom code in to PowerApps/Flow/Logic Apps is an important tool in the toolbox for the modern Office 365 architect or developer. You create the OpenAPI definition for your function or API, and then create a custom connector for it, ensuring that all operations get validated and the parameters passed in the POST body are individually-defined. Once you’re at this point, you can pretty much build anything into PowerApps and Flow – which is awesome! In my example, I used a custom connector which calls an Azure Function which calls the Graph – which can probably be the foundation for many different scenarios.

      Waldek has some great information on this too, especially around async connectors which can be used for long-running operations. There’s a model where your API returns “location” and “retry-after” headers to Flow if the operation is still in progress, and this tells Flow to check back again in X seconds (however many you specified in “retry-after”). This could be very useful in certain scenarios, so I recommend reading his article What you should know about building Microsoft Flow connectors. Other good reading:

      Tuesday, 19 June 2018

      Running a PowerShell script in the cloud with an Azure Function, PnP and the Graph

      We all know that Azure Functions are really useful in many Office 365 scenarios, and this goes beyond developers. IT Pros can certainly benefit from the ability to run a script in the cloud – perhaps it’s something you want to run every hour, day or week on a schedule, and maybe you want to use PnP PowerShell too for its awesome commands around managing Groups/Teams, provisioning of SharePoint sites and more. This post is aimed at anyone who has such a need – we’re going to use Visual Studio Code (not full Visual Studio) for some bits, but that’s just because it’s becoming a great PowerShell editor even for non-coders. In fact, I’m now convinced that VS Code is the best option for PowerShell scripts regardless of who you are. If you’re a dev, you’ll probably like the way VS Code can deploy your files to an Azure Function, but for IT Pros or anyone less comfortable with this, I’ll also show how you can drag and drop the script (and related files which make it run as a function) to the cloud. In both cases, we’ll benefit from VS Code’s support for creating the function with the right set of files, the code formatting/coloring and the ability to debug our script.

      The scenario we’ll use is creating Office 365 Groups (but if you’re interested in a Flow approach to this, also see my Control Office 365 Group creation – a simple no-code solution post) – I like this scenario because with a tiny tweak, in the future it will be possible to create a Microsoft Team also, which is arguably more useful. We’re just waiting for the app-only APIs for that one, although you could do it today with user context if you needed to.

      There are many ways of creating an Azure Function these days, and I’ll cover these in a future post. So as mentioned, IT Pros shouldn’t be scared of this, but we’ll use:

      You should follow these links to download/install those things if you don’t already have them.

      Steps we’ll go through

      I’m going to try to be fairly comprehensive in this post with lots of screenshots. Here’s what we’ll cover:

      • Overview of PowerShell script contents
      • Installing the PnP PowerShell module
      • Registering an Azure AD app to talk to the Graph
        • Granting admin consent to this app by constructing and hitting a consent URL (which is interesting if you’ve haven’t seen this process before)
      • Create the Function locally using VS Code, using PowerShell as the language
        • Debugging/stepping through the function
      • Prepare for running in the cloud - copy files from SharePointPnPPowerShellOnline module into local folder
      • Create the Azure Function app in the portal
      • Publishing the files to the function:
        • Using VS Code
        • Using drag and drop with Kudu
      • Testing the function
      • Final step – storing App ID and Client Secret securely
      Update – consider storing credentials in Azure Key Vault now it has hit General Availability
      Later in this post I’ll show how we register an app in AAD to talk to the Graph, which has credentials in the form of an app ID and secret. I show storing these away from the code in Azure environment variables, but there’s a better option now. Azure Key Vault provides the ability to securely store these, where you define “who” can access them (e.g. which Azure functions, or other things which provide a service principal).
      Two of my friends have great articles on this:

        Overview of PowerShell script contents

        Let’s start with the script, just so you can see how simple it is. We won’t do anything with it yet, but here it is for your info – notice that the example we’re using (creating an Office 365 Group) is very simple if you’re able to use the PnP command (New-PnPUnifiedGroup) and have an AAD app registered.

        Here are the script contents I used:

        With that in mind, let’s deal with the prerequisite steps such as obtaining the PnP PowerShell module and registering the AAD app to be able to create Groups.    

        Installing the PnP PowerShell module

        There are a couple of ways to install this, but the PnP PowerShell page has everything you need - full instructions and links to installers. See:


        Registering an Azure AD app to talk to the Graph

        If we’re using a PnP PowerShell command that talks to the Graph (as I am to create Office 365 Groups), then we need an AAD app registration which our script/code will use to authenticate. We define this with appropriate permission scopes for the actions we’re actually performing – the process below shows what I need to do to create Groups, but if you’re using a different script which does something else, then most likely you’ll need other permission scopes.

        1. In the Azure portal, go to the “Azure Active Directory” section and find the “App registrations” area:
        2. Click the “New application registration” button:
        3. Complete the details – note in this case the sign-on URL can be anything that uniquely identifies your function:
        4. Hit the “Create” button.
        5. We now need to grant the appropriate permissions to this app registration:
          1. Go in to the “Settings” area, and select “Required permissions”:
          2. Click the “Add” button, then “Select an API” and find and select “Microsoft Graph”:
          3. In the “Application Permissions” section of this list, check the box for “Read and write all groups”:
          4. Click “Select” and then “Done” to save this config.
        6. Now we need to obtain the app ID for our PowerShell script – it can be copied from the settings area. Store it somewhere safe for later pasting into our PowerShell script:
        7. We also need to obtain a key. This can be done by:
          1. Going into the “Keys” area:
          2. Give the key a name – when you click “Save”, the key value will appear in the box:
          3. Again, store it somewhere safe for later use in our PowerShell script.

        Grant admin consent to the app

        Now that we’ve defined our AAD app registration, a tenant admin needs to grant admin consent to the app. This is the security gate which means that the organization is permitting this access to Office 365 data. What happens is that you construct a URL in the form of:

        https://login.microsoftonline.com/[tenant].onmicrosoft.com/adminconsent?client_id=[App ID]

        So for my app, the URL looked like this: https://login.microsoftonline.com/chrisobriensp.onmicrosoft.com/adminconsent?client_id=348841f1-6c9d-4044-b014-ed346f2572f2

        When the tenant admin hits this URL, he/she sees the list of permission requirements for this app and can make a decision whether to grant consent or not:


        Once consent is granted, the app can request auth tokens which can perform actions related to the selected permission scopes. Read more about this process at Tenant Admin Consent if you need to.

          Create the Function locally, using PowerShell as the language

          OK, now the pre-reqs are in place, we’re ready to create an Azure Function to run our PS script from.

          1. First, create a folder on your machine to host the files – I named my folder “COB-GroupCreator-Function” to reflect what we’re doing here.
          2. Open VS Code to this folder.
          3. Hit CTRL SHIFT + P for the VS Code command bar, and start typing “Azure Functions” to pull up those commands. Select the “Create New Project” command:
          4. Follow the prompts in the command bar to select the folder – accept the default if you’re already in the folder you created earlier:
          5. When it comes to selecting a language, select any for now – but in this case we said we’d use PowerShell. PowerShell doesn’t appear in the list currently because support is only experimental (as at June 2018), but we can switch to it later:
          6. Now let’s add an individual function. Hit CTRL SHIFT + P again, and find the “Create Function” command under Azure Functions:
          7. Accept that we’ll use the current folder:
          8. In the next step we can change the language to PowerShell:
          9. Select PowerShell from the list:
          10. In the next step, we need to include “non-verified” templates whilst PowerShell support is still experimental – select the “Change template filter” option:
          11. Select “All” from the list:
          12. And now additional templates appear – in this case, our function is going to be a simple function which runs once per day, so we’ll select “Timer trigger”:
          13. Now enter the name for your function:
          14. For a function which runs daily at 10am, the CRON expression would be “0 0 10 * * *”, so let’s enter that when we’re asked for the schedule for the function by the way, here’s a good cheat sheet for other CRON expressions):
          15. A bunch of new files will now be added to your folder, including the “run.ps1” file which is the PowerShell script for your function implementation:

          We’re now ready for our function implementation. Except to say, if you’re following these steps when PowerShell support is still experimental, you may still have some leftover JavaScript config if you did this. Let’s fix that next.

          Debugging our PowerShell script locally with F5

          VS Code works great as a debugger for PowerShell scripts. The best approach is to get your PowerShell script 100% correct by developing and debugging locally before publishing it to the cloud as an Azure function. You’ll be able to stop on breakpoints, see variables and so on – doing this locally is much easier than when you’re running in Azure on a daily schedule (although you have options for that too).

          So, all we need to do is remove the JS config from our launch.json file. Find it in the .vscode folder:


          Remove the section for the JS configuration – before:




          VS Code is now smart enough to do the right thing based on your .ps1 file extension. You’ll find that you can hit add a breakpoint to your script (which is currently still just a single line of boilerplate code) and hit F5 to debug locally:


          You can actually do lots more to customize what happens when F5 is pressed – for example, if your PowerShell scripts accepts parameters and you want to pass some values in when debugging. This is done by adding back some PowerShell specific configurations into your launch.json file – see Using the launch.json file in VS code for improved PowerShell debugging for some examples of this. But even without this, we’re now able to step through our script locally and get the feedback we need to develop and test it properly.

          Prepare for running in the cloud - copy files from SharePointPnPPowerShellOnline module into local folder

          We now need to copy the files from the SharePoint PnP PowerShell module into our local folder. This isn’t necessary for running locally – the files will be loaded from your local modules path for that (which can be found using $Env:PSModulePath in PS) – but it is necessary for when the script runs as an Azure Function. So, let’s find the files on your local machine by running $Env:PSModulePath – you should see a bunch of paths in there, but one of them will be for the SharePointPnPPowerShellOnline module: 


          Open that folder on your machine and copy the “Modules” subfolder and all it’s contents into the folder containing your PS script for your function.

          Your set of files should now look like this:


          When your script runs in the cloud, the Azure Functions runtime will now load any modules stored in this folder and you can therefore use the PnP commands successfully.

          Paste the script contents into run.ps1

          Now we're ready to deal with the script implementation. Paste in the script we've seen already (repeated below for your convenience), or implement your own steps.

          You’ll need to substitute in values for your AAD app ID, AAD app secret (key) and Office 365 tenant prefix into the placeholders/variables in the script – for now, let’s do that in a hard-coded way to test our script. Later on, we’ll swap out these hardcoded values and fetch them from config in Azure instead.

          As a reminder, here are the script contents I used:

          Once you've pasted those details in, run your script to check it works (by pressing F5 in VS Code). If your AAD app is registered with the right details and permissions, and you have the PnP PowerShell module installed properly, things should work just fine when running locally. Hooray! If not, you should work through the issues until you're successfully running locally before publishing to an Azure Function.

          Create the Azure Function app in the portal

          We need an Azure Function app to host our script in the cloud. You can actually create this through VS Code (use the command “Azure Functions: Create Function App in Azure”), and it works great! However, for simplicity let’s do it the normal way through the Azure portal. Even there, we have a couple of options but anything that gets you to the create function app process is good – for example:

          1. Click “Create a resource”:
          2. Start typing “function app” to search for the item – select it when you find it, and then hit the “Create” button:
          3. Then complete the details – you can see in the image below I’m reusing some things I already have in Azure (e.g. a Resource Group, Storage Account etc.) – but just create new ones if you need to. Of course, be aware of the selections you make if you’re doing this for real in production (especially whether you’re choosing the Consumption Plan or App Service plan). Hit the “Create” button when you’re ready:
          4. Your function app, as well as an App Insights instance if you selected that option, should now have been created and are ready to host code:

          A note on authentication
          In this case, our function won't actually have *any* authentication for callers, because we're never going to call it over HTTP in this case (which might be different to other functions you create). This one is purely a timer trigger function.

          Publishing files to an Azure Function from VS Code:

          The next step, once we’re sure the script is working correctly, is to publish it to an Azure Function. You can do this in a variety of ways, but VS Code makes this easy:

          1. Switch to Azure tab in VS Code (bottom icon):

          2. Find the Function App you created earlier. Right-click on it, and select “Deploy to Function App”:
          3. Step through prompts in the VS Code command palette – the defaults are generally what you need:

          4. Accept the prompt:


          5. VS Code will then start the publish operation:

          6. You’ll then find that your files have been deployed as a Function to the selected Function App:



          Of course, you don’t *have* to use VS Code to publish files up to your function app. Anything that gets your files there (FTP, manual copy via Kudu, web deploy, sync from source control) is fine – you just need to ensure the files end up in the right structure. This is what you need to end up with:



          To deploy the files using Kudu, simply drag and drop them from Windows Explorer into the site/wwwroot folder within Kudu:


            Final step - Storing App ID and Client Secret securely

            At this point, you should now have your script running successfully in the cloud. It should execute on the schedule you defined for the function or from using the “Test” button in the Azure portal. But remember that we hard-coded some values for the app authentication into the script – we need to fix that.

            A better option than having these values hard-coded in your script is to add App Settings to your Azure Function app, and read them from there. They are stored securely by Azure, although you could take this further and use Key Vault or similar if preferred. Either is better than having secrets/passwords in code or scripts! In the next steps, we’ll define them as app settings in Azure and tweak the script to pick them up from there:

            1. Go to the Platform Features tab, then select “Application Settings”:

            2. Scroll down to the “Application Settings” section, then click the “Add new setting” link:

            3. Add the App ID and secret for the Azure AD app registration which has permissions to do the work (create an Office 365 Group in our case):

            4. Also add an item for your tenant prefix and name it “Tenant”.
            5. Don’t forget to click “Save” back at the top of the page:

            Accessing App Settings from PowerShell in a Function
            App Settings can be retrieved by use of the $env:[MyAppSetting] syntax in PowerShell, similar to an environment variable. So in our script, we simply use the following to match up with the keys we added our config values with:


            What this means is that our final script should look like this:

            Voila! Your script should now run from the cloud and be implemented in the way it should be. I won’t show end end result of this example (an Office 365 Group) because you most likely know what that looks like, but hopefully you get the overall idea and the steps involved.


            The idea of running PowerShell scripts from the cloud is very useful, especially for scripts which should execute on a schedule. Our scenario of creating an Office 365 Group isn’t necessarily something you’d want to do on a schedule (or maybe it is, if you need a collaboration space for a weekly meeting or other periodic event?) However, the power of PnP PowerShell means that there are many scenarios you could deal with in this approach. We showed how to get external modules like PnP running in your Azure Function, and also how to deal with the authentication to the Graph if needed – overall, it’s a powerful approach that can be useful to both IT Pros and developers. I recommend using Visual Studio Code to work with PowerShell these days, especially since it helps package and deploy Azure Functions as well as providing debugging support. Happy coding!