Sunday, 22 July 2018

Using a custom Azure Function in Flow (e.g. to call the Graph)

PowerApps and Flow are awesome, but at some point you might need to extend them with some custom code. When integrating with other systems, you can do a lot with Flow (and the HTTP action in particular), but a lot more doors are open to you if you can plug in an Azure Function to your app or process. Perhaps you have a particular system you need to talk to, or some custom logic that is easier to express in code rather than in Flow. Or perhaps you need to call the Graph, and do it from your custom code rather than another way. Over the next two articles I want to talk about how to create a custom connector so that you can talk to your function from different places in Office 365:

  • Using a custom Azure Function in Flow [this article]
  • Using a custom Azure Function in PowerApps

We’ll do most of the work in this article. If you haven’t already gone through the process of working with an Open API/Swagger definition for a custom API, the key point is that your methods get individual parameters which are strongly-typed, and this becomes easy to work with in PowerApps/Flow. You have to do some work for this, but it’s nice considering underneath it’s just a big ugly string of JSON which is passed between PowerApps/Flow and the function. In the example I’m using in these articles, I have a function which updates the Office 365/AAD record for the current user (using the Graph) – having done the OpenAPI work and defining a custom connector, in Flow the action for my API exposes individual parameters which makes it simple to call:

SNAGHTMLf85374f

It’s a similar story in PowerApps. So these steps are pretty much mandatory if you want to plug in some custom code to these technologies.

Let’s walk through the process needed to piece these things together.

Defining the Azure Function

First off, we need an Azure Function which is going to do the work. I won’t walk through every aspect of this since our focus here is on use of Swagger to define the API and make it callable from PowerApps/Flow. But the setup is:

  • A HTTP trigger function which receives some data passed as JSON in the request body (from Flow/PowerApps/some other client)
    • For function authentication, we just use the “key” option
  • An AAD app registration which has permissions to call the appropriate Graph endpoints
  • OPTIONAL - Use of Managed Service Identity + Azure Key Vault to obtain the Client ID/secret for the AAD app registration (N.B. this is what you should do and is fairly trivial, but I’ll keep this example simple by omitting it – see How to use Azure Managed Service Identity in App Service and Azure Functions for the MSFT docs on this)
  • Code in the Function to take the passed data and call the /v1.0/users/endpoint with the PATCH method to update the user profile
A note on authentication
I’m using function/key authentication (where the caller has to pass the code as a querystring parameter on the function URL e.g. /api/DoUpdate?code=foo). If you didn’t want the caller to have to provide the key, one option could be to use an Azure Function proxy in front of your function – this would supply the key to the back-end, but provide a plain URL to callers.
Also note it is possible to use AAD authentication between PowerApps/Flow and your function. Things are more involved here though, since you need 2 x AAD app registrations (one for your connector, one for your API) and the right configuration – see https://docs.microsoft.com/en-us/connectors/custom-connectors/create-web-api-connector#add-the-custom-connector-to-microsoft-flow for more on this. You might want to start with key authentication first though.

Here’s the code – some notable points:

  • There’s quite a lot of logging – useful as you’re getting things working, but you might dial it down slightly once you’re happy
  • You should use a static class level HttpClient object, rather than instantiating a new one during each execution (this is a best practice with Azure Functions which make HTTP calls with this object)
  • The data passed from the client is basically in exactly the same format that the Graph endpoint needs to – this simplifies our function code since we just need to serialize it again
  • ADAL.Net is used to obtain the token for AAD app registration

Our Azure Function also needs CORS to be allowed so that it can be called from PowerApps/Flow - either with a wildcard (*) entry, or the the domains used by PowerApps/Flow.

    Defining the API using OpenAPI (previously known as Swagger)

    Azure tries to be useful by providing a default API definition for your functions, but typically we need to extend this before your function can be used with PowerApps/Flow. In this step we’re going to expand the default API definition, so that the lower level detail of how JSON is passed and returned is also defined. There are a number of ways you can do this – options include:

      1. Directly in the Azure editor
      2. Using an online Swagger editor which provides a bit more support (e.g. http://editor.swagger.io/)
      3. Using a tool to generate the definition from your code. Mikael Svenson has a Swagger definition generator - I ran into some issues with it, but I’m sure it was just user error (missing annotations?) and to be honest I didn’t spend too much time on it. Looks useful though.
      4. Defining your API from scratch in the PowerApps/Flow/Logic Apps “create custom connector from scratch” process.

      Options 1-3 have you working with the OpenAPI definition file in some form. Option 4 allows you to use a designer which builds the OpenAPI file for you (which we’ll look at later) – this sounds good, but for some reason I find I prefer the quick “copy/paste of some JSON” experience of options 1 or 2. Unless your functions have lots and lots of parameters, you’ll probably find these options get the job done fine, but you can choose.

      When working with the OpenAPI definition for a function, we update the definition and then point PowerApps and Flow at it:

      Process
      1. Go to the “API definition” area of your Function app:
        SNAGHTML5be03d27
      2. Notice the default API definition that is provided for you – this is what we’re going to update:
        SNAGHTML5be22484
      3. Using one of the above approaches, ensure that:
        1. Each function in your API has an entry in “paths”
        2. The “parameters” and “responses” section for each method properly describes how JSON is passed in and out of the function – you need to provide a “schema” element which describes this
        3. The “produces” and “consumes” elements are set to “application/json”
      4. Save the updated definition.

      The best way to illustrate this is with a before and after – in the definitions below, you can see that before the editing the “parameters” and “responses” sections are empty or minimal. I added the detail of the JSON objects which are passed into these sections (along with a couple of other bits of added info):

      Before
      After

      Once the definition of your API is looking good, it’s time to make it available to where it will be used.

      Export your API to PowerApps and Flow

      The next step is to define a custom connector in PowerApps/Flow/Logic Apps from your API – the OpenAPI definition is the foundation of this. You can start this process from either the Azure Function side or the PowerApps/Flow side. In PowerApps/Flow, you would do:

      SNAGHTML57484ba

      SNAGHTML5740f2c

      Or, you can start from your Azure Function. Either approach gives the same result. On the Azure side, within the “API definition” area you’ll find this button:

      SNAGHTML5774987

      Let’s use this approach. When you do this, Azure asks you for some details to link your OpenAPI definition with PowerApps/Flow – be careful because you can’t rename it later (without affecting any PowerApps/Flows which use it):

      Process:

      1. Provide a name for your API, and set other initial details:
        SNAGHTML59cdaf6

        NOTE – in the “API Key Name” field above, I’m providing a hint to users of my connector about how to get the key (which must be passed to my function in the back-end). This appears whenever someone uses the connector for the first time.

        SNAGHTML57e4791
      2. Your API will now show up as a “custom connector” in PowerApps or Flow:
        SNAGHTML57f68ef    

      NOTE – the connector is only shared with you at this point. To enable it to be used by other people creating PowerApps/Flows in your organisation, it will need to be shared with them (or Everyone) also..

      [OPTIONAL] Extending your API definition for PowerApps/Flow/Logic Apps

      Now that your API is known to PowerApps/Flow, you can optionally play with the definition some more on the PowerApps/Flow/Logic Apps side. This is known as “extending an OpenAPI” definition. Microsoft have some specific extensions to the OpenAPI spec, and you can do some nifty things with them such as marking parameters as “important” or “advanced” in the x-ms-visibility tag. When you use “advanced” for example, the setting is hidden away behind an “advanced” or “more” option, which might make more sense for some of your parameters:

      image

      The process of extending an OpenAPI definition looks like this – and although it’s technically optional if your OpenAPI definition is perfect, you’ll often find that this process is great for validating your definition and helping you find missing attributes etc. In my case, I found my methods were missing a summary which I needed to add:

      NOTE
      We are still editing the OpenAPI definition here, your changes are saved there – however, this is not going back to the definition stored at your Azure Function, but a version held by PowerApps/Flow/Logic Apps.

    1. Click the edit button on your custom connector:
      SNAGHTML5a021a5
    2. On the “General” tab, provide a description (and perhaps an icon):
      SNAGHTML5a42a8b
    3. On the “Security” tab, leave the defaults as they are:
      SNAGHTML5a548eb
    4. On the “Definition” tab you may have some work to do - in my case, here’s where I found my methods had a warning initially because they had no value for the summary. You’ll see an “Action” for each function you’re exposing in your API – you can see my fetch and update operations for example:
      SNAGHTML5a6ed35
      Take each operation one by one. After adding any missing details and ensuring any warnings have disappeared, you should drill into the “Body”:     
        SNAGHTML1385d187[4]
      Here you should check that the individual bits of data in the JSON that would be passed to your API is reflected. In my case, I’m returning user profile information and it looks like this: 
      SNAGHTML138845d3
      You should also check the parameter validates OK:
        SNAGHTML13895f61
      Once you’ve done this for all your methods, save the changes:
      SNAGHTML138c6677
      You’re now ready to use your API in PowerApps/Flow/Logic Apps.

      Using your connector in a Flow

      When in the Flow designer, you can now add an action which calls your API. If you create or edit a Flow and search for your connector, it will show up with the actions that are defined (in my case fetch or update the user profile):

      SNAGHTMLbd8598e

      Of course, I now get the benefit of individually-defined fields that are easy to work with (shown here for my update operation):

      SNAGHTMLbd9eef3

      If you test the Flow by performing the trigger action (I’m using Postman to make a HTTP POST call), then you should see things work beautifully:
      SNAGHTMLbde557a

      SNAGHTMLbde9a62

      ..and of course, my Office 365 profile is indeed updated with the values:

      SNAGHTMLbe0530f
          

      Summary

      Being able to plug custom code in to PowerApps/Flow/Logic Apps is an important tool in the toolbox for the modern Office 365 architect or developer. You create the OpenAPI definition for your function or API, and then create a custom connector for it, ensuring that all operations get validated and the parameters passed in the POST body are individually-defined. Once you’re at this point, you can pretty much build anything into PowerApps and Flow – which is awesome! In my example, I used a custom connector which calls an Azure Function which calls the Graph – which can probably be the foundation for many different scenarios.

      Waldek has some great information on this too, especially around async connectors which can be used for long-running operations. There’s a model where your API returns “location” and “retry-after” headers to Flow if the operation is still in progress, and this tells Flow to check back again in X seconds (however many you specified in “retry-after”). This could be very useful in certain scenarios, so I recommend reading his article What you should know about building Microsoft Flow connectors. Other good reading:

      Tuesday, 19 June 2018

      Running a PowerShell script in the cloud with an Azure Function, PnP and the Graph

      We all know that Azure Functions are really useful in many Office 365 scenarios, and this goes beyond developers. IT Pros can certainly benefit from the ability to run a script in the cloud – perhaps it’s something you want to run every hour, day or week on a schedule, and maybe you want to use PnP PowerShell too for its awesome commands around managing Groups/Teams, provisioning of SharePoint sites and more. This post is aimed at anyone who has such a need – we’re going to use Visual Studio Code (not full Visual Studio) for some bits, but that’s just because it’s becoming a great PowerShell editor even for non-coders. In fact, I’m now convinced that VS Code is the best option for PowerShell scripts regardless of who you are. If you’re a dev, you’ll probably like the way VS Code can deploy your files to an Azure Function, but for IT Pros or anyone less comfortable with this, I’ll also show how you can drag and drop the script (and related files which make it run as a function) to the cloud. In both cases, we’ll benefit from VS Code’s support for creating the function with the right set of files, the code formatting/coloring and the ability to debug our script.

      The scenario we’ll use is creating Office 365 Groups (but if you’re interested in a Flow approach to this, also see my Control Office 365 Group creation – a simple no-code solution post) – I like this scenario because with a tiny tweak, in the future it will be possible to create a Microsoft Team also, which is arguably more useful. We’re just waiting for the app-only APIs for that one, although you could do it today with user context if you needed to.

      There are many ways of creating an Azure Function these days, and I’ll cover these in a future post. So as mentioned, IT Pros shouldn’t be scared of this, but we’ll use:

      You should follow these links to download/install those things if you don’t already have them.

      Steps we’ll go through

      I’m going to try to be fairly comprehensive in this post with lots of screenshots. Here’s what we’ll cover:

      • Overview of PowerShell script contents
      • Installing the PnP PowerShell module
      • Registering an Azure AD app to talk to the Graph
        • Granting admin consent to this app by constructing and hitting a consent URL (which is interesting if you’ve haven’t seen this process before)
      • Create the Function locally using VS Code, using PowerShell as the language
        • Debugging/stepping through the function
      • Prepare for running in the cloud - copy files from SharePointPnPPowerShellOnline module into local folder
      • Create the Azure Function app in the portal
      • Publishing the files to the function:
        • Using VS Code
        • Using drag and drop with Kudu
      • Testing the function
      • Final step – storing App ID and Client Secret securely
      Update – consider storing credentials in Azure Key Vault now it has hit General Availability
      Later in this post I’ll show how we register an app in AAD to talk to the Graph, which has credentials in the form of an app ID and secret. I show storing these away from the code in Azure environment variables, but there’s a better option now. Azure Key Vault provides the ability to securely store these, where you define “who” can access them (e.g. which Azure functions, or other things which provide a service principal).
      Two of my friends have great articles on this:

        Overview of PowerShell script contents

        Let’s start with the script, just so you can see how simple it is. We won’t do anything with it yet, but here it is for your info – notice that the example we’re using (creating an Office 365 Group) is very simple if you’re able to use the PnP command (New-PnPUnifiedGroup) and have an AAD app registered.

        Here are the script contents I used:

        With that in mind, let’s deal with the prerequisite steps such as obtaining the PnP PowerShell module and registering the AAD app to be able to create Groups.    

        Installing the PnP PowerShell module

        There are a couple of ways to install this, but the PnP PowerShell page has everything you need - full instructions and links to installers. See:

         https://docs.microsoft.com/en-us/powershell/sharepoint/sharepoint-pnp/sharepoint-pnp-cmdlets?view=sharepoint-ps

        Registering an Azure AD app to talk to the Graph

        If we’re using a PnP PowerShell command that talks to the Graph (as I am to create Office 365 Groups), then we need an AAD app registration which our script/code will use to authenticate. We define this with appropriate permission scopes for the actions we’re actually performing – the process below shows what I need to do to create Groups, but if you’re using a different script which does something else, then most likely you’ll need other permission scopes.

        1. In the Azure portal, go to the “Azure Active Directory” section and find the “App registrations” area:
          SNAGHTML41e995b
        2. Click the “New application registration” button:
          SNAGHTML41f91c4
        3. Complete the details – note in this case the sign-on URL can be anything that uniquely identifies your function:
          SNAGHTML42f55f6
        4. Hit the “Create” button.
        5. We now need to grant the appropriate permissions to this app registration:
          1. Go in to the “Settings” area, and select “Required permissions”:
            SNAGHTML4379a0d
          2. Click the “Add” button, then “Select an API” and find and select “Microsoft Graph”:
            SNAGHTML43a2fcb
          3. In the “Application Permissions” section of this list, check the box for “Read and write all groups”:
            SNAGHTML43bb969
          4. Click “Select” and then “Done” to save this config.
        6. Now we need to obtain the app ID for our PowerShell script – it can be copied from the settings area. Store it somewhere safe for later pasting into our PowerShell script:
          SNAGHTML435dc21
        7. We also need to obtain a key. This can be done by:
          1. Going into the “Keys” area:
            SNAGHTML43e1c6f
          2. Give the key a name – when you click “Save”, the key value will appear in the box:
            SNAGHTML4337eb8
            SNAGHTML4346511
          3. Again, store it somewhere safe for later use in our PowerShell script.

        Grant admin consent to the app

        Now that we’ve defined our AAD app registration, a tenant admin needs to grant admin consent to the app. This is the security gate which means that the organization is permitting this access to Office 365 data. What happens is that you construct a URL in the form of:

        https://login.microsoftonline.com/[tenant].onmicrosoft.com/adminconsent?client_id=[App ID]

        So for my app, the URL looked like this: https://login.microsoftonline.com/chrisobriensp.onmicrosoft.com/adminconsent?client_id=348841f1-6c9d-4044-b014-ed346f2572f2

        When the tenant admin hits this URL, he/she sees the list of permission requirements for this app and can make a decision whether to grant consent or not:

        SNAGHTML45a381c

        Once consent is granted, the app can request auth tokens which can perform actions related to the selected permission scopes. Read more about this process at Tenant Admin Consent if you need to.

          Create the Function locally, using PowerShell as the language

          OK, now the pre-reqs are in place, we’re ready to create an Azure Function to run our PS script from.

          1. First, create a folder on your machine to host the files – I named my folder “COB-GroupCreator-Function” to reflect what we’re doing here.
          2. Open VS Code to this folder.
          3. Hit CTRL SHIFT + P for the VS Code command bar, and start typing “Azure Functions” to pull up those commands. Select the “Create New Project” command:
            SNAGHTMLfff7b5
          4. Follow the prompts in the command bar to select the folder – accept the default if you’re already in the folder you created earlier:
            SNAGHTML1568088c
          5. When it comes to selecting a language, select any for now – but in this case we said we’d use PowerShell. PowerShell doesn’t appear in the list currently because support is only experimental (as at June 2018), but we can switch to it later:
            SNAGHTML1569e4bf
          6. Now let’s add an individual function. Hit CTRL SHIFT + P again, and find the “Create Function” command under Azure Functions:
            SNAGHTML156d23fb
          7. Accept that we’ll use the current folder:
            SNAGHTML156f43bf
          8. In the next step we can change the language to PowerShell:
            SNAGHTML1572184b
          9. Select PowerShell from the list:
            SNAGHTML1574f870
          10. In the next step, we need to include “non-verified” templates whilst PowerShell support is still experimental – select the “Change template filter” option:
            SNAGHTML15764032
          11. Select “All” from the list:
            SNAGHTML15778ca8
          12. And now additional templates appear – in this case, our function is going to be a simple function which runs once per day, so we’ll select “Timer trigger”:
            SNAGHTML159ee9ae
          13. Now enter the name for your function:
            SNAGHTML15a0bb42
          14. For a function which runs daily at 10am, the CRON expression would be “0 0 10 * * *”, so let’s enter that when we’re asked for the schedule for the function by the way, here’s a good cheat sheet for other CRON expressions):
            SNAGHTML15a49c1e
          15. A bunch of new files will now be added to your folder, including the “run.ps1” file which is the PowerShell script for your function implementation:
            SNAGHTML15a731ad

          We’re now ready for our function implementation. Except to say, if you’re following these steps when PowerShell support is still experimental, you may still have some leftover JavaScript config if you did this. Let’s fix that next.

          Debugging our PowerShell script locally with F5

          VS Code works great as a debugger for PowerShell scripts. The best approach is to get your PowerShell script 100% correct by developing and debugging locally before publishing it to the cloud as an Azure function. You’ll be able to stop on breakpoints, see variables and so on – doing this locally is much easier than when you’re running in Azure on a daily schedule (although you have options for that too).

          So, all we need to do is remove the JS config from our launch.json file. Find it in the .vscode folder:

          SNAGHTML15bee9e2

          Remove the section for the JS configuration – before:

          SNAGHTML15c009c9

          after:

          SNAGHTML15c08ed7

          VS Code is now smart enough to do the right thing based on your .ps1 file extension. You’ll find that you can hit add a breakpoint to your script (which is currently still just a single line of boilerplate code) and hit F5 to debug locally:

          SNAGHTML449f5a3

          You can actually do lots more to customize what happens when F5 is pressed – for example, if your PowerShell scripts accepts parameters and you want to pass some values in when debugging. This is done by adding back some PowerShell specific configurations into your launch.json file – see Using the launch.json file in VS code for improved PowerShell debugging for some examples of this. But even without this, we’re now able to step through our script locally and get the feedback we need to develop and test it properly.

          Prepare for running in the cloud - copy files from SharePointPnPPowerShellOnline module into local folder

          We now need to copy the files from the SharePoint PnP PowerShell module into our local folder. This isn’t necessary for running locally – the files will be loaded from your local modules path for that (which can be found using $Env:PSModulePath in PS) – but it is necessary for when the script runs as an Azure Function. So, let’s find the files on your local machine by running $Env:PSModulePath – you should see a bunch of paths in there, but one of them will be for the SharePointPnPPowerShellOnline module: 

          SNAGHTML4763bbe

          Open that folder on your machine and copy the “Modules” subfolder and all it’s contents into the folder containing your PS script for your function.

          Your set of files should now look like this:

          SNAGHTMLfd0159

          When your script runs in the cloud, the Azure Functions runtime will now load any modules stored in this folder and you can therefore use the PnP commands successfully.

          Paste the script contents into run.ps1

          Now we're ready to deal with the script implementation. Paste in the script we've seen already (repeated below for your convenience), or implement your own steps.

          You’ll need to substitute in values for your AAD app ID, AAD app secret (key) and Office 365 tenant prefix into the placeholders/variables in the script – for now, let’s do that in a hard-coded way to test our script. Later on, we’ll swap out these hardcoded values and fetch them from config in Azure instead.

          As a reminder, here are the script contents I used:

          Once you've pasted those details in, run your script to check it works (by pressing F5 in VS Code). If your AAD app is registered with the right details and permissions, and you have the PnP PowerShell module installed properly, things should work just fine when running locally. Hooray! If not, you should work through the issues until you're successfully running locally before publishing to an Azure Function.

          Create the Azure Function app in the portal

          We need an Azure Function app to host our script in the cloud. You can actually create this through VS Code (use the command “Azure Functions: Create Function App in Azure”), and it works great! However, for simplicity let’s do it the normal way through the Azure portal. Even there, we have a couple of options but anything that gets you to the create function app process is good – for example:

          1. Click “Create a resource”:
            SNAGHTMLe7bd811
          2. Start typing “function app” to search for the item – select it when you find it, and then hit the “Create” button:
            SNAGHTMLe7cfece
          3. Then complete the details – you can see in the image below I’m reusing some things I already have in Azure (e.g. a Resource Group, Storage Account etc.) – but just create new ones if you need to. Of course, be aware of the selections you make if you’re doing this for real in production (especially whether you’re choosing the Consumption Plan or App Service plan). Hit the “Create” button when you’re ready:
            SNAGHTMLe7eade6
          4. Your function app, as well as an App Insights instance if you selected that option, should now have been created and are ready to host code:

            SNAGHTMLe83a189
          A note on authentication
          In this case, our function won't actually have *any* authentication for callers, because we're never going to call it over HTTP in this case (which might be different to other functions you create). This one is purely a timer trigger function.

          Publishing files to an Azure Function from VS Code:

          The next step, once we’re sure the script is working correctly, is to publish it to an Azure Function. You can do this in a variety of ways, but VS Code makes this easy:

          1. Switch to Azure tab in VS Code (bottom icon):

            SNAGHTMLbf4881
          2. Find the Function App you created earlier. Right-click on it, and select “Deploy to Function App”:
            SNAGHTML1ea4112d
          3. Step through prompts in the VS Code command palette – the defaults are generally what you need:

            SNAGHTMLc1f726
          4. Accept the prompt:

            image

          5. VS Code will then start the publish operation:

            SNAGHTMLc9b489
          6. You’ll then find that your files have been deployed as a Function to the selected Function App:

            SNAGHTMLe32952

          ALTERNATIVE – DEPLOY FILES A DIFFERENT WAY

          Of course, you don’t *have* to use VS Code to publish files up to your function app. Anything that gets your files there (FTP, manual copy via Kudu, web deploy, sync from source control) is fine – you just need to ensure the files end up in the right structure. This is what you need to end up with:

          SNAGHTMLe3dca72

          SNAGHTMLe3e7ccb

          To deploy the files using Kudu, simply drag and drop them from Windows Explorer into the site/wwwroot folder within Kudu:

          SNAGHTMLe49905d

            Final step - Storing App ID and Client Secret securely

            At this point, you should now have your script running successfully in the cloud. It should execute on the schedule you defined for the function or from using the “Test” button in the Azure portal. But remember that we hard-coded some values for the app authentication into the script – we need to fix that.

            A better option than having these values hard-coded in your script is to add App Settings to your Azure Function app, and read them from there. They are stored securely by Azure, although you could take this further and use Key Vault or similar if preferred. Either is better than having secrets/passwords in code or scripts! In the next steps, we’ll define them as app settings in Azure and tweak the script to pick them up from there:

            1. Go to the Platform Features tab, then select “Application Settings”:

              SNAGHTMLcdcce0
            2. Scroll down to the “Application Settings” section, then click the “Add new setting” link:

              SNAGHTMLd053c9
            3. Add the App ID and secret for the Azure AD app registration which has permissions to do the work (create an Office 365 Group in our case):

              SNAGHTMLd282fd
            4. Also add an item for your tenant prefix and name it “Tenant”.
            5. Don’t forget to click “Save” back at the top of the page:

              SNAGHTMLd30906
            Accessing App Settings from PowerShell in a Function
            App Settings can be retrieved by use of the $env:[MyAppSetting] syntax in PowerShell, similar to an environment variable. So in our script, we simply use the following to match up with the keys we added our config values with:

            $env:AppID 
            $env:AppSecret 
            $env:Tenant

            What this means is that our final script should look like this:

            Voila! Your script should now run from the cloud and be implemented in the way it should be. I won’t show end end result of this example (an Office 365 Group) because you most likely know what that looks like, but hopefully you get the overall idea and the steps involved.

            Summary

            The idea of running PowerShell scripts from the cloud is very useful, especially for scripts which should execute on a schedule. Our scenario of creating an Office 365 Group isn’t necessarily something you’d want to do on a schedule (or maybe it is, if you need a collaboration space for a weekly meeting or other periodic event?) However, the power of PnP PowerShell means that there are many scenarios you could deal with in this approach. We showed how to get external modules like PnP running in your Azure Function, and also how to deal with the authentication to the Graph if needed – overall, it’s a powerful approach that can be useful to both IT Pros and developers. I recommend using Visual Studio Code to work with PowerShell these days, especially since it helps package and deploy Azure Functions as well as providing debugging support. Happy coding!