Tuesday, 19 June 2018

Running a PowerShell script in the cloud with an Azure Function, PnP and the Graph

We all know that Azure Functions are really useful in many Office 365 scenarios, and this goes beyond developers. IT Pros can certainly benefit from the ability to run a script in the cloud – perhaps it’s something you want to run every hour, day or week on a schedule, and maybe you want to use PnP PowerShell too for its awesome commands around managing Groups/Teams, provisioning of SharePoint sites and more. This post is aimed at anyone who has such a need – we’re going to use Visual Studio Code (not full Visual Studio) for some bits, but that’s just because it’s becoming a great PowerShell editor even for non-coders. In fact, I’m now convinced that VS Code is the best option for PowerShell scripts regardless of who you are. If you’re a dev, you’ll probably like the way VS Code can deploy your files to an Azure Function, but for IT Pros or anyone less comfortable with this, I’ll also show how you can drag and drop the script (and related files which make it run as a function) to the cloud. In both cases, we’ll benefit from VS Code’s support for creating the function with the right set of files, the code formatting/coloring and the ability to debug our script.

The scenario we’ll use is creating Office 365 Groups (but if you’re interested in a Flow approach to this, also see my Control Office 365 Group creation – a simple no-code solution post) – I like this scenario because with a tiny tweak, in the future it will be possible to create a Microsoft Team also, which is arguably more useful. We’re just waiting for the app-only APIs for that one, although you could do it today with user context if you needed to.

There are many ways of creating an Azure Function these days, and I’ll cover these in a future post. So as mentioned, IT Pros shouldn’t be scared of this, but we’ll use:

You should follow these links to download/install those things if you don’t already have them.

Steps we’ll go through

I’m going to try to be fairly comprehensive in this post with lots of screenshots. Here’s what we’ll cover:

  • Overview of PowerShell script contents
  • Installing the PnP PowerShell module
  • Registering an Azure AD app to talk to the Graph
    • Granting admin consent to this app by constructing and hitting a consent URL (which is interesting if you’ve haven’t seen this process before)
  • Create the Function locally using VS Code, using PowerShell as the language
    • Debugging/stepping through the function
  • Prepare for running in the cloud - copy files from SharePointPnPPowerShellOnline module into local folder
  • Create the Azure Function app in the portal
  • Publishing the files to the function:
    • Using VS Code
    • Using drag and drop with Kudu
  • Testing the function
  • Final step – storing App ID and Client Secret securely
Update – consider storing credentials in Azure Key Vault now it has hit General Availability
Later in this post I’ll show how we register an app in AAD to talk to the Graph, which has credentials in the form of an app ID and secret. I show storing these away from the code in Azure environment variables, but there’s a better option now. Azure Key Vault provides the ability to securely store these, where you define “who” can access them (e.g. which Azure functions, or other things which provide a service principal).
Two of my friends have great articles on this:

    Overview of PowerShell script contents

    Let’s start with the script, just so you can see how simple it is. We won’t do anything with it yet, but here it is for your info – notice that the example we’re using (creating an Office 365 Group) is very simple if you’re able to use the PnP command (New-PnPUnifiedGroup) and have an AAD app registered.

    Here are the script contents I used:

    With that in mind, let’s deal with the prerequisite steps such as obtaining the PnP PowerShell module and registering the AAD app to be able to create Groups.    

    Installing the PnP PowerShell module

    There are a couple of ways to install this, but the PnP PowerShell page has everything you need - full instructions and links to installers. See:

     https://docs.microsoft.com/en-us/powershell/sharepoint/sharepoint-pnp/sharepoint-pnp-cmdlets?view=sharepoint-ps

    Registering an Azure AD app to talk to the Graph

    If we’re using a PnP PowerShell command that talks to the Graph (as I am to create Office 365 Groups), then we need an AAD app registration which our script/code will use to authenticate. We define this with appropriate permission scopes for the actions we’re actually performing – the process below shows what I need to do to create Groups, but if you’re using a different script which does something else, then most likely you’ll need other permission scopes.

    1. In the Azure portal, go to the “Azure Active Directory” section and find the “App registrations” area:
      SNAGHTML41e995b
    2. Click the “New application registration” button:
      SNAGHTML41f91c4
    3. Complete the details – note in this case the sign-on URL can be anything that uniquely identifies your function:
      SNAGHTML42f55f6
    4. Hit the “Create” button.
    5. We now need to grant the appropriate permissions to this app registration:
      1. Go in to the “Settings” area, and select “Required permissions”:
        SNAGHTML4379a0d
      2. Click the “Add” button, then “Select an API” and find and select “Microsoft Graph”:
        SNAGHTML43a2fcb
      3. In the “Application Permissions” section of this list, check the box for “Read and write all groups”:
        SNAGHTML43bb969
      4. Click “Select” and then “Done” to save this config.
    6. Now we need to obtain the app ID for our PowerShell script – it can be copied from the settings area. Store it somewhere safe for later pasting into our PowerShell script:
      SNAGHTML435dc21
    7. We also need to obtain a key. This can be done by:
      1. Going into the “Keys” area:
        SNAGHTML43e1c6f
      2. Give the key a name – when you click “Save”, the key value will appear in the box:
        SNAGHTML4337eb8
        SNAGHTML4346511
      3. Again, store it somewhere safe for later use in our PowerShell script.

    Grant admin consent to the app

    Now that we’ve defined our AAD app registration, a tenant admin needs to grant admin consent to the app. This is the security gate which means that the organization is permitting this access to Office 365 data. What happens is that you construct a URL in the form of:

    https://login.microsoftonline.com/[tenant].onmicrosoft.com/adminconsent?client_id=[App ID]

    So for my app, the URL looked like this: https://login.microsoftonline.com/chrisobriensp.onmicrosoft.com/adminconsent?client_id=348841f1-6c9d-4044-b014-ed346f2572f2

    When the tenant admin hits this URL, he/she sees the list of permission requirements for this app and can make a decision whether to grant consent or not:

    SNAGHTML45a381c

    Once consent is granted, the app can request auth tokens which can perform actions related to the selected permission scopes. Read more about this process at Tenant Admin Consent if you need to.

      Create the Function locally, using PowerShell as the language

      OK, now the pre-reqs are in place, we’re ready to create an Azure Function to run our PS script from.

      1. First, create a folder on your machine to host the files – I named my folder “COB-GroupCreator-Function” to reflect what we’re doing here.
      2. Open VS Code to this folder.
      3. Hit CTRL SHIFT + P for the VS Code command bar, and start typing “Azure Functions” to pull up those commands. Select the “Create New Project” command:
        SNAGHTMLfff7b5
      4. Follow the prompts in the command bar to select the folder – accept the default if you’re already in the folder you created earlier:
        SNAGHTML1568088c
      5. When it comes to selecting a language, select any for now – but in this case we said we’d use PowerShell. PowerShell doesn’t appear in the list currently because support is only experimental (as at June 2018), but we can switch to it later:
        SNAGHTML1569e4bf
      6. Now let’s add an individual function. Hit CTRL SHIFT + P again, and find the “Create Function” command under Azure Functions:
        SNAGHTML156d23fb
      7. Accept that we’ll use the current folder:
        SNAGHTML156f43bf
      8. In the next step we can change the language to PowerShell:
        SNAGHTML1572184b
      9. Select PowerShell from the list:
        SNAGHTML1574f870
      10. In the next step, we need to include “non-verified” templates whilst PowerShell support is still experimental – select the “Change template filter” option:
        SNAGHTML15764032
      11. Select “All” from the list:
        SNAGHTML15778ca8
      12. And now additional templates appear – in this case, our function is going to be a simple function which runs once per day, so we’ll select “Timer trigger”:
        SNAGHTML159ee9ae
      13. Now enter the name for your function:
        SNAGHTML15a0bb42
      14. For a function which runs daily at 10am, the CRON expression would be “0 0 10 * * *”, so let’s enter that when we’re asked for the schedule for the function by the way, here’s a good cheat sheet for other CRON expressions):
        SNAGHTML15a49c1e
      15. A bunch of new files will now be added to your folder, including the “run.ps1” file which is the PowerShell script for your function implementation:
        SNAGHTML15a731ad

      We’re now ready for our function implementation. Except to say, if you’re following these steps when PowerShell support is still experimental, you may still have some leftover JavaScript config if you did this. Let’s fix that next.

      Debugging our PowerShell script locally with F5

      VS Code works great as a debugger for PowerShell scripts. The best approach is to get your PowerShell script 100% correct by developing and debugging locally before publishing it to the cloud as an Azure function. You’ll be able to stop on breakpoints, see variables and so on – doing this locally is much easier than when you’re running in Azure on a daily schedule (although you have options for that too).

      So, all we need to do is remove the JS config from our launch.json file. Find it in the .vscode folder:

      SNAGHTML15bee9e2

      Remove the section for the JS configuration – before:

      SNAGHTML15c009c9

      after:

      SNAGHTML15c08ed7

      VS Code is now smart enough to do the right thing based on your .ps1 file extension. You’ll find that you can hit add a breakpoint to your script (which is currently still just a single line of boilerplate code) and hit F5 to debug locally:

      SNAGHTML449f5a3

      You can actually do lots more to customize what happens when F5 is pressed – for example, if your PowerShell scripts accepts parameters and you want to pass some values in when debugging. This is done by adding back some PowerShell specific configurations into your launch.json file – see Using the launch.json file in VS code for improved PowerShell debugging for some examples of this. But even without this, we’re now able to step through our script locally and get the feedback we need to develop and test it properly.

      Prepare for running in the cloud - copy files from SharePointPnPPowerShellOnline module into local folder

      We now need to copy the files from the SharePoint PnP PowerShell module into our local folder. This isn’t necessary for running locally – the files will be loaded from your local modules path for that (which can be found using $Env:PSModulePath in PS) – but it is necessary for when the script runs as an Azure Function. So, let’s find the files on your local machine by running $Env:PSModulePath – you should see a bunch of paths in there, but one of them will be for the SharePointPnPPowerShellOnline module: 

      SNAGHTML4763bbe

      Open that folder on your machine and copy the “Modules” subfolder and all it’s contents into the folder containing your PS script for your function.

      Your set of files should now look like this:

      SNAGHTMLfd0159

      When your script runs in the cloud, the Azure Functions runtime will now load any modules stored in this folder and you can therefore use the PnP commands successfully.

      Paste the script contents into run.ps1

      Now we're ready to deal with the script implementation. Paste in the script we've seen already (repeated below for your convenience), or implement your own steps.

      You’ll need to substitute in values for your AAD app ID, AAD app secret (key) and Office 365 tenant prefix into the placeholders/variables in the script – for now, let’s do that in a hard-coded way to test our script. Later on, we’ll swap out these hardcoded values and fetch them from config in Azure instead.

      As a reminder, here are the script contents I used:

      Once you've pasted those details in, run your script to check it works (by pressing F5 in VS Code). If your AAD app is registered with the right details and permissions, and you have the PnP PowerShell module installed properly, things should work just fine when running locally. Hooray! If not, you should work through the issues until you're successfully running locally before publishing to an Azure Function.

      Create the Azure Function app in the portal

      We need an Azure Function app to host our script in the cloud. You can actually create this through VS Code (use the command “Azure Functions: Create Function App in Azure”), and it works great! However, for simplicity let’s do it the normal way through the Azure portal. Even there, we have a couple of options but anything that gets you to the create function app process is good – for example:

      1. Click “Create a resource”:
        SNAGHTMLe7bd811
      2. Start typing “function app” to search for the item – select it when you find it, and then hit the “Create” button:
        SNAGHTMLe7cfece
      3. Then complete the details – you can see in the image below I’m reusing some things I already have in Azure (e.g. a Resource Group, Storage Account etc.) – but just create new ones if you need to. Of course, be aware of the selections you make if you’re doing this for real in production (especially whether you’re choosing the Consumption Plan or App Service plan). Hit the “Create” button when you’re ready:
        SNAGHTMLe7eade6
      4. Your function app, as well as an App Insights instance if you selected that option, should now have been created and are ready to host code:

        SNAGHTMLe83a189
      A note on authentication
      In this case, our function won't actually have *any* authentication for callers, because we're never going to call it over HTTP in this case (which might be different to other functions you create). This one is purely a timer trigger function.

      Publishing files to an Azure Function from VS Code:

      The next step, once we’re sure the script is working correctly, is to publish it to an Azure Function. You can do this in a variety of ways, but VS Code makes this easy:

      1. Switch to Azure tab in VS Code (bottom icon):

        SNAGHTMLbf4881
      2. Find the Function App you created earlier. Right-click on it, and select “Deploy to Function App”:
        SNAGHTML1ea4112d
      3. Step through prompts in the VS Code command palette – the defaults are generally what you need:

        SNAGHTMLc1f726
      4. Accept the prompt:

        image

      5. VS Code will then start the publish operation:

        SNAGHTMLc9b489
      6. You’ll then find that your files have been deployed as a Function to the selected Function App:

        SNAGHTMLe32952

      ALTERNATIVE – DEPLOY FILES A DIFFERENT WAY

      Of course, you don’t *have* to use VS Code to publish files up to your function app. Anything that gets your files there (FTP, manual copy via Kudu, web deploy, sync from source control) is fine – you just need to ensure the files end up in the right structure. This is what you need to end up with:

      SNAGHTMLe3dca72

      SNAGHTMLe3e7ccb

      To deploy the files using Kudu, simply drag and drop them from Windows Explorer into the site/wwwroot folder within Kudu:

      SNAGHTMLe49905d

        Final step - Storing App ID and Client Secret securely

        At this point, you should now have your script running successfully in the cloud. It should execute on the schedule you defined for the function or from using the “Test” button in the Azure portal. But remember that we hard-coded some values for the app authentication into the script – we need to fix that.

        A better option than having these values hard-coded in your script is to add App Settings to your Azure Function app, and read them from there. They are stored securely by Azure, although you could take this further and use Key Vault or similar if preferred. Either is better than having secrets/passwords in code or scripts! In the next steps, we’ll define them as app settings in Azure and tweak the script to pick them up from there:

        1. Go to the Platform Features tab, then select “Application Settings”:

          SNAGHTMLcdcce0
        2. Scroll down to the “Application Settings” section, then click the “Add new setting” link:

          SNAGHTMLd053c9
        3. Add the App ID and secret for the Azure AD app registration which has permissions to do the work (create an Office 365 Group in our case):

          SNAGHTMLd282fd
        4. Also add an item for your tenant prefix and name it “Tenant”.
        5. Don’t forget to click “Save” back at the top of the page:

          SNAGHTMLd30906
        Accessing App Settings from PowerShell in a Function
        App Settings can be retrieved by use of the $env:[MyAppSetting] syntax in PowerShell, similar to an environment variable. So in our script, we simply use the following to match up with the keys we added our config values with:

        $env:AppID 
        $env:AppSecret 
        $env:Tenant

        What this means is that our final script should look like this:

        Voila! Your script should now run from the cloud and be implemented in the way it should be. I won’t show end end result of this example (an Office 365 Group) because you most likely know what that looks like, but hopefully you get the overall idea and the steps involved.

        Summary

        The idea of running PowerShell scripts from the cloud is very useful, especially for scripts which should execute on a schedule. Our scenario of creating an Office 365 Group isn’t necessarily something you’d want to do on a schedule (or maybe it is, if you need a collaboration space for a weekly meeting or other periodic event?) However, the power of PnP PowerShell means that there are many scenarios you could deal with in this approach. We showed how to get external modules like PnP running in your Azure Function, and also how to deal with the authentication to the Graph if needed – overall, it’s a powerful approach that can be useful to both IT Pros and developers. I recommend using Visual Studio Code to work with PowerShell these days, especially since it helps package and deploy Azure Functions as well as providing debugging support. Happy coding!

        Wednesday, 23 May 2018

        Updating your SPFx web part/extension – the nuclear approach

        SNAGHTML2356ad3One situation that Office 365 and SharePoint developers will encounter more and more is the need to upgrade an existing SharePoint Framework (SPFx) project to a later version of the framework. SPFx gets updated fairly regularly, and as more and more development moves over to this approach, you’ll find that when you go back to enhance or maintain an existing solution, more frequently it’s SPFx code that you’re going back to. When you do, you have a decision – should you upgrade the SPFx version of the web part/extension while you’re making your code changes? Of course, you don’t have to (and maybe there hasn’t even been a new version of SPFx since your original release). But maybe you want to take advantage of some newer SPFx features – or maybe you just want to stay current and benefit from some possible performance or developer enhancements.

        The pains of upgrading a project

        Unfortunately, updating an SPFx application is often not as easy as it should be – you’ll often run into build errors when you run a gulp serve or gulp bundle until you’ve got everything straight. Most of the guidance you’ll see advises you to use ‘npm outdated’ to identify packages with newer versions, and then updating them individually (with npm install [package]@[version]) – indeed the SPFx docs have a useful page on this approach at Update SharePoint Framework packages. I recommend always trying this process first. However, I find that sometimes you need some extra steps too. I upgraded a project yesterday and ended up using a slightly different process to get to a working build, so I thought that would be a good topic to briefly discuss.

        One issue you can hit with the npm outdated/npm install [package]@[version] approach is if there are new or removed packages which make up the new SPFx version you are moving to. In this case, simply updating your existing packages isn’t enough.

        The nuclear approach – a recipe

        Sometimes the right thing to do is to be a bit more drastic. You could certainly create an entirely new SPFx application and merge your code into that – but that’s too drastic. It’s a lot of work, and I’ve yet to see a case where it’s actually necessary. The process below is somewhere in between – it’s based on creating a new SPFx project on the latest version, merging the package.json files to ensure you end up with all the right packages at the expected versions and then rebuilding the node_modules folder based on that. I’d use it when:

        • Updating existing packages using npm outdated/npm install@version doesn’t work (still getting build errors)
        • I was confident the previous build is in source control, including a package-lock.json file (so I could revert everything back if needed)

        The process would be:

        • Install the latest Yeoman generator version
        • Create a new temporary SPFx app (web part or extension, to match the application you’re updating)
        • Merge the package.json from the temporary app into yours
          • The aim is to have all packages needed – all the SPFx packages for the new framework version, AND any other packages you have previously installed for this web part/extension
        • Remove any package-lock.json or npm-shrinkwrap.json files from local directory (but do not remove from source control - you still might need to go back to a previous build)
        • Delete everything from node_modules
        • Run npm install
        • Make any tweaks to tsconfig.json if needed - newer SPFx versions sometimes need a change here too
          • Build errors should point you to the solution for these. When running an upgrade to SPFx 1.4.1 recently, I needed to add an entry in my “lib” section for “es2015” – but look for build errors for the specific tweak you need
        • Run gulp clean (to delete previous build outputs)

          At this point, you should have a working build again when you run gulp bundle or gulp serve. If you do, and you’re comfortable everything is working, you should:

          • Commit to source control
            • Ensure your package-lock.json file is included – this represents the node_modules tree for this build, and you need it to recreate the build (since you won’t be checking-in the node_modules folder itself)
          • Continue making the code changes you actually came here to do :)

            Other notes

            • Extra care needs to be taken for on-prem SPFx projects. The npm repository holds latest versions of packages, but remember that when you’re on-prem you’re frozen at a certain version of SPFx packages (not what the npm repository holds as the latest version). So, your package.json file needs to reflect the appropriate versions for your on-prem build.

            Good luck!

              Thursday, 10 May 2018

              Licensing for push notifications in PowerApps/Flow

              If you’re doing more with PowerApps and Flow in Office 365, sooner or later you might hit an interesting question if you want to use a premium connector – just who needs the P1 or P2 license? Premium connectors require extra licensing to what comes with Office 365 E1, E3 and even E5, and the licensing requirements can vary depending on the connector and how it is used – meaning costs can vary too. The one I’m focusing on here is push notifications – let’s say you’re building a PowerApp or using a Flow, and you want to send a notification to a mobile device when something happens (i.e. an alert on the device, rather than an e-mail or SMS). In this case, you’ll most likely find your way to the PowerApps push notification action (a premium connector). Whilst there is also a Flow push notification, that one can only send to the Flow owner – but frequently, you want to send the notification to some kind of approver or recipient, not the person who created the Flow. So, the PowerApps Notification is probably what you need – but don’t be fooled by the name. This thing isn’t constrained to use with in a PowerApp (e.g. a button click) - it can also be used within a Flow (e.g. when the Flow gets to a certain stage). Here’s what it looks like there:

              PowerApps notification - in picker

              [SNAGHTML109f23cb%5B5%5D]

              Who needs the license?

              So the question you get to when you need to notify many people is “who exactly needs a P1/P2 license?”

              • Is it just the user/sender? (which in this case, would be the user who the Flow runs as)?
              • Is it every recipient?
              • Is everyone who could be a recipient?

              It’s an interesting one and I couldn’t find the answer in the documentation. We had the situation come up in some work for a client, so I reached out to a friendly contact in Microsoft (thanks Dan!), and he did some digging internally to find out.

              The answer

              In fact, it depends on how your Flow is triggered – since it can be done by a user action, or indirectly through an event:

              Triggered

              Who needs the license

              License type required

              Manually Any user of the app which triggered the Flow (e.g. your PowerApp, Flow button or other app)
              • Flow is triggered by PowerApp –> PowerApps P1
              • Flow is triggered by Flow button –> Flow P1
              By an event (e.g. new item in a SharePoint list, new file in OneDrive etc.) Just the Flow author
              • P2 (Flow) – * but see my caveat below

              It’s worth being aware that these P1 and P2 additive licenses come in both PowerApps and Flow flavors – but PowerApps licenses always include Flow capabilities. PowerApps licenses are correspondingly more expensive since you unlock the whole, um, power of PowerApps too. Broadly the idea is that:

              • Users creating “basic” PowerApps (e.g. canvas apps) need a P1 license - currently $7 per user per month for PowerApps P1
              • Users creating “advanced” PowerApps (e.g. model-driven apps, business process flows, entities in CDS etc) need a P2 license (currently $40 per user per month for PowerApps P1

              For many PowerApps, that will mean the P1 license is sufficient though (in my view).

                Also note that if you’ll only use Flow and never PowerApps, maybe you just need Flow P1/P2 licenses. These are much cheaper than PowerApps P1/P2 licenses (currently $5 per user per month for Flow P1, $15 per user per month for Flow P2).

                * By the way, with reference to my table above, my understanding is that if you have a PowerApp which adds a SharePoint list item (or file to OneDrive etc.), then a PowerApps license is required for the user – not just a Flow license. After all, the event is really being triggered by the PowerApp – rather than it being a purely indirect thing, from a manually added list item/file for example.

                If you need to know up-to-date pricing, see the PowerApps plans pricing and Flow plans pricing pages. Another good resource (along with licensing examples) is the PowerApps licensing overview page – this has some great info, but it just didn’t have the detail I needed around premium connectors and recipients.

                Other scenarios

                Maybe you want to use the PowerApps push notification purely within PowerApps (i.e. not in Flow). In this case, there are similar considerations:

                • Makers of the app which has the Push Notifications connection need a PowerApps P1 or P2 license (as you’d expect)
                • Users of the app need a PowerApps P1 or P2 license (also as you’d expect, since this is a premium connector remember)

                But there’s an interesting possibility if it’s a different PowerApp which receives the notification (in the PowerApps world, it’s possible for one app to send the notification and another to be receive it). In this case:

                • Users of the recipient app do NOT need a PowerApps P1/P2 license – their Office 365 license is sufficient
                  • (This is because this app does not use the Push Notifications connection)

                So, that’s definitely a trick which might be helpful, so long as the implementation details are OK with your requirements (i.e. you are able to split your app in this way).

                Summary

                There are definitely a few complexities around licensing of PowerApps and Flow, and as you build more solutions with them there are more scenarios to consider – in fact, . I’m really not trying to be the man to answer all your licensing questions here though :) You’ll need to contact Microsoft if the above links don’t have the detailed information you need, as they’ll give a better answer than I ever could. However, I thought some folks might run into the same questions that I did, and I figured some of this info needs more prominence.

                Have fun building your solutions!