Thursday 5 December 2013

Using CSOM in PowerShell scripts with Office 365

In my recent “Office 365 developer decisions, tips and tricks” talk I mentioned that we’d been doing a lot of “PowerShell with CSOM” work, and this was enabling us to run scripts against SharePoint Online in the same way that we are used to (for on-premises SharePoint). This is becoming a popular approach, but since I got questions on it I thought it worth writing about.

When we first started to work with Office 365, I remember being quite concerned at the lack of PowerShell cmdlets – basically all the commands we’re used to using do not exist there. Here’s a gratuitous graph to illustrate the point:


So yes, nearly 800 PowerShell commands in SP2013 (up from around 530 in SP2010) down to a measly 30 in SharePoint Online. And those 30 mainly cover basic operations with sites, users and permissions – no scripting of, say, Managed Metadata, user profiles, search and so on. It’s true to say that some of these things are now available down at site-collection scope (needed, of course, when you don’t have a true “Central Admin” site but there are still “tenant-level” settings that you want to use script for rather than make manual changes through the UI.

So what’s a poor developer/administrator to do?

The answer is to write PowerShell as you always did, but embed CSOM code in there. More examples later, but here’s a small illustration:

# get the site collection scoped Features collections (e.g. to activate one) – not showing how to obtain $clientContext here..
$siteFeatures = $clientContext.Site.Features

So we’re using the .NET CSOM, but instead of C# we are using PowerShell’s ability to call any .NET object (indeed, nearly every script will use PowerShell’s New-Object command). All the things we came to love about PowerShell are back on the table:

  • Scripts can be easily amended, no need to recompile (or open Visual Studio)
  • We can debug with PowerGui or PowerShell ISE
  • We can leverage other things PowerShell is good at e.g. easily reading from XML files, using other PowerShell modules and other APIs (including .NET) etc.

Of course, we can only perform operations where the method exists in the .NET CSOM – that’s the boundary of what we can do.

Getting started

Step 1 – understand the landscape

The first thing to understand is that there are actually 3 different approaches for scripting against Office 365/SharePoint Online, depending on what you need to do. It might just be me, but I think that when you start it’s easy to get confused between them, or not fully appreciate that they all exist. The 3 approaches I’m thinking of are:

  • SharePoint Online cmdlets
  • MSOL cmdlets
  • PowerShell + CSOM

This post focuses on the last flavor. I also wrote a short companion post about the overall landscape and with some details/examples on the other flavors, at Using SharePoint Online and MSOL cmdlets in PowerShell with Office 365

Step 2 – prepare the machine you will run scripts against SharePoint Online

Option 1 - if you will NOT run scripts from a SP2013 box (e.g. a SP2013 VM):

You need to obtain the SharePoint DLLs which comprise the .NET CSOM, and copy them to a folder on your machine – your scripts will reference these DLLs.

  1. Go to any SharePoint 2013 server, and copy any DLL
  2. which starts with Microsoft.SharePoint.Client*.dll from the C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI folder.
  3. Store them in a folder on your machine e.g. C:\Lib – make a note of this location.


Option 2 - if you WILL run scripts from a SP2013 box (e.g. a SP2013 VM):

In this case, there is no need to copy the DLLs – your scripts will reference them in the original SharePoint install location (C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI).

The top of your script – referencing DLLs and authentication

Each .ps1 file which calls the SharePoint CSOM needs to deal with two things before you can use the API – loading the CSOM types, and authenticating/obtaining a ClientContext object. So, you’ll need this at the top of your script:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

In the scripts which follow, we’ll include this “top of script” stuff by dot-sourcing TopOfScript.ps1 in every script below – you could follow a similar approach (perhaps with a different name!) or simply paste that stuff into every script you create. If you enter a valid set of credentials and URL, running the script above should see you ready to rumble:

PS CSOM got context

Script examples

Activating a Feature in SPO

Something you might want to do at some point is enable or disable a Feature using script. The script below, like the others that follow it, all reference my TopOfScript.ps1 script above:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

PS CSOM activate feature

Enable side-loading (for app deployment)

Along very similar lines (because it also involves activating a Feature), is the idea of enabling “side-loading” on a site. By default, if you’re developing a SharePoint app it can only be F5 deployed from Visual Studio to a site created from the Developer Site template, but by enabling “side-loading” you can do it on (say) a team site too. Since the Feature isn’t visible (in the UI), you’ll need a script like this:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

PS CSOM enable side loading

Iterating webs

Sometimes you might want to loop through all the webs in a site collection, or underneath a particular web:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

PS CSOM iterate webs

(Worth noting that you also see SharePoint-hosted app webs also in the image above, since these are just subwebs (albeit ones which get accessed on the app domain URL rather than the actual host site’s web application URL).

Iterating webs, then lists, and updating a property on each list

Or how about extending the sample above to not only iterate webs, but also the lists in each - the property I'm updating on each list is the EnableVersioning property, but you easily use any other property or method in the same way:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

 PS CSOM iterate lists enable versioning

Import search schema XML

In SharePoint 2013 and Office 365, many aspects of search configuration (such as Managed Properties and Crawled Properties, Query Rules, Result Sources and Result Types) can be exported and importing between environments as an XML file. The sample below shows the import operation handled with PS + CSOM: 

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

PS CSOM import search schema


As you can hopefully see, there’s lots you can accomplish with the PowerShell and CSOM combination. Anything that can be done with CSOM API can be wrapped into a script, and you can build up a library of useful PowerShell snippets just like the old days. There are some interesting things that you CANNOT do with CSOM (such as automating the process of uploading/deploying a sandboxed WSP to Office 365), but there ARE approaches for solving even these problems, and I’ll most likely cover this (and our experiences) in future posts.

A final idea on the PowerShell + CSOM front is the idea that you can have “hybrid” scripts which can deal with both SharePoint Online and on-premises SharePoint. For example, on my current project everything we build must be deployable to both SPO and on-premises, and our scripts take a “DeploymentTarget” parameter where the values can be “Online” or “OnPremises”. There are some differences (i.e. branching) in the scripts, but for many operations the same commands can be run.

Related post - Using SharePoint Online and MSO cmdlets in PowerShell with Office 365

Using SharePoint Online and MSOL/WAAD cmdlets in PowerShell with Office 365

This post is a companion post to Using CSOM in PowerShell scripts with Office 365. As I mention over in that article, broadly there are 3 different flavors to writing PowerShell for Office 365 – exactly what commands you need to run will dictate which you use, but it’s also conceivable that you might use all 3 in the same script. When you start with Office 365, I think it’s easy to get confused between them, or not fully appreciate that they all exist. What I’m thinking of here is:




When installed you’ll have:

SharePoint Online (SPO) cmdlets These are SharePoint Online specific, and can be identified by “SPO” in the noun part of the cmdlet. 
  • Get-SPOSite (to list your site collections in Office 365)
  • New-SPOSite (to create a new site collection)
 SPO shell
MS Online (MSOL)/WAAD cmdlets These are commands related to an Office 365 tenancy (but not necessarily specific to Exchange, Lync or SharePoint) and can be identified by “Msol” in the noun part of the cmdlet.
  • Get-MSolUser (to list users in your tenancy)
  • Set-MsolUserPassword (to update a password)
 MSO shell
Using SP CSOM in PS scripts The main focus of my other post, Using CSOM in PowerShell scripts with Office 365
  • Activating a Feature in SharePoint Online
  • Updating webs or lists in SharePoint Online
No install needed – you can run this type of scripts in a regular Windows PowerShell command prompt.


A note on MSOL/Windows Azure AD cmdlets

You might be wondering why the MSOL cmdlets show “Windows Azure Active Directory..” in the shortcut title (full name is “Windows Azure Active Directory Module for Windows PowerShell”), despite everything else being labelled “MSOnline” or “MSOL”. The answer is that originally these cmdlets were known as the “Microsoft Online Services Module for Windows PowerShell cmdlets”, but since that time Microsoft have introduced Windows Azure Active Directory as a formal service offering. Every Office 365 tenancy is backed by Windows Azure Active Directory (WAAD) – and since the MSOL cmdlets always were focused around directory stuff (managing users/groups, managing synchronization with your Active Directory and so on), these have now been absorbed into the WAAD offering.

Getting started

If you’re a developer or administrator who will be working with Office 365 regularly, I recommend installing the the ‘shells’ for both the SharePoint Online and Office 365 PowerShell commands – you’ll probably need them at some point.

  1. Install PowerShell 3.0 if you don’t already have it. It’s included in the Windows Management Framework 3.0 -
  2. Install the SPO cmdlets -
  3. Install the MSOL cmdlets -

Once installed, you’re ready to start thinking about the “top of script” stuff (e.g. authenticating to Office 365). You’ll find that it’s very similar for both the SPO and MSOL scripts, but a different cmdlet must be run to start the session:

  • Connect-SPOService
  • Connect-MsolService

Script examples – SPO scripts

Authenticating to SharePoint Online to run SPO cmdlets:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

List all site collections in SharePoint Online:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

Recreate a site collection in SharePoint Online:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

Script examples – MSOL/WAAD scripts

Authenticating to Office 365 to run MSOL/WAAD cmdlets:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

Getting all users in in the directory

A simple MSOL example, just for completeness:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

Further reading

Appendix – full lists of SPO and MSOL/WAAD cmdlets

To help you get a sense of the full range of commands in each family (in case you don’t already have them installed), I’m listing them below:

SPO cmdlets


MSOL/WAAD cmdlets


Wednesday 13 November 2013

Slides from my “Office 365 – developer decisions, tips and tricks” talk published

Recently I gave a talk at SharePoint Saturday UK, following some work my team and I have done on a large Office 365 project. The project involves a reasonable amount of customisation, and we found that many of the techniques we were used to using on classic SharePoint projects cannot simply be applied. Personally I feel like I’ve learnt a lot over the last few months – at my current employer (Content and Code) we are on our 3rd or 4th “Office 365 project with SharePoint customisations”, and we were lucky to also have some time to prepare techniques/scripts etc. before the first one started.

I can’t share all this work, but if you’re starting a similar project you might find some useful info in the ground we have covered and decisions we made. I also point to some useful resources which might be helpful.

Office 365 – developer decisions, tips and tricks

The presentation is embedded below (along with a link to it on SlideShare if you don’t see it).


  • Decision – how we deal with test environments in the cloud (e.g. do you need a separate Office 365 tenancy for test?)
  • Decision – do developers still need powerful desktops/laptops to run virtual machines?
  • Decision – should you use (server-side) code in the sandbox?
    • And should you worry about the “sandbox is deprecated” message?
    • What are the alternatives to server-side code?
  • Using the magic “PowerShell + CSOM” combination to write PowerShell scripts for Office 365
  • Dealing with Managed Metadata fields
    • Provisioning taxonomy (Term Sets) with CSOM, to ensure consistent IDs
    • Our approach - 100% declarative provisioning of Managed Metadata fields
  • The new way of working with Managed Properties – search schema XML
    • Using the SearchConfigurationPortability object in CSOM
  • Automated deployments to Office 365 – using PS/CSOM to:
    • Recreate site collections
    • Import taxonomy term sets and terms
    • Import search schema
    • Upload sandbox WSPs to the site’s Solution Gallery
    • Activate Features, apply custom WebTemplate to site
  • Wrapping the above in TFS build for Continuous Integration (nightly builds of latest packages to Office 365)

I’ll expand on some of these topics in future articles. Similarly I’m expecting to further develop this talk and incorporate new content.

View/download  link -

View/download  link -

Friday 25 October 2013

Waiting for a search crawl in Office 365 – plan search-driven sites carefully

HourglassHere in autumn/fall 2013, if you’re working with Office 365 you might notice that content changes (such as new pages and documents) take some time to appear in search results. I spent a little time thinking about this recently, as my team and I finished building a search-driven news site. On this project, we are mainly developing against Office 365 – we use local virtual machines also, but since O365 is the target we are deploying our customisations there frequently as we develop.

We noticed that “index latency” – the time taken for new content to appear in the search index – was poorer than we expected on Office 365. We run several tenancies on different subscription levels (e.g. SharePoint P2, Office 365 E3 etc.), and we experience the problem across all of them. Some days are good, some days are bad. One memorable (read, stressful) time, we had a “end of sprint demo” - our solution was provisioned 2 days before the demo, giving us lots of time to create test content in order to make the demo to the business users go well. We completed adding our pages, documents, pictures and videos a full 24 hours before the demo, and waited for our home page to “light up” as content was crawled in Office 365.

Unfortunately, only some of the content was indexed in time. The demo itself went well, but perhaps only because a bit of narrative helped the business users imagine the ‘full’ picture. Overall, it’s difficult not to feel that 24 hours is a long time to wait for content to be indexed in SharePoint! Business users these days have higher expectations, and most on-premise environments I’ve worked with have used incremental crawls with a frequency of 15 or 30 minutes.

How long is normal in Office 365?

The poor performance surprised us somewhat. My colleagues and I thought that we had originally read that a delay of up to 15 minutes was expected in Office 365, perhaps suggesting that SharePoint 2013’s “Continuous Crawl” is used. The Office 365 Service Descriptions – Search page now suggests that isn’t the case, but however it is managed in the back-end, we certainly weren’t expecting such long delays. Some further digging will lead you to this KB article:

Search doesn't return all results in SharePoint Online – KB2008449

“Search crawls occur continuously to make sure that content changes are available through search results as soon as possible. Recently uploaded documents may not immediately be displayed in search results because of the time that's required to process them. SharePoint Online targets between 15 minutes and an hour for the time between upload and availability in search results (also known as index freshness). In cases of heavy environment use, this time can increase to six hours.”

OK, so at least that’s something official, even if it’s not necessarily what we wanted to hear. But why are we sometimes seeing longer delays than 6 hours even? I raised a Service Request with Microsoft to find out..

The support line

In short, I didn’t get a 100% satisfactory answer from Office 365 support. Ultimately it sounds like this kind of thing is fairly normal in Office 365 right now. I asked if other customers were reporting this issue, and the answer was “yes, but we just ask them to wait another day”. Hmm, OK then! Of course, if your site deals with time-sensitive content (or you are just looking for fresh content to be shown in search in a reasonable timeframe) this isn’t a great situation.

Working around the issue

So if you need to consider other alternatives:

  • If you are dealing with search-driven functionality, could the same thing be provided with query rather than search (e.g. if you do not need to aggregate across site collections)?
  • If you are in a hybrid situation, could the functionality be delivered by an on-premises environment?
  • Do you need a solution right now, or can you afford to wait for improvements? (I personally am hopeful that upgrades to Office 365 will improve the situation in the future.)

For us, in fact all three are options we could use. In our situation the 2nd option could be the simplest if we need an immediate solution - everything we are building for this client can work be deployed to Office 365 or on-premises SharePoint. This requires quite a lot of careful engineering (not only in terms of the solution, but also deployment scripts/processes etc.), but results in a nice position to be in for a hybrid deployment.

In general though, let’s hope that Microsoft work on this in Office 365. I’ll keep you posted if we see improvements - and if anyone has any useful information in this area, feel free to share in the comments below.

Sunday 20 October 2013

Speaking at SharePoint Saturday UK 2013 – developing for Office 365

SharePoint Saturday UK

As usual, I’ll be speaking at this year’s SharePoint Saturday UK event – I think this is the 4th one, and also my 4th time speaking there. It’s a great event, and I’m looking forward to learning from the other sessions. For my part, I’ll be doing my best to share knowledge gained from recent Office 365 and SharePoint 2013 projects.

My session is mainly targeted towards developers and technical people, but hopefully there’s something in it for anyone looking at Office 365. Myself and some colleagues have spent an intense few weeks/months delivering a cloud project recently, and it feels like every day has been a learning day. My talk hopefully conveys some of this, and I’ll be evolving it for future conferences:

Developer decisions, tips and tricks - lessons learnt from Office 365 projects

As a developer or technical lead, your early Office 365 projects are likely to throw up some interesting questions. Should you avoid the sandbox? How should test environments be handled in the cloud? How should a site template be implemented? And just how do you provision Managed Metadata fields, when the on-premises techniques cannot be used?

This session walks through the “dev strategy” decisions we made at Content and Code, and why. Over the course of several demos, we’ll also discuss apps, automation scripts and also advanced techniques such as Continuous Integration for Office 365.

Event details

This year’s event is on 9th November 2013, and is held in Hinckley, near Leicester. If you’re interested, you can register here:


Wednesday 18 September 2013

Provisioning Managed Metadata fields in Office 365 – Part 2: building WSP packages for a specific environment/tenancy

In the previous post I talked about how provisioning Managed Metadata fields is tricky for Office 365 (since developers cannot use server-side code to “hook-up” the field to the Term Store, like we can on-premises). I talked about the way we approach this, and in this post I’ll show the Visual Studio/MSBuild customisation we use to build WSPs specific to an Office 365 tenancy. Here’s how the articles relate to each other:

  1. Provisioning Managed Metadata fields in Office 365 – dealing with multiple environments
  2. Provisioning Managed Metadata fields in Office 365 – building WSP packages for a specific environment/tenancy [this article]

Using MSBuild/custom Project Configurations to build packages per Office 365 tenancy

Here’s what I came up with – it took quite a few long nights in the murky hall-of-mirrors world of “batching” in MSBuild, roughly 5 million discarded approaches and several new gray hairs. But finally, it works great for us:

We now have some custom project configurations which can be selected in Visual Studio:

So instead of just ‘Debug’ and ‘Release’ we are using:

  • “CandCVM” – Content and Code (name of my employer) virtual machine. Used for when you are developing against your local VM rather than the cloud
  • “DevTenancy” – the Office 365 tenancy we are using for DEV purposes
  • “TestTenancy” – the Office 365 tenancy we are using for TEST purposes

Those are all the environments we need to worry about for now, but of course others will be needed eventually.

We also have an XML file in the VS project, which defines the Managed Metadata IDs for each environment – since we now only have to worry about the SspId, that’s the only value the file contains:


SspId replacements file_thumb[2]

Once the project (or solution) configuration has been switched, we just build/publish the WSP to the filesystem as usual:


..and as you’d expect, we get the WSP(s) which will work against that environment:


What has happened, and how?

As the WSPs were being packaged, some custom MSBuild executed. What this does is:

  • Finds the SspId_Replacements.xml file
  • Reads the SspId value for the appropriate environment
  • Performs a find/replace across all the of the XML files about to go into the package(s), and replaces the SspId value with the one read from the ‘replacements’ file


  1. Although it seems inefficient to look for an SspId and try to replace across *all* XML files (and it is), we do need to ensure that we catch:
    1. Managed Metadata field declarations (i.e. in elements.xml files)
    2. The section in schema.xml files (for list definitions) where the fields are duplicated

      I did look at trying to constrain the search beyond just the .xml extension, but found this difficult in MSBuild. I was also mindful of the fact that a developer could choose to have a Feature elements file NOT named elements.xml :) Fortunately the performance hit for us is negligible – the overall penalty in packaging for using this stuff (compared to a plain Debug or Release build) seems to be around 1-3 seconds – perfectly tolerable right now.
  2. The replacement is XML-based (using XPath rather than string manipulation), so if you have XML comments nearby, this won’t trip up the replacement.

Benefit: integrating with Continuous Integration/automated build

One reason I wanted to implement in MSBuild is because I knew that it would be painless to use in a CI process. When implementing the build in TFS, we select the solutions/projects to build, and then define any custom Configurations we wish to use there also (i.e. in addition to developer machines). In our case, our CI builds deploy to our test tenancy:

Continuous Integration - Office 365 - custom VS configuration_thumb[2]

We then make sure our build definition is using this Configuration (instead of Debug or Release):

Continuous Integration - Office 365 - Managed Metadata solution_thumb[2]

Want to use this? Here’s the MSBuild..

If you think this could be useful to you, here are the details – essentially you need to edit your .csproj file, add the SspId_Replacements.xml file to your project and finally define the custom Configuration in Visual Studio. The sequence doesn’t matter, so long as all pieces are in place before you try a build. In team development, these steps need to be performed just once per VS project (e.g. by the lead developer).

Step 1 – add the custom MSBuild to your project:

Edit your .csproj file (each of them if you have multiple – the solution is currently “project-scoped”) to include this at the bottom:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

Step 2 – add the XML file with your config values:

Add an XML file within your project at the path “_Replacements\SspId_Replacements.xml” (N.B. this is configurable, see the MSBuild above). Add in the XML shown above, substituting the relevant SspId(s) for your environment(s). If you don’t know where to find the SspId for your environment, my colleague Luis mentions that in his post.

Step 3 – define the custom Configurations in Visual Studio:

In Visual Studio, define a custom Configuration for each environment you need to build WSPs for. Start by going to Configuration Manager and selecting “New..”:

Define custom configuration_thumb[2]

Define custom configuration 2_thumb[2]
..then create the Configuration, ensuring the same name is used as specified in the XML file – these two need to match. You’ll generally want to copy the setting from “Release”:

Define custom configuration 3_thumb[2]

Now close Configuration Manager – the implementation is complete.

Building packages

To create WSPs, use the Configuration dropdown to switch to the “TestTenancy” (or whatever label you used) build type – be careful that the Platform stays as “Any CPU” here, you may need to change it back if not. Then just right-click on the project in Solution Explorer and click “Package” as usual – you should then get a WSP package in which the SspId find/replace has occured. You can now test deploying this to your Office 365 tenancy which corresponds to this build type.

A note on requirements

Note that the MSBuild used here is not compatible with SP2010/Visual Studio 2010 (but that’s OK because the entire technique is not needed there). There is a dependency on .NET 4.0 for the XmlPeek/XmlPoke MSBuild activities which are used to update the XML files.

Hopefully this is useful to some folks.

Tuesday 17 September 2013

Provisioning Managed Metadata fields in Office 365 - Part 1: dealing with multiple environments

Managed Metadata fields have always been slightly painful for SharePoint developers, but if you did any kind of site templating or Feature development in SharePoint 2010, chances are that you did some research, read some blog articles and came to understand the solution. Here I’d like to talk about it for Office 365, and show a little customization I built to help with the process we currently use. This article became long, so I split it over two articles:

  1. Provisioning Managed Metadata fields in Office 365 – dealing with multiple environments [this article]
  2. Provisioning Managed Metadata fields in Office 365 – building WSP packages for a specific environment/tenancy

So, back on the SharePoint 2010 situation - the deal is that some Managed Metadata details change between SharePoint environments – so unlike other fields, the same provisioning XML could not be used across dev/test/UAT/production. To be specific, the IDs of the Term Store, Group and Term Set all change between environments. As a reminder, broadly the solution was to:

  • Use XML to define the “static” details of the field
  • Use server-side code to “finish the job” – i.e. use the API to ask SharePoint what the IDs are (for the current environment), and then update the field config with that value

Without the 2nd step, the field will be created but will be broken – it will be grayed out and users cannot use it (SP2010 example shown here):


Posts by Ari Bakker (whose image I’m using above, thanks Ari), Wictor Wilen and Andrew Connell were popular in discussing the steps to solve this.

It’s the same deal in SharePoint 2013/Office 365 – but it turns out we need different techniques.

Why the approach doesn’t work for Office 365/SharePoint Online

Well firstly, code in sandboxed solutions is deprecated. Full-stop. [RELATED - did you hear? Microsoft are starting to clarify that sandboxed solutions without code aren’t really deprecated after all (and hopefully MSDN will reflect this soon), but CODE in sandboxed solutions is deprecated and could be phased out in future versions. Clearly this is a very important distinction.]

But even if we were happy to use sandboxed code - in Office 365/SharePoint Online, we cannot use the Microsoft.SharePoint.Taxonomy namespace in server-side code anyway – the net result is that we are unable to “finish the job” in this way to ensure the field is correctly bound to the Term Store. This is a problem! Even worse, whilst it is possible in the CSOM API to bind the field, having this execute in the provisioning process (e.g. as a site is being created from the template) is challenging, maybe impossible. Maybe you could come up with some imaginative hack, but that’s probably what it would be. And what happens if this remote code (e.g. a Remote Event Receiver) fails?

Possible solutions

A colleague of mine, Luis MaƱez, did some great research – I’ll give you a quick summary here, but I strongly recommend reading his article - Deploying Managed Metadata Fields declaratively in SharePoint 2013 Online (Office 365). Here’s a summary:

In fact, it IS possible to provision Managed Metadata fields without any code, if you are willing to accept a big trade-off – you can declaratively specify the key details (such as the Term Store ID (also known as the SspId), the Group ID, the Term Set ID etc.) into your XML field definitions. Wictor alluded to this possibility in his post. But remember, these details change between environments!

So in other words, the trade-off is that you would need to rebuild your WSPs for each environment.

This is tricky for us, because on this project we choose to run multiple Office 365 tenancies, for development/test/production (something I’ll talk about in future posts) – just like a traditional mature process. So at first we said “No way! That’s against all of our ALM principles!  The exact same packages MUST move between the environments!”. But then we rationally looked at the alternatives we could see:

  • Option 1 - Some elaborate “remote code” solution, perhaps involving code running separately AFTER the site has been provisioned. Until this code executed, it would not be possible to upload documents to libraries with MM fields within the sites (and similarly if this remote call actually fails for any reason,  these libraries would not function correctly until an administrator intervenes).
  • Option 2 - The client needs to fix-up any Managed Metadata fields manually – across all 5000 sites we were expecting. In every list and library. Knowing that some lists/libraries have up to 5 such fields. Yeah….

Since neither of these was attractive, we continued looking at this idea of a 100% declarative definition of Managed Metadata fields. And then we realized that..

..if you do things in a certain way, you can get to the point where ONLY THE TERM STORE ID (SspId) CHANGES BETWEEN ENVIRONMENTS. That’s kinda interesting. It means that just one find/replace operation is all that’s needed – assuming you’re happy to accept the overall trade-off. Of course, having to replace the SspId is still sub-optimal, error-prone and less than awesome. But maybe we could work on that too – and that’s what these posts are really about - to show a Visual Studio customization we made to simplify this process, and make it less prone to human-error. If you want to skip ahead to this, see Provisioning Managed Metadata fields in Office 365 – Part 2: building WSP packages for a specific environment/tenancy. 

But first, let’s talk about that “if you do things in a certain way” thing (which means that the only the SspId changes between environments)..

The full recipe for Managed Metadata fields

Taking a step back for a second, if you are in “development mode” (e.g. creating a template for SharePoint sites), then successful provisioning actually involves more than just provisioning the field itself in a certain way. Effectively you should seek to provision both the Term Sets AND the fields. Do not allow administrators to create new Term Sets in the admin interface. This is because:

  • This way, you can control the ID of all your Term Sets – rather than let SharePoint generate that GUID
  • Because this is a static “known” ID, we can reference it elsewhere

Here’s what needs to happen:

  • Term Sets are provisioned into the Term Store with “known” IDs
  • The “known IDs” are then used in the XML definition of the fields
    • The code sample below is an example of a Managed Metadata field being provisioned the 100% declarative way. Notice all the properties being specified in the ‘Customization’ section (something a field using the combined declarative + code approach does not have):
      ** N.B. My newer code samples do not show in RSS Readers - click here for full article **

If you do this, your Managed Metadata fields will work just fine:

Managed Metadata fields - working

Great. It’s certainly very valuable to know this is possible for Office 365. So now we have things so that only the SspId value needs to change between environments. But that’s still a nasty find/replace operation – how could we make this situation better?

I describe the mechanism we use in the next post - Provisioning Managed Metadata fields in Office 365 – Part 2: building WSP packages for a specific environment/tenancy.

Wednesday 21 August 2013

Working with web parts within a SharePoint app

I’ve previously mentioned that “you don’t get very much” when you start creating an app which has some SharePoint-hosted components (e.g. pages). You get a bare bones ASPX page, and from there it’s down to you as the developer. Whilst playing around with apps I’ve found it interesting to experiment by adding out-of-the-box components (e.g. web parts/controls in the SharePoint namespace) to app pages. The results are quite interesting, and lead to a couple of findings which might be useful tricks in the toolbox. If nothing else, you might find this an interesting “leftfield” post about some questions you hadn’t thought to ask.

Before we dive in, here’s where we are in my overall app series:

  1. SharePoint 2013 apps – architecture, capability and UX considerations
  2. Getting started – creating lists, content types, fields etc. within a SharePoint app (provisioning)
  3. Working with data in the app web, and why you should
  4. Access end-user data (in the host web) from a SharePoint 2013 app
  5. Rolling out SharePoint 2013 apps to the enterprise - tenant scope and PowerShell installs
  6. Azure is the new SharePoint ‘_layouts’ directory
  7. “Host web apps” – provisioning files (e.g. master pages) to the host web
  8. “Host web apps” – provisioning fields and content types
  9. Deploying SP2013 provider-hosted apps/Remote Event Receivers to Azure Websites (for Office 365 apps) 
  10. Working with web parts within a SharePoint app [this article]

Deploying web parts within an app

If you create a SharePoint-hosted app, you might notice that the default page you get is an ASPX page rather than HTML. Also, it has several declarations at the top, allowing SharePoint controls to be used:

App page default markup 

The idea here is that you might want your app to take advantage of some SharePoint building blocks (web parts/controls) rather than recreating that functionality from scratch. But since you don’t have the full page editing model in an app (it’s not SharePoint content that a contributor would edit, after all), any web parts would have to be included in the markup of the page (by the developer). I did some testing to see which web parts appear to work, and this post is really about my findings.

I expanded the markup to include some common web parts within web part zones (and added any @Register directives which I started to need). My markup was then:

** N.B. My newer code samples do not show in RSS Readers - click here for full article **

Which web parts can I (probably) use within an app?

Here’s a summary of my results:

Web part


Content Query web part No
Content Search web part Yes
XsltListView web part Yes
Core Search Results web part Yes
Page Viewer web part Yes
Data View web part No
Result Script web part (for search results) No EDIT: Yes

Content Query web part:

The CQWP appears to be a flat “no”. If you don’t have publishing enabled on the host web, you’ll get this:

This seems to be related to being unable to fetch ContentQueryMain.xsl. I notice it is a 401 rather than a 404, which is kinda interesting, but anyway if you activate publishing you then get:


So, it feels like the Content Query web part may not have received any attention to make it work within an app web. It’s worth noting here that path to the XSL references the app web (in my case,– although it is “structurally” correct (in that the Style Library is referenced at the root), I think effectively it’s not possible to “browse” host web content on the app web domain. I even tried providing an absolute host web URL to the XSL file, but that fails also with the ‘page not found’ error.

So, perhaps the CQWP would need to be re-engineered for it to work within an app.

Content Search web part:

This web part appears to work fine, which is good news:

Content Search web part

It’s worth knowing that Display Templates are still pulled from the host web (master page gallery), but it seems no problems arise from that.

XsltListView web part:

As I’ve mentioned in previous articles, it’s somewhat confusing the first time you add a list to an app you are developing - by default, there’s nothing on the page (e.g. a navigation link to the list) which tells you if the list is being provisioned successfully. However, since you’ll know the list URL, typing the address in the browser address bar shows it is there. To display items from the list, you’ll most likely want to use the XsltListView – you’ll need to add the markup to one of your pages manually. And when you do, you’ll see that displaying items from a list in your app web works just fine:

List view web part

I wondered if it was possible to use this web part to show items from a list in the host web. Putting aside potential issues of including a token such as ~site in the URL (by testing with a hardcoded absolute URL), I found that it is not:

XsltListViewWebPart - host web 

Page Viewer web part:

The idea of using the Page Viewer web part in an app could be quite interesting. I was intrigued to see if I could display a page from the host web within my app, and I found that it is, with caveats (below). The image below shows just that – a page from the host web. Some points:

  • The “This page is allowed to be IFramed” message is just some text on the page
  • The yellow “customized” message is because I manually edited the page in SPD to add the <WebPartPages:AllowFraming ID="AllowFraming" runat="server" /> tag. If this was provided by something else higher up (page layout/master page), this message would not be present

Page viewer web part

But the main caveat is probably that it all works fine with a hardcoded/absolute URL, but tokens such as ~site do not appear to work. So in real life you might struggle if you need to have any kind of relative URL here. Still, useful to know perhaps.

Result Script web part (search results):

[UPDATED AUGUST 22 2013: Scot Hillier got in touch, telling me that it IS possible to use this web part. The solution can be found at the bottom of this section.]

[Original text follows, to show errors you might run into:] The Result Script web part is effectively the modern day “search results” web part in SharePoint 2013. It can be used to display the results from a given search query – it is the web part used on the default SP2013 search results page, and very powerful it is too. It seems like this one should work in a SharePoint-hosted app, but I just couldn’t quite get there. I’m hoping I’ve overlooked something and it is actually possible, because the issue seems to simply be getting the right format of escaping quotation marks in the search query (the DataProviderJSON property). I tried various forms, but I either see this:

Result script web part

..or this:

Result script web part 2

Permutations of quotation marks/escape characters I tried include:

  • \”
  • ””
  • ..and so on

UPDATED – THE SOLUTION (thanks Scot!):

  • The answer is to use XHTML encoding of the quotation marks – i.e. &quot; 

When you do this, you’ll see the Result Script web part does actually work within your app:

Result script web part - working

So, that’s good news, and means that we CAN take advantage of this web part inside an app! 

This seems a shame, because I can imagine lots of apps which might want to display search results. For these apps, a nice web part cannot be used – code will be needed, such as I show in Calling SharePoint search using REST (e.g. from JavaScript or an app).

Core Search Results web part:

Although it turns out to be possible to use the Result Script web part, if for any reason it’s not right for your circumstance there’s always the Core Search Results web part – this is the search results web part you know and love from earlier versions of SharePoint. It uses XSL (instead of JavaScript display templates) for rendering and doesn’t have the sophisticated Query Builder (which you wouldn’t be able to use within an app anyway), but works fine for showing search results on a page:

Core Search Results web part


Some web parts can be used within an app, some cannot. In terms of other types of controls (e.g. SharePoint server controls), I found that the SafeControl settings of the web.config used by apps are more restrictive than normal – several namespaces and specific controls are marked as unsafe, so many server controls will be off-limits in an app page. Still, it’s quite possible that some can be used – more testing would be needed here.

The usual mantra should apply to developers of SharePoint apps as in general – check if there’s something in the box which can be used before you write code. It could be that something in there saves you considerable effort.

Next time - Using app parts (ClientWebPart) to bring app elements into the host web

Friday 12 July 2013

Deploying SP2013 provider-hosted apps/Remote Event Receivers to Azure Websites (for Office 365 apps)

Before SharePoint 2013, we were all used to the idea of custom code running on SharePoint servers. However, this changes with SP2013 – Microsoft are either forcing us (Office 365) or steering us (on-premises) to run such code off the SharePoint boxes. And you can understand why – frankly Microsoft have no chance of providing a stable Office 365/SharePoint Online platform if it has everyone’s custom code running on it. So, SharePoint 2013 allows event receiver code to run on a remote server. This post looks at deploying Remote Event Receiver components to an Azure Website for a SharePoint site running in Office 365. Since RERs take the form of a provider-hosted app in SharePoint 2013, everything I write here also applies to a provider-hosted app too. Some changes would be required (to code and configuration steps) to achieve the same for an on-premises farm, unless you have configured that environment to trust ACS. Before we dive in, here’s where we are in my overall app series:

  1. SharePoint 2013 apps – architecture, capability and UX considerations
  2. Getting started – creating lists, content types, fields etc. within a SharePoint app (provisioning)
  3. Working with data in the app web, and why you should
  4. Access end-user data (in the host web) from a SharePoint 2013 app
  5. Rolling out SharePoint 2013 apps to the enterprise - tenant scope and PowerShell installs
  6. Azure is the new SharePoint ‘_layouts’ directory
  7. “Host web apps” – provisioning files (e.g. master pages) to the host web
  8. “Host web apps” – provisioning fields and content types
  9. Deploying SP2013 provider-hosted apps/Remote Event Receivers to Azure Websites (for Office 365 apps) [this article]
  10. Working with web parts within a SharePoint app

Why would I deploy this stuff to Azure?

I think a good option for SharePoint remote code (e.g. some provider-hosted apps, Remote Event Receivers etc.) is to host this code on the “Azure Websites” offering (*update May 2015 - Azure Websites are now known as "Azure Web Apps"*). This is compelling because it’s free, quick and easy to spin up, and you don’t need to provide any in-house servers (with the resulting high availability, scalability, backup/restore and performance work which is required). Another significant factor is that you don’t need to involve your gateway/networks team in getting the website (which hosts your provider-hosted app pages and/or WCF service for your Remote Event Receivers) published (on SSL of course) and accessible externally - e.g. through UAG, or whatever your perimeter device may be. Azure is already outside of your firewall and accessible/addressable over the internet of course, and Azure Websites have automatic SSL support on the default domain (which is * If you want to use a custom DNS host name instead (e.g. then you can do that too, with some extra steps.  In general, using Azure Websites is a great way to sidestep many of the infrastructure roadblocks which can derail you.

What about building my app/RER as an auto-hosted app?

* Update early 2015 - auto-hosted apps are no longer available in Office 365!*

It’s a fair question, since not only do auto-hosted apps for Office 365 “just work”, but in fact they actually run on Azure Websites underneath. So why would you not just do that? Well, this works great for demo and proof-of-concept code. But personally I wouldn’t feel comfortable recommending this architecture to a client for production, and I notice others feel the same way. Frankly there is too much “black box” going on with auto-hosted apps – there aren’t really any knobs and dials right now, and technical details are scarce (e.g. scale limits, scale-up possibilities). Also, anyone building apps for the Store will note that auto-hosted apps cannot be sold there, and it’s unclear if they will be in the future.
So, I prefer the more manual approach of deploying to Azure Websites myself. The benefits I get are:
  • I can scale up from Azure Websites (shared) to reserved hardware (e.g. if my app is more heavily used, or uses more processor/memory than anticipated)
  • I can use Azure’s AutoScale capabilities (currently in preview) to do this automatically based on rules I set (e.g. processor thresholds)
  • I have better monitoring of my app
  • I can make changes to the Azure pieces of my app without redeploying the SharePoint pieces
  • I can publish in many different ways (e.g. FTP, WebDeploy, Continuous Deployment from TFS, git etc.)
  • I can examine the files in Azure to do trouble-shooting by opening an FTP client

How does my O365/SP2013 permutation fit in with this article? What other things should I think about?

Permutation Consideration
Office 365 + app in Azure Websites The focus of this article. As detailed above, quite similar to auto-hosted apps but with way more control.
On-premises + app in Azure Websites You’d need to make some changes to the code/process discussed here. Effectively you need to configure high-trust/S2S authentication instead, (rather than OAuth to ACS) and ensure you’re using the correct TokenHelper methods or equivalent custom code. An alternative to S2S could be on-the-fly auth.
Office 365 + app in on-premises server Can be attractive because could easily integrate with other on-premises applications/data. However, usually more complex due to operational I.T. challenges listed above. Solvable with techniques like Azure Service Bus, BCS, or simply exposing your data through custom services (e.g. WCF) from on-premises to the internet/DMZ.
On-premises + app in on-premises server Fairly simple because everything is behind your firewall. Requires S2S authentication configuration or on-the-fly auth.

What kind of code might this be?

Examples of such remote code in SharePoint 2013/Office 365 are:
  • Remote event receivers
    • List events (e.g. ListAdding)
    • ListItem events
    • Web events
  • App events
    • AppInstalling/AppInstalled
    • AppUpgrading/AppUpgraded
    • AppUninstalling/AppUninstalled
  • Other app code – i.e. the entire set of functionality in a provider-hosted app

What you need – a summary

Although a simple F5 deployment will take care of many aspects during development, to properly package/deploy your app for "real use", here are some of the key things you need:
  • A website to be created on Azure Websites
  • A solution which uses a Remote Event Receiver – for the first time you do this, I recommend using something guaranteed to work rather than your own code (to avoid getting the code and/or RER declaration wrong). I used the BasicDataOperations SP2013 provider-hosted app on MSDN
  • To have registered the app with AppRegNew.aspx on your SP2013 environment or Office tenancy – this creates a new App Principal “known” to the environment.
  • Within the app:
    • All the URL references in the app to be updated with absolute URLs pointing to your Azure website e.g:
      • The declaration of the Remote Event Receiver
      • The app start page listed in AppManifest.xml etc.
    • The ID of the remote web application (i.e. the App Id) listed in the AppPrincipal section of the AppManifest.xml
  • For the Visual Studio project which represents the remote web (rather than the app itself):
    • The web.config to be updated with the ClientId and ClientSecret (note that in Azure, it's also possible to to specify AppSettings values in the web app configuration, as an alternative to web.config. The steps below use this approach)
    • It to be published to your Azure website – there are many options for this, but I like WebDeploy (shown below)

The detailed process

Here’s a more detailed run through of the process - I’m focusing very much on the infrastructure/configuration here rather than the code, although I do call out some things:
  1. Create or obtain your solution. As with any SharePoint app with remote components, in the BasicDataOperations app I’m using, there are two Visual Studio projects:
    1. One for the SharePoint app
    2. One for the remote components (i.e. this is the ASP.NET website which hosts the web service for the Remote Event Receiver)
  2. Register your app with AppRegNew.aspx (i.e. the page which can be found at /_layouts/15/appregnew.aspx), and make a note of the App Id and App secret:

    AppRegNew confirmation
  3. Create your Azure Website in the Azure Management Portal – make a note of the URL. In my case this is
  4. Update the SharePoint app project so that the URL references point to this Azure site:
    1. The RER declaration should look something like this:
      ** N.B. My newer code samples do not show in RSS Readers - click here for full article **
    2. The AppManifest.xml should look something like this:
      ** N.B. My newer code samples do not show in RSS Readers - click here for full article **

      Note that Visual Studio will replace the "~remoteAppUrl" tokens when the app is packaged - a dialog box will appear asking you where the remote web app will be hosted, and the value you enter in the textbox (e.g. your URL in Azure) will be used.

  5. Also ensure the SharePoint AppPrincipal section of AppManifest.xml lists a RemoteWebApplication, with the ClientId attribute set to the App Id (from the app registration):
    ** N.B. My newer code samples do not show in RSS Readers - click here for full article **
  6. [OPTIONAL] If you want to debug your remote code, you should set up a Service Bus instance on Azure and configure your SharePoint app project with it (*update 2015 - this step is no longer necessary - remote debugging in Azure Web Apps now happens without Service Bus configuration*):
    1. Go to the “Service Bus” area within the Azure Management Portal, and create a new namespace for this app:
      App debug - service bus in Azure 
    2. Then click “Connection Information” for this namespace and view the details – copy the connection string:
      App debug - service bus conn string
    3. Finally, in Visual Studio go to the project properties for the app project (not the web project). Go to the SharePoint tab, and scroll to the bottom – ensure “Enable remote event debugging” is checked, and paste the Service Bus connection string into the textbox:
      App debug - configuring VS project
      If you need more information on this, see Update to debugging SharePoint 2013 remote events using Visual Studio 2012.

  7. Publish the ASP.NET website to Azure – I’m using WebDeploy here (*update 2015 - note that things are simpler than this if you have the Azure SDK installed - Web Apps for your Azure subscription will be listed as a target once you click the "Publish..." button in VS. Note that you will need to sign-in to Azure from VS if you aren't already*):
    1. Download the Publish Profile for your Azure Website:

      Download publish profile
      Download publish profile - save 
    2. Publish the app – importing the Publish Profile since this is the first time:
      Publish web app
      Publish web app - import   
      Publish web app - import - select file

      Publish web app - validate settings 
    3. Once the settings in the Publish Profile have been validated (as in the previous image), click “Publish” to deploy to your Azure site. You should then see a success message:

      Publish web app - publish success
      The remote components are now deployed to Azure.

  8. As we noted earlier, in Azure Web Apps AppSettings can either be specified in the web.config file as usual, or in the properties for the web app in Azure. Generally you'd use the former, but to illustrate the mechanism in Azure, here's what that option looks like - you'd go to the Azure portal and select your website. Click CONFIGURE, then scroll down to the AppSettings section – enter the ClientId and ClientSecret here:

    Azure - app settings  
  9. Publish the SharePoint app to Office 365 (by first publishing to the filesystem):

    Publish app 
    You’ll be presented with a dialog similar to the below (*update 2015 - the Client Secret is no longer specified here in later versions of the Visual Studio tools. Simply make sure it's correct in AppSettings for the app*)– ensure the URL and Client ID are correctly specified, and these values will be packaged into the app manifest properly

    Publish app - enter settings

    Publish app - files generated 

  10. At this point, the app is ready and can be added to the App Catalog in your Office 365 tenancy. Go to the  “Apps for SharePoint” library within the associated App Catalog site, and either upload the .app file conventionally, or drag in:

    Publish app - upload to app catalog
  11. Now the app can be added to a site:

    App install - in site
  12. Once the permission request has been accepted, the app is installed and can be run. Enter the app by clicking on it in the Site Contents page:

    App - in site
..and there is the BasicDataOperations MSDN sample app running in your Azure site, in all it’s ninja CSS and responsive design glory:

The point of course, is that you now have remote code running and have a location to host it.


In scenarios where your code must run “off-box” to SharePoint (such as Office 365), Azure Web Apps can provide a much easier way of doing this than with on-premises IIS servers. You can use Azure’s flexibility to scale up from the free option (not resilient) to one of the pay-for options which give a production-grade level of operations.

You have to consider if you are happy for your code to run there and who might "own" the use of Azure within your organization, and maybe some considerations such as authentication/integration with on-premises systems could rule it out for you. Otherwise, it can free you from dealing with lots of infrastructure aspects (especially getting the website published externally), and so is an incredibly useful tool in the toolbox.