Wednesday, 13 April 2016

Office 365 changes - new document libraries, Groups enhancements and new APIs

Microsoft recently made some changes to Office 365 which I think are particularly interesting. Of course, being an “evergreen” service things are evolving all the time, but these particular cases are interesting either because they have a big impact, or perhaps because of what they indicate for the future. I’m not sure it’s all good news, but there you go! I’ve previously talked about some the things which are now being rolled out back in December 2015 in Enhancements to SharePoint collaboration, Office 365 groups, user profiles, PowerApps and more on the way! (end 2015) but now they’re here. The ones I’m focusing on in this article are:

  • The new experience for SharePoint document libraries
  • Office 365 Groups – now with “full” SharePoint document library
  • A new approach to synchronizing user data to Office 365
These changes are now available in “First Release” tenants, and will be made generally available soon.

Modern SharePoint document libraries

Many of us knew changes in this area were coming, but the impact is pretty big in my view. At it’s core, the new doc lib experience is very similar to the user interface previously rolled-out to OneDrive for Business sites. The new experience may not be enabled by default (we’ll see), but when it is enabled the ‘All Documents’ view has the option of a ‘Grid’ view to give document libraries a much more visual experience:

Modern doc lib - grid view - small

Note that up to 3 documents can be pinned to the top area of the screen – I really like this, it’s a nice way to highlight key content in a library. When the ‘All Documents’ view is set to ‘list’ instead of grid, the pinned documents continue to be shown but the view is more appropriate to larger numbers of items:

Modern doc lib - list view - small

A big thing for me is that it’s now easy to move or copy a document within SharePoint. It was *hugely* annoying to have to download a document just to do this, but now there’s a simple interface to facilitate this:

Modern doc lib - move and copy
Unfortunately this only seems to support move/copy operations *within* the document library at the moment. I’m really hoping this gets extended, including the ability to copy a document to the same location (e.g. suffixed with “_copy” or similar) as part of a copy/modify kind of flow.
Other notable aspects include a new ‘details’ pane on the right-hand side. This is used for:
  • Viewing/editing properties of an item
  • Seeing recent activity
  • Seeing sharing details (including sharing to new people)
Here’s what it looks like when details are being edited – note the new controls, including a new taxonomy picker:

Modern doc lib - new taxonomy picker

Something else I really like is the view of ‘Recent activity’ across all documents in the library. That should be *very* useful in identifying changes as you land in a doc lib:

Modern doc lib - recent activity

As you’d expect, the user interface is quick and looks good on a mobile device too (shown here in ‘PC’ mode rather than ‘mobile’ mode):


Other bits and pieces include a new toolbar:

Modern doc lib - toolbar
..and a new experience for sorting and filtering:

Modern doc lib - sorting and filtering
And finally, another thing to love is that document libraries can now have *links* to other things (anything):

Modern doc lib - add link 1 Modern doc lib - add link 2 Modern doc lib - add link 3
So, all good right? Well no, I wouldn’t say that..

Modern SharePoint document libraries – the downside

Whilst the new experience is great in many ways, I note the following things which, depending on your circumstances, may not be good:
  • No JavaScript or CSS customization (aside from some JSLink scenarios I believe). Basically there is no way to get your custom JavaScript to execute on this page.
  • No ribbon customizations – so in addition to any you might have implemented, any 3rd party products you’re using (including things like Nintex as far as I can tell?) will not be surfaced.
  • No branding – if the site uses a custom master page, any document libraries using the new experience will not respect this. This means you may lose branding colors and so on (aside from Office 365 themes), but also any custom global navigation your site might be using - which might be pretty vital cross-site navigation
On this last point, here’s a screenshot showing the confusing user experience – when I navigate from the Site Contents page (on the left) into the doc lib (on the right):

Modern doc lib - no custom master page - UX - small

Overall, one of the things that’s the most frustrating to clients about this is the lack of communication around this change. Sure, it’s only in First Release tenants at the moment, but the big question is why was this item not conveyed in the Office 365 roadmap? I have no answer on that one unfortunately, but I’d love to hear the Microsoft thinking on that personally. Let’s hope that the final rollout of this change happens in a way which at least gives some choice and control to how things happen.

Anyway, on to the next change I want to talk about today..

Office 365 Groups enhancements – steps towards “full” document libraries:

We all know that Office 365 Groups are the future in many ways. They tie many aspects of Office 365 together – providing conversations, file storage, calendars, a notebook, projects and tasks in Planner and so on. But one of the things holding me back from recommending their usage has been the limitations around files stored there. So far it’s been a cut-down document library (a OneDrive for Business library), and it wasn’t possible to apply content types, change versioning and content approval settings and so on. The content type thing is particularly limiting, because if you’re tagging documents in SharePoint in other locations, that wasn’t possible here. And that can screw up the search experience, because if you have search refiners to help filter results down – well, these won’t work across your files in Groups because they’re not tagged in that way.

The good news is that things are gradually being unlocked, and Microsoft are taking the first steps to allow content types in the doc lib behind a Group:

Content types in Office 365 group libraries
Unfortunately I notice it’s not all there yet. It’s not actually possible to add a custom content type at this time, because you can’t get to the site settings in order to create one, and furthermore you can’t add a list content type directly. You *can* add custom columns to the out-of-the-box ‘Document’ content type though, so that’s something:


Wait! Full support for content types not there yet..

However, overall the real enterprise support is not there yet – it’s not possible to deploy content types through the content type hub, and I can’t find API coverage which would allow me to deploy content types through code (because it’s not possible to get a reference to the “site” behind an Office 365 Group). So, we still have gaps that prevent using this across many sites/libraries. Let’s hope real support for content types is coming soon, along with the ability to deploy content types in an enterprise way to these libraries.

On that note, what I really hope is coming soon is more than a document library. I’d like to see a full team site become available behind an Office 365 Group (something Microsoft have alluded to) – AND, I want to be able to easily make simple customizations to that site. For example, maybe I’ve got some specific document libraries I want to add, or any number of other tweaks which help my users in their use of the platform. It would be nice if there’s some kind of web hook that fires so I can plug in some code which adds these bits as the site behind the group is created.

We’ll see!

And the final item I wanted to highlight is..

User profile properties – bulk import API in CSOM

The other thing that caught my eye is that Microsoft have made available their Office365Tenant.QueueImportProfileProperties API which I talked about back in December (introduced in CSOM version 16.1.4727). This allows you to update user profiles in SharePoint Online with attributes that aren’t synchronized by the existing native tools such as AAD sync. So if you have some custom user properties such as Division, Department, Country, Favorite Pasta or whatever, you can now sync values into these fields from some existing data you have. It works by reading a JSON file you provide (by uploading somewhere into your tenant) – this gives you the flexibility to *get* the data from anywhere you like, so long as you can generate that file. So, whether your data is in AD, on-premises SharePoint user profiles, SAP or some other HR system, you can implement the code to run on a scheduler every night and sync from the source to the target.

This is useful because it was a gap we previously had to deal with ourselves – we implemented a custom solution to tackle this gap with some of our clients (see Improving the Office 365 SharePoint hybrid experience). However, it’s nice to know there’s now a Microsoft approach. It’s still just building blocks of code (which you need to implement) rather than a simple configuration switch, but it’s still nice to have. And since it does bulk updates, this should work better in larger organizations with many profiles to update.

For more details, start with Vesa’s very nice article here:


So that’s a quick run down of some recent changes that Office 365 practitioners should be aware of. Of course, there are many more things going on that I’m not covering here – don’t forget the rollout of Delve analytics (which I covered with screenshots in Enhancements to SharePoint collaboration, Office 365 groups, user profiles, PowerApps and more on the way! (end 2015)), the rollout of PowerApps to quickly build simple apps which work on mobile devices, and also more things that will be announced on May 4 at the Future of SharePoint event!

Thursday, 31 March 2016

Office 365 performance – our Azure CDN image renditions solution

In the last post image renditions causing slow page loads in SharePoint Online, I talked about how Office 365/SharePoint Online has some sub-optimal performance around images and image renditions, at least at the present time. Numerous people got in touch to say they also see the same issue. However, we are implementers - and we bring solutions, not problems! So in this post I’ll go into a bit more detail on our way of working around this challenge, and how it can improve page load times.

To recap, the problem is related to the image renditions functionality of SharePoint. This is a useful feature which automatically creates additional versions (sizes) of images in a publishing site such as an intranet. However, when a user hits a page which has these images - often a home page or section page – and they need to be downloaded, we see a big delay of up to 3 seconds. Clearly if a page is taking say, 5 or 7 seconds to download in total, this is a big chunk of this time. Surprisingly, the delay is NOT the actual image being sent over the wire to the user. Instead, analysis shows the 3 seconds or so pause happens on the server in SharePoint Online – most likely because of “cache misses” due to the fact that the renditions framework wasn’t originally designed for architectures used in Office 365. So, performance of this bit of the platform isn’t optimal - our solution was to roll our own renditions framework, and this post describes what we did.

Using Azure to implement renditions

Before delving into the implementation, here’s how I described the process last week:

  1. An intranet author adds or changes an image in SharePoint
  2. A remote event receiver executes, and adds an item to a queue in Azure (NOTE – RERs are not failsafe, so we supplement this approach with a fall-back mechanism which ensures broken images never happen. More on this below).
  3. An Azure WebJob processes the queue item, taking the following steps:
    1. Fetches the image from SharePoint (using CSOM)
    2. Creates different rendition sizes (using the sizes defined in SharePoint)
    3. Stores all the resulting files in Azure BLOB storage
  4. The Azure CDN infrastructure then propagates the image file to different Azure data centers around the world.
  5. When images are displayed in SharePoint, the link to the Azure CDN is used (courtesy of a small tweak to our display template code). Since CDNs work by supplying one URL, the routing automatically happens so that the nearest copy of the image to the user is fetched.

For those interested, let’s go into the major elements:

The Remote Event Receiver and associated SharePoint app/add-in

There are two elements here:

  • A SharePoint add-in used for our remote code (hosted in Azure) to authenticate back to SharePoint
    • We register this add-in using AppRegNew.aspx, specifying a Client ID and client secret which will be used for SharePoint authentication
  • A remote event receiver used to detect when new images are added

The two are related because we implement the RER code as a provider-hosted add-in using the Visual Studio template (which gives 2 projects, one for the app package and one for the app web). In actual fact, this particular RER doesn’t need to communicate back to SharePoint – when it fires, it simply adds an item to a queue in Azure which we created ahead of time. The object added to the queue contains the URL of the image which was just added or modified, and we use the Azure SDK to make the call.

We apply this RER to all the image libraries in the site which needs the solution. We do this simply as a one-off setup task with a PowerShell/CSOM script that iterates through all the subsites, and for each image library it finds it binds the RER. My post Using CSOM in PowerShell scripts with Office 365 shows some similar snippets of code which we extended to do this. The script can be run on a scheduled basis if needed, so that any new image libraries automatically “inherit” the event receiver.

The Azure WebJob

The main work is done here. The job is implemented as a “continuous” job in Azure, and we use an Azure QueueTrigger to poll the queue for new items. This is a piece of infrastructure in Azure that means that a function in our WebJob code is executed as soon as a new item is added to the queue – it’s effectively a monitor. We initially looked at using a BlobTrigger instead (and having the RER itself upload the image to Azure BLOB storage to facilitate this), but we didn’t like the fact that BlobTrigger can have a bigger delay in processing – we want things to be as immediate as possible. Additionally, remote event receivers work best when they do minimal processing work – and since a quick async REST call is much more lightweight than copying file bytes around, we preferred this pattern. When a new item is detected, the core steps are:

  1. Fetch details of the default rendition sizes defined in SharePoint for this site. This tends to not change too much, so we do some caching here.
  2. Fetch details of the *specific* rendition sizes for this image, using a REST call to SharePoint. We need to do this to support the cool renditions functionality which allows an author to specifically zoom-in/crop on a portion of the image for a specific rendition – y’know, this thing:

    Image renditions - crop image

    If an author uses this feature to override the default cropping for a rendition, these co-ordinates get stored in the *file-level* property bag for the item, so that’s where we fetch them from.
  3. Fetch the actual image from the SharePoint image library. We use CSOM and SharePoint add-in authentication to make this call – our WebJob knows the Client ID and client secret to use. We obtain the file bytes i.e. effectively downloading the file from Office 365 to where our code is running (Azure), because of course we’re going to need the file to be able to create different versions of it.
  4. For each rendition size needed:
    1. Resize the image to these dimensions, respecting any X and Y start co-ordinates we found (if the author did override the cropping for this image). There are many ways to deal with image resizing, but after evaluating a couple we chose to use the popular ImageProcessor library to do this.
    2. Upload each file to Azure BLOB storage. We upload using methods in the Azure SDK, and ensure the file has a filename according to a URL convention we use – this is important, because our display templates need to align with this.

Once the files have been uploaded to Azure BLOB storage, that’s actually all we need to worry about. The use of Azure CDN comes automatically from files stored there, if you’ve configured Azure CDN in a certain way. I’ll cover this briefly later on.

Authentication for the Azure WebJob

I thought long and hard about authentication. In the end, we went with SharePoint app-only authentication, but we also considered using Office 365/Azure AD authentication for our remote code. Frankly that’s my “default” these days for any kind of remote code which talks to SharePoint (assuming we’re talking about Office 365) – as discussed in Comparing Office 365 apps with SharePoint add-ins, there are numerous advantages in most cases, including the fact that there is no “installation” of an add-in, and the authentication flow can be started outside of SharePoint.

However, one advantage of using SharePoint authentication is that we aren’t tied to using the same Azure subscription/directory as the one behind the Office 365 tenant. Our clients may not always be able to support that, and that was important for us in this case – using this approach means we don’t have that dependency.

Display templates

As mentioned previously, a big part of the solution is ensuring SharePoint display templates align with file URLs in the CDN. So if we’re using roll-up controls such as Content Search web parts around the site and these reference rendition images, these also need to “know the arrangement”. Effectively it’s a question of ensuring the thing that puts the file there and the thing that requests the file are both in on the deal (in terms of knowing the naming convention for URLs). It’s here that we also implement the fall-back mechanism (more on this later) to deal with any cases where a requested image isn’t found on the CDN. In terms of the swapping out the default behaviour of fetching images from SharePoint to fetching them from the CDN instead, it just comes down to how the value used within the <img src> attribute is obtained:

<img src="_#= imgSrc =#_" />

Simply implement a function to get that value according to your URL convention, and you’re good. Although not shown in the snippet above, it’s here that our fall-back mechanism is called, courtesy of the ‘onerror’ handler on the <img> tag.


Since we’re talking about architecture pieces, there’s some WebAPI thrown in there too – this is part of the fall-back mechanism, described later.

Azure CDN configuration

As mentioned earlier, the CDN part is easy with Azure. When a file gets uploaded to Azure BLOB storage, it gets a URL in the form:


..but if you configure the CDN, it can also be accessed on something like:


When the latter URL is used, in fact the file will be requested from the nearest Azure CDN data center to the user. If the file hasn’t propagated to that location yet, then the first user to be routed through that location will force the file to be cached there for other users in that geographical region. Our testing found this additional delay is minimal. There are a few more CDN things to consider than I’ll go into detail on here, but initial configuration is easy – simply create a CDN configuration in Azure, and then specify it is backed by Azure storage and select the container where you’re putting your files. The images below show this process:


Create CDN endpoint from BLOB storage

The fall-back mechanism

So I mentioned a few times what we call “the fall-back mechanism”. I was always worried about our solution causing a broken image at some point – I could just imagine this would be on some critical news article about the CEO, on a big day for one of our clients. Fortunately, we were able to implement a layer of protection which seems to work well. In short, we “intercept” a broken image using the HTML 5 ‘onError’ callback for the <img> tag. This fires if an image isn’t found on the CDN for any reason, and this kicks off our mechanism which does two things:

  1. Substitutes the original rendition image from SharePoint - this means we’re “back to the original situation”, and we haven’t made anything worse.
  2. Makes a background async call to our WebAPI service – this adds an item to our queue in Azure, meaning the image gets processed for next time. This is the same as if the RER fired against this particular file.

The image below shows what happens (click to enlarge):

CDN image renditions - fallback mechanism 2

One nice thing about this mechanism is it works for existing images in a site. So if the mechanism is implemented in an existing site with lots of images, there’s no need to go round “touching” each image to trigger the remote event receiver. Instead, all that needs to happen is for someone to browse around the site, and the images will be migrated to the CDN as they are requested.

Challenges we encountered

Along the way we faced a couple of challenges, or at least things to think about. A quick list here would include:

  • Thinking about cache headers from Azure CDN, cache expiration and so on – this relates to scenarios where an author may update an image in SharePoint but not change the filename. Clearly end-user browsers may cache this image (and an end-user can’t be expected to press CTRL F5 to do a hard refresh just because you’ve updated a file!). My colleague Paul Ryan wrote a great post on this at Azure CDN integration with SharePoint, cache control headers max-age, s-maxage
  • Parallel uploads to Azure (e.g. if we’re creating 8 different sizes for image found, may as well upload them in parallel!)
  • Ensuring we understand how to handle different environments (dev/test/production tenants with different Azure subscriptions)
  • Implementing a nice logging solution
  • Testing


As I summarized last time, it would be great if the original performance issue in Office 365 didn’t occur. But CDNs have always been useful in optimizing website performance, and in many ways all we’re doing is broadening Microsoft’s existing use of CDNs behind Office 365. The building blocks of Azure WebJobs, Azure file storage, CDN, SharePoint add-ins, remote event receivers, WebAPI and so on mean that the Office 365/SharePoint Online platform can be extended in all sorts of ways where appropriate. This was a solution developed for clients of Content and Code so it’s not something I can provide the source code for, but hopefully these couple of articles help awareness of the issue and architectural details of one way of working around it.

Tuesday, 22 March 2016

Office 365 performance – image renditions causing slow page loads in SharePoint Online

Just like any website, there are many reasons why page load times might not be amazing in SharePoint Online. Perhaps it’s a page with too many ‘heavy’ controls (e.g search web parts), a particularly slow custom control, the amount of data going over the wire (e.g. due to large images JavaScript/CSS files), use of a known performance killer such as structural navigation, or maybe things are slow from the office due to network infrastructure such as reverse proxies slowing things. If users are far away from where the Office 365 tenant is located, that can certainly exacerbate things. As always, if the site has any kind of customization, some optimization steps need to be taken - good performance won’t always happen by default. Recently however, we’ve been noticing slow page loads even in:

  • Sites we have optimized
  • Out-of-the-box publishing sites

Analysis showed that the issue was related to image renditions in SharePoint. If you’re not familiar with this feature, it does something useful which, ironically, is intended to improve site performance. For each image uploaded, multiple resized versions are automatically created in the background – the idea is that end-users don’t download a large ‘original’ image when only a tiny thumbnail is needed. A classic example is large images added to content pages, which are then shown as a list of rolled-up links on a home page e.g. “most recent news articles”.

If it wasn’t for the performance issue, the principle works well – a 4MB image is typically shrunk to around 200k for a typical size, and that’s a lot less data being downloaded to users. I’m not sure if anything has changed recently in Office 365 (since most sites I’ve been involved in use image renditions), but a couple of clients noticed the issue around the same time we did. Specifically, pages with renditions are slow on “first-time” page loads i.e. whenever the images need to be downloaded because they are not served from the local browser cache. But unfortunately it’s not just that - rendition image files are served with expiry headers of 24 hours, meaning even regular users will have at least one very slow page load every 24 hours. And of course, we might not just be talking about their home page – rather, every page they hit could be slow once every 24 hours. That’s certainly enough to damage user perceptions about Office 365.

Sidenote – first-time page loads vs. returning user page loads
So renditions are slow even outside of first-time page loads. But whilst we’re on the subject, how much do we need to care about first-time page loads anyway? I typically advise my clients not to worry too much – frankly, Office 365 will always be slow here since there are some pretty heavy JavaScript files that need to be downloaded (even if they do come from a CDN). It’s a rich, highly-functional platform after all. But this is very much part of the Office 365 design – I believe Microsoft take the view that users are forgiving, so long as their *subsequent* browsing experience is quick. Most users don’t have the same expectations of their corporate intranet/collaboration platform as they do of public consumer sites such as Facebook and Google – and since most intranet usage is *not* first-time page loads, things work out in the end. I agree with this viewpoint frankly.

But why are image renditions slow?

When further analysis is performed, we see the delay happens on the server – the big surprise is that the delay is not for the actual image to be downloaded, which is what you’d normally expect. The following image shows a real home page being loaded, and we can see a delay of multiple seconds for many images (each one being a “rendition” image), as indicated by the long green bars:

Rendition image delays_Small

When we dig deeper, we see the delay is not in the content download, but is in the “waiting” stage – this indicates the delay is with Office 365 itself, and not in the actual file being downloaded:

Rendition image delays - detail_Small

We believe this happens because there are typically “cache misses” on rendition images being served from the BLOB cache. When the image is not served from the blob cache, the SharePoint Online infrastructure is very slow to process and serve the image. It seems that cache hits are very rare for end-users – possibly due in part to BLOB cache settings in SPO (e.g. disk size allowed), but more likely due to sheer number of front-end servers in a typical SPO farm. I’m told that some of the larger farms in the service have between 100 and 200 front-end servers now – clearly this is a very different situation to an on-premises environment which SharePoint was originally designed for. So whilst the renditions architecture would be very effective in a typical on-prem farm of say, 3-6 front-end web servers, in the Office 365 world this is not the case. Of course, if you work closely with the product you sometimes see some examples of things like this that didn’t quite translate perfectly to the cloud world. That said, having worked on many deployments I’m always amazed at just how well SharePoint does work as a service (no doubt due to some hard work from talented Microsoft engineers, including some folks I know) - but there will always be “opportunities for improvement”, and the service often evolves to include these.

Our solution (based on Azure CDN)

So what can we do about it? Well, we could just avoid using SharePoint image renditions completely, but then performance would still be poor due to large files being downloaded to users. So we definitely do want to use different image sizes – and since the core work of resizing images isn’t that hard, why not do it ourselves? We could then take advantage of other things, like some automatic use of a CDN such as Azure CDN (N.B. that link explains what a CDN does if you’re not familiar). This is the direction we took to work around the performance issue in SharePoint Online. Things work pretty well, and what we implemented has the following benefits:

  • The Office 365 renditions delay does not occur
  • There is no impact to intranet end-users (except the improved performance)
  • Intranet authors do not need to do anything different or have additional training
  • The image files are hosted in Azure CDN (Content Delivery Network), which places the files in various Azure datacentres around the world, to ensure they are close to the user. This can significantly boost performance further for some users, especially those far away from the Office 365 datacentres (e.g. non-European users in the case of most of our clients)
  • All of the capabilities of the image renditions framework are supported (e.g. the ability for an author/administrator to crop an image rendition so that a certain portion of the image is used)

Technical architecture of our solution

I’ll go into more detail in the next post, but briefly our solution works as follows follows:

  1. An intranet author adds or changes an image in SharePoint
  2. A Remote Event Receiver executes, and adds an item to a queue in Azure (NOTE – RERs are not failsafe, so we supplement this approach with a fall-back mechanism which ensures broken images never happen. More on this next time).
  3. An Azure WebJob processes the queue item, taking the following steps:
    1. Fetches the image from SharePoint (using CSOM)
    2. Creates different rendition sizes (using the sizes defined in SharePoint)
    3. Stores all the resulting files in Azure BLOB storage
  4. The Azure CDN infrastructure then propagates the image file to different Azure datacentres around the world.
  5. When images are displayed in SharePoint, the link to the Azure CDN is used (courtesy of a small tweak to our display template code). Since CDNs work by supplying one URL, the routing automatically happens so that the nearest copy of the image to the user is fetched.

This is depicted below:

CDN image renditions for SPO

Fall-back mechanism

Clearly it’s critical that an intranet home page never displays a broken or missing image – that would be A Very Bad Thing for most organizations. So how can we guard against that? Also, we said that Remote Event Receivers cannot be 100% reliable (and also should not do “heavy” processing work), so…..what about that? And what about existing images in a site that was running before a solution like this is implemented? My colleague Paul Ryan and I wrestled with these challenges and more as we architected the solution and wrote the code - I’ll talk more about the fall-back mechanism (which takes care of these aspects) and go into more detail on the technical implementation in the next post. 


It would be great if this issue didn’t exist in Office 365 in the first place of course. But I write this post to show that with the right building blocks, we can certainly supplement functionality around Office 365/SharePoint with some effort. This was a solution developed for clients of Content and Code so it’s not something I can provide the source code for, but hopefully this write-up helps awareness of the issue and potential ways of working around it. More next time..

Thursday, 25 February 2016

Get started with Office 365/SharePoint Online dev – part 2: Developing SharePoint Add-ins (apps)

This is the second post around getting started with Office 365/SharePoint Online development. In these couple of articles, I discuss a process where developers new to this space can get started by using trial environments and an Azure virtual machine. The idea is that you don’t need an MSDN subscription, existing Office 365 environment or even a development machine to get running – in the last post, we created all those things. Now it’s time to use them. Specifically, we’ll get some add-in code running in the Azure VM, which talks back to SharePoint Online - we’ll do this by obtaining a sample app and configuring our dev environment to run it. In my opinion, this is a good way to learn about modern SharePoint development which uses “provider-hosted” remote code – which, of course, can be used on-premises or in the cloud. Before we get started, a reminder on the contents of these two articles:

  1. Get started – part 1: Create trial environments and a VM
  2. Get started – part 2: Developing SharePoint Add-ins/apps (this post)

Deploying a provider-hosted SharePoint Add-in in your VM

Here’s an overview of what we’ll cover here:

  • Create a Developer Site in Office 365
  • Complete some information in your user profile in SharePoint Online (since our lab demo will use it)
  • Prepare the development environment to host apps locally:
    • Create a local IIS website to run apps
    • Create a self-signed SSL certificate and bind it to the site
  • Register the SharePoint Add-in using AppRegNew.aspx
  • Download the “app script part” SharePoint Add-in sample from from Github (from the Microsoft OfficeDev Patterns and Practices library), and run it locally in the VM
  • Review how the solution is working

Here are the step to go through..

Create a developer site collection in Office 365

Navigate to the SharePoint admin area of Office 365, and go to the “Site Collections” page. Click “New” to create a new site collection:

Use settings like the below:

Once the site has created, check it can be accessed in the browser:

Enter some details into your user profile in Office 365/SharePoint Online

In any SharePoint page in your Office 365 tenant, click the user photo in the top right corner, and then click the “About me” link to go to the profile page:

Click the “Edit profile” link:

Enter some details into fields such as “About me” and “Ask Me About”. Also use the “Change your photo” link to change your photo:

Once done, click “Save all and close”.

Create a new site in IIS for app hosting

Open IIS Manager, and create a new website. Use the following settings:

Config item Value
Name spsites
Path C:\inetpub\wwwroot\spsites (N.B. the “spsites” folder will need to be created)
Host name spsites

The Add Website dialog should look like this:

Click “OK” to finish creating the IIS site.

Add the site to your hosts file:

Create a self-signed SSL certificate and apply it to the site

Open a PowerShell window as an administrator. Create a self-signed cert using the following command:

New-SelfSignedCertificate -DnsName spsites -CertStoreLocation cert:\LocalMachine\My

Now we need to install the certificate as a Trusted Root certificate in the store, and apply it to the “spsites” IIS website. The following steps are used:

  1. Certificate is exported to a file.
  2. Certificate is installed from the file on the filesystem to the “Trusted Root Certification Authorities” cert store.
  3. Certificate is applied to the IIS website.

Go to “Server Certificates” in IIS, and find the certificate you just created. Follow the sequence in the steps below to export the certificate to a file:

The certificate should now be exported as spsite.cer.

Now find the file on the filesystem, and right-click > “Install Certificate”:

Follow the process in the images below to install the certificate to the Trusted Root cert store:

The final step around IIS and SSL certificates is to apply it to the site. Find the website in IIS, right-click on it and select “Edit bindings..”:

Add a new site binding on port 443 – select the “spsites” certificate:

Click “OK”, and the certificate should now be applied to the site. You should be able to browse it on https://spsites (although note you will get a default IIS page, since there is no site running there at this point).

Obtain the “app script part” sample from the Microsoft OfficeDev Patterns and Practices library

Navigate to - click the “Download zip” link:

Save the zip file to C:\Code on your virtual machine (create the folder since it won’t exist already), and then unzip:

Now run Visual Studio 2015 as an administrator. Note that it can take some time to open in an Azure virtual machine.

In Visual Studio, use “File > Open > Project/Solution…” to open the “App Script” sample from the zip in Visual Studio – this can be found at:


Once the project has opened, enter the URL for your Office 365 developer site in the Site URL property of the main project in Visual Studio:

A dialog box should appear – enter the credentials for your Office 365 identity:

Visual Studio should now be signed-in to your developer site in Office 365.

Configure the web project to match your hosting environment

Go to the properties page for the “Core.AppScriptPartWeb” project, and go into the “Web” area. Change the settings to match the IIS site we created, and click the “Create Virtual Directory” button to allow the files to be hosted in their dev location:

Save the Visual Studio project.

Also edit the userprofileinformation.webpart file in the project – find the line which sets the URL of the JavaScript file being linked by the web part in the sample, and set it to https://spsites/Core.AppScriptPartWeb/scripts/userprofileinformation.js:

Create the registration for your Add-in in SharePoint Online

In a browser window to your developer site, navigate to the following URL:

[site]/_layouts/15/AppRegNew.aspx in my case, this is:

Click the “Generate” button to generate new values for both the Client ID and Client Secret fields, then complete the other information as below:

On the next screen, be sure to copy the Client ID and Secret somewhere safe – you’ll need these later:

Now return to Visual Studio and perform the following steps:

Open the web.config file in the web project, and update the ClientId and ClientSecret app settings values:

Open the AppManifest.xml file in the app project, and update the ClientId attribute:

Your add-in should now be configured to run in development – in the next step we will run the app to test it.

Run the add-in

Press F5 in Visual Studio to run the project. Enter the credentials for your Office 365 identity if prompted:

You should be prompted to trust the app:

Click “Trust It”, and you should then be taken to the default page for the add-in. Note that this page is not a key piece of the add-in – it simply uploads the web part in an automated way to make it available.

To do this, click the “Run scenario” button:

Now click the “Back to Site” link:

Edit the page, and add the web part deployed by the add-in to the page. Go to add a web part, and you’ll find it in the “Add-in Script Part” category:

Once the web part has been added to the page, it should show details from your user profile (including a photo if you have one – I don’t in this case!):

If you got this far, well done! Now let’s consider what we just got working.

Key takeaways

  • We implemented a web part which is truly cloud-friendly – it has NO server-side code, because it is not a legacy web part.
    • In fact, it’s just an out-of-the-box Script Editor web part – which points to a particular JavaScript file, which has the actual functionality implemented
  • The JavaScript file has some CSOM code to fetch the user profile details
  • The JavaScript file is actually hosted remotely from SharePoint. In the case of our development scenario, it was running in IIS – but we could publish it to Azure or similar for production use (remember we’d need a new app registration with the appropriate URL etc.)

The key bits of code are essentially some simple JSOM code to fetch user profile details and output to HTML:


So that’s an introduction to the world of remote code and cloud-friendly development in SharePoint. Hope someone found it useful!