Wednesday, 17 June 2015

Implementing AD integration with Office 365 using a sub-domain (for dev/test)

As discussed in Challenges in Office 365 development – and how to address them, it’s fairly common to create multiple Office 365 tenancies to get around the fact that there’s currently no such thing as a “test environment” in Office 365. However, as the previous post discusses in detail, it’s certainly true to say that some trade-offs come with this method, generally related to environmental differences, code issues, testing, Application Lifecycle Management and so on. This article series attempts to provide some options and techniques which might be useful:

Summarizing the problem

Non-production tenants used for dev and test typically lack some of the aspects of the production environment – things like:

  • Directory integration (i.e. users sign-on with their “real” Windows account – e.g. “chris@mycompany.com” rather than “onmicrosoft.com” cloud accounts)
  • Synchronization of user profile data between AD and Office 365
  • Yammer SSO (or Yammer in general!)

And this leads to a long list of compromises – from not being able to test the real user experience properly, to sometimes having to write code to use a different “mode” in dev/test compared to production (perhaps because user account names work in different ways, or profile data isn’t populated).

Background – using one AD and multiple subdomains for Office 365 integration

So, we’ll look at one option for mitigating this issue – a technique which allows configuring non-production Office 365 tenancies with full directory integration and synchronization of user account data, but WITHOUT the need for a separate custom domain and Active Directory for each (and all that entails). This post is effectively a big HOW-TO article which describes the configuration process. In a team working on perhaps 15+ Office 365 implementations at any one time, you can probably see the attraction of this technique for us! Frankly, it can be challenging in the enterprise to get a test AD and domain set up which can be used with test Office 365 environments – with servers, networks, AD, security and operations all involved, quite a few people across I.T. might be needed there. So, the idea of having ONE Active Directory which could be used for directory integration with MULTIPLE Office 365 environments can be useful for lots of different teams in many different contexts.

What we're doing here
Ultimately the way we achieve this is to perform the regular “custom domain integration with Office 365” config steps, but with a couple of tweaks. Office 365 administrators and platform people will be very familiar with the standard process, and it’s a key element of hybrid SharePoint/Office 365 configuration. However, most developers will be less familiar with this stuff – but I personally think it’s hugely beneficial for any technical person working with Office 365 to have been through this process at least once.

At this point, I need to make clear that a lot of the smart thinking behind this approach actually comes from one of my colleagues at Content and Code – I’m really just being the mouth-piece here ;) Tristan Watkins is Head of Research and Innovation at Content and Code, and Tristan did the initial work of proving the approach – awesome work as usual.

What we actually do is use a sub-domain for each different Office 365 tenancy we want to integrate with the Active Directory/domain. So this could be:

  • Client1.MyCompany.com
  • Client2.MyCompany.com

Of course, you might substitute “Client” with “Project”, or anything else that you’re using different tenancies for.

Or perhaps:

  • Dev.MyCompany.com
  • Test.MyCompany.com
  • UAT.MyCompany.com

..and so on.

Ultimately where we’re going with this is that we’ll have one AD which can provide accounts to multiple tenancies. Whilst it may not be appropriate for production, this can really help us with our mission to make dev/test environments more like production. User accounts in Office 365 don’t *have* to come from AD when cloud identities are used (more on this later), but it can be useful to stay closely with the model that you’ll use in production and implement sync between AD and O365.

Things you’ll need

We need to be ready with each item in the list below – if not, you’ll need to purchase/obtain the following:

  • An Office 365 tenant ready and provisioned
  • The top-level domain (e.g. MyCompany.com) – this needs to be something you/your organization already owns and has registered
  • Ability to manage DNS for the domain (usually at the ISP hosting the domain)
  • Ability to create user accounts in the AD you’ll use

At a high level, we go through the usual steps to add a custom domain to an Office 365 tenancy – I’m using GoDaddy to host my domain/DNS, and Office 365 offers some useful integration/simplified admin with this and some other hosters. However, with the sub-domain approach, even if we ARE using GoDaddy to host our domains, we’ll need to manually configure our DNS records rather than can click any of the “magic buttons” to buy or configure the domain (like this one):

GoDaddy shortcut

So, just be aware that although Office 365 has numerous “simplified admin” options if you’re using GoDaddy, for the most part we’ll be ignoring these and doing some extra configuration ourselves.

HOW-TO: configure Office 365 domain integration with a sub-domain

For reference, in this walkthrough I’m using the following:

Office 365 domain

Custom domain (top-level)

Subdomain used for this tenancy

cobsp
..so this gives:

- URL = cobsp.sharepoint.com

- Usernames = user1@cobsp.onmicrosoft.com

chrisobrien.com

cob.chrisobrien.com

So here’s the process. Firstly we go to the Office 365 admin center, and then into the “Domains” area. Once there, we click the “Add domain” button on the “Manage domains” screen:

Add domain

We then start to step through the configuration of adding a custom domain to Office 365:

clip_image006

Hit the “Let’s get started” link on this screen. On the next screen (below), we should enter the subdomain/domain we are intending to use for this Office 365 tenant – as mentioned previously, we’re going to need to add DNS records there so we should be ready for that.

clip_image008

Once the domain is entered, hit “Next” and see the screen below. Here, you need to NOT sign-in to GoDaddy and have Office 365 attempt the config for you. Instead, the DNS records need to be created manually, so click the “use a TXT record” link as shown below:

clip_image010

This next screen gives details of the TXT record you’ll need to add at your hoster (GoDaddy in my case):

clip_image012

Doing this helps authenticate the fact you do indeed own and control the domain, and aren’t trying to pull some sneaky internet trick. So, we now need to go to the control panel of the ISP hosting our domain, and specifically the DNS area. At GoDaddy, it looks like this:

clip_image013

I use the “Add Record” link to add a TXT record with the details Office 365 gave me. Once done, I wait a while and then go back to Office 365 and confirm I’ve now done this step. If I added the TXT record as specified and DNS has propagated, I should see something like the following:

clip_image015

I click the “Next” button here. This will update any user’s e-mail addresses from user1@cobsp.onmicrosoft.com to user1@chrisobrien.com in my case:

clip_image016

You can then optionally add some additional users (manually). If you plan to sync users from an on-premises AD using AAD Sync you probably won’t use this option, but for dev/test environments you might want to add a couple more users at this stage (assuming you have licenses for them):

clip_image017

Now we get to the important part – adding the main set of DNS records to support the integration with your custom domain:

clip_image018

Office 365 asks you which services you plan to use/integrate, so that you only need to worry about the relevant DNS records:

clip_image019

On the screen which follows, there’s another one of those super-handy shortcut links that unfortunately we can’t use (because we’re doing the non-standard thing of using a sub-domain) J Skip the “Add records” easy button, and instead click the “add these records yourself” link:

clip_image020

The screen which follows lists all the DNS records you need to add at your hoster. You now need to cross-reference this screen with the DNS panel there – for each item listed, configure the DNS record in the admin panel. The key difference compared to the standard process is that the sub-domain is specified in each DNS record, rather than just the top-level domain:

clip_image022

Here’s an example of adding a record in the GoDaddy DNS admin panel:

clip_image013[1]

clip_image024

Once you’ve added all the records, go back to Office 365 and click that “Okay, I’ve added the records” link. Office 365 will now check the records you added, and if successful you should see a confirmation message as shown below:

clip_image026

Your custom domain is now integrated, and the status should be reflected on the “Manage domains” admin page:

clip_image027

Review - so what do we have now?
At this point, we now have our on-premises AD integrated with our Office 365 tenancy, using a sub-domain. However, we don’t have user accounts actually existing in Office 365 yet – accounts can be created there in different ways, and we need to use one of them to actually get identities up into Office 365 for testing. This applies regardless of whether we’ll actually log-in with these accounts, or simply use them as “passive” users. Ultimate these are cloud identities, so if these users will log-in our model is to use cloud authentication rather than the other option (federated authentication e.g. with ADFS or similar).

By the way, if some of this is confusing and you need a good background on the different options for identities and authentication in Office 365, I’d suggest starting with User Account Management on TechNet.

Actually creating users in Office 365 – AAD Sync, manual, or bulk update

With or without a custom domain (or subdomain), users can be created in the following ways:

  • Manually in the Office 365 administration UI
  • Using PowerShell
  • Using bulk upload from a CSV file
  • Using AAD Sync
  • Using an Exchange mailbox migration

In other words, it’s worth remembering that even when you’ve integrated a custom domain with an Office 365 tenancy, users do not *have* to come from AD. However, this is definitely possible if you want to stay close to production and perhaps have synchronization occurring from AD to Office 365.

Using AAD Sync to sync users from AD to Office 365

Implementing sync from AD with this sub-domain approach is made possible by adding an alternative UPN suffix to AD domain (as detailed in the “Assign a UPN domain suffix” section of Configure Office 365 for SharePoint hybrid on TechNet) – this effectively allows us to have different users in the directory be matched to different Office 365 tenancies:

clip_image028

Once the UPN suffix has been added in AD, it’s then possible to assign this suffix to individual user accounts – thus “matching” them to a particular Office 365 tenancy and URL domain:

clip_image030

I also tend to put each set of accounts in a different OU so I can easily see accounts for each tenancy.

At this point, you could now implement AAD Sync to synchronize accounts – you would download and install/configure the AAD Sync tool, which is a step that Office 365 administrators will be familiar with:

clip_image032

clip_image034

This will provision FIM and the scheduled task for ongoing sync from AD. In FIM, you could then restrict the sync to a particular tenancy to users you earlier put in a specific OU:

clip_image036

If you want to do this against multiple Office 365 tenancies, one thing to be aware of here is that you’ll need multiple instances of FIM and the sync tool/task (as far as I’m aware). Since none of this sync stuff to dev/test is mission-critical for you, this can be managed pretty easily by just deploying AAD Sync in a couple of different small VMs (i.e. one per tenancy).

At this point, you’d now have users being sync’d from your AD instance to your dev/test Office 365 tenancy. Cool!

And finally, don’t forget that if you want these test users from AD to be able to login and consume Office 365 services, you’ll need to assign them a license in the tenancy.

Summary

Implementing directory integration is required in some Office 365 contexts (e.g. hybrid, and where Yammer Enterprise is used), but is useful in many others too. If you’re using multiple Office 365 tenancies to represent non-production environments, it’s a pain when dev/test don’t match the production set-up, especially if you find yourself implementing any custom functionality around user profiles. The approach detailed here provides one option for facilitating directory integration for dev/test Office 365 environments, without some of the infrastructure requirements and headaches that would usually come with this.

This series will continue to discuss options and techniques for improving Office 365 development. Other posts:

Monday, 15 June 2015

Challenges in Office 365 development - and ways to address them

Over the last 2 years, I've spent quite a lot of time thinking about "cloud-friendly" SharePoint development - approaches and techniques which will work for Office 365, but also on-premises SharePoint deployments which need to be designed with the same principles. With my team now having now done at least 20 or so Office 365/SharePoint Online implementations (and counting), in this time we’ve established certain ways of working with Office 365. A good example is creating multiple tenancies to represent dev/test/production to help with our ALM processes, since Office 365 doesn't have the concept of a test environment. Sure, on the SharePoint side (my focus here) you could just create a special site collection - but that doesn’t give you the isolation needed across any global elements. Examples include things like user profiles, taxonomy and search. And since our clients often ask for minor customizations which interact with these areas, the “test site collection” idea just doesn’t cut it for us. Instead, we go with the multiple tenancy approach, and I’ve advocated this for a while for anyone with similar needs.
As I’ve previously discussed at conferences, our choice for most clients/projects is to create separate dev and test tenancies, and it generally looks something like this:
clip_image002
It’s not the most important point here, but we tend to use different Office 365 plan levels for these environments - most of our clients use the “Office 365 E3” plan in production, and we’ll ensure TEST is on the same plan, but reduce costs by having DEV use “SharePoint P2” (so no Exchange, Lync or Yammer Enterprise). This generally works fine, because most of our development work centers on the SharePoint side. But regardless of what you do with plan levels, it’s also true that some trade-offs come with using multiple Office 365 tenancies (of any kind) – recently I’ve been thinking about options to mitigate this, and these broadly can be categorised as:

  • Making dev/test Office 365 tenancies more like production
  • Finding ways to test safely against the production Office 365 environment
The next few blog posts will detail some techniques which can help here. In this first post I’ll discuss the problem space - some problems I see with current Office 365 development which might lead you to consider these approaches. But in the article series I’ll be discussing things like:
  • Implementing AD integration with #Office365 using a sub-domain (for dev/test)
    • N.B. Directory integration is mandatory for true hybrid deployments or where Yammer Enterprise is used, but frankly is also useful to avoid other issues in dev/test
  • Enabling Yammer Enterprise with SSO in dev/test environments
  • Using Azure Deployment Slots to implement dev/test/production for apps hosted in Azure
For now, let’s talk about some of the issues you might hit in Office 365 development.

Challenges which come with multiple Office 365 tenancies

When we talk about dev and test environments, implementation teams always have a need to make these as similar to production as possible. The more differences, the more likely you’re going to have a problem - usually related to invalid testing or defects which only become apparent in production. Unfortunately, I notice our Office 365 projects do have certain trade-offs here. We really do want the multiple Office 365 environments for dev/test/prod (with the way Office 365 dev currently works at least), but it can be hard to make those other environments closely reflect production. Here’s a list of things which might be different:
  • Typically, directory integration is configured in production, but NOT for other environments
    • In other words, users sign-in with “chris@mycompany.com” in production, but “.onmicrosoft.com” accounts are used in dev/test
    • [N.B. You might know this step as “implementing Azure AD Sync”, or “implementing DirSync” to use its previous name]
  • Lack of SSO to Office 365 for users logged on to the company network (which relates to the point above)
  • Lack of a full directory of users
  • User profiles are not synchronized from AD in dev/test environments
  • Lack of Yammer Enterprise
  • Lack of Yammer SSO
  • Different license types (e.g. E3/E4 in production, but something else in dev/test)

OK, but why should I care about these things?

Depending on what you’re developing, some of these things can definitely cause problems! Some tangible examples from our experience are:
  • It’s not possible to do end-to-end testing – we can’t see the “real” user experience, especially across connected services e.g. Office 365 and Yammer
  • The experience on mobile devices is different
  • Code sometimes has to be written a different way, or use a “mapping” in dev/test - especially anything around user profiles. For example, any kind of user name lookup/e-mail lookup/manager lookups and so on
  • Any integrations with 3rd party/external systems might not work properly if they use the current user’s details in some way (because a different identity is used)
  • Yammer – the lack of SSO means a couple of things:
    • Any standard usage e.g. Yammer web parts or Yammer Embed won’t “just work” - a login button is displayed, and the user has to supply secondary credentials to authenticate/get the cookie
    • Any Yammer API code might need a special “mode” – because you probably have Yammer SSO in production, but not elsewhere

What can we do about these challenges?

So, it would be nice if we could make this situation better. Many of the issues stem from the fact that dev/test environments don’t have identity integration and AAD Sync configured, and what’s generally getting in the way there is that standing up a dedicated URL domain and Active Directory often isn’t trivial. The good news is that the sub-domain/UPN suffix approach I’ll talk about in the next post allows you to get past this – regardless of how many Office 365 environments you have, all you need is one URL domain and one Active Directory. In dev/test for us, this means ONE URL that we registered at GoDaddy for all our clients/projects. We run the on-premises AD simply in a small VM, which runs on developer machines.
Once the domain integration is implemented, the next step is to implement AAD Sync to actually get some users from AD to Office 365.  It will run off a set of users you choose (perhaps you would group some users for each dev/test tenancy into different OUs), and will perform the step of actually creating the users in Office 365. You could then optionally assign them a license if you want this test user to be able to login and use Office 365 functionality, and actual authentication will happen in the cloud if/when the user logs-in. If you want to implement Yammer Enterprise and Yammer SSO, you can now do that too. Directory integration is a pre-requisite for both of these things, but having solved that problem without major platform/infrastructure headaches, these possibilities open up for our dev/test environments.

Summary

So that’s a little on the problem space. Ultimately, developing for Office 365 at enterprise level does have some challenges, but many can be overcome and we can still strive for robust engineering practices. The next few blog posts will cover some of this ground – I’ll add links to the list below as the articles get published:
Thanks for reading!

Tuesday, 19 May 2015

Presentation deck/videos – Comparing SharePoint add-ins (apps) with Office 365 apps

If you’re implementing customizations in Office 365, it’s fair to say that numerous approaches exist now, and that the service has certainly been enhanced as a development platform in recent times. However, one area of potential confusion is around how apps are developed. Developers and other technical folks became used to “the app model” over the past 2 years or so, as a way of implementing custom functionality which ran on non-SharePoint servers (perhaps Azure) and called into SharePoint using remote APIs such as CSOM or REST. But since then, Microsoft have introduced new APIs and authentication models for Office 365 – these sit above the underlying SharePoint, Exchange, Lync/Skype for Business and Yammer services in Office 365, and come with some differences in how the solution must be developed.

Notably, there are also differences in how end-users access the app, and also how administrators can make the app available (e.g. to specific users) in the first place. In all this, the original app model is not going away – SharePoint apps have now been renamed to “SharePoint add-ins”, but they are still are very valid implementation choice. So technical teams working with Office 365 often have decisions to make: should a customization be implemented as a SharePoint add-in or an Office 365 app?

For me, the key is understanding the impact on the different stakeholders – end-users especially, but also administrators and developers. The last group in particular need to understand the capabilities of Office 365 apps (e.g. what the APIs can do), to decide whether the functionality needed by the business can indeed be implemented with this approach.

My presentation

I presented on this topic at the SharePoint Evolutions 2015 Conference, and have now published the presentation deck and demo videos. The deck is shown below, and can also be downloaded from this link: SlideShare - Comparing SharePoint add-ins (apps) with Office 365 apps. There are 3 demo videos, and I’ve added captions to these and published them to YouTube – you can see these if you use the presentation below or obtain it from SlideShare. 

 

Hope you find it useful!

Tuesday, 12 May 2015

Info and some thoughts on Office 365 “NextGen” portals – Microsites, Boards and the Knowledge Management portal

Microsoft made some pretty big announcements at the Ignite conference I spoke at last week, which I think may change how intranets will be built in Office 365 (and maybe on-premises SharePoint too in the future?). I don’t normally try to be a “news blogger”, but for things that have a big impact on me/my team I like to make the occasional exception. Microsoft’s moves on “NextGen portals” and in particular the new Knowledge Management portal (codename “Infopedia”) are definitely in that category - there are several new CMS-like concepts here for Office 365/SharePoint. Some new “ready-to-go portals” will be made available to Office 365 users, and these come with a philosophy of “use more, build less”. As well as the KM portal, we can expect to see some kind of blogging capability soon, which may or may not be known as “stories”. These tools should work well for organizations who aren’t strongly tied to their own (often carefully-crafted!) list of requirements. If you’re already wondering about what happens when this isn’t the case, Microsoft *are* thinking about the customization strategy for NextGen portals. In the future it sounds like it will be possible to use various bits of what I’ll describe here as building blocks for custom portals – but that’s further down the line, and the “ready-to-go” offerings are likely to be around for a while first.

The impact

Immediately, a couple of things jump out at me:

  • If using the new portals, organizations will need to decide where they fit in the user experience with any existing intranet sites (just as they already do with Delve, Yammer and so on)
    • Notably, the NextGen portals come with their own look and feel. It’s not yet clear whether some level of branding can be inherited – but in the end, you’re kinda getting something for free here, so.. :)
  • The Knowledge Management portal could be seen as an “intranet-in-a-box”. There’s been a certain trend towards 3rd party SharePoint products of this type in recent years, but thinking as a client/user, I’m not sure why now you just wouldn’t use Microsoft’s. After all:
    • It’s free (i.e. part of your existing Office 365 licensing)
    • It’s got the development muscle of Microsoft behind it
    • It will be enhanced continually
    • It will be 100% supported by Microsoft
    • Whilst it might not match the precise functionality of a particular example product, you can probably get close enough to what you were trying to achieve. And it’s free ;)

Understanding how Boards, Microsites and portals relate to each other

The Knowledge Management portal is an example of a “ready-to-go” portal – just like Delve and the existing Office 365 video portal. You can immediately start creating sites and pages, but you won’t have much control over the functionality or look and feel. You get what you’re given, and if it works, great. If it doesn’t, well you don’t have to use it – ultimately it’s just another tool in the toolbox. I see it almost as a cross between a team site and a publishing site. Before we go into detail on different aspects, to help you frame things here’s what a page looks like:

Article page

The image shown above is an article page. As you’d expect, other page types exist such as a landing/front page – but implementers do not create custom page templates or page layouts.

The Knowledge Management portal isn’t the only thing to consider here though - Boards are another concept (which we see already in Delve), which can be used to present information and Microsites provide another layer. I think of it like this:

Concept

Notes

Good for

Boards

Some similarities to a board in Pinterest. Add links to documents and pages.

Lightweight/informal knowledge management – a relatively small amount of information around a topic.

Microsites

A simple website, with a landing page and some navigation. Can contain boards, pages and documents.

More structured/larger amounts of information. Something more organised and perhaps with a more formal approach to contributions.

Knowledge Management portal

Contains microsites, SharePoint sites and boards. Adds other tools such as personalisation, simple analytics and use of Office Graph for a level of “intelligence” (e.g. content you might like).

A central area to sit above microsites – likely to become a big part of your intranet (maybe used as the home page, maybe not).

So, the KM portal contains microsites (amongst other containers, such as “regular” SharePoint sites), which in turn contain boards (and pages and documents):

Relationships between NextGen concepts

I’d imagine boards might be able to exist outside of a microsite too.

Boards

Boards can be used right now to collect information together – they are surfaced already in Delve, but will be used in other places such as microsites in the future. They are intended to be fairly simple to use, and have the following characteristics:

  • Boards show a collection of cards
  • Cards are currently *always* shown in date descending order (of when they were added to the Board) – there is no other control over which content is shown where
  • Anyone can create a new board
  • Anyone can add an item to a board
  • Boards are always public

For example, here’s a board I recently created here at Content and Code:

Our Development board

In the future, we can probably expect to see other hooks into boards – the image below was badged as “Ideas in the pipeline”, but I’d be surprised if there wasn’t something like it:

Add to board in doc lib

Microsites

Microsites are pretty much what you’d expect – relatively simple websites, where lots of things are taken care of for you. Whilst it’s really the parent portal (e.g. Knowledge Management) that’s providing the “intranet-in-a-box” capability, some of the aspects are effectively provided by microsites:

  • More functionality and control than a board – i.e. not just cards, but pages too
  • Landing page
  • Article pages
  • Auto-generated navigation
  • Simple permissions model
  • Some social features
  • Responsive design – good mobile experience
  • Easy to create/contribute to

Here’s what the experience might look like down at the article page:

Microsite article page

The Knowledge Management portal (“InfoPedia”)

Whilst you’ll create many microsites, there is only one KM portal in the Office 365 tenant. It is effectively a container for microsites, but it’s more than that too:

  • Also bring in existing (or new) SharePoint sites and boards
  • Some extra top-level constructs to help users find what is valuable – navigation to different microsites, search tools and so on
  • An enhanced article page – with some additions focused on presenting knowledge
  • Tagging – to group together content in different sites
  • Personalised recommendations via Delve – both for consuming (“see related content”) and creating (“suggested content you might want to link to from this page”)
  • Analytics
  • A great mobile experience – potentially even for light content creation

Here are some screenshots:

The landing page:

Landing page

Showing sections within a microsite:

Sections

An article page within the KM portal:

Notably, this isn’t quite the same as an article within a regular microsite – there’s a greater focus on design (with the banner image), and some automatic in-page navigation (TOC style):

KM portal article

Some of the authoring experience:

Microsite authoring

A particularly rich article page (long vertical scroll in this case):

Microsite article - long

Responsive design/adaptive design: 

Providing a great mobile experience is a fundamental pillar to the NextGen portals vision. As you’d expect, pages scale down well and adapt well to the screen size of the device:

clip_image023

As you can see, it’s something like a publishing site with a specific look and feel. I think it could probably be used for any intranet site which focuses somewhat on presenting information – potentially even team sites which don’t feature heavy collaboration. Like other ready-to-go-portals, it’s a “railed experience” – meaning authors focus on the content, and how it gets presented to consumers is largely taken care of and not super-customisable.

Page creation/editing

Whether in the KM portal or any microsite, there is a streamlined experience to creating pages – this is great news because SharePoint was never the best or simplest CMS in this respect. Authors often found the ribbon complex, and the whole “navigate to a specific place, use the Site Actions menu, select a page layout” experience left a lot to be desired. Here are some notes I made on the new page editing experience:

  • No selection of a page template – just start authoring content
  • Simple formatting controls – bold, italics, font colour and size etc.
  • Easy to add text/images/documents/video
  • Auto-save
  • Auto table of contents style navigation for long pages
  • Simple picker for choosing images
  • Documents can be embedded in pages – this uses the Office Online rendering, just like the hover panel in search results or a document library. This means a PowerPoint deck can be browsed within the page, a Word document can be paged through and read within the page, and so on

The authoring experience:

Authoring in KM portal

How portals are implemented

There’s a whole layer of technical implementation detail to understand around all these concepts, and I look forward to digging around when they become available. Here are some of the notes I made:

  • Portals are effectively built as an extra layer above SharePoint. This layer encompasses a couple of things, including:
    • A Single Page App which supports portals and the specific things they do – this has various page controls (e.g. table of contents control, page rollup control etc.) and calls into the REST APIs underneath
    • The page rendering layer – all implemented as client-side code
  • They use site collection storage, but in a specific way – for example:
    • Pages are stored as BLOBS of JSON in the Pages library
    • Images are stored in the Assets library
    • Permissions are managed across the page and related assets (they are treated as one unit effectively)
    • Some other libraries are used e.g. a new “Settings” library

Here’s an image which depicts the railed experience (of the page template):

Article - railed 1

Article - railed 2

There’s a difference in how page data is stored too. Whilst we’re used to content types with various fields storing different data types, here it looks like the data will be stored in a larger JSON structure (still within the Pages library, but perhaps “fields” will only be represented in the JSON rather than physically exist on the content type):

clip_image033

Other Office 365 platform services such as Azure (for Video) and Office Graph (for suggestions) are used in the implementation too.

What customization might look like

As I mentioned earlier, it feels like Microsoft are (understandably) a lap ahead in terms of the end-user functionality compared to the extensibility story. However, the kind of things that were suggested in the Ignite talks were:

  • A future ability to build custom portals using the same building blocks used in the ready-to-go portals (I’m guessing the page rendering, data storage, page controls for roll-ups etc.)
  • Potentially some form of extensibility of the ready-to-go portals
  • A framework (e.g. APIs) for re-using JavaScript and CSS used in Microsoft’s portals
  • It should be possible to host custom portals in SharePoint Online – you wouldn’t need to provide your own hosting

I was interested to hear Dan Kogan (long-time Microsoft product manager) also hint they might even look at open-sourcing the entire front-end layer of NextGen portals.

Summary

It feels to me like this is a fairly big evolution of Office 365/SharePoint as a tool for managing information. The past few SharePoint releases have seen incremental improvements to publishing sites and team sites, but they still required a reasonable amount of experience for business users to get the most out of them. By providing tools at a higher level, it seems Microsoft are making a departure here – and technical implementers will need to “understand product” well if they are to make the right decisions and provide the right guidance. This is something I’ve been telling my team, and I think it applies whether you’re a developer, IT Pro, consultant or just about any other implementer.

As I mentioned earlier, it will also be interesting to see the impact on parts of the product industry which surrounds SharePoint and Office 365. As any product company surely knows, building around a stack from a company with deep pockets like Microsoft can be risky – as we saw with Microsoft’s backing of Yammer and the corresponding impact on other companies selling social products.

But personally I’m looking forward to getting deep with the new stuff when it arrives, and the challenge of continuing to provide expert guidance to clients as changes in the landscape continue to accelerate.

Sources (talks at the Ignite conference):

Thursday, 16 April 2015

Improving the Office 365 SharePoint hybrid experience

Back in February 2014, I wrote Office 365 SharePoint hybrid – what you DO and DO NOT get. In that article, I discuss the fact that whilst there’s lots of discussion about hybrid and indeed lots of organizations doing *something* in this space – in actual fact, what Microsoft provide is far from a turn-key solution and there’s quite a long list of things you need to work on. That is, if your organization wants to provide any kind of “joined-up” experience that works well for end-users, across sites in the cloud and sites in an on-premises SharePoint environment.

It’s fair to say that at the current time, hybrid from Microsoft really just consists of:

  • Search
  • BCS
  • A model for personal sites (OneDrive for Business sites) to live in the cloud, with some navigation for users, when other things are on-premises

The list of things you don’t get includes things like (N.B. the earlier article has a more complete list):

  • Global navigation
  • A global site directory
  • A joined-up social experience (unless you use Yammer)
  • Any kind of synchronisation of:
    • Branding
    • Customizations
    • Taxonomy/Managed Metadata
    • Content types
    • Apps
    • User profile data

..and so on.

I fully expect SharePoint 2016 to improve on this situation – I think it will be a key focus area for the product. But here in April 2015, having SP2016 in production is still some considerable time away for most.

Common requests for improving hybrid – user profiles, taxonomy and user photos

Since then I’ve worked on several hybrid deployments, and have been involved in work to close the hybrid gap and improve the experience. In this post I want to talk about some of the ground my team and I have covered – what we’ve implemented, and what our experiences have been. From where I stand now, I still believe that hybrid is the most complex option in some ways (for now at least). If you have a choice of simply running all of your SharePoint “things” in Office 365, at a stroke you will reduce complexity, cost and effort! However, this isn’t always possible and indeed, I talk to more and more organizations who are on a roadmap to eliminating on-premises SharePoint in favour of SharePoint Online – but it will take time to get there. So, hybrid definitely makes sense in many cases.

For one client in particular, we focused on solving three core problems:

User profiles

Since hybrid effectively gives each user two completely independent/disconnected user profiles, we wanted to provide one place for a user to edit their profile, with changes replicated to the other. We picked the on-premises SharePoint profile as the “editable master”, since some investment had already been made in integrating data from other on-premises data sources in this case (via BCS) and also custom AD fields. Our job was to replicate the changes to the SharePoint Online user profile (particularly custom/organization-specific fields not sync’d by Azure Active Directory sync – more on this later), so that user data was consistent wherever it was accessed

Taxonomy data

There were two drivers behind the need to sync taxonomy data from the on-premises SP2013 term store to the SharePoint Online term store – tagging of documents (as you’d perhaps expect), but also the fact that several term sets were used in custom user profile properties. Examples would be user profile properties for, say, “Business unit” or “Division” – values for these come from custom term sets to ensure consistency and valid input. In order for us to sync user profile data, this taxonomy data would need to also exist in Office 365, otherwise the profile cannot be saved because it’s pointing to a missing value. In other words, there was a dependency on taxonomy data so that we could sync user profile data.

User photos

On the surface this was simpler than the others – the client didn’t want to allow users to edit their own photos, and had some processes to deal with user photos in existing systems (including resizing, applying a border for consistency etc.). The desire was for the SharePoint Online user profile to also use this photo. If you’re going to bulk sync photos up to SharePoint Online, in reality the best way to do this is to deal with this at the Office 365 level (so that other service elements also show the photo), and the Set-UserPhoto cmdlet is provided in this space. However, we found that there was more complexity here than we expected – more later.

What we implemented – leveraging the OfficeDev Patterns and Practices code

Overall, custom code was needed to bridge these gaps. From the start, we knew that the OfficeDev Patterns and Practices libraries had code which would help, and we designed our solution around some pieces in there. With much of the heavy lifting taken care of, our focus was implementing some of their “engine code” in an enterprise kinda way – something that could run on a schedule, have the right level of resilience, and be monitorable so that any failures could be identified. Helpdesk calls of the “hey, when I go here it doesn’t have the right value for me, but when I go there it does!” and “I can’t tag this document properly” would no doubt occur otherwise – to a certain extent we definitely expect this occasionally, but we want to be able to identify/manage any issues proactively.

High-level architecture

Whilst the various bits of code supplied by the OfficeDev P and P team do their particular job (e.g. help sync taxonomy or user profile data), exactly how they are implemented and where they run from is up to you – options could include:

  • A simple console app, installed to an on-premises server
  • As above, but wrapped in a PowerShell script
  • A cloud option, e.g. WebJobs in Azure (but you might need to do extra work, e.g. with the Data Management Gateway or similar if access to on-premises data is needed)
  • Some combination of the above e.g. middle-tier code in Azure, “client” code or script on-premises

In our case, because all three aspects start with on-premises data we needed the code to talk to on-premises systems – be it the SharePoint 2013 environment, or on-premises fileshares (e.g. for photos). So, we used the “simple console app” option - this gave us the connectivity we needed, but also could be easily scheduled/monitored, and offered a familiar architecture that everyone was comfortable with. In code terms, however, we did separate out the core engine code, so that it could be hosted somewhere else (e.g. Azure) if ever required.

User profiles

The solution here focused on sync’ing custom user profile data from on-premises to SharePoint Online. Of course, Azure Active Directory sync (was DirSync) is used to synchronize some user data from on-premises systems (Active Directory) to Office 365, including the initial creation of profiles, in fact AAD Sync only deals with a core set of fields/attributes! Organizations commonly define *many* custom user profile fields (e.g. division, department, employee ID etc.) – and so a custom solution is currently needed if you want this data imported to Office 365/SharePoint Online. Our approach uses the relatively recent (November 2014) CSOM remote code methods to write the data to user profiles in SharePoint Online (not to Azure AD in this case). I implemented this as a two-step process, where a CSV file is generated of users whose profile has been updated (whether through the user, BCS fields or a sync from AD). The import then picks up this data as a 2nd step.

In fact, because CSOM is used for both reading from the source and writing the target, simply by changing values in a config file (e.g. URLs, credentials) we can actually synchronize profile data between ANY SharePoint environments we have connectivity to – either side could be an on-premises or Office 365 environment. This mean we have a pretty good solution which could be adapted if ever the needs change.

The end result is that custom profile data is synchronised to Office 365, and effectively users “look the same” in both environments - for example in people searches, or where a “created by” or “last modified by” link is clicked, and so on. In other words, the user experience just makes sense. (N.B. the images shown below are of the new profile experience in Office 365 [April 2015]):

clip_image001

clip_image002

Specific fields can be set to "do not import" (e.g. to not conflict with fields updated by AAD Sync), or for any other reason: clip_image004
User accounts can also be "mapped" between the source/target environments - this is useful where *.onmicrosoft.com accounts are used/AAD Sync and identity integration isn't implemented (e.g. test environments):

clip_image006

Any users who fail to import for some reason will automatically be re-tried (even if their profile hasn't changed since the last attempt). Information on which users are failing, how many times they've failed, and the last failure reason is maintained as the import jobs run:

clip_image008

Since they run on an ongoing basis, the tools are designed to be monitorable. They write a summary to the Windows Event Log, ready for SCOM or another monitoring tool to pick up and send e-mail notifications of any failures:

clip_image010

Further to this, Log4Net is implemented to provide rich logging to help troubleshoot any problems:

clip_image012

Taxonomy data (e.g. term sets for tagging)

Taxonomy sync can synchronize most types of changes (new term sets, new terms, renames etc.) from source to target. The heavy lifting is actually done by Microsoft code (the OfficeDev Patterns & Practices Core.MMSSync solution).

The end result is that any term set created in the source environment (on-premises SP2013 in this case) is replicated to the target environment (Office 365):

clip_image013
This allows documents to be tagged in a consistent way across online/on-premises sites, and also user profile fields which are bound to term sets to work properly in both environments.

clip_image014

Synchronization of user photos

Our mechanism here was mainly a PowerShell script which used the Set-UserPhoto PowerShell cmdlet (which is actually an Exchange Online cmdlet – and requires the user to have an Exchange Online mailbox in Office 365), but integrates with some existing photo-handling processes. However, it was decided to change this approach entirely in the end, due to some issues we hit – more on this in the next section.

Challenges we encountered

For the most part, the development of these tools went fine – however, we definitely did encounter a couple of road-bumps along the way. These were things like:

  • Typical client infrastructure issues – tools would run fine in our environments, but would be less reliable communicating with the cloud when running on more locked down servers and running through additional network infrastructure (e.g. reverse proxies).
    • In the end, this proved a significant issue for this client (but perhaps somewhat specific to their case). The user photo PowerShell script in particular would fall over with authentication problems when the network connection was less than perfect – and it was hard to “engineer in” absolute reliability here. In the end, it was decided a simpler approach would be to change policy to allow users to edit their own photo (in SharePoint Online) :)
  • Some taxonomy sync challenges
    • Although the P and P code is pretty resilient here, we did hit the occasional error (most likely related to connectivity/the previous point). For the most part, these would get resolved on the next run however.
  • Throttling and batch issues with remote code
    • During the development process, clearer guidance emerged on how to handle throttling in Office 365 code. This was useful, but we also found that some bits of the OfficeDev Patterns and Practices internal code caused problems in the way we were using it (even though the later versions implemented the throttling/retry pattern). There was nothing invalid about the PnP code *or* the calls we were making – it was more that some tweaking was required to be able to successfully process everything we needed to.

Summary

Ultimately we were able to implement our sync tools so that the end-user’s experience of hybrid SharePoint is improved. I think that any form of cloud development will always encounter occasional errors in communication, and at enterprise scale it’s usually a good idea to build in some kind of tolerance to this (e.g. our re-try mechanism for failed user profile updates). The OfficeDev Patterns and Practices code helped us out a ton – there’s no doubt that closing these gaps would have required WAY more effort without it.

As mentioned earlier, I’d expect SharePoint 2016 to improve on the hybrid situation natively – but this work wasn’t a huge effort to make big steps towards this right now.

Monday, 6 April 2015

Speaking at Ignite and SharePoint Evolutions 2015 Conferences

As you might know if you’re in the SharePoint/Office 365 space, there’s a couple of big conferences around the corner – I’ll be delivering sessions at Microsoft’s Ignite conference in Chicago in May, and also the most excellent SharePoint Evolutions conference before then in April, here in London. It always takes quite a lot of preparation to pull new talks together, but after the conference I’ll be looking forward to writing in more detail about some of the topics I’ve been exploring. Before then, I think both conferences are going to be pretty special, and if you’re going to one of them (or Build, in San Francisco in April) I think you’ll have a great time and learn a lot. Here are the details of my sessions (N.B. there were some updates to these summaries which hadn’t made it to the websites at the time of writing):

Ignite

MS_Ignite_400pxDealing with Application Lifecycle Management in Office 365 app development 

Thursday, May 7th, 10:45AM - 12:00AM

For teams who are undertaking cloud-friendly SharePoint or Office 365 development, apps are likely to be a key area of focus - be they Apps for SharePoint or newer Office 365/Azure AD apps. One of the benefits of stepping outside the traditional SharePoint development model is that aspects such as ALM and Continuous Integration become easier - finally, an end to "development is harder because it's SharePoint!". In this session we'll talk about how to do app development the "right" way - with a focus on ASP.NET as the framework and Azure as the hosting platform. We'll cover good practices such as implementing automated builds, take a close look at “continuous deployment” for apps, and also discuss the upgrade/push-to-live process for existing applications. Over several demos, we'll show how Visual Studio Online and Azure are a winning combination, and see how concepts such as "Deployment Slots" in Azure can help.

Transforming Your SharePoint Full Trust Code to the Office App Model (panel discussion)

Tuesday, May 5th - 5:00PM - 6:15PM
Panel – Vesa Juvonen, Frank Marasco, Bob German, Erwin van Hunen, Chris O’Brien

This session is a panel discussion covering examples and patterns for moving SharePoint customizations to the cloud app model - for use either in Office 365 or "cloud-friendly" implementations of on-premises SharePoint. The panel comprises members of the Microsoft OfficeDev Patterns and Practices team and independent MVPs. Both bring their experiences from the field, but different perspectives too. Since the landscape changes when developers must leave behind server-side SharePoint API code and feature framework elements, the discussion will centre around 5 related hot topics - branding, options around remote code (including .NET, JavaScript and PowerShell), provisioning of customized sites, the story with sandbox solutions and finally how the Office 365 APIs fit in with the app model. We promise a lively discussion, examples of code from the field, and time for Q&A!

 

SharePoint Evolutions

Speaker_Square_banner

Dealing with Application Lifecycle Management in Office 365 app development

Same as Ignite session above

New developer choices - comparing Apps for SharePoint with external Office 365 apps

No sooner have we all learnt how to develop Apps for SharePoint, when Microsoft go and release a new development model. Don’t you just love it when they do that?! In an "external" or "standalone" app, there's no need for the app to be installed to a SharePoint site - and this can provide huge benefits. However, new APIs must be used and the developer must understand how Azure AD fits in. Whilst this model is currently Office 365 only, Microsoft have hinted that this approach may come to on-premises SharePoint soon - so this will become important to all developers. This session is designed to get you up to speed.

Closing notes

I also plan to publish videos of the demos from these sessions. There have been quite a few changes in Azure recently, and my ALM session covers some of the cool things such as Deployment Slots, with a hat tip to other related capabilities such as the “Traffic Routing” feature which can be used for A/B testing and the like.

I’m really looking forward to the panel discussion at Ignite too. Like the other panel members I’ve definitely got some opinions on the topics being discussed :)

Anyway, hope to see you at one of the conferences. Please come and say hello if you’re there!