Showing posts with label features. Show all posts
Showing posts with label features. Show all posts

Wednesday, 23 June 2010

Feature upgrade (part 1) - fundamentals

In this article series:

  1. Feature upgrade (part 1) – fundamentals (this article)
  2. Feature upgrade (part 2) – a sample to play with
  3. Feature upgrade (part 3) – introducing SPFeatureUpgrade kit
  4. Feature upgrade (part 4) – advanced scenarios
  5. Feature upgrade (part 5) – using PowerShell to upgrade Features

In recent articles I’ve touched on the introduction of versioning and upgradability to the Features framework in SharePoint 2010. I’ve covered this topic recently in conference talks (SharePoint Evolutions) and as a chapter in the forthcoming Real World SharePoint 2010 book – however I also want to cover it to a certain level here, as I’m mindful that not everyone came to the talks or will buy the book. I also want to introduce a tool I’ve published on Codeplex which might be useful if you decide to use Feature upgrade - more on this in the next article.

When would I use Feature upgrade?

Feature upgrade is useful in the following scenarios (though there may be more!):

  • To make changes to existing site collections or sites, or something inside – for example:
    • Making changes to existing items e.g. adding a new column to a content type/list
    • Adding new elements to existing sites e.g. a new list
    • Any change which involves doing something to a site using the API e.g. using code to modify the navigation settings for many sites
  • To add new functionality into an existing Feature, rather than create a new one – perhaps because that’s the most logical factoring
  • Where some functionality will be upgraded several times during it’s lifecycle, possibly in a situation where the changes are not rolled out to every site, or are rolled out at different times

If you’re modifying or expanding on functionality developed using Features (across many sites or just one), then Feature upgrade is likely to be a good vehicle to roll out your changes. This is thanks (in part) to the new QueryFeatures() methods in the API, which provide a convenient collection to iterate over to apply the changes. When the changes themselves are also implemented in code, in the past developers may have put their implementation in the FeatureActivating event and then ensured the Feature is deactivated/reactivated - however this came with baggage, since you might have had other code in there which was never intended to be re-run. Feature upgrade is designed to solve such problems.

What does Feature upgrade look like?

Microsoft added some new XML to the Features framework to support Feature upgrade. When a new version of a Feature is created, this may involve:

  • Incrementing the version number of an existing Feature (this is mandatory for Feature upgrade to happen)
  • Adding some new XML to define new items for the Feature
  • Writing some code to execute in the new FeatureUpgrading event in the Feature receiver
  • All of the above

Here’s an example of an upgraded Feature highlighting some of the possibilities:

<?xml version="1.0" encoding="utf-8" ?>
<Feature xmlns="http://schemas.microsoft.com/sharepoint/" Version="1.0.0.0">
  <UpgradeActions>
    <VersionRange BeginVersion="0.0.0.0" EndVersion="0.9.9.9">
      <ApplyElementManifests>
        <ElementManifest Location="SomeFunctionality_Iteration2\Elements.xml" />
      </ApplyElementManifests>
      
      <AddContentTypeField ContentTypeId="0x010073f25e2ac37846bb8e884770fb7307c7"
          FieldId="{536DC46C-DC26-4DB0-A97C-7C21E4362A85}" PushDown="TRUE"/>
      <AddContentTypeField ContentTypeId="0x010073f25e2ac37846bb8e884770fb7307c7"
          FieldId="{4E7A6719-011A-47EA-B983-A4941D688CA6}" PushDown="TRUE"/>
 
      <CustomUpgradeAction Name="UpdateSomething">
        <Parameters>
          <Parameter Name="PassSomeValue">This is a string</Parameter>
        </Parameters>
      </CustomUpgradeAction>
    </VersionRange>
</Feature>

[Sidenote] Note that the above XML is actually a subset of the full feature.xml file – when working with Feature upgrade, it is necessary to step outside of the Feature Designer in Visual Studio 2010 and edit the XML files directly (the old-fashioned way!). When doing this the best choice is to allow VS to merge your changes with the XML it is managing. The XML being managed by VS gets merged with your custom XML when the WSP is packaged – if you could this bit of the XML isolated (you can’t, since you never need to), this might look like this:

<Feature Title="Some functionality" Id="cae1f65d-0365-42e9-9907-356c7983e902" Scope="Site">
  <ElementManifests>
    <ElementManifest Location="SomeFunctionality\Elements.xml" />
    <ElementManifest Location="SomeMoreFunctionality\Elements.xml" />
  </ElementManifests>
</Feature>

Essentially Visual Studio will still manage our element manifests, but any XML around Feature upgrade needs to be edited by hand. Walking through the box containing the main XML, we can see:

  • The Feature has a Version attribute (to be incremented each time the Feature is upgraded)
  • A VersionRange element defining the upgrade steps to process for a particular upgrade i.e. when an existing Feature instance within the BeginVersion and EndVersion is upgraded with an updated Feature definition
  • An ApplyElementManifests element – this is used to add new elements to an existing Feature. When the Feature is upgraded, any items (e.g. content types, modules etc.) will be provisioned according to the element manifest(s)
  • AddContentTypeField element – this is a convenience mechanism for the common task of adding a field to an existing content type (a very common upgrade scenario). Note the PushDown attribute – this is hugely useful as it does the work of pushing down the change from the site content type to all list content types within the site (and therefore, all lists), all without any code.
  • CustomUpgradeAction element – this allows the developer to point to some code to run to perform the upgrade actions. It will be common to need this approach, given the vast array of things you might want to do in an upgrade. In terms of the ‘pointing’, in fact the code will always be the FeatureUpgrading method in the receiver, but the value passed in the Name attribute is passed to this method to identify which code to run (along with any parameters). Hence your FeatureUpgrading method is likely to contain a switch statement and would look something like this if it was to match up with the above XML:
    public override void FeatureUpgrading(SPFeatureReceiverProperties properties, string upgradeActionName, System.Collections.Generic.IDictionary<string, string> parameters)
    {
        SPWeb parentWeb = (SPWeb)properties.Feature.Parent;
     
        switch (upgradeActionName)
        {
            case "UpdateSomething":
                string someValue = parameters["PassSomeValue"];
                // do some stuff.. 
                break;
            default:
                break;
        }
    }

In addition to these possibilities there is one further declarative element – MapFile. MapFile allows you to repoint the location of an uncustomized file, this will literally update the pointer in the database. The precise scenarios where you’d want to use this (as opposed to simply deploying an updated version of the original file) escape my tiny mind unfortunately – the only thing I can think of is that if it allows the repointing to be done at different scopes (e.g. web), in a DelegateControl kind of way, that could be very useful. I’m unable to verify this however as I just cannot get MapFile to work, and neither can others (@jthake) that I know have tried. Oh well.

Taking a step back to look at these tools, it’s easy to think that if you don’t happen to be adding a field to a content type then realistically you’re looking at code. However ApplyElementManifests is often all you need for some scenarios e.g. a set of new fields + a new content type + a new publishing page layout.

Notes

These are some ‘fundamental’ things to know – I’ll discuss some more advanced aspects in a future article:-

  • Feature upgrade does NOT happen automatically (including when the Feature is deactivated/reactivated)! The only way to upgrade a Feature is to call SPFeature.Upgrade(), typically in conjunction with one of the QueryFeatures() methods. My tool which I’ll go on to talk about is a custom application page which helps you with this part – note there is no STSADM command, PowerShell cmdlet or user interface to do this out-of-the-box.
  • On the VersionRange element, BeginVersion is inclusive but EndVersion is not. In other words, a Feature instance will be upgraded if the current version number is equal to or greater than BeginVersion , and less than  EndVersion.
  • Upgrade instructions are executed in the order they are defined in the file.
  • If a Feature does not have a Version attribute, the version is 0.0.0.0.
  • Enabling logging can help diagnose any issues. In the ULS settings, under the ‘SharePoint Foundation’ category, set the following sub-categories to Verbose to see more info:
    • Feature Infrastructure
    • Fields
    • General

Summary

SharePoint 2010 introduces additional lifecycle management capabilities with the ability to version and upgrade Features. There are some declarative elements such as ApplyElementManifests and AddContentTypeField, but using the CustomUpgradeAction element allows you to shell out to code where necessary. The only way to actually perform upgrade on a Feature once it has been updated is to call SPFeature.Upgrade() on each instance of the Feature (e.g. in each web) which should be upgraded – new QueryFeatures() methods help you locate Feature instances which can be upgraded. I’ve written a custom application page which helps manage this process, to be discussed next time.

Thursday, 23 April 2009

Fix to my Config Store framework and list provisioning tips

Had a couple of reports recently of an issue with my Config Store solution, which provides a framework for using a SharePoint list to store configuration values. If you're using the Config Store this article will definitely be of interest to you, but I've also picked up a couple of general tips on list provisioning which I want to pass on. I have to thank Richard Browne (no blog) of my old company cScape, as the fix and several of the tips have come from him - as well as alerting me to the problem, he also managed to fix it before I did, so many thanks and much kudos mate :-)

Config Store problem

Under some circumstances, fields in the Config Store list were not editable because they no longer appeared on the list edit form (EditForm.aspx). So instead of having 4 editable fields, only the 'Config name' field shows in the form:

ConfigStoreMissingFields

I've not fully worked out the pattern, but I think the problem may only appear if you provision the list on a server which has the October or December Cumulative Update installed - either that or it's a difference between Windows 2003 and Windows 2008 environments (which would be even more bizarre). Either way, it seems something changed in the way the provisioning XML was handled somewhere. This is why the problem was undetected in the earlier releases.

I had seen this problem before - but only when the list was moved using Content Deployment (e.g. using the Content Deployment Wizard) - the original 'source' list was always fine. We managed to work around this by writing some code which 're-added' the fields to the list from the content type, since they were always actually present on the content type and the data was still corrected stored. Having to run this code every time we deployed the list was an irritation rather than critical, but something I wanted to get to the bottom of - however, on finding some folks were running into this in 'normal' use meant that it became a bigger issue.

The cause

I always knew the problem would be down to a mistake in the provisioning XML, but since I'd looked for it on previous occasions I knew it was something I was seeing but not seeing. In my case, Richard spotted that I was using the wrong value in my FieldRef elements under the ContentType element - I was mistakenly thinking that the 'Name' attribute needed to match up with the ''StaticName' attribute given to the field; the documentation says this attribute contains the internal name of the field. So my FieldRefs looked like this:

<ContentType ID="0x0100E3438B2389F84cc3965600BC16BF32E7" Name="Config item" 
Group="Config Store content types" Description="Represents an item in the config store." Version="0">
<FieldRefs>
<FieldRef ID="{33F5C8B4-A6BB-41a4-AB24-69F2152974C5}" Name="ConfigCategory" Required="TRUE" />
<FieldRef ID="{BD413479-48AB-41f5-8040-918F32EBBCC5}" Name="ConfigValue" Required="TRUE" />
<FieldRef ID="{84D42C64-D0BD-4c76-8ED3-0A9E0D261111}" Name="ConfigItemDescription" />
</FieldRefs>
</ContentType>

..to match up with fields which looked like this:

<Field ID="{33F5C8B4-A6BB-41a4-AB24-69F2152974C5}"
Name="Config category"
DisplayName="Config category"

StaticName="ConfigCategory"
....
....
/>

The CORRECTED version looks like this (note the change in value for the Name attribute of FieldRefs):


<ContentType ID="0x0100E3438B2389F84cc3965600BC16BF32E7" Name="Config item"
Group="Config Store content types" Description="Represents an item in the config store." Version="0">
<FieldRefs>
<FieldRef ID="{33F5C8B4-A6BB-41a4-AB24-69F2152974C5}" Name="Config category" Required="TRUE" />
<FieldRef ID="{BD413479-48AB-41f5-8040-918F32EBBCC5}" Name="Config value" Required="TRUE" />
<FieldRef ID="{84D42C64-D0BD-4c76-8ED3-0A9E0D261111}" Name="Config item description" />
</FieldRefs>
</ContentType>

So, the main learning I got from this is to remember that the 'Name' of the FieldRef attribute needs to match the 'Name' of the Field attribute - that simple. Why did it work before? No idea unfortunately.

However, I also picked up a few more things I didn't know about, partly from Richard (this guy needs a blog!) and partly from some other reading/experimenting..

Some handy things to know about list provisioning

  • To make a field mandatory on a list, the 'Required' attribute must be 'TRUE'. Not 'True' or 'true' - this is one of the cases where the provisioning framework is pernickety about that 6-choice boolean ;-)
  • FieldRefs need an ID and Name as a minimum (which must match the values in the 'Field' declaration), but you can override certain other things here like the DisplayName - this mirrors what is possible in the UI.
  • You don't have to include the list .aspx files (DispForm.aspx, EditForm.aspx and NewForm.aspx) in your Feature if you use the 'SetupPath' attribute in the 'Form' element in schema.xml (assuming you don't need to associate custom list forms).
  • You can use the 'ContentTypeRef' element to associate your content type with the list (specify just content type ID), rather than using the 'ContentType' element which needs to redeclare all the FieldRefs.
  • It's safe to remove all the default 'system' fields from the 'Fields' section of schema.xml

Going further than these tips, the best thing I found on this is Oskar Austegard's MOSS: The dreaded schema.xml which shows how you can strip a ton of stuff out of schema.xml. I've not tried it yet, but I'm sure that will be my starting point for the next list I provision declaratively. If you're interested in the nuts and bolts of list provisioning, I highly recommend you read it.

Happy XML'ing..

Thursday, 1 November 2007

Master pages/page layouts deployed as Feature not updating

Since Deploying master pages and page layouts as a Feature has been the most heavily commented article on this blog, and several of the posters seem to have run into the same problem, I wanted to write a quick post with some more information from my experiences on this.

So this is something of a non-standard post, feel free to tune out if it doesn't affect you ;-)

Anyway, I decided to do some more testing to see if either I'd got something wrong or if perhaps I was doing something differently to the people having problems. My test was basically to knock up a publishing site with a master page and page layout (associated with a custom content type as it often would be), then go through the update process. This is what I found:

  • making updates to the files (outside of the 12 folder) and then XCOPYing these to overwrite the files in the 12\TEMPLATE\features\MyFeature\ directory successfully updated the site. No need to deactivate/activate the Feature.
  • using a Solution package to deploy the files (when using Features this is generally what I do since I'm in a farm environment) - again this updated the site correctly when I upgrade the Solution (stsadm -o upgradesolution). This is to be expected since underneath the exact same thing is happening as in the previous test. (However, I also noticed occasionally the directory would complete disappear even after the solution upgrade had completed, or was there but still locked by another process, meaning the files could not be accessed even in Windows Explorer - this is slightly irritating but running the Solution upgrade again always succeeded.)
  • after causing the file to be customized (e.g. modifying or even just checking out with SPD), any subsequent updates to the files via the Feature/Solution did not appear on the site (though further updates in SPD are fine).

In short, this is all what I expected. Assuming the file has not been customized, anything which updates the copy on the filesystem will cause an update to the site. If it has been customized, updates on the filesystem will not (since the file has now been added to the content database, and the filesystem version is no longer used). If you've not come across this before, Considerations when using Features to deploy SharePoint files - ghosting/unghosting may help.

So, I'm guessing that if your master pages etc. are not updating when you overwrite the Feature files, it's because the files have become customized somehow. Unfortunately it's not so easy to tell for publishing files - for other files, SharePoint Designer provides a handy blue dot next to the file in it's Explorer view if the file is customized, but alas this doesn't happen for master pages/page layouts. The blue dot can be seen next to the AllItems.aspx file below (click to enlarge):



Unfortunately this also means reverting to the file on the filesystem is not straightforward either (we can't right-click the file in SPD and select 'Reset to site definition' as we can with other SharePoint files). So this can be a pain if you do want to keep your page layouts referenced from the filesystem (e.g. because performance is critical), but you've ended up in this state. It is possible to revert the files using the API though. I've not needed to do this myself, but the property to check is SPFile.CustomizedPageStatus and if this returns SPCustomizedPageStatus.Customized, then the SPFile.RevertContentStream() method can be used - this should cause SharePoint to henceforth use the version on the filesystem (though note you may lose some updates which had been made after the file was unghosted (customized) - you will need to re-apply these to the filesystem file after the reversion. And remember, don't use SPD for this or you'll be back where you started!)

So far, so (reasonably) straightforward.

However, one poster (deelpunt) had an interesting question about updating page layouts with web parts. As far as I can see, updating all page instances to have web parts in web part zones by updating the layouts will not be possible. This is because if web part zones are used, the web part is associated with the page instance rather than the page layout. Indeed, this can be the power of the architecture. It is possible to either:

  • have default web parts added to a zone when a page instance is created from a page layout. This can be done by deploying the page layout using a Feature, and using the AllUsersWebPart tag. This would not affect pages already created however.
  • add web parts to all the pages by adding them directly to the page markup in SPD, rather than in a zone. Of course, this then means the settings for the web part can only be modified by the page designer in SPD, rather than site users.
  • use the API to iterate through all pages in the site to add/modify webparts using SPFile.GetLimitedWebPartManager(). Needless to say, this is the kind of operation which requires a lot of care and planning in production!

As I've mentioned before, because of these issues the web part zone architecture is often not the best choice for scenarios such as WCM site development, since here we want our changes to apply across all pages which use the layout.

Hopefully this has been of some use. As always, leave a comment if you've had different experiences to those detailed here, I'd definitely be interested to hear.

[I also wanted to say sincere apologies to the commenters on the original post (and any others I've been slow in replying to) that it took a couple of weeks for me to come back. Something to do with my project going live and moving house at the same time, hopefully normal service now resumed ;-) ]

Monday, 29 October 2007

STSADM export, Content Deployment, Content Migration API, Features/Solutions - deployment options compared

Back in May I wrote a post titled SharePoint deployment options : Features or Content Deployment?, which discussed some thoughts on what was the "right" way to move assets from development to production (and perhaps environments in between) during the site development process in SharePoint. Having now worked on other projects and consciously used different deployment methods on each, I'm rapidly coming to the conclusion that the "right" way to do deployment varies according to circumstances. So I thought what might be useful is an analysis of the whole range of deployment options, with information which might help you decide more easily on how you will complete this crucial step of the process.

So let's run through the options and their characteristics. Note by the way, that none of the options use 'destructive synchronization', where all content is first deleted before import.

Using STSADM export/import

Description:

Uses STSADM commands to generate a file (export) which can then be transferred to the target for import. One of the simplest ways of moving content from one place to another, although unlikely to be suitable as a continuous deployment mechanism. Examples:

stsadm.exe -o export -url http://localhost -filename C:\Export.cab -includeusersecurity -versions 4 -overwrite

stsadm.exe -o import -url http://localhost/sites/newsite -filename C:\Export.cab -includeusersecurity

Good for:

  • Moving an entire site/web as a one off
  • Quick deployment tests
  • Reparenting webs (can be into a different site collection)

Considerations:

  • Content on target will be overwritten if already exists
  • Granularity down to web only
  • Object GUIDs are not preserved (so some things will need to be 'fixed up' e.g. anything that references a list by GUID - ListViewWebPart, using lists with InfoPath forms)
  • Not a backup/restore tool - although it's the option which is most like backup/restore, things like alerts, audit trail, recycle bin items, security state, workflow tasks/state are not exported
  • Not transactional

Using Content Deployment via Central Admin *

Description:

Configured via 'Content Deployment paths and jobs' in Central Admin ('/_admin/deployment.aspx'). A path defines the source/target and authentication details, specific jobs define exactly which content should be deployed and how often. Quick deploy functionality allows users with permissions to specify important content which should be deployed more regularly than existing job schedules configured by administrators (quick deploy items are deployed every 15 mins).

Good for:

  • Moving entire site collections/webs on a scheduled basis e.g. in an authoring/production or authoring/staging/production topology
  • Deploying only incremental changes, e-mail notifications of success/failures
  • Allowing site owners to have some control over content deployment via Quick Deploy
  • Automatically deploying dependencies of content selected for deployment, even if in different site (e.g. page layouts/content types/site columns/referenced images etc.)
  • Automatically transferring the deployment package to the target environment (via HTTP[S])
  • Not transactional

Considerations:

  • Content on target will be overwritten if already exists
  • Granularity down to web only
  • No differentiation between site content (e.g. pages/images) and site 'infrastructure' (e.g. master pages, page layouts)
  • Object GUIDs are preserved
  • Blank site template should be used for source and destination site collection (see http://support.microsoft.com/kb/923592)
  • Also not a backup/restore tool (see above)

Using the content migration API *

Involves writing code which uses the content migration API (known as PRIME) to export then import content - the API is easy to use.

Good for:

  • Complete flexibility over deployment options
  • Granular control over what gets deployed (down to item level)
  • Ability to preserve object GUIDs (so that list GUIDs do not need to fixed-up)
  • Ability to select options for security, versioning and user roles

Considerations:

  • Blank site template should be used for source and destination site collection (see http://support.microsoft.com/kb/923592)
  • Not transactional
  • Also not a backup/restore tool (see above)
  • Need development skills to write code

Using Features/Solutions

The focus of this blog for several articles. Involves defining XML configuration files which SharePoint uses to add artifacts in the correct way on the target. This can be significantly more complex than simply developing in SharePoint Designer but can allow for better management throughout a solution's lifecycle.

Good for:

  • Iterative development/deployment
  • Deployment of assemblies and filesystem files (none of the other methods deal with this)
  • Ability to deploy assemblies/filesystem files to all servers in a farm with Solution packages
  • Possibilities for continuous integration

Considerations:

  • Developer is responsible for evaluating and deploying dependencies (e.g. underlying content types).
  • Updates to content types, list definitions, site columns etc. deployed via a Feature must be done with the API - modifying original Feature files and then reprovisioning is not supported
  • Can be very time-consuming due to lack of assistance from current tools

* Some additional notes on using Content Deployment or content migration APIs:

- appropriate Features will automatically be activated on the target, but they must be present (i.e. installed) for content deployment to work (N.B. publishing feature should not be enabled on target for first deployment)

- using Content Deployment or content migration API with RetainObjectIdentity option should not be combined with STSADM -export/import, since the latter will allocate new IDs!

So clearly there can be a few aspects to consider in choosing how to go about deployment for your project. In many scenarios where Features/Solutions aren't the most appropriate option, I favor using the content migration API, mainly due to the flexibility which isn't provided in any of the other options. Of course it does mean writing code, but as I mentioned last time, I'll soon share the mini-app I wrote so you don't have to!

Some useful references:

Sunday, 14 October 2007

Deployment using STSADM export or content migration API

Having focused on deployment using Features for several articles, back in May I wrote an article titled SharePoint deployment options : Features or Content Deployment?, which explored some of the decisions around deployment strategies for SharePoint projects. There are a variety of methods which can be used to move SharePoint artifacts and content from one place to another, and I think it's fair to say there's still a certain amount of confusion around deployment for many SharePoint developers. I certainly wouldn't claim to have all the answers, but after delivering another project last week, it seems like a good time to go over some of the experiences and reflect on the different approaches.

Needless to say, as far as deployment strategies go in general, the best idea is to have one! I see many newsgroup posts from people approaching the end of the development phase asking how they should move their work to the live servers. The problem I find with leaving deployment until the end of the project, is that none of the approaches are completely straightforward (particularly depending on what your solution consists of), and so if your project is to be delivered on time, it's important to know what steps you might need to go through.

As a sidenote, let's clarify some potentially confusing terminology here:

  • Content Deployment - the "paths and jobs" functionality which can be used to move content, surfaced by screens in Central Admin
  • Content Migration API - the underlying API (sometimes referred to as PRIME) which actually is used for both STSADM export and Content Deployment (in slightly different ways), in addition to the 'Manage Content and Structure' tool and in migrations from CMS2002

This time round I had decided to use the content migration API to deploy our solution, and it worked well for our circumstances. This is a contrast to developing with Features which I've done in the past, and the main reasons for choosing this approach were: 

  • no need for iterative deployment - although our overall project is phased, for this component we were able to develop the solution and then deploy everything from our development environment. (This approach will not work for subsequent deployments since content our client has generated on the live site would be overwritten on each deployment - more on this in an upcoming post.)
  • ability to retain object GUIDs - this simplified deployment significantly for our project, since if our lists were allocated new GUIDs on deployment (as happens with STSADM export/import), our components which referenced these lists (ListViewWebPart, InfoPath forms etc.) would not hook up properly on the deployment target. This would add a lot of "fix-up" steps to the deployment process if we were to use STSADM export.
  • no direct HTTP access from source Central Admin to target Central Admin - this is a prerequisite to use the Content Deployment functionality (paths and jobs) in Central Admin, but what we needed was a file we could copy to the live server. The content migration API provides this ability and also gives a compression option for large amounts of data.
  • automatic inclusion of database dependencies - as with STSADM export, (but not with Features), SharePoint will analyze and collect all dependencies such as fields, content types, master pages etc. for us.

The API is fairly simple to use and you may have seen Stefan's series of excellent articles on the subject - these serve as a good companion to the MSDN documentation.

It's important to remember that any non-database assets (e.g. user controls, assemblies etc.) need to be deployed manually to the target environment - these will not be included by use of something like STSADM export or the content migration API. In our case, since the live environment was a single server (and versioning would be handled by our main source control system), these were deployed by XCOPY since deployment using Solution packages did not offer any compelling advantages here. 

Whilst we're talking about filesystem files, it's useful to be aware that if you see 404 errors on the target after performing (e.g.) an STSADM import, chances are you've forgotten to deploy something like a user control. The 404 is actually coming from the referenced file rather than the actual page, so don't assume something has gone wrong with the import - a check on 'View all site content' and the import log will probably confirm all the site pages are present!

Hopefully this has given some food for thought on an approach you may not have considered. I guess my main message here is that whilst STSADM export is extremely simple, it may not provide the complete answer to all your deployment challenges due to changing GUIDs. In upcoming posts I'll provide a more direct comparison of deployment strategies (extending my 'Features or content deployment' post), and also share my mini-app which provides a front-end onto the content migration API.

[P.S. Sincere apologies to people who left comments whilst I was on holiday which are still not published - I'll publish these and respond over the next few days.

C.]

Sunday, 12 August 2007

Site definitions - custom code in the site creation process

This is the second article in a series of three, where I aim to show how to customize the site creation process (known as site provisioning) with your own API code. The full introduction and series contents can be found at http://sharepointnutsandbolts.blogspot.com/2007/07/article-series-custom-permissions-with.html. The example customization I'm using is as follows: any sites created with the definition should use a specific set of permissions, and not simply follow the default behavior of inheriting the parent site's permissions. Since this can't be done with a standard site definition (like many other things you might want to do), use of the API is required.

However, today the focus is less on the permission specifics of my example, and more on how generally to add your own code which runs in the site provisioning process. And the best thing is, it's actually very simple if you understand SharePoint Features.

There are many reasons why you might have cause to use the API in the site provisioning process. Essentially, if you can't find a way to do what you want using CAML schema in the onet.xml file, chances are you'll have to write code. Hence, it's almost easier to think of what you can do in the onet.xml file and reverse the list in order to work out scenarios which require code, but some examples which spring to mind anyhow are:

  • changing the custom master page of a site
  • creating a site column which gets it's data from a list (see my post on my Feature receiver which does this at Feature to create lookup fields on Codeplex)
  • adding custom unique permissions to a site (the example in this article series)
  • set a site property from any kind of dynamic lookup

In short, there are many scenarios.


Creating site definitions with VSeWSS

If you've ever created a site definition with Visual Studio Extensions for Windows SharePoint Services, you'll notice that the VS project it gives you contains a file called SiteProvisioning.cs. Inside is an event-handler method, where you can add your custom code which will execute when a site is created from the definition. The class looks like this:

namespace COB.Demos.SiteDefinition

{

    public partial class ProjectXSiteDefinition

    {

        /// <summary>

        ///  Define your own feature activation action code here

        /// </summary>

        public void OnActivated(SPFeatureReceiverProperties properties)

        {

            // my code here..

        }

    }

}

 

The plumbing behind all this is interesting. At first glance, the method signature looks like a Feature receiver, but it's actually not. However, examining the VS project (you'll need to build the project with F5 at least once to generate the files) reveals that VSeWSS has in fact created some Features in the background. These files can be found under the bin\Debug\solution folder in your VS project (hidden by default - you'll need to do a 'Show All Files' in Visual Studio Solution Explorer). If you do some more delving around to see exactly what VSeWSS is doing, you'll find the following:

  • 2 hidden Features have been created - 1 deploys the 'default.aspx' file, the other has no 'elements' file but is hooked up to a Feature receiver - this is a class in an assembly named the same as your VS project. If you check the GAC, you will indeed find this assembly there.
  • a line similar to the following has been added to the onet.xml file under the 'WebFeatures' element:

    <Feature ID="67b2507c-8822-41dc-b939-3d8f34b5ad13" />


    Notably, this is the ID of the Feature which is hooked up to the Feature receiver.
  • Using Reflector on the assembly containing the Feature receiver shows that the main event-handler method performs some processing and then calls into the OnActivated method shown above, i.e. the place where VSeWSS provides for you to add your own code to execute when sites are created. This code is actually contained in the SiteProvisioning.Internal.cs file within the VS project. (If you're curious as to what on earth all the code in here is doing, the answer as far as I can tell is nothing when site definitions are created with the VSeWSS project template. However, this code is also found when Solution Generator is used to extract a site definition - in that case there are some fixups which need to be done, and this is the code which is used.)

So in summary, VSeWSS creates a hidden Feature is added to the 'WebFeatures' section of the onet.xml so that it is automatically activated when the definition is used to create a web*. The Feature is hooked up to a Feature receiver which calls the OnActivated method where your custom code lives.

*(Note that if the definition is used to create a site definition, the root web is also created automatically so the Feature would also be activated then. Also note the feature needs to be already installed in the farm for it to be activated in this way).

What we can derive from this is that there's no 'special place' in the site provisioning process to inject custom code, but it can be accomplished by use of a Feature receiver. So if you don't want to use VSeWSS to create site definitions, this is the technique to use to add your custom code to the site creation process.

In terms of what that code might look like, a 'Hello World' example could be:

public void OnActivated(SPFeatureReceiverProperties properties)

{

     SPWeb currentWeb = null;

     SPSite currentSite = null;

     object oParent = properties.Feature.Parent;

 

     if (properties.Feature.Parent is SPWeb)

     {

         currentWeb = (SPWeb)oParent;

         currentSite = currentWeb.Site;

     }

     else

     {

         currentSite = (SPSite)oParent;

         currentWeb = currentSite.RootWeb;

     }

 

     currentWeb.Title = "Set from provisioning code at " +  DateTime.Now.ToString();

     currentWeb.Update();

}


Hopefully this illustrates that it's quite simple to write code which sets properties on sites created from the definition. Generally the SPWeb object is the entry point, and any property which can be modified can be modified using the API. So, this is a pretty powerful technique which can be used in many scenarios.

If you have this type of requirement, I'd definitely recommend using VSeWSS to simplify the process. It's certainly possible to hook everything up manually and package it into a Solution, but the tool does save a large amount of hassle. However as usual with VSeWSS, the price of this is some flexibility. As my sample code in the final article will show, it's sometimes useful to pass data into Features by using Feature properties, and this unfortunately is not supported by VSeWSS. So in case it's useful, the following link provides a zip file containing a Solution/Feature which uses the above technique, without using VSeWSS:

http://sharepointchris.googlepages.com/customcodewithsitedefinitions

In the next and final article, I'll cover the specifics of using the API to modify site permissions as sites are created. As is hopefully clear, this is in conjunction with the technique detailed here so the net result is that the specific permissions are set 'automatically', courtesy of the Feature which is automatically activated against a site when it is created.

Sunday, 5 August 2007

Creating, deploying and updating custom site definitions

This is the first article in a series of three where we'll discuss custom site definitions, and in particular how to run your own custom code in the site creation process. This technique is useful if you need to make any customizations using the API beyond what can normally be accomplished with a site definition. In my series (for the full series contents, see my introduction at http://sharepointnutsandbolts.blogspot.com/2007/07/article-series-custom-permissions-with.html), I use the example of creating a site definition with specific security permissions 'attached' - so that when any sites are created using the definition, specific permissions are applied which are different to those of the parent site (this is unlike the default, which is for new sites to inherit the parent site's permissions). More background can be found in the introductory article linked to earlier.

In this article we'll start with the site definition basics - I'll also supply the set of files used in this article in a link at the end. Fundamentally, a custom site definition is a template from which new SharePoint sites can be created. Customizations can be packaged into the definition, so that they are present automatically in sites created from the template. Consider the following about site definitions:

  • they are created by copying an existing (e.g. out-of-the-box) site definition and adding customizations
  • XML files (in particular the onet.xml file) specify what a site definition consist of (i.e. .aspx pages, images, web parts, functionality [in the form of SharePoint Features])
  • they provide a similar functionality to site templates (.stp files) - for a discussion of the differences between site definitions and site templates see http://msdn2.microsoft.com/en-us/library/aa979683.aspx
  • site definitions can be deployed using a SharePoint Solution package (.wsp file), so that files do not need to be manually copied to each web server in a SharePoint farm

The process for creating site definitions is well-documented in the WSS 3.0 SDK at http://msdn2.microsoft.com/en-us/library/ms454677.aspx, so I actually won't cover it here but would encourage you to follow the link. However I will quickly run through some of the key elements in the onet.xml file to go over what can be done with site definitions. For an example, let's take an extract of the onet.xml file for a publishing (note for clarity this is an extract only - a full version is available at the link at the end of the article):

<Configuration ID="0" Name="BLANKINTERNET">

  <SiteFeatures>

    <Feature ID="A392DA98-270B-4e85-9769-04C0FDE267AA">

      <!-- PublishingPrerequisites -->

    </Feature>

    <Feature ID="7C637B23-06C4-472d-9A9A-7C175762C5C4">

      <!-- ViewFormPagesLockDown -->

    </Feature>

    <Feature ID="F6924D36-2FA8-4f0b-B16D-06B7250180FA">

      <!-- Office SharePoint Server Publishing -->

    </Feature>

  </SiteFeatures>

  <WebFeatures>

    <Feature ID="00BFEA71-4EA5-48D4-A4AD-305CF7030140" > </Feature>

    <Feature ID="22A9EF51-737B-4ff2-9346-694633FE4416">

      <!-- Publishing -->

      <Properties xmlns="http://schemas.microsoft.com/sharepoint/">

        <Property Key="ChromeMasterUrl" Value="~SiteCollection/_catalogs/masterpage/BlueBand.master"/>

        <Property Key="WelcomePageUrl" Value="$Resources:cmscore,List_Pages_UrlName;/default.aspx"/>

        <Property Key="PagesListUrl" Value=""/>

        <Property Key="AvailableWebTemplates" Value="*-ProjectX#0"/>

        <Property Key="AvailablePageLayouts" Value="ThreeColumnLayout.aspx"/>

        <Property Key="AlternateCssUrl" Value="" />

        <Property Key="SimplePublishing" Value="false" />

      </Properties>

    </Feature>

  </WebFeatures>

  <Modules>

    <Module Name="LoginPage" />

    <Module Name="Images" />

    <Module Name="Home" />

  </Modules>

</Configuration>


  • The Configuration element represents settings to be used with the selected definition, allowing groups of settings to be reused across definitions for flexibility
  • SiteFeatures /WebFeatures - specifies which Features should be automatically activated when the definition is used to create a site collection or child site respectively. This is key to our overall aim in this series of creating a site definition which applies custom unique permissions to sites created from it.
  • AvailableWebTemplates property of publishing feature - can be used to restrict which site definitions can be used to create child sites within sites created from this definition. This can be useful to prevent your content creators adding a team site onto your public-facing .com website for example. Here I'm specifying that only the 'ProjectX' definition with Configuration '0' can be used for child sites.
  • AvailablePageLayouts property of publishing feature - can be used to restrict which page layouts can be used within sites created from this definition. Again, this can be useful in controlling the look and feel of your website.
  • Modules - a module is a set of files to be automatically added when sites are created. Note it's also possible to specify which web parts should be added by default to a web part page by using the AllUsersWebPart element.


Deploying site definitions


What I really want to focus on however, is how site definitions can be packaged into a Solution to simplify deployment. This is typically most useful when deploying to multiple environments (e.g. test, staging, production) and/or deploying to a farm which consists of multiple SharePoint web servers. If you don't have this requirement, you may want to consider the simpler process of copying the XML files around manually, as detailed in the WSS SDK.

Before we start, let me highlight that Visual Studio Extensions for Windows SharePoint Services (VSeWSS) is a useful tool for creating and deploying custom site definitions, and I'd recommend looking into it for this requirement if you haven't already. However, I'm illustrating the 'manual' way here to hopefully provide an understanding of the nuts and bolts.

So, to package these files as a Solution manually, we follow the SDK instructions to get our customized onet.xml and webtemp*.xml files, but then place the files in a similar folder structure to the existing site definition files found under the 12 folder. I recommend creating a Visual Studio project as the best way to group these files (VSeWSS also follows this approach). This means you should end up with something looking like this:



Note we also have a .ddf file which is used with makecab.exe to build the Solution package - see my article on building and deploying Solution packages for details on the full process here. Effectively, we need to build the Solution by passing the .ddf file as a parameter to makecab.exe, and then run the STSADM -o addsolution and STSADM -o deploysolution commands to deploy to our target SharePoint site.

Once this is done, on the 'create site' screen we should see our new template appear:



Users can now use this definition to create sites which automatically have all the functionality and appearance we specified upfront. If we then need to deploy the definition to other environments, it's a simple case of copying the .wsp file there and running the STSADM commands. We are now well on our way to creating a site definition with custom permissions associated with it.

The set of files used in this article are at http://sharepointchris.googlepages.com/creatinganddeployingcustomsitedefinition

Updating existing site definitions


Finally, a quick note on updating definitions. Care needs to be taken here as often you will want to update files which are in use by sites already created with the definition - this can break things! Generally adding to a definition is OK, but modifying/deleting things can cause problems.

So a good technique is to copy the existing definition, add the updates and deploy for new sites to use, but also hide the earlier version so it cannot be used going forward. This is accomplished by removing the webtemp*.xml file from the TEMPLATE\\XML directory. Any sites already provisioned from the earlier version of the definition will continue to run fine, since you're leaving the actual definition (onet.xml etc.) intact over in TEMPLATE\SiteTemplates.

Remember also that new Features can be stapled to existing site definitions (affecting only new sites which are created, not existing ones), and this can be useful in avoiding having to update the site definition itself. See my article on Feature-stapling for more details.

So that's site definition basics. Next in the series - how to go beyond simple site definitions : add custom code which will execute when sites are created!