Custom claim providers for SharePoint – lessons learned

In a number of the websites we have running on SharePoint 2010 we use custom claim providers. In this post I will describe the issues/challenges we had when creating and registering these providers.


The way to register a custom claim provider is to create a feature the does the job. You do this by creating a feature receiver and inheriting your receiver from SPClaimProviderFeatureReceiver as described by Steve Peschka. This needs to be a farm scoped feature. After activation, your custom claim provider is available in every web application, on every zone. This means that after a user logs in on one of your web sites, SharePoint notifies all registered claim providers and asks for claims for the user that is logging in. This applies to both an internal user on the internal (default) zone, or a internet user logging in to your website. In our farm we have websites for multiple labels, which means that the custom claim provider for website X of label A also kicks in for website Y, belonging to label B. In a later blogpost, Steve also decribes how to solve that. In the feature receiver, you need to set the IsUsedByDefault property of the SPClaimProviderDefinition to false:

public override void FeatureActivated(SPFeatureReceiverProperties properties)



    SPClaimProviderManager cpm = SPClaimProviderManager.Local;

    foreach (SPClaimProviderDefinition cp in cpm.ClaimProviders)


        if (cp.ClaimProviderType == typeof(YourCustomClaimProvider))


            cp.IsUsedByDefault = false;






We can now control exactly on what web application, on what zones our providers kick in.

LESSON 1: Before choosing the easy way out and making your claim providers available throughout the whole farm, think about if this is really what you want. Apart from the performance penalty you will get, you will probably find lots of errors in your log files that were generated by your claim providers running in places they were not built for.


After activating the farm feature using IsUsedByDefault, we still need to do this. We have chosen to do this using a custom PowerShell script. We run this script from our Project Installer (more about that later).

param($RedactieUrl, $ClaimProviderName)

$snapin = Get-PSSnapin | Where-Object {$_.Name -eq 'Microsoft.SharePoint.Powershell'}

if ($snapin -eq $null)


          Write-Host "Loading Microsoft SharePoint Powershell Snapin"

          Add-PSSnapin "Microsoft.SharePoint.Powershell"



function RegisterClaimProviderOnZone {

       param($WebApplication, $Zone, $ClaimProviderName)

       if ($WebApplication.IisSettings.ContainsKey($Zone))


             $settings = $WebApplication.GetIisSettingsWithFallback($Zone)

             $providers = $settings.ClaimsProviders

             if( -not( $providers.Contains($ClaimProviderName))) {

                    $providers += $ClaimProviderName

                    Set-SPWebApplication -Identity $WebApplication `

                           -Zone $Zone `

                           -AdditionalClaimProvider $providers

                    Write-Host "Registered $ClaimProviderName on $($WebApplication.Url) in zone $Zone"

             }else {

                    Write-Host "$ClaimProviderName already registered on $($WebApplication.Url) in zone $Zone"




$WebApplication = Get-SPWebApplication $RedactieUrl

RegisterClaimProviderOnZone $WebApplication "Default" $ClaimProviderName

RegisterClaimProviderOnZone $WebApplication "Internet" $ClaimProviderName

It runs just after installing the WSP and activation of the farm feature. We run this script file supplying 2 parameters: the url of the web application and the name of the claim provider. In this case, the claim provider gets registered on both the default zone and the internet zone. We have different claim providers that just get registered on the internet zone. When we first started registering our providers, we had a different implementation of our registration script, and this has caused us some headaches. The main reason was our mis-interpretation of the name of the AdditionalClaimProvider of the Set-SPWebApplication cmdlet. We assumed we could pass a new claim provider and the command would add that new provider to the current collection. That is not the case! The parameter you pass IS the new collection.

This is the way NOT to do it:

$claimProvider = Get-SPClaimProvider $ClaimProviderName

Set-SPWebApplication -Identity $RedactieUrl `

             -Zone $Zone `

             -AdditionalClaimProvider $claimProvider

Everything worked fine until the point we needed to register the second claim provider in a web application. After running the installation of site B, site A suddenly stopped working. Took us some time to realize what happened and that our registration caused the issue. Glad we found this in our test-environment

LESSON 2: When you build and deploy custom claim providers, regression testing is very important, especially if you create providers that are available farm wide of multiple providers are registered in the same web application. Test if your registration works properly and if all providers still work as expected.

Hey, where did my claim providers go?

In our project installation, 95% of the work we need to do is scripted. But we all know the situations where something goes wrong, and you manually tweak some settings here and there. In this case we had to manually set the Custom Signin Page on the Edit Authentication page (Authentication Providers button in the ribbon). And by doing that, we lost the custom claim provider registrations on our web application zone( s ). Of course that happened in a pretty narrow installation window and suddenly our users did not get their claims. Oops. It took us some time to find we lost the registration of the provider and why we lost them.

Another reason why we have lost our providers, is the de-activation of the farm feature. Still hard to find out why the feature was de-activated, but a number of times, we needed to re-activate the feature.

We now have a custom script that makes it real easy to check (thanks Wouter!). If checks whether a specific provider is available in the farm and if it is registered for the Internet zone of a specific web app:




$Provider = Get-SPClaimProvider | ? {$_.DisplayName -eq "Our custom Claims"} | Select -First 1

$HasProvider = $Provider -ne $null

Write-Host "Claim Provider exists: $HasProvider"

$WebApplication = Get-SPWebApplication $Url

$IisSettings = $WebApplication.GetIisSettingsWithFallback("Internet")

$HasRegisteredClaimProvider = $IisSettings.ClaimsProviders.Contains("OurCustomClaimProvider")

Write-Host "Our Claim Provider is registered: $HasRegisteredClaimProvider"

And if something is broken, there is a script that fixes it. It is the same as the script above, with an extra feature activation action to activate the farm feature.

LESSON 3: If you register your claims provider on a specific web application – zone, never touch the Edit Authentication page, or run your scripts afterwards to re-register the providers.


A claim provider does have context. But only in a number of methods, where you get a context parameter. And be aware, this context is the web application, it is NOT the url of the site collection your user is using. Please be aware that methods without the context parameter can and will be called without any context. So there is no SPContext.Current and no HttpContext.Current. We wanted to read a setting for our claim provider that was site collection specific, but after many tries (and errors and unexpected behavior) we decided it was not the way to go. In some methods you will have a current context, but it is only in a limited number of methods, and you cannot be 100% sure of having that context.

LESSON 4: When designing your claim provider, first study the methods of SPClaimProvider and the context the claim provider gives you. Live with the fact the the only context you will get is the url of the web application, in some of the methods.


LESSON 5: When you use custom claims, make sure you have a custom page or webpart in place that shows you the claims the current user has. This MSDN page has an example. You will need it for troubleshooting purposes. We have created a _LAYOUTS page that is deployed by our internal platform installation. That is easy, because this way the page is always available in all websites, and can be used to troubleshoot multiple different claim providers. And we don’t need administrative permissions to clutter the content with a webpart on a page.

Permanent link to this article: http://www.tonstegeman.com/blog/2012/01/custom-claim-providers-for-sharepoint-lessons-learned/


Organizing SharePoint projects – Our DTAP street

With a number of SharePoint 2010 projects running in production and a few projects that are currently in their second stage, I thought it was about time to write something about how we organized these projects. In this post I will describe how we applied DTAP in our projects. It took us a year and half and a few revisions to get where we are now. In this post I will describe which environments we have and how we use them. And here is the disclaimer: it is a setup that works pretty good in our organization. It might work in your organization, but it also can be complete overkill. It depends. Most important is to think about this before you build your street. And in this post I hope to give you some insights in our situation that might help you thinking about yours.

Thinking about how many environments and how to use them is important, because it is good for everyone to have a clear understanding of the purpose of each environment. Also make sure everybody understands the processes around each environment. Who’s doing deployments, who to call when it is down, where to find the latest build, things like that.

We started working with SharePoint 2010, doing an internal proof of concept project around knowledge management. We had good discussions on how to organize things in Visual Studio, how to deploy our software and how to manage versions. A year and a half later things have grown a bit and we needed to re-think our strategies to make the growth possible.


I start off with giving you a schematic overview of our street. A few things to note:

  • Besides intranet projects we also started to run internet facing sites. We have a separate SharePoint production farm for those websites. And this also led to 2 acceptance farms, 2 test farms etc
  • Multi server farms are showed as 3 servers. This is not the real topology of these farms. The real topology is out of scope for the purpose of this post. Just the fact that a farm is built using multiple servers is important.
  • Every server is virtualized.


In the following paragraphs I will talk about each environment, starting with the Development and moving forward to Production. For every environment I will discuss the following subjects:

  • Usage – The purpose of the environment. Who is using it, and what for.
  • Topology– Single server or multi server.
  • Installation and management– Who is installing the servers and who is responsible for maintaining them. How do we install SharePoint and tooling.
  • Deployment – How are projects deployed to the environment.

Development (D)

image Usage – Used by developers and architects to build software and technical proof of concepts. These servers are also used by (functional) designers as demo environments to talk to users about SharePoint and build functional proof of concepts.
Most developers have a dedicated server. For the other roles, we share servers. In SharePoint – BI projects multiple project members share a development server to build and test reports and scorecards.
  • Topology– All development machines are standalone farms. Our development machines are not dedicated for one of the farms. To be flexible when people change teams, new projects start and other projects end, we want to be able to switch our development servers from intra- to internet and vice versa.
  • Installation and management– Initial Installation of the OS and SharePoint is done by our IT-Pro team. We have a set of PowerShell scripts that install SharePoint and prerequisites and all tooling that we need. Day to day management is done by the developers themselves. They are responsible for installing service packs etc. They also have a script to change their dev server from an intranet to an internet development server. Main difference are the service applications that are used and the service accounts, that are dedicated for the type of farm. Initially developers get a fully functional machine, with the SharePoint farm up and running. Creating web applications, site collections, etc. is the first thing they do in their projects.
  • Deployment – Software and SharePoint configuration we build in our projects are deployed to the local farm through Visual Studio deployment and through our Project Installer. This is a set of PowerShell scripts that installs WSP’s, configures SharePoint and adds content. It probably is the subject of a future post.
    For the developers there is 1 simple ps1 file to kick off. This cleans up the machine and re-installs everything that is created by the whole team. This way front-end developers who don’t know anything about SharePoint are also able to do and test their work in a SharePoint environment.
    In projects that are in their second stage, we still use this procedure to install the project to dev machines. Developers are also responsible to build the upgrade path to their latest version, but this is not tested in the development farm. In the setup we currently have it is not very easy to go back to a specific, to test the upgrade path over and over again.

Test – Development (T-D)

image Usage – T-D environments are test farms dedicated to a project team. Used by testers in that team to test the software and SharePoint configuration. Main reason to have a dedicated farm is that testers are no longer dependent on other projects that run in the same farm. They control everything themselves and decide on their own deployment schedule to this internal test server. Larger project teams have a dedicated T-D environment. Small projects use the B-D farm for this purpose. If projects have testers that build automatic tests, this typically is the environment they use to run the tests.
  • Topology– All T-D machines are standalone farms. They are the same machines as development machines. This way we are flexible when setting up new teams. T-D machines can become D machines and vice versa. After a project team stops, D and T-D machines are recycled for new projects, either in the intranet or the internet farm.
  • Installation and management– Same story as for development machines.
  • Deployment – Our Project Installer is used to install software to SharePoint and configure the environment for testers to be used. Generally one of the developers in the team has the task to install a new version to this environment. Most teams do this every day. The developer gets the latest code from Team Foundation Server, builds and packages everything and runs a re-install on his own machine. This is to validate everything that is checked in the previous day is installable. If this leads to a working site, he/she runs the same re-install (1 big PowerShell script) on the T-D server. If something is wrong, the team has time to fix it and testers still have yesterday’s build to continue testing. This way testers always have a working test environment, with just a short break when the installer runs. We started off by always installing the daily build, but this turned out not to be a good idea. Having 3 testers asking every 10 minutes when they can start testing is not very good for your blood pressure Smile.

Build – Development (B-D)

SNAGHTML19a4033 Usage – B-D environments are the build servers dedicated to a farm. We have 2 build servers, one for intranet projects and one for internet projects. We use these environments for 5 purposes:
1) Daily validation if all code builds (using TFS TeamBuild) and can be deployed.
2) Testing if everything works in a multiserver environment, e.g. we catch a lot of “it works on my machine” in this environment. And because of the daily build, we find it the next day after it is checked in, instead of weeks later in the test environment.
3) Integration testing – this is the first environment where all projects get together. We use this to test if everything keeps running after installing the daily update of every project. This currently is a manual process. In the near future we hope to use the automated tests built by the projects teams for this.
4) Internal testing by testers in smaller projects that do not have a dedicated T-D server. We first started off by using this server as the test environment for all projects. This caused too many missed testing hours, because one of the projects was still fixing their daily build. Projects were simply too dependent on each other. We fixed it by introducing T-D servers.
5) Validation by IT-Pro’s if projects that are delivered to T are properly installed and tested by the development teams.
  • Topology– Both B-D farms are multi-server, with a single web frontend.
  • Installation and management– Same story as for development machines. Our development department is also responsible for managing this environment, just like the development servers. The challenge here is that multiple project teams are dependent on this farm. They need to work together to ensure test environments are available when needed and coordinate fixing of their daily builds.
  • Deployment – Our Project Installer is used to install software to SharePoint and configure the environment. This is done by kicking off the PowerShell script for the project installer. This is done using the Windows Task scheduler on the application server, after the TFS teambuilds for all projects are finished. We are currently using TFS2010 with the MSBuild based teambuild. We haven’t yet upgraded the teambuild processes to the new workflow based teambuild. One day we will move to the approach as described by Mike Morton and Chris O’Brien.

Test (T)

image Usage – For our IT-Pro’s everything we have talked about until now it called Development. For them, the test environment is where it all starts. The environments are used for:
1) Admins use this environment to learn project specific installations.
2) Project teams use it to fine tune the installations before they get to production (we all sometimes miss some config changes we made at the project start and forgot to document….).
3) Admins have a test form that they go through after every deployment. They do basic tests to ensure the environment stays up and running if this goes to production. They check event logs, health analyzer, etc.
4) Project teams use this environment for demos and for testing by business users. Generally all projects are installed a few times to T before they go to A.
5) Test connections to back-end systems. Not every installation to T needs to go through to production. We can install versions on T to validate connections to back office systems.
  • Topology– Both T farms are multi-server.
  • Installation and management– SharePoint and everything it needs is installed to the servers by our scripts. Of course without the development tooling. Developers do not have access to this server. They just have read permissions to the diagnostic logging folder. Everything else is done by admins.
  • Deployment – Our Project Installer is used to install software to SharePoint and configure the environment. This is done by an IT Pro. Installation is done based on release notes. When projects deliver for deployment to T, they also deliver this document. It describes all required parameters, prerequisites, how to run the Project Installer and, if needed, manual configuration steps. Projects themselves decide their deployment schedule to the test environment. Some do it every (scrum) sprint, some do it weekly and some do it after a number of sprints. It is the scrum master’s task to ensure a time slot and admin are available to do the installation.
    We have a special mode for our Project Installer; it does not just do every action, but asks whether or not to run the action. For admins this makes it easy to see what happens and find errors easier. Installations in T are done with the help of one of the developers.

Acceptance (A)

image Usage – The acceptance servers are used by our IT-Pro’s as final validation before we go to production. For the A farms, deployments need to be without problems. It is the final repetition before P-installation. We try not to do functional testing in this environment. After installing to the A environment we ask the product owner, to approve the application (or hotfix) for installation to the production environment. If something goes wrong during installation, or the product owner decides, some changes need to made, we need to go back to the project team and go from B-D to T, back to A. In the ideal world there’s 1 A installation for every P installation.
  • Topology– Both A farms are multi-server.
  • Installation and management– SharePoint is installed to the servers by the samen PowerShell scripts, but with different parameter sets.
  • Deployment – A deployments are done by admins, based on the release notes.

Production (P)

image Usage – Not a lot to tell here. It’s what it’s all about. We do all of the above to get everything we create to these farms.
  • Topology– Both production farms are multi-server.
  • Installation and management– SharePoint is installed to the servers by the same PowerShell scripts, but with different parameter sets. The big advantage of these PowerShell scripts, is that it is way easier to ensure all environments in our streets have the same setup. So if we have a multi tenant farm, all our development machines are also running the same multi tenant setup. And if we need Performance Point, the development machines also have the PPS service application running. From a quality perspective this is pretty important.
  • Deployment – Production deployments are done by admins, based on the release notes. By the time we get here, the deployments have been tested a number of times, so they should run without problems. Our goal is to automate as much as possible by using and continuously improving our Project Installer. This way there is very little manual configuration left by the time we get to the P-installation. This greatly improves the quality of our installations.

Permanent link to this article: http://www.tonstegeman.com/blog/2011/10/organizing-sharepoint-projects-our-dtap-street/


Security Blueprints – introduction

SharePoint 2007 contains a lot of options for security configuration. In larger site collections, it is very easy to loose the overview of how the security in your site collections is configured. Publishing parts of the content in your site collections for anonymous users, makes this even harder. And if you grant your power users the permissions to manage the security settings themselves, it suddenly is impossible to keep an overview. If you have done support for a SharePoint environment and have tried to solve security issues, you probably are familiar with this problem. Security blueprints can help you in this scenario. These blueprints are a report of all security related settings in your sites. In the first version of the product, these reports are published as XML files. By creating your own XSLT stylesheets, you can use this product to create your own reports on the security setup of your SharePoint sites.

A number of sample support questions that can be addressed easier using Security Blueprints:

  • I am getting all these request e-mails asking me to grant users permissions. How do I find all places where my e-mail address is configured to be the contact for access requests?
  • The Table of Contents web part does not take permissions into account. We have a number of sub sites with unique permissions, but my users see all subsites. In my other site collection, this works as expected. Using a security blueprint, you can easily check the permissions on these sites and find out users see the subsites in the navigation, because anonymous access is turned on for these sites.
  • I have granted my project managers our custom permission level that allows them to add people to specific SharePoint groups, but it does not work. They cannot add users. In another site collection, this works as described. By comparing security blue prints from the 2 site collections, you can quickly find out the custom permission level misses one of the critical permissions.

The most important settings that are (currently) included in the report:

  • SharePoint sitegroups and their permissions
  • Permission levels
  • Lists / Document libraries and their security settings
  • Anonymous settings
  • Request Access settings
  • Activated Site Features
  • Activated Site Collection Features
  • Site Collection Administrators

Security blueprints are generated manually by a site administrator, or on a scheduled basis by the Security Blueprints timerjob. See the installation article on this weblog for the installation and setup instructions. The reports are published as XML files in an automatically created document library. This library can be added to every site collection, or to a central storage location. Every time a blueprint is generated (manually or scheduled), the library is checked if a report was previously published for the site collection. If this is not the case, the report is published as a Full Report. If a report was previously published, this report is compared to the new report. If there are changes, a new Full Report is published. If there are no changes, a No Changes report is published.
You can exclude specific parts of your site collections by configuring Endpoints. See the installation article for details.

The screenshot below shows the blueprints library after the first 3 runs of the process in an empty site collection based on the Collaboration Portal template. After the first run, I have created custom permissions for the Reports site and the document library in the document center. This results in a new full report in the second run. In the 3rd run, there were no changes, as can be seen in the screenshot. To get an idea of what a security blueprint report looks like, the last Full Report of this site collection is available on this link.


Another scenario where security blueprints can help is when you have multiple site collections that upon launch have the same structure and security setup. Before the lauunch of your new site collections, you create a blueprint of the new site collection. Now your site collection administrators can go wild and do their thing. By automatically publishing a new security report if something changes in the security setup, it is much easier for you to track when security settings are changed. This can make troubleshooting these nasty secuity issues a lot easier. It allows you to identify the differences between the original security setup (the blueprint) and the current setup in your site collections.


You can download Security Blueprints on the SharePoint Objects site on CodePlex.

Permanent link to this article: http://www.tonstegeman.com/blog/2011/09/security-blueprints-introduction/


Introducing SharePoint Security Blueprints

My CodePlex site contains a new product called Security Blueprints. I have created this for one of our customers and made some enhancements  to it. They allowed me to publish this as open source project (thank you for that!). I hope the solution can help you as it helps them and I hope you like the idea behind it. The current version is a V1 product. There are a lot of ideas for the next version, that I will soon start to work on.

Security blueprints ‘document’ all security settings in your site collections. It comes with a timerjob that repeats this task every time the job runs. It only creates a new report (a blueprint) after something in the settings (or structure) has changed. This allows you to monitor the security setup and should make troubleshooting security issues easier. Our customer that came up with the idea of the blueprints, works with an increasing number of site collections that all have the same basic structure and security setup. During the lifecycle of these site collections, people start modifying structure and security settings. The blueprints helped us in a number of cases to identify the cause of the problem. The blueprint of the ‘master’ site collection is regarded as the documentation of the security setup. By comparing this blueprint with the current report of the site collection, we were able to quickly identify the problem.

This article contains more information about the Security Blueprints.
If you want to test the blueprints, you can download it on CodePlex. The installation instructions are documented in this article.

If you have any suggestions for improvement, or you would like to write a XSLT stylesheet to make the reports more readable, feel free to contact me.

Permanent link to this article: http://www.tonstegeman.com/blog/2009/09/introducing-sharepoint-security-blueprints/


Security Blueprints – installation

This article describes how to install the Security Blueprints in your SharePoint environment. The first step is to install the solution package. After you have done this, this article shows you how to configure the security blueprints. The last part of this article describes how you can manually start the process for 1 site collection.

Step 1 – Install the solution package

The first step is to install the Security Blueprints software to your environment. Unzip the file that you have downloaded from CodePlex to a folder on the server that is running Central Administration.

Start setup.exe and click Next. image
The installer runs a system check. If none of the checks fails, you can continue the installation by clicking Next. image
In this dialog, select the web applications that will use the Security Blueprints features. Click Next. image
The installer will now install the software to your SharePoint environment. Click Next after the process completes. image
If all steps were successfull, click the Close button. image

SharePoint Objects Security Blueprints are now installed in your SharePoint farm. The installation process has installed these files and folders to your server(s):

Name Location
TST.SharePointObjects.SecurityBluePrint.dll Global Assembly Cache
CreateSecurityBlueprint.aspx 12\TEMPLATE\LAYOUTS\TST\
CreateBluePrintsTimerJobSettings.aspx 12\TEMPLATE\ADMIN\TST\
tstfeature.gif 12\TEMPLATE\IMAGES\TST\
feature.xml 12\TEMPLATE\FEATURES\TST.SharePointObjects.SecurityBluePrint.Menu\
menu.xml 12\TEMPLATE\FEATURES\TST.SharePointObjects.SecurityBluePrint.Menu\
feature.xml 12\TEMPLATE\FEATURES\TST.SharePointObjects.SecurityBluePrint.CreateBluePrintsTimerJob\

Step 2 – Configure the timer job

Security blueprints are generated by a SharePoint timerjob, that can be installed by activating a feature. Navigate to the Central Administration of your SharePoint farm. On the Application Management tab, select Mangage Web application features. On this page, find the web application that runs the site collections that you want to monitor using the security blueprints. Then click the Activate button for the feature ‘SharePoint Objects – Security Blueprint Menu’.


The timer job is now installed, it can be configured by using a special administration page. The menu to navigate to this administration page can be activating a site collection feature. Navigate to the Site Settings of the Central Administration site. In the Site Collection Administration section, click Site collection features. Find the feature called ‘SharePoint Objects – Security Blueprint Menu’ and click Activate.


If you now navigate to the Application Management tab in Central Administration, you will find a new section called ‘SharePoint Objects’. This section now has a menu option called ‘Configure timerjob for creating security blueprints’. Click this link to configure the timerjob. The first section on this page lets you choose a web application.


If you select a web application that does not have the Security Blueprint Timerjob featere activated, the Status field will notify you the timerjob is not activated. If the feature is activated, the Status field will show the last run time of the timerjob. In this section you can also set the display title for the timerjob and the schedule.

The second section on the configuration page allows you to configure the location where the blueprints will be stored. When the blueprint timerjob runs, it creates a security blueprint for every site collection in the web application. This blueprint is saved as a XML file in an automatically created document library. By configuring the Library Site Url setting, you can decide where the timerjob publishes the blueprint.


There are 3 options:

  • Leave the setting empty
    The blueprint library is created in the root site of each site collection.
  • Enter a relative url (e.g. ‘/admin/blueprints’)
    The blueprint library is created in each site collection, in the subsite with this url. If there is no subsite found on this url, the blueprints are saved in the root site of each site collection.
  • Enter an absolute url (e.g. http://admin.intranet/blueprints)
    All blueprints of all site collections are stored in 1 document library. The timer job creates a subfolder for each site collection. These folder are hidden from the user in the view. This allows you to manage the blueprints in a central location.

The last section of the timerjob setup page allows you to configure endpoints. Endpoints are relative urls to specific sub sites in your site collections. The blueprint process stops generating the blueprint XML at this site, if the url equals one of the endpoints. Suppose you have a subsite called ‘Projects’. This site has a number of subsites for a number of projects. You are interested in the security settings of this Projects site, but the security settings for each project site are not important. You can enter ‘/Projects’ as an endpoint, meaning the Projects site is the last site in the tree to be included in the blueprint. You can now add new project sites to your site collection(s) without changing the security blueprint for your site collection. Otherwise every new project site is seen as a change to the security blueprint of the site collection, and a new report is published.


You can enter multiple endpoints by putting every endpoint on a new line in the text box.

Step 3 – Start the process manually

The Security Blueprints allow you to start the process manually for a single site collection. If you do not have the feature activated for the site collection, navigate to the Site Settings of the root site in your site collection. In the Site Collection Administration section, select Site collection features. Find the feature called ‘SharePoint Objects – Security Blueprint Menu’ and click the Activate button.


If you navigate to the Site Settings page, this page will have a new section called SharePoint Objects. This section has a menu option called ‘Create security blueprint’. This link is available for every subsite in the site collection. This allows you to create a blueprint for just 1 subsite, instead of a full report for all sites in the site collection. The root site of the site collection is always included in the blueprint.


After clicking this link, you can manually start the process by clicking the Create button. You can publish the blueprint to a specific location or a central location in your farm by entering a url. See Step 2 in this article for the details. The paragraph also contains an explanation of the endpoints you can configure.


After clicking the Create button, the blueprint is created and you are redirected to the library that contains the report.-

Permanent link to this article: http://www.tonstegeman.com/blog/2009/09/security-blueprints-installation/


JQuery and SharePoint – Lookup fields and event lists

In one of my recent projects I used some JQuery to change the width of a mulitvalued lookup field in SharePoint and to hide the Workspace field in an event list.

The script to change the width of multivalued lookup fields:

<script type="text/javascript">

When the lookup list has a lot of similar items, it is now much easier for users to pick the right items:


The script to hide the workspace checkbox in an event list:

<script type="text/javascript">

This first bit of script hides the ‘Workspace’checkbox in the NewForm and the EditForm. The script to hide the workspace field from the DispForm.aspx:

<script type="text/javascript">

There are several ways to add the script to the pages. For the Lookup fields I used the I descibe in this blog post. For the Event list, I created custom EditForm, NewForm and DispForm pages and added the script to those pages directly.

Permanent link to this article: http://www.tonstegeman.com/blog/2009/08/jquery-and-sharepoint-lookup-fields-and-event-lists-2/


SharePoint Filter web parts: using a context filter in a page layout

For our e-office intranet I was working on a number of page layouts. In this page layout I wanted to use the out of the box Page Field Filter web part. After creating a new page using that page layout, the page crashed immediatly, showing the error message “An unexpected error has occurred”. After switching off customErrors in web.config, the error message was “The Hidden property cannot be set on Web Part ‘g_8271d6f6_a902_4fa4_88ce_ca9ae1b0d463′, since it is a standalone Web Part.”.

Context filter web parts are not visible at runtime, the only show up when the page is in edit mode. The web parts are using the hidden property to hide themselves. The way this is done does not work when using the web part directly in a page layout, resulting in this error message. I decided to change the Page Column Filter web part I released on CodePlex, to make this work.

In this web part I created a new override of the Hidden property. If there is a web part zone in which the web part is used, it behaves normally. If there is no web part zone, the property always returns false to prevent the web part from throwing the error message above. Here’s the code:

public override bool Hidden
        if (base.WebPartManager == null)
            return base.Hidden;
        if (this.Zone == null)
            return false;
        return !base.WebPartManager.DisplayMode.AllowPageDesign;
        base.Hidden = value;

Because our web part now returns false when used in a page layout, we need another way to hide the web part at runtime. To do this I also created another override of the Visible property. Here is the code:

public override bool Visible
        if (base.WebPartManager == null)
            return base.Visible;
        if (this.Zone != null)
            return true;
        return base.WebPartManager.DisplayMode.AllowPageDesign;
        base.Visible = value;

Please note that this code snippet always returns true if there is a web part zone. If you do not do this, your web part will throw an error when the web part is used in a web part zone (by adding it to the page through the web part gallery).

Now our context filter web part works as expected when used as a normal web part and when used in a page layout.

Permanent link to this article: http://www.tonstegeman.com/blog/2009/05/sharepoint-filter-web-parts-using-a-context-filter-in-a-page-layout/


SharePoint fabulous 40: fixing the Board of Directors template

Today I was looking at the Board of Directors template, which is one of the Windows SharePoint Services 3.0 Application Templates (also known as the fabulous 40). This template contains an event calendar for board events. When you add a new board event to that list, a SharePoint Designer workflow creates a new Meeting minutes document. Another workflow adds a new discussion to the discussion board. The display form of the board event contains a number of DataForm web parts that show the content related to the board event, as is shown in the screenshot below:


The list of discussion items shows the discussion item created by the SharePoint Designer “Create Discusion” workflow. The link however points to the details page (dispform.aspx) of the discussion item, instead of the page that shows the thread (flat.aspx or thread.aspx). Another problem is that this view always show 0 replies. This is because the discussion item is created using the “Create List Item” activity in the workflow. This way an item is added to the discussion board, but it is not a discussion. You can see that by navigating to the list and opening the item from the summary view. You are now also redirected to the dispform.aspx page and if you reply to the discussion, replies are added to the list as new items, instead of as a reply to the initial item. The screenshot below shows the issues:


The idea of the workflow that creates a discussion item for every board event is nice, but it does not work properly. There are 2 ways to fix this. The nicest way is to create a new activity for SharePoint Designer workflows that creates a new discussion item and use that in de workflow. This blog post by Ricardo Costa shows how to do that. Another option is to create a workflow in Visual Studio that creates a new discussion item using SPUtility.CreateNewDiscussion. Then you replace the out of the box Create Discussion workflow in your template. In both cases you will need to create the discussion item and link that to the meeting event (using the Meeting lookup field). Here is the code how to do that:

foreach (SPList list in workflowProperties.Web.Lists)
    // Check if this list is a discussion board.
    if (list.BaseTemplate == SPListTemplateType.DiscussionBoard)
        // Start a new discussion on the title of the event.
        SPListItem newDiscussion = SPUtility.CreateNewDiscussion(list.Items, workflowProperties.Item.Title);

        // Update the lookup field to the event, if the field is available.
        if (list.Fields.ContainsField("Meeting"))
            newDiscussion["Meeting"] = workflowProperties.Item.ID;

        // Update the discussion item.

        // Just add a new discussion thread to the first discussion board found.

Last thing you will need to do is to fix the link in the dataview web part. Instead of dispform.aspx, the item needs to point to either the flat.aspx or the threaded.aspx page. Open the dispform.aspx page of the Calendar list.

Search in the XSLT of the Discussion web part for the <td> that renders the hyperlink to the dispform.aspx page using the ID of the item. Replace that <a> with this code:

<a href="../../Lists/Team%20Discussion/Threaded.aspx?RootFolder=Lists%2FTeam%20Discussion%2F{@Title}"><xsl:value-of select="@Title" /></a>

That’s it, now the discussion board in the Board of Directors template works as it should.

Permanent link to this article: http://www.tonstegeman.com/blog/2009/05/sharepoint-fabulous-40-fixing-the-board-of-directors-template/


Changing the behavior of the Close button in SharePoint

A number of customers have asked us to change the behavior of the Close button in an announcement. The default behavior is that when the user clicks the Close button, he is redirected to the default view of the list. Unless the querystring contains the ‘Source’ parameter. The user is then redirected to the value of that parameter. The question was if it is possible to make the Close button behave like the Back button of the browser. This is usefull when you aggregate announcements to multple places in your site collection(s)’. In this post I will describe how we implemented this change.

The basic idea is that we add a piece of JQuery script to the Dispform.aspx pages. This script changes the onclick of the button. To add this script to the pages, we add a custom control to the pages using the AdditionalPageHead control template. This is a custom user control that decides whether or not the script is added to the page. This way we only add the script to the pages that need it.

Step 1 – Register the control template

The first step is to create a new feature that will register our control:

<?xml version="1.0" encoding="utf-8" ?>
<Feature Id="F88ED2E4-1318-4679-B0F4-38EECDB608F6"
  Title="Close Button behavior."
  Description="Adjust the behavior of the Close button on display pages."
    <ElementManifest Location="controls.xml" />

The contents of the controls.xml file:

<?xml version="1.0" encoding="utf-8" ?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
    ControlSrc="~/_controltemplates/TST/CloseButton.ascx" />

Step 2 – Create the control template

Next thing to do is create the control that is loaded in the AdditionalPageHead. This control loads a custom control called ‘CloseButtonControl’:

<%@ Control Language="C#" ClassName="CloseButton" %>
<%@ Register TagPrefix="TST" Namespace="TST.SharePoint.CloseButton"
    Assembly="TST.SharePoint.CloseButton, Version=, Culture=neutral, PublicKeyToken=b2defc4f610a0b97" %>
<TST:CloseButtonControl ID="CloseButtonControl1" ListTemplates="Announcements;Tasks" runat="server">

This custom user control has a property called ‘ListTemplates’. This is a semi-colon separated list of list template names that determines to which list templates the new behaviour is applied. In case we don’t want to implement this new behavior for all lists in the site collection, we can set the ListTemplates in our ASCX file. If we want the user to go back to the previous on every list item page, we set it to a ‘*’. This control property ‘ListTemplates’ in the ASCX file is not the best way to implement such configuration values. In a future post I will get back to this configuration topic.

Step 3 – Add the script to the page

Next thing to do is create the control that adds the script to the SharePoint pages that need it. The code snippet below show the code for this control. In the OnLoad our control checks if we are in a view item page (dispform.aspx). If that is the case, the template setting is validated. If the current request is the dispform page of one of the list templates that is setup in the ASCX file, the JQuery file and the script itself are added to the page. In this example the script will be added for Task and Announcement items.

namespace TST.SharePoint.CloseButton
    public class CloseButtonControl : UserControl
        private const string JQUERYREGISTRATION = "TST.SharePoint.CloseButton.JQuery";
        private const string SCRIPTREGISTRATION = "TST.SharePoint.CloseButton.Script";

        public string ListTemplates

        protected override void OnLoad(EventArgs e)

            if (String.IsNullOrEmpty(ListTemplates) ||
                SPContext.Current.List == null ||

            // Validate list template.
            string[] templates = ListTemplates.Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries);
            foreach (string template in templates)
                if (template=="*" ||
                    SPContext.Current.List.BaseTemplate.ToString() == template)
                    if (!Page.ClientScript.IsClientScriptIncludeRegistered(JQUERYREGISTRATION))
                            JQUERYREGISTRATION, "/_layouts/TST/jquery-1.3.2.min.js");
                    if (!Page.ClientScript.IsStartupScriptRegistered(SCRIPTREGISTRATION))
                        StringBuilder script = new StringBuilder();
                        script.Append("<script type=\"text/javascript\">");
                        script.Append("$(document).ready( function(){");
                        script.Append("   $(\"input[name$='GoBack']\").each(function() {");
                        script.Append("        var click = \"javascript:history.go(-1);\";");
                        script.Append("        this.onclick = new Function(click);");
                        script.Append("   });");
                            SCRIPTREGISTRATION, script.ToString());

The JQuery script itself is very easy. It is a selector that finds the Close button and registers a new click function. The reason for registering this JQuery script from a usercontrol, is that we do not want every page to load the JQuery js file and run the script. Otherwise the solution would have been easier. You can register the JQuery file and the script directly in the ASCX file and you’re done.

Step 4 – Create the solution package

Last thing to do is create the DDF and manifest.xml files to create a solution package. The WSP file created from the DDF file below contains all necessary files.

.OPTION EXPLICIT     ; Generate errors
.Set CabinetNameTemplate=TST.SharePoint.CloseButton_1.0.0.0.wsp
.set DiskDirectoryTemplate=CDROM ; All cabinets go in a single directory
.Set CompressionType=MSZIP;** All files are compressed in cabinet files
.Set UniqueFiles="ON"
.Set Cabinet=on
.Set DiskDirectory1=

manifest.xml manifest.xml

..\12HIVE\TEMPLATE\LAYOUTS\TST\jquery-1.3.2.min.js LAYOUTS\TST\jquery-1.3.2.min.js

..\12HIVE\TEMPLATE\FEATURES\TST.SharePoint.CloseButton\feature.xml TST.SharePoint.CloseButton\feature.xml
..\12HIVE\TEMPLATE\FEATURES\TST.SharePoint.CloseButton\controls.xml TST.SharePoint.CloseButton\controls.xml

These files are deployed to SharePoint using this manifest file:

<?xml version="1.0" encoding="utf-8" ?>
<Solution ResetWebServer="TRUE" SolutionId="7EC65C73-E63F-433f-9695-B0DBD41B811D" xmlns="http://schemas.microsoft.com/sharepoint/">
    <Assembly DeploymentTarget="WebApplication" Location="TST.SharePoint.CloseButton.dll">
    <TemplateFile Location="LAYOUTS\TST\jquery-1.3.2.min.js"/>
    <TemplateFile Location="CONTROLTEMPLATES\TST\CloseButton.ascx"/>
    <FeatureManifest Location="TST.SharePoint.CloseButton\feature.xml"/>
      <PermissionSet class="NamedPermissionSet" version="1" Description="Allow access to TST.SharePoint.CloseButton">
        <IPermission class="AspNetHostingPermission" version="1" Level="Minimal"/>
        <IPermission class="SecurityPermission" version="1" Flags="Execution" />
        <IPermission class="Microsoft.SharePoint.Security.SharePointPermission,
              Microsoft.SharePoint.Security, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="True" />
        <Assembly Name="TST.SharePoint.CloseButton" Version="" PublicKeyBlob="0024000004800000blablabla_etc." />

Step 5 – Test

After installing and deploying the WSP package, you can test the solution by activating the site collection feature. By de-activating the feature, the Close buttons in the lists return to the out of the box SharePoint behavior.

Permanent link to this article: http://www.tonstegeman.com/blog/2009/03/changing-the-behavior-of-the-close-button-in-sharepoint/


Making changes to the SharePoint wiki page template – Adding navigation using a control template

In my current project one of the requirements was to add navigation to SharePoint wiki’s. We defined a number of rules that drive the navigation and that are not important for this post. In this post I will describe how I added the navigation controls to the page. Before I started I thought it was pretty easy. I started off creating a a custom navigation provider. I added the navigation controls to the out of the template page, which is wkpstd.aspx in the folder ‘12HIVE\TEMPLATE\DocumentTemplates’. I did the last step just on my development machine to test the navigation. Do not do this in your environment, because it is not supported! But for testing purposes of the navigation provider, it worked very nice.

After I finished this, I planned on creating my custom version of wkpstd.aspx and changing the reference to this template page from code. Turns out that this is not possible. And of course I had just done the demo. Everybody liked it and I just said: ‘it is almost finished. I just need to move the custom navigation controls to a custom template and then we can use it’.

The reference to the template page is stored in the property bag of the SPListItem, in a property called ‘vti_setuppath’. As Gary LaPointe documented in this blog post, you cannot update this property. This was also mentioned in the comments for this post by Mart Muller. So there is no way to change the template for existing wiki pages. I then started to look into the alternative that Mart mentions in his post. This option creates a new custom page to create a new wiki page. This custom page then needs to set the reference to our custom wiki template page. But while researching this option in Reflector, I found that the out of the box page is using some internal methods to add ghosted pages. And I think it is still very hard to get around the out of the box template wkpstd.aspx. It is hardcoded at such a deep level that I found it too difficult to change this to a custom aspx page.

I decided to take a different approach and use a custom usercontrol and add that to the page using the AdditionalPageHead control template. This control loads a custom ASCX file and adds that to the placeholder in the wiki page that is loaded. This control template is activated by a web scoped feature. The elements manifest for the feature looks like this:

<?xml version="1.0" encoding="utf-8" ?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
    ControlSrc="~/_controltemplates/Custom/CustomWiki.ascx" />

The CustomWiki.ascx file is a custom UserControl that is stored in the CONTROLTEMPLATES folder. It does not contain other controls, it just contains the Control element that registers the inherited class in the code behind. In this class, I override the CreateChildControls:

public class CustomWiki: UserControl
    protected override void CreateChildControls()
        if (SPContext.Current.ListItem != null &&
            SPContext.Current.ListItem.Properties.ContainsKey("vti_setuppath") &&
            SPContext.Current.ListItem.Properties["vti_setuppath"].ToString() == "DocumentTemplates\\wkpstd.aspx")
            Control leftActions = GetControl(this.Page, "PlaceHolderLeftActions");
            if (leftActions != null)
                Control nav = LoadControl("wikinavigation.ascx");
                Control recentChanges = GetControl(leftActions, "RecentChanges");
                if (recentChanges != null)

    private Control GetControl(Control ctrl, string controlName)
        Control result = null;
        if (string.Compare(ctrl.ID, controlName) == 0)
            return ctrl;
        foreach (Control c in ctrl.Controls)
            result = GetControl(c, controlName);
            if (result!=null)
                return result;
        return null;

The CreateChildControls checks if we are in a wiki page, by checking value of the vti_setuppath property of the current listitem. If this equals wkpst.aspx, we know we are in a wiki page. In that case, we use the function GetControl to find the placeholder to which we would like to add our navigation controls. In our case, this is called ‘PlaceHolderLeftActions’. If that placeholder is found, we load a new user control called ‘wikinavigation.ascx’ and add that to the controls collection of the placeholder. Our custom controls also hides the out of the box ‘Recent Changes’ control in the wiki page template.

I would prefer to have done it in a different way. But I still think it is a nice way to change the out of the box, hard coded, wiki template page. And the advantage of this technique is that when you de-activate the feature, the control template is no longer used, and the wiki pages behave link an out of the box SharePoint wiki page.

Permanent link to this article: http://www.tonstegeman.com/blog/2009/03/making-changes-to-the-sharepoint-wiki-page-template-adding-navigation-using-a-control-template/

Older posts «