Getting Active Directory UserId from Windows Claim in SharePoint 2013

We’ve always used NTLM for our SharePoint authentication but in SharePoint 2013, claims is the preferred authentication method.  Fortunately, SharePoint 2013 ships with something called Windows Claims.  This seems to work the same as the NTLM auth from before but that windows auth is converted into a claim that SharePoint can use.

This change means that your userid would look something like this:

i:0#.w|contoso\chris

instead of this:

contoso\chris

Sometimes when calling other services, you need the windows userid and not the claim userid.  So for these instances, I’ve created a few helper methods.

//Regex needs more testing
public const string CLAIMS_REGEX = @"(?<IdentityClaim>[ic])?:?0(?<ClaimType>.)(?<ClaimValueType>.)(?<AuthMode>[wstmrfc])(\|(?<OriginalIssuer>[^\|]*))?(\|(?<ClaimValue>.*))";
 
public static string GetAdUserIdForClaim(string login)
{
    string userName = login;
 
    foreach (Match m in Regex.Matches(login, CLAIMS_REGEX, RegexOptions.IgnoreCase))
	{
		try
		{
			if (m.Groups["AuthMode"].Captures[0].Value.ToLower() == "w")
			{
				userName = m.Groups["ClaimValue"].Captures[0].Value;
			}
		}
		catch { }
	}
    return userName;
}
 
public static string GetClaimForAdUserId(string login)
{
    string userName = login;
    SPClaimProviderManager mgr = SPClaimProviderManager.Local;
    if (mgr == null) return userName;
 
    SPClaim claim = new SPClaim(SPClaimTypes.UserLogonName, login, "http://www.w3.org/2001/XMLSchema#string", SPOriginalIssuers.Format(SPOriginalIssuerType.Windows));
    userName = mgr.EncodeClaim(claim);
 
    return userName;
}
 
public static bool IsLoginClaims(string login)
{
    Regex re = new Regex(CLAIMS_REGEX, RegexOptions.IgnoreCase);
    return re.IsMatch(login);
}

First I made a regular expression to identify the different pieces of a claim (see http://social.technet.microsoft.com/wiki/contents/articles/13921.sharepoint-2013-and-sharepoint-2010-claims-encoding.aspx).  This allows me to effectively parse the claim for the windows login name (see GetAdUserIdForClaim).  This also allows me to validate whether a string is a claim or not (see IsLoginClaims).

Update 01-22-2015:

After some more usage, I found that I was being too limiting in the Claim Types and Claim Value Types in my regex.  I had based the options from the technet article above but I then ran into some other Claim Types when doing some work recently that were not in that article.  I then found this page:  http://blogs.msdn.com/b/scicoria/archive/2011/06/30/identity-claims-encoding-for-sharepoint.aspx which listed a lot more than the technet article.  It also now seems that almost any value could be there in the future.  Because of this I changed the regex in the code above to allow any value in those two fields.

Gotcha’s using Custom Web Parts and the Minimal Download Strategy

I’ve been playing around with some of my custom code in SharePoint 2013.  One of the issues I’ve been noticing is when I add any of my custom web parts to a page, the minimal download strategy (MDS) would failover to the normal asp.net webforms page.  You can tell the difference by looking at the url.  A url that looks like this:

/_layouts/15/start.aspx#/SitePages/Home.aspx

is using MDS.  Notice the actual page is start.aspx and then the page it is ajax loading (MDS) is /sitepages/home.aspx (the part after the hash(#)).  Whereas a normal asp.net webforms page url would look like this:

/SitePages/Home.aspx

They both look the same but when using MDS you get an added benefit of less being downloaded on each click and also smoother and snappier page changes.

Gotcha #1 – Decorate your assembly or class with MDSCompliant(true).  If your MDS isn’t working and you see this message in your ULS:

MDSLog: MDSFailover: A control was discovered which does not comply with MDS

then you will need to add the attribute (http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.webcontrols.mdscompliantattribute.aspx).  Here is an example of adding it to the class:

[MdsCompliant(true)]
public class MyWebPart : WebPart
{
	//...Code
}

And here is an example of adding it to an assembly (assemblyinfo.cs):

[assembly: MdsCompliant(true)]

Every control that is loaded on the page needs this attribute.  So if you are loading other controls in your web part, you’ll need to make sure they also have this attribute.

Gotcha #2 – Declarative user controls typically used for delegate controls need a codebehind and the attribute set as well.  I use a lot of delegate controls and then use features to swap out user controls in the delegate controls to add / remove functionality on the site.  Typically my user controls didn’t have a codebehind and would just add webparts, html or other controls in the markup.  The issue is if you want these controls to be able to be swapped out using MDS, then you will need to add a codebehind to the user control and decorate it with the MdsCompliant attribute.

So a normal user control like this:

<%@ Control Language="C#" AutoEventWireup="true"  %>
<%@ Register TagPrefix="MyWebParts" Namespace="MyWebParts"%>

<MyWebParts:WebPart runat="server" Title="WebPart"></MyWebParts:WebPart>

would need to be converted to this:

<%@ Assembly Name="$SharePoint.Project.AssemblyFullName$" %>
<%@ Control Language="C#" AutoEventWireup="true" CodeBehind="MyControl.ascx.cs" Inherits="MyControl" %>
<%@ Register TagPrefix="MyWebParts" Namespace="MyWebParts"%>

<MyWebParts:WebPart runat="server" Title="WebPart"></MyWebParts:WebPart>

and with the following codebehind:

[MdsCompliant(true)]
public partial class MyControl : UserControl
{
    protected void Page_Load(object sender, EventArgs e)
    {
    }
}

I couldn’t figure out a way to decorate the usercontrol without using a code behind.  If anyone else knows how to do this, please comment or contact me with the info.  Thanks!

Gotcha #3 – Inline scripts are not allowed and will need to be added using the SPPageContentManager.  If you receive any of the following messages in your ULS logs you will need to look at your content.

MDSFailover: document.write
MDSFailover: document.writeln
MDSFailover: Unexpected tag in head
MDSFailover: script in markup

The first two are obvious; you can’t have any document.write or document.writeln’s in your html.  The third is a little less obvious.  According to a MS source, in the head these tags are not allowed:

  • Meta refresh tag
  • link tag for a stylesheet (text/css)
  • script tag
  • style tag
  • title tag
  • base tag

The fourth was a big kicker for me.  I had made the decision a few years ago to switch almost all of my webparts over to being XSLT rendered.  That means I had a lot of inline javascript and css in my xslt.  Luckily, I had previously also created a special webpart which could ajax load other webparts using an update panel and had solved the inline script issue before.  When I was writing my special ajax loading webpart before I found this page http://www.codeproject.com/Articles/21962/AJAX-UpdatePanel-With-Inline-Client-Scripts-Parser which showed how to extend the ootb update panel to automatically find inline scripts and register them to work correctly using ajax. I was able to slightly modify this code to work for my XSLT rendered webparts.

public bool IsAnAjaxDeltaRequest
{
	get
	{
		return false == String.IsNullOrEmpty(Context.Request.QueryString["AjaxDelta"]);
	}
}

protected override void Render(HtmlTextWriter output)
{
	base.Render(output);
	string html = GetHtml();
	if (IsAnAjaxDeltaRequest)
	{
		html = RegisterAndRemoveInlineClientScripts(this, this.GetType(), html);
	}
	output.Write(html);
}

public static readonly Regex REGEX_CLIENTSCRIPTS = new Regex(
"<script\\s((?<aname>[-\\w]+)=[\"'](?<avalue>.*?)[\"']\\s?)*\\s*>(?<script>.*?)</script>",
RegexOptions.Singleline | RegexOptions.IgnoreCase | RegexOptions.Compiled |
RegexOptions.ExplicitCapture);

public static string RegisterAndRemoveInlineClientScripts(Control control, Type type, string htmlsource)
{
	if (htmlsource.IndexOf("<script", StringComparison.CurrentCultureIgnoreCase) > -1)
	{
		MatchCollection matches = REGEX_CLIENTSCRIPTS.Matches(htmlsource);
		if (matches.Count > 0)
		{
			for (int i = 0; i < matches.Count; i++)
			{
				string script = matches[i].Groups["script"].Value;
				string scriptID = script.GetHashCode().ToString();
				string scriptSrc = "";

				CaptureCollection aname = matches[i].Groups["aname"].Captures;
				CaptureCollection avalue = matches[i].Groups["avalue"].Captures;
				for (int u = 0; u < aname.Count; u++)
				{
					if (aname[u].Value.IndexOf("src",
						StringComparison.CurrentCultureIgnoreCase) == 0)
					{
						scriptSrc = avalue[u].Value;
						break;
					}
				}

				if (scriptSrc.Length > 0)
				{
					SPPageContentManager.RegisterClientScriptInclude(control,
						type, scriptID, scriptSrc);
				}
				else
				{
					SPPageContentManager.RegisterClientScriptBlock(control, type,
						scriptID, script);
				}

				htmlsource = htmlsource.Replace(matches[i].Value, "");
			}

		}
	}
	return htmlsource;
}

Since this code will automatically register any script references it finds, make sure that the paths to your scripts are correct, otherwise MDS will silently failover without any ULS messages.

Update 5/3/2013:

I have found another potential MDS error message in ULS:

MDSLog: Master page version mismatch occurred in MDS

or in the ajax response if using fiddler:

versionMismatch

I was able to resolve this by browsing to another site and then back to the original site with the issue.  Weird one, if anyone knows more about this error, please contact me or comment below.

Following Sites across Farms with SharePoint 2013 MySites

I’ve been struggling recently with trying to get following sites working correctly with SharePoint 2013 mysites.  (See my forum post here).  If you don’t know what functionality I am referring to, see this blog post:  http://sharepoint.microsoft.com/blog/Pages/BlogPost.aspx?pID=1060.  I was attempting the scenario where you have two separate farms each with their own user profile service application (UPA).  This could be an intranet where mysites are located and an extranet where you are only using the UPA for AD synchronization.  Or it could be two farms located in different parts of the world each with their own UPA and potentially their own mysite implementation.  For me I need to be able to handle both situations simultaneously.

At first I was getting any of the following errors in the ULS logs of either server or in the message headers which I packet sniffed:

FollowedContent.FollowItem:Exception:System.Net.WebException: The remote server returned an error: (401) Unauthorized.
{"error_description":"Invalid JWT token. Could not resolve issuer token."}
x-ms-diagnostics: 3000006;reason="Token contains invalid signature.";category="invalid_client"
x-ms-diagnostics: 3002002; reason=App principal does not exist

Luckily, after a lot of time spent getting this to work I believe I have a solution.  First thing is first, all farms that need to connect need to use the same realm.  In my powershell scripts below I will use the realm:  myrealm.  Also I’ll be using the farm names of mysitehost and collaboration where mysitehost is the farm which hosts the mysites and collaboration is the external facing farm.

First follow the first few steps here:  http://technet.microsoft.com/en-us/library/ee704552.aspx on creating and copying the certs.  Then run this code on the mysitehost farm (publishing farm).

$trustCert = Get-PfxCertificate "C:\ConsumingFarmRoot.cer"
New-SPTrustedRootAuthority "COLlABORATION" -Certificate $trustCert

$stsCert = Get-PfxCertificate "c:\ConsumingFarmSTS.cer"
New-SPTrustedServiceTokenIssuer "COLlABORATION" -Certificate $stsCert

$farmid = "<guid>"; #Get the farm id from the collaboration farm by typing Get-SPFarm | Select Id into powershell
$security = Get-SPTopologyServiceApplication | Get-SPServiceApplicationSecurity 

$claimProvider = (Get-SPClaimProvider System).ClaimProvider 

$principal = New-SPClaimsPrincipal -ClaimType http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid -ClaimProvider $claimProvider -ClaimValue $farmid 

Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Full Control" 

Get-SPTopologyServiceApplication | Set-SPServiceApplicationSecurity -ObjectSecurity $security 

Set-SPAuthenticationRealm -realm "myrealm"
$sts=Get-SPSecurityTokenServiceConfig
$Realm=Get-SpAuthenticationRealm
$nameId = "00000003-0000-0ff1-ce00-000000000000@$Realm"
Write-Host "Setting STS NameId to $nameId"
$sts.NameIdentifier = $nameId

$c = Get-SPSecurityTokenServiceConfig
$c.AllowMetadataOverHttp = $true  #needed if you are not using ssl
$c.AllowOAuthOverHttp=$true #needed if you are not using ssl
$c.Update()

iisreset

Write-Host "Run Consumer Server Script and then press any key to continue ..."
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")

New-SPTrustedSecurityTokenIssuer –MetadataEndpoint "http://collaboration.com/_layouts/15/metadata/json/1" –Name "collaboration metadata" -RegisteredIssuerName $nameId

Notice the section of the script that will pause and wait while you run the following script on the collaboration farm (consumer).

$trustCert = Get-PfxCertificate "C:\PublishingFarmRoot.cer"
New-SPTrustedRootAuthority "mysitehost" -Certificate $trustCert

Set-SPAuthenticationRealm -realm "myrealm"
$sts=Get-SPSecurityTokenServiceConfig
$Realm=Get-SpAuthenticationRealm
$nameId = "00000003-0000-0ff1-ce00-000000000000@$Realm"
Write-Host "Setting STS NameId to $nameId"
$sts.NameIdentifier = $nameId

$c = Get-SPSecurityTokenServiceConfig
$c.AllowMetadataOverHttp = $true  #needed if you are not using ssl
$c.AllowOAuthOverHttp=$true #needed if you are not using ssl
$c.Update()
Iisreset

New-SPTrustedSecurityTokenIssuer –MetadataEndpoint "http://mysitehost.com/_layouts/15/metadata/json/1" –Name "mysitehost metadata" -RegisteredIssuerName $nameId

The main difference in my scripts vs the MS documentation is I also include the registeredissuername in the New-SPTrustedSecurityTokenIssuer command (last command in each script).  I was having issues with this value not being set correctly and for some reason was unable to change it after it was created.

Now on your collaboration farm, you will need to setup a separate UPA and set the trusted mysite host locations (http://technet.microsoft.com/en-us/library/ee721061.aspx).  Now the way that site following works is you request a site to follow on your farm, it will look at the farm’s UPA to see find out the url of your mysite (http://mysitehost/personal/domain_userid).  Then the server will basically forward your request to your mysite url’s webservice and since you are most likely using claims based authentication (best practice for SP 2013), then your claim is sent over and the request is done using your creds.  So if the personal site (PersonalSpace) User Profile property is empty on your farm’s UPA, then it won’t work; you’ll get a message about your mysite is still being created.  To get it to work you need to put a valid site relative path for all user’s for the personal site user profile property.  I first tried using the correct value (/personal/domain_userid) and of course that worked but I also tried just putting a / for the field and that worked as well.  But in the interest of correctness, I decided to implement the  User Profile Replication Engine.  I couldn’t find the 2013 version even though the cmdlets are documented (http://technet.microsoft.com/en-us/library/ee906542.aspx).  So I had to download and install the SharePoint 2010 Administration tool kit (http://technet.microsoft.com/en-us/library/cc663011.aspx#section3).  You only need to install it on one server in one of the farms.  They do suggest you install it on the source server because there is less data being sent to the destination farm than being retrieved from the source farm.  Below is my powershell code to sync just the personal site (personalspace) user profile property between my mysitehost farm and my collaboration farm.

Add-PsSnapin Microsoft.Office.Server.AdministrationToolkit.ReplicationEngine
#Full Sync
Start-SPProfileServiceFullReplication -Destination "http://collaboration.com" -Source "http://mysitehost.com" -Properties "PersonalSpace" -EnableInstrumentation 

#Incremental Sync
Start-SPProfileServiceIncrementalReplication -Destination "http://collaboration.com" -Source "http://mysitehost.com" -FeedProperties "PersonalSpace" -Properties "PersonalSpace" -ReplicationInterval 1 -EnableInstrumentation

Now if you are just setting up an extranet where you want internal employees to be able to follow sites but external users to not have any mysite functionality other than their AD accounts are synchronized, then you are done.  If you are connecting to another UPA which also has mysites and you need to be able to “share” feeds and utilize other social features, you will need to perform the same steps above as if it was the collaboration farm and a few additional steps.  Make sure you perform steps 2, 4 and 5 here:  http://technet.microsoft.com/en-us/library/ff621100.aspx.  These would be

There you have it, it turns out the piece I was missing for so long was setting the realm and registeredissuername correctly on both farms so that the claim could be accepted and decrypted properly.  If you are planning on connecting UPA’s from different farms where the farms do not reside in the same datacenter, please read this page on some known issues you might see:  http://technet.microsoft.com/en-us/library/cc262500.aspx#geodist

 

Process Drop Off Library Items Programmatically

I was working on a solution at work where they wanted to tag and approve all documents submitted to the site.  SharePoint 2010’s Content Organizer seemed like it would meet their needs.  I turned on approval for the drop off library and made the default content type be document.  That way when a external user submitted a document, they would just hit save and then the internal user who is approving it, would change it’s content type, fill out the metadata, and approve it.  Then the content organizer routing kicks in and puts it in the correct location.

This all worked great when working with a single document.  What didn’t work well is if you have several documents that needed to have the same metadata applied, approved, and routed.  I turned to Batch Edit on CodePlex for this piece.  I had used it before and with a few tweaks it worked great.  I ran into two issues though in this scenario.

  1. Batch Edit didn’t allow me to edit the content types, so we still had to go to each item individually to change it’s content type
  2. Batch Edit had no way to approve the items
  3. When saving an item programmatically in SharePoint, the routing rules did not process the item until the timer job ran (typically set to run at night once a day).

I wrote some code to fix #1 which was pretty straight forward and I can include the code here on the blog if someone leaves a comment saying they want it.  #2 involved adding a second button to the page that says save and approve and a little extra code on the save to check if that button was pressed and approve the item.  #3 was a lot harder than I thought it would be to get resolved.

Initially I was using OfficialFileCore.SubmitFile().  After realizing there were some additional permissions needed (Records Center Web Service Submitters SP group) I actually got it working.  It seems that OfficialFileCore.Submit() is mainly used to put a file INTO the drop off library and run the routing rules against that file.  The issue was that the files I needed to push through were already in the drop off library, and when using the OfficialFileCore.SubmitFile(), it made an extra copy of those files in the drop off library with a random name that I would somehow need to figure out and delete.  This is not exactly what I wanted and seemed overly complicated.

Remember that timerjob that runs nightly to process items in the drop off library and send emails out about items missing metadata, it’s called the Content Organizer Processing job  (Microsoft.Office.RecordsManagement.Internal.RecordsRepositoryJobDefinition).  I reflectored it and found it was calling an internal method named ProcessWaitingFiles which was looping through all of the files and calling another internal method named ProcessWaitingFile(SPListItem, bool).  This second one was really interesting to me since it only required the SPListItem needing to process and a bool which is whether or not moderation is enabled on the library.  Using reflection I was able to call this method and my files in the drop off library got routed just like I was doing them individually.

Here’s the code:

private void ProcessDropOffLibraryItem(SPWeb web, SPListItem item, bool listRequiresModeration)
{
	EcmDocumentRoutingWeb routingWeb = new EcmDocumentRoutingWeb(web);
	Type clsSPRequest = typeof(EcmDocumentRoutingWeb).Assembly.GetType("Microsoft.Office.RecordsManagement.RecordsRepository.EcmDocumentRoutingWeb", true);
	System.Reflection.MethodInfo processWaitingFile = clsSPRequest.GetMethod("ProcessWaitingFile", BindingFlags.Instance | BindingFlags.NonPublic);

	try
	{
		object result = processWaitingFile.Invoke(routingWeb, new object[] { item, listRequiresModeration });
	}
	catch (System.Reflection.TargetInvocationException ex)
	{
		string message = string.Format("Unable to route file {0}: {1}", item.File.ServerRelativeUrl, ex.Message);
		SPCriticalTraceCounter.AddDataToScope(67, "SP2010 BatchEdit", 1, message + ": " + ex.StackTrace);
	}
}

SharePoint DNS name considerations when using anonymous access

Last year we migrated our external SharePoint farm from SharePoint 2007 to SharePoint 2010. This envrionment has anonymous access turned on for the web application and certain sites have anonymous turned on. We’ve been having issues when accessing these sites with anonymous turned on where it would show us as being signed out for one request and then signed in for the next. See the video here to see what I’m talking about (please ingore the fact that the picture is not working, that’s an unrelated issue…).

We ended up opening a case with Microsoft and they were able to finally figured out what’s going on. It’s a cookie conflict problem with the WSS_KeepSessionAuthenticated cookie. This is the cookie that get’s set to let SharePoint know that you have already authenticated when on an anonymous site. The reason we have a conflict was because of how our dns names for are SharePoint sites are setup. We have a SP 2007 environment at https://company.com which sets the cookie to be 443 (https port #). Our anonymous accessible SP 2010 environment is at https://partners.company.com which is the one with the issues. This environment sets the cookie to be a guid.

Since partners.company.com is a sub-domain of company.com, both the guid and the 443 get sent to partners.company.com for the same cookie name because of sub-domain cookie sharing. I think the SP code in SP 2010 is only looking at the first value when checking for the cookie in the request and sometimes 443 is first and other times the guid is first in the cookie’s namevaluecollection. Thus it shows a user as being logged out when the 443 happens to be first in the namevaluecollection and logged in when the guid is first.

This never showed up before with us and SP 2007 because we used SSL everywhere so the cookie was being set to 443 for all of the SharePoint web applications. Now that SP 2010 uses the web application guid I see this being a bigger problem. Our ultimate solution is to change the dns names for our SharePoint sites. Until then, I was playing around with an HttpModule to “fix” the issue on our SP 2010 environment. Unfortunately, when I tried to remove the cookie that was not a guid, this also causes a race condition with the SharePoint HttpModule which checks the cookie and caused the same behavior even when the correct cookies were being sent. So the thought was to create a HttpModule for SP 2007 that would remove the cookie after it was being set because we aren’t using anonymous in our SP 2007 environment. Unfortunately, this didn’t work either because we use blob caching which exited out of the pipeline before I could remove the cookie. So my final code, which is below, was to always expire the cookie and hopefully, a later request will expire any that the blob cache sets.

using System;
using System.Collections.Generic;
using System.Text;
using System.Web;

public class RemoveAuthCookie : IHttpModule
{

	/// <summary></summary>
	/// Initializes the module, and registers for application events.
	/// 
	public void Init(HttpApplication context)
	{
		context.PreRequestHandlerExecute += new EventHandler(context_PreRequestHandlerExecute);
		context.PostAuthenticateRequest += new EventHandler(context_PostAuthenticateRequest);
	}

	void context_PostAuthenticateRequest(object sender, EventArgs e)
	{
		HttpApplication app = sender as HttpApplication;
		RemoveCookie(app);
	}

	void context_PreRequestHandlerExecute(object sender, EventArgs e)
	{
		HttpApplication app = sender as HttpApplication;
		RemoveCookie(app);
	}

	void RemoveCookie(HttpApplication app)
	{
		if (app == null) return;
		if (app.Response == null) return;
		if (app.Response.Cookies == null) return;

		try
		{

			if (app.Response.Cookies[&quot;WSS_KeepSessionAuthenticated&quot;] == null)
			{
				//WSS_KeepSessionAuthenticated cookie not found
			}
			else
			{

				app.Response.Cookies[&quot;WSS_KeepSessionAuthenticated&quot;].Expires = DateTime.Now.AddDays(-1D);
				//WSS_KeepSessionAuthenticated cookie EXPIRED
			}

			if (app.Response.Cookies[&quot;MSOWebPartPage_AnonymousAccessCookie&quot;] == null)
			{
				//MSOWebPartPage_AnonymousAccessCookie cookie not found
			}
			else
			{
				app.Response.Cookies[&quot;MSOWebPartPage_AnonymousAccessCookie&quot;].Expires = DateTime.Now.AddDays(-1D);
				//MSOWebPartPage_AnonymousAccessCookie cookie EXPIRED
			}                
		}
		catch { }
	}

	/// <summary></summary>
	/// Disposes of the resources (other than memory) used by the module.
	/// 
	public void Dispose()
	{
	}
}

So, this is not a fun situation to be in, so please keep this in mind when you are architecting your SharePoint solutions, a little thing such as the url used to access the site can be a hard thing to change later.