Following Sites across Farms with SharePoint 2013 MySites

I’ve been struggling recently with trying to get following sites working correctly with SharePoint 2013 mysites.  (See my forum post here).  If you don’t know what functionality I am referring to, see this blog post:  http://sharepoint.microsoft.com/blog/Pages/BlogPost.aspx?pID=1060.  I was attempting the scenario where you have two separate farms each with their own user profile service application (UPA).  This could be an intranet where mysites are located and an extranet where you are only using the UPA for AD synchronization.  Or it could be two farms located in different parts of the world each with their own UPA and potentially their own mysite implementation.  For me I need to be able to handle both situations simultaneously.

At first I was getting any of the following errors in the ULS logs of either server or in the message headers which I packet sniffed:

FollowedContent.FollowItem:Exception:System.Net.WebException: The remote server returned an error: (401) Unauthorized.
{"error_description":"Invalid JWT token. Could not resolve issuer token."}
x-ms-diagnostics: 3000006;reason="Token contains invalid signature.";category="invalid_client"
x-ms-diagnostics: 3002002; reason=App principal does not exist

Luckily, after a lot of time spent getting this to work I believe I have a solution.  First thing is first, all farms that need to connect need to use the same realm.  In my powershell scripts below I will use the realm:  myrealm.  Also I’ll be using the farm names of mysitehost and collaboration where mysitehost is the farm which hosts the mysites and collaboration is the external facing farm.

First follow the first few steps here:  http://technet.microsoft.com/en-us/library/ee704552.aspx on creating and copying the certs.  Then run this code on the mysitehost farm (publishing farm).

$trustCert = Get-PfxCertificate "C:\ConsumingFarmRoot.cer"
New-SPTrustedRootAuthority "COLlABORATION" -Certificate $trustCert

$stsCert = Get-PfxCertificate "c:\ConsumingFarmSTS.cer"
New-SPTrustedServiceTokenIssuer "COLlABORATION" -Certificate $stsCert

$farmid = "<guid>"; #Get the farm id from the collaboration farm by typing Get-SPFarm | Select Id into powershell
$security = Get-SPTopologyServiceApplication | Get-SPServiceApplicationSecurity 

$claimProvider = (Get-SPClaimProvider System).ClaimProvider 

$principal = New-SPClaimsPrincipal -ClaimType http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid -ClaimProvider $claimProvider -ClaimValue $farmid 

Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Full Control" 

Get-SPTopologyServiceApplication | Set-SPServiceApplicationSecurity -ObjectSecurity $security 

Set-SPAuthenticationRealm -realm "myrealm"
$sts=Get-SPSecurityTokenServiceConfig
$Realm=Get-SpAuthenticationRealm
$nameId = "00000003-0000-0ff1-ce00-000000000000@$Realm"
Write-Host "Setting STS NameId to $nameId"
$sts.NameIdentifier = $nameId

$c = Get-SPSecurityTokenServiceConfig
$c.AllowMetadataOverHttp = $true  #needed if you are not using ssl
$c.AllowOAuthOverHttp=$true #needed if you are not using ssl
$c.Update()

iisreset

Write-Host "Run Consumer Server Script and then press any key to continue ..."
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")

New-SPTrustedSecurityTokenIssuer –MetadataEndpoint "http://collaboration.com/_layouts/15/metadata/json/1" –Name "collaboration metadata" -RegisteredIssuerName $nameId

Notice the section of the script that will pause and wait while you run the following script on the collaboration farm (consumer).

$trustCert = Get-PfxCertificate "C:\PublishingFarmRoot.cer"
New-SPTrustedRootAuthority "mysitehost" -Certificate $trustCert

Set-SPAuthenticationRealm -realm "myrealm"
$sts=Get-SPSecurityTokenServiceConfig
$Realm=Get-SpAuthenticationRealm
$nameId = "00000003-0000-0ff1-ce00-000000000000@$Realm"
Write-Host "Setting STS NameId to $nameId"
$sts.NameIdentifier = $nameId

$c = Get-SPSecurityTokenServiceConfig
$c.AllowMetadataOverHttp = $true  #needed if you are not using ssl
$c.AllowOAuthOverHttp=$true #needed if you are not using ssl
$c.Update()
Iisreset

New-SPTrustedSecurityTokenIssuer –MetadataEndpoint "http://mysitehost.com/_layouts/15/metadata/json/1" –Name "mysitehost metadata" -RegisteredIssuerName $nameId

The main difference in my scripts vs the MS documentation is I also include the registeredissuername in the New-SPTrustedSecurityTokenIssuer command (last command in each script).  I was having issues with this value not being set correctly and for some reason was unable to change it after it was created.

Now on your collaboration farm, you will need to setup a separate UPA and set the trusted mysite host locations (http://technet.microsoft.com/en-us/library/ee721061.aspx).  Now the way that site following works is you request a site to follow on your farm, it will look at the farm’s UPA to see find out the url of your mysite (http://mysitehost/personal/domain_userid).  Then the server will basically forward your request to your mysite url’s webservice and since you are most likely using claims based authentication (best practice for SP 2013), then your claim is sent over and the request is done using your creds.  So if the personal site (PersonalSpace) User Profile property is empty on your farm’s UPA, then it won’t work; you’ll get a message about your mysite is still being created.  To get it to work you need to put a valid site relative path for all user’s for the personal site user profile property.  I first tried using the correct value (/personal/domain_userid) and of course that worked but I also tried just putting a / for the field and that worked as well.  But in the interest of correctness, I decided to implement the  User Profile Replication Engine.  I couldn’t find the 2013 version even though the cmdlets are documented (http://technet.microsoft.com/en-us/library/ee906542.aspx).  So I had to download and install the SharePoint 2010 Administration tool kit (http://technet.microsoft.com/en-us/library/cc663011.aspx#section3).  You only need to install it on one server in one of the farms.  They do suggest you install it on the source server because there is less data being sent to the destination farm than being retrieved from the source farm.  Below is my powershell code to sync just the personal site (personalspace) user profile property between my mysitehost farm and my collaboration farm.

Add-PsSnapin Microsoft.Office.Server.AdministrationToolkit.ReplicationEngine
#Full Sync
Start-SPProfileServiceFullReplication -Destination "http://collaboration.com" -Source "http://mysitehost.com" -Properties "PersonalSpace" -EnableInstrumentation 

#Incremental Sync
Start-SPProfileServiceIncrementalReplication -Destination "http://collaboration.com" -Source "http://mysitehost.com" -FeedProperties "PersonalSpace" -Properties "PersonalSpace" -ReplicationInterval 1 -EnableInstrumentation

Now if you are just setting up an extranet where you want internal employees to be able to follow sites but external users to not have any mysite functionality other than their AD accounts are synchronized, then you are done.  If you are connecting to another UPA which also has mysites and you need to be able to “share” feeds and utilize other social features, you will need to perform the same steps above as if it was the collaboration farm and a few additional steps.  Make sure you perform steps 2, 4 and 5 here:  http://technet.microsoft.com/en-us/library/ff621100.aspx.  These would be

There you have it, it turns out the piece I was missing for so long was setting the realm and registeredissuername correctly on both farms so that the claim could be accepted and decrypted properly.  If you are planning on connecting UPA’s from different farms where the farms do not reside in the same datacenter, please read this page on some known issues you might see:  http://technet.microsoft.com/en-us/library/cc262500.aspx#geodist

 

Speaking at SharePoint Saturday Charlotte

I am speaking at the Charlotte SharePoint Saturday event tomorrow.  Below is my abstract and slide deck if you are interested.  If you are a reader of my blog and are going to the event, please stop by and say hello.

Session Title:  Skanska’s Partner Portal and External Document Management System

Session Summary:  Skanska needed a single, consistent place for all project teams to access the important tools they needed on a day to day basis.  A project document management system which allowed internal and external collaboration and an easy to use landing page were keys to success.

Session Topic:  Allowing users to find tools and documents they need to get their job done can be difficult in an enterprise where every project team has different needs and requirements.  Skanska accomplished this by creating a site template which had easy to use links to different enterprise tools and created an effective project document management system where both external and internal parties contributed content.  We will be demoing an early release product while discussing the following pieces of functionality:

  • Managed Metadata
  • Content types and the syndication hub
  • Content Organizer
  • SkyDrive Pro
  • Email enabled lists and libraries
  • Custom Solutions
    • Metadata Based Security
    • Batch Edit (Codeplex)
    • Download as a Zip

Here is the link to the presentation.

Resources for setting up Kerberos Authentication in SharePoint

I recently had a colleague ask me about Kerberos authentication in SharePoint.  They were attempting to get around the infamous NTLM double-hop issue.  Below is the list of resources which I use when getting Kerberos working in SharePoint.

SharePoint 2010 Kerberos document

  • Word Document with step by step instructions on how to setup Kerberos in SharePoint and SQL

Kerbtray

  • System Tray Utility that displays your current Kerberos tickets.  It helps to make sure that you are actually logging in using Kerberos and not ntlm.

Delegconfig

Fiddler2

  • This proxy based packet sniffer is absolutely fantastic for troubleshooting authentication and other SharePoint/website issues.  It will also decrypt HTTPS traffic if you enable it in the settings and add the cert.

Wireshark

  • If fiddler2 isn’t showing you everything this low level packet sniffer will show you everything but it NOT decrypt HTTPS traffic.

It may be included in the documentation above but if you are publishing SharePoint externally through your firewall you will need to open a few ports on your firewall to point to your AD domain controllers.

Also, with SharePoint 2013, claims based authentication is the preferred authentication mechanism.  SP 2013 does include a nice authentication system called windows claims which will work with NTLM and Kerberos but if you are running any custom code in SharePoint, you might need to change it to work correctly with claims auth.

Process Drop Off Library Items Programmatically

I was working on a solution at work where they wanted to tag and approve all documents submitted to the site.  SharePoint 2010’s Content Organizer seemed like it would meet their needs.  I turned on approval for the drop off library and made the default content type be document.  That way when a external user submitted a document, they would just hit save and then the internal user who is approving it, would change it’s content type, fill out the metadata, and approve it.  Then the content organizer routing kicks in and puts it in the correct location.

This all worked great when working with a single document.  What didn’t work well is if you have several documents that needed to have the same metadata applied, approved, and routed.  I turned to Batch Edit on CodePlex for this piece.  I had used it before and with a few tweaks it worked great.  I ran into two issues though in this scenario.

  1. Batch Edit didn’t allow me to edit the content types, so we still had to go to each item individually to change it’s content type
  2. Batch Edit had no way to approve the items
  3. When saving an item programmatically in SharePoint, the routing rules did not process the item until the timer job ran (typically set to run at night once a day).

I wrote some code to fix #1 which was pretty straight forward and I can include the code here on the blog if someone leaves a comment saying they want it.  #2 involved adding a second button to the page that says save and approve and a little extra code on the save to check if that button was pressed and approve the item.  #3 was a lot harder than I thought it would be to get resolved.

Initially I was using OfficialFileCore.SubmitFile().  After realizing there were some additional permissions needed (Records Center Web Service Submitters SP group) I actually got it working.  It seems that OfficialFileCore.Submit() is mainly used to put a file INTO the drop off library and run the routing rules against that file.  The issue was that the files I needed to push through were already in the drop off library, and when using the OfficialFileCore.SubmitFile(), it made an extra copy of those files in the drop off library with a random name that I would somehow need to figure out and delete.  This is not exactly what I wanted and seemed overly complicated.

Remember that timerjob that runs nightly to process items in the drop off library and send emails out about items missing metadata, it’s called the Content Organizer Processing job  (Microsoft.Office.RecordsManagement.Internal.RecordsRepositoryJobDefinition).  I reflectored it and found it was calling an internal method named ProcessWaitingFiles which was looping through all of the files and calling another internal method named ProcessWaitingFile(SPListItem, bool).  This second one was really interesting to me since it only required the SPListItem needing to process and a bool which is whether or not moderation is enabled on the library.  Using reflection I was able to call this method and my files in the drop off library got routed just like I was doing them individually.

Here’s the code:

private void ProcessDropOffLibraryItem(SPWeb web, SPListItem item, bool listRequiresModeration)
{
	EcmDocumentRoutingWeb routingWeb = new EcmDocumentRoutingWeb(web);
	Type clsSPRequest = typeof(EcmDocumentRoutingWeb).Assembly.GetType("Microsoft.Office.RecordsManagement.RecordsRepository.EcmDocumentRoutingWeb", true);
	System.Reflection.MethodInfo processWaitingFile = clsSPRequest.GetMethod("ProcessWaitingFile", BindingFlags.Instance | BindingFlags.NonPublic);

	try
	{
		object result = processWaitingFile.Invoke(routingWeb, new object[] { item, listRequiresModeration });
	}
	catch (System.Reflection.TargetInvocationException ex)
	{
		string message = string.Format("Unable to route file {0}: {1}", item.File.ServerRelativeUrl, ex.Message);
		SPCriticalTraceCounter.AddDataToScope(67, "SP2010 BatchEdit", 1, message + ": " + ex.StackTrace);
	}
}

SharePoint DNS name considerations when using anonymous access

Last year we migrated our external SharePoint farm from SharePoint 2007 to SharePoint 2010. This envrionment has anonymous access turned on for the web application and certain sites have anonymous turned on. We’ve been having issues when accessing these sites with anonymous turned on where it would show us as being signed out for one request and then signed in for the next. See the video here to see what I’m talking about (please ingore the fact that the picture is not working, that’s an unrelated issue…).

We ended up opening a case with Microsoft and they were able to finally figured out what’s going on. It’s a cookie conflict problem with the WSS_KeepSessionAuthenticated cookie. This is the cookie that get’s set to let SharePoint know that you have already authenticated when on an anonymous site. The reason we have a conflict was because of how our dns names for are SharePoint sites are setup. We have a SP 2007 environment at https://company.com which sets the cookie to be 443 (https port #). Our anonymous accessible SP 2010 environment is at https://partners.company.com which is the one with the issues. This environment sets the cookie to be a guid.

Since partners.company.com is a sub-domain of company.com, both the guid and the 443 get sent to partners.company.com for the same cookie name because of sub-domain cookie sharing. I think the SP code in SP 2010 is only looking at the first value when checking for the cookie in the request and sometimes 443 is first and other times the guid is first in the cookie’s namevaluecollection. Thus it shows a user as being logged out when the 443 happens to be first in the namevaluecollection and logged in when the guid is first.

This never showed up before with us and SP 2007 because we used SSL everywhere so the cookie was being set to 443 for all of the SharePoint web applications. Now that SP 2010 uses the web application guid I see this being a bigger problem. Our ultimate solution is to change the dns names for our SharePoint sites. Until then, I was playing around with an HttpModule to “fix” the issue on our SP 2010 environment. Unfortunately, when I tried to remove the cookie that was not a guid, this also causes a race condition with the SharePoint HttpModule which checks the cookie and caused the same behavior even when the correct cookies were being sent. So the thought was to create a HttpModule for SP 2007 that would remove the cookie after it was being set because we aren’t using anonymous in our SP 2007 environment. Unfortunately, this didn’t work either because we use blob caching which exited out of the pipeline before I could remove the cookie. So my final code, which is below, was to always expire the cookie and hopefully, a later request will expire any that the blob cache sets.

using System;
using System.Collections.Generic;
using System.Text;
using System.Web;

public class RemoveAuthCookie : IHttpModule
{

	/// <summary></summary>
	/// Initializes the module, and registers for application events.
	/// 
	public void Init(HttpApplication context)
	{
		context.PreRequestHandlerExecute += new EventHandler(context_PreRequestHandlerExecute);
		context.PostAuthenticateRequest += new EventHandler(context_PostAuthenticateRequest);
	}

	void context_PostAuthenticateRequest(object sender, EventArgs e)
	{
		HttpApplication app = sender as HttpApplication;
		RemoveCookie(app);
	}

	void context_PreRequestHandlerExecute(object sender, EventArgs e)
	{
		HttpApplication app = sender as HttpApplication;
		RemoveCookie(app);
	}

	void RemoveCookie(HttpApplication app)
	{
		if (app == null) return;
		if (app.Response == null) return;
		if (app.Response.Cookies == null) return;

		try
		{

			if (app.Response.Cookies[&quot;WSS_KeepSessionAuthenticated&quot;] == null)
			{
				//WSS_KeepSessionAuthenticated cookie not found
			}
			else
			{

				app.Response.Cookies[&quot;WSS_KeepSessionAuthenticated&quot;].Expires = DateTime.Now.AddDays(-1D);
				//WSS_KeepSessionAuthenticated cookie EXPIRED
			}

			if (app.Response.Cookies[&quot;MSOWebPartPage_AnonymousAccessCookie&quot;] == null)
			{
				//MSOWebPartPage_AnonymousAccessCookie cookie not found
			}
			else
			{
				app.Response.Cookies[&quot;MSOWebPartPage_AnonymousAccessCookie&quot;].Expires = DateTime.Now.AddDays(-1D);
				//MSOWebPartPage_AnonymousAccessCookie cookie EXPIRED
			}                
		}
		catch { }
	}

	/// <summary></summary>
	/// Disposes of the resources (other than memory) used by the module.
	/// 
	public void Dispose()
	{
	}
}

So, this is not a fun situation to be in, so please keep this in mind when you are architecting your SharePoint solutions, a little thing such as the url used to access the site can be a hard thing to change later.