SharePoint Search Alerts and the Case of Only 10 Results

A puzzling SharePoint search alert behaviour was keeping our team of three scratching our heads for days. If you speak Spanish, you can check my teammates’ blog posts about this same error: Ignasi and Miguel.


You have a SharePoint search configured correctly in SharePoint 2010/2013. You search for a keyword and create a search alert for the results.


You make more than 10 changes that should trigger the alert. However, the search alert email you receive lists only 10 changes. The rest of the changes are not sent as alerts.


The Search Alert mechanism in SharePoint 2010/2013 is described in great detail in the following MSDN article. In a nutshell, the process is as follows:

  1. The alert is invoked once a day ("daily summary") or once a week ("weekly summary").
  2. The alert runs the query again with the user-supplied search criteria (in my example case: "Hello").
  3. If there are no new results (the results that are more recent than the last time the alert was run), the alert finishes.
  4. If there are new results, they are formatted according to the search alert email template and sent to the user.

There is one tiny bit of missing information here. The step 2, where the search query is run again, has a hidden parameter that limits the number of results that are returned from the search index. If you look inside the SharePoint code that processes the alert, you will find a RowLimit parameter supplied to the query:


This searchAlertNotificationQuota is a property that is ultimately established as AlertNotificationQuota property at the SearchApplicationService object in the Sharepoint server object model.


Run a PowerShell script to update this property in the Search service application and set a  number of returned search results to a value of your convenience. Let’s say 75:

Failed to create a custom control ‘PublishingSiteActionsMenuCustomizer’

A very weird and hard to pinpoint SharePoint error has haunted me these last days.

The Symptoms

You have a SharePoint site collection that uses Publishing features. Suddenly, the users can’t access your site. All user accounts, including site collection administrators, get the dreaded "Access Denied" error. In my case, it was SharePoint 2010 with a custom site template with publishing features included in it.

The SharePoint log files mention this:

The Cause

It is really strange that SharePoint can’t load it’s own components. But, the real cause it that the web application that a culprit site collection is running on is missing its "superuser" settings. The SuperUsers are the users configured for Publishing infrastructure to read and write publishing cache. It seems that if the users are not correctly configured, the publishing infrastructure fails badly and SharePoint interprets it as "Access Denied".

Two blog posts were of great help: Khashish Sukhija and Nico Marten’s. Thank you guys! I checked the web application properties from PowerShell and the super user entries were empty for the web application that was behaving strangely.

The Fix

Execute the script found on Nico’s post (reproduced here for convenience, all credit is his) and IISRESET.

Exposing BLOB Data in Child Entities With Business Connectivity Services

Another interesting issue arose the last week. I was tasked with implementing a BCS .NET connector to a OTRS web issue tracking service, as I mentioned earlier. The icing on the cake was extracting the binary data (issue attachments and images) and showing them in SharePoint leveraging Business Connectivity Services (BCS) in SharePoint.

I found out a post on how to use BCS to expose SQL Server data, which was not applicable in my case. I also had the extra difficulty of having the attachments in a child entity. The OTRS Ticket was the primary entity in my model, with OTRS Article being a child entity with 1:N relation in between). The attachments were properties of the article in OTRS but in the model I attached them to the Ticket in order to be more accessible.

So, I struggled to build a model that had to comply with 2 goals:

  • expose entities fields with BLOB data (Attachments and their binary data)
  • the BLOB entities should be on the child side of the relation

In this post I will show you how to achieve these goals step by step with Visual Studio.

What we want to achieve?

This is the model that I’d like to end with. It has a main entity called Product and a set of child entities called Photo. The child entity has a compound key of both the ProductId and it’s own PhotoId. It also has a binary field called PhotoData together with MIMEType and FileName fields that will govern how to expose the photo to the browser.

Those are the minimum three components for binary data BCS compatibility: MIME type, binary content and a file name.


We will model these two entities in a custom NET assembly connector. For brevity, I will fake the external service and return hardcoded data read from picture files embedded inside the connector.

Building the Connector

The first step is to create a new Business Data Connectivity Model named ProductModel in Visual Studio.


Visual Studio will create a new entity called Entity1 and will implement the entity sample methods X and Y, together with a NET implementation of the entity. The main problem with BCS development is that the metadata has to match hand-in-glove with the implementation, and also the BCS metadata has to match internally.


We will begin filling the Entity1 and changing its name to Product. We’ll also change the name of the model from BdcModel1 to ProductModel. After the renaming of several files and nodes in the BDC Explorer, we’ll have something like this.


As you can see, the Product entity has ReadItem (gets a product by its ID) and ReadList methods (gets a list of products). The methods are declared in the BDC model (left side) and their code will reside in ProductService.cs class.

We will model the Product entity operations first, as every change in the model triggers a change in the code that is generated in ProductService class. First, we’ll change the Identifier1 field of the Product entity into a ProductId of Int32 type.


Modeling Mayhem

Then, in the BDCM editor we’ll select the methods and fill their details in the BDC Method Details pane. This is the tricky part of modeling BDC: it’s very easy to do this wrong. Luckily, the BDC Explorer lets us copy and paste metadata to save time. First, we will model a ReadList operation. It will take one no parameters and will return a "Return" parameter of type "Collection of Product", which will be our entity metadata. Take a look.


When we edit the metadata, we have the following BDC Explorer tree:


Here we have to change the type and the name of Entity1List, Entity1, Identifier1 and Message. They should be: ProductList, Product, ProductId (Int32) and ProductName (String). The change is done in Properties window (F4) and we should change the Name and Type Name property. When changing the Type Name you have to choose the entities from "Current Project" tab. For collections (such as ProductList) you should select the entity and check "Is Enumerable" list.

 image image

Note: when modeling the ProductId, you also have to specify that that property maps to the Identifier (and the entity that it refers to, i.e. Product).


Now, we have to change the same thing (without the collection, of course) for the ReadItem method. It should take one parameter (mapped to String) and return a Product. The good news is that we can copy and paste the Product node from ReadList into ReadItem method.


The underlying code class Product.cs and the service ProductService.cs have to be changed to include "hardcoded" data:

At this moment we have a workable connector that exposes products and product details, but nothing else. We will do a quick check by deploying the connector and creating a new external list in SharePoint.




Well done! Now we have to model the photos 🙂

Adding the Photo Entity to the model

First, we’ll add a new class to the project, with the following simple properties:

Then, we add a entity in the BDCM model canvas right-clicking it and choosing "Add / Entity":



Of course, we have to change the name and add the properties of the entity. We have to add both identifiers, the PhotoId and the ProductId. They both have to refer to the Photo entity, and in the association we will let BDC know that it will provide the value of ProductId when the association is navigated.

I have also added a ReadItem method.


Even if the association is necessary, you still have to model the ReadItem method in advance and add an instance of that method, which should be of SpecificFinder type. We will take 2 In parameters with the 2 identifiers of the Photo entity and we will return an instance of Photo class, with all its fields.

image image

We’ll add the association between Product and Photo entities now, right-clicking again on the BDCM canvas:


In the dialog, we’ll make sure that the association is correct and in this case we will only have the navigation from Product to the Photo, not the other way around. We’ll remove the extra navigation method (the last one) and we will uncheck the Foreign Key association, as the ProductIds are returned in the code for the association method in the Product class.


Now we have a new method called ProductToPhoto in the Product entity that returns a list of photos for that product.


We still have to do the "boring" stuff of mapping the return types in the BDC Explorer pane:


After that, we have to write the code for the ProductToPhoto method. At the moment we won’t be showing the photo yet, so we can set the BLOB array to null.

Ready to roll! Deploy the solution to SharePoint and create External Content Type Profile pages in the BDC Service Application (Central Administration). It will automatically add the related Photos to the Product in its profile page.

We have to delete and recreate the external list. Now we can go to the View Profile action and see the details of the product and its photos:


Reading the photos

The only thing missing is the link to see the actual photo (the BLOB content). We have to add a StreamAccessor method and a method instance.

We can’t add this method in the entity designer. We have to open the BDCM file as an XML file and then add the Method and MethodInstance nodes to it.

image image

We will add our method under the existing ReadItem method:


The XML snippet to insert is this one:

As you can see, we return a Stream with the data. We have two additional instance properties that specify which entity property is the MIME type and which one is the file name.

Check the mappings: both identifiers should be mapped to Photo entity and both as parameters and return values in the entities (for ReadItem method). If not, it will complain in runtime about "Expected 2 identifiers and found only 1". It took me some time to solve that one!

In our PhotoService.cs class we have to add the method that returns a Stream with the data. In my case I use a Base64 string with a small sailboat image in PNG format, using the excellent web site that allows you to encode the image into a string I use the Convert .NET class to convert that string into the original array of bytes. (In this snippet I have shortened the string for legibility):

Deploy again to SharePoint, rebuild the external content type profile pages and it’s done!
The complete code for this example is available on my SkyDrive.

BDC Visual Studio Project and Missing Assembly Trouble

I just had a strange error the other day, deploying Business Connectivity Services (BCS) model arranged around a NET assembly. When accessing the external list data, I found the following error:

Assembly was requested for LobSystem with Name 'Namespace.LobSystem', but this assembly was not returned. SystemUtility of Type 'Microsoft.SharePoint.BusinessData.SystemSpecific.DotNetAssembly.DotNetAssemblySystemUtility' requires the assembly to be uploaded.

Of course, I checked the assembly and it was loaded in the GAC. So, where’s the error coming from?

Well, our friend BCS registers the assemblies for your external content type when you activate the feature containing your BCS Model and Assembly. This feature is made automatically when you create a new BCS project in Visual Studio. The feature has a custom feature receiver and also has a custom entry in the feature.xml declaration.

  <Property Key="GloballyAvailable" Value="true" />
  <Property Key="IncrementalUpdate" Value="false" />
  <Property Key="ModelFileName" Value="YourModelYourModel.bdcm" />
  <Property Key="BdcModel1" Value="BdcAssembliesYourAssembly.dll" />

It has to match the name of the LOB System in the BDCM file (the entity model):

<LobSystem Name="LobSystemName" Type="DotNetAssembly">
    <LobSystemInstance Name="LobSystemInstance" />

My error was renaming the model in some point of time. It went well for the model XML, but the old name ("BdcModel1") was still remaining in the feature.xml. After manually editing the feature.xml and pointing it to the new name of the LOB System, the error was gone:

  <Property Key="GloballyAvailable" Value="true" />
  <Property Key="IncrementalUpdate" Value="false" />
  <Property Key="ModelFileName" Value="YourModelYourModel.bdcm" />
  <Property Key="LobSystemName" Value="BdcAssembliesYourAssembly.dll" />

Configuring Content Organizer Rules with PowerShell

As you probably know from my previous posts, I have been configuring a wide-scale document management solution using Content Organizer feature of SharePoint. The idea is to use managed metadata to tag the document with information about the business unit and region it originates from (in my case Region and Section metadata columns) and let SharePoint classify it to the correct site and document library. I wrote about how to expose the cross-site content organizer hubs a few months ago.

As I had many regions and many sections to configure, I had to manage to build the whole hierarchy with a PowerShell script instead of doing the work by hand. In this post I will share with you the things I learnt by doing so.

Anatomy of the Content Organizer Rules

When you click the Content Organizer Rules in Site Settings, it will show you the contents of a hidden list called "Content Organizer Rules". You can see it adding /RoutingRules to the site URL.


Each rule is a list item with several fields of importance:

  • RoutingEnabled: this column should be set to 0 or 1 in order to disable or enable the rule.
  • RoutingPriority: a number from 1 to 10. 1 is the highest priority and 10 is the lowest one. A rule with higher priority will run BEFORE any rules with lower priority.
  • RoutingRuleName: a string with the rule name.
  • RoutingContentType: if the rule applies to a specific content type, here you should put the content type name (not the ID).
  • RoutingContentTypeInternal: this is the content type ID for the content type specified in the RoutingContentType field concatenated with the content type name using the pipe ‘|’ character as separator
  • RoutingRuleExternal: if the content organizer rule should route to another site, this field should be set to 1. Set it to 0 if the routing is done in the same site as the rule.
  • RoutingTargetLibrary: the destination library if the document is routed in the same site.
  • RoutingTargetFolder: the destination folder if the folder classification is used.
  • RoutingTargetPath: if you are routing to another site, this should be the name of the content organizer source (set in Central Administration).
  • RoutingConditions: this is an XML string with the routing condition. The XML syntax will be explained later.

In order to programatically create these rules, SharePoint server object model exposes a class named EcmDocumentRouterRule. This class simply surfaces the underlying list columns as class properties.

In order to create a new rule, just instantiate the EcmDocumentRouterRule passing the SharePoint site for the rule in the constructor (the SPWeb object). Populate the properties and call the Update() method on the rule. Job done!

Rule Condition XML

The most complex part of the creating the rule in code is how to correctly construct the condition XML. It should be in the form of:

  <Condition Column=’column’ Operator=’operator’ Value=’value’>


All the children nodes of Conditions are evaluated together (i.e. it’s an AND of all of them). Each condition specifies the Column that is evaluated, the operator and the value to compare against.

The column is comprised of several pieces: field GUID|field internal name|field display name.  (in my case it was ‘5bc078e1-bcf6-4475-aadf-2b567726c696|Region|Region’)

The available operators are:

  • IsEqual
  • IsNotEqual
  • GreaterThan
  • LessThan
  • GreaterThanOrEqual
  • LessThanOrEqual
  • BeginsWith
  • NotBeginsWith
  • EndsWith
  • NotEndsWith
  • Contains
  • NotContains
  • EqualsOrIsAChildOf
  • NotEqualsOrIsAChildOf
  • IsEmpty
  • IsNotEmpty
  • ContainsAny
  • ContainsAnyOrChildOf
  • ContainsAllOrChildOf

The value is the literal value to compare against. If you are comparing against managed metadata columns, the value specified as ’16;#Austria|1953bfe9-95d0-4ec8-8b9f-7a58169a9a53Region’. The left part is the underlying lookup value for the managed metadata field and the right part is the Term ID for the selected term.

Note: all the managed metadata columns in SharePoint are implemented as lookup columns to a site-collection root site hidden list. Every time a new managed metadata value is added to a list item, SharePoint adds a new entry to this hidden list. The consequence of this implementation is that you must get the lookup value IDs to make a CAML query against managed metadata using TaxonomyField.GetWssIdsOfTerm method.

Putting it all together

So, let’s say that we have the text value of the managed metadata column and we want to create a content organizer rule that will route the document with the content type "My Content Type" to a different site when the value of that column matches our text value or its children (in my case the column name and the term set name is Region and its value is ‘Spain’). How should we construct our PowerShell script to do it?

First of all, we have to retrieve the Term ID that corresponds to the taxonomy node for the term set "Region" with the value of "Spain". In the $web variable we have the SPWeb object for the site we want to create the rule for.

$regionValue = ‘Spain’
$ts = Get-SPTaxonomySession -Site $web.Site
$tstore = $ts.TermStores[0]
$tgroup = $tstore.Groups["Group Name"]
$tset = $tgroup.TermSets["Region"]
$term = $tset.GetTerms($regionValue, $true)
$termValueGuid = $term.Id

Now we have the term ID that corresponds to ‘Spain’ entry in the ‘Region’ term set in the ‘Group Name’ term set group. Now we need to construct the full literal value of the managed metadata column (the lookup part and the GUID part). In order to do so, we use the TaxonomyFieldValue.PopulateFromLabelGuidPair method. It consists of the text value (‘Spain’) and its Guid separated by the pipe character (‘|’).

$docLib = $web.Lists["Document Library Name"]
$regionField = [Microsoft.SharePoint.Taxonomy.TaxonomyField]$docLib.Fields["Region"]
[Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue]$taxonomyFieldValue = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($regionField)   
$taxonomyFieldValue.PopulateFromLabelGuidPair([Microsoft.SharePoint.Taxonomy.TermSet]::NormalizeName($region) + "|" + $termValueGuid)

Now we can create the content organizer rule. Remember that the column name for the condition XML is the field Guid, static name and display name. In this case we route the document to another content organizer called ‘Spain’, that’s why the rule is marked as External. The value for the taxonomy field is retrieved using the ValidatedString property of the TaxonomyFieldValue object.

[Microsoft.Office.RecordsManagement.RecordsRepository.EcmDocumentRouterRule]$rule = New-Object Microsoft.Office.RecordsManagement.RecordsRepository.EcmDocumentRouterRule($web)
$rule.ConditionsString = "<Conditions><Condition Column=’" + $regionField.Id + "|Region|Region’ Operator=’EqualsOrIsAChildOf’ Value=’" + $taxonomyFieldValue.ValidatedString +  "’></Condition></Conditions>"
$rule.Name = $region + " rule"
$rule.ContentTypeString = $web.AvailableContentTypes["My Content Type"]
$rule.RouteToExternalLocation = $true
$rule.Priority = "5"
$rule.TargetPath = $regionValue
$rule.Enabled = $true



I hope that this code snippet can save you some time if you create a lot of content organizer rules and want to avoid doing it by hand.

PDF Files Missing in Tag Results

A quick mystery solved on a customer intranet today.

The Symptoms

You tag some PDF files, alongside other content in SharePoint 2010.

Then, you search for content for that tag, either by clicking a tag or searching for a specific tag.

The results do not include the PDF files.

The Cause

Luckily, it’s just that the SharePoint search does not index PDF files by default, skipping them completely.

The Solution

You have to install Adobe IFilter for PDF files on your SharePoint server(s) and configure some other things in SharePoint. After that, you have to run a full crawl again to rebuild the search index.

You can find detailed instructions on how to enable PDF files to be indexed in this Microsoft Support article.

JavaScript Errors in SharePoint Application Pages

Today was a day packed with investigation and troubleshooting weird JavaScript errors on a customer premise. I put this information on my blog so to help anyone who might have found in a similar journey.


Problem 1: Views

You have a custom SharePoint 2010 master page. You assign that master page as the master page for a site, both for content and system pages. You face the following errors when dealing with list or library views:

  • You can’t change the current view in the Ribbon view selector. The combo box doesn’t drop down. Additionally, you do not have the view selector on the page title.
  • You edit the current view (or any other view). A JavaScript error is displayed in Internet Explorer and you can’t save the changes. However, in Firefox you can modify the view without any problems.
Problem 2: File Upload page

You have a custom SharePoint 2010 application page. The page inherits from the default SharePoint file upload page. You want to replace the file upload control with a custom control. However, you have to maintain the original control because the inherited code relies on the presence of those controls. You hide the controls but you still face JavaScript errors due to the hidden control validation scripts.


Problem 1: the culprits are the new SharePoint 2010 master page placeholders and delegate controls.

In order to have view selector working, you have to add the following DelegateControl to the master page (as found on this thread):

<SharePoint:DelegateControl runat="server" ControlId="TreeViewAndDataSource">

The weird JavaScript errors on the Edit View page were caused by a PlaceHolderUtilityContent that was misplaced. This content placeholder should be placed outside the <form> tag on the master page, as this MSDN article outlines.

<asp:ContentPlaceHolder id="PlaceHolderUtilityContent" runat="server"/>

It seems that the Internet Explorer breaks when a JavaScript error is found, and skips the execution of the remaining script block while Firefox just skips the current line and executes the remaining lines if they are correct. The errors were about the objects being null, as the IE skipped the object initialization commands due to the errors generated by a misplaced placeholder tag.

Problem 2: the culprit was the JavaScript emitted by the default validation controls for the file upload selector. The solution was to embed the default control inside an ASP.NET Panel control set to Visible="false". In this way the controls are present on the server but nothing is rendered back to the page, preventing JavaScript errors.

Reconfigure A Rogue SharePoint 2010 Search Service

Hello again to all the readers of my blog. I haven’t been around lately, but I have a good excuse for that: I was getting married. Now, as the feasts are over, I’ll be back in business as usual.

Today I’ve stumped across a puzzling problem with an existing SharePoint 2010 installation. The problem was that the Search Service Application was misbehaving, giving weird errors of not being able to connect to itself.

After some extensive googling, this forum post led to a clue: remove the service application altogether and rebuild it again from scratch. The command to remove the service application is done in the old reliable STSADM:

stsadm.exe –o deleteconfigurationobject –id <GUID_of_the_service_application>

That did the trick! After that, make a new search service application in Central Administration and you are good to go.

Note: the same forum post mentioned another post, which has a script to remove the service applications, for the PowerShell-savvy among you.

My SPC 2009 Coverage Article at DotNetMania November Issue

I wrote a short article in Spanish .NET magazine called DotNetManía about the SharePoint Conference 2009 in Las Vegas. I’ve just found out that the PDF file of the article is available online at their site:

(of course, it’s in Spanish only) 😉

SharePoint 2010 Hyper-V Virtual Machines Available

Microsoft has announced that pre-configured machines with Office 2010 Beta and SharePoint 2010 Beta 2 are now available for download.

There are two machines:

  • SharePoint 2010/SQL Server 2008/Visual Studio 2010/Office 2010 VM
  • Exchange 2010 VM

joined in the same “” domain.

8 GB of RAM are recommended for running the machines.

The download page for the virtual machines is available at