Tuesday, June 25, 2013

Thoughts about Building Multilingual Publishing Site on SharePoint 2013 - Part 3 of 3

This is a third part of the series describing my experience in planning and building a multilingual WCM site on SharePoint 2013.

Part 1 discusses basic requirements and architecture of the authoring site.
Part 2  focuses on managed navigation for the authoring site.
This part discusses Cross-Site Publishing (XSP) and publishing sites.

A Quick Introduction to Cross-Site Publishing

Microsoft has released a blog series describing in detail how to set up a fictitious Contoso site leveraging XSP. Contoso is a product-centric web site using a concept of category pages and catalog item pages to illustrate the applicability of the XSP. A product sold at Contoso electronic store can be categorized by a hierarchy of categories. For example, a specific product, the "Datum SLR Camera X142", is categorized as Cameras >> Digital Cameras. There are only  two kinds of pages that we need here: a product catalog item page showing product information, and a category page showing products matching the category. So, if you pick "Cameras" category from the navigation menu, you will see products qualifying as cameras; if you pick "Digital Cameras" - you will see a narrower set of products qualifying as digital cameras. Regardless of which category in the hierarchy you pick the principal is the same. So we need to figure the current category and render matching products for it. Category pages do exactly that. Next, you click on a specific product listed by the current view of the category page, and then the product details are rendered by the catalog item page, which accepts a unique product identifier. And so you can surface the entire product database on the SharePoint site by using just two pages - a category page and catalog item page.

There are two "magic ingredients" here. Firstly, there is the ability to publish and consume lists as catalogs. What this does is it creates a search result source in the consuming site, and optionally pins the terms from the term set used to categorize the catalog as navigation terms to the navigation term set of the site consuming the catalog. Also behind the scenes, the Microsoft.SharePoint.Publishing.Navigation.TaxonomyNavigation class callable from SharePoint.Publishing.HttpPublishingModule class "becomes aware" of how to detect requested URLs constructed of categories and unique identifiers aka "Friendly URLs" on this site, and route them to appropriate category or catalog item pages. Secondly, it is that the pages "can be made aware" of their current context and use this knowledge when issuing search queries for catalog items. This ability is there thanks to a set of well-known search query variables available to developers or information workers placing search web parts on pages.

Cross-Site Publishing as a Web Content Management Approach

Things started to look a bit confusing when I tried to apply the XSP concept to the site publication scenario that I have described in Part 1. Here are the hurdles I faced:

1. A web site page is hard to fit in the product-centric model described above, because a page can often be both a category and a catalog item at the same time. Commonly sites use section landing pages, the ones the top navigation menus point at, which contain content and at the same time act as logical parents to child pages down the site map hierarchy. Let's say we make a landing page to be an XSP catalog category page. Then, according to the product-centric model we access its child pages as we would access catalog items  - by clicking on a list of these pages rendered by our landing page. Well, this is usually not how information architects expect users to navigate their web sites, unless they are online product stores. What we often need instead is navigation menus, the familiar top navigation and current navigation controls to let us access all pages on the site.

2. Now think for a second about the managed navigation discussed in Part 2 and the requirement we had about maintaining the fidelity between authoring and publishing sites. Every page on the authoring site has a corresponding term used for navigation and forming a friendly URL. Because we use the managed navigation on the authoring site, we get our top and current navigation menus populated by these terms. We want the same behavior on the publishing site. This brings the question: "Which pages should be the category pages, and which ones should be the catalog item pages?" If we were to follow the classical product-centric approach, and designate pages corresponding to the top-level nodes as category pages, and leaf pages as item pages, we would lose the drop-down menus on the publishing site:

When I say "designate" I mean that we "tag" the pages with the terms or, in other words, assign term values to corresponding pages by setting managed metadata fields of these pages.

3. On another extreme, if we tag each and every page on the authoring site with a term, then when we consume the pages library catalog, all our pages will logically become the category pages. How do we now render the catalog item information?

We resolved the confusion by tagging all the pages on the authoring site, effectively making all of them to become the category pages on the publishing site, and modified their page layouts to make them simultaneously act as catalog item pages. This looked like a promising strategy, because by making all of our pages into the category pages, we would get exactly the same top and current navigation elements as on the authoring site, for free. The only thing left to do was to make the category pages render catalog item information - should be doable on a search-driven site.

If you examine an automatically created content item page, you will see that it is based on a page layout, which in turn leverages Catalog Item Reuse (CIR) web parts. Each CIR web part renders contents of a specific managed search property specified in the SelectedPropertiesJSON property. Provided that the managed properties correspond to the columns on the authoring site, this results in rendering the same content on the publishing page as on the authoring page. Below is an example of a CIR web part rendering a value of a managed property corresponding to the Page Content column.

Note that the value of UseSharedDataProvider property should be set to True on all CIR web parts on the page except for one, which serves as a data provider for the rest of them. All CIR web parts on a page form what’s called a query group, with one CIR web part acting as a data provider, and the rest of them – as data consumers. The data provider CIR web part in addition to SelectedPropertiesJSON property has DataProviderJSON property set as shown in the following example.

The value of DataProviderJSON property sets properties on objects of DataProviderScriptWebPart type. The key properties here are:
  • QueryTemplate – defines keyword search filtering criterion using a query variable such as {URLTOKEN.1} in the above example.
  • SourceID – a unique ID of the search result source corresponding to the catalog being consumed. The search result source is created automatically when the catalog is connected to.
  • Scope – a URL of the catalog source list.
An easy way to get started with the CIR web parts is to let SharePoint auto-generate catalog Category and Item pages and page layouts when connecting to a catalog, then harvest the web part markup from the auto-generated page layouts.

So conceptually the problem is solved now: we can create our own page layout and then a category page based on it, and configure CIR web parts to select the managed properties of interest from the catalog.  If the markup of the page layout and master page is the same as on the authoring site, then CIR web parts essentially replace the content fields, and the publishing page appears visually identical to its authoring counterpart, and the navigation is  working, except that only a single page exists on the publishing site. Pretty cool.

Now let's consider some practical aspects of getting the XSP-based publishing site up and running.

XSP Navigation Term Set and Vanity Names

To properly publish pages library as a catalog we need to designate a field uniquely identifying each page, and a managed metadata field used for page categorization. We have come up with two fields for this purpose and defined them at the site collection level in order to make sure the required managed properties get created automatically when we do a full crawl:
  • Vanity Name - this is a text field where we enter a unique friendly name of each page.
  • XSP Category  - is a managed metadata field using a new term set named XSP Navigation.
On the authoring site we are creating and managing a master term set, which is applied to the source variation, then its terms are getting re-used and translated on the target variations. We certainly want to avoid duplication of the effort required to manage the terms when we tag our pages, but we cannot reuse the existing master term set, because SharePoint complains about it being already used when we consume the catalog. We need a new term set, and so we just pin top-level terms with children from the master Site Navigation Term set. Another important thing we need to do is to create a Root term in the new XSP Navigation term set so that we could hook up to it when we consume the catalog. This is illustrated on the figure below:

The convention we have used was that the  value of the Vanity Name field must match the value of the XSP Category Node for any given page on the site. This is important because it allows us to structure the query we issue from the data provider CIR web part as follows: 


This query means "select items where the value of the Vanity Name managed property contains the last segment of the current URL". So for the URL http://www.softforte.com/softforte-corporate/about-us the {URLTOKEN.1} == "about-us", and due to the convention the Vanity Name == "about-us" as well. The page is a category page in the XSP terms, and has a managed term created when we were consuming the catalog, which points at the URL /softforte-corporate/about-us. This is exactly how we have hacked the category pages, giving them the ability to act as catalog item pages at the same time.

Navigation Translation on Publishing Sites

The terms used on the publishing sites are already translated by virtue of setting their labels for each culture, and customizing term-driven page settings down the re-use chain originating from the authoring source variation's navigation term set. The challenge however exists with regards to selecting the proper translation for the corresponding publishing site.

In order to localize content and navigation of a variations-based site you do not need to install a language pack. The locale such as fr-CA is selected when a variation label is created. This is different for a publishing site, which does not rely on variations. The only way I found to get terms to translate to French was to install the French language pack, then create a site collection using the French target language template.  In order to still manage the site in English, the Alternate Language can be set under Site Settings >> Language Settings. This forces the SharePoint to read the language preferences the browser is sending to it. So if my preferred language is English, I will see the the managed navigation in English, if it is French then it will automatically use the French term translation if one is available. The limitation here is the language packs - we can only publish in the languages supported by language packs, at least to the best of my knowledge.

Do I Really Need a Single Page on My Publishing Site? 

Since most of the real-life sites use multiple page layouts, the same page layouts need to be mirrored  over to the publishing sites and CIR web parts configured there instead of content fields. Then, if all the content on the authoring site is static, i.e. there are no web parts, just the content fields, then we can simply create catalog category pages, one page per page layout, and that would be all we need to do.

Real web sites use web parts to render dynamic content. Since web parts are not stored in index, cross-site publishing will not render them. This means that the web parts need to be re-created on the publishing site. Needless to say that since publishing and authoring sites have different security requirements, the web parts would need to be created differently on publishing and authoring sites.

Where ever possible, we should therefore utilize Content Search (CS)web parts, which would get the content from the SharePoint index just as the CIR web parts do. We were able to meet 100% of requirements to the dynamic elements on the site using CS web parts, thanks to their great flexibility. When a CS web part is used on an authoring site page, it runs in a different context than when it is
used on a publishing site. Therefore  it needs to be adapted to the publishing site, and simply copying it over won't work. Most of the time the adaptation is quite simple however, as the same managed properties are available on both sites, and therefore the search queries are similar.

How big of an issue is the fact that the dynamic elements on content pages need to be recreated on a publishing site? We found it to be quite manageable in our case. The level of work duplication was minor for us, although it depends on how many dynamic web parts are you planning for. A general corollary here is you need to plan thoroughly your pages in advance, to get the most out of the XSP. 

So while you could only use a single category page to render all pages based on a specific page layout, once you add dynamic web parts to the mix you need to create a new page in the publishing site for each corresponding authoring page with the web part. If the same web part is used on multiple pages, then it makes sense to embed it into publishing page layout directly in order to reduce the number of publishing pages you need to create.

Here is an example of how an authoring page corresponds with a  publishing page layout and a publishing page:

On the above illustration, the legend is as follows:

  • Purple color - content fields;
  • Gray color - items inherited from a master page or page layout;
  • Green color - web parts;
  • Blue color - navigation control.

The Inconvenient Result Source ID Value

In the code snippet above the DataProviderJSON property included a SourceID parameter, which is the ID of a Search Result Source that is automatically configured when a catalog is consumed on the site. This property controls the scope of search queries issued by the web parts participating in the query group by adding filtering to select items from the list  with a specific ID and source web site URL. If you let the SharePoint create catalog item and category pages automatically when connecting to a catalog, this Source ID would be embedded in the page layout. Pretty inconvenient, especially as each time you re-connect the Source ID changes, and you cannot control it by exporting and importing site search configuration.

After researching alternatives, we ended up using a provisioning script, which would first provision the publishing page layouts, then determine the result source ID, then check out the layouts, replace the ID in them, and check them back in... like I've done in the old days...

The Big Picture

Getting back to our business requirements described in Part 1, by utilizing the XSP approach we have created 3 publishing sites, one for each variation. The publishing site corresponding to the source variation is not accessible to the Internet users, but the information workers can preview English-level content on it exactly how it would appear to anonymous visitors on the live site. This architecture required us to turn Pages libraries on each variation label into catalogs, and consume the catalogs from the corresponding publishing sites, meaning that we had to have 3 different setups of the publishing page layouts, pages and catalogs. This may seem like a lot, yet since the pages set up on the publishing sites are not created by information workers, but instead are viewed as a part of the application provisioning exercise, and since they are almost identical across the three publishing sites, it was a quite reasonable decision.


The XSP is new, and it can be difficult to depart from the product-centric model used in Microsoft demos. Practically it all gets down to retrieving content from a search index, something we've been doing for a while with SharePoint. The XSP takes the concept to the next level of manageability and implementation convenience, and does not require us to write compiled code any more to take advantage of it. And of course SharePoint sticks to its traditions: planning is key to properly setting up the publishing sites and minimizing duplication of effort. As a result we get modern-looking dynamic publishing web sites and an enterprise-class content management process.

Wednesday, June 19, 2013

Thoughts about Building Multilingual Publishing Site on SharePoint 2013 - Part 2 of 3

This is a second part of the series describing my experience in planning and building a multilingual WCM site on SharePoint 2013.

Part 1 discusses basic requirements and architecture of the authoring site.
This part focuses on managed navigation for the authoring site.
Part 3 discusses cross-site publishing and publishing sites.

Term Set Organization

SharePoint 2013 Managed Navigation is at the heart of our solution. We have defined a global term set containing the navigation hierarchy. The terms at the top level correspond to the items at the top navigation bar of the site:

So there is no single root or parent term for the global navigation term set, rather - there are several top-level terms. It is important to note that every single page needs to have a corresponding term in the term set pointing at it:

Next, on the Site Settings >> Navigation configuration page we set the managed navigation to point at the global navigation term set for the source variation label. The root site of the site collection is still using structured navigation. This site is not going to be exposed publicly and is logically above the top level the Internet users will see.

Terms Sharing (Reuse)

When variations are created by clicking on Site Settings >> Variation Labels >> Create Hierarchies  the background job will automatically create site collection-scoped term sets and re-use the terms from the source term set to the target term sets. Going forward, when an information worker publishes a new page to the source variation label, the Variations Propagate Page Job Definition background job would propagate the page, and if the page is pointed at by a term, it will re-use this term from the target variations site (Term reuse scenario A). If a page is published first and propagated to target variations, and a term pointing at it is created at a later time, then the term will not be reused on target variations and would need to be manually reused by selecting Reuse Terms option on the parent term of the term to be reused.

Terms can also be reused by manually pinning top-level terms instead of relying on the variations propagation job. In this case, the terms created automatically by variations can be removed.  Next, terms from the source term set can be pinned along with the children using the Pin Term With Children menu option on the term set used for the target variation label's navigation top-level (Term reuse scenario B). The advantage of scenario B over scenario A is that regardless of when a page is published and propagated, the term will always get reused. The advantage of scenario A is that only terms for published pages get reused, which may better align with a publishing business process. The good thing is that you can mix both scenarios in a single term set as they are term-specific.

Terms Translation

In order to translate a term you need to change a value of current language in the Language drop-down box on a GENERAL tab, and enter a translated value for the Default Label. Regardless of which term set you do it on (global term set, the one for the en-CA label site or the one for the fr-CA label site), it updates the source term's value, i.e. the one belonging to the global term set. Below is an example of translating a term to Russian:

There is an interesting side effect of term translation. If a term is translated prior to being reused (scenario A) or pinned (scenario B) then the managed URL of the propagated page is created based on the translated label. Here is an example of how this looks for a Russian variation label:

The fix is easy: selecting the Customize check box does it. In case of the scenario B, the friendly URL to a page in a target variation will return 404, since Target Page Settings on a TERM-DRIVEN PAGES tab for the target term are not defined. The URL of the target page needs to be entered manually.

So as you might have noticed, terms connected via pinning or term reuse share certain properties, such as Default Label, which can only be changed in one place - on the source term, while you are allowed to independently customize term-driven pages settings. This opens up a possibility of having locale-specific URLs, especially in languages utilizing Latin alphabet: a page in source variation could have one term-driven URL, and a replica of the page in the target variation could have a completely different term-driven URL. This improves readability of the site and supports SEO.

On our project we  went ahead with the Scenario A, assuming our information workers would get training to make the following logical workflow happen:
  1. The user creates a page in source variation;
  2. The user creates a term in global term set;
  3. The user points a term at the new page;
  4. The page undergoes approval process and gets published.
  5. Variations Propagate Page Job Definition timer job picks up the page and creates a replica in target variations. It also reuses the term pointing at the page, and correctly points reused term(s) at the page's replica(s).
  6. The term can be translated now without a harm to the term-driven URLs.
  7. User can optionally change the term-driven page URLs for the target variations' pages.
Should the workflow not be followed for whatever reason, the terms can always be manually reused, and incorrectly translated URLs fixed as shown above.

Search Indexing and Managed Navigation

I have noticed that a term needs to have either the Show in Global Navigation Menu  or the Show in Current Navigation Menu checkbox checked (or both of them) in order for the search to be able to index the page corresponding to the term. If the term is neither on global nor on current navigation, and Hide physical URLs from search field is set to True for the corresponding page, then the page will not get indexed. For the pages we did not want on the navigation, but still needed to be in the index we were able to workaround this behavior by selecting the Show in Current Navigation Menu checkbox, but using a page layout that does not render a current navigation control.

Global versus Local Term Sets

One last note about using the global term set. We could have used a site collection-scoped term set (aka local term set) instead. The strategy was to keep all terms used for navigation in one place and then reuse them. Even for the purpose of cross-site publishing it is not necessary to have a global term set. Still we have chosen a global term set on the basis of a "why not?" approach. It gave us greater flexibility and eliminated potential issues having to do with local scope.

Monday, June 17, 2013

Thoughts about Building Multilingual Publishing Site on SharePoint 2013 - Part 1 of 3

Publishing approach has drastically changed in SharePoint Server 2013, and the Web Content Management (WCM) solution and infrastructure architecture approaches need to be revised from the ground up as a result. The guidance suggests 2 methods of content publishing: Author-in-place and Cross-site publishing, and provides a decision flowchart for choosing one over the other. Although content deployment publishing method is still supported, it is no longer in the list of recommendations. While the reasons for this are unknown to me, at least I know that the content deployment does not work with the new and sought-after managed navigation feature. With the above in mind, how do you practically approach construction of a multilingual Internet-facing SharePoint-based WCM site utilizing clean URLs and a content publishing process suitable for an enterprise?

This blog post is the first one in a 3-part series dedicated to the aspects of a multilingual WCM site architecture. It sets the context and describes the architecture of the authoring site.
Part 2 describes managed metadata navigation on the authoring site;
Part 3 describes the architecture of publishing sites and utilizing the cross-site publishing.

Setting the Context

At Navantis, we needed to build a bi-lingual Internet presence publishing site for our Canadian customer, based on an on-premises installation of SharePoint Server 2013. The key requirements at a high level were as follows:
  1. The content of the site should be available in French and English;
  2. Although the authors could publish French version of a piece of content independently of its English version, most of the time it was required for the two versions to go live simultaneously, such as in cases of company news announcements;
  3. The site had two kinds of content - "non-volatile", updated rarely, and "volatile" corporate blog content updated monthly when new postings were published;
  4. Some pages in addition to containing content, such as text and images, needed to have dynamic elements pointing at other pages, for example products widget, blog postings widget, home page image slider widget, etc.;
  5. English and French versions of the content would be hosted on two sites addressed by two different DNS host names;
  6. Authors and approvers should be able to preview the content with high fidelity to its production appearance, and follow a translation and approval business process before exposing it to the anonymous Internet users.
Pretty typical, I would think...

General Architecture

So, we have started with planning an application utilizing Cross-Site Publishing (XSP) method, and studying Microsoft guidance including the case study of Mavention WCM site. I must point out that the case study and the excellent blog of Mavention's Waldek Mastykarz were very much helpful to us. Yet our case was not identical to the Mavention's site. The key differences were as follows:
  • We used managed navigation on the authoring site. This was done in order to provide the fidelity of the look and feel between the authoring and publishing sites to the authors and approvers. Here we have willingly deviated from the guidance recommending structured navigation for authoring sites that leverage XSP. Another reason for this choice was that it provided for an easy fallback to the author-in-place publishing method, should there be implementation issues with the XSP method.
  • We were hosting authoring and publishing sites in two separate web applications in order to separate them physically. The customer 's security policy allowed placing both authoring and publishing sites to the perimeter network, and so we did.

Assets Site

Some of the key architecture decisions to make have to do with where to store and how to publish binary assets created by information workers, such as images and videos. Search index does not store binary content. This means that in an authoring-publishing setup the images need to be either shared or copied between authoring and publishing sites. While the images that are a part of the branding can be copied over during site provisioning, this cannot be done with the images used by the information workers when creating content pages.

The two alternatives I see here are developing a solution for synchronizing the images between the authoring and publishing sides, and using a shared assets site. The synchronization solution could theoretically leverage content deployment between source and target site collections for the images, although the information workers would need to maintain "dummy" pages referencing all the used images so that they could be picked up and transferred over by content deployment. We have chosen the same approach as in the Mavention case study - a shared assets site. It is important to note some consequences of this decision:
  • Unless we keep authoring and publishing sites in the same web application and use path-based site collections, we have to use absolute URIs when referencing the assets (starting with protocol identifier, ex. http://www.softforte.com/styles/softforte.jpg). This was our case based on requirement #5 above.
  • Assets site would need to use author-in-place publishing method and allow information workers to have authenticated access to the assets site, while also being available publically to anonymous users.
  • If you use host-named site collections for your publishing sites, which are a general recommendation on the grounds of improved scalability, you further limit the available options: with host-named site collections there is only one security zone, and securing authenticated traffic with SSL becomes problematic.
So we have chosen to use the unsecured HTTP channel with NTLM authentication for the assets site.

Blogs Site
We chose to keep blog postings in a sub-site of the main authoring site because only one Pages library can be used per web site, and we wanted to have a separation between the two libraries in order to be able to manage publishing workflows for pages and blog posts independently of each other. Since we relied on search web parts to surface content from the blogs, it was not a problem to retrieve blogs content regardless of its location. The only thing to be aware of is configuration of navigation on the blogs site - it needed to be inherited from the parent site in order to maintain its navigation context.

Variations and Publishing Process

In order to meet the requirement #2 asking for pages in both languages to be published simultaneously we have taken a rather elegant approach: we have used 3 variations: en-US (source English), en-CA (target English), fr-CA (target French). The source variation is not visible to the Internet users, but still allows to approve and then publish content in English language, an action that triggers propagation of published updates across the variations. Then in case of the en-CA language variation label the content simply gets approved to become published, and in case of fr-CA language it undergoes a translation process then gets approved and published. Such approach lets us publish French and English versions of a page quasi-simultaneously, leaving it up to the approver to determine the interval between approving French and English publications.

Customizations and Search-Driven Content

We have built the site 100% without having to write custom server-side components. All customizations were client-side only, 80% of them were the design templates for working with the Content Search web parts. The latter were used universally for almost each widget from a page with a dynamic list of blogs to a widget surfacing blog postings on select pages, to a content rotator with sliding images on the home page. After solving provisioning issues thanks to Chris O'Brien's blog post, we were able to speed up deployment cycles and significantly improve the development process. After all, it wouldn't be an exaggeration to state that the site was 100% search-driven.

We have also used traditional site search box and a search results page. For a relatively static web presence site we have opted to not use a search center, and instead use a custom search results page. We wanted to be able to search for blogs separately on the blogs listing page. There we have used content search web part, and a search box web part connected to it, and made the query to filter out anything but the blog postings. The global search results page would include a wider set of results, and allow user an option to switch to blog results only. I have earlier described how to configure this functionality. Because of our use of variations, each variation label had its own search results page, with a query scoped to only content residing in that variation label site.

With all the advantages of surfacing the content in content search web parts, I feel that the main drawback has to do with the obscurity of the content selection method, being the search query issued from a content search web part. This obscurity slows down development by making troubleshooting search queries or search results issues more difficult. The benefits clearly outweigh the complexities, and there are several  tools at our disposal: firstly the content search web parts themselves with the excellent ability to do query results previewing; secondly there is a Search Query Tool released by Microsoft; lastly there is a REST web interface which can be used also to troubleshoot the queries using a browser. Still we cannot fully get away from the  high complexity of the search sub-system, and from this point of view it is reminiscent to me of the past experience writing search-driven web parts in previous versions of SharePoint. If you compare complexity of troubleshooting your search queries with troubleshooting of equivalent SQL queries you will know what I mean, so just allow some extra time for getting it right before you start adding a lot of content and the time of full crawl would increase dramatically. The less time it takes to do a full crawl in your development environment the sooner you will get the search-driven functionality to a solid state.

One search-related issue we have encountered is worth mentioning: the pages we were authoring had Hide physical URLs from search fields set to False by default. Logically, we were getting duplicated sets of results in return to some of our search queries. Setting this field to True followed by re-indexing the content, only shows term-driven URLs in search results - duplication problem solved!

Master Page, Page Layouts and Branding

In general we have used a pretty standard approach that is in principal not much different from the one used in SharePoint 2010. The new Design Manager helps by letting us use HTML editor of choice, and not rely on the SharePoint Designer. The cost of this convenience was that we had to provision both *.html and *.aspx versions of our page layouts using features. When only the *.html versions are provisioned by a feature their conversion to *.aspx files isn't triggered automatically. Still we needed both of the kinds of files as during development it is important to have identically configured development environments that are ready for branding work. Exporting and re-importing a design package didn't quite work for us as we noticed this process changed columns on intrinsic content types we relied upon - a topic that deserves a blog post and a research on its own.

The site had responsive web design, which we have selected over the device channels option since it allowed us to stay with the same set of master pages and page layouts, and change only the style sheets with some minimal JavaScript support for the changes in navigation at smaller view port sizes. This "modern" style not only impacts the design of the master page and page layouts, but also affects the design of the content, in general making it more complicated. In our case we have provided sample "Lorem ipsum" content for all of the default pages. The sample content was intended to demonstrate to the information workers by a way of example how to structure the real content, and which CSS classes to apply to it.

The next topic, managed metadata navigation, is another key part of a WCM site architecture. It is described in the Part 2 of the series.

Sunday, June 9, 2013

Filtering Items by Date Range using SharePoint 2013 REST API

For some reason I always had extra difficulty when filtering results by dates in SharePoint, especially when I needed a "between" filter. This time I needed to render a calendar on a page accessed anonymously. I used a FullCalendar jQuery plugin for this,  and the SharePoint 2013 REST API. In my implementation when the end user changes a month in the calendar, it calls a function passing it start and end dates of a date range. The function then needs to retrieve JSON from SharePoint corresponding to the date range.

I started with turning on Team Collaboration Lists feature on my publishing site and creating the Calendar list. Then I have got my REST endpoint, this time around with a "greater than or equal to" filter:

http://devserver2012/_api/web/lists/getbytitle('events')/items?$select=Id,Title,Description,EventDate,EndDate,fAllDayEvent&$filter=EventDate ge datetime'2013-06-08T00:00:00'
The reason for my blog post is that the above RESTful call doesn't work. Research has pointed out a few things:
  1. I am not the only one. Other people have had the issue. Rob Windsor's comment was quite useful to me.
  2. The MSDN guidance has a table of supported filtering operations. First of all, it incorrectly capitalizes the numeric comparison operations. For example, instead of "Ge" operator which does not work, I was using "ge" which does work. Second of all, the references pointing back at www.odata.org were unfortunately broken there, at the time of this writing at least.
  3. Later I've learned that my filter actually works with a Boolean "and" operator as I will show below. Does this make it unsupported? It probably does, according to the mentioned MSDN article, although the article itself clearly needs a revision.
  4. OData documentation is your friend. I find it hard to read, yet it is ultimately helpful. You can see query options supported by OData in section 4.5 on this page. Notice how the operators are all lowercase in the examples.  The formatting of dates in filter queries can be found here. This is where the datetime'2013-06-08T00:00:00' comes from. One interesting detail, although the format spec does not indicate this, the following works just as well: datetime'2013-06-08T00:00:00Z'. Apparently the ISO-8601 is still honored.
Getting back to my original REST call - what "fixes" it? If I were to use ListData.svc instead then I'd get the following:

http://devserver2012/_vti_bin/ListData.svc/Events?$select=Id,Title,StartTime,EndTime,AllDayEvent&$filter=StartTime ge datetime'2013-06-10' and EndTime lt datetime'2013-06-12'

The above call actually works, although it ignores time portion when filtering. You can see that I get my "between" filter in there, also the column names are different - it uses display names of the columns with removed spaces. Also notice how the date values are formatted, actually conflicting with the OData format spec. In general, if you examine the Atom results returned, you will also find that date formatting is relaxed (no "Z" Zulu time indicator).  So what! We can celebrate now... Well, almost - the call to ListData.svc won't work for anonymous users. Since this was a deal breaker to me, I kept on looking. Turning to the ULS log shows that the error message  "The field 'EventDate' of type 'DateTime' cannot be used in the query filter expression." comes from here:

Microsoft.SharePoint.SPListItemEntityCollectionCamlQueryBuilder.CheckFieldRefUsage(SPField field, FieldRefUsage fieldRefUsage)

With help of Reflector we see that the SPField.Filterable property is logically enough the driving force here. Then using SharePoint Manager to examine the value of this property for the EventDate ("Start Time") field of the Event content type we of course find that its Filterable property is set to false, and this is what is causing the error message to be returned. I should mention that there is little value in working around this behavior by using calculated fields - they are also rejected as filters by this method. To prove that it actually works in principle I've created a DateTime column from scratch and named it "Date3". Here is how my REST call now looks like:

http://devserver2012/_api/web/lists/getbytitle('events')/items?$select=Id,Title,Description,EventDate,EndDate,fAllDayEvent,Date3&$filter=Date3 ge datetime'2013-06-13T00:00:00Z' and Date3 le datetime'2013-06-14T00:00:00Z'

This works when accessing the list anonymously. Problem solved and lesson learned: do not rely on the OOB Calendar list for anonymously accessible events, or be prepared to add your own custom DateTime columns to that list.

At the time of this writing I was running SharePoint Server 2013, Enterprise CAL, March 2013 update.

Wednesday, May 29, 2013

Customizing Search Navigation on a SharePoint 2013 Publishing Site

Managed metadata-driven navigation in SharePoint 2013 lets us use clean and implementation-independent URLs, which are often required in WCM applications. I was working on a web presence site for my customer, which used clean URLs. A simple search interface was used on it: a search box on a master page, and a search results page containing search results web part, and a search box web part. I wanted that search box web part to have a drop-down list of options so the Internet users could choose whether to search the entire site, or just the blogs.

First of all, in the past with SharePoint 2010 and 2007 we were using scopes to achieve this effect. It works differently in SharePoint 2013. Under Site Settings >> Search Settings you need to define the links to different destination pages:

Fig. 1 - Site Collection Search Settings

These links would point at two different search results pages and pass user's search keywords as a query string to these pages. In my case the first link "All Results" was pointing at a site search results page, the second one "Blog Results" - at a blogs listing page.

Second of all, we need to configure the search box web part. If you want to use the drop-down menu in the search box residing on the master page then you simply need to select the option titled "Turn on the drop-down menu inside the search box, and use the first Search Navigation node as the destination results page." on the Fig. 1 above. In my case I wanted no drop-down menu there, but needed one on the search results page. This is done on the search box web part as shown on the Fig. 2.

Fig. 2 - Search Box ToolPart showing how to configure the drop-down menu

That's pretty easy and gets you what you need. There is one problem however. Look at the Search Settings page illustrated on Fig.1 - the URL is pointing at an ASP page, which we certainly would like to change from /auth/en-ca/Pages/Search-Results.aspx to something like /auth/en-ca/results. Should be simple, especially as exactly this is done under the "Which search results page should queries be sent to?" section on the same page. Well, turns out that it does not work if you attempt doing it through the UI - it will complain about incorrect URL format unless you give it a page reference.

These links are referred to as "Search Navigation nodes" for a reason. If you open PowerShell and check out the navigation of your current web, you will see the new property of Microsoft.SharePoint.Navigation.SPNavigation type named SearchNav there:

Fig. 3 - SearchNav is a new property of the SPNavigation type in SharePoint 2013

OK great. The next obvious thing to do is change the URL of the "All Results" Search Navigation node from /auth/en-ca/Pages/Search-Results.aspx to /auth/en-ca/results using the PowerShell. Guess what -  it would silently not update the URL property.

The way I have got it to work was by creating the navigation node entirely from PowerShell. I also had to use the SPNavigationNode constructor overload accepting a Boolean isExternal flag and set it to $true in order to bypass the URL check, which SharePoint does and for some reason ignores the managed metadata-based navigation URLs. So the PowerShell workaround in my case looks as follows:

$node1 = new-object `
 -TypeName "Microsoft.SharePoint.Navigation.SPNavigationNode" `
 -ArgumentList "All Results", "/auth/en-ca/results", $true
$node2 = new-object `
 -TypeName "Microsoft.SharePoint.Navigation.SPNavigationNode" `
 -ArgumentList "Blog Results", "/auth/en-ca/blogs", $true

$web = Get-SPWeb http://devserver2012/auth/en-ca

This gets all my URLs to be "clean", including the ones pointed at from the Search Box drop-down menus. Interestingly, after creating the links through PowerShell you can modify their title, description, or even the URL properties through the UI without issues.

I was running SharePoint Server 2013 with March 2013 CU.

Sunday, August 28, 2011

Custom Alerts in SharePoint - Templates or Code?

My client needed a customized alert created each time a new issue was added to a standard Issues list. The only difference from out-of-the-box alert functionality was that customized alert Email message needed to have an Issue ID in its subject. I haven’t customized alerts before so I did some research, which showed that:

  • you can customize HTML of an alert through alert templates and use placeholders to insert values in an Email message.
  • there is an API letting you to tap into alert message processing pipeline and customize Email message before it is sent out.

There is a lot of information online on the subject, yet I still had difficulties with this seemingly easy problem. The following resources were most useful to me: MSDN (http://msdn.microsoft.com/en-us/library/bb802949.aspx), Albert Meerscheidt’s post about alerts in WSS 3.0 and Yaroslav Pentsarskyy’s post about customizing alerts in SharePoint 2010.

For me empowered with this information the task came down to writing a correct alert template, or so I thought. Take a look at this fragment of an out-of-the-box immediate alert template defining Email message subject text (immediate alerts are sent right away while digest alerts are sent later as a summary):

<GetVar Name="AlertTitle" />
<HTML><![CDATA[ - ]]></HTML>
<GetVar Name="ItemName" />

The interesting part here is placeholders “AlertTitle” and “ItemName” and the way they are used. I have a field named “ID” (it is a part of a standard Issues list), and a naive approach of writing <GetVar Name=”ID” /> didn’t get me anywhere. Same result with adding a custom text column named “ATextColumn” and then doing <GetVar Name=”ATextColumn” />. Well, the <GetVar> element is a part of CAML Vew Schema and yields a value of a local or a global variable set in current page context, but what are these placeholders and how are they set? At this point I have realized that my effort estimates were a little too optimistic. Then I bumped into a help article about alerts in WSS 2.0. Among other things it had a list of “tags” that could be included in alert templates. Here it is:

SiteUrlThe full URL to the site.
SiteNameThe name of the site.
SiteLanguageThe locale ID (LCID) for the language used in the site.
For example, 1033 for U.S. English.
AlertFrequencyImmediate (0), Daily (1), or Weekly (2).
ListUrlThe full URL to the list.
ListNameThe name of the list.
ItemUrlThe full URL to the item.
ItemNameThe name of the item.
EventTypeItemAdded (1), Item Modified (2), Item Deleted (4), DiscussionAdded (16),
Discussion Modified (32), Discussion Deleted (64), Discussion Closed (128),
Discussion Activated (256).
ModifiedByThe name of the user who modified an item.
TimeLastModifiedThe time the item was last modified.
MySubsUrlThe full URL to the My Alerts on this Site page in Site Settings.

Some of these tags are used in the out-of-the-box templates. I tried the rest of them and all have worked. So it appears as we are limited to using these 12 tags only, and that it is an old functionality which survived WSS 2.0, WSS 3.0 and SharePoint Foundation 2010 releases. If someone knows more about it please post a comment to validate, disprove or complete this statement.

Another thing that comes out of reflecting over template XML is usage of <GetVar Name=”OldValue#{Field}” /> or <GetVar Name=”NewValue#{Field}” /> or <GetVar Name=”DisplayName#{Field}” />. These elements are descendents of <Fields> element for immediate alerts, and of <RowFields> element for digest alerts. If you inspect generated alert body HTML then you would notice that fields (or columns) are iterated over and their values are inserted in the body except when a field is listed inside of <ImmediateNotificationExcludedFields> or <DigestNotificationExcludedFields> elements. So then <Fields> element establishes a loop, and the {Field} must be a contextual variable inside this loop. With the above syntax you can get display name, old or new values for each field into the Email body and exclude fields you don’t want to be listed.

Great, but how do I get the ID into my Email’s subject? I don’t want to list a bunch of fields in my subject, just the ID, so I don’t want to use <Fields> element there. The API did the trick. I have created a class implementing IAlertNotifyHandler interface and used regular expressions to replace a placeholder with a value:

public class MessageCustomizer : IAlertNotifyHandler

public bool OnNotification(SPAlertHandlerParams parameters)
string webUrl = parameters.siteUrl + parameters.webUrl;

using (SPSite site = new SPSite(webUrl))
using(SPWeb web = site.OpenWeb())
string to = parameters.headers["To"];
string subjectTemplate = parameters.headers["Subject"];
string itemId = parameters.eventData[0].itemId.ToString();

// Below we are replacing a placeholder we have
// created in our alert template with the actual value.

string subject = Regex.Replace(
bool result = SPUtility.SendEmail(
return result;

We still need a customized alert template – firstly to insert our own custom placeholder (in my example #ID#) and secondly to register the MessageCustomizer class so its OnNotification() method would get called. Here is updated fragment defining Email’s subject:

<HTML><![CDATA[Issue ID #ID#: ]]></HTML>
<GetVar Name="AlertTitle" />
<HTML><![CDATA[ - ]]></HTML>
<GetVar Name="ItemName" />

Registration of MessageCustomizer class and its assembly is done inside of <NotificationHandlerClassName> and <NotificationHandlerAssembly> elements of the template:

<NotificationHandlerAssembly>AlertCustomization, Version=, Culture=neutral, PublicKeyToken=bda7bcef852778f0</NotificationHandlerAssembly>

We can now wire up the templates and the handler. Handler’s assembly needs to go to the GAC, then you copy and rename alerttemplates.xml file sitting in 14\TEMPLATE\XML folder, add your template (or again copy an existing one and change it), then you register this file with SharePoint running stsadm –o updatealerttemplates command. I didn’t find PowerShell cmdlets equivalent to this command. Lastly you need to assign your template to a list using SPList.AlertTemplate property. You can write a PowerShell script or use a feature receiver. The latter approach is demonstrated in Yaroslav Pentsarskyy’s post mentioned earlier.

So we are arriving at a standard “It depends…” answer to the question of whether to customize alerts via templates or via code. Regardless of which approach works for you for any such customization that is not an ad hoc fix or a proof of concept you are looking into creating a package including deployment script, a SharePoint solution and possibly a feature with feature receiver. The functionality is almost identical between WSS 3.0 and SharePoint Foundation 2010 with the latter adding SMS support. Also with Visual Studio 2010 it is much easier to package things, yet you are probably still looking into a few hours of work to get it done right.

Tuesday, July 12, 2011

Calling WCF Web Services from a SharePoint Timer Job

Imagine that you are building an enterprise application on top of SharePoint 2010, which is installed on a multi-server farm. The application consumes a WCF web service – custom libraries use generated proxy classes and endpoint configuration is stored inside of a web application’s web.config file. Configuration settings are applied to all servers in the farm by a script when application is provisioned. And your application needs to be deployed to development, testing and production farms, all of which have differences in how WCF endpoints and bindings are configured, including variance in WCF binding types.

Now imagine that you need to call this web service from two places: from your custom web application code and from a timer job, perhaps because you need to cache results for better performance, but also be able to fall back to synchronous call when results get outdated. You will face a complication: how do you configure your WCF client when you invoke the service from a timer job?

You have a few options:

1. Implement a configuration file OWSTIMER.exe.config;

2. Construct binding and endpoint objects inside of a timer job, set their properties through code then create a channel or extended ClientBase<T> object and execute a service method call on it.

3. Load service model configuration from application’s web.config file and create and populate appropriate binding and endpoint objects. Then create and use a channel or a ClientBase<T> object to invoke the service method.

Option 1 has issues with provisioning files to locations not intended for custom application files and keeping same configuration information in two locations on all farm servers.

Option 2 hard-codes binding information, which makes it very difficult to maintain code and troubleshoot WCF issues in multiple environments.

Option 3 is apparently the best choice since configuration is stored in one place, it can easily be changed in web.config, and the changes will affect WCF service client objects in both locations. So let us look at what’s involved in implementing the option 3.

Before you can load a configuration object from web.config file you need to locate it. Here we can leverage SPIisSettings object, which is available for each security zone. Next you load web.config content by using WebConfigurationManager.OpenMappedConfiguration() method:

private System.Configuration.Configuration GetWebConfig(SPUrlZone zone)
SPIisSettings iisSettings = this.WebApplication.IisSettings[zone];
string rootPath = iisSettings.Path.ToString();
WebConfigurationFileMap map = new WebConfigurationFileMap();
new VirtualDirectoryMapping(rootPath, true));
System.Configuration.Configuration webConfig =

return webConfig;

Once you have obtained configuration object, you need to infer type of binding from service model configuration, and apply all attributes to it, as well as create an endpoint. Given names of a binding and an endpoint you find corresponding BindingCollectionElement and ChannelEndpointElement. The actual binding object is created using .NET Reflection and value of BindingCollectionElement.BindingType property:

private MyServiceClient MakeMyServiceClient(
System.Configuration.Configuration config)
// Get endpoint and binding names from settings.

string bindingName = GetValue("Key_MyBindingName");
string endpointName = GetValue("Key_MyEndpointName");

// Determine endpoint and binding elements used.

var sectionGroup = ServiceModelSectionGroup.
ChannelEndpointElement endpointElement = null;

for (int i = 0; i < sectionGroup.Client.Endpoints.Count; ++i)
if (sectionGroup.Client.Endpoints[i].Name == endpointName)
endpointElement = sectionGroup.Client.Endpoints[i];

BindingCollectionElement collectionElement = sectionGroup.
item => item.BindingName == endpointElement.Binding);
IBindingConfigurationElement bindingConfig = new
collectionElement.ConfiguredBindings).Find(item =>
item.Name == endpointElement.BindingConfiguration);

// Create address and binding of proper type and populate them.

Binding binding = (Binding)collectionElement.BindingType.
GetConstructor(new Type[0]).Invoke(new object[0]);
EndpointAddress address = new EndpointAddress(

MyServiceClient client = new MyServiceClient(binding, address);
return client;

That’s it. The credit for MakeMyServiceClient() method goes to Microsoft. When I was searching for examples of inferring binding type and properties from configuration I bumped into implementation of a read-only property named Binding on an internal type Microsoft.SharePoint.Administration.Claims.SPSecurityTokenServiceApplication inside of Microsoft.SharePoint assembly. I have reused that code with minor deviations in the example above. Open up Reflector and take a look at that property.