Thursday, December 23, 2010

Writing Trace Output to ULS Log in SharePoint 2010

SharePoint 2010 supports writing custom messages to ULS log. A blog post by Waldek Mastykarz of Mavention provides a good example of how to do this. UPDATE 02/24/2011: Andrew Connell's article on MSDN from December 2010 provides essential information on diagnostics logging.

Ability to send messages from System.Diagnostics.Trace.Write() and Trace.Fail() method overloads to ULS is yet another “logging novelty” I will be focusing on in this post.

There is a new type in Microsoft.SharePoint assembly, which was not there in version 12 – Microsoft.SharePoint.SPULSTraceListener. As its name suggests it allows sending trace messages to ULS.

If you are instrumenting your custom SharePoint code and need to write diagnostic traces to ULS then you can wire up SPULSTraceListener type in your web.config and use tracing infrastructure available in System.Diagnostics.Trace.  Here are the steps:

1. Add a trace listener element to web.config. For more information see description of system.diagnostics element on MSDN.

<remove name="Default"/>
type="Microsoft.SharePoint.SPULSTraceListener, Microsoft.SharePoint, version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" />

In this XML snippet note the <remove name=”Default”>. This is done to remove default trace listener, which is wired up by default and displays a pop-up window with the message when Trace.Fail() is called.

2. In central administration site configure diagnostic logging (Central Administration >> Diagnostic Logging). If you are using Trace.Write() method calls then they are logged as Verbose and so in order to actually see them in ULS logs you need to make sure that throttling level is set to Verbose for SharePoint Foundation > Service Connections category. If you use Trace.Fail() overloaded methods then the message will be written as a High level, which by default is on, so the message will normally end up being written to a log file. And yes, all messages will be written to the log under Service Connections category.

What About TraceContext.Write()?

UserControl, Page and HttpContext classes have a property named Trace of type System.Web.TraceContext. It allows tracing messages using ASP.NET trace infrastructure, for example: HttpContext.Current.Trace.Write(string message);. You can send ASP.NET trace messages to ULS using standard technique but with a caveat:

Add trace configuration element under system.web element in your web.config as follows:

<trace enabled="true" requestLimit="1000" mostRecent="true" writeToDiagnosticsTrace="true"/>

Here the key attribute is writeToDiagnosticsTrace instructing ASP.NET trace to use tracing infrastructure from System.Diagnostics. Now trace messages written using TraceContext.Write() will be sent to ULS.
Now the caveat: adding system.web/trace element appears to destabilize SharePoint. I have tried this on SharePoint Foundation and SharePoint Server 2010 with October 2010 CU, and although tracing does work as expected I have got a consistent error when trying to create a new web part page:
Looking at the ULS log, I found these messages logged under SharePoint Foundation category:

SharePoint Foundation    General    b9y3    High    Failed to open the file 'C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\Resources\wss.resx'.

SharePoint Foundation    General    b9y4    High    #20015: Cannot open "": no such file or folder.   
SharePoint Foundation    General    b9y4    High    (#2: Cannot open "": no such file or folder.)   
SharePoint Foundation    General    b9y9    High    Failed to read resource file "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\Resources\wss.resx" from feature id "(null)".

As soon as I disable the system.web/tracing the error disappears. I am not sure if this is a bug or adding <trace> element under <system.web> is not supported. In any case there is enough of rich diagnostic features available, and if you are using tracing in your application then just avoid System.Web.TraceContext.Write() in favor of System.Diagnostics.Trace.Write() or at least use it temporarily if you need to. By the way System.Diagnostics.Debug.Write() also sends messages to trace log.

UPDATE (01-16-2011): There is yet another issue caused by turning on trace that I ran into today: SharePoint designer cannot open a site in a web application for which tracing is enabled, and keeps prompting for credentials. If you check out traffic with Fiddler, you will find that SPD makes requests to http://devserver2010/_vti_bin/shtml.dll/_vti_rpc, which return a 401 UNAUTHORIZED. Not even verbose logging reveals anything useful. Well, commenting out <trace> element in web.config restores normal functionality.

Monday, December 13, 2010

Configure Links Web Part to Open URLs In a New Window With SharePoint 2010

With SharePoint 2010 (or SharePoint Foundation 2010) you can customize list views using XSLT, which is very powerful and for simple changes also quite simple. Here is an example of how to make URLs in Links list web part open up pages in a new browser window.

Links web part as well as other list-based web parts uses a predefined list view to render its content. Since views are now defined using XSLT style sheets, all that is needed to control the HTML is to override appropriate XSLT template with a custom one that has HTML markup that we need, for example <a href=”….” target=”_blank”>. List web part’s edit mode panel contains a text field titled XSL Link under Miscellaneous category where you can provide a URL of an XSLT file, which will override default rendering behavior.

Let’s say we have a Links list with a few URLs in it and we have added a Links web part to a page. We want to have all links shown by the web part open up in a new window. We need to find an XSLT template which describes rendering of the field with a link. One easy way to determine this is to sort by the URL field then take a note of the value of SortField query string value, which tells us that the field name we need is URLNoMenu. URL field is not a part of default view, default view has a URL (URL with edit menu) column, so you would need to add a column titled URL. As shown in the picture below, I’ve added my own view “Only Url” to Links list with this field only and set the web part to use this view.


Now we need to find an XSLT template for the URLNoMenu field.  Out-Of-The-Box templates are stored under the 14 hive in TEMPLATE\LAYOUTS\XSL on server file system. For what we need there are 2 important files there: main.xsl and fldtypes.xsl. The former file is a default file used for formatting, which imports other XSLT files including fldtypes.xsl. The latter file is the one which contains many templates for rendering different fields. Doing search inside this file for URLNoMenu yields:

<xsl:template name="FieldRef_URLNoMenu_body" ddwrt:dvt_mode="body" match ="FieldRef[@Name='URLNoMenu']" mode="Computed_body">

Next we copy this template XML element and put it in a new XSLT file. I named mine TargetBlank.xsl. Inside the new file we change HTML anchor tag to include target=”_blank” (or anything else we need) and save the file. Once we specify URL to this new file in the XSL Link field of a web part being edited, it will override all default formatting, therefore we also need to make sure that other templates still apply. Thanks to XSL import and overriding mechanism this is easy to do, we just need to import main.xsl file, and our version of the FieldRef_URLNoMenu_body template will override the one located inside fldtypes.xsl. Here is the complete listing of the TargetBlank.xsl.

<xsl:stylesheet xmlns:x="" 
exclude-result-prefixes="xsl msxsl ddwrt"
<xsl:import href="/_layouts/xsl/main.xsl"/>
<xsl:output method="html" indent="no"/>
<xsl:template name="FieldRef_URLNoMenu_body"
match ="FieldRef[@Name='URLNoMenu']"
<xsl:param name="thisNode" select="."/>
<xsl:variable name="url" select="$thisNode/@URL" />
<xsl:variable name="desc" select="$thisNode/@URL.desc" />
<xsl:when test="$url=''">
<xsl:value-of select="$desc" />
<xsl:when test="@Format='Image'">
<img onfocus="OnLink(this)" src="{$url}" alt="{$desc}" />
<a onfocus="OnLink(this)" href="{$url}" target="_blank">
<xsl:when test="$desc=''">
<xsl:value-of select="$url" />
<xsl:value-of select="$desc" />

In this file the xsl:stylesheet element is copied from the main.xsl file. Also note the xsl:import and the xsl:output statements. Lastly we need to upload the XSLT file and reference its URL in the XSL Link field of our web part. That’s all there is to it. Our links should now open in a new browser window.

Wednesday, October 20, 2010

Decks, Source Code and Resources from October 19 MSPUG Meeting

Thanks a lot to the folks who showed up yesterday at the Mississauga User Group meeting. Also I’d like to thank Ray Outair for his efforts in running the group in celebration of its first year anniversary. I have posted the deck for my presentation “Leveraging SharePoint 2010 Search Technologies” and the source code of the demo web part demonstrating FQL capabilities on SoftForte site here.

If you are planning search deployment I highly recommend checking out technical diagrams on TechNet: There are 4 diagrams there dedicated specifically to Search. Also check out downloadable content here – it has been recently updated.


As I mentioned at the meeting, I also recommend a book titled Professional Microsoft Search: FAST Search, SharePoint Search, and Search Server by Mark Bennett, Jeff Fried, Miles Kehoe and Natalya Voskresenskaya. It is a great resource on planning and architecture of SharePoint Enterprise Search, and specifically FAST Search products.

Last but not least is a reference to Steve Peschka’s blog post where Steve introduces a Search Explorer tool that he wrote. Steve’s blog is an essential resource on developing Search and other SharePoint development topics.

So I’d like to re-cap on a few questions we left unanswered yesterday:

1. FAST visual best bets were displayed, but did not get filtered based on User Context’s managed properties. The question is how to create/promote properties into user context for such filtering to happen correctly. UPDATE: I figured this one out

2. I was not lucky extending CoreResultsWebPart and make it process custom FQL queries, and was forced instead to write a web part from scratch. The question is whether there is a supported way to extend CoreResultsWebPart for FAST search queries.
UPDATE: this one is now also addressed. Check it out here.

3. Somebody has mentioned that they could not get PowerPoint visual preview to work with FAST search results, suspecting that SSL encryption between FAST Search Server and query SSA was to blame. Looks like some research is needed here.

I plan to look into these and post my findings. Please feel free to comment on if you get answers sooner.

Sunday, September 19, 2010

Enumerate SharePoint 2010 Databases

If you are working with SharePoint in a lab environment and use one shared database server to store databases from multiple farms then with SharePoint 2010 you can quickly lose control of your databases since there are much more of them now. When you set up a lab farm you typically run Products Configuration Wizard which leaves you with several databases created whose names contain GUIDs, which makes it hard to distinguish between them and relate them to corresponding farms.

Let’s say you need to get rid of a farm and want to delete corresponding databases after. What do you do? If you haven’t yet disconnected from the farm and uninstalled SharePoint then PowerShell comes at help. Log in to one of the farm servers, start up SharePoint 2010 Management Shell and enter this command:

Get-SPDatabase | % {$_.Name}

It will list names of all databases utilized in the current farm. You can now create a script to drop these databases or do whatever you needed to do with them. If however it is too late and you have uninstalled your servers, you can still get the list of database names using a SQL query:

USE SharePoint_Config
select distinct name from Objects WITH(NOLOCK) where Name in (
select Name COLLATE Latin1_General_CI_AS_KS_WS from sys.databases WITH(NOLOCK))

Replace SharePoint_Config with the name of configuration database for your farm. SharePoint products use Latin1_General_CI_AS_KS_WS collation so you need to do a cast. It is of course not supported to issue queries directly against SQL Server so I would use this approach for development environment only, and only when you are really cleaning it up and PowerShell isn’t an option.

Alternatively to avoid the mess in your database names from the start you can also pre-create them. Here is the guidance. Lastly, although the guidance does not suggest sharing database role between farms in production environment, it is quite practical to share a SQL server in development environment if you are not using your farm for performance and capacity testing.

UPDATE: on technet there is a list of databases used by SharePoint (a section in storage and SQL Server capacity planning article):

Monday, August 30, 2010

Slipstreamed Installation of SharePoint Server 2010 with June 2010 CU

I wanted to create a slipstreamed installation point for SharePoint Server 2010, which would include June 2010 CU. I could not find guidance on how to do this for the current version. MOSS 2007 guidance recommends extracting patches into \Update folder of the installation point. This works as well for the SharePoint 2010. Would the installer also pick up and apply executable packages, or  their extracts in subfolders under the \Update? I tried both variants – it does not.

The reason I went that path was because June 2010 CU consists of 6 packages, which is unusual given regular MOSS CUs single file format. Also when extracting each of the patches manually to a common folder the following two files are duplicated between KB983319 and KB983497 patches: osrchwfe-x-none.msp, osrchwfe-x-none.xml; and the following two - between KB2281364 and KB983497: pplwfe-x-none.msp, pplwfe-x-none.xml. By reviewing KB articles it appears that the files are identical so it is ok to overwrite them, which is what I’ve done using a script:

office-kb2124512-fullfile-x86-glb.exe /extract:C:\temp\sharepoint\Updates
office-kb2204024-fullfile-x64-glb.exe /extract:C:\temp\sharepoint\Updates
office-kb2281364-fullfile-x64-glb.exe /extract:C:\temp\sharepoint\Updates
office-kb983319-fullfile-x64-glb.exe /extract:C:\temp\sharepoint\Updates
office-kb983497-fullfile-x64-glb.exe /extract:C:\temp\sharepoint\Updates
spf-kb2028568-fullfile-x64-glb.exe /extract:C:\temp\sharepoint\Updates

Next I copied extracted files to the \Updates folder in my installation point.

After installing SharePoint and running configuration wizard most of the patches were applied, except for KB2124512 and KB2204024 according to DLL version check. I am running Windows 2008 R2, and SQL Server 2008 R2, and made a farm installation of SharePoint. It could be that either these patches do not apply in my environment, or something is not working in my installation. I will post an update if I find out more. For now I consider that the slipstream approach is generally the same as before even for multi-file CUs such as this one.

Thursday, May 27, 2010

Modified Date Field is not Visible after Content Deployment

It is the simple things, or at least ones you assume must be simple that often create some extra anxiety with SharePoint – at least from what I’ve seen. Here is one I ran into a couple of days ago: my customer runs a SharePoint 2007 publishing portal with February 2010 CU, where content is created in authoring environment, then content deployment job transfers it to production farm located in perimeter zone and accessible to anonymous Internet users. The application is using a significant amount of custom code. In many cases SharePoint API is used as a data provider, and is accessed through an adapter layer, which helps decoupling custom code from the platform. Quite logical.

An example of how the adapter accesses SharePoint API is retrieving SharePoint document library information through SPList.GetItems(SPQuery).GetDataTable(). The DataTable instance is then returned back to the consuming custom code. The problem is that Modified column is missing from the returned DataTable in production farm, while it is there in authoring farm, which certainly breaks a lot of application logic.

It turns out that a Content Deployment Job Path setting (Central Administration > Operations > Manage Content Deployment Paths and Jobs > Content Deployment Path) is to blame: Security Information must be set to All in order for the issue to go away, and you would need to re-create the target site collection. Now this will not be a completely painless move – you will start getting a content deployment warning: User security information cannot be properly imported without setting UserInfoDateTime option to ImportAll. This is because you have a Deploy User Names checkbox cleared, which you should according to the guidance. Here is a related post about this warning.

What actually happens is when Security Information is not set to All, this results in hiding certain fields, including Modified date field. So the GetItems(SPQuery) will still return correct items satisfying the SPQuery parameter, but if you inspect properties of an SPField for the returned items you will see that:

 item.Fields["Modified"].CanToggleHidden == false;
item.Fields["Modified"].Hidden == true;

and the SPListItemCollection.GetDataTable() method omits hidden fields when it constructs the DataTable. Knowing the above you can either edit content deployment path and do a clean content deployment, or if it is problematic for some reason, then you can change your custom code to manually construct a DataTable rather than using SPListItemCollection.GetDataTable().

UPDATE: The issue was fixed in April Cumulative Update (see for more details). Thanks to Bill Brockbank for noting this. A good moment to emphasize importance of running up-to-date software.

Thursday, April 29, 2010

Records Management in SharePoint 2010

SharePoint 2010 has a lot more to offer in records management space than its predecessor. Rez will be presenting at Mississauga SharePoint User Group on May 4th. The topic is SharePoint 2010 Records Management Features. Rez has many years of experience with SharePoint and deep knowledge of the technology. I look forward to and highly recommend next week's session.

Wednesday, April 21, 2010

Toronto SharePoint User Group Meeting Follow Up

Thanks for coming out to the April 21 TSPUG meeting, I was excited to talk about performance once again, and hope you have found the meeting useful. As you can see from my blog, I have done the same talk a month earlier at Toronto SharePoint Camp, and the decks are pretty much identical, the difference being that yesterday I had enough time to do decent demos on load testing. If you are interested in material from the slides or need a reference to one of the resources listed there, just use my deck from the camp.

Sunday, March 21, 2010

Toronto SharePoint Camp 2010

It was my second time when I took part in the event, and it was fun again. Great thanks to Eli, Bill, Ruven, Kanwal, Graham and other people  who organized it. Thanks to the folks who showed up and dedicated their Saturday to SharePoint! SoftForte was the gold sponsor of the event.

For those interested in references and links related to my presentation “Planning and Measuring Performance of a SharePoint Farm”, the deck is available for download. It will also soon be available from the SharePoint camp site.

Due to time constraint at the camp I’ve excluded some details and the entire topic on performance tuning best practices from my talk, but I left these slides in downloadable version of the deck as this is all useful information and contains many good references.

Thursday, January 14, 2010

Stand-Alone Network Emulator for VS2010 Beta 2

At the “Load Testing SharePoint 2010 with VSTT” presentation at SPC 2009 I’ve learned about the new network emulator, allowing one to model various network latencies and bandwidth values. The emulator is available as a part of Visual Studio 2010. I’ve got excited, and found the following two great blog posts by Lonny Kruger, first one describing how to enable the emulator through the VS2010, and the second one – how to write a stand-alone emulator, which would rely on VS2010 API but not require running a load test to simulate specific network characteristics. Also for reference, here is an MSDN page on the topic: 

The only problem was – these 2 blog posts were for the Beta 1 release of Visual Studio 2010, while I already had the Beta 2, and some changes were made in the Beta 2, that are worth to note here. I have written a stand-alone emulator using Lonny’s post as a starting point, but changing a few things to get it working with the Beta 2. The tool can be downloaded here, and its source code - here. Below is a list of pre-requisites to get the emulation tool running, and description of what and why I have changed compared to Lonny’s original post. I recommend reading his two posts first to get a better context.


You would need Visual Studio 2010 Beta 2 installed in order to run the emulator. I’ve tested the tool with VS2010 Beta 2 Ultimate Edition on a 64-bit Windows 7 box. You would also need .NET Framework 4.0, which is installed by default when you install the Visual Studio. Once you have that, you need to install network emulation driver. This is done one time only through the Visual Studio UI. From that point on, you don’t need to start Visual Studio to run the stand-alone emulator.

Enabling the Driver

The driver is accessed via a Test Settings dialog. You need to create test settings first. One way to do it is to create a new Test Project: File >> New Project… >> Test Project. Below is a screen shot of a project type selector dialog.


Once you have a test project, you can double-click on the Local.testsettings solution item in Solution Explorer to bring up the Test Settings dialog, then enable Network Emulation as shown on the following screen shot:

Enabling Emulation

Lastly, to enable the driver click on Configure button for selected and checked Network Emulation. Follow the prompts asking to confirm the driver installation:

Installing Driver

You can now use the network emulator as a part of Visual Studio for web performance and load tests, or close Visual Studio and use the stand-alone tool, for which you need administrator’s rights.


Stand-Alone Network Emulator Source Code

I started off by copying Lonny’s sample code, except that I use a WPF application as opposed to WinForms. Once I built and ran it, I kept getting negative long values returned through interop from the NativeNetworkEmulationAPI class, which is listed in Lonny’s post. Tried running the program as administrator – still same errors. So using reflector I inspected the Visual Studio network emulation classes, and found this type, which appears to serve the same purpose as the NativeNetworkEmulationAPI: Microsoft.VisualStudio.QualityTools.NetworkEmulation.NetworkEmulationDriver. All I needed to do was to change method calls to use this class instead of the NativeNetworkEmulationAPI.

Regarding the network profiles, you have to edit these XML files following Lonny’s instructions to make them work. If you simply use the OOTB profiles, you will not get an error, but the emulation won’t start. I have copied and changed the default network profiles, so you only need to do this if you want to add a new custom profile, in which case just place the profile XML file into Profiles sub-folder and re-start the emulator.

Wednesday, January 13, 2010

Presentation Deck from January 12 MSPUG Meeting

I wanted to thank people who showed up at yesterday’s Mississauga SharePoint User Group meeting. Also thanks Ray for organizing and Peter for sponsoring it. I think it was a great evening.

Peter’s part of presentation was exciting. Not only he is a good speaker, but Envision IT also has done some great work. Even though I have got through about 40% of what I wanted to cover in my part of presentation before I ran out of time, I hope it was interesting and useful. You can download my deck here. Also according to Ray it will be available at some point on the MSPUG site.

Thursday, January 7, 2010

Dynamic Text Highlighting with jQuery

I’ve got my hands dirty with jQuery once again, creating animated text line with dynamic highlighting of its letters. As usual I started by looking for existing plug-ins – didn’t find quite what I needed, so I wrote my own animation. It is possible that there are ones that I just missed, but thanks to jQuery my own animation ended up being pretty simple, which I want to illustrate here.

I was dynamically writing out letters and changing their font-weight CSS property to create the effect of highlighting. I think it worked out nicely. It even works on my iPhone 3G. You can see the page here. Below are a couple of notes on its source code:

It starts with displaying a semi-transparent banner by using jQuery animate function to change its left CSS property. The banner (id=”motto”) is clipped by a surrounding div (id=”mottoclip”), which creates the perception of it appearing out of nothing rather than simply sliding horizontally. With jQuery it is as simple as this:

{ left: "2px" },
{ duration: rolloverDuration, easing: "linear", complete: showMotto });

Once the banner is in place the showMotto() callback function sets up a recursive output of the letters. The recursion is implemented in writeText() function. It outputs the motto line of text one letter per recursion, setting timeout between them. In order to be able to “highlight” a letter, it needs to be placed in its own span. The letter becomes visible once its span is added to the parent container through jQuery append() method:

var characterSpan = "<span id='mt" + i + "'>" + character + "</span>";

Then we construct a jQuery selector object for each character, so that we can easily animate it. Next, call a function animateLetter() where we first hide letter completely, then gradualy show it in bold (font-weight: 700), then return to normal (font-weight: 400):

function animateLetter(jChar, letterShowTimeout, letterResetTimeout)
jChar.css("display", "inline");
jChar.css("font-weight", 400);, function() { jChar.css("font-weight", 700); });
setTimeout(function() { jChar.css("font-weight", 400); }, letterResetTimeout);

At this point we move on to writing out and animating the next letter, and so on.

Lastly there is an “artistic touch” on all this – the easing. Easing determines how an animation ends to create a pleasant visual effect. You’ve probably seen menu fly-outs that appear to oscillate slightly at the end of their path, or they simply slow down, etc. By the way one great example of this can be seen in jQuery Scrollable control.

In my case easing was implemented starting at 14th character of the motto text. From that point the timeout between displaying letters increased exponentially, and timeout between making each letter bold and resetting it back to normal – linearly with each iteration:

var letterShowTimeout = 60;
var letterResetTimeout = 90;
var easingStartLength = 14;

if ( i > easingStartLength )
var easing = Math.ceil((i - easingStartLength) / 3);
var easingSq = easing * easing;
letterShowTimeout += easingSq * letterShowTimeout;
letterResetTimeout += easing * letterResetTimeout;

If you think of the animation as of lighting up a string of text with a flashlight, the easing creates an effect of the light spot getting wider towards the end. And yes, all the constants and exponents are empirical. If you want to use something similar, you’ve got to experiment with numbers.