Nested “Include” clauses with SharePoint JSOM API

Recently I needed to load properties two levels deep when using the JSOM (JavaScript Object Model) API in SharePoint 2013. Since it took me a while to find the syntax for it, I figured I’d save others the trouble and just put it right here.

var ctx = SP.ClientContext.get_current();
var folder = ctx.get_web().getFolderByServerRelativeUrl('/server-relative-path/to/folder');
var subfolders = folder.get_folders();

// Here are the nested "Include" clauses. The first loads the
// 'Name' and 'Files' properties of each SP.Folder object in
// the subfolders collection. The second goes a level deeper
// and loads the 'Name' property of each file in the 'Files'
// collection.
ctx.load(subfolders, 'Include(Name, Files.Include(Name))');

   function() {
      var folderLooper = subfolders.getEnumerator();
      while (folderLooper.moveNext()) {
         var subfolder = folderLooper.get_current();
         var fileLooper = subfolder.get_files().getEnumerator();
         while (fileLooper.moveNext()) {
            var file = fileLooper.get_current();

            // This line would throw a 'property or field not initialized'
            // exception if I hadn't used the nested Includes above to load
            // the 'Name' property of each file in the collection.
   function(sender, args) {
      alert('Error: ' + args.get_message());

Making a SharePoint 2013 Workflow Wait for Document Check-In

I was working on a SharePoint 2013 workflow the other day and turned on the “Start workflow automatically when an item is created” setting so my workflow would run whenever a document was uploaded into a document library.

The caveat is SharePoint runs the workflow immediately upon upload and doesn’t wait for an initial check-in to occur (which is a separate step if your document library has required metadata fields that must be filled out).

I needed a way to make the workflow wait for the initial check-in to occur so all the data would be present that was needed by the workflow.

Since the “Current Item” variable in my workflow included a “Level” field (indicates the check-in status of an item), I initially thought I could use the Wait for Field Change in Current Item action and wait for that field to change to “1.”

However, that didn’t work. The workflow would get stuck at the wait action every single time and never come out of it (even when the level changed to 1). I don’t know why, but for some reason that action doesn’t work as expected with the “Level” field.

The good news is I finally found a solution that worked: using a loop.

As shown in the following screenshot, I added a “Wait for Check-In” loop that checks the value of Current Item:Level once per minute and exits the loop when the value is no longer 255 (checked out).

So far that’s worked consistently.

Workflow loop to wait for check-in

SSRS Error – “Element <Query> of parameter query is missing or invalid”

As I told someone earlier this morning, I have mixed feelings about SSRS for SharePoint reporting. SSRS is a great tool, but it’s very obvious at times that it wasn’t originally designed for reporting against SharePoint data. Even in the 2012 version, the UI and behavior of SSRS with SharePoint can be a bit quirky.

Today’s strange issue was the following error: “Element <Query> of parameter query is missing or invalid”

I got this error when I tried to run a very basic report in Report Builder 3.0. I was pulling data from a list in SharePoint 2013 and using the XML (Lists.asmx) connection type.

At first, I thought I had the problem described in this StackOverflow post:

The post describes how SSRS fails to send your query properly if you directly embed CAML markup in your query XML. Instead, you must create a parameter called “query” (all lowercase) in your dataset and put your CAML markup in there. I won’t reproduce all of that here because the SO post does a great job illustrating it.


Even after I did that, I was still getting the error. The reason turned out to be a case-sensitivity problem. I’d copied my original web service query XML from another blog post, and the author of the original post had a syntax error in his XML that I missed.

Here’s how his query should have looked:

  <Method Namespace="" Name="GetListItems">
      <Parameter Name="listName">
        <DefaultValue>My List</DefaultValue>
      <Parameter Name="query" Type="xml"></Parameter>
      <Parameter Name="viewFields" Type="xml"></Parameter>
  <ElementPath IgnoreNamespaces="True">*</ElementPath>

In the code I originally copied from his blog, the “Type” attribute for the query parameter was spelled as “type” (all lowercase). Because XML processing in SSRS is case-sensitive, the lowercased “type” attribute caused the error.

Solution for missing SharePoint 2013 search display templates

I was doing some client work today and came across a need to configure a Content Search web part on a page.

I added the web part to the page and immediately got an error about the “Control_List.js” template. The error said the template was missing or had a syntax error.

The first thing I did was navigate to the display template folder at Master Page Gallery > Display Templates > Content Web Parts.

To my surprise, the folder was empty. However, the other display template folder (Display Templates > Search) had all the normal templates in it, so I wasn’t sure why only the web part templates were missing. I did a Google search and found absolutely nothing. The client’s environment had been upgraded from SharePoint 2010 to SharePoint 2013 earlier this year, so it’s certainly possible the display templates weren’t provisioned properly during the upgrade.

In the end, I discovered the fix was to deactivate and then re-activate the SearchTemplatesAndResources feature (with id “8b2c6bcb-c47f-4f17-8127-f8eae47a44dd”) on the site collection. This feature isn’t visible in the UI, so I did with PowerShell using the Disable-SPFeature and Enable-SPFeature cmdlets.

Triggering the “When a Task Expires” Event in a Single-Server Dev Environment

I was recently building a 2010-based workflow in SharePoint Designer 2013 and needed to test the When a Task Expires event in a task process. I wanted to make sure my task reminders worked as expected before migrating the workflow to production.

I thought that testing this event in a single-server development environment where I had complete control would be easy, but it wasn’t, especially given the lack of good information out there (even in 2015).

When I did a Google search on this, the best information I found was a few forum threads where developers pleaded with Microsoft for information on how to do this. Microsoft’s response was to run the Workflow timer job in your farm. Sometimes they also said to run the Workflow Auto Cleanup and Expiration Policy jobs for good measure.

As it turns out, Microsoft’s answer was partially correct. Running the Workflow timer job is required, but it’s not the whole solution. And the other two jobs are irrelevant.

In this post I’ll cover:

  1. The technical details of how the When a Task Expires event actually works.
  2. How to manually trigger the event for testing in a single-server development environment.

I will NOT cover how to trigger the event in a multi-server farm because I haven’t tested that. I don’t have a non-production multi-server farm available at the moment where I can try it.

And if you’re a business user reading this, you’ll need to get someone in IT to do this for you. The approach I describe here is highly technical and requires direct server access.

How “When a Task Expires” Actually Works

Before you mess around with triggering something, I think it’s important to understand how it actually works.

Here’s what actually happens.

Step 1: The due date is assigned/updated for a task in a task process

This date change triggers the workflow to determine if a task expiration event needs to be scheduled.

For 2010 workflows, SharePoint examines the value of the OverdueRepeat setting on the task process. For 2013 workflows, it looks at the OverdueReminderRepeat setting (same setting, just slightly different name).

If that setting is turned on and if the task due date occurs at or after the current time, a task expiration event is scheduled.

Step 2: A task expiration event is scheduled

The workflow adds a record to the dbo.ScheduledWorkItems table in the content database where the workflow instance is running.

This table is essentially a queue of tasks to be executed by special timer jobs called work item timer jobs. One work item timer job – the Workflow job – is specifically designed to fetch and process work items pertaining to workflows.

If you check this table, you’ll notice that the DeliveryDate column indicates when your task expiration event will be triggered. The workflow sets this to the due date of your task initially and then updates it each time the expiration event fires (if you’ve told the task process to repeat the event more than once).

Step 3: The task expiration event is processed and triggered

The Workflow timer job processes workflow work items each time it runs. In an on-premise SharePoint 2013 environment, this job is set to run every 5 minutes by default. In Office 365, Microsoft says it runs every minute.

The timer job relies on at least three fields in the dbo.ScheduledWorkItems table when fetching work items to process:

  • Type – A GUID value of BDEADF09-C265-11D0-BCED-00A0C90AB50FI in this column indicates a workflow work item
  • BatchId – Work items with the same batch ID belong to the same workflow instance. The batch ID is a Base64/binary-encoded version of the workflow instance ID (from the dbo.Workflow table)
  • DeliveryDate – The stored procedure that fetches work items only pulls items where this value is greater than or equal to the current system time.

If your workflow has a task expiration work item where the DeliveryDate is in the past, the event will be fired, and your “When a Task Expires” section will execute.

Note than manually updating the DeliveryDate column in the SQL table (which isn’t recommended anyway) will not work. If you try that, the date will be reset automatically (when the timer job next executes) to the original date the workflow thinks it should fire.

If your task gets deleted, the workflow is canceled, or the task’s due date is reset to a date that occurs before the current system time, the work item for the task expiration event will be removed (or never created in the first place).

Unfortunately that means if you try setting your task’s due date to a past date for testing purposes, it won’t work. The “overdue” event will never fire.

How to Trigger the “When a Task Expires” Event

Triggering this event is painful, so I’ve saved you some time and packaged up all the steps in a PowerShell script.

But first, here’s an overview:

  1. Disable all your timer jobs except the Workflow job (I explain why in Step 3).
  2. Run your workflow, and make sure your task due dates are set later than the current system time (usually the next day is a safe bet).
  3. Adjust the system clock on your server to a time that’s at least as far in the future as your task due date.
    • This is why I had you disable most of your timer jobs in Step 1. Many of those jobs run every few minutes or seconds, and setting your system clock ahead will trigger than ALL to run at the same time.
  4. Manually run the Workflow timer job to process and fire the “When a Task Expires” event.
  5. Reset the system clock back to the current time.
  6. Re-enable all the timer jobs that were disabled in Step 1.

So that’s the process, and that’s why it hasn’t been summed up in a simple, quick answer in Microsoft’s forums.

But again, I’ve made this easy for you. I’ve included PowerShell code below that does everything I mentioned above.

Simply copy and paste the code below into a script file named Trigger-SPWorkflowTaskExpiration.ps1.

*** WARNINGS ***

  1. Do NOT run this on a production farm. It’s intended ONLY for single-server development farms.
  2. Do NOT run this in a multi-server farm (where SharePoint and SQL aren’t on the same box). As I said earlier, I don’t have a non-production multi-server farm available for testing at the moment, so I haven’t tested that scenario and the script doesn’t account for it.
# Disclaimer: Like any free code you find on the Internet, use this at your own
# risk. Neither Bart McDonough nor Incline Technical Group will be held
# liable for any damage done to your environment.
# Usage: Trigger-SPWorkflowTaskExpiration [time-span],
#   where [time-span] is a string that can be parsed into a .NET
#   TimeSpan structure and specifies your desired adjustment to
#   the system time on the computer on which this script is run.
# Example: "Trigger-SPWorkflowTaskExpiration 1.00:00:00"
#   would set the system time ahead 1 day from the current time and
#   then trigger the workflow task expiration event.

# Type name of the workflow timer job that processes task expiration events
$WorkflowJobTypeName = 'Microsoft.SharePoint.Administration.SPWorkflowJobDefinition'

# Default system time adjustment is 1 day into the future
$dateAdjust = [System.TimeSpan]::FromDays(1)

# Use time adjustment from command line if supplied
if ($args -ne $null) {
   [System.TimeSpan]::TryParse($args[0], [ref]$dateAdjust)

$timerJobs = Get-SPTimerJob -ErrorAction Stop | where {
   $_.GetType().FullName -ne $WorkflowJobTypeName

Write-Output "$($timerJobs.Count) timer job(s) besides the workflow job were found in this farm."
Write-Output "Attempting to disable those jobs..."

$disabledCount = 0
$timerJobs | foreach {
   $job = $_
   $job.IsDisabled = $true
   try {
   catch {
      $job.IsDisabled = $false

Write-Output "$($disabledCount) job(s) successfully disabled. The remaining jobs are protected and cannot be disabled."

$workflowJob = Get-SPTimerJob -Type $WorkflowJobTypeName -ErrorAction Stop
$newTime = (Get-Date).Add($dateAdjust)

Write-Output "Updating system time to $($newTime)..."

$x = Set-Date -Adjust $dateAdjust

Write-Output "Running workflow timer job..."

$lastRunTime = $workflowJob.LastRunTime

Write-Output "Waiting for workflow job to complete..."

while ($workflowJob.LastRunTime -eq $lastRunTime) {
   $workflowJob = Get-SPTimerJob -Type $WorkflowJobTypeName

Write-Output "Resetting system time back to current time..."

$x = Set-Date -Adjust $dateAdjust.Negate()

Write-Output "Attempting to re-enable timer jobs..."

$enabledCount = 0
$timerJobs | foreach {
   if ($_.IsDisabled) {
      $_.IsDisabled = $false
      try {
      catch { }

Write-Output "$($enabledCount) job(s) successfully re-enabled."

If you run the script with no command-line arguments, your system clock will temporarily be set one day ahead, and the Workflow timer job will be executed to trigger your task expiration events.

If you supply the optional time-span argument, whatever value you supply will be used as the system clock adjustment. The format of the time-span is a string which .NET can parse into a System.TimeSpan structure. In general, that means d.HH:mm:ss.

For example, running “Trigger-SPWorkflowTaskExpiration 2.05:30:00″ would set your system clock ahead by 2 days, 5 hours, and 30 minutes.


Hopefully this will help others out there who want to test task expiration events in their SharePoint Designer workflows. If you discover something new that I missed or have problems with the script, please leave a comment and let me know.

Open-Ended Date Ranges (null dates) with SharePoint queries in SSRS reports

Yesterday I was building an SSRS report in Report Builder 3.0 for a client. The report was querying data in a SharePoint 2013 site, and the client asked me to add date parameters so he could specify a date range for the data that comes back.

Seems simple enough, right? I thought I just needed to add two date parameters to my report, update my query to use them, and I’d be good to go.

Well, that was almost enough.

You see, in my case I wanted to support open-ended date ranges, which means one or both date boundaries can be null/empty. For example, the client might specify January 1, 2015, as the start date, but by leaving the end date null, the client is telling the report to pull everything after the start date. Or, as another example, leaving both dates null would return all data in the list.

The problem is SSRS doesn’t work that way. Since SSRS was originally created to query SQL data, it’s very literal in its interpretation of a null date. As far as it’s concerned, passing a null value for a date parameter means you want data where a particular date field is null. It doesn’t provide an option to interpret the null value as “I don’t care” (open-ended boundary).

The good news is I was able to get SSRS to do exactly what I wanted. If you need this same behavior, check out the YouTube video I made showing how I did it.

SharePoint OpenSearch URL Tokens

In SharePoint 2013, we can configure OpenSearch result sources for searching external providers outside of SharePoint.

The URL for an OpenSearch result source allows several replaceable tokens to be used, but so far I’ve been unable to find the complete set of tokens documented anywhere (even on MDSN).

Therefore, I’m documenting them here for reference. All of the following are valid tokens in OpenSearch provider URLs.

SharePoint OpenSearch URL Token Reference

Token Replaced With
{searchTerms} Search query (keywords)
{startIndex} Start index of first item to display, relative to total number of results

Example: If SharePoint is displaying 10 items per page, the start index on page 1 is 0, the start index on page 2 is 10, and so on.

{startItem} Same as {startIndex}
{startPage} Despite the name, behaves same as {startIndex}
{count} Number of items to display per page (setting in search results web part)
{itemsPerPage} Same as {count}
{language} Language, for example “en-US”
{inputEncoding} Encoding of search query request, for example “utf-8″
{outputEncoding} Desired encoding of query response, for example “utf-8″

While the last few tokens may not be used very much, the others are fairly important because features like paging won’t work correctly without them.

As noted above, several tokens behave the same way, so just choose the one you want in those cases. For example, you can pick {count} or {itemsPerPage} to pass the “number of results per page” to the search provider. They both mean the same thing.

More Details on Paging

For paging in particular, it’s important to know that two things are required to get it working properly in SharePoint. Almost every example I’ve seen of OpenSearch paging with SharePoint misses at least one of these, which is why so many people struggle with it.

#1: The OpenSearch Provider Needs to Accept Paging Parameters

The OpenSearch provider needs to accept parameters for paging such as “start index” and “items per page” and take those into account when searching. If the OpenSearch provider has those arguments, you can pass values for them using the tokens I listed above.

For some services you might want to query such as YouTube, you’ll have to write your own web service as a go-between because YouTube’s v3 API doesn’t support start indexes. Instead, it requires “back” and “next” tokens for paging, which are special strings it sends back with results. SharePoint can’t supply those.

#2: SharePoint Must Know Total Number of Results

The other part of paging is letting SharePoint know how many total results there are. The reason why is simple. When SharePoint receives results, it asks, “Are there more total results than what I’ve been asked to show on one page?” If the answer is yes, paging links are generated. Otherwise, they’re not.

For OpenSearch results, there’s a “totalresults” element which can be included in the XML to let SharePoint know this information. Take a look at the ATOM response example in the OpenSearch specifications for an example. Without this information, SharePoint can’t support paging correctly. You either won’t get paging at all, or you’ll get limited paging functionality that doesn’t traverse the entire result set.

Many ATOM and RSS feeds do not include the OpenSearch-specific elements such as “totalresults,” so depending which provider you’re using, you may need to create your own web service in the middle to add those additional elements.