Nintex Workflow

Repairing List Item Permission Actions in Exported Nintex Workflows

While working on a sizeable SharePoint and Nintex Workflow development project recently, I came across a significant issue in Nintex Workflow that the support engineers at Nintex said was not a bug. I disagree with them, and had to find a workaround anyway, so thought I would share it.

The Problem

If you export a workflow from one SharePoint system (e.g. a development farm), and import it into a different SharePoint system (e.g. a production farm), if you are using the “Set Item Permissions” action within your workflow, you will discover that it is broken after importing. I did a bit of digging, and discovered why – the exported workflow describes the permissions sets in XML by both their name, and their internal IDs (large integers) – but only seems to use the IDs when importing – Nintex Workflow doesn’t try to correlate the permission sets by name on the destination system, so presumes it cannot find the permission sets described in the workflow actions.

It’s worth repeating – the Nintex support engineer I dealt with said this was by design. I was quite shocked.

The Solution

If you’re working on a sizeable project, you probably have all the workflows exported to a folder on the filesystem. You can therefore process the files to replace the IDs from the original system with those of the target system. So – we can run the following PowerShell script on the files, while they are sitting on the destination server(s):

$url = "https://server/sites/site_collection/subsite"

$uri = [System.Uri]$url

# Loop through files in Workflows subdirectory
foreach ( $source_file in $(Get-ChildItem './Workflows' -File | Sort-Object -Property Name) ) {

    Write-Host $("Processing [" + $source_file.Name + "]") -foregroundcolor white
    $file_content = Get-Content "../Workflows/$source_file"
    # repair the role definitions in the XML
    Write-Host " - Repairing Role Definition IDs in XML"
    foreach ($role_definition in $web.RoleDefinitions){
        $pattern     = $('\#' + $role_definition.Name + '\;\#None\;\#[0-9]+\$\$\#\#')
        $replacement = $('#' + $role_definition.Name + ';#None;#' + $role_definition.Id + '$$$$##')
        $file_content = $file_content -replace $pattern , $replacement
    # Write the file into the modified directory
    $file_content | out-file -encoding utf8 "./Workflows/Modified/$source_file"
    Write-Host $(" - Finished Processing [" + $source_file.Name + "]")


The above snippet presumes you have all your workflows in a folder called “Workflows”, alongside the PowerShell script. It also presumes a sub-folder called “Modified” exists within the Workflows folder, to put the modified workflows into. The script does a regex search for the role definition names in the XML (the permission sets), and swaps them out for the matching ones for the destination system. After running the script, you end up with a set of workflow export files that work.

In my mind, this entire situation could have been avoided if the developers at Nintex had been a bit more forward thinking. At least there is a solution.

Posted by Jonathan Beckett in Notes, 0 comments

Powershell to Clear a Large SharePoint List

When a SharePoint list grows to millions of rows (cough – Nintex Workflow History List!), it becomes a huge problem to clear it’s contents down. The following script uses a couple of tricks to essentially set a “cursor” to loop through a huge list, and clear it down. It does it in chunks of 1000 items at a time (using the “batch” function of the SharePoint API), and then empties both the site and site collection recycle bins. It runs repeatedly until the offending list is cleared down.

$site_collection_url = ""
$list_title = "Things"
$batch_size = 1000

$site = get-spsite $site_collection_url
$web = get-spweb $site_collection_url

$list = $web.Lists[$list_title]

$query = New-Object Microsoft.SharePoint.SPQuery
$query.ViewAttributes = "Scope='Recursive'";
$query.RowLimit = $batch_size
$caml = ''
$query.Query = $caml
$process_count = 0

    $start_time = Get-Date
    write-host $(" - [Compiling Batch (" + $batch_size + " items)]") -nonewline

    $list_items = $list.GetItems($query)
    $count = $list_items.Count
    $query.ListItemCollectionPosition = $list_items.ListItemCollectionPosition

    $batch = ""

    $j = 0
    for ($j = 0; $j -lt $count; $j++)
        $item = $list_items[$j]
        $batch += "$($list.ID)$($item.ID)Delete$($item.File.ServerRelativeUrl)"
        if ($i -ge $count) { break }

    $batch += ""

    write-host " [Sending Batch]" -nonewline
    $result = $web.ProcessBatchData($batch)

    write-host " [Emptying Web Recycle Bin]" -nonewline

    write-host " [Emptying Site Recycle Bin]" -nonewline

    $end_time = Get-Date
    $process_count += $batch_size

    write-host $(" [Processing Time " + ($end_time - $start_time).TotalSeconds + "] [Processed " + $process_count + " so far]") -nonewline
    write-host " [Waiting 2 seconds]"

    start-sleep -s 2

while ($query.ListItemCollectionPosition -ne $null)

// Release Resources

The take-away from this script is the ListItemCollectionPosition property of the query object – which appears to work like a cursor. I had never seen it before I started searching for solutions to this problem. It may well be useful again in the future.

Posted by Jonathan Beckett in Notes, 0 comments

Deploying Nintex Workflows via PowerShell

One of the more common tasks when working on a large project is to deploy Nintex Workflows via a PowerShell script. It’s not too difficult, because the SharePoint web front ends will have a copy of NWAdmin on them – installed in the 15 hive by Nintex during installation. If you’re not aware of it, NWAdmin is a command line tool that can – drum-roll – deploy workflows (among many other things).

The snippet of code below shows the general pattern I use to deploy many workflows in one go – essentially listing them all out in an arraylist, and then looping through it, calling NWAdmin via Powershell. If nothing else, this is a great example of the horrible syntax PowerShell forces upon you to call command line applications with parameters.

$web = Get-SPWeb "https://server/sites/site_collection/subsite"

$cmd = "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\BIN\NWAdmin.exe"

$workflows = New-Object System.Collections.ArrayList

# Fill an array with the worklow names, list names and filenames to process
$workflows.Add(("Process A","List A","process_a.nwf")) > $null
$workflows.Add(("Process B","List B","process_b.nwf")) > $null
$workflows.Add(("Process C","List C","process_c.nwf")) > $null

if (Test-Path($cmd)) {
	Write-Host " - NWAdmin Found" -foregroundcolor green
	$list_workflows_path = Resolve-Path $(".\Workflows\")
	if (Test-Path($list_workflows_path)) {
		foreach ($workflow in $workflows) {

			$workflow_name     = $workflow[0]
			$list_name         = $workflow[1]
			$workflow_filename = $workflow[2]

			$nwf_path = "$list_workflows_path$workflow_filename"

			if (Test-Path($nwf_path)) {
				if ($web.Lists[$list_name]){

					write-host $("Deploying '" + $workflow_name + "' to list '" + $list_name + "'") -foregroundcolor white
					$prm = "-o","DeployWorkflow","-workflowName",$("`"" + $workflow_name + "`""),"-nwfFile",$("`"" + $nwf_path + "`""),"-siteUrl",$("`"" + $web.Url + "`""),"-targetList",$("`"" + $list_name + "`""),"-overwrite"
					& $cmd $prm

				} else {
					Write-Host $("SharePoint List not found [" + $list_name + "]") -foregroundcolor red
			} else {
				write-host $("Workflow File Not Found [" + $nwf_path + "]") -foregroundcolor red
	} else {
		write-host $("Workflows Directory Not Found [" + $list_workflows_path + "]") -foregroundcolor red
	write-host "Complete!" -foregroundcolor green
} else {
	Write-Host " - NWAdmin Not Found" -foregroundcolor red

# release resources
Posted by Jonathan Beckett in Notes, 0 comments

Repairing Nintex Forms in Exported Nintex Workflows

A little while ago I was working on a SharePoint development project with Nintex Workflows and Nintex Forms. The project was being developed remotely – in a virtual machine – and then deployment scripts were given to the client to install on their development, test, and production farms as testing progressed. Along the way we found a pretty serious bug in Nintex Workflow and Forms, but thankfully found a workaround.

The Problem

When you design the forms for tasks within a Nintex Workflow (using Nintex Forms) the form designs are embedded in the Nintex Workflow when you export it. If you export a form directly from the form designer, you end up with an XML file describing the form. If you export a workflow containing a task form design from the workflow designer, you end up with an XML file describing the workflow, with the XML describing the task form escaped within it. The problem comes when you export from one farm, and import into another – you will discover the task forms may fail in the destination system. After a bit of digging, I figured out the the XML describing the forms contains hard-coded server relative paths to the origin system in Lookup fields, that are not dealt with during the import – or at least, that’s how I have seen this problem occur – there may be other field types that also have the source system URL baked into them.

It’s worth noting that I have contacted Nintex Support about this issue – I will update when they get back to me.

The Solution

The solution is pretty straightforward really – you can run some PowerShell to read the exported Workflows, replace the relative paths, and write them back. In the example below, we presume that all of the exported workflows exist within a folder called “Workflows”, and the modified versions will be stored in a subfolder called “Modified”.

# Connect to SharePoint
$web = Get-SPWeb "https://server/sites/site_collection/subsite"

$uri = [System.Uri]$url

# server relative path of site the workflows originally existed at
$source_localpath = "/sites/site_collection/subsite"

# server relative path of the site where the workflows are going to be imported
$destination_localpath = $uri.LocalPath

# escape the paths (because we will find both unescaped and escaped versions of the paths)
$source_localpath_escaped = $source_localpath -replace "/","\\/"
$destination_localpath_escaped = $destination_localpath -replace "/","`\/"

# Loop through files in Workflows subdirectory
foreach ( $source_file in $(Get-ChildItem './Workflows' -File | Sort-Object -Property Name) ) {
    # read the workflow file
    $file_content = Get-Content "./Workflows/List Workflows/$source_file"
    # replace the paths
    write-host " - Replace Paths"
    $file_content = $file_content -replace $source_localpath,$destination_localpath
    $file_content = $file_content -replace $source_localpath_escaped,$destination_localpath_escaped
    # Write the file into the modified subdirectory
    $file_content | out-file -encoding utf8 "../Workflows/Modified/$source_file"


# release resources

The script results in a modified set of exported workflows, which will import correctly into the destination farm. It’s worth noting that if you don’t do this, you end up in a world of trouble, because you can’t repair the forms the workflow import breaks – and you don’t know they are broken until you try to use them. I discovered the cause after digging through the SharePoint ULS logs.

I’m amazed that this particular bug got past quality control at Nintex, but then it’s a fantastically complex piece of kit, and so is SharePoint that it sits on top of. It’s also worth noting that this only effects the development of systems using multiple farms for development, testing, and production – if you do agile development in production, you would never see this problem.

Posted by Jonathan Beckett in Notes, 0 comments