Problems with Breaking Inheritance and Limited Access User Permission Lock Down Mode in SharePoint

What is “Limited Access User Permission Lock Down Mode” ?

Lets start this by describing a little known site collection feature called “Limited Access User Permission Lock Down Mode”. When enabled, it stops users from viewing the list that a file they have been given specific access to exists within. In some cases it seems to stop Microsoft Office from working correctly too.

The reason you might use it, is to allow a user read access to a specific file within a SharePoint Library, but not let them modify the URL in order to see the list – essentially only the URL to the file will work for them.

If you switch off the site collection feature, the user will be able to at least see the library within SharePoint containing the file they have access to.

How does this relate to Permissions ?

It just so happens I developed a PowerShell script for a client that creates sub-sites for projects – breaking permissions inheritance on each sub-site, and wiring up custom groups, and permissions for them for each sub-site (e.g. “Project A”, with groups “Project A Owners”, “Project A Members”, and so on).

It turns out the method used in the PowerShell script to break permissions inheritance on the sub-site was incorrect (although advocated by Microsoft I might add).

I used the following method :

$web.RoleDefinitions.BreakInheritance($true,$false)

It turns out this does something that is impossible through the SharePoint interface – it not only breaks inheritance, and copies the Group assignments to the subsite, it also breaks inheritance of the Permission Levels (aka “permission sets”), and creates new permission sets tied to the sub-site with the same names as the parent. The tell-tale that this has happened is that checkboxes appear next to the permission set names when viewed from the sub-site (via “view site permissions”).

Why is this important? Because when the permission sets are copied, the configuration of the Limited Access User Permission Lock Down Mode feature is also copied – and then if it is enabled, or disabled at the site collection level (it’s a site collection feature, remember), it will not affect sub-sites with broken inheritance.

How can it be fixed ?

When you create a sub-site via Powershell, you need to use a slightly different method to break permissions inheritance :

$web.BreakRoleInheritance($true,$false)

This method copies the existing group assignments, but inherits the permission sets. It’s obviously the method used by the SharePoint interface, which exhibits the same behaviour.

If you have already created a number of sub-sites, they can be repaired by writing a PowerShell script to iterate through them, first reading the groups and roles assigned to them, then re-inheriting, and re-breaking permission inheritance correctly, before re-building the group and role assignments appropriately.

Posted by Jonathan Beckett in Notes, 0 comments

The Internet Explorer jQuery Change Event Puzzle

While working on a huge user interface development project some time ago, I came across some strange behavior from Internet Explorer that I thought might be worth sharing. In simple terms, it seems the .change jQuery event handler does not work consistently across all browsers, all environments, and all situations.

The user interface code I had written did something along the lines of the following:

$(".container").append("<input id='field_a' type='text' />");
$("#field_a").change(function(){
  // do something
});

Looks pretty straightforward, doesn’t it. It worked in Chrome, Firefox, Internet Explorer (on my work laptop), and even on the version of Chrome installed within Raspbian on a Raspberry Pi. It didn’t work in some cases on laptops used by the client on-site.

I started digging, and eventually found some similar conversations at StackOverflow. I resolved the issue by changing my code to look like this:

$(".container").append("<input id='field_a' type='text' />");
$("#field_a").on("change",function(){
  // do something
});

The only different is the use of on(), rather than change() directly. The interesting thing about this is the jQuery documentation specifically states that change() is shorthand for on(). I beg to differ. Something is different about it – I haven’t had time to look into the source of jQuery yet, but would be interested to find out if anybody else has ever seen this.

Posted by Jonathan Beckett in Notes, 0 comments

Updating SharePoint List Items with JavaScript

This is a simple example of updating a SharePoint list item via the Javascript CSOM API (I never quite know what to call it – Microsoft vary in their own naming of things).

The basic idea might be that this code would be called from a page within SharePoint, so is able to pick up the client context, and run the code. It will obviously fail if the user has no permissions to update the item in question.

function update_list_item(list_name, id, internal_field_name, value)
{
    // get connection to SharePoint
    var context = new SP.ClientContext.get_current();

    // get the current sharepoint site
    var web = context.get_web();

    // get a list from the site
    var list = web.get_lists().getByTitle(list_name);

    // get an item from the list
    var list_item = list.getItemById(id);

    // populate a property of the list item
    list_item.set_item(internal_field_name, value);

    // force sharepoint to save the change
    list_item.update();

    // tell SharePoint to do everything we just talked about
    context.executeQueryAsync(update_list_item_success, update_list_item_failure);
	
    return false;
}

function update_list_item_success(sender, args)
{
    alert("UpdateListItem Succeeded");
}

function update_list_item_failure(sender, args)
{
    alert("UpdateListItem Failed. \n" + args.get_message() + "\n" + args.get_stackTrace());
}

It’s probably worth making a few comments about updating different field types. The two standout ones that always cause trouble are URLs and Dates.

When updating a URL field, you can set the URL and label at the same time – by seperating them with – e.g. “https://google.com, Google”.

When updating a date field, SharePoint will expect the date in a specific format if you’re sending it as text. The easiest way around this without reading up on the ISO 8061 date format that SharePoint expects is just to pass it a Javascript Date object instead of text, and it will be quite happy.

Posted by Jonathan Beckett in Notes, 0 comments

Using Python to convert OPML to HTML

If you follow a number of blogs in a feed reader such as Feedly, wouldn’t it be great if you could turn the OPML export directly into nicely formatted HTML for a bulleted list in your own blog, complete with descriptions of each blog from the authors themselves. That’s what I thought, so I wrote this Python script to do exactly that.

It looks through each feed in an OPML file, loads the feed, and then reads the description before compiling them all into one outputted chunk of HTML – a list of links ready to drop into a page in a blog. Here’s how you might call the script :

python opml2html.py subscriptions.opml > html.txt

And here’s the script to do the work:

import sys,urllib2
import xml.etree.ElementTree as ET

# Prepare a blog object
class Blog:
    def __init__(self,title,url,rss,description):
        self.Title = title
        self.URL = url
        self.RSS = rss
        self.Description = description

# Prepare a blog list
blogs = []

# get the filename passed in
filename = sys.argv[1]
print 'Processing ' + filename

# load and parse the file
opml_tree = ET.parse(filename)
opml_root = opml_tree.getroot()

# find the feeds
feeds = opml_root.findall(".//outline")

# loop through the feeds and output their titles
for feed in feeds :

    # Check we have the text and htmlUrl attributes at least (the title and url of the blog)
    if "text" in feed.attrib :

        if "htmlUrl" in feed.attrib :

            # get the properties of the feed
            feed_title = feed.attrib['text']
            feed_url = feed.attrib['htmlUrl']

            feed_description = ""
            feed_rss = ""

            if "xmlUrl" in feed.attrib :

                feed_rss = feed.attrib['xmlUrl']
                
                print feed_rss
                
                try:
                    
                    feed_tree = ET.parse(urllib2.urlopen(feed_rss))
                    feed_root = feed_tree.getroot();
                    descriptions = feed_root.findall('channel//description')

                    if descriptions[0].text is None :
                        feed_description = "No description..."
                    else :
                        feed_description = descriptions[0].text

                    
                except IndexError, e:
                    feed_description = "No description..."
                except urllib2.HTTPError, e:
                    feed_description = "RSS Feed Not Found..."
                except urllib2.URLError, e:
                    feed_description = "RSS Feed Not Found..."

                print feed_description
                print "-"


            blog = Blog(feed_title,feed_url,feed_rss,feed_description)
            blogs.append(blog)

# Sort the blogs
blogs.sort(key=lambda blog: blog.Title)

# start HTML output
html = "\n"

# output HTML
print html
Posted by Jonathan Beckett in Notes, 0 comments

Deploying Nintex Forms with PowerShell

When working on a sizeable project with Microsoft SharePoint, Nintex Workflow, and Nintex Forms, it makes sense to automate deployment as much as possible. While it’s straightforward to automate the provisioning of SharePoint assets such as lists, content types, fields, and views through PowerShell, and it’s fairly easy to call NWAdmin to deploy workflows, Nintex Forms have always been something of a problem – until Nintex released a Forms Webservice that is.

It’s still not easy to deploy Nintex Forms via PowerShell, as evidenced by the numerous discussions on both the Nintex community forms, and elsewhere on the internet – with people trying to stick bits of the solution together, and nobody really having the “whole story”. Well this post describes the whole story. I debated for some time about writing this up, because the amount of effort to do it was significant – it gets into that grey area of “this has commercial value”. In the end I decided to share it because I have taken so much from the community over the years, so it might be time to pay something back.

The following PowerShell snippet essentially loops through an arraylist describing the titles of lists, and associated form XML files, and communicates with the Nintex Forms webservice to upload the XML, and publish the forms. It sounds straightforward – it’s anything but. In reality, the script does the following:

  • Calls SharePoint to get a Form Digest
  • Extracts the Form Digest from the response
  • Prepares a POST Web Request to the Nintex Forms Webservice
  • Reads the Form XML file into a byte array
  • Sends the request to the Forms Webservice, streaming the byte array
  • Captures the response from the Forms Webservice

Here’s the guts of it…

$web = Get-SPWeb "https://server/sites/site_collection/subsite"

[System.Reflection.Assembly]::LoadWithPartialName("System.IO") >> $null
[System.Reflection.Assembly]::LoadWithPartialName("Nintex.Forms.SharePoint") >> $null
[System.Reflection.Assembly]::LoadWithPartialName("Nintex.Forms") >> $null

# Build an arraylist of List names, and form xml filenames
$items = New-Object System.Collections.ArrayList
$items.Add(("List A","form_a.xml")) > $null
$items.Add(("List B","form_b.xml")) > $null
$items.Add(("List C","form_c.xml")) > $null

# Check we can see the folder where the form files are
$forms_path = Resolve-Path $(".\Forms\")
if (Test-Path($forms_path)) {

    # loop through the form files
    foreach ($item in $items) {
        
        $list_name = $item[0]
        $form_filename = $item[1]
        
        $form_path = "$forms_path$form_filename"

        Write-Host $(" - Deploying [" + $form_filename + "] to [" + $list_name + "]") -foregroundcolor white
    
        if (Test-Path($form_path)) {

            Write-Host $(" - Form XML File Found")
            
            if ($web.Lists[$list_name]){
                
                # Get Form Digest
                Write-Host " - Getting Form Digest" -NoNewLine
                
                    # Call SharePoint for the Form Digest
                    $form_digest_request = [Microsoft.SharePoint.Utilities.SPUtility]::ConcatUrls($web.Site.RootWeb.Url, "_api/contextinfo")
                    $form_digest_uri = New-Object System.Uri($form_digest_request)
                    $credential_cache = New-Object System.Net.CredentialCache
                    $credential_cache.Add($form_digest_uri, "NTLM", [System.Net.CredentialCache]::DefaultNetworkCredentials)
                    $http_request = [System.Net.HttpWebRequest] [System.Net.HttpWebRequest]::Create($form_digest_request)
                    $http_request.Credentials = $credential_cache
                    $http_request.Method = "POST"
                    $http_request.Accept = "application/json;odata=verbose"
                    $http_request.ContentLength = 0
                    [System.Net.HttpWebResponse] $http_response = [System.Net.HttpWebResponse] $http_request.GetResponse()
                    [System.IO.Stream]$response_stream = $http_response.GetResponseStream()
                    [System.IO.StreamReader] $stream_reader = New-Object System.IO.StreamReader($response_stream)
                    $results = $stream_reader.ReadToEnd()
                    $stream_reader.Close()
                    $response_stream.Close()

                    # Extract the Form Digest Value from the Response
                    $start_tag = "FormDigestValue"
                    $end_tag = "LibraryVersion"
                    $start_tag_index = $results.IndexOf($start_tag) + 1
                    $end_tag_index = $results.IndexOf($end_tag, $start_tag_index)
                    [string] $form_digest = $null
                    if (($start_tag_index -ge 0) -and ($end_tag_index -gt $start_tag_index))
                    {
                        $form_digest = $results.Substring($start_tag_index + $start_tag.Length + 2, $end_tag_index - $start_tag_index - $start_tag.Length - 5)
                    }
                    
                    Write-Host $(" - Form Digest Retrieved")
                
                # Prepare Web Request
                Write-Host " - Preparing Web Request" -NoNewLine
                    
                    $webservice_url = [Microsoft.SharePoint.Utilities.SPUtility]::ConcatUrls($web.Url, "_vti_bin/NintexFormsServices/NfRestService.svc/PublishForm")
                    $webservice_uri = New-Object System.Uri($webservice_url)

                    # Create the web request
                    [System.Net.HttpWebRequest] $request = [System.Net.WebRequest]::Create($webservice_uri)

                    # Add authentication to request 
                    $request.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials

                    # Configure Request
                    $request.Method = "POST";
                    $request.ContentType = "application/json; charset=utf-8";
                    $request.Accept = "application/json, text/javascript, */*; q=0.01"
                    $request.Headers.Add("X-RequestDigest", $form_digest); 
                    $request.Headers.Add("X-Requested-With", "XMLHttpRequest")
                    
                    Write-Host " - Request Prepared"

                # Read XML file into byte array
                Write-Host " - Reading XML File" -NoNewLine
                
                    [system.io.stream] $stream = [system.io.File]::OpenRead($form_path)
                    [byte[]] $file_bytes = New-Object byte[] $stream.length
                    [void] $stream.Read($file_bytes, 0, $stream.Length)
                    $stream.Close()
                    
                    try
                    {
                        $form = [Nintex.Forms.FormsHelper]::XmlToObject([Nintex.Forms.NFUtilities]::ConvertByteArrayToString($file_bytes))
                    } catch [Exception] {
                        $form = [Nintex.Forms.FormsHelper]::XmlToObject([Nintex.Forms.NFUtilities]::ConvertByteArrayToString($file_bytes, [System.Text.Encoding]::UTF8))
                    }

                    $form.LiveSettings.Url = ""
                    $form.LiveSettings.ShortUrl = ""
                    $form.RefreshLayoutDisplayNames()
                    $form.Id = [guid]::NewGuid()

                    $form_json = [Nintex.Forms.FormsHelper]::ObjectToJson($form);
                
                    Write-Host $(" - Json Prepared - [" + $form_json.Length + "] chars")

                # Create the data we want to send
                Write-Host " - Generating Data to Send" -NoNewLine
                
                    $list = $web.Lists[$list_name]
                    $id = "{$($list.ID)}"
                    $data = "{`"contentTypeId`": `"`", `"listId`": `"$id`", `"form`": $form_json }"

                    # Create a byte array of the data we want to send 
                    $utf8 = New-Object System.Text.UTF8Encoding 
                    [byte[]] $byte_array = $utf8.GetBytes($data.ToString())

                    # Set the content length in the request headers 
                    $request.ContentLength = $byte_array.Length;
                    
                    Write-Host $(" - [" + $byte_array.Length + "] bytes prepared")

                # Send the Request
                Write-Host " - Sending the Request" -NoNewLine
                    
                    try {
                        $post_stream = $request.GetRequestStream()
                        $post_stream.Write($byte_array, 0, $byte_array.Length);
                    } catch [Exception]{
                        write-host -f red $_.Exception.ToString() 
                    } finally {
                        if($post_stream) {
                            $post_stream.Dispose()
                        }
                    }
                    
                    Write-Host $(" - Sent [" + $byte_array.Length + "] bytes")

                # Get the Response
                Write-Host " - Processing Response"
                
                    try {
                        [System.Net.HttpWebResponse] $response = [System.Net.HttpWebResponse] $request.GetResponse()

                        # Get the response stream 
                        [System.IO.StreamReader] $reader = New-Object System.IO.StreamReader($response.GetResponseStream())

                        try {
                            $strResult = $reader.ReadToEnd()
                            $jsonResult = ConvertFrom-Json $strResult

                        } catch [Exception] {
                            write-host -f red $_.Exception.ToString() 
                        }
                    } catch [Exception] {
                        write-host -f red $_.Exception.ToString() 
                    } finally {
                        if($response) {
                            $response.Dispose()
                        }
                    }
            
            } else {
            
                # List not found
                Write-Host $(" - List [" + $list_name + "] not found") -foreground-color red
                
            }
            
        } else {
        
            # form_path not found
            Write-Host $(" - Form Path [" + $form_filename + "] not found") -foregroundcolor red
        
        }
        
    } # foreach item in arraylist

} else {

    # forms_path not found
    Write-Host $(" - Forms Path [ " + $forms_path + "] not found") -foregroundcolor red
}

# release resources
$web.Close()
$web.Dispose()

Hopefully this will be useful to somebody, somewhere. I read a lot of documentation to come up with this method, and also looked at pieces of the puzzle that other people had completed before solving it. Of course as with any solution like this, the complexity vanishes at runtime – with forms importing and publishing at a rate of one or two per second into SharePoint – certainly faster than importing and publishing them by hand.

Posted by Jonathan Beckett in Notes, 0 comments

Deploying Nintex Workflows via PowerShell

One of the more common tasks when working on a large project is to deploy Nintex Workflows via a PowerShell script. It’s not too difficult, because the SharePoint web front ends will have a copy of NWAdmin on them – installed in the 15 hive by Nintex during installation. If you’re not aware of it, NWAdmin is a command line tool that can – drum-roll – deploy workflows (among many other things).

The snippet of code below shows the general pattern I use to deploy many workflows in one go – essentially listing them all out in an arraylist, and then looping through it, calling NWAdmin via Powershell. If nothing else, this is a great example of the horrible syntax PowerShell forces upon you to call command line applications with parameters.

$web = Get-SPWeb "https://server/sites/site_collection/subsite"

$cmd = "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\BIN\NWAdmin.exe"

$workflows = New-Object System.Collections.ArrayList

# Fill an array with the worklow names, list names and filenames to process
$workflows.Add(("Process A","List A","process_a.nwf")) > $null
$workflows.Add(("Process B","List B","process_b.nwf")) > $null
$workflows.Add(("Process C","List C","process_c.nwf")) > $null

if (Test-Path($cmd)) {
	Write-Host " - NWAdmin Found" -foregroundcolor green
	$list_workflows_path = Resolve-Path $(".\Workflows\")
	if (Test-Path($list_workflows_path)) {
		foreach ($workflow in $workflows) {

			$workflow_name     = $workflow[0]
			$list_name         = $workflow[1]
			$workflow_filename = $workflow[2]

			$nwf_path = "$list_workflows_path$workflow_filename"

			if (Test-Path($nwf_path)) {
				if ($web.Lists[$list_name]){

					write-host $("Deploying '" + $workflow_name + "' to list '" + $list_name + "'") -foregroundcolor white
					$prm = "-o","DeployWorkflow","-workflowName",$("`"" + $workflow_name + "`""),"-nwfFile",$("`"" + $nwf_path + "`""),"-siteUrl",$("`"" + $web.Url + "`""),"-targetList",$("`"" + $list_name + "`""),"-overwrite"
					& $cmd $prm

				} else {
					Write-Host $("SharePoint List not found [" + $list_name + "]") -foregroundcolor red
				}
			} else {
				write-host $("Workflow File Not Found [" + $nwf_path + "]") -foregroundcolor red
			}
		}
	} else {
		write-host $("Workflows Directory Not Found [" + $list_workflows_path + "]") -foregroundcolor red
	}
	write-host "Complete!" -foregroundcolor green
} else {
	Write-Host " - NWAdmin Not Found" -foregroundcolor red
}

# release resources
$web.Close()
$web.Dispose()
Posted by Jonathan Beckett in Notes, 0 comments

Repairing Nintex Forms in Exported Nintex Workflows

A little while ago I was working on a SharePoint development project with Nintex Workflows and Nintex Forms. The project was being developed remotely – in a virtual machine – and then deployment scripts were given to the client to install on their development, test, and production farms as testing progressed. Along the way we found a pretty serious bug in Nintex Workflow and Forms, but thankfully found a workaround.

The Problem

When you design the forms for tasks within a Nintex Workflow (using Nintex Forms) the form designs are embedded in the Nintex Workflow when you export it. If you export a form directly from the form designer, you end up with an XML file describing the form. If you export a workflow containing a task form design from the workflow designer, you end up with an XML file describing the workflow, with the XML describing the task form escaped within it. The problem comes when you export from one farm, and import into another – you will discover the task forms may fail in the destination system. After a bit of digging, I figured out the the XML describing the forms contains hard-coded server relative paths to the origin system in Lookup fields, that are not dealt with during the import – or at least, that’s how I have seen this problem occur – there may be other field types that also have the source system URL baked into them.

It’s worth noting that I have contacted Nintex Support about this issue – I will update when they get back to me.

The Solution

The solution is pretty straightforward really – you can run some PowerShell to read the exported Workflows, replace the relative paths, and write them back. In the example below, we presume that all of the exported workflows exist within a folder called “Workflows”, and the modified versions will be stored in a subfolder called “Modified”.

# Connect to SharePoint
$web = Get-SPWeb "https://server/sites/site_collection/subsite"

$uri = [System.Uri]$url

# server relative path of site the workflows originally existed at
$source_localpath = "/sites/site_collection/subsite"

# server relative path of the site where the workflows are going to be imported
$destination_localpath = $uri.LocalPath

# escape the paths (because we will find both unescaped and escaped versions of the paths)
$source_localpath_escaped = $source_localpath -replace "/","\\/"
$destination_localpath_escaped = $destination_localpath -replace "/","`\/"

# Loop through files in Workflows subdirectory
foreach ( $source_file in $(Get-ChildItem './Workflows' -File | Sort-Object -Property Name) ) {
    
    # read the workflow file
    $file_content = Get-Content "./Workflows/List Workflows/$source_file"
    
    # replace the paths
    write-host " - Replace Paths"
    $file_content = $file_content -replace $source_localpath,$destination_localpath
    $file_content = $file_content -replace $source_localpath_escaped,$destination_localpath_escaped
            
    # Write the file into the modified subdirectory
    $file_content | out-file -encoding utf8 "../Workflows/Modified/$source_file"

}

# release resources
$web.Close()
$web.Dispose()

The script results in a modified set of exported workflows, which will import correctly into the destination farm. It’s worth noting that if you don’t do this, you end up in a world of trouble, because you can’t repair the forms the workflow import breaks – and you don’t know they are broken until you try to use them. I discovered the cause after digging through the SharePoint ULS logs.

I’m amazed that this particular bug got past quality control at Nintex, but then it’s a fantastically complex piece of kit, and so is SharePoint that it sits on top of. It’s also worth noting that this only effects the development of systems using multiple farms for development, testing, and production – if you do agile development in production, you would never see this problem.

Posted by Jonathan Beckett in Notes, 0 comments

Provisioning Large SharePoint Projects with Powershell

It makes sense when building a sizeable project in SharePoint on-premises to script the provisioning of all assets – the lists, content types, columns, views, and so on. This inevitably ends up with a colossal powershell script, so it makes sense to look for ways to break it up, and make it easier to work on (or for multiple people to work on individual parts of the whole). The following is a method I have come up with over the course of several projects – I figured it might be useful for others.

Approach

The basic approach is to run one powershell file that loads all the others, and executes them. Therefore we have one core file (e.g. “deploy.ps1”), and a folder (“lib”) full of scripts it runs. It occurred to me while building the core script that we could copy the way Unix and Linux often do things – and number the files in the folder – then if the core file sorts them, we can control the execution of the children purely by filename order (e.g. “lib/1000_provision_list.ps1”, “lib/2000_provision_content_type.ps1”, and so on).

The Core Deployment Script

So – the following snippet is an example of what the core deployment script looks like – expecting a couple of parameters – the name of the SharePoint site to deploy everything into, and a filter to optionally run a subset, or even a single script, based on it’s filename.

param( [string]$url = "https://server/sites/site_collection/subsite", [string]$filter = "" )

# record results to a text file in the same path as the deploy script
start-transcript ./deploy_log.txt -Append

# Add SharePoint Snap-In
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) 
{
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}

if ($url -ne ""){

    Write-Host "Connecting to [$url]" -foregroundcolor green
    $web = Get-SPWeb $url

    # Loop through all files in the lib folder and execute them in order of their filename
    # and if their name matches the filter parameter passed to the script
    foreach ( $source_file in $(Get-ChildItem './lib' | Sort-Object -Property Name) ) {
        
        if ($source_file.Name -like "*$filter*") {
        
            Write-Host $("***** Start  [" + $source_file.Name + "] *****") -foregroundcolor green
            
            & $source_file.FullName
            
            Write-Host $("***** Finish [" + $source_file.Name + "] *****") -foregroundcolor green

        }
    }

    # Release Assets
    $web.Close() > $null
    $web.Dispose() > $null

} else {
    Write-Host "No URL supplied" -foregroundcolor red
}

Write-Host ""
Write-Host "Finished!" -foregroundcolor green

Stop-Transcript

It’s worth noting that the above example uses the “Start-Transcript” and “Stop Transcript” powershell functions, that very neatly record anything we write out through standard output to a text file. I only discovered this a few months ago – it’s very useful.

Writing the Worker Scripts

Apart from knowing we need to put the scripts in the “lib” folder, it’s useful to see what one of the child scripts might look like. The example below shows a good example – which might be saved as “lib/1000_provision_list.ps1”:

function provision_list($web){
    $list_internal_name = "mylist"
    $list_display_name = "My List"
    Write-Host $("Provisioning List [" + $list_display_name + "]")
    if ($web.Lists[$list_display_name]) {
        Write-Host " - $list_display_name Already Exists"
        $list = $web.Lists[$list_internal_name]
    } else {
        Write-Host " - $list_display_name ($list_internal_name)"
        $web.Lists.Add($list_internal_name, "", "GenericList") > $null
        $list = $web.Lists[$list_internal_name]
        $list.Title = $list_display_name
        $list.ContentTypesEnabled = $true
        $list.EnableVersioning = $true
        $list.NavigateForFormsPages = $false
        $list.Update() > $null
    }
}

provision_list($web)

The take-away from the library files is that they share any objects already instantiated by the core script – therefore we can presume $web already contains a valid SharePoint object. Therefore all each file contains is a function definition, and a call to run the function (using functions helps keep working variables scoped).

So there you have it – a working method of building large, flexible, and extendable provisioning scripts. Do comment if you’ve done something similar, or if you found this useful.

Posted by Jonathan Beckett in Notes, 0 comments

Bulk Uploading Files into Libraries in SharePoint with PowerShell

The following snippet shows a method of uploading a folder of files on the filesystem into a library in SharePoint. In this case the filesystem folder is presumed to exist in the same location as the PowerShell script, called “SiteAssets”, uploading to a library in the SharePoint subsite called “SiteAssets”.

# Connect to SharePoint Site
$web = get-spweb "https://server/sites/site_collection/subsite"

# loop through all files in the local siteassets subfolder
foreach ( $source_file in $(Get-ChildItem './SiteAssets' | Sort-Object -Property Name) ) {
    if ($source_file.Attributes -match 'Directory') {
        # its a folder - ignore it
    } else {
        
        $library = $web.GetFolder("SiteAssets")
        $library_files = $library.Files

        $file = Get-ChildItem $source_file.FullName

        $library_files.Add("SiteAssets/" + $source_file.Name,$file.OpenRead(),$true) > $null
    }
}

# Release resources
$web.Close()
$web.Dispose()
Posted by Jonathan Beckett in Notes, 0 comments

Provisioning Groups and Assigning Permissions within SharePoint Subsites with PowerShell

The following snippet shows how you might provision a new group within a subsite of SharePoint using PowerShell. Note that you not only need to create the group – you also need to associate it with the subsite, and assign permissions to the relationship between the group and the subsite. It’s not obvious at all.

# Connect to SharePoint Site
$web = get-spweb "https://server/sites/site_collection/subsite"

$group_name = "My Group"

# Remove group if it already exists
if ($web.SiteGroups[$group_name] -ne $null)
{
    $web.SiteGroups.Remove($group_name) > $null
    $web.Update() > $null
}

# Create the group
$web.SiteGroups.Add($group_name, $web.Site.Owner, $web.Site.Owner, $group_name) > $null
$web.Update() > $null

# Add Group to Associated Groups Collection
if ($web.AssociatedGroups[$group_name] -eq $null)
{
    $web.AssociatedGroups.Add($web.SiteGroups[$group_name])
    $web.Update()
}

# Assign Permissions to the Group
$group = $web.SiteGroups[$group_name]
$group_role_assignment = new-object Microsoft.SharePoint.SPRoleAssignment($group)
$full_control_role_definition = $web.RoleDefinitions["Full Control"]
$group_role_assignment.RoleDefinitionBindings.Add($full_control_role_definition) > $null
$web.RoleAssignments.Add($group_role_assignment) > $null
$web.Update()

# Release resources
$web.Close()
$web.Dispose()
Posted by Jonathan Beckett in Notes, 0 comments