Scripting

Batch File to Run PowerShell Scripts and Log Console Output to Log Files

The easiest way to schedule a PowerShell script is to run it from a DOS batch file, and schedule the execution of the batch file on a Windows server via the Task Scheduler. The following batch file will run a PowerShell script, and also organise output from the PowerShell (via Write-Host) into a text file, stored in a sensibly formatted sub-folder and filename structure.

ECHO OFF

REM create a logfile name (log_yyyy-mm-dd-hh-mm-ss.log)
SET logfilename=log_%DATE:~-4%-%DATE:~3,2%-%DATE:~7,2%-%TIME:~0,2%-%TIME:~3,2%-%TIME:~6,2%.log

REM create a folder name (log_yyyy-mm-dd)
SET foldername=log_%DATE:~6,4%-%DATE:~3,2%-%DATE:~0,2%

REM create a log subdirectory
mkdir c:\logs\%foldername%

REM run the powershell script and pipe output to the logfile
powershell.exe "c:\scripts\MyScript.ps1" > "c:\logs\%foldername%\%logfilename%"

The take-away from this script is almost certainly the arcane DOS batch file comments required to get bits and pieces of the date and time.

Posted by Jonathan Beckett in Notes, 0 comments

Updating SharePoint List Items with JavaScript

This is a simple example of updating a SharePoint list item via the Javascript CSOM API (I never quite know what to call it – Microsoft vary in their own naming of things).

The basic idea might be that this code would be called from a page within SharePoint, so is able to pick up the client context, and run the code. It will obviously fail if the user has no permissions to update the item in question.

function update_list_item(list_name, id, internal_field_name, value)
{
    // get connection to SharePoint
    var context = new SP.ClientContext.get_current();

    // get the current sharepoint site
    var web = context.get_web();

    // get a list from the site
    var list = web.get_lists().getByTitle(list_name);

    // get an item from the list
    var list_item = list.getItemById(id);

    // populate a property of the list item
    list_item.set_item(internal_field_name, value);

    // force sharepoint to save the change
    list_item.update();

    // tell SharePoint to do everything we just talked about
    context.executeQueryAsync(update_list_item_success, update_list_item_failure);
	
    return false;
}

function update_list_item_success(sender, args)
{
    alert("UpdateListItem Succeeded");
}

function update_list_item_failure(sender, args)
{
    alert("UpdateListItem Failed. \n" + args.get_message() + "\n" + args.get_stackTrace());
}

It’s probably worth making a few comments about updating different field types. The two standout ones that always cause trouble are URLs and Dates.

When updating a URL field, you can set the URL and label at the same time – by seperating them with – e.g. “https://google.com, Google”.

When updating a date field, SharePoint will expect the date in a specific format if you’re sending it as text. The easiest way around this without reading up on the ISO 8061 date format that SharePoint expects is just to pass it a Javascript Date object instead of text, and it will be quite happy.

Posted by Jonathan Beckett in Notes, 0 comments

Provisioning Large SharePoint Projects with Powershell

It makes sense when building a sizeable project in SharePoint on-premises to script the provisioning of all assets – the lists, content types, columns, views, and so on. This inevitably ends up with a colossal powershell script, so it makes sense to look for ways to break it up, and make it easier to work on (or for multiple people to work on individual parts of the whole). The following is a method I have come up with over the course of several projects – I figured it might be useful for others.

Approach

The basic approach is to run one powershell file that loads all the others, and executes them. Therefore we have one core file (e.g. “deploy.ps1”), and a folder (“lib”) full of scripts it runs. It occurred to me while building the core script that we could copy the way Unix and Linux often do things – and number the files in the folder – then if the core file sorts them, we can control the execution of the children purely by filename order (e.g. “lib/1000_provision_list.ps1”, “lib/2000_provision_content_type.ps1”, and so on).

The Core Deployment Script

So – the following snippet is an example of what the core deployment script looks like – expecting a couple of parameters – the name of the SharePoint site to deploy everything into, and a filter to optionally run a subset, or even a single script, based on it’s filename.

param( [string]$url = "https://server/sites/site_collection/subsite", [string]$filter = "" )

# record results to a text file in the same path as the deploy script
start-transcript ./deploy_log.txt -Append

# Add SharePoint Snap-In
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) 
{
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}

if ($url -ne ""){

    Write-Host "Connecting to [$url]" -foregroundcolor green
    $web = Get-SPWeb $url

    # Loop through all files in the lib folder and execute them in order of their filename
    # and if their name matches the filter parameter passed to the script
    foreach ( $source_file in $(Get-ChildItem './lib' | Sort-Object -Property Name) ) {
        
        if ($source_file.Name -like "*$filter*") {
        
            Write-Host $("***** Start  [" + $source_file.Name + "] *****") -foregroundcolor green
            
            & $source_file.FullName
            
            Write-Host $("***** Finish [" + $source_file.Name + "] *****") -foregroundcolor green

        }
    }

    # Release Assets
    $web.Close() > $null
    $web.Dispose() > $null

} else {
    Write-Host "No URL supplied" -foregroundcolor red
}

Write-Host ""
Write-Host "Finished!" -foregroundcolor green

Stop-Transcript

It’s worth noting that the above example uses the “Start-Transcript” and “Stop Transcript” powershell functions, that very neatly record anything we write out through standard output to a text file. I only discovered this a few months ago – it’s very useful.

Writing the Worker Scripts

Apart from knowing we need to put the scripts in the “lib” folder, it’s useful to see what one of the child scripts might look like. The example below shows a good example – which might be saved as “lib/1000_provision_list.ps1”:

function provision_list($web){
    $list_internal_name = "mylist"
    $list_display_name = "My List"
    Write-Host $("Provisioning List [" + $list_display_name + "]")
    if ($web.Lists[$list_display_name]) {
        Write-Host " - $list_display_name Already Exists"
        $list = $web.Lists[$list_internal_name]
    } else {
        Write-Host " - $list_display_name ($list_internal_name)"
        $web.Lists.Add($list_internal_name, "", "GenericList") > $null
        $list = $web.Lists[$list_internal_name]
        $list.Title = $list_display_name
        $list.ContentTypesEnabled = $true
        $list.EnableVersioning = $true
        $list.NavigateForFormsPages = $false
        $list.Update() > $null
    }
}

provision_list($web)

The take-away from the library files is that they share any objects already instantiated by the core script – therefore we can presume $web already contains a valid SharePoint object. Therefore all each file contains is a function definition, and a call to run the function (using functions helps keep working variables scoped).

So there you have it – a working method of building large, flexible, and extendable provisioning scripts. Do comment if you’ve done something similar, or if you found this useful.

Posted by Jonathan Beckett in Notes, 0 comments