Tome's Land of IT

IT Notes from the Powertoe – Tome Tanasovski

How to Execute PowerShell Scripts Without the CPU Hit to Start PowerShell

Background – The Problem

I have been playing a lot with Splunk recently. If you’re not familiar with the product, it is a horizontally scalable database that leverages map-reduce to give you real-time analytics about your data. That’s probably a topic for another day, but the relevant part is that they have an agent that can run on a Windows desktop called the Universal Forwarder. They also have a PowerShell SDK that lets you send data to Splunk via PowerShell. Again, the details about these topics should be saved for another day. The topic for today is that in order to send data from my system with any regularity I encounter a fairly known problem with PowerShell performance: Starting powershell.exe takes longer-than-I’d-like time and it incurs a bit of a CPU hit. Both of which are unacceptable to me if I’m going to run these scripts on every desktop in my enterprise.  This is especially true when you consider that a good proportion of those will be virtual desktops that are sharing CPU with each other.

The Solution

I’ve been thinking about this problem a lot, and I have a trimmed-down script-only version of my proposed solution. The technique is not that hard to follow.  The first step is to create a PowerShell script that will run indefinitely.  The script has the following requirements:

  • It should read through a directory for scripts.  If a script exists, it should execute it in its current runspace.
  • The order in which the scripts are entered in the queue directory matters, i.e., the first script in should be the first script run.
  • After every script is run, it should remove the variables that were created by the script, and then call [gc]::collect() to ensure that memory does not become unmanageable.  This is the magic part of the script.  For many, this post may be worth this snippit alone 🙂 You can use this technique anytime your PowerShell session is using up too much RAM for your tastes.
  • It should allow an initialization script to run so that you can load any global variables that should not be deleted or modules that should stay loaded in the session.
  • It should sleep for a variable number of seconds between runs
  • Parameters should consist of, the queue directory name, the initialization script, and the delay between retries in the loop.

The Script

The end result is a script called longrunnings.ps1 (for lack of any thought put into the name) that looks like this:

    [ValidateScript({Test-Path $_ -PathType Container})]
    [string] $QueueDirectory,
    [ValidateScript({Test-Path $_ -PathType Leaf})]
    [string] $InitializationScript,
    [int] $SleepSeconds = 15
if ($InitializationScript) {
    Write-Verbose "Dot sourcing $InitializationScript"
    . $InitializationScript

Write-Verbose "Capturing the list of variables in the session so they are not removed between executions"
$vars = dir variable: |select -ExpandProperty name
# There's a few variables that get set in this script, and a few others that will be seen when called as a script
$vars += ('args','input','MyInvocation','PSBoundParameters','PSDebugContext','file','vars','files','foreach')

# Enter the infinite loop
while ($true) {
    $files = dir $QueueDirectory -file -Filter *.ps1 |sort lastwritetime
    if ($files) {
        foreach ($file in $files) {
            Write-Verbose ('Reading {0}' -f $file.fullname)
            $content =  [System.IO.File]::OpenText($file.fullname).ReadToEnd()
            Write-Verbose ('Executing {0}' -f $file.fullname)
            Invoke-Expression $content
        $newvars = dir variable: |select -ExpandProperty name
        foreach ($var in $newvars) {
            if ($vars -notcontains $var) {
                Write-Verbose ('Removing ${0}' -f $var)
                Remove-Variable $var
        Write-Verbose 'Garbage Collection'

        Write-Verbose ('Deleting {0}' -f $file.fullname)
        del $file.fullname
    else {
        sleep $SleepSeconds


Here’s an export of the schtasks xml I am using to ensure that it runs constantly.  I even have it set to restart every 24 hours, but that may not be necessary.

<?xml version="1.0" encoding="UTF-16"?>
<Task version="1.2" xmlns="">
 <Principal id="Author">
 <Actions Context="Author">
 <Arguments>-windowstyle hidden -file D:\DropBox\scripts\longrunning\longrunning.ps1 -queuedirectory d:\dropbox\scripts\longrunning\queue -InitializationScript d:\DropBox\scripts\longrunning\init.ps1</Arguments>

You can load the above by running

schtasks /create /xml d:\pathtoabovexml.xml

Controlling What Gets Run

Finally, to control when things are run, we obviously cannot rely on PowerShell because we’ll be introducing the overhead we are trying to avoid. Instead you can use schtasks again to copy your scripts into the queue directory at the intervals you expect them to run. Mind you, this does not ensure that the script runs at the specified time. It only ensures that it is scheduled to run. Alternatively, you could copy files directly into the directory from some remote server that controls what is run, but for my purposes the schtasks solution is fine.


I mentioned at the beginning of this post that this is a script-only interpretation of my solution. I originally wanted to create this as a C# compiled service that created a PowerShell runspace and managed it nearly exactly the way I’m doing it in the script. The truth is that so far the technique I’m using seems to be extremely reliable. I’m sure I’ll hit snags along the way, but for now the technique is sound and the problem is solved. Whether I’ll propose this as a production solution is TBD, but I’m happy to see my dream realized.


2 responses to “How to Execute PowerShell Scripts Without the CPU Hit to Start PowerShell

  1. ramblingcookiemonster August 6, 2013 at 10:08 pm

    Great idea! We use SCOM, which allows a command channel for alert notifications. As you can imagine, starting ## powershell processes at one time during an alert storm would not be pleasant!

    I borrowed your idea and added a runspace pool to process incoming files. Still testing the stability, but it’s looking good thus far.


  2. ramblingcookiemonster August 21, 2013 at 8:05 pm

    Can’t seem to edit previous reply! Anyhow, published my implementation of your idea:

    Thanks again for the idea! The ‘daemon’ (no idea what to call it) hasn’t skipped a beat processing notifications through several alert storms from SCOM.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: