PowerShell Quickie: Audit your SQL Server power plan settings

At work this week, I got a new computer. I ditched my old laptop and started the long and arduous process of setting up my new PC. You know the drill: back up all the old stuff, start your day long install of Visual Studio (and all the different versions you might need to support older versions of SSIS packages), new installs of SQL Server 2012/2014/2016 (or all three!) and so on. Really, it’s a lost day of productivity but yet we all seem to enjoy it because hey, new computer!

The problem is that there’s no checklist; you just sort of do it and then you remember you forget something and stop what you’re doing to fix it. One particularly vexing thing that gets set on every new computer (whether it be a laptop, desktop, or server) are the power plan settings in the control panel. By default, you get the balanced plan. And believe me, you notice when you’re not running high performance. Which leads me to a interesting topic: how many of your SQL servers, right now, are set to high performance power plans? All of them, right? You sure? Because I thought mine were all set, and lo and behold, the first server I used remote desktop to log into, that wasn’t the case.

Time for an audit.

We don’t want to have to log into each and every server to get this info, so what do we do? We write PowerShell! To get a power plan setting on a PC, the command is pretty simple:

That should return your bog standard WMI object (yuck) when you run it: a bunch of class information with the specific class parameters added to it. You get the currently activated plan for your local PC. I added the -ErrorAction part in case my user account doesn’t have access to a computer, or the computer doesn’t exist. But that’s just one PC, and we want to do a bunch of computers at once. So we’re going to take that script and add just a few lines:

And now we have a much better solution. Why? There’s three main reasons:

  1. We add parameter support to the script. That means we can pass in a computer name parameter. This lets us have a lot of flexibility in how we call the script, instead of having to modify it each time we run it.
  2. The Get-WMIObject  cmdlet supports the -ComputerName parameter, which, you guessed it, can take our $ComputerName  parameter in and apply it. I added that in the Get-WMIObject  cmdlet above.
  3. Once the plan setting is returned, we just pipe it to a Select-Object  cmdlet to get the details of the plan into a more readable list.

The trickiest syntax in the script is the part that returns the $plansetting  object back. In the last statement you’ll see I’m returning the computer name and element name. Why the goofy syntax for the computer name? That’s called an expression, and really the easiest way to relay it is: think of it as an “AS” statement in SQL. We’re basically writing an expression and giving it a custom name for an existing object property. If you leave it off, you have to Select-Object PSComputerName, ElementName . I just wanted some clarity around the name.

But the biggest take-away from this script is we return an object from it, so we can call it over and over again for multiple servers and build a list of server names and current power plan settings very easily. You can build your server list from a text file or from a central management server list extremely easily and get one object back. Take a look at this example (Note: In order for this to work, we need to make sure we have our SQL Server PowerShell module loaded. I’m using “still new-car-smell fresh” module sqlserver, but you can also load it with import-module SQLPS , it’ll still work):

It’s a classic one-liner, but if you’re not used to reading it I’ll break it down for you. First, we use Get-ChildItem  to return a list of registered servers in our central management server (named PRO-CM-SQL in my example). This returns a series of objects that lists all the registered names on the central management server in each directory, so we need to filter out the directory names with a Where-Object  (objects that don’t have a “mode” value of “d” for directory). Once we have our list, we just select the names ( Select-Object ). Then we pipe the list of names over to a ForEach-Object  and execute the script each time. Finally, we tack on a Export-CSV  cmdlet to output the results to an easy to read file we can open in Excel (or notepad, or whatever).

Our script also doesn’t control output, so you leave that up to the user. They can put it on the screen or pipe it to a file. And that’s an important style point: never put your users on rails. You may like output one way, but someone else may not. Just think of the next person. Because some day you might be that next person.

Remember: just because you are (most likely) a SQL Server professional of some kind doesn’t mean you can’t get down and play in the mud of systems administration. If anything, it either keeps your server admins on their toes or you can help them if they’re overwhelmed. This leads to a more diverse you, and a more diverse you means a more in-demand you.

4 thoughts on “PowerShell Quickie: Audit your SQL Server power plan settings

  1. Pingback: Auditing Power Plan Settings – Curated SQL

  2. Konstantin

    Hello, Drew, thanks for the nice blog.

    Could you write article about async using powershell for executing some T-SQL commands? Google about it but no working example for me found,

    1. Drew Furgiuele

      Hey Konstatin, thanks for reading and contact me. Regarding your request, can you elaborate a little on what you’re trying to accomplish? You mean you want to run queries without waiting for them to complete?

      1. Konstantin

        Hi, Drew, thanks for quick respond.

        I trying run my stored procedure (https://github.com/ktaranov/sqlserver-kit/blob/master/Stored_Procedure/usp_bcpTableUnload.sql – export table from SQL Server using bcp with column names) using jobs in PowerShell (try to reproduce this behavior http://stackoverflow.com/questions/12766174/how-to-execute-a-powershell-function-several-times-in-parallel) but all jobs start ruining and not responding – only see many powershell.exe process.

        Main question: how to run custom powershell function in parallel? For example, count file size in directory for all .csv files (simple example without using SQL Server)

Comments are closed.