In case you didn’t notice, I’m already in love with SQL Operations Studio Azure Data Studio. After downloading the preview and poking around, I’m pretty excited for what this application can (and eventually) do. I view the preview as a furnace, where a new developer or management interface for SQL Server will eventually be forged.
Much like I tell people writing PowerShell to switch from the PowerShell ISE to VS Code, I’ll probably eventually push people towards this application. It’s a smaller install, it’s pretty snappy, and the interface is way cleaner and easier to manage (and officially supports dark themes). There’s built in source control and support for extensions. So much to explore and learn! Not yet, though; there’s still a lot to work out, like the fact that it can’t give you actual execution plans (just estimates) and other things. I’m sure that with enough community engagement, it’ll continue to evolve. It’s too early to write it off, but at the same time, you can’t go whole-hog using it either.
One barrier to entry is that the initial setup can be a little daunting, especially if you use a local connection groups or central management servers to keep track of registered connections in SQL Server Management Studio. You’d be in for a lot of manual clicking and typing of connections if you have a lot of saved connections. But there’s a better way: you can import all that saved information right into Azure Data Studio, and it’s pretty painless, too. Buckle up, because this involves a little knowledge of how settings are saved in Azure Data Studio, and how we can quickly get saved connection information out of SSMS and into your new application. Spoiler alert, we’re going to use PowerShell.
Like I mentioned in my last post, changing settings in Azure Data Studio requires you to modify a text file. No fancy GUIs here. To pull up your settings, either click “File -> Preferences -> Settings” or use Control-Comma (,) on your keyboard (Note: keyboard shortcuts are king in both VS Code and SQL Ops Studio, I’d recommend practicing).
When you do, after a fresh install, you should see something that looks like this:
The panel on the left is the default settings for the application. Go ahead and try to edit them. What’s that, you can’t? That’s on porpoise. In this shell, just like VS Code, when you want to change or add a setting, you don’t do it in the defaults, you set them in your personal settings file, which is the panel on the right. Essentially, if you want to override a setting, you find it in the left panel, and copy/paste it into your settings, which trumps the defaults. Anything additional just gets added, since it won’t technically override anything. Your settings file is stored if your local roaming application data folder (C:\Users\<your user name>\AppData\Roaming\sqlops\User ) when you first run the application.
The settings are saved in JSON format, which is nice and easy to work with as a key:value pair. If you were to add connections through the UI, they get added to your settings file and here’s a sample of what it would look like:
{ "datasource.connectionGroups": [ { "name": "ROOT", "id": "C777F06B-202E-4480-B475-FA416154D458" } ], "datasource.connections": [ { "options": { "server": "someservername.boatmurder.net", "database": "master", "authenticationType": "SqlLogin", "user": "drew", "password": "", "connectTimeout": 30, "applicationName": "sqlops", "groupId": "C777F06B-202E-4480-B475-FA416154D458", "databaseDisplayName": "master" }, "groupId": "C777F06B-202E-4480-B475-FA416154D458", "providerName": "MSSQL", "savePassword": true, "id": "7d3ace26-f270-40c0-a6f3-6019894366dc" } ] }
Inside your settings file, two new sections get created. First, the “datasource.connectionGroups” key, which contains information about the groups (or folders) that you connections get stored in. By default, they get added to the “ROOT” section of the list (i.e. no group). If this group doesn’t exist when you create a server, Azure Data Studio creates it for you and assigns it a unique GUID.
Next, details about the connection get listed here: the “datasource.connections” key saves all your server connection info. The nested “options” key contains all the details about the server connection, such as which server it wants to connect to, what kind of authentication, username, etc. The groupId field is what helps order it in the group hierarchy (here, we see it is just off the ROOT group).
So, now that we know this, importing the server info can be pretty straightforward: we just need to capture the saved server info on our local machine, create the neccessary JSON, and paste it into our settings file. Even then, if we could aggregate it, that’s still a lot of manual work. I want to automate the entire process. The question then becomes, how? PowerShell, that’s how!
On paper, PowerShell seems like the ideal tool to get our info out of SSMS and into Azure Data Studio:
Is it as easy as it sounds, though?
I wrote a function called Import-RegisteredServers that you can snag from my Github repository. The code is listed out there. I won’t paste the entire thing here, since it’s a living repository that is going to change as 1) my code gets better and 2) the application changes. Go ahead and download it take a look.
All done? Cool, let’s move on.
In addition to the points listed above, I wanted to include some quality of life stuff in the script too, like checking for duplicate server registrations within a group and making sure the user has the required settings areas defined. Let’s take a look at some of the code that makes this work
$UserSettings = (get-content -Path $PathToSettingsFile -Raw | ConvertFrom-Json)
The line above is what gets things started for us. We’re using Get-Content to read the contents of the settings file (which is stored in the $PathToSettingsFile variable defined a bit earlier in the code. Then, we take that content and pipe it to ConvertFrom-JSON which does exactly as advertised: it takes your input variable and attempts to parse the values as JSON data. Once it does, your content that was stored in JSON becomes an object:
Which means, now can we compare, add, or remove things we see fit. This functionality makes it easy to add things into your existing settings once you have them loaded.
Now we need to get our registered servers from SSMS. We’re going to do that with the help of the SQL Server PowerShell module. Here’s a quick snippet to demonstrate how it works:
import-module sqlserver Get-ChildItem -Path SQLSERVER:\SQLRegistration -Recurse
This will look at ever folder you have saved as part of your registered servers in SSMS. The object you get back, though, has some quirks
The trick then is to make sure you work with both objects. My function is basically broken up into two sections: the first section iterates over groups, determines if they are a child of another group, and writes a new connection group key:value pair to the settings object we read in ahead of time. The second part does the same thing, except for the servers.
Each section also checks to see if the parts of the settings file has areas defined for these settings as well, and adds them in if they aren’t there.
Once the parsing is done, we now need to write the file. The final part of the file does this pretty elegantly, if I do say so myself:
$UserSettings | ConvertTo-Json -Depth 99 | Out-File -FilePath $PathToSettingsFile -Encoding "UTF8"
Here, I’m taking the $UserSettings object which contains all the info we’ve added during our parsing process, and piping it to the ConvertTo-Json cmdlet. Note the -Depth 99 parameter here: by default, when you pipe content to ConvertTo-Json , it only goes two levels deep. Our server registrations are three levels deep, so it won’t be saved correctly unless you account for that. With a value of 99 here, I’m overcompensating, but it wouldn’t be the first (or last) time.
Once the object has been recombobulated into JSON, we pipe it again to Out-File, overwriting our file we imported at the start of the script. We’re specifying UTF-8 encoding, too, since Azure Data Studio is very picky about file encoding.
I’m pretty pleased with the functionality I wrote here, and I hope you find it useful. Before you download it and try it yourself though, just a few quick notes about the function:
For roughly 100 lines of code, this turned out pretty well I think. Can it be better? Absolutely it can; there’s people who can code better than me out there, and I bet you’re one of them. I encourage you to submit a PR once you come up with another solution. I’d love to see your work! And if you liked this script and want to see more, drop me a comment. I’d love to keep contributing if you tell me where I can help.
Here’s the script in action (click for an larger, easier to watch version):
Please keep these SQL Ops Studio posts coming. I love the idea of it and it’s already sparking a few great ideas for my own environment.
You got it Keith! Any particular thing about Ops Studio you’d like to see more of, or just anything?
Drew, This is really good stuff. Thank you for the time and effort you put into it.
Thanks Rob, you’re very welcome!
Pingback: Importing SSMS Registered Servers Into SQL Operations Studio – Curated SQL
Really cool. Ran it and it just worked. I was a bit surprised by the lack of feedback after running and then seeing all of my servers in SOS. Sadly, seems that SOS doesn’t support Azure AD MFA yet so won’t work for our Test/Prod servers. It imported those as “Windows Auth”, which is close, but not quite right. 🙂
I appreciate the script to help us copy our settings from SSMS. That’s always been a little painful when moving between tools or machines.
Peter, thanks for the feedback. I’m thrilled my code ran fine for someone else, that’s always a good sign. You’re right about the authentication methods; I’m sure those will get added in soon and when they do, I’ll update my code to account for it.
Pingback: Where’s Drew, December 2017 Edition – Port 1433