Databases, Development, and Other Anecdotes


This post will go through an example of adding telemetry collection to a basic Azure Data Studio extension – where you would be able to analyze the extension’s usage patterns. I won’t even begin talking about the technical details of implementing telemetry data collection without mentioning the importance of clearly disclosing the usage data that is being collected. Even for an open source project, a privacy statement in plain language should be shared if telemetry data is collected.

One of the most streamlined ways to implement telemetry collection is to avoid reinventing the wheel and leverage a currently available module – the VS Code Extension Telemetry module. The module will send telemetry data to an Application Insights instance based on a key placed in the application.

I’ve been weighing some complex issues lately, and one of those issues is the use of telemetry data collection in my extensions. While usage data is commonly collected in commercial software, it isn’t generally the first priority for hobby projects. When it comes to understanding how many people have installed one of the Azure Data Studio extensions I’ve worked on or how it is being used, I’m somewhat flying blind. Through the magic of GitHub APIs, I can see how many times a specific release has been downloaded – but I’m unable to see which extensions are getting the most use or which features are most popular.

While the Azure Data Studio APIs themselves are largely self-documenting, it can be tough to catch when APIs are added in the monthly release notes. As an Azure Data Studio extension developer, it is certainly helpful to know if new capabilities have been added! The solution that I am currently leveraging to monitor changes to the API’s is surprisingly simple – an RSS feed.

The following is reposted from the Dynamics SL User Group blog as it contains my personal thoughts on a change to the annual conference for the user group.

Change is often not easy, but the Dynamics SL User Group board of directors has been preparing for the opportunity to join the collective Community Summit for the last several years.  This process involved many hours of individual and group reflection, discussions on the current Microsoft business applications ecosystem, and extended arrangements with Dynamic Communities.

With respect to all the effort that went into the decision and our commitment to the betterment of the entire Dynamics SL user community, I wanted to take a moment and answer a few questions preemptively.

I’ve had the benefit of learning through trial by fire – that is, I became a manager early in my career without any formal management or leadership training. Being a reasonably smart individual, I figured I would be able to lead a team to success in projects in areas beyond the boundaries of my relatively small experience. Without any regard for my own significant technical gaps or inability to know everything in the technical realm, I charged ahead as a young and motivated manager.

I launched a new extension for Azure Data Studio in early September – Query Editor Boost. More to come on this extension soon, including an updated tutorial for creating extensions for Azure Data Studio. Query Editor Boost There were a few pieces of functionality missing from the query editor in Azure Data Studio that I felt would be a good fit for an extension: new query template use database keyboard shortcut friendly snippet editing

Unable to install version <0.0.0> of extension ‘drewsk.newprojectforfun’ as it is not compatible with Azure Data Studio ‘1.9.0’

Curses! You’ve created an extension, tested it with the VS Code debug extension for Azure Data Studio, packaged it up and now want to install it in your Azure Data Studio instance – but you get an error message. What gives?

In Dynamics SL, one of the ways to establish a test environment with fairly low overhead is a duplicate set of your APP and SYS databases. While the arguement for a completely segmented test environment on additional servers can be made and is especially valid for upgrade processes – the duplicate databases are a good testing ground for interface customizations, stored procedure modifications, and quick query changes. Because these test databases are duplicates of your production data, the same data controls and permissions need to be exercised.

The nuts and bolts of this post is about sending an HTTP POST request in an Azure Logic App that utilizes the multipart/form-data content type. I don’t run into it often, but when I do, I’m sure glad I figured out how to do more than application/json request bodies in Logic Apps.

The use case I came across this week for multipart/form-data body was for the Mailgun API.

There are several ways to setup an environment for learning or development and testing in SQL Server – this post will outline working with Azure SQL database. The database will be created from a sample of Stack Overflow data and we will connect to it from Azure Data Studio. Being Azure-based, your hardware requirements are essentially non-existent. The Azure SQL environment comes at a cost of about $15/month – but with a setup this straight forward, you don’t have to keep your environment running longer than you need it.

The Stack Overflow data dump is an anonymized version of the information behind the popular site StackExchange.com. The real data included is a significant improvement over the many popular Microsoft samples (AdventureWorks, WorldWideImporters, ughh).

Read on for a walk-through of transferring the Stack Overflow bacpac from Azure blob storage to your Azure subscription and importing it to an Azure SQL database following these 4 steps.

  • Import the 9GB bacpac file from another storage account to your storage account using AZcopy
  • Select the bacpac file from your storage for an Azure SQL database import
  • Wait 2-18 hours
  • Connect

This is an anecdotal recording of a quick series of tests and is not a comprehensive experiment or thorough examination. I don’t have the money for that. Importing a 9GB .bacpac File Azure storage account A test database can be quickly setup by importing a .bacpac file into an Azure SQL server. I have a 9GB bacpac stored in Azure blob storage and in the same region I have an Azure SQL server.

“Why do you do what you do?” What Do I Do? I solve problems. I build solutions using technology. I keep a small-medium business humming along with systems and infrastructure to support the growth that drives employee-owner shareholder value. The title that I’ve been dubbed is Director of IT, and I get to work with a great group of talented technologists and lead them as the application architect.

Pick one thing you want to learn that is not SQL Server. Write down ways and means to learn it and add it as another skill to your resume. If you are already learning it or know it – explain how you got there and how it has helped you. Your experience may help many others looking for guidance on this.


In my day job, one of my hats is systems architect and integrator. One of those systems is a traditional on-premises ERP, while others are web applications and native mobile applications.  We’re a small team, but we make an honest effort to perform scalable and repeatable development practices.  The next improvement we’re making is a move towards **continuous deployment (CD). **As a team lead, I don’t enforce process changes with unknown consequences, so the CD shift requires me to get my feet wet with NodeJS.

While the first Azure Data Studio extension I developed (First Responder Kit extension) made HTTP GET requests, it did not pass information to external services. The next extension, an extension for Paste the Plan, does send plan information to the external service via HTTP POST requests. I understand that public exposure of information isn’t for everyone – but simply installing the extension doesn’t send data to Paste the Plan.

I can be very particular about the strangest things, and when I picked up Azure Data Studio (pka SQL Operations Studio) I immediately noticed that the syntax coloring wasn’t quite what I wanted.  Since Azure Data Studio has VS Code under the hood, you can install any theme you find on the VS Code marketplace into Azure Data Studio.  The trouble with those themes is that they aren’t usually focused on TSQL syntax.