Migrating Azure API Management Between Regions
We’ve had a tenancy in Azure for years – so long that our original assumptions and architecture strategies are needing to be overhauled. More importantly, mistakes are becoming glaring limitations. Our original strategy to deploy most primaries to North Central US has evolved into an ExpressRoute connection to East US. We stood up an API Management instance in North Central US and shipped code with a direct link to the instance’s native name (gateway address) before the API management instance could be migrated to East US. Over the Thanksgiving holiday I migrated an Azure API management instance between 2 regions – here’s how I did the migration:
Anticipating VS Code 1.40
With a few exceptions, improvements to VS Code will improve Azure Data Studio a few months after release through the VS Code base within Azure Data Studio. The October 2019 release of VS Code, v1.40, brought in a handful of changes along with quite a bit of repository housekeeping. The housekeeping included combining duplicate issues and merging in a large number of pull requests. There are 2 changes that I wanted to highlight as notable and worth considering how they will impact Azure Data Studio in the future.
How to Add Telemetry to an Azure Data Studio Extension
This post will go through an example of adding telemetry collection to a basic Azure Data Studio extension – where you would be able to analyze the extension’s usage patterns. I won’t even begin talking about the technical details of implementing telemetry data collection without mentioning the importance of clearly disclosing the usage data that is being collected. Even for an open source project, a privacy statement in plain language should be shared if telemetry data is collected.
One of the most streamlined ways to implement telemetry collection is to avoid reinventing the wheel and leverage a currently available module – the VS Code Extension Telemetry module. The module will send telemetry data to an Application Insights instance based on a key placed in the application.
Decision: Adding Telemetry to Azure Data Studio Extensions
I’ve been weighing some complex issues lately, and one of those issues is the use of telemetry data collection in my extensions. While usage data is commonly collected in commercial software, it isn’t generally the first priority for hobby projects. When it comes to understanding how many people have installed one of the Azure Data Studio extensions I’ve worked on or how it is being used, I’m somewhat flying blind. Through the magic of GitHub APIs, I can see how many times a specific release has been downloaded – but I’m unable to see which extensions are getting the most use or which features are most popular.
Following Changes to the Azure Data Studio APIs
While the Azure Data Studio APIs themselves are largely self-documenting, it can be tough to catch when APIs are added in the monthly release notes. As an Azure Data Studio extension developer, it is certainly helpful to know if new capabilities have been added! The solution that I am currently leveraging to monitor changes to the API’s is surprisingly simple – an RSS feed.
Dynamics SL User Group – Summit 2020
The following is reposted from the Dynamics SL User Group blog as it contains my personal thoughts on a change to the annual conference for the user group.
Change is often not easy, but the Dynamics SL User Group board of directors has been preparing for the opportunity to join the collective Community Summit for the last several years. This process involved many hours of individual and group reflection, discussions on the current Microsoft business applications ecosystem, and extended arrangements with Dynamic Communities.
With respect to all the effort that went into the decision and our commitment to the betterment of the entire Dynamics SL user community, I wanted to take a moment and answer a few questions preemptively.
T-SQL Tuesday #119: Changing Your Mind
I’ve had the benefit of learning through trial by fire – that is, I became a manager early in my career without any formal management or leadership training. Being a reasonably smart individual, I figured I would be able to lead a team to success in projects in areas beyond the boundaries of my relatively small experience. Without any regard for my own significant technical gaps or inability to know everything in the technical realm, I charged ahead as a young and motivated manager.
Query Editor Boost – Launch!
Azure Data Studio Extension Install Error – “Not Compatible”
Unable to install version <0.0.0> of extension ‘drewsk.newprojectforfun’ as it is not compatible with Azure Data Studio ‘1.9.0’
Curses! You’ve created an extension, tested it with the VS Code debug extension for Azure Data Studio, packaged it up and now want to install it in your Azure Data Studio instance – but you get an error message. What gives?
Dynamics SL Test Databases
Azure Logic Apps and Form-Data HTTP Requests
The nuts and bolts of this post is about sending an HTTP POST request in an Azure Logic App that utilizes the multipart/form-data content type. I don’t run into it often, but when I do, I’m sure glad I figured out how to do more than application/json request bodies in Logic Apps.
The use case I came across this week for multipart/form-data body was for the Mailgun API.
Database Sandboxes: Azure SQL DB
There are several ways to setup an environment for learning or development and testing in SQL Server – this post will outline working with Azure SQL database. The database will be created from a sample of Stack Overflow data and we will connect to it from Azure Data Studio. Being Azure-based, your hardware requirements are essentially non-existent. The Azure SQL environment comes at a cost of about $15/month – but with a setup this straight forward, you don’t have to keep your environment running longer than you need it.
The Stack Overflow data dump is an anonymized version of the information behind the popular site StackExchange.com. The real data included is a significant improvement over the many popular Microsoft samples (AdventureWorks, WorldWideImporters, ughh).
Read on for a walk-through of transferring the Stack Overflow bacpac from Azure blob storage to your Azure subscription and importing it to an Azure SQL database following these 4 steps.
- Import the 9GB bacpac file from another storage account to your storage account using AZcopy
- Select the bacpac file from your storage for an Azure SQL database import
- Wait 2-18 hours
- Connect