DrewSK.Tech

Databases, Development, and Other Anecdotes

0%

Week 1 Whether you’re a seasoned “senior developer” or taking your first steps into development, coding challenges are an excellent way to expand your skillset. Azure Developer Advocates are hosting the 25 days of serverless development challenge and I’m quite intrigued! I’m familiar with foundational concepts for Azure Functions in both C# and Typescript but I can use every opportunity I can get to improve that capability because Azure functions and serverless concepts are no small deal for modern developers.

We’ve had a tenancy in Azure for years – so long that our original assumptions and architecture strategies are needing to be overhauled. More importantly, mistakes are becoming glaring limitations. Our original strategy to deploy most primaries to North Central US has evolved into an ExpressRoute connection to East US. We stood up an API Management instance in North Central US and shipped code with a direct link to the instance’s native name (gateway address) before the API management instance could be migrated to East US. Over the Thanksgiving holiday I migrated an Azure API management instance between 2 regions – here’s how I did the migration:

With a few exceptions, improvements to VS Code will improve Azure Data Studio a few months after release through the VS Code base within Azure Data Studio. The October 2019 release of VS Code, v1.40, brought in a handful of changes along with quite a bit of repository housekeeping. The housekeeping included combining duplicate issues and merging in a large number of pull requests. There are 2 changes that I wanted to highlight as notable and worth considering how they will impact Azure Data Studio in the future.

This post will go through an example of adding telemetry collection to a basic Azure Data Studio extension – where you would be able to analyze the extension’s usage patterns. I won’t even begin talking about the technical details of implementing telemetry data collection without mentioning the importance of clearly disclosing the usage data that is being collected. Even for an open source project, a privacy statement in plain language should be shared if telemetry data is collected.

One of the most streamlined ways to implement telemetry collection is to avoid reinventing the wheel and leverage a currently available module – the VS Code Extension Telemetry module. The module will send telemetry data to an Application Insights instance based on a key placed in the application.

I’ve been weighing some complex issues lately, and one of those issues is the use of telemetry data collection in my extensions. While usage data is commonly collected in commercial software, it isn’t generally the first priority for hobby projects. When it comes to understanding how many people have installed one of the Azure Data Studio extensions I’ve worked on or how it is being used, I’m somewhat flying blind. Through the magic of GitHub APIs, I can see how many times a specific release has been downloaded – but I’m unable to see which extensions are getting the most use or which features are most popular.

While the Azure Data Studio APIs themselves are largely self-documenting, it can be tough to catch when APIs are added in the monthly release notes. As an Azure Data Studio extension developer, it is certainly helpful to know if new capabilities have been added! The solution that I am currently leveraging to monitor changes to the API’s is surprisingly simple – an RSS feed.

The following is reposted from the Dynamics SL User Group blog as it contains my personal thoughts on a change to the annual conference for the user group.

Change is often not easy, but the Dynamics SL User Group board of directors has been preparing for the opportunity to join the collective Community Summit for the last several years.  This process involved many hours of individual and group reflection, discussions on the current Microsoft business applications ecosystem, and extended arrangements with Dynamic Communities.

With respect to all the effort that went into the decision and our commitment to the betterment of the entire Dynamics SL user community, I wanted to take a moment and answer a few questions preemptively.

I’ve had the benefit of learning through trial by fire – that is, I became a manager early in my career without any formal management or leadership training. Being a reasonably smart individual, I figured I would be able to lead a team to success in projects in areas beyond the boundaries of my relatively small experience. Without any regard for my own significant technical gaps or inability to know everything in the technical realm, I charged ahead as a young and motivated manager.

I launched a new extension for Azure Data Studio in early September – Query Editor Boost. More to come on this extension soon, including an updated tutorial for creating extensions for Azure Data Studio. Query Editor Boost There were a few pieces of functionality missing from the query editor in Azure Data Studio that I felt would be a good fit for an extension: new query template use database keyboard shortcut friendly snippet editing Get the extension now through the Azure Data Studio extension marketplace or direct on GitHub: https://github.

Unable to install version <0.0.0> of extension ‘drewsk.newprojectforfun’ as it is not compatible with Azure Data Studio ‘1.9.0’

Curses! You’ve created an extension, tested it with the VS Code debug extension for Azure Data Studio, packaged it up and now want to install it in your Azure Data Studio instance – but you get an error message. What gives?

In Dynamics SL, one of the ways to establish a test environment with fairly low overhead is a duplicate set of your APP and SYS databases. While the arguement for a completely segmented test environment on additional servers can be made and is especially valid for upgrade processes – the duplicate databases are a good testing ground for interface customizations, stored procedure modifications, and quick query changes. Because these test databases are duplicates of your production data, the same data controls and permissions need to be exercised.

The nuts and bolts of this post is about sending an HTTP POST request in an Azure Logic App that utilizes the multipart/form-data content type. I don’t run into it often, but when I do, I’m sure glad I figured out how to do more than application/json request bodies in Logic Apps.

The use case I came across this week for multipart/form-data body was for the Mailgun API.

There are several ways to setup an environment for learning or development and testing in SQL Server – this post will outline working with Azure SQL database. The database will be created from a sample of Stack Overflow data and we will connect to it from Azure Data Studio. Being Azure-based, your hardware requirements are essentially non-existent. The Azure SQL environment comes at a cost of about $15/month – but with a setup this straight forward, you don’t have to keep your environment running longer than you need it.

The Stack Overflow data dump is an anonymized version of the information behind the popular site StackExchange.com. The real data included is a significant improvement over the many popular Microsoft samples (AdventureWorks, WorldWideImporters, ughh).

Read on for a walk-through of transferring the Stack Overflow bacpac from Azure blob storage to your Azure subscription and importing it to an Azure SQL database following these 4 steps.

  • Import the 9GB bacpac file from another storage account to your storage account using AZcopy
  • Select the bacpac file from your storage for an Azure SQL database import
  • Wait 2-18 hours
  • Connect

This is an anecdotal recording of a quick series of tests and is not a comprehensive experiment or thorough examination. I don’t have the money for that. Importing a 9GB .bacpac File Azure storage account A test database can be quickly setup by importing a .bacpac file into an Azure SQL server. I have a 9GB bacpac stored in Azure blob storage and in the same region I have an Azure SQL server.

“Why do you do what you do?” What Do I Do? I solve problems. I build solutions using technology. I keep a small-medium business humming along with systems and infrastructure to support the growth that drives employee-owner shareholder value. The title that I’ve been dubbed is Director of IT, and I get to work with a great group of talented technologists and lead them as the application architect. My involvement with SQL Server is as a database developer and on bad days, a DBA.