Getting started with Log Analytics and PowerShell logging

Log Analytics is a fantastic place to ship, store, and analyse your logs. Whether they’re coming from a linked Azure resource, machine agents, or you’re posting them from your own applications and services, Log Analytics is a key part of Azure Management & Monitoring. Whether you’re an IT Pro, working in devops, or an application developer - this platform and its capabilities are worth exploring and understanding.

Log Analytics was previously offered as part of the Operations Management Suite (OMS) bundling, though that labelling is in the process of being retired.

To get started you can create a free workspace which lets you ingest up to 5GB of data per month. Once the data is loaded there is no cost to query it, and it’ll be retained for 31 days (you can up retention and ingestion limits as part of paid plan later).

This post will walk through creating a Log Analytics workspace, uploading some logs with PowerShell, and then querying them via the portal. To follow along you’ll need an Azure subscription and the AzureRM PowerShell module - for installation instructions see the prerequisites section at the end of this post.

[Read More]

Troubleshooting a slow application with Application Insights

Application Insights (AppInsights) is a fantastic instrumentation framework that with minimal/zero configuration will start giving you rich data about your application’s performance.

We recently got some reports that one of our website solutions was ‘slow’ when developing locally, and as much as we’d like to turn to the DBA (you know what DBA stands for, right? I like Database Blamed Always…), with AppInsights we can be a little more rigorous.

From our starting point of ‘it runs slow locally, I think it is the database’ we’ll figure out precisely how slow it is, and whether it really is the database or not.

[Read More]

Ensuring your Describe Tags are unique in Pester tests

The name of each test in SQLChecks is used as both the setting name in the configuration files, and to tag the Describe block. After seeing the benefit of fine-grained control over test execution (from Claudio Silva’s post dbachecks - a different approach…) this method of test invocation became the preferred way to leverage the SQLChecks library:

$config = Read-SqlChecksConfig -Path $sqlChecksConfigPath
foreach($check in (Get-SqlChecksFromConfig $config)) {
  # Note we invoke by -Tag $check - a test with no tag will never get invoked
  Invoke-SqlChecks -Config $config -Tag $check
}

There isn’t yet a convention for how to name a test, and we’ve already had some tests built with similar sounding names - it is only a matter of time before we get a duplicate. To prevent duplicate tests accidentally getting checked in (and causing unusual/broken behaviour for consumers), I recently added a test that parses the test files and ensures that each tag is not only unique within the file, but globally within SQLChecks.

Describe tags test on AppVeyor

You can find the full test on GitHub, or read on for an explanation of how it is implemented. For a more thorough exploration of tests you can run on Describe blocks see SQLDBAWithABeard’s blog Using the AST in Pester for dbachecks (which inspired this test), or the TechNet post learn how it pros can use the PowerShell AST.

[Read More]

Adding Pester tests to a PowerShell module and scheduling CI with AppVeyor

Adding Pester tests to a PowerShell module is probably one of the most valuable development activities you’ll be able to perform, and I’d encourage you to do it early in your project. I left it until rather late with SQLChecks, and as a result have broken the module several times.

While some of breaks were definitely edge cases (and I don’t think I have the the foresight to write a test that would have caught them), one of the most egregious errors caused SQLChecks to not export any functions at all. In this post I’ll walk through the steps needed to do just that, which in brief are:

  • Create a Pester test for the module
  • Run the test locally, demonstrating that it will fail when the module doesn’t export any functions
  • Schedule the test to run automatically every time a commit is pushed to the GitHub repo (using AppVeyor)
  • Display the build status and number of passing tests on the Github readme

And once we’re done a quick glance at our readme will show:

SQLChecks with badges

[Read More]

Setting sp_configure values with SQLChecks

As of v1.0 SQLChecks now contains the Set-SpConfig command that allows you to take a file that documents a server configuration (specifically sp_configure values) and apply that configuration to a server. The configuration file is the same one used by Pester tests (perhaps in combination with something like dbachecks), which means you now have a mechanism to document, test, and set your server’s configuration.

In order to apply the configuration to a single server you would run the following PowerShell (note that SQLChecks configuration files contain the instance name, which is why we don’t have to specify a server):

  Read-SqlChecksConfig "c:\configs\localhost.config.json" `
  | Set-SpConfig -Verbose

Running in verbose means that it’ll output progress as it changes a value, as well as a summary as it finishes (x/y config values updated).

Note the command compares the configured value against the expected value - if the configured value is correct but the runtime value is wrong then this will neither fail the Pester tests, nor update the value when using Set-SpConfig.

[Read More]