Home

How to Use Azure Log Analytics

|
|  Updated:  
2020-03-27 17:30:55
VMware vSphere For Dummies
Explore Book
Buy On Amazon
If you’ve spent any time in Azure Monitor, you’ve seen some of the myriad log files that your Azure resources create. Think of all the ways that data is represented in Microsoft Azure, and imagine a way to put all your logs in a single data lake and run queries against it seamlessly.

Azure Log Analytics is a platform in which you do just that: aggregate VM and Azure resource log files into a single data lake (called a Log Analytics workspace) and then run queries against the data, using a Microsoft-created data access language called Kusto (pronounced KOO-stoh) Query Language (KQL).

You’ll find that Log Analytics somehow normalizes all these different log streams into a tabular structure. You’ll also discover that KQL is similar to Structured Query Language (SQL), the data access language that is standard for relational databases.

Creating a Log Analytics workspace

The first order of business is to deploy a Log Analytics workspace. Then you can on-board as few or as many Azure resources to the workspace as you need. You can also deploy more than one Log Analytics workspace to keep your log data separate.

To create a new Azure Log Analytics workspace, follow these steps:

  1. In the Azure portal, browse to the Log Analytics Workspaces blade, and click Add.

    The Log Analytics workspace blade appears.

  2. Complete the Log Analytics workspace blade.

    You'll need to provide the following details:
    • Workspace name
    • Subscription name
    • Resource group name
    • Location
    • Pricing tier
  3. Click OK to create the workspace.
  4. Click OK to submit your deployment.
Log Analytics has a free tier as well as several paid tiers. The biggest free tier limitations are
  • Data ingestion limit of 5 GB per month
  • 30-day data retention limit

Connecting data sources to the Azure Log Analytics workspace

With your workspace online, you’re ready to on-board Azure resources into said workspace. To connect Azure resources to the workspace, go back to Monitor Diagnostic settings, enable diagnostics, and point the log streams to your workspace.

You can connect VMs to the workspace directly from the workspace’s Settings menu. Follow these steps:

  1. In your Log Analytics workspace settings menu, click Virtual Machines.

    You see a list of all VMs in the workspace’s region. You can see which VMs are connected to the workspace and which are not.

  2. If necessary, use the filter controls until you see the VM you want to connect.

    You can link a VM to only one workspace at a time. Below for example,the vm1 virtual machine is linked to another workspace.

    Log Analytics workspace Connecting VMs to an Azure Log Analytics workspace.
  3. Select the desired VM, and click Connect.

    Behind the scenes, Azure deploys the Log Analytics agent (formerly called Microsoft Monitoring Agent) to the VM.

  4. Verify that the VM is connected to the workspace.

    You can see this information in your workspace settings. Or you can revisit your VM’s Extensions blade and verify that the MicrosoftMonitoringAgent extension is installed.

    You should know that Log Analytics can on-board on-premises VMs, particularly those managed by Systems Center Operations Manager, just as it can native cloud Linux and Windows Server VMs.

You can disconnect a VM from its current workspace and connect it to another one. This operation is trivial, taking only two minutes or so to complete. To do this, simply select the VM from within the workspace and click Disconnect from the toolbar.

Writing KQL queries

You need to know a bit about how to access your Log Analytics workspace data with KQL. KQL is fast and easy to learn, and it should seem familiar to you if you’ve used Splunk Search Processing Language, SQL, PowerShell, or Bash shell.

Touring the Log Search interface

You can get to the Log Search interface by opening Monitor and selecting the Logs blade. Another way to get there (is to go to your Log Analytics workspace and click the Log setting.

A third method is to use the Log Analytics Query Playground, where you can work with an enormous data set, getting to know Log Analytics before generating a meaningful data set.

Follow these steps to run some sample KQL queries:

  1. Go to the Log Analytics portal demo.

    This site is authenticated, but don’t worry: You’re using Microsoft’s subscription, not your own.

  2. Expand some of the tables in the Schema list.

    There’s a lot in this list. Log Analytics normalizes all incoming data streams and projects them into a table-based structure.

    Expand the LogManagement category; then expand the Alert table, where you can use KQL to query Azure Monitor alerts. The t entries (shown under the expanded SecurityEvent item below) are properties that behave like columns in a relational database table.

    Log Analytics Search interface Azure Log Analytics Log Search interface
  3. On the Log Search toolbar, click Query Explorer, expand the Favorites list, and run the query Security Events Count by Computer During the Last 12 Hours.

    This environment is a sandbox. Microsoft has not only on-boarded untold resources into this workspace but also written sample queries to let you kick the tires.

  4. In the results list, click Chart to switch from Table to Chart view.

    You can visualize your query results automatically with a single button click. Not every results set lends itself to graphical representation, but the capability is tremendous.

  5. Click Export, and save your query results (displayed columns only) to a CSV file.

    Note the link to Power BI, Microsoft’s cloud-based business intelligence/dashboard generation tool.

Writing basic KQL queries

For fun, let’s try an obstacle course of common KQL queries. Click the plus sign in the Log Search query interface to open a new tab — a multitab interface like those in Visual Studio and Visual Studio Code.

To get a feel for a table, you can instruct Azure to display any number of rows in no particular order. To display 10 records from the SecurityEvent table, for example, use the following command:

SecurityEvent
| take 10
Did you notice that the query editor attempted to autocomplete your query as you typed? Take advantage of that convenience by pressing Tab when you see the appropriate autocomplete choice appear.

Use the search keyword to perform a free-text query. The following query looks in the SecurityEvent table for any records that include the string "Cryptographic":

search in (SecurityEvent) "Cryptographic"
| take 20
When you press Enter, you’ll doubtless notice the pipe character (|). This character functions the same way here as it does in PowerShell or the bash shell. Output from one query segment is passed to the next segment via pipe — a powerful construct for sure.

You can ramp up the complexity by finishing with filtering and sorting. The following code both filters on a condition and sorts the results in a descending manner based on time:

SecurityEvent
| where Level == 8 and EventID == 4672
| sort by TimeGenerated desc
If you’re thinking, “Wow, these KQL queries act an awful lot like SQL!” you’re right on the money. Welcome to Log Analytics!

About This Article

This article is from the book: 

About the book author:

Timothy Warner, MCSE, MCT, A+, is an IT professional, technical trainer, and author.