{"id":4204,"date":"2023-11-04T23:14:08","date_gmt":"2023-11-04T23:14:08","guid":{"rendered":"http:\/\/localhost:10003\/real-time-analytics-with-azure-data-explorer\/"},"modified":"2023-11-05T05:47:56","modified_gmt":"2023-11-05T05:47:56","slug":"real-time-analytics-with-azure-data-explorer","status":"publish","type":"post","link":"http:\/\/localhost:10003\/real-time-analytics-with-azure-data-explorer\/","title":{"rendered":"Real-time analytics with Azure Data Explorer"},"content":{"rendered":"
Azure Data Explorer (ADX) is a fast, reliable, and highly scalable real-time analytics platform provided by Microsoft. It’s designed to collect, analyze, and visualize massive volumes of data in real-time. This tutorial will walk you through the steps to set up a simple real-time analytics pipeline using Azure Data Explorer.<\/p>\n
Before you begin, you’ll need to have the following:<\/p>\n
The first step in setting up your real-time analytics pipeline is to create an Azure Data Explorer cluster. A cluster is a collection of Azure Data Explorer resources that work together to enable real-time data processing.<\/p>\n
To create a new cluster, start by logging in to your Azure account and opening a new terminal window. Then, use the following command to create a new Azure Data Explorer cluster.<\/p>\n
az kusto cluster create --resource-group <resource-group> --name <cluster-name> --sku D13_v2 --capacity 2 --data-retention 30\n<\/code><\/pre>\n\n- Replace
<resource-group><\/code> with the name of the resource group in which you want to create the cluster.<\/li>\n- Replace
<cluster-name><\/code> with the name you want to give your new cluster.<\/li>\n<\/ul>\nIn this example, we’re creating a cluster with the D13_v2<\/code> SKU, which is suitable for moderate workloads. The capacity<\/code> parameter determines how many nodes are provisioned for your cluster, and the data-retention<\/code> parameter specifies how long to retain ingested data.<\/p>\nIt can take several minutes for the cluster to be provisioned. Once the cluster is created, you can use the following command to check its status.<\/p>\n
az kusto cluster show --resource-group <resource-group> --name <cluster-name>\n<\/code><\/pre>\nCreate an Azure Data Explorer Database<\/h2>\n
The next step is to create a database within your Azure Data Explorer cluster. A database is a logical container for your data and enables you to manage access, retention, and querying of your data.<\/p>\n
To create a new database, use the following command.<\/p>\n
az kusto database create --cluster-name <cluster-name> --name <database-name>\n<\/code><\/pre>\n\n- Replace
<cluster-name><\/code> with the name of your Azure Data Explorer cluster.<\/li>\n- Replace
<database-name><\/code> with a name to give your new database.<\/li>\n<\/ul>\nOnce the database is created, you can verify its status with the following command.<\/p>\n
az kusto database show --cluster-name <cluster-name> --name <database-name>\n<\/code><\/pre>\nIngest Data into Azure Data Explorer<\/h2>\n
With the Azure Data Explorer cluster and database created, let’s start ingesting data. There are several ways you can ingest data into Azure Data Explorer, including Azure Event Hubs, Azure Blob Storage, and Azure Data Factory. For this tutorial, we’ll use Event Grid to trigger an Azure Function, which will then ingest data into our Azure Data Explorer database.<\/p>\n
Create an Azure Function<\/h3>\n
First, we need to create a function app that will ingest data into our Azure Data Explorer database.<\/p>\n
To create a new function app, follow these steps:<\/p>\n
\n- Open a new terminal window and navigate to a local directory where your Function code will be stored.<\/li>\n
- Use the following command to create a new function app.<\/li>\n<\/ol>\n
az functionapp create --name <function-name> --resource-group <resource-group> --consumption-plan-location eastus --runtime dotnet\n<\/code><\/pre>\n\n- Replace
<function-name><\/code> with the name of your new Function app.<\/li>\n- Replace
<resource-group><\/code> with the name of the resource group in which you want to create the Function app.<\/li>\n<\/ul>\nThis command creates a new Function app with the eastus<\/code> region and dotnet<\/code> runtime. It can take several minutes for the Function app to be provisioned.<\/p>\n\n- Once the Function app is provisioned, use the following command to create a new HTTP-triggered function.<\/li>\n<\/ol>\n
az functionapp function create --name <function-name> --function-name <http-function-name> --consumption-plan-location eastus --runtime dotnet --trigger-http\n<\/code><\/pre>\n\n- Replace
<http-function-name><\/code> with the name of your new HTTP-triggered function.<\/li>\n<\/ul>\nThis command creates a new function that’s triggered by an HTTP request.<\/p>\n
\n- Next, open your Function app in the Azure Portal and navigate to the “Platform features” tab. Click on the “Deployment Center” option, and then follow the prompts to deploy your code to Azure.<\/li>\n<\/ol>\n
Create an Azure Event Grid Topic<\/h3>\n
With our Function app created, let’s create an Event Grid topic to trigger our Function when new events are published.<\/p>\n
To create a new Event Grid topic, follow these steps:<\/p>\n
\n- Navigate to the Azure Portal and open a new terminal window.<\/p>\n<\/li>\n
- \n
Use the following command to create a new Event Grid topic.<\/p>\n<\/li>\n<\/ol>\n
az eventgrid topic create --name <topic-name> --resource-group <resource-group>\n<\/code><\/pre>\n\n- Replace
<topic-name><\/code> with the name of your new Event Grid topic.<\/li>\n- Replace
<resource-group><\/code> with the name of the resource group in which you want to create the Event Grid topic.<\/li>\n<\/ul>\nThis command creates a new Event Grid topic.<\/p>\n
\n- Next, use the following command to retrieve the endpoint URL for your new Event Grid topic.<\/li>\n<\/ol>\n
az eventgrid topic show --name <topic-name> --resource-group <resource-group> --query \"endpoint\" --output tsv\n<\/code><\/pre>\n\n- Copy the endpoint URL to your clipboard.<\/li>\n<\/ol>\n
Create an Azure Event Grid Subscription<\/h3>\n
The last step in setting up our Event Grid pipeline is to create a new subscription that triggers our Function app when new events are published to our Event Grid topic.<\/p>\n
To create a new subscription, follow these steps:<\/p>\n
\n- Navigate back to your Function app and go to the “Integrate” tab.<\/li>\n
- Click on the “New Input” button, and then select “Azure Event Grid” from the list of available triggers.<\/li>\n
- Enter the name of your new subscription in the “Subscription Name” field.<\/li>\n
- Paste the endpoint URL for your Event Grid topic into the “Event Grid Topic Endpoint Url” field.<\/li>\n
- Choose the desired options for your subscription. For this tutorial, we’ll use the default options.<\/li>\n<\/ol>\n
Query Data in Azure Data Explorer<\/h2>\n
With our real-time analytics pipeline fully operational, let’s take a look at how we can query data in Azure Data Explorer. Azure Data Explorer provides a powerful query language called KQL (Kusto Query Language) that enables us to perform complex queries on our data.<\/p>\n
Navigate to the Query Portal<\/h3>\n
To navigate to the Azure Data Explorer query portal, follow these steps:<\/p>\n
\n- Navigate to the Azure Portal and open your Azure Data Explorer cluster.<\/li>\n
- Go to the “Data Explorer” tab.<\/li>\n
- Click the “Launch Query UI” button.<\/li>\n<\/ol>\n
Create a Simple Query<\/h3>\n
Once you’re in the query portal, you can start writing queries to retrieve and analyze your data.<\/p>\n
For example, let’s say our Function app is ingesting telemetry data from a fleet of vehicles. We can use the following query to retrieve the number of messages received from each vehicle in the fleet.<\/p>\n
telemetry\n| summarize count() by vehicleId\n<\/code><\/pre>\nAnalyze Data with Visualizations<\/h3>\n
Azure Data Explorer also enables you to create interactive visualizations of your data. To create a new visualization, follow these steps:<\/p>\n
\n- Write your query and ensure that it returns the data you want to visualize.<\/li>\n
- Click on the “New Visualization” button in the query editor.<\/li>\n
- Select the desired chart type for your visualization.<\/li>\n
- Configure the chart properties as desired.<\/li>\n
- Click the “Run” button.<\/li>\n<\/ol>\n
Conclusion<\/h2>\n
In this tutorial, we’ve seen how to set up a simple real-time analytics pipeline using Azure Data Explorer. We started by creating an Azure Data Explorer cluster and database, and then ingested data into our database using Azure Functions and Event Grid. Finally, we explored how to retrieve and analyze our data using Azure Data Explorer’s powerful query language and visualization tools.<\/p>\n
With Azure Data Explorer, you can easily scale your real-time analytics to handle millions of data points per second, so you can make critical business decisions in real-time.<\/p>\n","protected":false},"excerpt":{"rendered":"
Azure Data Explorer (ADX) is a fast, reliable, and highly scalable real-time analytics platform provided by Microsoft. It’s designed to collect, analyze, and visualize massive volumes of data in real-time. This tutorial will walk you through the steps to set up a simple real-time analytics pipeline using Azure Data Explorer. Continue Reading<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_import_markdown_pro_load_document_selector":0,"_import_markdown_pro_submit_text_textarea":"","footnotes":""},"categories":[1],"tags":[1675,93,1677,1215,95,1676,211,155,1011,1678],"yoast_head":"\nReal-time analytics with Azure Data Explorer - Pantherax Blogs<\/title>\n\n\n\n\n\n\n\n\n\n\n\n\n\n\t\n\t\n\t\n