New Feature: Verified Stolen Credential Detection

Ready to help

Ingesting events using Azure Monitor and Microsoft Sentinel

Configure Microsoft Entra ID and Azure Monitor to allow for ingesting Push webhook logs. Then you can use Microsoft Sentinel to perform analytics and to configure alerting.

Overview of setup steps:

  • Create a Microsoft Entra application.

  • Create a data collection endpoint in Azure Monitor.

  • Create a custom log table in Log Analytics workspace.

  • Set up an Azure Monitor ingestion client.

  • Create a webhook in the Push admin console.

For the ingestion client, we will use the Azure Identity and Azure Monitor ingestion libraries that form part of Azure SDK for Python.

Create a Microsoft Entra application

To get started, create an app in Entra ID. This is how the Azure Monitor ingestion client will authenticate. You will also need to grant the app the necessary permissions to make use of the data collection rule and endpoint.

1. In Microsoft Entra ID, go to Manage > App registrations. Click on New registration.

Entra - new registration - Sentinel SIEM

2. Give the app a descriptive name, select the single tenant option, and then click Register.

Entra - add app - register an application - Sentinel SIEM

3. On the app’s Overview page, make note of the Application (client) ID and Directory (tenant) ID values. You will need these later for the ingestion client step. Click on Certificates & secrets.

Entra - add app - certificates and secrets - Sentinel SIEM

4. Click New client secret, provide a description, and set the secret’s expiry date. Finally, click Add.

Entra - add app - add client secret - Sentinel SIEM

Make note of the newly created secret value. You will need this later for the ingestion client step.

Entra - add app - client secret - Sentinel SIEM

Create a data collection endpoint

1. In Azure Monitor, click on Settings > Data Collection Endpoints, and click Create.

Azure data collection endpoint - Sentinel SIEM

2. Provide an Endpoint Name, then select your Subscription, Resource Group, and Region. Click Review + create. On the last pane, click Create.

Data collection endpoint step 2 - Sentinel SIEM

Create a custom log table

In Log Analytics workspaces, select your designated workspace.

1. Click on Settings > Tables, click Create and select New custom log (DCR-based).

Azure new custom log - Sentinel SIEM

2. Provide a Table name, select the Data collection endpoint created in the previous step, and click on Create a new data collection rule. Click Done, followed by Next.

Azure new custom log step 2 - Sentinel SIEM

3. On the next page, you’ll need to provide sample webhook data to create the schema for the custom table. Save the following text in a plain-text file, then drag and drop it to the page or click Browse for files and navigate to the file.

{"version": "1", "id": "76a70a6d-8fff-4517-80b9-e0d3331184c6", "timestamp": 1716279759, "object": "LOGIN", "friendlyName": "Login", "category": "ACTIVITY", "description": "user@example.com logged into https://example.com using a password", "new": {"employeeId": "be35f96b-234d-4e50-9f47-a480350baf3a", "accountId": "9adbb18c-5a8c-4540-bcbc-2dda8d09acfb", "appType": "GOOGLE_WORKSPACE", "appId": "69cb2d4d-1176-4e10-afae-98c919414a6d", "email": "user@example.com", "loginTimestamp": 1716279755, "loginUrl": "https://example.com", "passwordManuallyTyped": false, "weakPassword": false, "weakPasswordReasons": null, "leakedPassword": false, "loginType": "PASSWORD_LOGIN", "identityProvider": null, "sourceIpAddress": "192.168.0.1", "browser": "CHROME", "os": "MACOS", "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/125.0.0.0 Safari/537.36", "workApp": true, "passwordId": "1b859501-6b89-4c76-99fa-5ca8e3ff5573"}}

Note: You should see a warning about a timestamp field not being present. We’ll need to use the Transformation editor to adjust the name of the timestamp field from “timestamp” to “TimeGenerated” to allow the data to be stored accurately in the table.

Azure new custom log step 3 - Sentinel SIEM

4. Clear any pre-populated text in the transformation editor and paste the following text in its place.

source
| extend eventId = id
| extend TimeGenerated = datetime(1970-01-01) + totimespan(timestamp * 1000 * 10000)
| project-away id

5. Click Run to test the transformation on the sample data. You should see a log entry that confirms the transform was successful. Click Apply.

Azure new custom log step 4 - Sentinel SIEM

6. Back at the Create a custom log page, click Next. Finally, click Create.

Azure new custom log step 5 - Sentinel SIEM

Collect information from the DCR and DCE

There are a few items we need from the DCR and DCE objects.

1. First, head to Azure Monitor > Settings > Data Collection Rules and click on the DCR created in an earlier step.

On the Overview page, click JSON View.

Azure data collection rule - Sentinel SIEM

2. On the Resource JSON page, make note of the immutableId value, and the streamDeclarations value. The streamDeclarations value should correspond with the name of the custom log table created in an earlier step. These values will be used for the Azure Monitor ingestion client.

Azure data collection rule step 2 - Sentinel SIEM

3. Head to Azure Monitor > Settings > Data Collection Endpoints and click on the DCE object created in an earlier step.

Azure data collection rule step 3 - Sentinel SIEM

4. On the Overview page, click JSON View.

Azure data collection rule step 4 - Sentinel SIEM

5. On the Resource JSON page, make note of the logsIngestion endpoint url. This value will be used for the Azure Monitor ingestion client.

Azure data collection rule step 5 - Sentinel SIEM

6. Once again, head back to the Azure Monitor page and click Settings > Data Collection Rules, and select the DCR created for this exercise.

Azure data collection rule step 6 - Sentinel SIEM

7. On the DCR page, click Access control (IAM) > Add > Add role assignment.

Azure data collection rule step 7 - Sentinel SIEM

8. On the Add role assignment page, scroll down and select Monitoring Metrics Publisher. This role will be assigned to the application created at the beginning of this guide, and is necessary to allow it to write to the custom logs table via this DCR.

At the bottom of the page, click Next.

Azure data collection rule step 8 - Sentinel SIEM

9. On the Members tab, click Select members and type the name of the application created at the beginning of this article. Select the app and click Select. Finally, click Review + assign twice to finish the role assignment process.

Azure data collection rule step 9 - Sentinel SIEM

Configure an Azure Monitor ingestion client

For the purposes of this guide, we will use the Azure Identity and Azure Monitor ingestion libraries that form part of Azure SDK for Python to configure an Azure Monitor ingestion client.

Note that your environment and needs may differ from this example. The main requirements are that you need to provide a publicly reachable URL for the Push platform to send webhook events to, and you need the Azure Monitor ingestion client to handle the requests.

At Push, we handle this with an AWS Lambda function, but you should be able to adjust the example code and use it within your preferred environment.

import json
from azure.core.exceptions import HttpResponseError
from azure.identity import DefaultAzureCredential
from azure.monitor.ingestion import LogsIngestionClient


# Parse the incoming JSON object
try:
    incoming_data = json.loads(event["body"])
except (KeyError, TypeError, json.JSONDecodeError) as e:
    return {
        "statusCode": 400,
        "body": json.dumps({"message": "Invalid request body", "error": str(e)}),
    }

# Ensure incoming_data is an array of JSON objects
if isinstance(incoming_data, dict):
    incoming_data = [incoming_data]
elif not isinstance(incoming_data, list):
    return {
        "statusCode": 400,
        "body": json.dumps(
            {"message": "Request body must be a JSON object or an array of JSON objects"}
        ),
    }

# logs ingestion endpoint of the DCE or DCR
endpoint_uri = "Use your DCE endpoint URL here"
# immutableId property of the Data Collection Rule
dcr_immutableid = "Use your DCR immutable ID value here"
# name of the stream in the DCR that represents the destination table
stream_name = "Use your streamDeclarations custom table value here"

# DefaultAzureCredential() expects AZURE_TENANT_ID, AZURE_CLIENT_ID, and AZURE_CLIENT_SECRET to be present as environment variables
# You can also use other credential classes defined at https://learn.microsoft.com/en-us/python/api/overview/azure/identity-readme?view=azure-python#credential-classes
credential = DefaultAzureCredential()

# Note: Logging is DEBUG level and may expose credentials
client = LogsIngestionClient(
    endpoint=endpoint_uri, credential=credential, logging_enable=True
)

try:
    client.upload(rule_id=dcr_immutableid, stream_name=stream_name, logs=incoming_data)
except HttpResponseError as e:
    print(f"Upload failed: {e}")
    return {
        "statusCode": 500,
        "body": json.dumps({"message": "HttpResponseError", "error": str(e)}),
    }

return {
    "statusCode": 200,
    "body": json.dumps({"message": "Data successfully sent to Azure"}),
}

Essentially, the code takes the JSON data provided by the webhook event and packages it as an array of JSON objects, as this is what the ingestion endpoint expects.

Keep the URL of your API client handy, as you’ll be using it in the next step!

Once you’re set up, the client should be ready to start receiving webhook events from Push.

Create a webhook in Push

Finally, go to the Push admin console to add a webhook and paste in the URL of your API client.

Refer to the Generic setup steps for SIEM or SOAR for instructions.