Explore the latest in technology and cybersecurity with insightful blog posts, expert tips, and in-depth analysis. Stay informed, stay secure!

How To Confidently Create Microsoft Sentinel data lake Custom Tables for Security Analytics

🔍 Introduction

Security teams often struggle to onboard custom or niche log sources into Microsoft Sentinel. While built-in connectors handle many scenarios, custom tables unlock flexibility—allowing you to store, query, and analyze unique datasets in your own Microsoft Security Data Lake.

This first post in the series shows how to:
1️⃣ Create a custom table with Azure CLI
2️⃣ Verify the configuration
3️⃣ Change its tier to Sentinel Data Lake through the Defender Portal


🧩 Before You Begin

You’ll need:

  • An Azure subscription with permission to create Log Analytics tables
  • An existing resource group and Sentinel workspace
  • Azure CLI v2.66 or newer
  • Contributor or Log Analytics Contributor role

💻 Creating a Custom Table with Azure CLI

CLI Limitation (December 2025):
The az monitor log-analytics workspace table create command currently supports only plan Analytics or plan Basic.
Data Lake plans aren’t yet available through CLI. Create the table using the Analytics plan, then change its tier inside the Defender Portal.

Step 1 – Set Your Context

az login
az account set --subscription <your-subscription-id>

Step 2 – Create the Custom Table

az monitor log-analytics workspace table create \
  --resource-group <your-resource-group> \
  --workspace-name <your-workspace-name> \
  --name Syslog_datalake_CL \
  --columns TimeGenerated=datetime Device=string Message=string Severity=string \
  --plan Analytics \
  --retention-time 90

✅ This creates a standard Log Analytics custom table inside your Sentinel workspace.


🧠 Verify the Table

az monitor log-analytics workspace table show \
  --resource-group <your-resource-group> \
  --workspace-name <your-workspace-name> \
  --name Syslog_datalake_CL \
  --query "{name:name,plan:plan,totalRetentionInDays:totalRetentionInDays}"

Expected output:

{
  "name": "Syslog_datalake_CL",
  "plan": "Analytics",
  "totalRetentionInDays": 90
}

🧪 Validate in Sentinel Logs

In the Microsoft Sentinel Logs blade, run:

Syslog_datalake_CL | getschema

You’ll see your columns (TimeGenerated, Device, Message, Severity) confirming the table is active and ready for data ingestion.


🧭 Switch to Sentinel Data Lake Tier (Portal Step)

After verifying the table, you can change its storage tier to Sentinel Data Lake for cost-efficient, long-term storage.

  1. Open the Microsoft Defender Portal
  2. Go to Microsoft Sentinel → Tables → Manage Table
  1. Select Syslog_datalake_CL
  2. Under Table Tier, switch from Analytics to Sentinel Data Lake

https://learn.microsoft.com/en-us/azure/sentinel/manage-table-tiers-retention

What this does

  • Routes future ingested data to the Data Lake tier (cold / archival).
  • Keeps existing Analytics data where it is.
  • Reduces cost for high-volume or compliance logs.
  • Disables real-time analytics and hunting on data stored only in Data Lake tier.

Note: The CLI can’t change table plans yet—this action is supported only through the Defender Portal UI.


⚙️ Troubleshooting Tips

IssueCauseResolution
Invalid plan valueCLI doesn’t accept Data Lake plansUse plan Analytics, then change the tier in the Portal
Missing TimeGeneratedIncorrect column syntaxUse space-separated name=type pairs
Need to change plan after creationPlans can’t be updated via CLIManage tier from Defender Portal → Microsoft Sentinel → Tables → Manage Table

✅ Summary

You have now:

  • Created Syslog_datalake_CL via CLI using the Analytics plan
  • Verified its schema and retention (90 days)
  • Changed its tier to Sentinel Data Lake in the Defender Portal

This hybrid approach—CLI for automation plus portal for tier management—gives you full control and cost-optimized data governance.


💡 Next Up

Part 2: Ingesting Syslog Data into Your Custom Table
We’ll connect your syslog collector VM, configure a Data Collection Rule (DCR), and validate live data flow into Syslog_datalake_CL.

For previous posts, please take a look here: Home – Its Security Day with Mike