Skip to content

10 Monitoring and Alerts using Fabric Events

Benny Austin edited this page Apr 14, 2025 · 38 revisions

The Fabric Accelerator leverages real-time Fabric events to monitor and alert significant occurrences, enhancing the observability of your data platform. This observability encompasses all activities within OneLake and Fabric workspaces, as well as the execution of data pipelines and Spark notebooks. Currently, the following monitoring and alerts are available:

  • Frequently used Fabric workspaces, item types, items, users, and user actions.
  • Frequently run data pipelines and Spark notebooks, including elapsed duration, execution status, trigger types, job types, and schedules.
  • Frequently used OneLake actions by users.
  • Alerts for job execution anomalies.
  • Alerts for jobs showing regression trends compared to the last 60 days.
  • Alerts for OneLake usage anomalies.
  • Alerts for new users, among others.

How this is done.

Fabric Events and Alerts

The Eventstream es_fabricEvents connects to events generated by items in your data platform. The following data sources are connected to the event stream. Data is sent to event stream when a user or system generated event occurs.

Connector Data source
Fabric Workspace Item Events The default workspace for this accelerator
Fabric Job Events Master ETL ASQL data pipeline
Optimize DeltaLake Tables Spark notebook
Fabric OneLake Events lh_bronze lakehouse
lh_silver lakehouse
dw_gold datawarehouse

The eventstream then filters data based on the schema of data source type(workspace/job/OneLake) and lands them as-is in kdb_fabricEvents KQL database in the Eventhouse eh_fabricAccelerator

Event Stream KQL Database Update Policies Materialized Tables Real-Time Dashboards Alerts

What's the output look like?

Alerts

Clone this wiki locally