Power BI Integration

Use data contract schemas directly in Microsoft Power BI as semantic models.

Entropy Data integrates with Power BI for a seamless data consumer experience. From an existing data product's output port, a user can publish a Power BI semantic model into one of their own Fabric workspaces. Entropy Data generates the TMDL from the data contract, binds the semantic model to the underlying Snowflake or Databricks connection, and then creates a new data product in Entropy Data whose output port is linked to the newly published Power BI semantic model. This new data product is registered as consumer of the existing data product.

You will be able to use the origin source in Power BI to build reports, extend the semantic model with metrics and combine with existing data sources. You can use Entropy Data to follow business lineage and be sure how your data flows.

Open in Power BI

Architecture

A business or data analyst triggers the integration from the Open in Power BI wizard in Entropy Data. Authentication goes through Microsoft Entra ID (OAuth 2.0 / OIDC). Entropy Data converts the output port's Data Contract into a TMDL definition, POSTs it to the Power BI REST API as a new Semantic Model, and binds it to an existing Fabric Connection. The resulting model is registered back in Entropy Data as a governed consumer Data Product; at runtime, Power BI reads the underlying data source (e.g. Databricks) via the bound connection.

How it works

  1. An organization owner turns on the integration by configuring Power BI app credentials for the organization — either SSO-based (if members sign in via Microsoft Entra SSO) or a custom Entra application registered for Entropy Data.
  2. Each user connects their own Microsoft Entra account to Entropy Data once, granting the permissions needed to read Fabric workspaces and create semantic models on their behalf. Tokens are stored encrypted in Entropy Data and refreshed automatically.
  3. When a user opens the Power BI action on an output port, Entropy Data:
    • generates a TMDL semantic model from the output port's data contract (ODCS),
    • creates it in the Fabric workspace the user selects,
    • binds it to a Fabric connection — reusing an existing one if the data-source parameters match, or prompting the user to enter credentials for a new one,
    • creates a consumer data product and an approved data-usage agreement in Entropy Data so the published model is visible in lineage and reuse.
  4. Previously-published semantic models appear on the output port page, so users can jump straight to them in Power BI.

Supported output port types for the underlying Fabric connection: Snowflake and Databricks.

Authentication flow

Getting the integration working is a two-step process: an organization owner configures the OAuth application once, then each user connects their own Microsoft account on first use. After that, cached tokens are reused and refreshed in the background.

Power BI authentication flow

Prerequisites

  • An Entropy Data Enterprise License or the Cloud Edition.
  • A Microsoft Entra ID tenant with either:
    • Option A — SSO-based credentials. Your organization's members sign in to Entropy Data via Entra ID / Microsoft SSO. The existing SSO application is reused for the Power BI user-delegation flow.
    • Option B — Custom Entra application. A dedicated application registration in your Entra tenant with the permissions and redirect URI described below.

Cloud Edition

The feature flag, encryption keys, and signing keys are already configured and managed for you. No environment variables to set — skip straight to Option A or Option B to register the Entra application.

Self-hosted

Set the following environment variables before starting Entropy Data. See Configuration for the full reference.

VariablePurpose
APPLICATION_POWERBI_WEB_ENABLED=trueGlobal feature flag that enables the integration
APPLICATION_ENCRYPTION_KEYS64-hex-character key; encrypts stored client secrets and user tokens at rest

Option A: SSO-based credentials

If your organization already uses Microsoft Entra SSO to log in to Entropy Data, the same Entra application can be used for the Power BI integration.

  1. In the Entra portal, open your Entropy Data SSO application and add the following redirect URI under Authentication → Web:

    https://entropy-data.example.com/powerbi/callback
    

    Only the host changes — the /powerbi/callback path must be exactly as shown.

  2. Under API permissions, add the delegated permissions listed in the Required permissions section below.

  3. In Entropy Data, navigate to Settings > Power BI (as an organization owner) and select Use SSO credentials.

  4. Save. The settings page displays the resolved application name from Microsoft Graph so you can confirm you are pointing at the right app.

Option B: Custom Entra application

When Entropy Data SSO does not use Microsoft, or you prefer a dedicated application for the Power BI integration, register a new Entra application.

  1. In the Entra portal, Register a new application. Give it a descriptive name like Entropy Data — Power BI.

  2. Under Authentication → Web, add the redirect URI:

    https://entropy-data.example.com/powerbi/callback
    

    Only the host changes — the /powerbi/callback path must be exactly as shown.

  3. Under Certificates & secrets, create a new Client secret and note the value — it is shown only once.

  4. Under API permissions, add the delegated permissions listed below and grant admin consent for your tenant.

  5. In Entropy Data, navigate to Settings > Power BI (as an organization owner) and select Use custom credentials. Enter the Tenant ID, Client ID, and Client Secret of your new application, then click Test connection to verify.

  6. Save. Credentials are encrypted at rest and the client secret is never returned from the API after save.

Required permissions

The Entra application needs the following delegated Microsoft API permissions:

APIPermissionPurpose
Power BI ServiceWorkspace.Read.AllList the user's Fabric workspaces
Power BI ServiceDataset.ReadWrite.AllCreate and update semantic models
Power BI ServiceItem.ReadWrite.AllCreate Fabric items (semantic models as Fabric items)
Power BI ServiceConnection.ReadWrite.AllCreate Fabric connections for new data sources
Microsoft Graphoffline_accessRefresh tokens so users do not need to reconnect every session

offline_access is required so Entropy Data can refresh the user's access token in the background. Without it, the integration works for the first few minutes after connect and then fails.

Connecting your Microsoft account

Each user connects once by clicking Connect to Power BI inside the Power BI modal on an output port. Users can disconnect at any time via their profile (Your Profile -> Power BI connections); disconnect revokes the authentication of the user.

Power BI Connect

The consent dialog is shown on the initial connect. A Reconnect link is available in Your Profile which explicitly re-prompts for consent (useful if new permissions were added to the Entra application).

Entra ID Consent

Creating a semantic model

If the output port is not connected to a semantic model yet, a new semantic model will be created.

On a data product page, open a Snowflake or Databricks output port and choose Power BI from the Open in dropdown.

Power BI modal

The modal walks through:

  • Workspace — one of your Fabric workspaces.
  • Semantic model name — defaults to the output port name.
  • Connection — select an existing Fabric connection.

Creation runs asynchronously. A spinner indicates that the creation is in progress. The resulting semantic model opens in Power BI on success, showing the tables and columns generated from the output port's Data Contract.

Published Power BI semantic model in the Fabric model view

Each successful publish creates a consumer data product plus an approved data-usage agreement in Entropy Data, so the relationship between the source output port and the downstream Power BI model appears in lineage.

The sequence below shows what the single Open in Power BI click actually does behind the scenes — from loading the user's Fabric connections, through TMDL generation and semantic-model creation, to registering the result as a governed consumer Data Product in Entropy Data.

Power BI publish flow

Opening an existing semantic model

Once a semantic model has been created, you can open it again from either side of the lineage:

  • From the semantic model data product — click the location link on the output port to jump straight to the model in Power BI.
  • From the source data product — choose Open in → Power BI. Any existing connections to Power BI semantic models are listed, so you can reopen an existing one instead of creating another model.

Limitations

  • Only Snowflake and Databricks are supported as Fabric connection sources today.
  • The integration creates new semantic models; it does not update existing ones with the same name. Double-click / retry of the publish button within the same in-flight job is collapsed into one publish, but re-publishing after completion produces a fresh model with a suffix.
  • Power BI itself allows duplicate display names inside a workspace. Choose distinct names if you intend to publish the same output port multiple times.
  • Per-user tokens expire after ~1 hour and are refreshed automatically so long as offline_access is granted.