72 Introduction - Power BI Data Prep & Dataflows
72 Introduction - Power BI Data Prep & Dataflows
Empowered employees
Employees
Transformed
Intelligent Data products and
customer Customers + Products optimized
engagement Intelligence
operations
The modern business intelligence challenges
Fragmented, incomplete data Requires a team of specialists Data preparation – the most time-
consuming task in analytics BI.
Pulling together data from traditional and Creating E2E BI solutions requires
cloud data sources and figuring out how multiple BI tools. This requires specific
to enrich it is extremely difficult.
knowledge of each of the tools and
complex integration to build and
maintain an E2E BI solution. Complex system design and
architecture
Data visualization
BI modeling
Data visualization
BI modeling
Data prep
Standard schema
Dataflow
Data reuse Integral part of Power BI stack
(Common Data Model)
Datasets
BI models
Dataflows
Data prep
Storage
Data (Azure Data Lake)
Sales data
Dataflow
1 2 3
Sales data Sales with service calls Power BI app
Dataflow Dataflow
• Create dataflows with dataflow • Reuse data from others in your Connect with Power BI Desktop to
editor in Power BI department or other departments create models, reports, and
dashboards using dataflow data.
• Perform transformations and data • Perform in-lake computations
cleansing using Power Query
Online • Keep data consistent with smart re-
calc logic
• Map data to the Common Data
Model
Pulling together data from traditional and Creating E2E BI solutions requires
cloud data sources, and figuring out how multiple BI tools. This requires specific
to enrich it is extremely difficult. knowledge of each of the tools and
complex integration to build and
maintain an E2E BI solution.
Datasets
Dataflows
Storage
Your data
Azure integration
Data + AI professionals can use the full power of the
Azure Data Platform
Reports & dashboards
Datasets
Storage
Configure Power BI to
work with your
organization’s Azure
Data Lake Storage
Configure Power BI to
work with your
organization’s Azure
Data Lake Storage
1. Provide Power BI
with details of the
Data Lake Storage
account
Configure Power BI to
work with your
organization’s Azure
Data Lake Storage
2. Enable workspace
Admins to assign
workspaces to the
storage account
Configure Power BI to
work with your
organization’s Azure
Data Lake Storage
3. Workspace Admins
can assign
workspaces to the
storage account,
Dataflows and
Dataflow data will be
stored there.
Configure Power BI to
work with your
organization’s Azure
Data Lake Storage
Configure Power BI to
work with your
organization’s Azure
Data Lake Storage
Datasets
Data scientists
Business analysts
Data engineers
Low/no code
Low to high code
Data + AI professionals can use the full power of the
Azure Data Platform
Reports & dashboards
Datasets
Data scientists
Business analysts
Data engineers
Low/no code
Low to high code
Enrich data using Azure services
Enriched
Sales with service call center
Dataflows
Storage
Use case: Dynamics 365 Finance & Operations data in
Power BI
Self service
Dynamics 365 for
customizations
Finance & Operations
in Power BI
Dataflows
Storage
Data + AI professionals can use the full power of the
Azure Data Platform
Reports & dashboards
Custom LOB
+
Datasets Developer
resources
Data scientists
Business analysts
Data engineers
Low/no code
Low to high code
Programmability
1 2 3
Power BI APIs Model file SDK Documentation
• We’ve added new web APIs to • A variety of libraries and sample • Documentation for APIs, model
Power BI to help enable many code will be available to enable file creation and working with
scenarios involving dataflows. programming scenarios. dataflows will be available at
• You can now programmatically • You can validate, create, or read launch.
import Power BI dataflows or add CDM model files and data • Step-by-step guides for key
a reference to external dataflows programmatically. dataflow scenarios will be
in Azure storage. • You can easily read CDM Model available as well.
• You can connect datasets to files from Azure Databricks or write
dataflows, refresh dataflows, Databricks tables to dataflows.
update their refresh schedules,
and more.
Questions?