New announcement 2023-05-23 about the single place of MS; it’s like a Databricks replica, except for the tools in Azure without needing to go into Azure (see more below). See the announcement Introducing Microsoft Fabric and Copilot in Microsoft Power BI.
They also announced the integration into git for some Power BI artifacts (RW Introduction to git integration). But it’s not there yet as they exclude most of the things. It will probably take a long time until they are ready, as usually, MS products have lots of visuals inside the files as well (same as in the SSAS builder where all is XML with lots of IDs)
These features are possible due to an open Data Lake Table Format like Delta Lake, which they packaged into MS OneLake, with more security and integration (lineage, etc.) on top of it. Similar to what Databricks is doing with notebooks and Lakehouse.
You can combine different Query languages and computes. I know they use spark under the hood to wrap up, but for T-SQL, they might use a serverless SQL server, or Analysis Services, depending on what you need.
# Comments Simon Whiteley Video
See more from Simon Whiteley on Advancing Fabric - What is Microsoft Fabric? - YouTube.
Some comments and notes:
- it’s a sass service on top of Synapse, power Bi and more integrated with a new name, which is fabric
- what is different is the communication: They talk about Fabric the same way they talk about MS Excel.
- Spark in MS Synapse is now called Synapse Data Engineering
- His comment: it looks all nice: but will it be seamlessly integrated?
- There is no Azure, there is only SQL, Delta, and Spark. If logging into Fabric, you log into Power BI. It’s a software as a service, not AZURE. Wow!
# Lakehouse reference
Microsoft Fabric is basically a Lakehouse implementation by Microsoft. Others who have their own are Databricks with Delta, and Onehouse has theirs with Hudi calling it OneTable. Dremio is also building one on Iceberg, and Snowflake is getting more open with Apache Iceberg.
# Declarative Metrics or Business Logic
Yeah, that’ll be your Power BI models most likely. You could have logical layers in either Lakehouse or Warehouse artifacts, but your metrics will be Power BI measures directly on top of the Delta Table objects.
Mostly, in the transformations of the ETL/ELT jobs/dataflows/pipelines between the Delta tables, similar like in the Medallion Architecture known on the Databricks side.
The business logic spreads over certain products (Factory, Data Engineering, Synapse DWH etc.) in certain code (Spark, T-SQL, KQL)
You can also put business logic into PowerBI data models (like DAX measures, calculation groups etc.), but then it should be handled like a Data mart.
The truth should be persisted in the Delta tables (with Parquet as underlying), which is analogous to the Gold layer in the Medallion architecture in Databrick’s logic (=the well-structured business presentation layer).
Maybe Tabular Model Definition Language (TMDL) is any good, it’s human readable.