top of page
  • Writer's pictureJesse Arora

Power BI Composite Model , No ETL Business Intelligence Solution

#PowerBI #MODBI #CompositeModel

Gone are the days when you have to use complex ETL tool and hire ETL Experts to clean data for Reporting, with latest innovations in the BI world one can combine and refresh data from almost any source such as Excel, Web, SharePoint with any kind On Premise or Cloud database without writing single ETL package. Solution will be fastest time to deployment along with other advantage of agility for example if there is change in business formulae or calculation one can change that quickly in the data model without having to modify DW Structure or ETL, With Composite Data Model you can keep all data at source or refresh what is needed in the model instead of refreshing all data.

One good strategy will be to refresh summary data in the model and leave the detailed drill through data at the sources. There are various products coming in this space but POWER BI Premium and Azure Analysis Services are most popular. Power BI/SSAS Composite data models now let you combine multiple direct query sources and even direct query sources and import sources into same model, they also support many to many relations natively.

So does that means you no longer need Datawarehouse? Well that Depends

1. Do you need to keep any additional history which your source system does not keep?

2. Is your data bigger than what azure analysis services or Power BI premium can hold in the cache and refresh within your SLA?

3. How complex are your transformations? Although composite data model allows you to mix and match direct query sources with import but how are those complex transforms performing, are they meeting user expectations of split second response times?

Azure Analysis Services Current Capacity per Instance:

Tier S9 400 GB Compressed data 4 TB uncompressed Estimated

Power BI Premium Current Capacity per Instance:

Tier p3 100 GB Compressed 1 TB uncompressed Estimated

So if you do not need to persist the data and total data size for particular DataMart is within these capacity tiers or if transformations are not too complex to perform poorly than you can skip the Datawarehouse and ETL Platform all together and use advanced DAX to transform data and build calculations.

Note if you are going to do direct query or import data multiple times a day you might want to extract data from real time replica and not from live application database so you don’t end up impacting application performance during refresh. One can also schedule incremental refresh instead of full refresh depending on size of the data and type of changes, Azure Analysis also has partitioning which can speed up data processing even further. Along with Partitioning Azure analysis services supports automatic replica synchronization which will be useful if you are refreshing data very frequently let’s say 15 minutes so BI users are not impacted during each refresh.

925 views1 comment

Recent Posts

See All
bottom of page