Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
scrimpton
Frequent Visitor

Power BI - Multiple Reports, One Semantic Model

I know that the ideal thing to do is to source multiple reports from one model mainly due to easy maintenance and to avoid redundant data/refresh.

 

But what if we have an extreme scenario, like 30 reports sourcing from one model (users can create their own reports). Am I correct to assume that there's a high chance that it will hit the memory allocation per model if these reports are concurrently accessed by the users and the data model is refreshing at the same time?  To give more context, some of the reports have several visuals (particularly big table visuals).

 

1 ACCEPTED SOLUTION
nandic
Memorable Member
Memorable Member

@scrimpton the size of dataset and refresh duration are very important.
If dataset refresh takes 5-10 minutes, maybe some report will get a little bit slower, but it will just be short peak.
If you have shared capacity (not premium capacity) max size of dataset is 1 GB.
If you have premium capacity, then you can upload datasets bigger than 1 GB. But if you use premium, i doubt that there will be issues if you have 1 dataset and 30 concurrent users.

Where you might get into issues: if you have very complicated dax. Then even small dataset can have huge impact on all reports. 
If you have DAX calculated tables and columns, it will impact on dataset refresh performance.
If you have DAX measures, it will not impact dataset refresh performance, but it will impact report performance.

So best practice:
1) make data model clean and without complex calculations
2) try to refresh during off-peak hours
3) test performance of report using option performance analyzer
4) if you have premium capacity, you can instal Fabric Capacity Metric app where you can see which dataset is slow, at which hours dataset/report gets slow
5) one additional important thing: reports are rendered on user's browser. So if you have fast laptop maybe everything will look fine. But if someone has 4GB ram, slow processor, it will be slow report on user's side for sure if there is a lot of data.

 

Cheers,
Nemanja Andic

View solution in original post

4 REPLIES 4
nandic
Memorable Member
Memorable Member

@scrimpton the size of dataset and refresh duration are very important.
If dataset refresh takes 5-10 minutes, maybe some report will get a little bit slower, but it will just be short peak.
If you have shared capacity (not premium capacity) max size of dataset is 1 GB.
If you have premium capacity, then you can upload datasets bigger than 1 GB. But if you use premium, i doubt that there will be issues if you have 1 dataset and 30 concurrent users.

Where you might get into issues: if you have very complicated dax. Then even small dataset can have huge impact on all reports. 
If you have DAX calculated tables and columns, it will impact on dataset refresh performance.
If you have DAX measures, it will not impact dataset refresh performance, but it will impact report performance.

So best practice:
1) make data model clean and without complex calculations
2) try to refresh during off-peak hours
3) test performance of report using option performance analyzer
4) if you have premium capacity, you can instal Fabric Capacity Metric app where you can see which dataset is slow, at which hours dataset/report gets slow
5) one additional important thing: reports are rendered on user's browser. So if you have fast laptop maybe everything will look fine. But if someone has 4GB ram, slow processor, it will be slow report on user's side for sure if there is a lot of data.

 

Cheers,
Nemanja Andic

scrimpton
Frequent Visitor

Thanks for your reply. My personal license is Pro, not sure on the company level (but I'm guessing Premium capacity)

 

Why do you ask? I am assuming as the licensing goes up, it will just increase the memory limit? But basing from your answer, I'm guessing that my assumption below is correct?

But what if we have an extreme scenario, like 30 reports sourcing from one model (users can create their own reports). Am I correct to assume that there's a high chance that it will hit the memory allocation per model if these reports are concurrently accessed by the users and the data model is refreshing at the same time?  To give more context, some of the reports have several visuals (particularly big table visuals).

Hello,@nandic .Thank you for your help of this problem. Now I want to share my solution below.
Hi,@scrimpton

Regarding the issue you raised, my solution is as follows:

1.As the license level increases, so does the size limit of the refreshable content space.

2.Assuming that you need to get 30 reports from a model and refresh it, we recommend that you take the following steps:

First, ensure that you optimize the semantic model as much as possible. This includes removing unnecessary columns and tables, optimizing DAX expressions, and using aggregation to reduce the amount of data processed and stored in memory.

Second, as Mr. Nemanja Andic said, try to refresh during off-peak hours. If users are accessing these reports at the same time and refreshing the data models at the same time, it is possible to reach the memory allocation for each model, which can cause the refresh to fail or lag.

Finally, depending on your needs, consider scaling up capacity.Power BI Premium and Power BI Embedded do not require a cumulative memory limit, so refreshing semantic models at the same time will not cause resource constraints. However, refreshing individual semantic models is limited by available capacity memory and CPU limitations, as well as model refresh parallelism for SKUs.

3.The reference documentation is below:

What is Power BI Premium? - Power BI | Microsoft Learn

 Pricing & Product Comparison | Microsoft Power BI

Best Regards,

Leroy Lu

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

nandic
Memorable Member
Memorable Member

Which license model do you use:
1) pro
2) premium per user
3) premium capacity

What is the size of semantic model and how long does it take to refresh?

The best practice is to link multiple reports to one semantic model and to refresh this semantic model before users start using reports (during off-peak hours).

Cheers,
Nemanja Andic

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors