Quantcast
Channel: Service topics
Viewing all articles
Browse latest Browse all 61975

Sanity Check on "Resource Governing: This operation was canceled because there wasn't enough memory"

$
0
0

Good day all,

First, know that I'm new to the Power BI space and am currently troubleshooting "not enough memory" errors that occur during a semantic model refresh.

I've been looking at this for over a week and wanted a sanity check from the community.

Issue: During scheduled refreshes of one of our production workspace models we always receive "not enough memory" errors.

Additional data:

  • The same semantic model refresh always works in the Stage workspace. Interestingly, the Stage source SQL DB is bigger than the PROD SQL DB! Also, both Stage and PROD refresh fine in Power BI Desktop.
  • The model is set to "Large semantic model storage format": "On".
  • Using DAX Studio, I collected metrics of both Stage/Prod models when fully refreshed. PROD Size: 928.74 MB, Stage Size: 836.95 MB.
  • This suggests to me that although the Stage SQL DB is larger overall, there is more relevant data in PROD, resulting in a larger model.
  • Power BI License capacity SKU: A2.
  • We are using incremental refreshes with partitions by quarter.

So, why would the model refresh everywhere but on the PROD Power BI service?

Thanks in advance!

Paul


Viewing all articles
Browse latest Browse all 61975

Trending Articles