Quantcast
Channel: Service topics
Viewing all articles
Browse latest Browse all 62508

DirectLake Semantic Model failes to load due to supposed Disk Size Limit

$
0
0

We have a Semantic Model in our PowerBI-Service that connects to our Lakehouse in the same workspace via DirectLake. When I tried to open the data model today, though, the following error message is displayed:

rvp_ordix_0-1738657789613.png

Reports that connect to the model also fail to load, though the SQL Endpoint the model is building on works fine. The blacked out Database ID is the ID of the model itself, not the underlying lakehouse.

The claim that this model would "exceed the maximum size limit on disk" is very confusing, since according to the MS Learn Direct-Lake-page even an F2 capacity has a Size On Disk Limit of 10GB for DirectLake models. We use an F16, which means that our limit should be 20GB. 

Max Memory is the only limit we may plausibly be reaching, but even. Neither Size On Disk nor Max Memory will block you from working with a DirectLake Model if exceeded. According to the documentation they'd simply impact performance.

 

Usually I'd have tried to remove tables from the model to get it back under that "limit" and figure the cause out afterwards, but I can't even do that with the data model inaccessible. 

 

What is happening, and how can I fix this?


Viewing all articles
Browse latest Browse all 62508

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>