Hey everyone. I work in an environment with about 14 Workspaces, each averaging 3-4 reports with a matching semantic model. There are 2-3 Workspaces with 10+ Reports as well. What has me confused and worried is that the data sometimes takes 30-50 minutes to refresh when 90% of these data sets are under 1GB. We use scheduled refresh and everything routes to 2 gigantic servers (which have from my knowledge way more resources than this should take). We have the premium service for powerBI and really havent had too many problems up until the new year, and it has now become unbearable. To my knowledge (I made 95% of these reports and models) there are no high amounts of DAX and ETL Language in these models to slow them down that way. Most reports take 10-20MB of SQL Tables and imports them. I am sure I am not covering every single piece of information relevant, but I do not know what to add atm. Please ask and I will answer to help my situation. Or someone can tell me that 10 MB SQL Tables are supposed to take 50 minutes to refresh, which I doubt.
Thanks ahead of time.
↧
Semantic Model Refresh Slow and Buggy for Large Amount of Models
↧