Hi guys,
I am currently working on a project for a client. They have an Azure SQL DW, which stores data from numerous censors in a building they own. We are talking about several 100 million rows. When I try to set up a direct query connection to this DW, either from PBI Service or the desktop, the query time is terrible. I can easily connect to the DW, select tables and even apply filters in the query editor. But once I start dragging measures and columns to the canvas in my workspace, the load time is ENDLESS or tables/charts fail.
My issue is whether PBI is simply not fit to handle such a large amount of data, or whether indexing the rows or pumping up performance in the DW would have any effect.
Do any of you have any experience regarding this matter and can you perhaps share some light on the issue?
Thanks,
Casper