Hi, community.
We are facing a behaviour/error that we can not explain regarding semantic model, in import mode, connected to a Dataflow Gen 1 in a workspace running on a Fabric Capacity.
The scenario we are experiance:
- Checking the Dataflow, all data we need is there. The data is far less that 1GB. (aprox 500MB)
- Connecting Power BI Desktop to the Dataflow, randow rows are missing from the dataset that we know exisits in the Dataflow. Some of them from 2025.
- Limiting the data that exists in the Dataflow (change filter on year to ingest from 2020 to 2024) and refresh the Dataflow. Now all data from 2024 and up until today are loaded into the Semantic model. Dataset now less that 300MB.
Are there anything in how semantic models load data from a Dataflow Gen 1 that could explain this behaviour when the dataset is clearly inside the limitations of the Semantic model, and the Dataflow for that matter?
Do you need anymore info or have question, please just ask!
Regards
Jon Terje Tjønn