When working with large datasets (close to 1 GB) how do you best update the model?
My dataset is supposed to be shared to others as a content pack, and it's connected via a gateway to an OnPrem SQL.
From times to times I will need to update the data model, adding new tables, creating new relations, new columns, or new measures. Doing so with Power BI Desktop will each time publish a ~1 GB file to the Power BI Service.
Is there a way to use tools like SSDT to connect to the Power BI Service, and/or use Power BI Service to publish just the data model without any data to the Power BI Service?