Map visual is not working neither in Service nor in Desktop version from last few days
Map visual is not working neither in Service nor in Desktop version from last few days
Hi there,
I am using Power Bi and Purview and have been tasked with loading Purview Metadata from an S3 data source on Purview into Power Bi.
Is there a way to do this ???
I am aware of the Purview Hub in PowerBi, and ideally what I would like to do is to add the S3 metadata into this Hub.
My apologies, if this is not the right forum to post this question.
Yesterday I posted about Q&A suddenly no longer working in some of our workspaces.
We now have a new issue of maps not working in all of our workspaces! Yesterday they were working.
Further more if I download the PBIX of the report connected to the same dataset into PBI Desktop - the map loads!
Why are MS breaking everything?? We have paying customers looking at these and our hands are tied!
Can anyone help here?
Dear all how are you?
Suddenly the maps of all dashboard of my organization have disappeared. We are used to use the Choropleth map visual, and sometimes adding a tooltip from another page as more information, now it's all blank and i cannot find another visual that allow us the color states and show a tooltip from another page...
the dasbhoard was perfectly working, updating every night but suddently in the service it stoped to show, so i went to the desktop and i cannot see it these neither...
How can I have MDSCHEMA_CUBES in DMV to return the timestamps in EST. The results don't match to the refresh history at all.
Does any1 know why it returns a different timestamp altogether? I need it to return the exat same time stamp as refresh history of the dataset.
@jeffrey_wang @AlexisOlson @bcdobbs
Thank you in advance.
When I add a PowerBI Pro license to my global admin account, I can view the standard Microsoft 365 usage analytics report.
When using my normal account with PowerBI Pro license, I am not able to view/update the report.
Can anyone tell me which role or authorization I should give to my normal account in order to make the report work?
Hi all - I'm truly stumped on this problem. I have experience paginating and inserting dynamic parameters in GET calls, but it appears POST API calls in Power BI require a different method. Here is my 3-variable function in the dataflow making a POST call:
(page as number, startingdate as text, endingdate as text)=> let url = "https://fakeurl.com/v1/", api_consumer_key = "fake-key", authorization = "basic fakeauthodwklsjlsdjafkljasd", clientId = "0495", header = [#"api-consumer-key" = api_consumer_key, #"authorization" = authorization, #"Content-Type" = "application/json; charset = utf-8"], content = Json.FromValue([ action = "fetchQuery", domainObject = [#"page" = Number.ToText(page), #"startDate" = startingdate, #"endDate" = endingdate, #"client" = [#"id" = clientId]]]), response = Web.Contents( url, [ Headers = header, Content = content ] ), jsonResponse = Json.Document(response) in jsonResponse
To deal with Gateway timeout (the server taking too long to respond to large calls), I am forced to shift to more calls with less volume. I addressed this by iterating the function over narrow start and end dates. I then added the pagination with List.Generate as the page loads are too small for even a narrow date range. The following function iterates across a 2 column table of [START DATE] and [END DATE]:
let Source = #"Post API Date Parameters ETL", #"Added fx" = Table.AddColumn(Source, "fxMediaFrames", each let PageList = List.Generate(()=> [Result = try fxMediaFrames(1, [START DATE], [END DATE]) otherwise null, page = 1], each not List.IsEmpty ([Result]), each [Result= try fxMediaFrames([page]+1, [START DATE], [END DATE]) otherwise null, page = [page]+1], each [Result]) in PageList) in #"Added fx"
When I run this iterated function, I receive this vague error:
When I manually enter the 3 variables to the POST call, I successfully retrieve a single page of records, as expected. The error arises when I attempt to paginate and parameterize the call within the [START DATE] and [END DATE] columns.
If you made it this far, any help on the function or insight on the error would be hugely appreciated. Thanks all.
Lakehouse on Capacity in Fabric. When I query the data from the auto-generated SQL Analytics endpoint, I can see data, tables, etc. If I attempt to "Expore Data" from the auto-generated Semantic model, It never loads. I've waited up to 30 minutes and it never loads anything in the data pane.
Additionally, If I attempt to connect to either from Power BI Desktop, It says "working on it" for 10 seconds or so and then never shows any of the tables in the data pane. It is as if there is no data.
The Lakehouse is populated via a Dataflow pulling data from MySQL.
Any ideas as to what is going on?
Hello,
Is there an extension that would allow me to create a button that can add a sql record and also delete the selected record for a sql server table that I am using for Direct Query?
Or can I use PowerApps?
Thank You,
Michael
Hello, I don't imagine this is possible but thought I would ask in case.
In my tenant (Tenant A) we have a pro license, and a report that we would like to share with Tenant B.
Tenant B have a premium capacity workspace.
If I shared the report and semantic model with Tenant B, could they republish the report into their premium capacity workspace and refresh the semantic model?
The steps I have imagined in my head would be :
1.) Tenant A shares semantic model and report with Tenant B
2.) Tenant B downloads report and connects to shared semantic model as an external data source
3.) Tenant B republishes report in Premium Workspace
Is this possible or would Tenant B need to rebuild the report / build a new report from scratch once they have connected to the semantic model as an external data source?
Thanks
Hi Team ,
Getting below message while opening Coilot , earlier it was opening,
Is there any reason why this blank selection keeps popping up when I try to map a gateway? This prevents the gateway from auto assigning requiring me to click through and select my gateway. The circled area in red is what I'm referring to. Below that red is the actual gateway I want to select.
Hi I am trying to embed a Power BI Report in the Dashboard area of my dynamics CRM. I want to know how can I apply context filter to the report. I understand that fitering is possible when embedding the report to a Form, but is it possible to have context based filtering for the main dashboard?
Are power bi gen 2 dataflows in preview still?
These things are super painful to use. They are constantly kicking off queries to a web connector, without my approval. To add insult to injury the gen 2 version of dataflows won't allow me to cancel the refresh operation either. Or rather, it spins and spins and once the refresh is finally done, then the cancel option finally becomes available! (after you don't need it anymore. )
It is difficult to understand the design of this PBI component. I'm not sure if "gen 2" is better than the "gen 1". Is anyone using this stuff?
Below (image) is the spinny refresh going on without any way to access settings, or delete, or cancel. It seems like I'm being punished once again for using the tool in some way that Microsoft didn't have in mind..
Hi there,
Currently, it's only possible to schedule a refresh for a Semantic Model at full hour or half hour. Is there a way to schedule a refresh to run at X:20? The source in that case would be Sharepoint, so Directquery wouldn't be an option.
Hello everyone,
I created a table with some conditonal formatting icons, but after my users started to play with the sorting columns, some icons start to dissapear and then I have to republish my reports, to be able to see the icons
This how it looks like before I republish the report
Then, I republished my report
now, if i start to sort my columns, the issue will appear and all users will be impacted with the missing values
Any idea of what I can do to solve this issue?
Many thanks in advance for your help
Hi!
I have a PowerBI report connected to the SQL Warehouse on Databricks. It is setup for incremental refresh, and locally I'm dealing only with a fraction of data limited by the RangeStart and RangeEnd parameters. When I publish the report online and do the first (full) refresh and PowerBI starts sending some limit 1000 queries, some kind of preview. I can see the queries on the Databricks side and they look like this
Is there a way to disable that?
I have disabled that bacjground preview load but that didn't help
Since I'm querying from a complex views on databricks those limit 1000 previews are very costly and incremental refresh doesn't make almost any sence.
How to disable that? So PowerBI sends only the very nessesary queries (pulling all historical data first and then only the incrementals?)
Hi!
I have a matrix on a report page using field parameters to dynamically get values for the last four months in addition to some other metrics (Image 1). However, when pinned to a dashboard, the column names in the matrix do not update as the months displayed update (Image 2). The values in the visual are updating correctly in both the report and the dashboard, but the column names on the dashboard only are not updating to reflect the change in months. When pinned in April, both the report and the dashboard displayed January – April. But now in May, the report displays column names for February – May, but the dashboard still displays January – April.
Any ideas on why the column names in the dashboard are not updating?
Image 1, report:
Image 2, dashboard:
Thanks!
I have a number of computed entities in a DF GEN1 dataflow.
This dataflow is deployed to two workspaces, with virtually no changes. In one workspace they behave differently than in another. I think it is based on the amount of data.
In one workspace the data is is retrieved from a WEB connector once, one table is stored in blob, and all the other tables are computed/calculated from the first table ( so that the WEB api isn't abused by PBI).
In another workspace, the exact same DF code is working differently. PBI is hitting the WEB connector over and over again and is disregarding the fact that the other tables are supposed to be computed. I've confirmed that they both use the exact same gateway, and the code of the two DF's is almost identical. The only difference is that there is a slightly different amount of data in one case or the other (they have different REST argument filters)
Below is a sample of the code, where the computed entity is annotated properly in the DF, yet PBI is not respecting this, and is going back to the WEB connector for the data instead.
Any help would be appreciated.
hello everyone,
I am looking at a semantic model that's giving this error, there is issue with fct_Order table.
you can see the lineage view.
When you click on this semantic model, I can see it only contains dim and fact tables
However, when I click on the four individual dataflows that goes into this dataset, their tables are completely different.
Because of this, I am unable to figure out how this semantic model got it's 43 dim/fct tables.
And another question related to this, when I click on the detail of this semantic model, this semantic model contains other report and semantic model? So what exactly is a semantic model? Is it like a folder on PBI service where it cantain data, report and other semantic model?