Quantcast
Channel: Service topics
Viewing all 62040 articles
Browse latest View live

How to Embed PowerBI paginated Report into email using Power Automate

$
0
0

Hi, I am using the PowerBI Premium account to create a paginated report and having a paginated premium capacity license.

if we export the report to PDF then Pagianted  Report generates PDF contains 3 to 5 pages depending on the parameter.
So we don't know how many reports page it will have all dynamic data.

I tried this but had no luck.

 I want to embed the entire report tables and charts in the email. I we attach the report in PDF that is ok. but I want to embed all report content into an email.
Is there any way to embed the entire paginated report into the outlook email?

 

 


Published a mixed storage report to workspace. It is broken there

$
0
0

My report uses a published dataset (connected live is usually displayed on the bottom right)

when I added an additional local file it changed the connection to the published dataset to direct query

now I have two datasources:  Direct query to the main dataset and import to the local file

The storage mode is listed as Mixed.

 

It works fine in the desktop where I created the report.

However once I publish it, it does not work and gives this error:

bsheffer_0-1627659779129.png

I've never had this issue before.  It doesn't mention the local datafile I imported.

 

when I go to the datasource with the same name as this report, It won't let me connect to the On-premises or VNet data gateway that I usually connect a datasource to.  It does give me the option to load a personal gateway but I don't want to require every user to have a personal gateway to view the report.

 

The local file is very static so I could turn off the refresh for it but that hasn't eliminated the issue when I did.

 

The published datasource I usually connect to is in another workspace.

 

what do I need to fix this so this report will run on the service?

Dynamic Parameter

$
0
0

I have a table check_register, which has three columns, ID ,INSTR_NO,employee_id

I have used the following query 

let
Source = Oracle.Database("entp_prod_support", [HierarchicalNavigation=true, Query="SELECT ID,instr_no FROM check_register WHERE instr_no="&Parameter1&""])
in
Source

 

When i change the Parameter in Desktop, it works fine but when i update the parameter in the setting of app, it does not fetch any record

 

Surprisingly The same works for the column Emploee_id

 

Please help

Premium App and Workspace Disappeared but not Deleted

$
0
0

One of my organization's premium workspace and app content pkg has disappeared this morning. Only 5 had admin permissions to delete and none are aware of how this happened. We have now run an audit log from the admin portal and no such deletions have been detected. We are at a loss for what happened and if it can be restored. 

Refresh Excel Pivots within Service

$
0
0

I've published an Excel Report, with a Power BI Dataset Connection, to a Premium Workspace.  

 

The report has a Pivot Table connected to my Power BI Dataset that needs to be refreshed daily, however I don't see a function while viewing the report in Service to Refresh the table. 

  • Before the very recent Excel Online/BI update, I'd always get the "can't refresh from a Power BI source" error.  That's now gone, hence why I'm attempting this.

 

I would prefer the "Schedule Refresh" option that was noted when I imported the book from One Drive, but not seeing that anywhere either.

 

cassidy_0-1627662152260.png

 

The workspace claims I don't have any Workbooks with a Data Model.

cassidy_1-1627662499254.png

 

 

Published report reverting to old version

$
0
0

Hi there,

 

The past month I've had noticed data missing from a published powerBI report online. When I go into the desktop dataset driving the report and refresh the data, its there. I republish this dataset, and for a few minutes the data is there. When I refresh the page however, the data goes missing again. 

 

I've tried checking credentials and everything looks good. Any idea why there would be a discrepancy between my PowerBI desktop dataset and PowerBI online dataset?

 

Thanks,

Courtney

Incremental Refresh for Dataflows

$
0
0

Hi all,

I have searched some specific questions I have on this topic, but can't seem to find any existing answers. Hoping someone can enlighten me.

 

We are using Dataflows to pull from SAP HANA, but the source table does not have a datetime column (only date). From what I have seen from searching online, incremental refresh on datasets allow you to use functions to convert the date column to datetime, and so this is a non-issue. But that does not seem to be the case with dataflows in the service (at least according to the microsoft documentation). Will I be able to use incremental refresh in the service if the source table has a date column rather than a datetime column? I have attempted to get around this by converting the column to datetime within the query itself, but refresh was extremely slow, even slower than refreshing the entire table. I should also mention that currently the query uses Date.AddMonths to filter out any rows that are older than 1 year. Wasn't sure if I needed to get rid of that filter in order for incremental refresh to work correctly.

 

Any insight would be greatly appreciated, this forum has been very helpful.

 

Dataflows in Golden Dataset

$
0
0

I am working to implement Power BI framework that includes best practices for use of data in Power BI reports throughout the organization. From my research, I think using a Golden Dataset is the way to go. I also think using Dataflows can be beneficial, so analysts can create reports using the upkept dataflows as a source of truth instead of having to recreate connections to SharePoint folders (unfortunately we cannot connect directly to SAP HANA so flat files/workbooks hosted on SharePoint is a must). Trying to keep an agile mindset and make everything as simple as possible for analysts to grab data to create analysis.

 

Since I would be looking to use Dataflows AND a Golden Dataset, I see the benefit of using dataflows within the golden dataset. HOWEVER, I would like the data to get updated as quickly as possible in the golden dataset (Not 45+ minutes), and I do not believe I can create incremental refreshes for the dataflows AND the golden dataset. The idea would be to refresh flat files on SharePoint, which would then kick off a Power Automate flow to refresh the dataflows and golden dataset.

 

My Questions:

1. Is it best practice to connect to dataflows in a golden dataset or connect directly to the source within the golden dataset so that I can use incremental refresh for fast refreshes in the golden dataset and the dataflows?

2. Would it be better/more efficient to not focus on dataflows and just have analysts get data from the SharePoint folder instead? I don't want to overcomplicate things by having to maintain dataflows if it is not beneficial. Curious what others experience is on this. I have come across people having problems with dataflows, and if I don't need to use them and deal with dataflow problems then maybe I should not use dataflows.


SharePoint List to Power BI Online Error

$
0
0

Having created a dashboard from a SharePoint list via Integrate >> Power BI >> Visualize the List resulted in the following error when the SharePoint List was subsequently updated.

 

  • Activity ID20386b58-1078-4807-8523-09a0d14dd1d6
  • Request IDc0a11beb-9cb5-a702-6d25-0885c3a14bea
  • Correlation IDd1edbf6a-f071-8e17-82a1-067319ed5c3b
  • TimeFri Jul 30 2021 14:01:36 GMT-0500 (Central Daylight Time)
  • Service version13.0.16475.45
  • Client version2107.3.06953-train
  • Cluster URIhttps://wabi-us-east2-redirect.analysis.windows.net/

 

Please advise.


Thanks,

 

Petar

300MB excel file and growing with multiple sheets with calculation

$
0
0

I am trying to figure out the best way to get the power BI connection secured with premium. Here is my scenario:
I have around 300 mb excel file which has another 10 sheets that have data entry and validation. There are 3 other excel files smaller like around 10 to 15 mb which connect to the same power BI report.
Now this excel and power BI are being stored in a shared drive. Everyone usng the report has free license and power BI desktop installed.
Now my question is since this is a large content, what are my options, I am considering:
1. Move the excel source files into SHarePoint where the data entry is done and publish the power BI report into a premium workspace for embedding into some team site. Is there a way to just append new data with the existing excel data already imported? Will this reduce the refresh interval or performance will be better?

2. Use an on prem sql db, migrate all the content into a SQL db and have a power Apps form to write to the sql with new data and BCS connection to view the data from sql db. Will this be faster and better approach?

 

Thanks in advance.

Need Assistance: Permissions problem with viewers

$
0
0

Hi Everyone,

 

I built a report for a department in my company. I have a premium per user license. I placed the report in an app for testing purposes and shared the link via teams with a user of that department. The user I shared the report with is a pro user, but we will eventually want free users to view the reports also. I was under the assumption that since I am a premium user, then if I create a report and share it with a user and put them in a viewer role, then the user can be a free user or pro user since I am a premium user, is this correct?

 

If so, why would the user I shared the report with(a link to the app I created with the report inside) get a message saying that he needs premium account to open report?

 

On a side note, this report was created and sent(while i was a premium user) to this same user but not within an app, and the user was able to see the report, but when I placed it inside an app and shared that link, he now gets the message.

 

Any help is appreciated, thanks

Help: Current Users are no longer able to export underlying data

$
0
0

Hi,

 

I have had reports that a group in my organization have had access to and historically have been able to export the underlying data.  All of the sudden they can no longer export the data but their access has not changed and I have not changed the settings.  How can I solve this for my team?

 

Thanks!

Remove blank x axis values from the visual after applying filter

$
0
0

Hello,

 

Can someone please help me solve this?

 

I have a filter with the following categories

gudiya999_0-1627680460790.png

When I select the category "Average" I get this:

gudiya999_1-1627680958782.png

 

Goal: Is there a way where I could filter out these blank spots in here and have a continuous graph?

 

 

When to use Linked tables

$
0
0

I have six Dataflows in my Power BI Service. All tables in those Dataflows point to various SQL Server tables and these are all refreshing as expected. 

 

I now want to merge data based on one main table in one of those Dataflows (we’ll call it DF1), but need to look up information in tables from the other various Dataflows (DF2, DF3, etc). I want to merge through the Power BI Service, not in a local Dataset.

 

Would it be best to first create Linked tables in DF1 to the other Dataflow tables and then merge that way? Or should I look at this in a completely different way? (I’m a newbie so I’m open to suggestions.)

Remove blank x axis from visual after applying filter - selecting category

$
0
0

Hello,

 

Can someone please help me solve this?

 

I have a filter with the following categories

gudiya999_0-1627680460790.png

When I select the category "Average" I get this:

gudiya999_1-1627680958782.png

 

Goal: Is there a way where I could filter out these blank spots in here and have a continuous graph?

 


PowerBI snowflake refresh fails

$
0
0

Hello Team,

 

I have published a dataset which works on my local desktop but fails on services with below error:

 


Data source error: {"error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","pbi.error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"Native queries aren't supported by this value."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-214746759"}},{"code":"Microsoft.Data.Mashup.ValueError.Detail","detail":{"type":1,"value":"#table({\"Name\", \"Description\", \"Data\", \"Kind\"}, {})"}},{"code":"Microsoft.Data.Mashup.ValueError.Reason","detail":{"type":1,"value":"Expression.Error"}}],"exceptionCulprit":1}}} Table: Dim_Product.
Cluster URI: WABI-NORTH-EUROPE-F-PRIMARY-redirect.analysis.windows.net
Activity ID: 3c242947-a328-44de-814b-22d648f3a7
Request ID: fdc46f31-8c81-c6eb-ea74-d82e9657b
Time: 2021-07-29 08:38:00Z

 

 

Datasource connection is native and its connected, but refresh fails with above error on PBI services.

 

Thanks you for help.

Remove page background when exporting power bi report to power point

$
0
0

Hi,

 

Need help if someone knows how to fix that. I uploaded a picture to the power bi desktop to use it as a background. There is an option to make a page background transparent. But to use my background picture I have to set transparency to 0% otherwise my picture is not visible (it's whited out). So it adds this grey background line. And when I export to power point this line is there as well. This is annoying.

powerbiuser444_0-1627781002367.png

powerbiuser444_1-1627781180005.png

 

Appreciate your help

 

help with relative path and UPS tracking

$
0
0

Hi Guys,

 

Im trying to create a report that brings in the latest UPS tracking information from a list of tracking numbers.

 

My initial thought is to use Web.Contents and modify the string to include the tranking number from another column in the table and then use html.table to parse the site table, similar to the following

 

Table.AddColumn(#"Removed Duplicates", "Custom", each Web.BrowserContents("https://www.ups.com/track?loc=null&tracknum="&[insert tracking number column here]&"&requester=WT/trackdetails"))

 

All seemed to work until i loaded he data to PBI service were i was greeted with the following error mesage when i tried to do a scheduled refresh,

 

You can't schedule refresh for this dataset because the following data sources currently don't support refresh:

  • Data source for Query1

 

Read around a bit and found that relative path function maybe the solution. Heres what i have up til now

 

 

=Table.AddColumn(#"Removed Duplicates", "Custom", each Web.BrowserContents("https://www.ups.com",

[RelativePath="/track?loc=null&tracknum="&[insert tracking number column here]&"&requester=WT/trackdetails"]))

 

but it gives me an error,

 

DataSource.Error: Web.BrowserContents: This function doesn't support the query option 'RelativePath' with value '"/track?loc=null&tracknum=1Z5685470197406940&requester=WT/trackdetails"'.
Details:
/track?loc=null&tracknum=1Z5685470197406940&requester=WT/trackdetails

 

Any ideas on what i may be doing wrong?

 

 

Dataflows Storage can't connect to ADLS Gen 2

$
0
0

Hi,

 

I'm studying the usage of PBI Dataflows + ADLS Gen 2.

 

  • Power BI Account = free, but with PPU trial
  • Power BI Tenant region = Brazil South
  • Azure Resources region = Brazil South (resource group, storage account)
  • Azure Permissions (IAM) = set owner at all levels in all resources, for Power BI Service, Power BI Premium and myself

I'm still getting error as if the regions were different.

 

Any comments?

 

Thanks!

 

L_Bernardino_BR_0-1627813397193.png

 

How to Connect Azure SQL DB to PowerBi Online?

$
0
0

Hi,

I haven't been able to find any resources online about how to connect an Azure SQL DB to PowerBi Online. I know it can connect to the Desktop version, but what about the Online version? How can I do this, please point me in the right direction 🙂

 

Thanks!

Viewing all 62040 articles
Browse latest View live