Quantcast
Channel: Service topics
Viewing all articles
Browse latest Browse all 61926

Dataflow updated without errors but the result is not complete

$
0
0
Hi,

we created several Dataflows Gen2, which are to be executed regularly by a Data Pipeline.
While setting up the Pipeline, we checked several times that all dependencies such as data sources, data sinks, connections and table schemas were set up correctly.

Nevertheless, at irregular intervals, some of our resulting tables are incomplete or still at the level before the update,
without any errors or problems being recorded in the Data Pipeline log or in the Dataflow logs.
SQL queries of our Lakehouses revealed that individual tables and therefore individual dataflow queries were not executed.

We checked that there was sufficient capacity available at the time of the incomplete Pipeline executions.
The execution time of the Pipeline activities also did not deviate from the expected average.

Normally, restarting the Pipeline manually fixes the error. This method is not sufficient for our requirements.

Has anyone had similar experiences?
Are there any known solutions or at least troubleshooting measures?

Viewing all articles
Browse latest Browse all 61926

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>