Passing Pipeline Trigger Time to Data Flows in Azure Data Factory: Use Strings, Not Timestamps!

Passing Pipeline Trigger Time to Data Flows in Azure Data Factory: Use Strings, Not Timestamps!

When working with Azure Data Factory (ADF) and the Dataverse connector, passing the pipeline trigger time into a Data Flow can be trickier than expected.

The Scenario

You want to pass the pipeline’s trigger time—using the @pipeline().TriggerTime system variable—into a Data Flow. This is often needed for auditing, filtering, or other time-based logic.

The catch? You’re using Dataverse, which communicates over the Web API and handles datetime values as strings.

The Common Mistake

In Azure Data Factory, you might instinctively define the Data Flow parameter as a timestamp or date type. But ADF doesn’t have a dedicated datetime type—only date and timestamp. So you choose one of those, thinking it aligns with your goal.

Then you hit an error.
And to make matters worse, the error message doesn’t clearly explain the real issue—it can be vague or misleading, which only adds to the confusion. This tripped me up for a while, as I assumed the problem was elsewhere.
image

The Solution

  1. Define the Data Flow parameter as a string.
    image

  2. In the pipeline, pass the @pipeline().TriggerTime system variable directly into this parameter using a pipeline expression.
    image

This small change ensures compatibility with the Dataverse connector and avoids the cryptic conversion error.

Lesson Learned

Even though it’s tempting to use date or timestamp types when dealing with datetime values in ADF, the correct approach for Dataverse is to use strings. It aligns with how the Web API expects the data and helps you avoid hard-to-decipher errors.

Comments

Popular posts from this blog

Add User As Local Administrator On Domain Controller

How to Create SharePoint Items with Power Automate Desktop

Harnessing Host Form Data with PCF Controls in Model-Driven Applications