Posts

Showing posts from August, 2025

Bridging Clouds: Secure Pipelines from Azure DevOps to GCC High

Image
Introduction The goal of this setup is to allow an Azure DevOps pipeline running in the Commercial cloud to move files (e.g., build artifacts, documentation, or deployment packages) into a Storage Account in GCC High . Because these are two different clouds, the connection must be established carefully to remain secure, compliant, and tenant-scoped . Although this guide is written for Commercial → GCC High , the same approach can also be used for file transfers between Commercial environments or even across tenants within Commercial Azure . By relying on federated credentials instead of secrets, the process ensures secure, governed transfers that honor existing Azure AD (Entra ID) boundaries. To achieve this, we use an Azure User Assigned Managed Identity (UAMI) in the target environment, link it with Workload Identity Federation from Azure DevOps (Commercial), and grant it the minimum necessary roles . This way, files can flow between environments without storing long-lived s...

Locking Down a Logic App (Consumption) with OAuth for Calls from Dataverse Plug-ins using Managed Identity

Image
Why I did this I’m using managed identity to let a Dataverse plug-in call Azure resources without storing secrets. One of those calls hits a Logic App (Consumption) via the When an HTTP request is received trigger. I wanted to ensure the workflow can only be invoked by callers from my tenant using OAuth —no shared access signature (SAS) keys. (If you’re on Logic Apps Standard , you’d typically use App Service “Easy Auth”.) Microsoft’s docs say you can require OAuth and (critically) you must disable SAS for request triggers in Consumption, otherwise a valid SAS bypasses OAuth. The official instructions work, but I found a simpler way to flip the SAS switch directly in code view. What I changed 1) Disable SAS for the HTTP trigger (Consumption only) Open the Logic App (Consumption) in the Azure portal. Go to Development Tools ➜ Logic app code view . In the workflow JSON, add the following sibling to "parameters" (top level) and Save : "accessControl...

Passing Pipeline Trigger Time to Data Flows in Azure Data Factory: Use Strings, Not Timestamps!

Image
When working with Azure Data Factory (ADF) and the Dataverse connector , passing the pipeline trigger time into a Data Flow can be trickier than expected. The Scenario You want to pass the pipeline’s trigger time—using the @pipeline().TriggerTime system variable—into a Data Flow. This is often needed for auditing, filtering, or other time-based logic. The catch? You’re using Dataverse , which communicates over the Web API and handles datetime values as strings . The Common Mistake In Azure Data Factory, you might instinctively define the Data Flow parameter as a timestamp or date type. But ADF doesn’t have a dedicated datetime type—only date and timestamp . So you choose one of those, thinking it aligns with your goal. Then you hit an error. And to make matters worse, the error message doesn’t clearly explain the real issue—it can be vague or misleading, which only adds to the confusion. This tripped me up for a while, as I assumed the problem was elsewhere. The Solution...

Power Query: Driftless Merges using Table.Buffer

Image
What happened Recently I was working on data where I needed to pick one best row per group, then merge that result with a lookup table. Here’s the head-scratcher I hit: the pick looked right in preview, but after the merge some groups showed different rows, like the merge had used the pre-pick data. What was happening is that Power Query re-evaluated and re-ordered things during the merge, which changed which row got selected. The fix was to freeze the picked result with Table.Buffer right after the pick so the merge used exactly those rows. I also made the lookup one row per key to avoid duplicate expands. After that, everything stayed stable on refresh. Why and how Table.Buffer works Why the drift happens Power Query is lazy. It does not materialize intermediate steps until needed. A Merge can push work back to the source (folding). That re-evaluation can change row order. If your “pick one” depends on order, the selected row can change during the Merge. What Table.Buffer...