Posts

Showing posts from 2025

Bridging Clouds: Secure Pipelines from Azure DevOps to GCC High

Image
Introduction The goal of this setup is to allow an Azure DevOps pipeline running in the Commercial cloud to move files (e.g., build artifacts, documentation, or deployment packages) into a Storage Account in GCC High . Because these are two different clouds, the connection must be established carefully to remain secure, compliant, and tenant-scoped . Although this guide is written for Commercial → GCC High , the same approach can also be used for file transfers between Commercial environments or even across tenants within Commercial Azure . By relying on federated credentials instead of secrets, the process ensures secure, governed transfers that honor existing Azure AD (Entra ID) boundaries. To achieve this, we use an Azure User Assigned Managed Identity (UAMI) in the target environment, link it with Workload Identity Federation from Azure DevOps (Commercial), and grant it the minimum necessary roles . This way, files can flow between environments without storing long-lived s...

Locking Down a Logic App (Consumption) with OAuth for Calls from Dataverse Plug-ins using Managed Identity

Image
Why I did this I’m using managed identity to let a Dataverse plug-in call Azure resources without storing secrets. One of those calls hits a Logic App (Consumption) via the When an HTTP request is received trigger. I wanted to ensure the workflow can only be invoked by callers from my tenant using OAuth —no shared access signature (SAS) keys. (If you’re on Logic Apps Standard , you’d typically use App Service “Easy Auth”.) Microsoft’s docs say you can require OAuth and (critically) you must disable SAS for request triggers in Consumption, otherwise a valid SAS bypasses OAuth. The official instructions work, but I found a simpler way to flip the SAS switch directly in code view. What I changed 1) Disable SAS for the HTTP trigger (Consumption only) Open the Logic App (Consumption) in the Azure portal. Go to Development Tools ➜ Logic app code view . In the workflow JSON, add the following sibling to "parameters" (top level) and Save : "accessControl...

Passing Pipeline Trigger Time to Data Flows in Azure Data Factory: Use Strings, Not Timestamps!

Image
When working with Azure Data Factory (ADF) and the Dataverse connector , passing the pipeline trigger time into a Data Flow can be trickier than expected. The Scenario You want to pass the pipeline’s trigger time—using the @pipeline().TriggerTime system variable—into a Data Flow. This is often needed for auditing, filtering, or other time-based logic. The catch? You’re using Dataverse , which communicates over the Web API and handles datetime values as strings . The Common Mistake In Azure Data Factory, you might instinctively define the Data Flow parameter as a timestamp or date type. But ADF doesn’t have a dedicated datetime type—only date and timestamp . So you choose one of those, thinking it aligns with your goal. Then you hit an error. And to make matters worse, the error message doesn’t clearly explain the real issue—it can be vague or misleading, which only adds to the confusion. This tripped me up for a while, as I assumed the problem was elsewhere. The Solution...

Power Query: Driftless Merges using Table.Buffer

Image
What happened Recently I was working on data where I needed to pick one best row per group, then merge that result with a lookup table. Here’s the head-scratcher I hit: the pick looked right in preview, but after the merge some groups showed different rows, like the merge had used the pre-pick data. What was happening is that Power Query re-evaluated and re-ordered things during the merge, which changed which row got selected. The fix was to freeze the picked result with Table.Buffer right after the pick so the merge used exactly those rows. I also made the lookup one row per key to avoid duplicate expands. After that, everything stayed stable on refresh. Why and how Table.Buffer works Why the drift happens Power Query is lazy. It does not materialize intermediate steps until needed. A Merge can push work back to the source (folding). That re-evaluation can change row order. If your “pick one” depends on order, the selected row can change during the Merge. What Table.Buffer...

How to View DLP Policies Applied to a Power Platform Environment

Image
To quickly see which Data Loss Prevention (DLP) policies are applied to a specific Power Platform environment, you can use a direct URL. This article shows you how to: Find your environment ID Use the DLP filter URL View results in both the old and new Power Platform Admin Center interfaces Step 1: Locate Your Environment ID To get your environment ID: Go to the Power Platform Admin Center . Click Environments in the left-hand menu. Select the environment you want to inspect. Under the Details or Overview section, locate the Environment ID (a GUID string). Step 2: Use the URL to View Applied DLP Policies There are two URL formats, depending on which version of the Admin Center you’re using. ✅ New Admin Center: https://admin.powerplatform.microsoft.com/security/dataprotection/dlp/environmentFilter/{environmentId} 🕹️ Old Admin Center: https://admin.powerplatform.microsoft.com/dlp/environmentFilter/{environmentId} Just replace {environmentId} with the actual ID...

Harnessing Host Form Data with PCF Controls in Model-Driven Applications

Image
Introduction This tutorial delves into integrating PowerApps Component Framework (PCF) controls with host form data within Microsoft Power Platform’s model-driven apps. This article will guide you through the necessary scripting to expose and consume formContext and globalContext from a custom table called new_Competitor. Aimed at enhancing both custom and Microsoft Form Component PCF controls, this approach ensures dynamic interactions with the host form data. Disclaimer : It’s important to note that there are various methods to retrieve data within PCF controls, including the use of WebAPI. While WebAPI provides a versatile way to access data across different entities and contexts, the approach described in this tutorial focuses on directly integrating with host form data, which can be particularly beneficial in specific use cases where immediate context is crucial. This method allows for real-time data interactions that are essential for certain scenarios, providing a streamlined...

Copy Hidden Power Automate Expressions

Image
While recently walking a colleague through Power Automate, I was reminded of one of my long-standing frustrations: how difficult it is to copy expressions once they’ve been inserted into actions like a Compose , Filter Array , or For Each . For example: workflow()['tags']['environmentName'] item()?['logicalName'] Once fields are wrapped in certain expressions or are in certain actions, you can no longer open them in the expression popup—you or easily copy them out to reuse elsewhere. That makes it frustrating when you want to replicate logic in another flow or simply document what you’ve written. There is a copy action functionality but that copies the entire action and sometimes you just want a specific expression. The Workaround (Until Now) To deal with this, I used to: Add expressions to a comment or note in the action so I could reference them later Open DevTools and inspect the DOM to grab the full title attribute that contains the expression ...

Connecting to Dataverse from Azure Data Factory

Image
Introduction Azure Data Factory (ADF) provides versatile ways to connect and interact with Dataverse. Understanding the connection options and configurations is crucial for securely and efficiently managing your data integration tasks. This guide walks through the primary methods for connecting Azure Data Factory to Dataverse, covering their benefits, best practices, and detailed walkthroughs to get you started. provides versatile ways to connect and interact with Dataverse. Understanding the connection options and configurations is crucial for securely and efficiently managing your data integration tasks. This guide walks through the primary methods for connecting Azure Data Factory to Dataverse, covering their benefits, best practices, and detailed walkthroughs to get you started. Disclaimer: These thoughts are my own and based on my personal experience. If you have different ideas or approaches, I’d love to hear them—I’m always eager to learn more from others in the community. C...