tag:blogger.com,1999:blog-86756968612451918962024-03-16T14:53:04.219-04:00PowerApps RAW!Providing tips, tricks, and free stuff to the PowerApps community.Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.comBlogger209125tag:blogger.com,1999:blog-8675696861245191896.post-19375416557245096712024-03-08T15:48:00.002-05:002024-03-08T15:48:23.284-05:00Syncing Azure DevOps Work Item Status to Microsoft Dataverse with Dataflows<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/de09fe1f-3949-4bb1-abde-587528e9d2ef" alt="Syncing Azure DevOps Work Item Status to Microsoft Dataverse with Dataflows"></p>
<p>In today’s fast-paced development environments, seamless integration between customer and development tracking systems can be crucial for maintaining transparency, efficiency, and alignment across teams. Our objective centers around a scenario faced by some organizations: <strong>synchronizing customer requirements captured by staff in Microsoft Dataverse with the development work tracked in Azure DevOps (ADO)</strong>.</p>
<h2 id="business-case-and-goals">Business Case and Goals</h2>
<p>Our service teams interact with customers to capture requirements, which are then stored in Microsoft Dataverse. As development plans are formulated, corresponding ADO items are created for each requirement. The primary goals of our integration efforts are:</p>
<ol>
<li>
<p><strong>Visibility for Service Staff:</strong> Enable service teams to view the status of the ADO items associated with customer requirements directly within Dataverse. This integration aims to eliminate the need for service staff to navigate away from their primary system to check development progress, fostering a more efficient and cohesive workflow.</p>
</li>
<li>
<p><strong>Enhanced Reporting Capabilities:</strong> By syncing ADO item statuses with Dataverse, we unlock the potential for advanced querying capabilities within Dataverse. This allows for the creation of detailed reports and analytics on development progress, directly correlating customer requirements with development statuses.</p>
</li>
<li>
<p><strong>Streamlined Operations:</strong> The integration ensures that information flow between customer service and development teams is automated and streamlined. This not only saves time but also reduces the potential for errors in tracking and reporting on the progress of development work against customer requirements.</p>
</li>
</ol>
<p>By achieving these goals, we aim to enhance the operational efficiency of our teams, improve the accuracy of our reporting, and ultimately deliver a better service experience to our customers. The following sections detail the technical journey we embarked on to realize this integration, navigating through authentication challenges, API limitations, and leveraging Power BI Dataflows as a creative solution to synchronize ADO Work Items with Microsoft Dataverse.</p>
<h2 id="the-challenge">The Challenge</h2>
<p>Our objective to synchronize specific ADO Work Item fields with a Dataverse table for enriched reporting introduced a multifaceted set of challenges:</p>
<ol>
<li>
<p><strong>Authentication with Personal Access Token (PAT):</strong> Although ADO supports various authentication methods, our scenario necessitated the use of a PAT for its flexibility and security. Integrating this with Dataverse Dataflows presented an initial obstacle, as these dataflows do not natively support basic authentication, which is essential when using PATs.</p>
</li>
<li>
<p><strong>Inconsistent Behavior in Authentication:</strong> Initial attempts to directly set the Authorization header in the <code>Web.Contents</code> function and configure the connection as Anonymous were met with errors, reporting incorrect credentials. This issue underscored the subtle complexities of handling authentication within Power BI and Dataverse integrations.</p>
</li>
<li>
<p><strong>Challenges with Data Source Configuration:</strong> Adding a new data source via the Web API connector and attempting to use the PAT solely in the password field (leaving the username empty) resulted in invalid credentials errors. However, starting with a blank query and then incorporating the <code>Web.Contents</code> function with Basic authentication—using only the PAT for authentication—eventually proved successful. This discovery process highlighted the trial and error involved in establishing a viable authentication method.</p>
</li>
<li>
<p><strong>API Limitations:</strong> The <code>wit/workitemsbatch</code> endpoint, ideal for our purposes, does not support <code>POST</code> requests when authenticating with basic authentication via the <code>Web.Contents</code> function in M code. This limitation, along with a constraint on processing only 200 records at a time, required a strategic approach to batch processing and API requests.</p>
</li>
<li>
<p><strong>Absence of a Direct Azure DevOps Connector in Dataflows:</strong> Unlike Power BI Desktop, Dataflows lacks an out-of-the-box connector for Azure DevOps, adding an extra layer of complexity. This required us to utilize the Azure DevOps REST API, a robust yet initially daunting interface for those unfamiliar with its intricacies. Learning to navigate and effectively leverage the REST API took time but was essential for achieving our integration goals.</p>
</li>
</ol>
<p>For comprehensive guidance on utilizing the Azure DevOps REST API, including accessing work items, repositories, and other essential services, refer to the official API documentation: <a href="https://learn.microsoft.com/en-us/rest/api/azure/devops/?view=azure-devops-rest-7.2&viewFallbackFrom=azure-devops-rest-7.1">Azure DevOps Services REST API Reference</a>.</p>
<h2 id="the-solution">The Solution</h2>
<p>Despite these obstacles, a solution was crafted through a series of steps, leveraging Power BI Dataflows as an intermediary to handle the data transformation and syncing process.</p>
<h3 id="step-1-create-and-configure-the-pat">Step 1: Create and Configure the PAT</h3>
<p>The initial and crucial step in syncing Azure DevOps (ADO) Work Item states with Microsoft Dataverse involves creating a Personal Access Token (PAT) within Azure DevOps. This PAT serves as the authentication mechanism for accessing ADO’s APIs securely.</p>
<p>Here’s a step-by-step guide to ensure your PAT is properly configured:</p>
<ol>
<li><strong>Navigate to Azure DevOps:</strong> Go to your Azure DevOps organization’s user settings.</li>
<li><strong>Access Security:</strong> Find and click on the “Personal access tokens” option under the security settings.</li>
<li><strong>Create New Token:</strong> Select “New Token.” Ensure you provide it with a descriptive name that clearly indicates its usage, such as “DataverseSync.”</li>
<li><strong>Set Expiry:</strong> Choose an appropriate expiry date for the token according to your project duration and security policies.</li>
<li><strong>Assign Scopes:</strong> Assign the necessary scopes for the PAT. For this integration, you must at least include permissions to read Work Items. If your integration requires accessing other ADO API endpoints, make sure to include those permissions as well.</li>
</ol>
<p>For a detailed walkthrough on creating a PAT in Azure DevOps, refer to the official documentation available at <a href="https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate">Create Personal Access Tokens to authenticate access</a>.</p>
<p>Remember, the PAT is sensitive and should be securely stored. It provides direct access to your Azure DevOps services and should only be shared with trusted individuals and applications.</p>
<h3 id="step-2-implement-power-bi-dataflow">Step 2: Implement Power BI Dataflow</h3>
<p>After securing your PAT, the next step is to establish a Power BI dataflow that will serve as the intermediary for transferring and transforming data between Azure DevOps (ADO) and Microsoft Dataverse. This involves creating custom functions within Power BI to handle data batching and API calls. Follow these detailed instructions to set up your Power BI dataflow:</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/1d1a23f1-448c-4c34-acee-b8c51087ac4b" alt="Copmleted Dataflow"></p>
<h4 id="creating-a-new-blank-query-for-batching">Creating a New Blank Query for Batching</h4>
<ol>
<li><strong>Open Power BI Desktop:</strong> Start by opening Power BI Desktop and navigating to the Data view.</li>
<li><strong>Create a New Query:</strong> Go to the Home tab, click on “Transform Data,” and then select “Data Source Settings.” Here, choose to create a new blank query by selecting “New Source” > “Blank Query.”</li>
<li><strong>Enter the SplitListIntoBatch Function Code:</strong> In the query editor that opens, enter the following M code to create the <code>SplitListIntoBatch</code> function. This function is designed to split your data into smaller batches for processing.</li>
</ol>
<pre class=" language-m"><code class="prism language-m">// Function to split a list into smaller lists of a given size
(list as list, batchSize as number) as list =>
let
// Calculate the number of batches
numBatches = Number.RoundUp(List.Count(list) / batchSize),
// Generate a list of batches
batches = List.Transform(
{0..numBatches - 1}, each List.Skip(list, _ * batchSize) & List.FirstN(list, batchSize)
)
in
batches
</code></pre>
<ol start="4">
<li><strong>Name Your Query:</strong> Rename the query to <code>SplitListIntoBatch</code> by right-clicking on the query name in the left pane and selecting “Rename.”</li>
</ol>
<h4 id="creating-the-function-for-azure-devops-api-calls">Creating the Function for Azure DevOps API Calls</h4>
<ol>
<li><strong>Add Another Blank Query:</strong> Repeat the steps to add a new blank query, this time for making API calls to Azure DevOps.</li>
<li><strong>Enter the GetWorkItems Function Code:</strong> In the new blank query, copy the following M code to create the <code>GetWorkItems</code> function. Adjust the <code>adoOrganization</code> and <code>adoProject</code> variables as necessary for your Azure DevOps instance.</li>
</ol>
<pre class=" language-m"><code class="prism language-m">(ids as list) as table =>
let
// Convert the list of IDs to a comma-separated string
idsString = Text.Combine(List.Transform(ids, Text.From), ","),
// Specify the fields to retrieve
fields = "System.Title,System.State",
//ADO Organization Name
adoOrganization = "PowerAppsRAW",
//ADO Project Name
adoProject = "My%20Project",
// Set up API URL and headers
apiUrl = "https://dev.azure.com/" & adoOrganization & "/" & adoProject & "/_apis/wit/workitems",
headers = [
#"Content-Type" = "application/json"
],
// Make the Get request
Source = Json.Document(Web.Contents(apiUrl, [
Headers = headers,
Query = [
#"api-version" = "7.1",
ids = idsString,
fields = fields
]
])),
// Convert 'value' array to a table
workItemsTable = Table.FromList(Source[value], Splitter.SplitByNothing(), null, null, ExtraValues.Error),
// Expand the top-level properties of each work item
expandedTable = Table.ExpandRecordColumn(workItemsTable, "Column1", {"id", "rev", "fields", "url"}, {"ID", "Revision", "Fields", "URL"}),
// Dynamically expand all fields within the 'Fields' column
// Get the list of field names dynamically from the first row (assuming consistency across rows)
fieldNames = Record.FieldNames(expandedTable{0}[Fields]),
fullyExpandedTable = Table.ExpandRecordColumn(expandedTable, "Fields", fieldNames, fieldNames)
in
fullyExpandedTable
</code></pre>
<ol start="3">
<li><strong>Name Your Query:</strong> This time, rename the query to <code>GetWorkItems</code> to reflect its purpose.</li>
</ol>
<p>After completing these steps, you’ve successfully created the necessary functions within your Power BI Dataflow. These functions will allow you to batch process your Dataverse data and fetch updated work item information from Azure DevOps, respectively.</p>
<h2 id="step-3-efficiently-combining-data-for-synchronization">Step 3: Efficiently Combining Data for Synchronization</h2>
<p>As we proceed to optimize our synchronization process, it’s important to address the configuration needed when actually running the query. This step ensures that we only query Azure DevOps for updates on ADO item IDs present in our Dataverse environment, significantly reducing Azure API usage. Below, we’ve provided sample code and now include essential guidance on configuring the connection for the Azure DevOps (ADO) URL.</p>
<h3 id="efficient-data-fetching-and-processing">Efficient Data Fetching and Processing</h3>
<ol>
<li>
<p><strong>Initiate Data Retrieval:</strong> Start the process by creating a new query for the <code>new_customerrequirement</code> table in Dataverse to identify the specific ADO item IDs that require updates. This step determines the scope of our synchronization efforts.</p>
</li>
<li>
<p><strong>Apply Batch Processing:</strong> Utilize the <code>SplitListIntoBatch</code> function to divide the list of relevant IDs into smaller batches. This approach is crucial for managing API call volume and adhering to Azure DevOps API rate limits.</p>
</li>
<li>
<p><strong>Fetch Updated Work Item Information:</strong> With each batch, invoke the <code>GetWorkItems</code> function to fetch the current status and other pertinent details for the ADO items. Targeting only the IDs identified earlier ensures that our data retrieval is as efficient as possible.</p>
</li>
</ol>
<pre class=" language-m"><code class="prism language-m">let
// Connect to dataverse
Source = CommonDataService.Database("12345.crm.microsoftdynamics.com", [CreateNavigationProperties = true]),
// Navigate to our table
#"Navigation 1" = Source{[Shcema="dbo", Item="new_customerrequirement"]}[Data],
// We will drill down just to the new_adoitemid field so we can pass the entire list to the next step
#"Drill Down" = #"Navigation 1"[new_adoitemid],
// Split the list into batches of 200
#"Batch Items" = SplitListIntoBatches(#"Drill Down", 200),
// Get the work items for each batch from the Azure DevOps API
#"Get Work Items" = List.Transform(#"Batch Items", each GetWorkItems(_)),
// Combine the results into a single table
#"Combined Results" = Table.Combine(#"Get Work Items")
in
#"Combined Results"
</code></pre>
<h3 id="running-the-query-and-configuring-the-connection">Running the Query and Configuring the Connection</h3>
<p>After setting up your query with the provided sample code, executing the query will prompt you to configure the connection settings for accessing Azure DevOps. Here’s how to accurately set up the connection:</p>
<ol>
<li>
<p><strong>Prompt for Connection Settings:</strong> When you attempt to run the query for the first time, Power BI will prompt you to specify how to connect to the Azure DevOps URL. This is a crucial step to ensure secure and successful data retrieval.</p>
</li>
<li>
<p><strong>Set Authentication Type:</strong> In the dialog box that appears, you’ll need to set the authentication method for the ADO URL connection. Choose “Basic” as the authentication method. This selection is necessary to use your Personal Access Token (PAT) for authentication.</p>
</li>
<li>
<p><strong>Configure Username and Password:</strong> For the username field, leave it blank. The PAT does not require a username to be specified. In the password field, paste in the PAT that you created earlier. Your PAT acts as the password, providing secure access to Azure DevOps data based on the permissions you’ve set when creating the token.</p>
</li>
<li>
<p><strong>Save and Proceed:</strong> After configuring the authentication settings, save your changes and proceed with running the query. This setup should allow Power BI to securely fetch the required data from Azure DevOps using your PAT, enabling the data transformation and syncing process to proceed.</p>
</li>
</ol>
<h3 id="finalizing-the-data-preparation">Finalizing the Data Preparation</h3>
<p>With the connection properly configured, you can efficiently combine the data fetched from Azure DevOps with the records in your Dataverse <code>new_customerrequirement</code> table. This process prepares the synchronized dataset for the final step of updating Dataverse records, ensuring that only relevant and updated ADO item information is processed and prepared for synchronization.</p>
<h2 id="step-4-creating-a-dataverse-dataflow-and-linking-to-power-bi-dataflow-results">Step 4: Creating a Dataverse Dataflow and Linking to Power BI Dataflow Results</h2>
<p>After setting up your Power BI dataflow to fetch and process ADO item statuses, the final step involves creating a dataflow within Microsoft Dataverse. This Dataverse dataflow will utilize the Dataflows connector to connect to your Power BI dataflow, allowing you to synchronize and update the <code>new_customerrequirement</code> table with the latest ADO item statuses. Here’s how to accomplish this:</p>
<h3 id="creating-a-dataverse-dataflow">Creating a Dataverse Dataflow</h3>
<ol>
<li><strong>Navigate to Power Apps:</strong> Start by going to the Power Apps portal and selecting your environment.</li>
<li><strong>Access Dataflows:</strong> From the left navigation pane, choose “Data” and then “Dataflows” to access the dataflows section.</li>
<li><strong>Create New Dataflow:</strong> Click on “New dataflow” and then select “Start from blank” to begin the creation process.</li>
</ol>
<h3 id="connecting-to-power-bi-dataflow">Connecting to Power BI Dataflow</h3>
<ol>
<li><strong>Use the Dataflows Connector:</strong> Within the dataflow creation process, select “Add data” and then choose the “Power BI dataflows” connector. This allows you to connect directly to the data processed by your Power BI dataflow.</li>
<li><strong>Authenticate and Select Your Dataflow:</strong> Authenticate as necessary and select the Power BI dataflow you created earlier, which contains the ADO item statuses.</li>
</ol>
<h3 id="linking-and-updating-the-new_customerrequirement-table">Linking and Updating the <code>new_customerrequirement</code> Table</h3>
<ol>
<li><strong>Combine IDs with Power BI Dataflow Results:</strong> With the data from your Power BI dataflow now accessible within Dataverse, the next step is to link this data with the corresponding IDs in the <code>new_customerrequirement</code> table. This involves matching ADO item IDs from Power BI dataflow results with those stored in Dataverse to ensure accurate updates.</li>
<li><strong>Update Table with ADO Item Statuses:</strong> Finally, utilize the linked information to update the <code>new_customerrequirement</code> table, specifically the fields related to ADO item statuses. This ensures that the service staff can view the current status of development work directly within Dataverse, without needing to access Azure DevOps.</li>
</ol>
<h3 id="finalizing-the-integration">Finalizing the Integration</h3>
<ul>
<li><strong>Test and Validate:</strong> It’s essential to test the dataflow to ensure that data is being correctly updated in the <code>new_customerrequirement</code> table. Validate the integration by checking if the ADO item statuses in Dataverse accurately reflect those in Azure DevOps.</li>
<li><strong>Schedule Refreshes:</strong> To maintain up-to-date information, schedule regular refreshes of your Dataverse dataflow. This ensures that the service staff always has the latest status updates at their disposal.</li>
</ul>
<p>By following these steps, you complete the integration cycle, effectively bridging Azure DevOps and Microsoft Dataverse. This enables your organization to streamline operations, enhance reporting capabilities, and provide your service staff with the visibility needed to offer informed customer support.</p>
<h2 id="an-alternative-approach-single-query-execution">An Alternative Approach: Single Query Execution</h2>
<p>While the method described above leverages functions within Power BI Dataflows to batch process and synchronize data, it’s crucial to note that there is an alternative strategy that does not require premium capacity. This alternative involves consolidating the entire process into a single query, thereby avoiding the creation of a computed table which necessitates a Power BI workspace with premium capacity.</p>
<h3 id="benefits-of-a-single-query-approach">Benefits of a Single Query Approach</h3>
<ul>
<li><strong>Cost Efficiency:</strong> By avoiding the need for premium capacity, this method can be more cost-effective, especially for organizations looking to optimize their use of Power BI and Azure resources.</li>
<li><strong>Simplicity:</strong> Consolidating the process into a single query can simplify the dataflow, making it easier to manage and troubleshoot.</li>
<li><strong>Performance:</strong> A single query approach might also offer performance benefits by reducing the complexity and potential overhead introduced by multiple function calls and batch processing.</li>
</ul>
<h3 id="considerations">Considerations</h3>
<p>It’s important to weigh the benefits against potential limitations, such as the complexity of crafting a single, comprehensive query that can handle all necessary operations efficiently. Additionally, while this approach avoids the need for premium capacity, it still requires careful planning around API rate limits and the handling of large datasets.</p>
<h3 id="implementation">Implementation</h3>
<p>Implementing this strategy requires a deep understanding of the Power Query M language and the ability to effectively leverage the Web.Contents function, query parameters, and data transformation capabilities within a single query block. This approach might look something like the following:</p>
<pre class=" language-m"><code class="prism language-m">let
// Connect to dataverse
Source = CommonDataService.Database("12345.crm.microsoftdynamics.com", [CreateNavigationProperties = true]),
// Navigate to our table
#"Navigation 1" = Source{[Shcema="dbo", Item="new_customerrequirement"]}[Data],
// We will drill down just to the new_adoitemid field so we can pass the entire list to the next step
#"Drill Down" = #"Navigation 1"[new_adoitemid],
// Define the batch size
BatchSize = 200,
// Split the list into batches
NumBatches = Number.RoundUp(List.Count(#"Drill Down") / BatchSize),
Batches = List.Transform({0..NumBatches - 1}, each List.Skip(#"Drill Down", _ * BatchSize) & List.FirstN(#"Drill Down", BatchSize)),
// Function to fetch work items for a batch of IDs (inline definition)
GetWorkItemsForBatch = (ids as list) as table =>
let
idsString = Text.Combine(List.Transform(ids, Text.From), ","),
fields = "System.Title,System.State",
adoOrganization = "PowerAppsRAW",
adoProject = "My%20Project",
apiUrl = "https://dev.azure.com/" & adoOrganization & "/" & adoProject & "/_apis/wit/workitems",
headers = [#"Content-Type" = "application/json"],
Source = Json.Document(Web.Contents(apiUrl, [
Headers = headers,
Query = [
#"api-version" = "7.1",
ids = idsString,
fields = fields
]
])),
workItemsTable = Table.FromList(Source[value], Splitter.SplitByNothing(), null, null, ExtraValues.Error),
expandedTable = Table.ExpandRecordColumn(workItemsTable, "Column1", {"id", "rev", "fields", "url"}, {"ID", "Revision", "Fields", "URL"}),
fieldNames = if Table.IsEmpty(expandedTable) then {} else Record.FieldNames(expandedTable{0}[Fields]),
fullyExpandedTable = Table.ExpandRecordColumn(expandedTable, "Fields", fieldNames, fieldNames)
in
fullyExpandedTable,
// Fetch work items for each batch and combine results
FetchResults = List.Transform(Batches, each GetWorkItemsForBatch(_)),
CombinedResults = Table.Combine(FetchResults)
in
CombinedResults
</code></pre>
<p>This illustrative example simplifies the process into a single, cohesive query, demonstrating the potential to streamline the integration without relying on premium features.</p>
<h2 id="overcoming-obstacles">Overcoming Obstacles</h2>
<p>This journey wasn’t without its trials, particularly around authentication and the nuances of the M language for dataflows. The solution required creative thinking, such as using Power BI dataflows as a workaround for Dataverse’s authentication limitations and meticulously crafting M code to interact with the ADO API within its constraints.</p>
<h2 id="conclusion">Conclusion</h2>
<p>By bridging Azure DevOps and Microsoft Dataverse with Power BI Dataflows, we’ve established a robust process for syncing work item updates for enhanced reporting and insight into project requirements. This solution not only addresses the immediate need but also offers a template for similar challenges, showcasing the flexibility and power of Microsoft’s ecosystem when it comes to custom integrations.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-59727398178346290612024-03-06T15:20:00.002-05:002024-03-06T15:20:47.999-05:00Govee and Power Platform: Transforming Smart Lighting Automation<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/bb05d532-19a7-4236-bb4e-e91e169a1997" alt="Govee and Power Platform: Transforming Smart Lighting Automation"></p>
<h2 id="introduction">Introduction</h2>
<p>In the evolving world of smart home technology, the integration of lighting solutions with sophisticated automation platforms is gaining significant traction. Our focus in this discussion is the Govee Lights Power Automate Connector, a pioneering tool that brings together the advanced capabilities of Govee’s smart lighting with the robust automation features of Microsoft Power Automate.</p>
<p>For those interested in exploring the full capabilities of this connector, including setup instructions, usage scenarios, and technical specifications, detailed information is available on the official Microsoft documentation page. You can access it here: <a href="https://learn.microsoft.com/en-us/connectors/govee/">Govee Connector for Power Automate</a>.</p>
<p>This connector not only exemplifies the practical application of integrating smart devices with automation platforms but also opens up new possibilities for enhancing home and office environments through intelligent lighting control.</p>
<h2 id="practical-integration-for-advanced-home-automation">Practical Integration for Advanced Home Automation</h2>
<p>The Govee Lights Power Automate Connector provides a pragmatic solution for controlling Govee smart lights through the Power Automate platform. By utilizing the Govee Developer API, this connector enables a straightforward yet sophisticated approach to lighting automation.</p>
<h3 id="building-the-connector-technical-insights-into-the-govee-api">Building the Connector: Technical Insights into the Govee API</h3>
<p>The development of the Govee Lights Power Automate Connector required a detailed understanding of the Govee API. This process was facilitated by two key resources:</p>
<ol>
<li>
<p><strong>Postman Quickstart Guide for Govee:</strong> This guide provides a comprehensive starting point for anyone looking to understand and use the Govee API. It is especially useful for beginners or those not familiar with API interactions. The guide walks you through the basic steps of setting up API requests and testing them in Postman, making it an invaluable resource for getting up to speed quickly. Access the guide here: <a href="https://quickstarts.postman.com/guide/govee/#0">Postman Quickstart for Govee</a>.</p>
</li>
<li>
<p><strong>Govee Developer API Reference:</strong> For a more in-depth exploration, the official Govee Developer API Reference is the go-to document. It contains detailed information on the API’s capabilities, parameters, response structures, and more. This comprehensive document is essential for developers looking to fully exploit the features of the Govee API in their applications. You can find the official documentation here: <a href="https://govee-public.s3.amazonaws.com/developer-docs/GoveeDeveloperAPIReference.pdf">Govee Developer API Reference</a>.</p>
</li>
</ol>
<p>Using these resources, we were able to effectively understand and harness the full potential of Govee’s API, paving the way for the creation of this connector. This knowledge allowed us to build a tool that can interact seamlessly with Govee lights, providing users with an enhanced level of control and automation possibilities.</p>
<h3 id="key-features">Key Features</h3>
<ol>
<li><strong>Direct Device Control:</strong> The connector allows users to manage their Govee lights through Power Automate, offering functionalities like brightness adjustment and color change.</li>
<li><strong>Device Information Access:</strong> This feature provides essential information about Govee devices, such as the MAC address, model, and supported properties. It’s vital for identifying the specific requirements to execute commands on the devices.</li>
</ol>
<h2 id="utilizing-power-automate-for-smarter-lighting-control">Utilizing Power Automate for Smarter Lighting Control</h2>
<p>The integration of this connector with Power Automate’s automation tools enhances the control and customization of home lighting. It allows for the efficient management of lights as part of a broader automated home system.</p>
<h3 id="application-example-device-information-retrieval-and-command-execution">Application Example: Device Information Retrieval and Command Execution</h3>
<p>Consider a practical application of this integration:</p>
<ol>
<li>
<p><strong>Retrieve Device Information:</strong> Create a flow in Power Automate to gather the necessary details about your Govee devices.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/7a48f824-cf40-4542-8792-097910903def" alt="image"></p>
</li>
<li>
<p><strong>Process and Implement Data:</strong> Use this information to prepare and structure the commands for the intended devices.</p>
</li>
<li>
<p><strong>Command Execution:</strong> Utilize this setup to execute desired actions, such as turning lights on or off and adjusting brightness.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/ab7cd4b6-869b-4d78-be5a-cbc6cd289d30" alt="image"></p>
</li>
<li>
<p><strong>Refine and Integrate Your Automation:</strong> Test and modify your flow for optimal performance, potentially incorporating other smart automation triggers.</p>
</li>
</ol>
<h3 id="examples-of-power-automate-triggers-for-govee-lights">Examples of Power Automate Triggers for Govee Lights</h3>
<ol>
<li>
<p><strong>Time-Based Lighting Control:</strong></p>
<ul>
<li><strong>Trigger:</strong> Schedule (Time-based trigger in Power Automate).</li>
<li><strong>Action:</strong> Automatically dim Govee lights in the evening and brighten them in the morning.</li>
</ul>
</li>
<li>
<p><strong>Calendar Event-Driven Lighting:</strong></p>
<ul>
<li><strong>Trigger:</strong> Calendar event start (Outlook Calendar trigger).</li>
<li><strong>Action:</strong> Change Govee light colors for different types of events.</li>
</ul>
</li>
<li>
<p><strong>Email Notification Response:</strong></p>
<ul>
<li><strong>Trigger:</strong> New email from a specific sender (Email trigger in Power Automate).</li>
<li><strong>Action:</strong> Flash or change the color of Govee lights for priority emails.</li>
</ul>
</li>
<li>
<p><strong>Weather-Based Lighting Adjustments:</strong></p>
<ul>
<li><strong>Trigger:</strong> Weather forecast change (Weather connectors or APIs).</li>
<li><strong>Action:</strong> Adjust light brightness or color temperature in response to weather.</li>
</ul>
</li>
<li>
<p><strong>IoT Device Interaction:</strong></p>
<ul>
<li><strong>Trigger:</strong> IoT device status change (Smart home IoT triggers).</li>
<li><strong>Action:</strong> Modify Govee light brightness with changes in room temperature.</li>
</ul>
</li>
<li>
<p><strong>Social Media Alerts:</strong></p>
<ul>
<li><strong>Trigger:</strong> New social media mention or message (Social media triggers).</li>
<li><strong>Action:</strong> Change light color or pattern for notifications.</li>
</ul>
</li>
<li>
<p><strong>Motion-Triggered Lighting:</strong></p>
<ul>
<li><strong>Trigger:</strong> Motion sensor activation (Smart home motion sensors).</li>
<li><strong>Action:</strong> Turn on or adjust Govee lights when motion is detected.</li>
</ul>
</li>
<li>
<p><strong>Custom User Input:</strong></p>
<ul>
<li><strong>Trigger:</strong> Button press on a mobile app or physical button (Power Automate Button trigger).</li>
<li><strong>Action:</strong> Switch Govee lights to a pre-set scene or color palette.</li>
</ul>
</li>
</ol>
<h2 id="conclusion">Conclusion</h2>
<p>The Govee Lights Power Automate Connector represents a focused approach to enhancing home lighting systems through automation. It stands as a practical example of how smart technology can be effectively integrated into daily life using tools like Microsoft Power Automate.</p>
<p>Explore the functionality and benefits of this connector for a more efficient and tailored smart home lighting experience.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-53803717185636072442024-02-29T11:47:00.001-05:002024-02-29T11:47:39.045-05:00Step-by-Step to Success: Run AutoGPT using Azure OpenAI on Docker<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/105d93ce-4983-4c37-9fc2-bbf9dbc6be3e" alt="Step-by-Step to Success: Run AutoGPT using Azure OpenAI on Docker"></p>
<p>Integrating AutoGPT with Azure OpenAI through Docker offers a direct path to unlocking advanced AI capabilities. This detailed guide not only walks through the initial setup and configuration steps but also emphasizes the critical adjustments required for effective Azure OpenAI integration. Let’s dive into a more focused and informative discussion on setting up AutoGPT and ensuring it works seamlessly with Azure OpenAI services.</p>
<h2 id="what-is-autogpt">What is AutoGPT?</h2>
<p><a href="https://github.com/Significant-Gravitas/AutoGPT">AutoGPT</a> is like having a smart robot buddy that helps you achieve a specific goal by chatting with a super smart AI, kind of like having a conversation with a genius friend. Here’s how it works, broken down really simply:</p>
<ol>
<li>
<p><strong>You Set a Goal</strong>: Imagine you have a goal, like planning a surprise birthday party or learning about space. You tell this to your robot buddy.</p>
</li>
<li>
<p><strong>The Robot Starts the Chat</strong>: Your robot buddy kicks things off by asking the first question to the genius AI, aiming to get information or ideas related to your goal.</p>
</li>
<li>
<p><strong>Listening and Thinking</strong>: After getting an answer, the robot thinks about it, figures out if it’s helpful, and what to ask next to get closer to your goal.</p>
</li>
<li>
<p><strong>Asking More Questions</strong>: Based on what the genius AI says, the robot keeps the conversation going, asking more questions to dig deeper or get more specific information, all aimed at reaching your goal.</p>
</li>
<li>
<p><strong>Goal Achieved</strong>: This back-and-forth chat continues until your robot buddy has gathered enough info or ideas to help you meet your goal, like having a full plan for that surprise party or a good understanding of space.</p>
</li>
</ol>
<p>In short, AutoGPT is like a helpful middleman between you and a super-smart AI, doing all the talking and thinking for you, so you don’t have to come up with what to ask next. It makes getting to your goal easier by handling the conversation, making sure everything stays on track.</p>
<h2 id="detailed-configuration-steps-for-integrating-autogpt-with-azure-openai">Detailed Configuration Steps for Integrating AutoGPT with Azure OpenAI</h2>
<h3 id="initial-setup">Initial Setup</h3>
<ol>
<li>
<p>Install <a href="https://www.docker.com/get-started/">Docker</a></p>
</li>
<li>
<p><strong>Fork and Clone the AutoGPT Repository</strong>: Begin by forking the <a href="https://github.com/Significant-Gravitas/AutoGPT">AutoGPT repository on GitHub</a> and cloning it to your local machine, for instance, at <code>C:\Auto-GPT</code>.</p>
</li>
</ol>
<h3 id="configuration">Configuration</h3>
<ol start="2">
<li>
<p><strong>Environment Setup</strong>:</p>
<ul>
<li>Copy the <code>.env.template</code> file from <code>C:\Auto-GPT\autogpts\autogpt</code> to the primary folder <code>C:\Auto-GPT</code> and rename it to <code>.env</code>.</li>
<li>Edit the <code>.env</code> file, setting <code>USE_AZURE=True</code> to enable Azure OpenAI integration. Ensure <code>True</code> is capitalized to avoid issues.</li>
</ul>
</li>
<li>
<p><strong>API Key Configuration</strong>:</p>
<ul>
<li>
<p>Update the <code>OPENAI_API_KEY</code> in the <code>.env</code> file with your Azure OpenAI API key, found in the Azure portal under your OpenAI service’s “Keys and Endpoints”.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/2f3968ad-27cb-4266-9b91-07d116908595" alt="image"></p>
</li>
</ul>
</li>
<li>
<p><strong>Docker and Azure YAML Setup</strong>:</p>
<ul>
<li>
<p>Copy the <code>azure.yaml.template</code> file to <code>C:\Auto-GPT</code> rename it to <code>azure.yaml</code> we will adjust it later according to our Azure OpenAI service details.</p>
</li>
<li>
<p>Create a <code>docker.compose.yml</code> file in <code>C:\Auto-GPT</code> using the Docker setup template from the AutoGPT documentation. Add the following line to the volumes section to prevent the <code>app/azure.yaml</code> file not found error:</p>
<p>Volume Sample:</p>
<pre class=" language-yaml"><code class="prism language-yaml"><span class="token key atrule">volumes</span><span class="token punctuation">:</span>
<span class="token punctuation">-</span> ./azure.yaml<span class="token punctuation">:</span>/app/azure.yaml
</code></pre>
<p>Entire docker-compose:</p>
<pre class=" language-yaml"><code class="prism language-yaml"> <span class="token key atrule">version</span><span class="token punctuation">:</span> <span class="token string">"3.9"</span>
<span class="token key atrule">services</span><span class="token punctuation">:</span>
<span class="token key atrule">auto-gpt</span><span class="token punctuation">:</span>
<span class="token key atrule">image</span><span class="token punctuation">:</span> significantgravitas/auto<span class="token punctuation">-</span>gpt
<span class="token key atrule">env_file</span><span class="token punctuation">:</span>
<span class="token punctuation">-</span> .env
<span class="token key atrule">ports</span><span class="token punctuation">:</span>
<span class="token punctuation">-</span> "8000<span class="token punctuation">:</span>8000" <span class="token comment"># remove this if you just want to run a single agent in TTY mode</span>
<span class="token key atrule">profiles</span><span class="token punctuation">:</span> <span class="token punctuation">[</span><span class="token string">"exclude-from-up"</span><span class="token punctuation">]</span>
<span class="token key atrule">volumes</span><span class="token punctuation">:</span>
<span class="token punctuation">-</span> ./data<span class="token punctuation">:</span>/app/data
<span class="token comment">## allow auto-gpt to write logs to disk</span>
<span class="token punctuation">-</span> ./logs<span class="token punctuation">:</span>/app/logs
<span class="token comment">## allow auto-gpt to read the azure yaml file</span>
<span class="token punctuation">-</span> ./azure.yaml<span class="token punctuation">:</span>/app/azure.yaml
<span class="token comment">## uncomment following lines if you want to make use of these files</span>
<span class="token comment">## you must have them existing in the same folder as this docker-compose.yml</span>
<span class="token comment">#- type: bind</span>
<span class="token comment"># source: ./ai_settings.yaml</span>
<span class="token comment"># target: /app/ai_settings.yaml</span>
<span class="token comment">#- type: bind</span>
<span class="token comment"># source: ./prompt_settings.yaml</span>
<span class="token comment"># target: /app/prompt_settings.yaml</span>
</code></pre>
</li>
</ul>
</li>
</ol>
<h3 id="azure-ai-models-deployment">Azure AI Models Deployment</h3>
<ol start="5">
<li><strong>Deploy Azure AI Models</strong>:
<ul>
<li>
<p>Use <a href="https://oai.azure.com/portal">Azure AI Studio</a> to deploy necessary models like <code>gpt-4</code> and <code>gpt-3.5-turbo-text-embedding-ada-002</code>, setting deployment names to match the model names for simplicity.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/51e4ed6d-ffe3-4bc1-9acb-2fb43f47528b" alt="image"></p>
</li>
</ul>
</li>
</ol>
<h3 id="final-adjustments">Final Adjustments</h3>
<ol start="6">
<li>
<p><strong>Modify the <code>azure.yaml</code> File</strong>:</p>
<ul>
<li>
<p>Set <code>azure_api_type</code> to <code>azure</code>, ensuring the use of the API key for authentication. If you want to use Azure AD you can set the parameter to <code>azure_ad</code>. This will also require that you use an auth token as your OPENAPI_API_KEY. Instructions on how to obtain this token can be found in <a href="https://gist.github.com/primaryobjects/523577860628974501ffd3c52cd73525">How to Configure AutoGPT with Azure OpenAI Active Directory Managed Identity</a>.</p>
</li>
<li>
<p>The <code>azure_api_base</code> and <code>azure_api_version</code> was determined using the <a href="https://oai.azure.com/portal">Azure AI Studio’s</a> chat playground “View code” feature.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/d5888573-532e-4f7e-880d-84280ec2e80c" alt="image"></p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/770210c4-1aa1-4d57-8aa1-cb7b7de7a386" alt="image"></p>
</li>
<li>
<p>For azure_model_map, an iterative approach was taken. Initially, no mappings were specified. After running the Docker command, errors indicating missing models were used to gradually populate this section with the correct model mappings. This process involved mapping the AutoGPT’s expected model names to the corresponding deployment names in Azure AI Studio.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/41dc5ec2-a20f-4518-a815-4eb57beeeef0" alt="2024-02-28_16-36-06"></p>
</li>
</ul>
<p>Complete azure.yaml file.</p>
<pre class=" language-yaml"><code class="prism language-yaml"><span class="token key atrule">azure_api_type</span><span class="token punctuation">:</span> azure
<span class="token key atrule">azure_api_base</span><span class="token punctuation">:</span> https<span class="token punctuation">:</span>//rawopenai.openai.azure.com/
<span class="token key atrule">azure_api_version</span><span class="token punctuation">:</span> 2024<span class="token punctuation">-</span>02<span class="token punctuation">-</span>15<span class="token punctuation">-</span>preview
<span class="token key atrule">azure_model_map</span><span class="token punctuation">:</span>
<span class="token key atrule">gpt-3.5-turbo-16k</span><span class="token punctuation">:</span> gpt<span class="token punctuation">-</span>35<span class="token punctuation">-</span>turbo
<span class="token key atrule">gpt-4</span><span class="token punctuation">:</span> gpt<span class="token punctuation">-</span><span class="token number">4</span>
<span class="token key atrule">ext-embedding-3-small</span><span class="token punctuation">:</span> text<span class="token punctuation">-</span>embedding<span class="token punctuation">-</span>ada<span class="token punctuation">-</span><span class="token number">002</span>
</code></pre>
</li>
</ol>
<h3 id="execution">Execution</h3>
<ol start="7">
<li><strong>Running AutoGPT</strong>:
<ul>
<li>Execute AutoGPT via Docker from the <code>C:\Auto-GPT</code> directory using the command <code>docker compose run --rm auto-gpt</code>. This step confirms the successful integration and functionality of AutoGPT with Azure OpenAI.</li>
</ul>
</li>
</ol>
<h2 id="conclusion">Conclusion</h2>
<p>AutoGPT revolutionizes our interaction with AI by automating the conversation process, guiding us toward achieving specific goals with minimal effort. This transformative approach streamlines tasks ranging from content creation to complex data analysis, making it a versatile tool for anyone looking to leverage AI’s power. The simplicity of AutoGPT, coupled with its goal-oriented methodology, democratizes access to advanced AI capabilities, enabling users to focus on outcomes rather than getting bogged down in the technicalities of prompt engineering.</p>
<p>Through this article, we’ve provided a detailed blueprint for integrating AutoGPT with Azure OpenAI, ensuring you have the knowledge to harness this innovative technology effectively. Whether you’re a seasoned developer or new to the world of AI, the step-by-step guide laid out here is designed to empower you to implement AutoGPT within the Azure ecosystem successfully. Embracing AutoGPT opens up a realm of possibilities, allowing you to push the boundaries of what you can achieve with AI, turning complex tasks into manageable, goal-driven projects.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-11198663782139046332024-02-29T11:17:00.001-05:002024-02-29T11:17:44.759-05:00Power Automate Blueprint Accessing Azure Portal Backend APIs and the Intricacies of main.iam.ad.ext.azure.com<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/6f67a1f1-2456-48d0-8638-93fa3cabb301" alt="Power Automate Blueprint Accessing Azure Portal Backend APIs and the Intricacies of main.iam.ad.ext.azure.com"></p>
<p>In the realms of digital infrastructure management, automation emerges as a pivotal ally, especially when confronting repetitive and time-sensitive tasks. A recent endeavor led me to a scenario where automating the management of OAuth tokens for users within our organization was paramount. Our meticulous record-keeping of these tokens and their respective assignments is handled through Power Apps. However, the manual aspect of adding these tokens via the Azure portal, which necessitated the upload of a CSV file each time, posed a cumbersome challenge.</p>
<p>Given the current preview status of this functionality, a straightforward method through Graph API was conspicuously absent. Thus, I aimed to devise an automated framework, enabling individuals with the appropriate permissions in Power Apps to seamlessly add these tokens for users. While this wouldn’t entirely absolve admins of their duties—they would still need to activate the tokens within the Azure portal—it significantly mitigated the manual labor involved in creating and uploading the CSV file.</p>
<p>My exploration revealed that during a manual CSV file upload, an API located at <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> is triggered in the background. However, this API is a closed door, accessible exclusively via the Azure portal and not through an app registration. This discovery beckoned a deeper dive to harness this API for automating the OAuth token management chore.</p>
<p>This trail led to the crafting of a solution using Power Automate, which is the focal point of this article. The scripts and methodologies delineated here draw inspiration and foundational knowledge from the following insightful articles:</p>
<ul>
<li><a href="https://rozemuller.com/use-internal-azure-api-with-logic-apps/">Utilizing the internal main.iam.ad.ext.azure.com API with Logic Apps</a></li>
<li><a href="https://rozemuller.com/use-internal-azure-api-in-automation/#authenticate-to-mainiamadextazurecom">Employing the internal main.iam.ad.ext.azure.com API in automation</a></li>
<li><a href="https://goodworkaround.com/2020/11/27/accessing-the-backend-azure-ad-apis-behind-portal-azure-com/">Accessing the backend Azure AD APIs behind portal.azure.com</a></li>
</ul>
<p>While these articles unveiled a treasure trove of information, I envisioned extending this knowledge by showcasing how this automation could be embodied within Power Automate. This article, therefore, aspires to bridge that informational chasm and furnish a step-by-step guide on configuring the <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> API with Power Automate to automate OAuth token management, albeit it’s crucial to note that <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> is an unsupported API, which calls for a cautious approach in a production environment.</p>
<h2 id="service-account">Service Account</h2>
<p>Utilizing the <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> API requires a distinct approach for authentication due to its unsupported status and exclusive accessibility via the Azure portal. Unlike many Azure services, this API doesn’t support service principal access, which is a common method for non-interactive, automated access within Azure environments.</p>
<p>To navigate this, it’s advisable to create a dedicated service account with the precise permissions required to interact with the <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> API. This setup aligns with the principle of least privilege (PoLP), ensuring the account has only the necessary access rights, and facilitates easier auditing and monitoring of the automated processes.</p>
<p>Creating a service account is a good practice from a security standpoint. It not only provides a workaround for the lack of service principal support but also ensures that all actions carried out by the automation processes are traceable, thus enhancing the security and accountability of the setup.</p>
<p><strong>Important Note on MFA and Token Lifetime:</strong> If your Azure environment has Multi-factor authentication (MFA) requirements set up through conditional access, it can affect the token lifetime for the service account. In scenarios where this becomes a hindrance to automation processes, you may need to consider excluding or adjusting the MFA requirements for the service account. However, always ensure that any changes made align with your organization’s security policies and best practices.</p>
<h2 id="initial-key-vault-setup">Initial Key Vault Setup</h2>
<p>Azure Key Vault is a pivotal component in this setup, acting as a secure repository for storing and managing the refresh token needed for automation. It provides a centralized platform to safeguard sensitive data such as secrets, encryption keys, and certificates.</p>
<h3 id="resource-provider-registration">Resource Provider Registration</h3>
<p>Before diving into scripts or deployment files, ensure that the necessary Azure resource providers are registered within your subscription, particularly <code>Microsoft.Authorization</code> and <code>Microsoft.KeyVault'</code> These providers are crucial for creating and managing the Key Vault and setting the necessary permissions.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/3a9826cb-ecd9-42f8-91d2-df47a48ba54f" alt="image"></p>
<h3 id="deployment-options">Deployment Options</h3>
<p>There are various avenues to set up the Key Vault – you can employ Azure PowerShell, Azure CLI, or Bicep files, among other methods. In this guide, both PowerShell scripts and Bicep files are provided to offer flexibility based on your preferences or environment constraints.</p>
<h4 id="powershell-scripts">PowerShell Scripts</h4>
<p>A script named <a href="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/blob/main/01-create-resources-powershell.ps1">01-create-resources-powershell.ps1</a> is provided to facilitate the creation of a resource group and a Key Vault. It’s an executable script where you just need to replace the parameters with your environment details.</p>
<pre class=" language-undefined"><code class="prism language-*.sh-session language-undefined">.\01-create-hidden-api-environment.ps1 -Location "eastus" -ResourceGroupName "hidden-resourcegroup" -KeyVaultName "hidden-kv" -TenantId "123e4567-e89b-12d3-a456-426614174000" -SubscriptionId "123e4567-e89b-12d3-a456-426614174001"
</code></pre>
<h4 id="bicep-files">Bicep Files</h4>
<p>Bicep is a declarative language for describing and deploying Azure resources. It simplifies the process of writing and managing Azure Resource Manager (ARM) templates. Bicep files offer a streamlined method to deploy resources directly from Visual Studio Code, provided you have the Bicep extension installed. The files included in this guide are structured to deploy a resource group and a Key Vault effortlessly.</p>
<p>What is great about these files is that you can easily deploy from within Visual Studio Code. Just install the Bicep extension.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/dbbbe922-8346-4756-94ab-49d5be154186" alt="image"></p>
<p>Then right click on the bicep file to deploy. The <code>main.parameters.json</code> file can be used to supply all the necessary parameters for deployment.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/be99a9fc-c4c7-4fc9-91fc-c78827e521ff" alt="image"></p>
<p>Both deployment methods are designed to abstract the complexities involved in setting up Azure Key Vault, providing a straightforward path to secure the refresh token necessary for automation.</p>
<h4 id="azure-portal">Azure Portal</h4>
<ul>
<li>Navigate to the Azure portal, then select “Create a resource.”</li>
<li>In the “Search the Marketplace” field, type “Key Vault” and select it from the list.</li>
<li>Click the “Create” button to open the “Create Key Vault” blade.</li>
<li>Fill in the necessary fields such as the Name, Subscription, Resource Group, and Location.</li>
<li>Review the other settings and adjust them according to your preferences or requirements.</li>
<li>Click the “Review + create” button, review your settings one final time, and then click “Create” to deploy the Key Vault.</li>
</ul>
<h2 id="acquiring-the-initial-refresh-token">Acquiring the Initial Refresh Token</h2>
<p>For the automation to function seamlessly, acquiring a refresh token initially is crucial. This token will serve as the bridge for obtaining new access tokens, allowing the automation to interact with the <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> API securely.</p>
<p>A PowerShell script named <a href="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/blob/main/02-set-initial-refresh-token.ps1">02-set-initial-refresh-token.ps1</a> is provided to facilitate this process. When executed, this script will prompt you to sign in twice. The first login should be performed with an account that possesses the requisite permissions to update the Key Vault. Following this, a second login prompt will appear. This is where you would log in using the service account created earlier, which has been configured with just the necessary rights to interact with the Azure API. This dual-login mechanism ensures that the process is securely handled, aligning with the principle of least privilege while setting the stage for automated interactions with the <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> API. The example below shows you how to run the script, replace the parameters with your environment details.</p>
<pre class=" language-undefined"><code class="prism language-*.sh-session language-undefined">.\02-set-initial-refresh-token.ps1 -TenantId "123e4567-e89b-12d3-a456-426614174000" -SubscriptionId "123e4567-e89b-12d3-a456-426614174001" -KeyVaultName "hidden-kv"
</code></pre>
<h2 id="automation-framework-setup">Automation Framework Setup</h2>
<p>Embarking on the automation of OAuth token management necessitates a structured setup within Power Automate and Azure. The steps below delineate the essential configurations:</p>
<h3 id="create-app-registration">Create App Registration</h3>
<ul>
<li>There’s no need to add any API roles.</li>
<li>Generate a new secret, copying the secret value for safekeeping, as it will be crucial when creating the connection for Key Vault in Power Automate.</li>
<li>Ensure to also copy the Application (Client) ID and the Directory (tenant) ID, which will be required for the connection within Power Automate.</li>
</ul>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/56a79799-527e-41ff-8396-98ed6e1f084f" alt="image"></p>
<h3 id="install-the-azure-key-vault-custom-connectors">Install the Azure Key Vault Custom Connectors</h3>
<p>Either install the Azure Key Vault custom connectors solution <a href="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/raw/main/power-platform-solutions/AzurePortalAPICustomConnector_1_0_0_2.zip">AzurePortalAPICustomConnector_1_0_0_2</a>, or follow the installation guidelines provided <a href="https://github.com/Microsoft/PowerPlatformConnectors/tree/master/custom-connectors/AzureKeyVault">here</a>. Utilizing a custom connector is vital since the certified connector for Key Vault doesn’t allow writing secrets back to the Key Vault, a feature requisite for this solution.</p>
<ul>
<li>Update the host URL to reflect your new Key Vault.</li>
</ul>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/e6291654-ecc1-4915-b729-95c8f6b0e3d6" alt="image"></p>
<ul>
<li>Revise the security section:</li>
<li>Ensure the Enable Service Principal support option is selected.</li>
<li>Input the details copied from the app registration you created.
<ul>
<li>For the authentication URL, use:
<ul>
<li>Azure Cloud: <code>https://login.microsoftonline.com</code></li>
<li>Azure Government: <code>https://login.microsoftonline.us</code></li>
</ul>
</li>
<li>For the Resource URL use:
<ul>
<li>Azure Cloud: <code>https://vault.azure.net</code></li>
<li>Azure Government: <code>https://vault.usgovcloudapi.net</code></li>
</ul>
</li>
</ul>
</li>
</ul>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/3a05e59e-d23e-40f5-8cad-bfbae5a11f60" alt="image"></p>
<ul>
<li>Copy the Redirect URL from the security section, to be added within the app registration later.</li>
</ul>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/0dd6016d-b327-408e-8133-f67648bc4ce4" alt="image"></p>
<h3 id="update-app-registration">Update App Registration</h3>
<p>Incorporate the Power Automate redirect URL into the app registration.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/d060a313-94b5-4cc1-994f-dce807295d1b" alt="image"></p>
<h3 id="key-vault-access-management">Key Vault Access Management</h3>
<p>Designate the app registration as a Key Vault Secrets Officer in Key Vault, ensuring the requisite permissions for reading and writing secrets are granted.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/537e6ab1-3a38-4642-a883-f0868d35c17d" alt="image"></p>
<h2 id="power-automate-flow">Power Automate Flow</h2>
<p>The culmination of our setup is the creation of a Power Automate flow, designed to automate the process of managing OAuth tokens. Initially, the flow retrieves the existing refresh token from Key Vault, utilizes it to log in the service account, and upon a successful login, acquires a new refresh token alongside an access token. The new refresh token is then stored back in Key Vault, extending its lifespan, while the access token is employed to interact with the <code>main.iam.ad.ext.azure.com</code> API.</p>
<p>For a hands-on experience, you can download a solution containing the sample Flow from here: <a href="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/raw/main/power-platform-solutions/AzurePortalAPIFlowExample_1_0_0_4.zip">AzurePortalAPIFlowExample_1_0_0_4.zip</a>.</p>
<p><strong>Important Note:</strong> Ensure to secure all input and output parameters within any actions that may utilize the refresh token or access token to safeguard against unauthorized access.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/d35adbdc-446b-416a-a609-2d5472efef6a" alt="image"></p>
<h3 id="get-refresh-token">Get Refresh Token</h3>
<p>The journey begins with the ‘Get secret’ action, courtesy of our custom Key Vault connector, fetching the refresh token from Key Vault to set the stage for the login process.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/7994e384-b765-434c-9358-9e10810219c7" alt="image"></p>
<p>Upon adding this action to your flow, select <code>Service Principal Connection</code> as the authentication type, and furnish the details from the app registration created earlier.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/6c09cffc-f8f3-467c-a4e7-2434c7a8c232" alt="image"></p>
<h3 id="parse-json-from-refresh-token">Parse JSON from Refresh Token</h3>
<p>Simplify the subsequent steps by parsing the Key Vault response using the ‘Parse JSON’ action.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/37600217-adf0-401f-b272-63eb9ebdabc6" alt="image"></p>
<p>The following schema can be copy/pasted into this action.</p>
<pre class=" language-json"><code class="prism language-json"><span class="token punctuation">{</span>
<span class="token string">"properties"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"attributes"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"properties"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"created"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"integer"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"enabled"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"boolean"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"recoverableDays"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"integer"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"recoveryLevel"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"updated"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"integer"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"object"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"id"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"value"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"object"</span>
<span class="token punctuation">}</span>
</code></pre>
<h3 id="login">Login</h3>
<p>The ‘HTTP’ action propels a POST request to the AD OAuth token endpoint to procure the authorization token essential for Azure API interactions.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/fa7a43fe-4ea2-48b3-87ee-f900e0830ea1" alt="image"></p>
<p>URI (Replace {tenant-id} with your own tenant Id):</p>
<ul>
<li>Azure Cloud: <code>https://login.windows.net/{tenant-id}/oauth2/token</code></li>
<li>Azure Government: <code>https://login.microsoftonline.us/{tenant-id}/oauth2/token</code></li>
</ul>
<p>Header:</p>
<p><code>content-type</code>:<code>application/x-www-form-urlencoded</code></p>
<p>Body:<br>
-Azure Cloud:</p>
<pre class=" language-undefined"><code class="prism language-*.txt language-undefined">resource=74658136-14ec-4630-ad9b-26e160ff0fc6&grant_type=refresh_token&refresh_token=@{body('Parse_JSON_From_Refresh_Token')?['value']}&client_id=1950a258-227b-4e31-a9cf-717495945fc2&scope=openid
</code></pre>
<p>-Azure Government:</p>
<pre class=" language-undefined"><code class="prism language-*.txt language-undefined">resource=ee62de39-b9b0-4886-aa58-08b89c4e3db3&grant_type=refresh_token&refresh_token=@{body('Parse_JSON_From_Refresh_Token')?['value']}&client_id=1950a258-227b-4e31-a9cf-717495945fc2&scope=openid
</code></pre>
<h3 id="parse-json-from-login">Parse JSON from Login</h3>
<p>Dive into the login response to extract the access and refresh tokens using the ‘Parse JSON’ action.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/1cf15732-a132-4b80-8980-f1737ee4719d" alt="image"></p>
<p>Utilize this schema:</p>
<pre class=" language-json"><code class="prism language-json"><span class="token punctuation">{</span>
<span class="token string">"properties"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"access_token"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"expires_in"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"expires_on"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"ext_expires_in"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"foci"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"id_token"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"not_before"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"refresh_token"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"resource"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"scope"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"token_type"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"string"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"object"</span>
<span class="token punctuation">}</span>
</code></pre>
<h3 id="set-new-refresh-token">Set new Refresh Token</h3>
<p>With a new refresh token at hand, update Key Vault using the ‘Create or update secret value’ action via the AzureKeyVault custom connector, thereby elongating the token’s lifespan.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/aad5bb7d-c6a2-4ffa-8d5b-b8b6d743c4a3" alt="image"></p>
<h3 id="http-to-azure-api">HTTP to Azure API</h3>
<p>The curtain call involves employing the access token, obtained from the Login action, to interact with the Azure API. In this illustration, we’re fetching the account SKUs.</p>
<p><img src="https://github.com/rwilson504/power-automate-azure-portal-hidden-api/assets/7444929/ec34d95a-8a36-4555-aab7-5ecb8c0dd1ab" alt="image"></p>
<h2 id="upload-auth-token">Upload Auth Token</h2>
<p>The primary incentive behind devising this solution was to automate the simulation of uploading a hardware token CSV. The subsequent details exhibit the URLs and a sample body JSON for this purpose:</p>
<p>To delve deeper into the realm of OATH tokens in Azure Entra, explore the following link: <a href="https://learn.microsoft.com/en-us/entra/identity/authentication/concept-authentication-oath-tokens">Authentication methods in Microsoft Entra ID - OATH tokens</a>.</p>
<h3 id="sample-urlpayload">Sample URL/Payload</h3>
<p><strong>Commercial URLs:</strong></p>
<ul>
<li>Azure Cloud: <code>https://main.iam.ad.ext.azure.com/api/MultifactorAuthentication/HardwareToken/upload</code></li>
<li>Azure Government: <code>https://main.iam.ad.ext.azure.us/api/MultifactorAuthentication/HardwareToken/upload</code></li>
</ul>
<p><strong>Body:</strong></p>
<pre class=" language-json"><code class="prism language-json"><span class="token punctuation">{</span>
<span class="token string">"content"</span><span class="token punctuation">:</span> <span class="token string">"upn,serial number,secret key,time interval,manufacturer,model\r\ntest@mydomain.com,1234567,2234567abcdef2234567abcdef,30,Contoso,HardwareKey"</span><span class="token punctuation">,</span>
<span class="token string">"id"</span><span class="token punctuation">:</span> <span class="token keyword">null</span><span class="token punctuation">,</span>
<span class="token string">"mimeType"</span><span class="token punctuation">:</span> <span class="token string">"application/vnd.ms-excel"</span><span class="token punctuation">,</span>
<span class="token string">"name"</span><span class="token punctuation">:</span> <span class="token string">"760b21fc-128a-4504-9374-71f47638e27c.csv"</span>
<span class="token punctuation">}</span>
</code></pre>
<p>In the final HTTP action of the Power Automate Flow, the above URLs and body can be employed to automate the hardware token upload process. You would specify the corresponding URL (based on your Azure environment) in the HTTP action’s URI field, and the provided JSON payload in the Body field. This automation step is crucial as it streamlines the hardware token management process by allowing a seamless upload of token data through the <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> API.</p>
<p>The payload above delineates the necessary information for uploading hardware tokens, embodying user details, token attributes, and the corresponding hardware specifics.</p>
<p>In conclusion, leveraging the <a href="http://main.iam.ad.ext.azure.com">main.iam.ad.ext.azure.com</a> API in conjunction with Power Automate facilitates a seamless automation of hardware token management, thereby significantly reducing manual labor and enhancing security by minimizing human error.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-12614364319195880502024-02-16T16:18:00.001-05:002024-02-16T16:19:01.463-05:00Government API Development Playbook: Designing for Power Platform and Building Custom Connectors<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/9693714c-2bb4-456c-860f-7e12399e681d" alt="Government API Development Playbook: Designing for Power Platform and Building Custom Connectors"></p>
<p>The article discusses integrating Government APIs with Microsoft Power Platform, emphasizing the creation and certification of custom connectors. It guides government API developers on adopting best practices and standards for API design to ensure compatibility and effectiveness within Power Platform. This includes leveraging tools like OpenAPI and Postman for development and navigating the certification process for connectors, aiming to improve government data accessibility and digital transformation efforts.</p>
<p><strong>Table of Contents</strong></p>
<!-- TOC depthfrom:2 depthto:3 -->
<ul>
<li><a href="#introduction">Introduction</a>
<ul>
<li><a href="#for-government-api-developers">For Government API Developers</a></li>
<li><a href="#for-custom-connector-developers">For Custom Connector Developers</a></li>
</ul>
</li>
<li><a href="#part-i-guidelines-for-government-api-developers">Part I: Guidelines for Government API Developers</a>
<ul>
<li><a href="#best-practices-for-government-api-developers">Best Practices for Government API Developers</a></li>
<li><a href="#the-importance-of-unique-paths-in-api-design">The Importance of Unique Paths in API Design</a></li>
<li><a href="#publishing-api-definitions">Publishing API Definitions</a></li>
<li><a href="#guidance-for-implementing-effective-pagination-in-apis-for-power-platform">Guidance for Implementing Effective Pagination in APIs for Power Platform</a></li>
</ul>
</li>
<li><a href="#part-ii-empowering-connector-development">Part II: Empowering Connector Development</a>
<ul>
<li><a href="#selecting-and-connecting-to-apis">Selecting and Connecting to APIs</a></li>
<li><a href="#endpoint-considerations-for-government-clouds">Endpoint Considerations for Government Clouds</a></li>
<li><a href="#adhering-to-published-coding-standards-for-custom-connectors">Adhering to Published Coding Standards for Custom Connectors</a></li>
<li><a href="#leveraging-openapi-and-postman-for-custom-connector-creation">Leveraging OpenAPI and Postman for Custom Connector Creation</a></li>
<li><a href="#utilizing-openapi-for-connector-generation">Utilizing OpenAPI for Connector Generation</a></li>
<li><a href="#speeding-up-development-with-postman">Speeding Up Development with Postman</a></li>
<li><a href="#combining-openapi-and-postman-for-comprehensive-connector-development">Combining OpenAPI and Postman for Comprehensive Connector Development</a></li>
<li><a href="#openapi-extensions-for-custom-connectors">OpenAPI Extensions for Custom Connectors</a></li>
<li><a href="#utilizing-ai-for-enhanced-connector-development">Utilizing AI for Enhanced Connector Development</a></li>
<li><a href="#connector-policies-bridging-the-gap-in-api-standards">Connector Policies: Bridging the Gap in API Standards</a></li>
<li><a href="#writing-custom-code-for-connectors-extending-functionality-beyond-policies">Writing Custom Code for Connectors: Extending Functionality Beyond Policies</a></li>
<li><a href="#publishing-through-the-independent-publisher-connectors-program-and-verified-publishing">Publishing Through the Independent Publisher Connectors Program and Verified Publishing</a></li>
</ul>
</li>
<li><a href="#part-iii-case-study-and-practical-implementation">Part III: Case Study and Practical Implementation</a>
<ul>
<li><a href="#my-process-for-creating-a-custom-connector-the-gsa-site-scanning-api-case-study">My Process for Creating a Custom Connector: The GSA Site Scanning API Case Study</a></li>
<li><a href="#creating-the-connector-step-by-step">Creating the Connector Step-By-Step</a></li>
<li><a href="#viewing-the-final-output">Viewing the Final Output</a></li>
</ul>
</li>
<li><a href="#conclusion-bridging-government-apis-and-the-power-platform">Conclusion: Bridging Government APIs and the Power Platform</a>
<ul>
<li><a href="#impact-on-digital-government-services">Impact on Digital Government Services</a></li>
</ul>
</li>
</ul>
<!-- /TOC -->
<h2 id="introduction">Introduction</h2>
<p>In the rapidly evolving landscape of digital government services, the Microsoft Power Platform stands out as a transformative tool for streamlining operations, enhancing data integration, and fostering automation across workflows. At the heart of this ecosystem are custom connectors, which serve as essential links between the Power Platform and a myriad of data sources, notably including a wealth of US government APIs.</p>
<p>This guide is designed with a dual purpose: not only to assist developers in crafting custom connectors that unlock the rich potential of existing government APIs but also to serve as a valuable resource for government entities embarking on the development or refinement of APIs. The aim is to ensure these APIs are inherently aligned with Power Platform connector standards, thereby simplifying the connector creation process and enabling full utilization of Power Platform capabilities, such as built-in paging and seamless data access.</p>
<h3 id="for-government-api-developers">For Government API Developers</h3>
<p>This guide begins by addressing the creators of government APIs, emphasizing the importance of designing with Power Platform integration in mind. By adhering to connector standards and implementing features like OpenAPI specifications and efficient pagination from the outset, government APIs can greatly enhance their compatibility and functionality within the Power Platform ecosystem. This section aims to equip API developers with the knowledge to build or update their APIs in a way that facilitates ease of use, maximizes accessibility, and leverages the full spectrum of Power Platform features.</p>
<p>Focusing on these areas aims to foster a more interconnected, efficient, and user-friendly landscape for digital government services, empowering developers and government entities alike to contribute to a robust and dynamic digital ecosystem.</p>
<h3 id="for-custom-connector-developers">For Custom Connector Developers</h3>
<p>Following the guidance for government API developers, this section offers a comprehensive roadmap for navigating the intricacies of connecting to US government APIs, leveraging these connections to enhance the Power Platform’s automation and data processing capabilities. Dedicated to developers seeking to bridge the Power Platform with government data sources, it provides best practices, tools, and insights to streamline development and ensure secure, effective integration. This approach ensures that custom connectors are not only efficiently developed but also fully utilize the capabilities and data provided by government APIs.</p>
<h2 id="part-i-guidelines-for-government-api-developers">Part I: Guidelines for Government API Developers</h2>
<h3 id="best-practices-for-government-api-developers">Best Practices for Government API Developers</h3>
<p>Developing APIs with a focus on ease of integration into custom connectors is crucial for maximizing the utility of government data. A well-designed API not only facilitates seamless connectivity but also enhances user experiences by providing clear, efficient access to data.</p>
<h4 id="why-building-on-standards-like-openapi-is-important">Why Building on Standards Like OpenAPI Is Important</h4>
<p>The Power Platform connectors, along with Logic Apps, utilize the OpenAPI standard, emphasizing its significance in creating adaptable and interoperable APIs. OpenAPI offers a language-agnostic way to describe RESTful APIs, which can be used to generate code-stubs and documentation. This specification plays a pivotal role in decoupling the public interface of an API from its implementation details, allowing business and design teams to outline the features they need independently of the engineering team’s work.</p>
<ul>
<li>
<p><strong>Design-First Approach</strong>: OpenAPI allows for the API to be defined with types and examples for every endpoint before implementation. This design-first approach facilitates refining API design by iterating over the specification document, ensuring the API meets both business requirements and technical constraints.</p>
</li>
<li>
<p><strong>Code Generators and Tooling</strong>: The OpenAPI ecosystem provides tools to create mock APIs, generate documentation, tests, and server stubs. This accelerates the API development process, making it easier for front-end developers to work with a practical interface even before the back-end is fully implemented.</p>
</li>
<li>
<p><strong>Huge Userbase and Stable Implementation</strong>: Backed by many large organizations, OpenAPI represents condensed knowledge from thousands of APIs developed over the years. Its stable implementation and wide adoption provide a reliable foundation for building modern APIs.</p>
</li>
<li>
<p><strong>Facilitating Better Engineering and Usage</strong>: Standards provide a common framework of communication and development, helping to pick the right tools based on specific needs. The OpenAPI Specification, in particular, aids in building consumer-centric APIs by ensuring a good developer experience through a definition-driven approach.</p>
</li>
</ul>
<p>Incorporating OpenAPI standards into the development of government APIs is instrumental in achieving interoperability, ensuring security, and enhancing the overall quality of digital services. It lays a solid foundation for APIs that not only serve the intended purpose efficiently but are also easier to use, document, and integrate with platforms like the Power Platform.</p>
<p>By adopting a standardized approach to API development, government developers can ensure their APIs are robust, scalable, and ready to meet the evolving needs of both internal and external consumers. This approach not only streamlines the development process but also paves the way for a more connected and efficient digital government infrastructure.</p>
<h3 id="the-importance-of-unique-paths-in-api-design">The Importance of Unique Paths in API Design</h3>
<ul>
<li>
<p><strong>Design Considerations</strong>: Ensure that each operation within your API has a unique path. This clarity in design helps custom connector developers to create precise actions, allowing end-users to efficiently target the information they need. The OpenAPI specification emphasizes that a path can only be used once per method, although you can have multiple operations (HTTP methods) for each path. This specification underlines the necessity of segregating your API into distinct actions/paths during development. Utilizing a single path to retrieve different data types can lead to confusion and complicate the user experience.</p>
</li>
<li>
<p><strong>OpenAPI Specification Guidance</strong>: According to the <a href="https://swagger.io/docs/specification/2-0/paths-and-operations/">Swagger definition website for OpenAPI 2.0</a>, paths serve as the unique identifiers for operations within an API, and employing unique paths ensures that the API’s structure remains clear and logical. Adhering to this specification aids developers in avoiding functional overlaps within the API, thereby simplifying navigation and usage of the API through custom connectors.</p>
</li>
</ul>
<h3 id="publishing-api-definitions">Publishing API Definitions</h3>
<p>Make your OpenAPI definitions publicly available, as exemplified by various GSA APIs, to simplify the development of custom connectors. Providing these definitions enables developers to quickly understand and integrate your API’s capabilities into the Power Platform, fostering a collaborative ecosystem.</p>
<ul>
<li>Example: The <a href="https://open.gsa.gov/api/site-scanning-api/#openapi-specification-file">Site Scanning API by GSA</a> offers its OpenAPI specification file for download, facilitating an excellent starting point for connector development.</li>
</ul>
<h3 id="guidance-for-implementing-effective-pagination-in-apis-for-power-platform">Guidance for Implementing Effective Pagination in APIs for Power Platform</h3>
<p>For APIs to be fully compatible with Power Platform and Logic Apps, supporting built-in paging functionality is crucial. This involves designing your API to return paginated responses that adhere to a specific structure, enabling Power Platform to correctly navigate through data sets across multiple pages. Below are the key requirements and examples for structuring your API responses for effective pagination.</p>
<h4 id="requirements-for-returning-paginated-responses-json">Requirements for Returning Paginated Responses (JSON)</h4>
<ol>
<li>
<p><strong>nextLink</strong>: Include a <code>nextLink</code> property in the JSON response body, which contains the URI for accessing the next page of results. This property should be present only when there are more pages to fetch. If there are no additional pages, omit the <code>nextLink</code> property.</p>
</li>
<li>
<p><strong>value</strong>: The response must have a <code>value</code> array that houses the results for the current page. The objects within this array represent the actual data items retrieved by the API.</p>
</li>
<li>
<p><strong>HTTP Response Code</strong>: Use the HTTP status code <code>200</code> for successful paginated responses, indicating that the data has been successfully returned.</p>
</li>
</ol>
<h4 id="example-paginated-api-responses">Example Paginated API Responses</h4>
<p>To illustrate, consider an API that returns data about various entities, each with <code>Name</code> and <code>Description</code> attributes. The following examples show how the API should handle pagination across different responses:</p>
<p><strong>Response #1 (First Page):</strong></p>
<pre class=" language-json"><code class="prism language-json"><span class="token punctuation">{</span>
<span class="token string">"nextLink"</span><span class="token punctuation">:</span> <span class="token string">"http://example.com/api/entities?page=2"</span><span class="token punctuation">,</span>
<span class="token string">"value"</span><span class="token punctuation">:</span> <span class="token punctuation">[</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity1"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity1"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity2"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity2"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">]</span>
<span class="token punctuation">}</span>
</code></pre>
<p><strong>Response #2 (Second Page):</strong></p>
<pre class=" language-json"><code class="prism language-json"><span class="token punctuation">{</span>
<span class="token string">"nextLink"</span><span class="token punctuation">:</span> <span class="token string">"http://example.com/api/entities?page=3"</span><span class="token punctuation">,</span>
<span class="token string">"value"</span><span class="token punctuation">:</span> <span class="token punctuation">[</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity3"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity3"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">]</span>
<span class="token punctuation">}</span>
</code></pre>
<p><strong>Response #3 (Final Page):</strong></p>
<pre class=" language-json"><code class="prism language-json"><span class="token punctuation">{</span>
<span class="token string">"value"</span><span class="token punctuation">:</span> <span class="token punctuation">[</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity4"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity4"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity5"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity5"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">]</span>
<span class="token punctuation">}</span>
</code></pre>
<p><strong>Example Logic App Task Output (Aggregated Results):</strong></p>
<pre class=" language-json"><code class="prism language-json"><span class="token punctuation">{</span>
<span class="token string">"value"</span><span class="token punctuation">:</span> <span class="token punctuation">[</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity1"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity1"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity2"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity2"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity3"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity3"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity4"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity4"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"Name"</span><span class="token punctuation">:</span> <span class="token string">"Entity5"</span><span class="token punctuation">,</span>
<span class="token string">"Description"</span><span class="token punctuation">:</span> <span class="token string">"Description of Entity5"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">]</span>
<span class="token punctuation">}</span>
</code></pre>
<p>By designing your API according to these guidelines, you ensure that it seamlessly integrates with Power Platform’s and Logic Apps’ pagination capabilities, thus facilitating efficient and effective data retrieval in workflows.</p>
<h2 id="part-ii-empowering-connector-development">Part II: Empowering Connector Development</h2>
<h3 id="selecting-and-connecting-to-apis">Selecting and Connecting to APIs</h3>
<p>Selecting the right API is crucial for successful custom connector development. Developers should prioritize US government APIs with comprehensive documentation and a clear authentication process, ensuring smooth development and secure integration within the Power Platform.</p>
<h3 id="endpoint-considerations-for-government-clouds">Endpoint Considerations for Government Clouds</h3>
<p>When building custom connectors intended for use across various cloud environments, including Government Community Cloud (GCC), GCC High (GCCH), and Department of Defense (DoD) networks, it’s essential to be aware of potential differences in API endpoints. These distinctions are particularly relevant for connectors that aim for certification and distribution within these specialized networks.</p>
<h4 id="key-points-on-endpoint-configuration">Key Points on Endpoint Configuration</h4>
<ul>
<li>
<p><strong>Cloud-Specific Endpoints</strong>: APIs may have different endpoints depending on the cloud environment. This variance can affect how a connector accesses the API, necessitating adjustments to ensure seamless operation in GCC, GCCH, or DoD contexts.</p>
</li>
<li>
<p><strong>Certification and Distribution</strong>: For connectors to be certified and distributed within government networks, developers must provide detailed information about the API endpoints specific to each environment. This includes identifying any unique URLs or access methods required for secure government clouds.</p>
</li>
<li>
<p><strong>Awareness and Planning</strong>: Early in the development process, identify whether the API you’re connecting to has distinct endpoints for different cloud environments. This foresight enables you to design your connector with flexibility, ensuring it can adapt to various security and network requirements.</p>
</li>
</ul>
<h4 id="action-steps-for-developers">Action Steps for Developers</h4>
<ol>
<li>
<p><strong>Research and Documentation</strong>: Engage with the API provider to document all relevant endpoints for the targeted government clouds. This information is crucial for developing a connector that is compatible across different environments.</p>
</li>
<li>
<p><strong>Testing and Validation</strong>: Thoroughly test the connector in each intended environment to validate its functionality and address any issues arising from endpoint differences.</p>
</li>
<li>
<p><strong>Communication with Microsoft</strong>: When seeking certification for your connector, clearly communicate with the Microsoft team regarding the different endpoints for GCC, GCCH, and DoD networks. Providing this information upfront can streamline the certification process and facilitate the connector’s deployment across desired networks.</p>
</li>
</ol>
<p>Understanding the nuances of API endpoints for government clouds is pivotal in developing custom connectors that are not only functional but also compliant with the specific requirements of these secure environments. By taking endpoint variations into account early in the development process, you can ensure broader usability and ease of certification for your connector.</p>
<h3 id="adhering-to-published-coding-standards-for-custom-connectors">Adhering to Published Coding Standards for Custom Connectors</h3>
<p>To maintain high quality and consistency in the development of custom connectors for the Power Platform, following the published coding standards is crucial. These guidelines offer a structured approach to creating connectors that are both powerful and user-friendly. Below is an expanded summary of the key points from the official guidelines, along with a link to the full article for in-depth understanding.</p>
<ul>
<li>
<p><strong>General Standards</strong>: The foundation of a successful custom connector is its user-friendliness. Developers are encouraged to design connectors that are easy to use and understand. This includes providing detailed descriptions, actionable summaries, and logical categorizations for all actions and triggers. The aim is to reduce complexity and enhance the intuitive use of the connector by end-users.</p>
</li>
<li>
<p><strong>Schema Definitions</strong>: Effective schema definitions are vital for clear and meaningful interactions with the connector. Developers should use specific data types for all parameters and responses, steering clear of generic types whenever possible. This precision aids users in anticipating the type of data required and returned by each operation, enhancing the overall user experience.</p>
</li>
<li>
<p><strong>Operations and Actions</strong>: Clear and intuitive naming conventions for operations and actions are essential. Names should directly reflect the purpose and outcome of the action, making it easy for users to identify the correct operation for their needs. Furthermore, every parameter and output should be accompanied by comprehensive documentation, including descriptions, expected values, and examples. This detailed guidance supports users in effectively utilizing the connector.</p>
</li>
<li>
<p><strong>Error Handling</strong>: Consistent and informative error handling mechanisms are key to a seamless user experience. Connectors should return specific error messages and codes that align with standard HTTP status codes, helping users quickly understand and resolve issues. Detailed error responses facilitate easier troubleshooting and enhance the reliability of the connector.</p>
</li>
<li>
<p><strong>Security and Privacy</strong>: Adhering to security best practices is non-negotiable. Connectors must implement authentication protocols correctly, ensuring the secure handling of credentials and user data. Developers must prioritize the protection of sensitive information, following stringent guidelines for data privacy and security.</p>
</li>
<li>
<p><strong>Performance and Non-Functional Requirements</strong>: Optimizing connector performance is essential for minimizing latency and ensuring efficient data processing. Developers should also comply with non-functional requirements such as implementing proper pagination for large datasets and adhering to rate limits. These practices prevent overloading services and ensure a smooth, responsive experience for users.</p>
</li>
</ul>
<p>By closely following these coding standards, developers can create custom connectors that are not only functional but also align with the best practices of the Power Platform ecosystem. High adherence to these guidelines ensures that connectors are reliable, secure, and easy to use, meeting the high expectations of users and developers.</p>
<p>For a comprehensive guide to the coding standards for custom connectors, including best practices and detailed examples, visit the official Microsoft documentation: <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/coding-standards">Custom Connector Coding Standards</a>.</p>
<h3 id="leveraging-openapi-and-postman-for-custom-connector-creation">Leveraging OpenAPI and Postman for Custom Connector Creation</h3>
<p>The integration of OpenAPI specifications and Postman collections into the custom connector development process on the Power Platform significantly reduces manual coding, accelerates development, and ensures accuracy in API communication.</p>
<h3 id="utilizing-openapi-for-connector-generation">Utilizing OpenAPI for Connector Generation</h3>
<p>The OpenAPI specification acts as a blueprint for custom connectors, providing a machine-readable interface description that allows for automated generation within the Power Platform. This standardization simplifies the initial creation process and ensures that connectors accurately reflect the API’s capabilities and requirements.</p>
<h3 id="speeding-up-development-with-postman">Speeding Up Development with Postman</h3>
<p>Postman, a popular API client, can further streamline connector development. Developers can use Postman to create and test API requests before translating these requests into custom connector actions. Microsoft provides guidance on how to define a Postman collection for a custom connector, which can then be imported into the Power Platform.</p>
<ul>
<li>
<p><strong>Postman Collections for Custom Connectors</strong>: Postman collections offer a way to group and save API requests, including their parameters and authentication methods. By defining a Postman collection for your API, you create a reusable asset that accelerates the testing and validation of API calls. This approach enables developers to quickly iterate on their connector design, refine its functionality, and ensure its reliability before deployment.</p>
</li>
<li>
<p><strong>Microsoft’s Learn Article on Postman Collections</strong>: For developers looking to leverage Postman in their custom connector development workflow, Microsoft provides a comprehensive <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/define-postman-collection">guide on defining a Postman collection</a>. This article covers the essentials of creating a Postman collection tailored for custom connector creation, from setting up API requests to importing the collection into the Power Platform.</p>
</li>
</ul>
<h3 id="combining-openapi-and-postman-for-comprehensive-connector-development">Combining OpenAPI and Postman for Comprehensive Connector Development</h3>
<p>By leveraging both OpenAPI specifications and Postman collections, developers can harness a robust development workflow that combines the strengths of automated generation and detailed, hands-on testing. This combination ensures that custom connectors are both quickly generated and thoroughly vetted, leading to higher quality integrations and a more seamless user experience.</p>
<h3 id="openapi-extensions-for-custom-connectors">OpenAPI Extensions for Custom Connectors</h3>
<p>OpenAPI extensions such as <code>x-ms-summary</code> and <code>x-ms-visibility</code>, play a pivotal role in refining custom connectors for the Power Platform, particularly by improving how triggers and actions are presented and configured within a flow. The <code>x-ms-summary</code> extension allows developers to add human-readable summaries to operations, parameters, and responses. This clarity aids users in quickly grasping the purpose of each operation and understanding the data inputs required, facilitating a smoother flow creation process.</p>
<p>The <code>x-ms-visibility</code> extension manages the visibility of parameters, enabling developers to streamline the user interface by categorizing parameters as “important,” “advanced,” or “internal.” This thoughtful organization guides users through a more intuitive setup process. Essential parameters are made prominent, ensuring that users are immediately aware of the critical inputs needed for an operation to function correctly. Advanced parameters are available for those needing to fine-tune or utilize more complex functionalities, keeping the interface clean and focused on the most commonly used options.</p>
<p>Together, <code>x-ms-summary</code> and <code>x-ms-visibility</code> significantly enhance user experience by making custom connectors more accessible and easier to use. They ensure that users can navigate complex configurations with ease, reducing the likelihood of errors and streamlining the flow development process. By leveraging these extensions, developers can create connectors that are not only powerful and flexible but also user-friendly and intuitive to use within the Power Platform environment.</p>
<p>Beyond <code>x-ms-summary</code> and <code>x-ms-visibility</code>, the OpenAPI specification for custom connectors on the Power Platform supports a wide array of other extension properties designed to enrich functionality and user interaction. These extension properties allow for a myriad of customizations, ranging from defining custom authentication methods to enabling dynamic schema generation. Developers looking to leverage the full potential of their custom connectors can explore these additional extensions to further tailor their solutions to specific requirements. For a comprehensive overview of available extension properties and guidance on how to implement them, the Microsoft documentation provides an invaluable resource. Interested developers can find detailed information on these extensions at <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/openapi-extensions">Microsoft’s official documentation on OpenAPI extensions</a>. This documentation is an essential tool for anyone aiming to develop more sophisticated, efficient, and user-friendly custom connectors for the Power Platform.</p>
<h3 id="utilizing-ai-for-enhanced-connector-development">Utilizing AI for Enhanced Connector Development</h3>
<p>Incorporating artificial intelligence (AI) into custom connector development offers significant advantages, streamlining the creation process and enhancing functionality. AI tools, notably ChatGPT, are instrumental in several key areas:</p>
<ul>
<li>
<p><strong>Streamlining Low-Code Design</strong>: AI analyzes API documentation to generate simplified explanations and practical use cases for each endpoint and parameter. This enriches the connector’s documentation, making it accessible to a wider audience and facilitating easier integration into the Power Platform.</p>
</li>
<li>
<p><strong>Demystifying Data Structures</strong>: Complex data returned by APIs can be challenging to navigate. AI assists by suggesting how to optimally represent this data within the Power Platform’s low-code environment, ensuring it is accessible and manageable for users.</p>
</li>
<li>
<p><strong>Automating OpenAPI Extension Generation</strong>: The process of manually creating OpenAPI extensions is time-consuming and prone to errors. AI automates this task, accurately aligning extensions with the API’s functionality, which saves time and improves the connector’s reliability and ease of use.</p>
</li>
<li>
<p><strong>Enhancing Documentation for Certification</strong>: A critical step in connector development is the creation of comprehensive <a href="http://README.MD">README.MD</a> files that meet certification standards. AI leverages detailed API documentation to produce clear, concise, and informative content, highlighting the necessity of well-documented APIs for effective AI assistance.</p>
</li>
</ul>
<p>The success of employing AI in connector development heavily relies on the quality of the API documentation provided. High-quality, detailed documentation enables AI to offer more effective assistance, underscoring its importance for both API consumers and creators. This approach not only expedites the development and certification processes but also ensures the creation of connectors that are both powerful and user-friendly. By leveraging AI, developers can navigate the complexities of connector development with greater ease and efficiency, ultimately enhancing the capabilities and accessibility of the Power Platform.</p>
<h3 id="connector-policies-bridging-the-gap-in-api-standards">Connector Policies: Bridging the Gap in API Standards</h3>
<p>In the development of custom connectors for the Power Platform, encountering external APIs that do not completely adhere to expected standards can be a common challenge. Connector policies provide a robust mechanism to customize the interaction with these APIs, ensuring a seamless integration. These policies allow for modifications of requests and responses, effectively bridging any gaps in API standards.</p>
<h4 id="overview-of-policy-templates">Overview of Policy Templates</h4>
<p>Connector policies introduce specific rules into the connector’s definition, guiding the handling of incoming and outgoing data. They are designed to adapt the connector’s functionality to work with diverse API behaviors, from altering request paths to transforming response formats. Microsoft’s policy templates offer predefined solutions for common integration challenges:</p>
<ul>
<li>
<p><strong><a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/convertarraytoobject/convertarraytoobject">Convert Array to Object</a> / <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/convertobjecttoarray/convertobjecttoarray">Convert Object to Array</a></strong>: Facilitate data structure transformations to match the Power Platform’s requirements, ensuring compatibility.</p>
</li>
<li>
<p><strong><a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/stringtoarray/stringtoarray">String to Array</a></strong>: Splits strings into arrays based on specified delimiters, useful for APIs returning multi-value strings.</p>
</li>
<li>
<p><strong><a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/routerequesttoendpoint/routerequesttoendpoint">Route Request to Endpoint</a></strong>: Dynamically directs requests to different API endpoints, enhancing flexibility in handling API requests.</p>
</li>
<li>
<p><strong><a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/setconnectionstatustounauthenticated/setconnectionstatustounauthenticated">Set Connection Status to Unauthenticated</a></strong>: Automatically updates the connection status, prompting re-authentication when necessary.</p>
</li>
<li>
<p><strong><a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/setvaluefromurl/setvaluefromurl">Set Value from URL</a> / <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/dynamichosturl/dynamichosturl">Dynamic Host URL</a></strong>: Adjusts requests dynamically based on URL parameters or changes the API host URL, accommodating APIs that require flexible request configurations.</p>
</li>
<li>
<p><strong><a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/setheader/setheader">Set Header</a> / <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/setproperty/setproperty">Set Property</a> / <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates/setqueryparameter/setqueryparameter">Set Query Parameter</a></strong>: Customize request headers, properties, and query parameters to meet specific API requirements or standards.</p>
</li>
</ul>
<h4 id="the-role-of-policies-in-handling-non-standard-apis">The Role of Policies in Handling Non-Standard APIs</h4>
<p>By employing connector policies, developers can ensure that custom connectors are compatible with a wide array of APIs, including those that do not conform to standard practices. These policies offer a versatile toolkit for addressing integration hurdles, making connectors more robust and user-friendly.</p>
<p>For an in-depth exploration of connector policies and practical examples, refer to the following Microsoft documentation: <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/policy-templates">Policy Templates Overview</a></p>
<p>This guidance underscores the significance of connector policies in custom connector development, highlighting how they can be leveraged to overcome the complexities of working with diverse and non-standard APIs.</p>
<h3 id="writing-custom-code-for-connectors-extending-functionality-beyond-policies">Writing Custom Code for Connectors: Extending Functionality Beyond Policies</h3>
<p>While connector policies provide a powerful toolset for modifying requests and responses to ensure seamless API integration, sometimes the requirements or complexities of an API call for a more tailored approach. Writing custom code for connectors emerges as a vital strategy in these scenarios, offering the ability to implement advanced functionality and bridge any remaining gaps between an API and the Power Platform’s standards. This section explores the role of custom code in connector development, including its capabilities and inherent limitations.</p>
<h4 id="capabilities-of-custom-code-in-connector-development">Capabilities of Custom Code in Connector Development</h4>
<p>Custom code allows developers to go beyond the constraints of predefined policies, enabling:</p>
<ul>
<li>
<p><strong>Complex Data Manipulation</strong>: When data transformation needs exceed simple conversions, custom code can perform intricate processing, ensuring the data fits the Power Platform’s structure and flows.</p>
</li>
<li>
<p><strong>Advanced Logic Implementation</strong>: Custom code supports the incorporation of sophisticated logic into connectors, such as conditional operations, loops, and custom error handling, enhancing the connector’s versatility.</p>
</li>
<li>
<p><strong>Integration of External Libraries</strong>: For specific functionalities not natively supported by the Power Platform, developers can leverage external libraries within their custom code, broadening the scope of what a connector can achieve.</p>
</li>
</ul>
<h4 id="limitations-and-considerations">Limitations and Considerations</h4>
<p>While writing custom code offers expanded possibilities, it comes with its set of limitations and considerations:</p>
<ul>
<li>
<p><strong>Performance Impact</strong>: Custom code can introduce additional processing overhead, potentially affecting the connector’s performance. It’s crucial to optimize code for efficiency to minimize latency and resource consumption.</p>
</li>
<li>
<p><strong>Maintenance and Complexity</strong>: Custom code increases the complexity of the connector, which can impact maintainability. Clear documentation and adherence to coding best practices are essential to mitigate these challenges.</p>
</li>
<li>
<p><strong>Security Concerns</strong>: Incorporating custom code and external libraries requires careful consideration of security implications, including the management of dependencies and adherence to security best practices.</p>
</li>
<li>
<p><strong>Platform Limitations</strong>: The Power Platform may impose restrictions on the execution environment of custom code, such as execution time limits and available computational resources. Developers must design their code with these constraints in mind.</p>
</li>
</ul>
<h4 id="implementing-custom-code">Implementing Custom Code</h4>
<p>To implement custom code in a custom connector, developers must navigate to the connector’s settings within the Power Platform and specify the code within designated sections for request or response processing. It’s important to thoroughly test custom code to ensure it behaves as expected and does not introduce unintended side effects.</p>
<p>For detailed guidance on writing custom code for connectors, including examples and best practices, refer to the official Microsoft documentation: <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/write-code">Write Custom Code for Connectors</a>.</p>
<p>Writing custom code for connectors is a powerful method for addressing complex integration scenarios and enhancing connector functionality. By carefully considering the capabilities and limitations of custom code, developers can create robust, efficient, and secure custom connectors that fully meet their integration needs.</p>
<h3 id="publishing-through-the-independent-publisher-connectors-program-and-verified-publishing">Publishing Through the Independent Publisher Connectors Program and Verified Publishing</h3>
<p>Making your custom connector available on the Power Platform can significantly extend its reach, potentially impacting millions of users. Two primary pathways exist for publishing connectors: Verified Publishing and Independent Publishing. Each route offers unique benefits and caters to different types of publishers.</p>
<h4 id="verified-publishing">Verified Publishing</h4>
<p>Verified Publishing is designed for companies and organizations that want to publish their connectors to all Power Platform users. This process involves a certification process that ensures the connector meets Microsoft’s standards for security, functionality, and compliance.</p>
<ul>
<li>
<p><strong>Eligibility and Process</strong>: To be eligible for Verified Publishing, the connector must be owned by an organization rather than an individual. The process involves submitting the connector for Microsoft’s certification, which includes a thorough review and testing phase. Once certified, the connector is listed as a verified connector within the Power Platform, offering high visibility and trust among users.</p>
</li>
<li>
<p><strong>Benefits</strong>: Verified connectors are prominently displayed and recommended within the Power Platform, providing a significant boost in visibility. This route is ideal for organizations looking to promote their services or integrations widely and establish a high level of trust with users.</p>
</li>
</ul>
<p>More details on Verified Publishing and eligibility can be found in Microsoft’s documentation <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/certification-submission">here</a>.</p>
<h4 id="independent-publisher-connectors-program">Independent Publisher Connectors Program</h4>
<p>The Independent Publisher Connectors Program caters to individual developers or smaller teams who wish to share their connectors with the community. This program allows for a more streamlined publication process and does not require the connector to be associated with an organization.</p>
<ul>
<li>
<p><strong>Eligibility and Process</strong>: Independent publishing is open to any developer, including those not affiliated with an organization. Developers submit their connectors through a simplified certification process that ensures basic functionality and security standards. Once approved, connectors are available on the Power Platform as independent publisher connectors.</p>
</li>
<li>
<p><strong>Benefits</strong>: Publishing as an independent publisher allows developers to contribute to the Power Platform ecosystem and share their work with a wide audience. While these connectors are marked as independently published, they still undergo a review process to ensure quality and security.</p>
</li>
</ul>
<p>Detailed information on the Independent Publisher Connectors Program and the submission process can be found <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/certification-submission-ip">here</a>.</p>
<h4 id="choosing-your-path-to-publication">Choosing Your Path to Publication</h4>
<p>Deciding between verified and independent publishing depends on your goals, affiliation, and the scale at which you wish to share your connector. Both paths offer the opportunity to enhance the Power Platform ecosystem and provide valuable tools to users worldwide. By making your connector available through either process, you contribute to the growing library of resources that empower users to create innovative solutions and automate workflows more efficiently.</p>
<h2 id="part-iii-case-study-and-practical-implementation">Part III: Case Study and Practical Implementation</h2>
<h3 id="my-process-for-creating-a-custom-connector-the-gsa-site-scanning-api-case-study">My Process for Creating a Custom Connector: The GSA Site Scanning API Case Study</h3>
<p>Creating a custom connector for the Power Platform involves careful selection of the API, understanding its documentation, and overcoming any potential challenges. For this case study, I’ve chosen the GSA Site Scanning API, an exemplary model of government transparency and digital transformation. Here’s an enhanced overview of my process, rationale, and a deep dive into the Site Scanning Program:</p>
<h4 id="the-site-scanning-program-overview">The Site Scanning Program Overview</h4>
<p>The Site Scanning program represents a pivotal federal initiative, automating the scanning of public federal websites to generate data on website health, policy compliance, and adherence to best practices. This program, provided as a no-cost shared service for federal agencies and the public, is built around the Federal Website Index, a comprehensive listing of all public federal .gov sites categorized by agency or department. Through daily scans, the program amasses over 1.5 million fields of data about approximately 26,000 federal .gov websites, all made accessible via an API and bulk download options.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/8c411181-5072-496e-b4d4-1f28c01ef249" alt="GSA Site Scanning Program Website"></p>
<h4 id="key-features-and-benefits">Key Features and Benefits</h4>
<ul>
<li>
<p><strong>Comprehensive Data Collection:</strong> Scans cover a broad spectrum of compliance and best practice metrics, including the presence of agency websites and subdomains, Digital Analytics Program participation, US Web Design System utilization, search engine optimization, third party services, IPv6 compliance, among others.</p>
</li>
<li>
<p><strong>Public Accessibility and Transparency:</strong> The API and bulk data downloads democratize access to a wealth of information on federal website compliance and health, reinforcing the government’s commitment to transparency and digital accountability.</p>
</li>
<li>
<p><strong>Ease of Access:</strong> The program simplifies the process of obtaining an API key, allowing developers immediate access to scan data. This ease of access facilitates the integration of scan data into developers’ workflows and applications.</p>
</li>
</ul>
<h4 id="advantages-of-using-the-gsa-site-scanning-api">Advantages of Using the GSA Site Scanning API</h4>
<p>The GSA Site Scanning API presents several significant benefits that contribute to its appeal for custom connector development:</p>
<ul>
<li>
<p><strong>Published OpenAPI File</strong>: The API’s <a href="https://open.gsa.gov/api/site-scanning-api/#openapi-specification-file">published OpenAPI specification file</a> is a critical asset for quickly generating a custom connector. This document provides a detailed overview of the API’s operations, parameters, and responses, enabling streamlined creation with minimal manual coding.</p>
</li>
<li>
<p><strong>Comprehensive Documentation</strong>: Beyond the OpenAPI specification, the GSA’s thorough documentation on API endpoints ensures developers have a deep understanding of how to interact with the API, the data available for retrieval, and the integration of this data within applications or workflows.</p>
</li>
<li>
<p><strong>Convenient and Easy API Key Acquisition</strong>: The simple process for obtaining an API key facilitates a quick start for developers, allowing immediate access to the necessary authentication to interact with the API.</p>
</li>
</ul>
<h4 id="disadvantages-of-using-the-gsa-site-scanning-api">Disadvantages of Using the GSA Site Scanning API</h4>
<p>Despite its numerous advantages, there are challenges associated with using the GSA Site Scanning API for custom connector development:</p>
<ul>
<li>
<p><strong>Pagination Challenges</strong>: Addressing the unique pagination method employed by the GSA Site Scanning API required a creative solution, as it diverges from the <code>@odata.nextLink</code> standard typically leveraged by Power Platform connectors for automatic paging. To facilitate efficient data access across the API’s extensive dataset, I devised a custom flow that leverages the pagination metadata provided by the API. This approach involves dynamically adjusting query parameters based on the pagination links and metadata supplied with each API response, allowing for seamless iteration through pages of data. By incorporating conditional checks and loop controls within the flow, I ensured complete and efficient data retrieval, effectively overcoming the API’s pagination challenge. This method demonstrates how developers can employ custom logic within Power Automate to handle non-standard pagination schemes, ensuring access to all available data without sacrificing performance or usability.<br>
<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/6eee21b3-6c01-46d7-a140-2da36bd674ba" alt="Illustration of the custom pagination flow in Power Automate, demonstrating how pagination metadata is utilized to navigate through pages of data."><br>
</p>
</li>
<li>
<p><strong>OpenAPI 3.0 Format</strong>: The API’s OpenAPI file is provided in OpenAPI 3.0 format, which is currently not directly supported for custom connectors on the Power Platform. This necessitates converting the file to OpenAPI 2.0 (Swagger), requiring additional steps and potentially external tooling like Apimatic or the <code>api-spec-converter</code> command-line tool. This conversion process, although manageable, introduces an extra layer of complexity to the connector development workflow.</p>
</li>
</ul>
<p>By carefully considering these advantages and challenges, developers can make informed decisions about using the GSA Site Scanning API for creating custom connectors, ensuring a balanced approach to leveraging this resource within the Power Platform ecosystem.</p>
<h3 id="creating-the-connector-step-by-step">Creating the Connector Step-By-Step</h3>
<h3 id="viewing-the-final-output">Viewing the Final Output</h3>
<p>After walking through the development process of the GSA Site Scanning API connector, you now have the opportunity to review the final output of what was created. This section directs you to the completed files and submission details, allowing you to examine the practical application of the steps previously discussed:</p>
<ul>
<li>
<p><strong>Completed Connector Files</strong>: Access the detailed files and configurations <a href="https://github.com/microsoft/PowerPlatformConnectors/pull/3246/commits/397f689fa540e4967af90010143b30e111e77010">here</a>.</p>
</li>
<li>
<p><strong>Submission Process Insights</strong>: Review the submission and collaborative feedback process <a href="https://github.com/microsoft/PowerPlatformConnectors/pull/3246">here</a>.</p>
</li>
</ul>
<p>This real-world example serves as a valuable reference for your own connector development journey.</p>
<h4 id="step-1-obtaining-the-api-key-for-the-gsa-site-scanning-api">Step 1: Obtaining the API Key for the GSA Site Scanning API</h4>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/e26cc294-2e67-41c7-91c2-c5a3373df144" alt="GSA API Key Signup"></p>
<p>Before you can start building your custom connector, you need to secure an API key, which serves as your access credential for the GSA Site Scanning API. Here’s how to go about it:</p>
<ol>
<li>
<p><strong>Visit the API’s Getting Started Page</strong>: Navigate to the <a href="https://open.gsa.gov/api/site-scanning-api/#getting-started">GSA Site Scanning API’s getting started section</a> to initiate the process of obtaining your API key.</p>
</li>
<li>
<p><strong>Submit the API Key Request Form</strong>: Fill out the form provided on the page with your details. This typically includes your name, email address, and the reason for requesting access. The simplicity of this process underscores the API’s accessibility and developer-friendly approach.</p>
</li>
<li>
<p><strong>Check Your Email</strong>: After submitting the form, keep an eye on your email inbox. The GSA Site Scanning API team will send you the API key, usually within a short period. This key is essential for authenticating your requests to the API during the connector development process.</p>
</li>
<li>
<p><strong>Secure Your API Key</strong>: Once you receive your API key via email, ensure to keep it secure. You’ll be using this key in your custom connector configurations to authenticate and interact with the API.</p>
</li>
</ol>
<p>This initial step is not only about gaining access but also about starting your journey with the API on a note of security and preparedness. With the API key in hand, you’re ready to move on to the technical aspects of building your custom connector.</p>
<h4 id="step-2-enhancing-the-openapi-definition-file-with-extended-attributes">Step 2: Enhancing the OpenAPI Definition File with Extended Attributes</h4>
<p>With the API key in hand, enhancing the OpenAPI definition file for the GSA Site Scanning API is crucial for developing a user-friendly custom connector. This step focuses on refining the definition file to make it as informative and accessible as possible for low-code developers.</p>
<ol>
<li>
<p><strong>Download the OpenAPI Definition File</strong>: Obtain the OpenAPI definition from the <a href="https://open.gsa.gov/api/site-scanning-api/#openapi-specification-file">GSA Site Scanning API documentation</a>. This document is your starting point, providing a detailed overview of the API’s capabilities.</p>
</li>
<li>
<p><strong>Review the OpenAPI File</strong>: Familiarize yourself with the file’s contents, identifying any areas lacking in detail that could benefit from further clarification, especially for low-code developers.</p>
</li>
<li>
<p><strong>Using ChatGPT to Fill in the Details</strong>:</p>
<ul>
<li>
<p>After identifying what’s missing, particularly in terms of extended properties, the next step involves leveraging ChatGPT to generate these details. Due to processing limitations, it’s often necessary to break down the task into smaller, more manageable segments.</p>
</li>
<li>
<p><strong>Approach</strong>:</p>
<ul>
<li><strong>Individual Path Processing</strong>: Begin by inputting individual API paths into ChatGPT, asking it to generate missing summary, description, and other relevant attributes for each. This piecemeal approach helps avoid overwhelming ChatGPT and ensures more accurate outputs.</li>
<li><strong>Sequential Processing</strong>: For APIs with multiple paths, consider processing them sequentially. Start with the first path, then proceed to the next, and so on. This method allows for focused enhancement of each part of the API definition.</li>
</ul>
</li>
<li>
<p><strong>Sample Prompt</strong>: Here’s an example prompt used for generating extended attributes for an API path:</p>
<pre class=" language-markdown"><code class="prism language-markdown">Acting as a Power Platform developer, I would like your assistance in writing a custom connector. I will provide each path for the API. Include the following:
<span class="token list punctuation">*</span> A summary and description attributes for each path.
<span class="token list punctuation">*</span> A description, and x-ms-summary attribute for each path parameter and response property; the x-ms-summary should read like a title for the name field.
<span class="token list punctuation">*</span> A title, description, and x-ms-summary attribute for each response property; the title and x-ms-summary will be the same.
<span class="token list punctuation">*</span> If the name attribute is used in the description, then update the description to use the new title attribute.
Information on those attributes can be found here: https://learn.microsoft.com/en-us/connectors/custom-connectors/openapi-extensions. Please update the file with those additional attributes and provide it back to me.
Here is the first one.
... <<span class="token tag"><span class="token tag"><span class="token punctuation"><</span>paste</span> <span class="token attr-name">the</span> <span class="token attr-name">first</span> <span class="token attr-name">past</span> <span class="token attr-name">here</span><span class="token punctuation">></span></span>>
</code></pre>
</li>
</ul>
<p>Utilizing this prompt, you can guide ChatGPT to produce the necessary extensions and attributes, tailoring the OpenAPI file to better suit low-code development needs within the Power Platform.</p>
</li>
<li>
<p><strong>Integrate ChatGPT Outputs into the OpenAPI File</strong>: After generating the extended attributes with ChatGPT, incorporate these enhancements back into the original OpenAPI definition file. This iterative process significantly enriches the file, making it a more valuable resource for connector development.</p>
</li>
<li>
<p><strong>Validate the Enhanced OpenAPI File</strong>: Ensure the modified file is error-free and adheres to the OpenAPI specification by using validation tools like Swagger Editor. This step confirms that your enhancements are correctly implemented and that the file is ready for use in developing your custom connector.</p>
</li>
</ol>
<p>By methodically enhancing the OpenAPI definition with detailed attributes through ChatGPT, you streamline the development process, offering a guided and simplified experience for developers engaging with your custom connector.</p>
<h4 id="step-3-enhancing-the-openapi-file-with-contact-and-product-metadata">Step 3: Enhancing the OpenAPI File with Contact and Product Metadata</h4>
<p>For independent publishers, it’s crucial to enrich the OpenAPI definition file with detailed contact information and additional product or end service metadata. This enhancement ensures users of your custom connector have comprehensive support resources and insights into the product or service behind the connector.</p>
<h5 id="adding-contact-information">Adding Contact Information</h5>
<p>Incorporate your contact details within the <code>info</code> section of the OpenAPI file. This inclusion offers users a direct channel for support, feedback, or inquiries:</p>
<pre class=" language-json"><code class="prism language-json"><span class="token string">"info"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"title"</span><span class="token punctuation">:</span> <span class="token string">"GSA Site Scanning API"</span><span class="token punctuation">,</span>
<span class="token string">"description"</span><span class="token punctuation">:</span> <span class="token string">"This API provides information about sites in the federal web presence, enabling users to scan and retrieve data efficiently."</span><span class="token punctuation">,</span>
<span class="token string">"version"</span><span class="token punctuation">:</span> <span class="token string">"2.0"</span><span class="token punctuation">,</span>
<span class="token string">"contact"</span><span class="token punctuation">:</span> <span class="token punctuation">{</span>
<span class="token string">"name"</span><span class="token punctuation">:</span> <span class="token string">"Richard Wilson"</span><span class="token punctuation">,</span>
<span class="token string">"email"</span><span class="token punctuation">:</span> <span class="token string">"richard.a.wilson@microsoft.com"</span><span class="token punctuation">,</span>
<span class="token string">"url"</span><span class="token punctuation">:</span> <span class="token string">"https://www.richardawilson.com/"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">}</span>
</code></pre>
<h5 id="incorporating-product-metadata">Incorporating Product Metadata</h5>
<p>Add metadata concerning the product or service associated with the connector, such as support details, privacy policy, and categorization. These details are crucial for enhancing user understanding and trust. Utilize the x-ms-connector-metadata extension for this purpose:</p>
<pre class=" language-json"><code class="prism language-json"><span class="token string">"x-ms-connector-metadata"</span><span class="token punctuation">:</span> <span class="token punctuation">[</span>
<span class="token punctuation">{</span>
<span class="token string">"propertyName"</span><span class="token punctuation">:</span> <span class="token string">"Website"</span><span class="token punctuation">,</span>
<span class="token string">"propertyValue"</span><span class="token punctuation">:</span> <span class="token string">"https://open.gsa.gov/api/site-scanning-api"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"propertyName"</span><span class="token punctuation">:</span> <span class="token string">"Privacy policy"</span><span class="token punctuation">,</span>
<span class="token string">"propertyValue"</span><span class="token punctuation">:</span> <span class="token string">"https://www.gsa.gov/technology/government-it-initiatives/digital-strategy/terms-of-service-for-developer-resources"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"propertyName"</span><span class="token punctuation">:</span> <span class="token string">"Categories"</span><span class="token punctuation">,</span>
<span class="token string">"propertyValue"</span><span class="token punctuation">:</span> <span class="token string">"IT Operations"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">]</span>
</code></pre>
<p>The <code>contact</code> object within the <code>info</code> section and the <code>x-ms-connector-metadata</code> extension should be meticulously detailed, as shown above. The <code>Categories</code> property must accurately represent the connector’s logical classification, using a semicolon-delimited string from the following categories: AI, Business Management, Business Intelligence, Collaboration, Commerce, Communication, Content and Files, Finance, Data, Human Resources, Internet of Things, IT Operations, Lifestyle and Entertainment, Marketing, Productivity, Sales and CRM, Security, Social Media, Website.</p>
<p>To ensure the accuracy and relevance of product metadata included in your OpenAPI file, the best resources are the API publisher’s website and official API documentation. This is why it’s crucial for publishers to provide comprehensive information about their product or service, including support resources, privacy policies, and categorization details. Equally, for consumers aiming to create custom connectors, selecting APIs that offer this information is pivotal. This approach not only facilitates the creation of more reliable and user-friendly connectors but also promotes transparency and trust within the user community. By prioritizing APIs with well-documented product metadata, both publishers and consumers contribute to a more informed and efficient ecosystem within the Power Platform.</p>
<p>By integrating these contact and product metadata details into your OpenAPI file, you not only adhere to the publishing requirements for independent publishers but also significantly improve the connector’s discoverability, transparency, and trustworthiness among users.</p>
<h4 id="step-4-good-practices-in-setting-the-consumes-and-produces-attributes">Step 4: Good Practices in Setting the Consumes and Produces Attributes</h4>
<p>Defining the <code>consumes</code> and <code>produces</code> attributes in your OpenAPI specification is crucial. These attributes indicate the MIME types your API can accept and return, respectively:</p>
<p>Best Practices<br>
For APIs dealing exclusively with JSON data, explicitly specifying <code>application/json</code> for both <code>consumes</code> and <code>produces</code> attributes is crucial. This ensures that the API and client applications correctly handle content types:</p>
<pre class=" language-json"><code class="prism language-json">consumes<span class="token punctuation">:</span> <span class="token punctuation">[</span><span class="token string">"application/json"</span><span class="token punctuation">]</span><span class="token punctuation">,</span>
produces<span class="token punctuation">:</span> <span class="token punctuation">[</span><span class="token string">"application/json"</span><span class="token punctuation">]</span>
</code></pre>
<p>Setting these attributes correctly ensures data formats are properly understood and handled, promoting consistency and reliability in data exchange.</p>
<h4 id="step-5-addressing-pagination-issues-with-custom-logic-in-power-automate-flows">Step 5: Addressing Pagination Issues with Custom Logic in Power Automate Flows</h4>
<p>To enable effective pagination in the GSA Site Scanning API through Power Automate flows, incorporate <code>limit</code> and <code>page</code> parameters into your requests. These parameters should be defined in your OpenAPI file with precision, ensuring seamless integration and use within the Power Platform.</p>
<h5 id="adding-pagination-parameters-to-the-openapi-file">Adding Pagination Parameters to the OpenAPI File</h5>
<p>Define the <code>limit</code> and <code>page</code> parameters in the OpenAPI file as follows, ensuring they align with the standard structure used for other parameters:</p>
<pre class=" language-json"><code class="prism language-json"><span class="token string">"parameters"</span><span class="token punctuation">:</span> <span class="token punctuation">[</span>
<span class="token punctuation">{</span>
<span class="token string">"name"</span><span class="token punctuation">:</span> <span class="token string">"limit"</span><span class="token punctuation">,</span>
<span class="token string">"in"</span><span class="token punctuation">:</span> <span class="token string">"query"</span><span class="token punctuation">,</span>
<span class="token string">"required"</span><span class="token punctuation">:</span> <span class="token boolean">false</span><span class="token punctuation">,</span>
<span class="token string">"description"</span><span class="token punctuation">:</span> <span class="token string">"Specifies the number of items to return in a single page of results."</span><span class="token punctuation">,</span>
<span class="token string">"x-example"</span><span class="token punctuation">:</span> <span class="token string">"10"</span><span class="token punctuation">,</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"integer"</span><span class="token punctuation">,</span>
<span class="token string">"x-ms-summary"</span><span class="token punctuation">:</span> <span class="token string">"Limit"</span>
<span class="token punctuation">}</span><span class="token punctuation">,</span>
<span class="token punctuation">{</span>
<span class="token string">"name"</span><span class="token punctuation">:</span> <span class="token string">"page"</span><span class="token punctuation">,</span>
<span class="token string">"in"</span><span class="token punctuation">:</span> <span class="token string">"query"</span><span class="token punctuation">,</span>
<span class="token string">"required"</span><span class="token punctuation">:</span> <span class="token boolean">false</span><span class="token punctuation">,</span>
<span class="token string">"description"</span><span class="token punctuation">:</span> <span class="token string">"Specifies the page number of the results to retrieve."</span><span class="token punctuation">,</span>
<span class="token string">"x-example"</span><span class="token punctuation">:</span> <span class="token string">"1"</span><span class="token punctuation">,</span>
<span class="token string">"type"</span><span class="token punctuation">:</span> <span class="token string">"integer"</span><span class="token punctuation">,</span>
<span class="token string">"x-ms-summary"</span><span class="token punctuation">:</span> <span class="token string">"Page"</span>
<span class="token punctuation">}</span>
<span class="token punctuation">]</span>
</code></pre>
<h4 id="step-6-converting-the-openapi-3.0-file-to-swagger-2.0-format">Step 6: Converting the OpenAPI 3.0 File to Swagger 2.0 Format</h4>
<p>The Power Platform requires custom connector definitions to be in Swagger 2.0 format. If your OpenAPI definition file is in version 3.0, you’ll need to convert it before proceeding with the import. There are a few tools available for this conversion:</p>
<ol>
<li>
<p><strong>Using Apimatic’s Transform API Function</strong>: Apimatic offers a convenient <a href="https://app.apimatic.io/">Transform API</a> function that allows you to easily convert between different API specification formats, including from OpenAPI 3.0 to Swagger 2.0. Simply upload your OpenAPI 3.0 file, select the target format (Swagger 2.0), and download the converted file.</p>
</li>
<li>
<p><strong>Utilizing the api-spec-converter Command Line Tool</strong>: For those who prefer working from the command line or need to integrate this step into automated workflows, the <a href="https://github.com/LucyBot-Inc/api-spec-converter">api-spec-converter</a> tool available on GitHub is an excellent resource. The tool can be installed using the following command:</p>
<pre class=" language-bash"><code class="prism language-bash"><span class="token function">npm</span> <span class="token function">install</span> -g api-spec-converter
</code></pre>
<p>This tool supports various API specification formats and can be used to convert your OpenAPI 3.0 file to Swagger 2.0 format with a simple command:</p>
<pre class=" language-bash"><code class="prism language-bash">api-spec-converter --from<span class="token operator">=</span>openapi_3 --to<span class="token operator">=</span>swagger_2 my_api_file.yml <span class="token operator">></span> my_converted_api_file.json
</code></pre>
<p>Ensure you have Node.js installed to use the <code>api-spec-converter</code>, as it is a prerequisite for running the tool.</p>
</li>
</ol>
<p>After converting your file to Swagger 2.0 format, you’re ready to proceed with importing your custom connector into the Power Platform. This conversion step ensures compatibility with the platform’s requirements, paving the way for a smoother development and publication process.</p>
<h4 id="step-7-importing-the-enhanced-openapi-file-into-the-power-platform-as-a-custom-connector">Step 7: Importing the Enhanced OpenAPI File into the Power Platform as a Custom Connector</h4>
<p>After enhancing the OpenAPI definition file with extended attributes and adding your contact information, the next step is to import this file into the Power Platform to create your custom connector. This step transforms the API definition into a functional tool within the Power Platform ecosystem. Here’s how to do it:</p>
<ol>
<li>
<p><strong>Navigate to Power Apps or Power Automate</strong>: Open either Power Apps or Power Automate, depending on where you intend to use the custom connector.</p>
</li>
<li>
<p><strong>Access Custom Connectors</strong>: From the navigation pane, select “Custom Connectors.” This area allows you to manage, create, and import custom connectors.</p>
</li>
<li>
<p><strong>Create a New Custom Connector</strong>: Opt for creating a new custom connector and select “Import an OpenAPI file” as your method, given you have a prepared API definition.</p>
</li>
<li>
<p><strong>Upload the OpenAPI File</strong>:</p>
<ul>
<li>Click “Import” and locate your updated OpenAPI file.</li>
<li>The platform will process the file and auto-generate the connector’s scaffolding based on the OpenAPI specification.</li>
</ul>
<pre class=" language-markdown"><code class="prism language-markdown">Ensure the OpenAPI file is properly formatted and complete. The import validation will check the file's integrity.
</code></pre>
</li>
<li>
<p><strong>Configure the Connector</strong>: Fine-tune your connector’s settings in the editor, paying special attention to the nuances that can affect its approval for publication:</p>
<ul>
<li><strong>Title</strong>: When naming your connector, avoid including the term “API” in the title. Microsoft’s guidelines for connector submission stipulate that the title should not contain “API” to ensure clarity and consistency across the platform.</li>
<li><strong>Icon</strong>: For this particular connector, an icon will not be used. When choosing to include an icon for other connectors, ensure it meets the <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/certification-submission#step-5-prepare-the-connector-artifacts">artifacts requirements</a> set by Microsoft, which includes specifications for size, format, and design.</li>
<li><strong>Color</strong>: Set the connector’s color to <code>#da3b01</code>. This specific color is a requirement for independent connectors, helping to distinguish them visually within the Power Platform ecosystem.</li>
</ul>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/bc7b4afc-068a-4e3a-849f-965547088c2a" alt="Custom Connector Configuration Page"></p>
<p>Incorporate these configuration details carefully to align with the platform’s standards and increase the likelihood of your connector being approved for publication. Further refine authentication methods, actions, triggers, parameters, and general information according to the functionality and data flow defined in your OpenAPI file.</p>
</li>
</ol>
<h4 id="step-8-saving-and-testing-the-connector">Step 8: Saving and Testing the Connector</h4>
<p>After configuring your custom connector, the crucial next steps involve saving your work and thoroughly testing the connector to ensure its functionality:</p>
<ol>
<li>
<p><strong>Saving the Connector</strong>: Make sure all configurations are accurately applied and save your custom connector.</p>
</li>
<li>
<p><strong>Testing the Connector</strong>: Utilize the testing feature within the custom connector editor on the Power Platform. This enables you to run each action defined in the connector to verify correct interaction with the API and appropriate response handling.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/2b6b182b-f55e-4d25-b041-7a2644223eb6" alt="Custom Connector Test Area"></p>
</li>
<li>
<p><strong>Encountering and Resolving Errors</strong>: During testing, you may run into various schema validation errors. Addressing these errors is key to ensuring your connector’s reliability and user-friendliness.</p>
<ul>
<li>
<p><strong>Remove ‘Required’ Sections from Response Schemas</strong>: For return schemas (those defining API responses), removing the “required” section can prevent issues where the actual API response does not strictly adhere to the expected schema, particularly for non-mandatory fields.</p>
</li>
<li>
<p><strong>Adjusting for Type Mismatches</strong>: Use the Swagger Editor, accessible via a toggle in the custom connector editor, to quickly edit the Swagger (OpenAPI) information. This allows for rapid adjustments to resolve type mismatches and other schema validation errors highlighted during testing.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/8bafc5b5-2912-46e2-87a6-acea7de1b248" alt="Using Swagger Editor in Custom Connector UI"></p>
</li>
</ul>
</li>
<li>
<p><strong>Re-testing After Making Adjustments</strong>: Once you’ve made the necessary corrections, conduct another round of tests to ensure all issues have been resolved. Continue refining and testing until the connector performs flawlessly across all operations.</p>
</li>
</ol>
<p>Thorough testing and diligent error resolution are indispensable for preparing your custom connector for deployment or submission as an independent publisher. By meticulously addressing any issues encountered during the testing phase, you ensure the connector is robust, functional, and ready for use within the Power Platform ecosystem.</p>
<h4 id="step-9-creating-a-readme.md-file">Step 9: Creating a <a href="http://README.MD">README.MD</a> File</h4>
<p>Creating a comprehensive <a href="http://README.MD">README.MD</a> file is a critical step in developing custom connectors for the Power Platform. This document serves as the first point of reference for users to understand the capabilities, setup requirements, and how to use your custom connector effectively. Below is a guide on structuring your <a href="http://README.MD">README.MD</a> file, following a specified template inspired by the <a href="https://github.com/microsoft/PowerPlatformConnectors/blob/dev/custom-connectors/AzureKeyVault/Readme.md">Azure Key Vault connector README on GitHub</a>. Remember to customize each section based on your connector’s specific features and requirements.</p>
<p>To facilitate the creation of <a href="http://README.MD">README.MD</a> files for custom connectors, you can leverage ChatGPT’s capabilities. By providing ChatGPT with a detailed prompt, you can generate comprehensive and well-structured README documents quickly. Below, you’ll find space to paste in the prompt you use with ChatGPT to create these <a href="http://README.MD">README.MD</a> files:</p>
<pre class=" language-md"><code class="prism language-md">Acting as a technical write create a README.MD file for the custom connector. Below is the template for the file, do not deviate from the template. When generating the operations please make sure to include all input attributes and utilze the friendly names for them.
TEMPLATE:
# Connector Title from OpenAPI File
Description for the connector here, take some of the information from he description field in the OpenAPI file.
## Publisher: Richard Wilson
## Prerequisites
Discuss the pre-requisites the connector such as obtaining an api key or setting up oauth or creating an account on the services website.
## Supported Operations
### Operation 1
Operation description
- **Inputs**:
- `Input 1`: input 1 description.
- **Outputs**:
- `Output 1`: output 1 description.
## Obtaining Credentials
Discuss how to get credentials for the service.
## Known Issues and Limitations
Currently, no known issues or limitations exist. Always refer to this section for updated information.
OPENAPI FILE:
<<paste in the contents of your open api file.>>
</code></pre>
<p>Remember, the output from ChatGPT might require some tweaks and customization to perfectly fit your connector’s specificities and the context in which it will be used. Always review and adjust the generated content to ensure accuracy and completeness.</p>
<h4 id="step-10-preparing-for-connector-submission-to-powerplatformconnectors-github-repository">Step 10: Preparing for Connector Submission to PowerPlatformConnectors GitHub Repository</h4>
<p>Contributing a new connector to the PowerPlatformConnectors GitHub repository requires a structured approach. Here’s a step-by-step guideline:</p>
<ol>
<li>
<p><strong>Forking the Repository</strong><br>
Create a fork of the PowerPlatformConnectors repository, targeting the dev branch as your starting point.</p>
</li>
<li>
<p><strong>Creating a Feature Branch</strong><br>
From your fork, create a new branch specifically for your connector development, such as gsasitescan.</p>
</li>
<li>
<p><strong>Setting Up the Connector Directory</strong><br>
Within your branch, create a new folder under independent-publisher-connectors named after your connector to store all related files.</p>
</li>
<li>
<p><strong>Installing and Logging in with paconn</strong><br>
Before managing your connector with <code>paconn</code>, ensure Python 3.5+ is installed and then install <code>paconn</code>. Follow these steps:</p>
<ul>
<li>
<p>Check Python installation:</p>
<pre class=" language-bash"><code class="prism language-bash">python --version
</code></pre>
</li>
<li>
<p>Install <code>paconn</code> via pip:</p>
<pre class=" language-bash"><code class="prism language-bash">pip <span class="token function">install</span> paconn
</code></pre>
</li>
<li>
<p>Authenticate with <code>paconn</code>:</p>
<pre class=" language-bash"><code class="prism language-bash">paconn login
</code></pre>
</li>
</ul>
<p>Refer to the paconn <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/paconn-cli#installing">CLI documentation</a> for detailed instructions.</p>
</li>
<li>
<p><strong>Utilizing paconn for Connector Management</strong><br>
With paconn installed and authenticated, manage your connector effectively:</p>
<ul>
<li>
<p>Download the connector:</p>
<pre class=" language-bash"><code class="prism language-bash">paconn download
</code></pre>
</li>
<li>
<p>Update the connector after modifications:</p>
<pre class=" language-bash"><code class="prism language-bash">paconn update
</code></pre>
</li>
<li>
<p>Validate the connector to ensure it meets submission standards:</p>
<pre class=" language-bash"><code class="prism language-bash">paconn validate --api-def apiDefinition.swagger.json
</code></pre>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/6cf8aedf-4a9f-4b18-be73-579163ac2b88" alt="Output of paconn validate Command"></p>
</li>
</ul>
</li>
<li>
<p><strong>Submitting a Pull Request and Providing Proof of Testing</strong><br>
After finalizing your connector, you’ll need to submit a pull request from your fork to the main PowerPlatformConnectors repository, targeting the <code>dev</code> branch. To ensure your submission will be accepted:</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/83321919-43a1-45b8-962b-ee2159f57184" alt="Pull Request"></p>
<ul>
<li>
<p><strong>Attach Images of Successful Action Tests</strong>: Include screenshots that show each action of your connector running successfully within Power Automate. These images should clearly demonstrate the action configurations and the successful outcomes.</p>
</li>
<li>
<p><strong>Screenshots of Working Flows</strong>: Add screenshots of the Power Automate flows you’ve created using your connector, showcasing successful execution. These provide concrete examples of how your connector operates in real-world scenarios.</p>
</li>
<li>
<p><strong>Image of <code>paconn validate</code> Command</strong>: Lastly, attach a screenshot of the <code>paconn validate</code> command executing without any errors. This image is crucial as it verifies that your connector meets the Power Platform’s custom connector certification criteria.</p>
</li>
</ul>
<p>By including these visual proofs with your pull request, you not only enhance the credibility of your submission but also assist the review process, showcasing the functionality and compliance of your connector with Power Platform standards.</p>
</li>
</ol>
<h2 id="conclusion-bridging-government-apis-and-the-power-platform">Conclusion: Bridging Government APIs and the Power Platform</h2>
<p>The integration of US government APIs with the Microsoft Power Platform represents a significant leap forward in the digital transformation of government services. By fostering the development of custom connectors and encouraging government API developers to adhere to Power Platform standards, this guide aims to catalyze a more interconnected, efficient, and user-friendly digital ecosystem.</p>
<h3 id="impact-on-digital-government-services">Impact on Digital Government Services</h3>
<p>The collaboration between custom connector developers and government API developers underlines a shared goal: to enhance the accessibility, efficiency, and innovation of digital government services. Custom connectors unlock the potential of government data, making it more accessible and actionable within the Power Platform’s suite of apps. This not only streamlines operations but also opens up new avenues for data analysis, automation, and citizen engagement.</p>
<h4 id="enhancing-accessibility-efficiency-and-innovation">Enhancing Accessibility, Efficiency, and Innovation</h4>
<ul>
<li><strong>Accessibility</strong>: By simplifying the integration of government APIs with the Power Platform, we ensure that valuable government data is more accessible to developers and, ultimately, to the public. This enhances transparency and empowers citizens with easy access to information and services.</li>
<li><strong>Efficiency</strong>: Streamlined workflows and automated processes reduce manual effort, accelerate service delivery, and minimize errors. Government agencies can operate more efficiently, focusing resources on innovation rather than administration.</li>
<li><strong>Innovation</strong>: The guide’s dual focus encourages not just the consumption of existing government APIs but also the thoughtful development of new APIs with Power Platform integration in mind. This fosters an environment of continuous improvement and innovation, where government services can evolve to meet changing needs and expectations.</li>
</ul>
<p>The journey towards fully leveraging the Power Platform within government services is ongoing. Developers and government entities alike are encouraged to share their success stories, engage with the community, and provide feedback. This collective effort will ensure that the ecosystem continues to grow, adapt, and serve the public with ever-greater effectiveness.</p>
<p>As we look to the future, the role of custom connectors and well-designed government APIs will only increase in importance. This guide represents a step towards realizing that future—a future where digital government services are more integrated, more accessible, and more responsive to the needs of citizens.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-70844587296428372152024-02-01T14:21:00.001-05:002024-02-01T14:21:30.726-05:00Handling Graph API Pagination in Power Platform Dataflows<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/123afe82-83a8-4e29-9bd8-828bd925b8fc" alt="Handling Graph API Pagination in Power Platform Dataflows"></p>
<h2 id="introduction">Introduction</h2>
<p>When managing extensive user datasets from Microsoft Graph API, a common challenge is handling the pagination of data. This blog post explores a solution for effectively looping through multiple pages of Graph API data within Power Platform dataflows and discusses alternative methods that might be more efficient in certain scenarios.</p>
<h2 id="background">Background</h2>
<p>The need for this project arose from our requirement to comprehensively collect and set manager information for our users in Dataverse. This was integral to improving our workflow automation and communication processes. However, the pagination feature in Graph API responses presented a significant challenge in retrieving this data efficiently, prompting us to seek a solution that could integrate seamlessly with Dataverse and optimize our data management practices.</p>
<h2 id="exploring-graph-api-with-graph-explorer">Exploring Graph API with Graph Explorer</h2>
<p>To become proficient in crafting queries for the Microsoft Graph API, a powerful resource at your disposal is the <a href="https://developer.microsoft.com/en-us/graph/graph-explorer/">Graph Explorer tool</a>. This interactive tool allows you to formulate and test Graph API queries in a user-friendly environment. It provides a practical hands-on approach to learning how the API responds to different queries and helps you understand the structure of the data it returns.</p>
<p>By experimenting with Graph Explorer, you gain valuable insights into how Graph API operates, enabling you to build more effective queries for your Power Platform dataflows. Whether you’re retrieving user data, managing tasks, or accessing analytics, the Graph Explorer can be your sandbox for mastering Graph API interactions.</p>
<h2 id="solution-implementing-looping-logic-and-considering-alternatives">Solution: Implementing Looping Logic and Considering Alternatives</h2>
<h3 id="understanding-graph-api-pagination">1. Understanding Graph API Pagination</h3>
<ul>
<li>Microsoft Graph API uses pagination to manage large datasets, providing links to subsequent pages in each response.</li>
<li>Effectively handling this is crucial for comprehensive data collection.</li>
</ul>
<h3 id="setting-up-the-graph-api-connection">2. Setting Up the Graph API Connection</h3>
<ul>
<li>The first step involves integrating the Graph API into your Power Platform dataflow.</li>
<li>Create a new Dataflow in the Maker portal.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/6b2279b1-0964-4e7a-abe5-87128d9255b0" alt="image"></li>
<li>Select the Web API connector<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/07f9fa85-87d8-41fe-b3d1-9a730ca7b9f0" alt="image"></li>
<li>Create a connection to the Graph API url (eg: <a href="https://graph.microsoft.com/v1.0">https://graph.microsoft.com/v1.0</a>). Make sure to get the authentication type to <strong>Organizational account</strong>.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/347b939f-4862-4cf5-8c37-10e6cbea7de5" alt="image"></li>
<li>A new query will be created showing all of the endpoints for the Graph API.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/8f4914d2-4d83-47b9-aa1b-c96dd695b653" alt="image"></li>
</ul>
<p>Now that you have a connection to graph you can utilze the default query that was provided for you or create your own queries using the pattern provided by this default query and using the <strong>Json.Document(Web.Contents(“<a href="https://graph.microsoft.com/v1.0/me">https://graph.microsoft.com/v1.0/me</a>”))</strong> functionality to pass in your Graph query.</p>
<h3 id="looping-through-paginated-data">3. Looping Through Paginated Data</h3>
<ul>
<li>Develop a script or logic within your dataflow to process each page and retrieve the next page’s link.</li>
<li>The code below loops through all users within <a href="https://www.microsoft.com/en-us/security/business/identity-access/microsoft-entra-id">Entra ID</a> and get all manager information. You can utilize the <strong>Advanced editor</strong> button in the query window to copy/paste this code.</li>
</ul>
<pre><code>let
// This query will return all users who are domain members and have a usage location with the US.
// Expand gets us the mail attribute for the users first level manager.
// Top is set to 999, this is the max size of records returned per page for this call and will result in fewer calls having to be made.
url = "https://graph.microsoft.com/v1.0/users?$filter=userType eq 'Member' and usageLocation eq 'US' &$select=userPrincipalName&$expand=manager($levels=1;$select=mail)&$top=999",
// This function will return the data for each page.
FnGetOnePage = (url) as record =>
let
Source = Json.Document(Web.Contents(url)),
data = try Source[value] otherwise null,
next = try Record.Field(Source, "@odata.nextLink") otherwise null,
res = [Data=data, Next=next]
in
res,
// This calls the function to return data for each page that is returned and creates the combined list.
GeneratedList = List.Generate(
()=>[i=0, res = FnGetOnePage(url)],
each [res][Data] <> null,
each [i=[i]+1, res = FnGetOnePage([res][Next])],
each [res][Data]
),
// Create a combined list
CombinedList = List.Combine(GeneratedList),
// Convert the list into a table format
#"Convert To Table" = Table.FromList(CombinedList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
// Assuming the data structure is a record, expand the columns you need. Adjust the column names based on your data structure.
#"Expand Each Record" = Table.ExpandRecordColumn(#"Convert To Table", "Column1", {"userPrincipalName", "manager"}, {"userPrincipalName", "manager"}),
// Expand the manager record to get their email address
#"Expand Manger" = Table.ExpandRecordColumn(#"Expand Each Record", "manager", {"mail"}, {"mail.1"}),
#"Renamed columns" = Table.RenameColumns(#"Expand Manger", {{"mail.1", "ADUser.ManagerEmail"}, {"userPrincipalName", "ADUser.Upn"}})
in
#"Renamed columns"
</code></pre>
<h3 id="considering-alternatives">4. Considering Alternatives</h3>
<ul>
<li>
<p>Looping through pages is straightforward, but not always the most efficient.</p>
<p><strong>Alternative Approaches</strong>:</p>
<ul>
<li><strong>Batch Processing</strong>: Send multiple requests in a single call using the Microsoft Graph batch processing feature.</li>
<li><strong>Delta Query</strong>: Use delta queries to fetch only changes since the last query, reducing data volume.</li>
<li><strong>WebJobs or Azure Functions</strong>: For more control, consider Azure services like WebJobs or Azure Functions.</li>
</ul>
</li>
</ul>
<h3 id="conclusion-and-optimization-tips">5. Conclusion and Optimization Tips</h3>
<ul>
<li>While looping is effective, alternatives could offer more efficiency for specific use cases or large-scale operations.</li>
<li>Evaluate your project needs to choose the most suitable method.</li>
</ul>
<h2 id="final-note">Final Note</h2>
<p>Assess your project’s specific requirements to determine whether looping through pagination or an alternative approach is more appropriate. This decision can significantly impact the efficiency and scalability of your data management in Power Platform and Graph API.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-2955139887432779892024-01-23T14:19:00.003-05:002024-01-23T14:19:47.678-05:00Enhancing Public Sector Travel with the GSA Per Diem Connector for Power Platform<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/cf4fb2d3-16a9-40de-ae6f-57b04cb20819" alt="# Enhancing Public Sector Travel with the GSA Per Diem Connector for Power Platform"></p>
<p>Navigating travel expenses in the public sector can be intricate, especially when it comes to adhering to per diem rates. To address this, I developed the GSA Per Diem Connector for Power Platform, aiming to simplify access to essential travel expense data. For those looking to delve deeper into per diem rates, the <a href="https://www.gsa.gov/travel/plan-book/per-diem-rates">U.S. General Services Administration (GSA)</a> offers comprehensive information. Additionally, the <a href="https://open.gsa.gov/api/perdiem-api/">GSA Per Diem API</a>, which serves as the backbone of this connector, provides detailed insights into the rate data.</p>
<h2 id="the-importance-of-per-diem-rates">The Importance of Per Diem Rates</h2>
<p>Per diem rates, set by the U.S. General Services Administration (GSA), are daily allowances allotted to federal employees to cover lodging, meals, and incidental expenses when traveling for work. These rates are crucial for budgeting and expense management in government-related travel.</p>
<h2 id="my-journey-to-approval">My Journey to Approval</h2>
<p>Developing this connector involved aligning with Microsoft’s <a href="https://learn.microsoft.com/en-us/connectors/custom-connectors/certification-submission-ip">certification submission guidelines</a>, ensuring both functionality and compliance. The approval signifies the connector’s readiness to serve the public sector’s specific needs.</p>
<h2 id="connector-overview">Connector Overview</h2>
<p>The GSA Per Diem Connector seamlessly integrates with Power Platform, granting users immediate access to up-to-date GSA per diem rates. This integration is key for accurately calculating travel expenses and streamlining expense reporting processes. For more in-depth information and a detailed exploration of the connector’s capabilities, visit the <a href="https://learn.microsoft.com/en-us/connectors/gsaperdiem/">official GSA Per Diem Rates connector page on Microsoft Learn</a>.</p>
<h2 id="empowering-public-sector-organizations">Empowering Public Sector Organizations</h2>
<p>The connector is specifically designed to:</p>
<ul>
<li><strong>Simplify Data Retrieval</strong>: Provides instant, updated access to per diem rates.</li>
<li><strong>Ensure Accuracy and Compliance</strong>: Maintain alignment with federal travel expense standards.</li>
<li><strong>Enable Custom Solutions</strong>: Combine per diem data with other Power Platform functionalities for bespoke application development.</li>
</ul>
<h2 id="practical-applications">Practical Applications</h2>
<ul>
<li><strong>Automated Expense Reporting</strong>: Enhances travel expense forms with automatically integrated current per diem rates.</li>
<li><strong>Budget Forecasting</strong>: Leverages current and historical rate data for accurate fiscal planning.</li>
<li><strong>Custom Travel Management Tools</strong>: Develops real-time, compliant travel planning applications.</li>
</ul>
<h2 id="actions-available-in-the-gsa-per-diem-connector">Actions Available in the GSA Per Diem Connector</h2>
<p>The GSA Per Diem Connector offers a variety of actions to retrieve per diem rates, each tailored to different requirements. Here’s a breakdown of these actions and how they can be utilized:</p>
<ol>
<li>
<p><strong>Retrieve Rates by City, State, and Year</strong></p>
<ul>
<li><strong>Functionality</strong>: Fetches per diem rates for a specific city and state for a given fiscal year.</li>
<li><strong>Use Case</strong>: Ideal for applications focused on specific cities or for detailed travel planning within known destinations.</li>
</ul>
</li>
<li>
<p><strong>Retrieve Rates for All Counties in a State</strong></p>
<ul>
<li><strong>Functionality</strong>: Provides per diem rates for all counties and cities within a chosen state for a specified year.</li>
<li><strong>Use Case</strong>: Useful for applications that require a broad view of per diem rates across an entire state, such as statewide travel management systems.</li>
</ul>
</li>
<li>
<p><strong>Retrieve Rates by ZIP Code</strong></p>
<ul>
<li><strong>Functionality</strong>: Offers per diem rates based on ZIP code for a specific fiscal year.</li>
<li><strong>Use Case</strong>: Best suited for applications where the travel destination is known primarily by ZIP code, providing quick and localized rate information.</li>
</ul>
</li>
<li>
<p><strong>Lodging Rates for Continental US</strong></p>
<ul>
<li><strong>Functionality</strong>: Gives detailed lodging rate information across various locations within the Continental US for the selected fiscal year.</li>
<li><strong>Use Case</strong>: Essential for applications managing accommodation expenses, offering comprehensive data for budgeting and reimbursement processes.</li>
</ul>
</li>
<li>
<p><strong>Mapping ZIP Codes to Destination IDs</strong></p>
<ul>
<li><strong>Functionality</strong>: Delivers a mapping of ZIP codes to their corresponding Destination-IDs and state locations for a particular fiscal year.</li>
<li><strong>Use Case</strong>: This action is particularly beneficial for advanced applications that integrate geographic data with per diem rates for enhanced travel analysis and reporting.</li>
</ul>
</li>
</ol>
<p>Each of these actions is designed to make accessing and utilizing per diem rate data as seamless and efficient as possible, ensuring your Power Platform solutions are both robust and compliant with federal travel regulations.</p>
<h2 id="integrating-the-connector">Integrating the Connector</h2>
<p>Incorporating the GSA Per Diem Connector into Power Platform enhances travel-related applications with essential per diem rate data. Here’s a brief guide on how to utilize the connector in both Power Automate and Power Apps.</p>
<h3 id="building-a-power-automate-flow">Building a Power Automate Flow</h3>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/4f567874-08c5-40d5-a382-0335ee07cd0e" alt="image"></p>
<ol>
<li><strong>Create a New Flow</strong>: Start by creating a new automated flow in Power Automate.</li>
<li><strong>Trigger</strong>: Choose a trigger that suits your application needs, such as a scheduled trigger for daily updates or a manual trigger for on-demand requests.</li>
<li><strong>Add the Connector</strong>: Search for the GSA Per Diem Connector in the action panel and add it to your flow.</li>
<li><strong>Configure Parameters</strong>: Set up the required parameters (City, State Abbreviation, Year, etc.) based on your specific needs.</li>
<li><strong>Add Actions</strong>: Use the retrieved per diem rates to perform calculations, create reports, send notifications, or store data in your database.</li>
<li><strong>Test and Deploy</strong>: Test your flow to ensure it works as expected and then deploy it for use.</li>
</ol>
<h3 id="creating-a-power-app">Creating a Power App</h3>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/e41323ec-217c-4d02-9939-a5faef508fa1" alt="Sample Power App"></p>
<p>The entire sample app can be downloaded here: <a href="https://github.com/rwilson504/Blogger/raw/master/gsa-per-diem-connector/Travel%20Location.msapp">Sample Power App</a></p>
<ol>
<li><strong>Start a New App</strong>: Open Power Apps and start a new canvas app from blank or choose a template that fits your scenario.</li>
<li><strong>Add Data Connection</strong>: Connect to the GSA Per Diem Connector as a data source.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/843b9b07-1748-4e6f-ae2a-8046ace82758" alt="image"><br>
4</li>
<li><strong>Design the User Interface</strong>: Create a user-friendly interface with input fields for city, state, and year, and display areas for the retrieved per diem rates.</li>
<li><strong>Integrate the Connector</strong>: Use Power Apps formulas to fetch per diem rates based on user input and display the results in the app.</li>
<li><strong>Add Logic and Navigation</strong>: Implement logic for data validation, error handling, and navigation between different screens of the app.</li>
<li><strong>Test and Share Your App</strong>: Thoroughly test the app for usability and accuracy. Once satisfied, share your app with users in your organization.</li>
</ol>
<p>PowerFx code for OnSelect of Location gallery.</p>
<pre><code>Set(varPerDiemRates, GSAPerDiem.GetPerDiemRatesByCityStateAndYear(ThisItem.City, ThisItem.State, Dropdown1.SelectedText.Value).rates);
If(
CountRows(varPerDiemRates) = 0,
Set(
noLocationFound,
true
);
Set(
varMeals,
Blank()
);
//Set(varHotelMonths, Table({}));
Clear(colHotelMonths);
Set(
varCity,
Blank()
);
Set(
varState,
Blank()
);
Set(
varYear,
Blank()
);
,
Set(
noLocationFound,
false
);
Set(
varMeals,
Index(
Index(
varPerDiemRates,
1
).rate,
1
).meals
);
Set(
varCity,
Index(
Index(
varPerDiemRates,
1
).rate,
1
).city
);
Set(
varState,
Index(
varPerDiemRates,
1
).state
);
Set(
varYear,
Index(
varPerDiemRates,
1
).year
);
ClearCollect(
colHotelMonths,
Index(
Index(
varPerDiemRates,
1
).rate,
1
).months.month
);
)
</code></pre>
<h2 id="conclusion">Conclusion</h2>
<p>I am pleased to offer the public sector a practical tool in the GSA Per Diem Connector, simplifying the management of travel expenses in alignment with federal guidelines.</p>
<h2 id="explore-and-connect">Explore and Connect</h2>
<p>Discover the GSA Per Diem Connector’s capabilities within your Power Platform environment. For further information or support, feel free to reach out.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com1tag:blogger.com,1999:blog-8675696861245191896.post-317241777580599792023-12-11T11:44:00.004-05:002023-12-11T11:44:17.740-05:00Building Better Tables - A patterned approach to using autonumber columns and alternate keys to drive efficiency in Dataflows<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/75e1d6d2-d761-4bb5-9f59-99042cc927ef" alt="Building Better Tables for Dataflows"></p>
<h2 id="introduction">Introduction</h2>
<p>Navigating the complexities of dataflows in Microsoft’s Power Platform, especially when dealing with Dataverse, can present unique challenges. One significant hurdle is efficiently setting up and using lookup values. This article introduces a straightforward design pattern I’ve developed, emphasizing the use of autonumber fields and alternate keys in entity creation to facilitate smoother data mapping for data analysts.</p>
<h2 id="understanding-the-role-of-entity-design-in-traditional-dataflows">Understanding the Role of Entity Design in Traditional Dataflows</h2>
<p>The effectiveness of traditional dataflows in Microsoft’s Power Platform often depends on how entities are configured in Dataverse. The challenge usually lies not in the dataflows themselves but in the nuances of entity setup. A critical aspect of this is the complexity involved in managing lookup values. The article <a href="https://www.apprising.co.nz/post/how-to-map-a-lookup-column-in-a-power-platform-dataflow">How to map a Lookup Column in a Power Platform Dataflow</a> highlights these difficulties, underscoring the importance of well-structured entities. My design pattern addresses this by optimizing entity configuration, thus enhancing the overall functionality of traditional dataflows.</p>
<h2 id="a-simplified-approach-autonumber-and-alternate-keys">A Simplified Approach: Autonumber and Alternate Keys</h2>
<p>To address these challenges, I incorporate a consistent practice in my entity creation process. Each new entity begins with an ‘ID’ field set as an autonumber type, followed by an alternate key named ‘IdKey.’ This approach not only ensures uniformity but also greatly simplifies data mapping in dataflows.</p>
<h2 id="step-by-step-implementation-guide">Step-by-Step Implementation Guide</h2>
<p>Implementing this pattern involves:</p>
<ol>
<li>
<p><strong>Creating an Autonumber Field</strong>: Establish an autonumber field in each new entity as a unique identifier.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/bd4297a0-46f3-4f42-bced-1ecc6e198220" alt="Create Autonumber Column"></p>
</li>
<li>
<p><strong>Setting Up an Alternate Key (IdKey)</strong>: Implement an alternate key for the entity to facilitate efficient data mapping.<br>
<img src="https://github.com/rwilson504/Blogger/assets/7444929/304f0fa4-7319-4dfe-9f47-6f2e562876f1" alt="Create Alternate Key on ID Column"></p>
</li>
</ol>
<h2 id="integrating-this-setup-in-dataflows">Integrating This Setup in Dataflows</h2>
<p>Once you have established the autonumber field and alternate key (IdKey) in your entities, the next crucial step is integrating this configuration into your dataflows for effective data mapping. This integration is key to leveraging the full potential of your design pattern in Power Platform’s data management.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/6073839a-eae4-4e97-ac7a-b84e5044f5b9" alt="Using Autonumber field to set lookup value"></p>
<ol>
<li><strong>Data Mapping in Dataflows</strong>: Navigate to the data mapping stage when setting up or editing a dataflow in Power Platform. Here, you will map data from your source to the corresponding fields in your Dataverse entities.</li>
<li><strong>Mapping the ID Field</strong>: Focus on the lookup columns of your entity. Correctly mapping the ‘ID’ field, your autonumber field, is crucial. This ID field will serve as a reference, ensuring data from your source is correctly associated with the corresponding record in your entity.</li>
<li><strong>Setting the Lookup for the Entity</strong>: For each lookup field in your entity, map it to the ‘ID’ field of the related entity in your data source. This maintains relational integrity by establishing a direct link between the records in your data source and the corresponding records in Dataverse.</li>
<li><strong>Validating Data Integrity</strong>: After configuring the mappings, validate the data integrity by running a test. This helps identify any potential issues with the mapping setup.</li>
<li><strong>Regular Monitoring and Adjustments</strong>: Regularly monitor and adjust the data mappings as needed to ensure that your dataflow continues to function accurately and efficiently over time.</li>
</ol>
<h2 id="enhancing-existing-tables-with-autonumber-fields">Enhancing Existing Tables with Autonumber Fields</h2>
<p>For those working with existing tables in Dataverse, integrating autonumber columns into your current setup is straightforward. This addition improves data management processes, aligning them with the new entity creation pattern.</p>
<p>If you’re looking to add autonumber columns to existing tables, you can do so seamlessly using the <a href="https://mayankp.wordpress.com/2021/12/09/xrmtoolbox-autonumberupdater-new-tool/">XrmToolBox AutoNumberUpdater</a>. This tool facilitates the efficient population of autonumber fields in your tables.</p>
<h3 id="using-xrmtoolbox-autonumberupdater">Using XrmToolBox AutoNumberUpdater</h3>
<ol>
<li><strong>Download and Install XrmToolBox</strong>: First, download and install XrmToolBox from its <a href="https://www.xrmtoolbox.com/">official website</a>.</li>
<li><strong>Add the AutoNumberUpdater Tool</strong>: Locate and add the AutoNumberUpdater tool to your toolbox.</li>
<li><strong>Connect to Your Dataverse Environment</strong>: Connect to your environment and use the tool to populate the new autonumber fields.</li>
</ol>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/56de40d2-d4b6-40ff-88b2-8b1f949cc0dc" alt="AutoNumberUpdater Tool"></p>
<h2 id="adjusting-seed-values-with-auto-number-manager-in-xrmtoolbox">Adjusting Seed Values with Auto Number Manager in XrmToolBox</h2>
<p>After populating autonumber fields in existing tables, adjust the seed value using the Auto Number Manager. This step is vital to ensure new records have unique autonumber values.</p>
<ol>
<li><strong>Install Auto Number Manager</strong>: Add this tool to your XrmToolBox, following instructions from its <a href="https://jonasr.app/ANM/">dedicated page</a>.</li>
<li><strong>Connect to Your Environment</strong>: Open the Auto Number Manager and connect to your environment.</li>
<li><strong>Adjust Seed Values</strong>: Locate the autonumber fields and adjust the seed value to be greater than the highest number set by the AutoNumberUpdater.</li>
<li><strong>Save Changes</strong>: Ensure your new configuration is applied by saving the changes.</li>
</ol>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/b56a0864-20a8-443f-9fc9-9368439abb17" alt="Auto Number Manager Tool"></p>
<h2 id="conclusion">Conclusion</h2>
<p>Incorporating autonumber fields and alternate keys in entity creation offers a structured and efficient way to enhance data management within the Power Platform and Dataverse. Tools like the AutoNumberUpdater and Auto Number Manager in XrmToolBox are invaluable for integrating this pattern into both new and existing entities, ensuring optimal functionality of your data management system.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-60652335230373105552023-09-27T14:52:00.002-04:002023-09-27T14:52:07.230-04:00PCF Component: Docx Templates in Canvas Apps<h1 id="pcf-component-docx-templates-in-canvas-apps">PCF Component: Docx Templates in Canvas Apps</h1>
<p>Have you ever wanted to fill in a Docx template within a Canvas App? Look no further! I’m excited to introduce a new PCF component that allows you to do just that. This component leverages the <a href="https://github.com/alonrbar/easy-template-x">easy-template-x</a> open-source library, making it a breeze to create and structure templates.</p>
<p><img src="https://raw.githubusercontent.com/rwilson504/PCFControls/master/DocxTemplatesCanvas/image.png" alt="Docx Template Sample App"></p>
<h2 id="how-to-get-started">How to Get Started:</h2>
<ol>
<li>
<p><strong>Installation:</strong> Begin by <a href="https://github.com/rwilson504/PCFControls/releases/latest/download/RAWDocxTemplatesCanvas_managed.zip">downloading</a> and importing the managed solution into your environment. Ensure you’ve enabled PCF components for Canvas apps. If you’re unsure how, you can find instructions <a href="https://docs.microsoft.com/en-us/powerapps/developer/component-framework/component-framework-for-canvas-apps">here</a>.</p>
</li>
<li>
<p><strong>Usage Instructions:</strong> Once you’re in the Power Apps Editor, navigate to <strong>Insert -> Custom -> Import Components</strong>. From there, select the <strong>Code</strong> tab and import the <strong>RAW! Docx Templates (Canvas)</strong>. Add the component to the form, and you’re good to go! The component offers various input properties, such as <code>docxTemplate</code>, <code>fillTemplate</code>, and <code>templateData</code>, allowing for a customizable experience.</p>
</li>
</ol>
<h2 id="explore-the-sample-application">Explore the Sample Application:</h2>
<p>To get a feel for the component, download the sample solution. This includes a Canvas app with a sample Docx and other essential components. You can <a href="https://github.com/rwilson504/PCFControls/raw/master/DocxTemplatesCanvas/Sample/DocxTemplateSample_1_0_0_1_managed.zip">download the sample app here</a>.</p>
<h2 id="template-features-from-easy-template-x">Template Features from easy-template-x:</h2>
<ul>
<li>
<p><strong>Text Plugin:</strong> The most basic plugin that replaces a single tag with custom text while preserving the original text style.</p>
</li>
<li>
<p><strong>Loop Plugin:</strong> This plugin allows for iterating text, table rows, and list rows. It also supports simple conditions and nested conditions, providing flexibility in template creation.</p>
</li>
<li>
<p><strong>Image Plugin:</strong> Embed images directly into the document with ease.</p>
</li>
<li>
<p><strong>Link Plugin:</strong> Insert hyperlinks into the document, preserving their original style.</p>
</li>
<li>
<p><strong>Raw XML Plugin:</strong> Add custom XML into the document to be interpreted by Word. This can be especially useful for adding elements like page breaks.</p>
</li>
<li>
<p><strong>Scope Resolution:</strong> <code>easy-template-x</code> supports tag data scoping, allowing you to reference “shallow” data from deeper in the hierarchy. This is similar to referencing an outer scope variable from within a function in JavaScript.</p>
</li>
<li>
<p><strong>Extensions:</strong> While most document manipulations can be achieved using plugins, <code>easy-template-x</code> also supports extensions for more advanced use cases. For instance, the community-developed <a href="https://github.com/sebastianrogers/easy-template-x-data-binding">easy-template-x-data-binding</a> extension supports updating custom XML files inside Word documents, enabling the automatic filling of Word forms that use content controls.</p>
</li>
</ul>
<p>In conclusion, this new PCF component, combined with the capabilities of <code>easy-template-x</code>, offers a powerful and seamless way to integrate Docx templates within Canvas Apps. Whether you’re looking to create simple text replacements or more complex templates with loops, conditions, and embedded images, this solution has got you covered. Give it a try and let me know your thoughts!</p>
<hr>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com2tag:blogger.com,1999:blog-8675696861245191896.post-56571658696318656982023-09-06T11:20:00.002-04:002023-09-06T11:20:47.363-04:00Installing .NET Tools on Air Gapped Systems<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/091918db-8218-4b69-aecb-104297153253" alt="Installing .NET Tools on Air Gapped Systems"></p>
<p>In today’s digital age, the vast majority of our tasks rely heavily on internet connectivity. However, there are scenarios, more common than one might think, where systems are intentionally kept offline for security or other reasons. These air-gapped or isolated systems, like Azure VMs in a restricted VNET, pose unique challenges, especially when it comes to software installation. One such challenge is installing .NET tools, a task that’s straightforward with an internet connection but can become a complex endeavor without one. In this blog, we’ll delve deep into the intricacies of using the dotnet command line reference to seamlessly install .NET tools on machines that don’t have the luxury of internet access. Whether you’re a seasoned developer or just starting out, this guide aims to simplify the seemingly daunting process and equip you with the knowledge to conquer the offline world of .NET installations.</p>
<p>Before diving into the nitty-gritty of offline installations, it’s essential to understand the foundational tools at our disposal. Central to our endeavor is the .NET SDK, a powerful suite that grants us the capability to harness the <code>dotnet</code> command. With this command, we can perform a myriad of tasks, including the installation of .NET tools. But how do we achieve this without an active internet connection? The answer lies in NuGet packages. These packages, which are typically fetched from online repositories, can also be saved locally. By leveraging locally saved NuGet packages, we can sidestep the need for online connectivity, making it possible to install our desired .NET tools on air-gapped systems. In the sections that follow, we’ll walk you through the step-by-step process of setting up the .NET SDK, accessing the ‘dotnet’ command, and utilizing local NuGet packages to achieve our installation goals.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/3603f080-251d-4658-8f16-362982cf672d" alt="default behavior when installing packages"></p>
<h1 id="step-1-preparing-for-the-offline-journey">Step 1: Preparing for the Offline Journey</h1>
<p>Remember, the key to a successful installation on an air-gapped system is thorough preparation. By ensuring you have all necessary files on hand and understanding the nuances of your isolated environment, you’re setting yourself up for a smooth and hassle-free installation process.</p>
<h2 id="downloading-the-.net-sdk">Downloading the .NET SDK:</h2>
<p>Before anything else, you’ll need the .NET SDK. This is the backbone that will allow you to run the <code>dotnet</code> command on your air-gapped machine. Visit the official .NET SDK download page from a machine with internet access. Ensure you select the appropriate version and platform for your needs, then initiate the download.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/c5e6a01c-8b51-4a8e-b86c-60f698bfc9c5" alt="Download .Net SDK"></p>
<h2 id="gathering-required-nuget-packages">Gathering Required NuGet Packages:</h2>
<p>Next, identify all the .NET tools you wish to install on the offline machine. For each tool, you’ll need its corresponding NuGet package. Navigate to NuGet’s official website and use the search functionality to locate each package. Once found, download the .nupkg file for the latest stable version, or the version you require.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/7f922cb0-6dd4-4c54-b451-d21954901eb9" alt="Download nupkg files"></p>
<h2 id="transferring-files-to-the-air-gapped-machine">Transferring Files to the Air-Gapped Machine:</h2>
<p>With the .NET SDK installer and the necessary NuGet packages in hand, you’re now faced with the task of moving them to your offline environment. Before you proceed with the transfer, it’s beneficial to establish a clear and organized folder structure to house these files. Here’s a recommended approach:</p>
<h3 id="creating-a-structured-folder-hierarchy">Creating a Structured Folder Hierarchy:</h3>
<p>Start by creating a primary folder named packages. This will serve as the central repository for all your installation files. Within this packages folder, create a subfolder specifically for NuGet, aptly named nuget. This subfolder will hold all the .nupkg files you’ve downloaded. By maintaining this structure, not only do you ensure a neat and tidy directory, but it also simplifies the process of locating and managing packages in the future. Whether you choose to set up this structure directly on the air-gapped machine or on a local network drive accessible by the machine, consistency is key.</p>
<p>Once your folder hierarchy is in place, proceed to transfer the .NET SDK installer and the NuGet packages to their respective locations. The method you choose will depend on the tools and protocols available in your specific setup. Whether it’s through USB drives, optical discs, or any other secure transfer method your organization permits, ensure the files are safely and completely transferred to the destination machine.</p>
<h1 id="step-2-installing-the-.net-sdk-on-the-air-gapped-machine">Step 2: Installing the .NET SDK on the Air-Gapped Machine</h1>
<h2 id="initiate-the-installation">Initiate the Installation:</h2>
<p>Navigate to the location where you’ve transferred the .NET SDK installer. Double-click the installer file to initiate the installation process. Follow the on-screen prompts, ensuring you select the appropriate options that suit your environment. Once the installation is complete, you should be able to access the dotnet command from the command line or terminal.</p>
<h2 id="setting-up-the-nuget-configuration-file">Setting Up the NuGet Configuration File:</h2>
<p>Given the unique constraints of an air-gapped system, the default behavior of the dotnet command line, which seeks out online NuGet repositories, won’t serve our purpose. Even using the <code>--add-source</code> command option and pointing that to our local directory will not circumvent the default behavior unless you also include the <code>--ignore-failed-sources</code> option at the same time, which is a lot for users to remember. To get around this, we’ll create a custom NuGet configuration file that points solely to our local repository.</p>
<h2 id="creating-the-nuget-configuration-file">Creating the NuGet Configuration File:</h2>
<p>Open a text editor of your choice and paste the following configuration:</p>
<pre><code><?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<add key="Local NuGet" value="F:\packages\nuget" />
</packageSources>
</configuration>
</code></pre>
<p>Ensure that the value attribute in the tag points to the correct location of your nuget folder. Save this file as nuget.config within your packages/nuget directory.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/bddc68be-768a-4eda-bd95-8b9353e10b36" alt="nuget.config file"></p>
<h1 id="step-3-running-dotnet-tool-commands">Step 3: Running dotnet tool commands</h1>
<p>When you wish to install a tool or package using the dotnet command line, ensure you use the --configfile option, pointing it to your custom nuget.config. This ensures that the command line only refers to your local NuGet repository and doesn’t attempt to reach out to the default online NuGet provider. For example:</p>
<pre><code>dotnet tool install <tool-name> --configfile F:\packages\nuget\nuget.config
</code></pre>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/4dcb3d87-9ca9-424b-acb9-affc75b2997a" alt="installing docfx"></p>
<h1 id="alternate-configuration-modifying-the-default-nuget-configuration">Alternate Configuration: Modifying the Default NuGet Configuration</h1>
<p>For those who’d prefer a more permanent solution, rather than using the --configfile argument every time, there’s an alternative. You can directly modify the default nuget.config file that the dotnet command line uses. This approach ensures that the command line always refers to your local NuGet repository by default, without any additional arguments.</p>
<h2 id="navigating-to-the-default-nuget-configuration">Navigating to the Default NuGet Configuration:</h2>
<ul>
<li>On your air-gapped machine, open the File Explorer.</li>
<li>In the address bar, type <code>%appdata%\nuget</code> and press Enter. This will take you directly to the directory containing the default <code>nuget.config</code> file.</li>
<li>Open the <code>nuget.config</code> file in a text editor of your choice.</li>
</ul>
<h2 id="modifying-the-configuration">Modifying the Configuration:</h2>
<ul>
<li>Within the file, you’ll find a section named <code><packageSources></code>. This section lists all the NuGet repositories that the dotnet command line refers to.</li>
<li>Remove or comment out the entry that points to the default online NuGet repository. It typically looks like this:</li>
</ul>
<pre><code><add key="nuget.org" value="https://api.nuget.org/v3/index.json" protocolVersion="3" />
</code></pre>
<ul>
<li>Add your local NuGet repository to this section, similar to the custom configuration we discussed earlier:</li>
</ul>
<pre><code><add key="Local NuGet" value="F:\packages\nuget" />
</code></pre>
<p>Save and close the file.</p>
<p>With these changes in place, every time you use the dotnet command line to install a tool or package, it will refer to your local NuGet repository by default, without the need for any additional arguments. Additionally because this is a known file location you can create automation around this to ensure that this will be completed without any user intervention within your environment.</p>
<h1 id="conclusionreferences">Conclusion/References</h1>
<p>By following these steps, you effectively create an environment where the dotnet command line works seamlessly, even in the absence of an internet connection. This approach not only ensures successful installations but also provides a blueprint for managing and expanding your local NuGet repository in the future.</p>
<ul>
<li><a href="https://learn.microsoft.com/en-us/dotnet/core/tools/dotnet-tool-install">MSFT Docs - dotnet tool install</a></li>
<li><a href="https://learn.microsoft.com/en-us/nuget/reference/nuget-config-file">MSFT Docs - nuget.config reference</a></li>
</ul>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-34903979088543477192023-08-28T11:51:00.004-04:002023-08-28T11:51:51.407-04:00Capture User’s Last Successful Login with Portal Web API<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/30275784-cfd0-4770-8748-5bd304a6c4ee" alt="image"></p>
<p>Hey Power Pages developers! Are you sitting there scratching your head wondering why the <strong>Authentication/LoginTrackingEanbled</strong> site setting isn’t working? Unfortunately it has been <a href="https://cloudblogs.microsoft.com/dynamics365/it/2018/03/20/portal-capabilities-for-dynamics-365-deprecated-features/">deprecated</a> 😭😭😭😭. This saddened me a lot because I utilize the <strong>Last Successful Login</strong> date field on the Contact table for a lot of reporting and automation using Power Automate. In this article I will demonstrate how you can use the Portal WebAPI with a little Javascript/Liquid to populate that field. Before we dive in, here are a few other options you might consider:</p>
<ul>
<li>Power Automate: Use a Flow to capture the data, a good example of how to implement can be found here: <a href="https://prasadmotupallicrm.blogspot.com/2021/10/last-successful-login-on-contact-record.html">Last Successful Login on Contact record</a> by <a href="https://prasadmotupallicrm.blogspot.com/">Prasad Motupalli</a>. The one thing this was missing for me though was security which could be a concern with the http trigger not requiring authentication. I want to be sure that the person updating the data is an authenticated user.</li>
<li>Application Insights: You can track additional details surrounding users on your site by using liquid and adjusting the application insights JS on your site. Details on how to do that can be found here: <a href="https://www.dancingwithcrm.com/powerappsportals-tracking-using-azure-app-insights/">PowerApps Portals tracking using Azure Application Insights</a> by <a href="https://www.dancingwithcrm.com/about/">Oleksandr Olashyn</a>. I have added this code to my site and will be using it for more detailed reports. One thing missing from this approach is having the data directly in Dataverse for reporting or easily running Power Automate against that data.</li>
</ul>
<h2 id="instructions">Instructions</h2>
<p>To populate the <strong>Last Successful Login</strong> field on the Contact table using the Web API, we’ll first need to enable table permissions and grant WebAPI access to that field. Finally we will drop in some JavaScript/Liquid code which will be used to make the Web API call. Big thanks to 🎉🎉 <a href="https://www.linkedin.com/in/ronald-sease-888438111/">Marty Sease</a> 🎉🎉 for all his help writing that code, and doing so in a way that reduces any negative performance impacts by using sessions variables.</p>
<h3 id="create-table-permissions">Create Table Permissions</h3>
<p>To get started navigate to the Portal Management app.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/791dabfa-61ba-4e67-a2ea-7f9a14a04ab4" alt="image"></p>
<p>Create a new Table permission that will allow for a user to update and read their own contact record.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/5d87fd8f-d229-49ea-96e8-b57652aea452" alt="image"></p>
<p>Add the Authenticated User web role to the Table Permission.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/dbac581d-0c37-4def-8390-0c9a5beada88" alt="image"></p>
<h3 id="create-web-api-settings">Create Web API Settings</h3>
<p>Create two Site Settings that will enable the Web API for the Contact record and allow access to the adx_identity_lastsuccessfullogin, which is the logical name for the field displayed on the Contact form.</p>
<p>Note: In the site settings below we are using the out of the box column provided in the default portal solution. You are not required to use that field, you can use your own custom date column and use the logical name of that column here instead.</p>
<table>
<thead>
<tr>
<th>Name</th>
<th>Value</th>
</tr>
</thead>
<tbody>
<tr>
<td>Webapi/contact/enabled</td>
<td>true</td>
</tr>
<tr>
<td>Webapi/contact/fields</td>
<td>adx_identity_lastsuccessfullogin</td>
</tr>
</tbody>
</table><p><img src="https://github.com/rwilson504/Blogger/assets/7444929/2029d2d7-bdcd-40fa-91c0-3e535bc08147" alt="image"></p>
<h3 id="drop-the-code">DROP THE CODE!!</h3>
<p>Finally we will navigate to the Enable Traffic Analysis section of the App. From here select the website and then copy the code below and click save. If you are also doing Application Insights tracking (which is always a good idea), just copy this code below the code for that.</p>
<p><img src="https://github.com/rwilson504/Blogger/assets/7444929/f2b464a0-4354-42e3-a9a5-2b9a14b7fe96" alt="image"></p>
<p>Code to be copied into the Tracking Code content snippet:</p>
<pre><code><script type = "text/javascript" >
{% if user %}
(function(webapi, $) {
function safeAjax(ajaxOptions) {
var deferredAjax = $.Deferred();
shell.getTokenDeferred().done(function(token) {
// add headers for AJAX
if (!ajaxOptions.headers) {
$.extend(ajaxOptions, {
headers: {
"__RequestVerificationToken": token
}
});
} else {
ajaxOptions.headers["__RequestVerificationToken"] = token;
}
$.ajax(ajaxOptions)
.done(function(data, textStatus, jqXHR) {
validateLoginSession(data, textStatus, jqXHR, deferredAjax.resolve);
}).fail(deferredAjax.reject); //AJAX
}).fail(function() {
deferredAjax.rejectWith(this, arguments); // on token failure pass the token AJAX and args
});
return deferredAjax.promise();
}
webapi.safeAjax = safeAjax;
})(window.webapi = window.webapi || {}, jQuery)
const loginCacheKey = "lastLoginKey";
if (!sessionStorage.getItem(loginCacheKey)) {
const now = new Date();
sessionStorage.setItem(loginCacheKey, now);
webapi.safeAjax({
type: "PATCH",
url: "/_api/contacts({{ user.contactid }})",
contentType: "application/json",
data: JSON.stringify({
"adx_identity_lastsuccessfullogin": now
})
});
}
{% else %}
const loginCacheKey = "lastLoginKey";
sessionStorage.removeItem(loginCacheKey);
{% endif %}
</script>
</code></pre>
<p><strong>Thats it!</strong> Now when a user logs into your Power Pages portal you will be able to capture the <strong>Last Successful Login</strong> date field. If you don’t see it happen right away make sure you clear the <a href="https://learn.microsoft.com/en-us/power-pages/admin/clear-server-side-cache#metadataconfiguration-tables">Portal Cache</a> then try logging out and back in again.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-43169723267778495212022-02-04T15:49:00.002-05:002022-02-04T15:49:34.093-05:00How to Create SharePoint Items with Power Automate Desktop<p><img src="https://user-images.githubusercontent.com/7444929/152599822-85aeae57-142d-4d85-80aa-b0bd8459de4d.png" alt="How do it create..."></p>
<h2 id="overview">Overview</h2>
<p><a href="https://powerautomate.microsoft.com/en-us/desktop">Power Automate Desktop</a> is a great way to automate many of your daily task so you can focus on real work. A prime example of this is getting data from one place to another, especially when those data sources do not have an API such as a legacy desktop application or a file.</p>
<p>In this example I demonstrate several ways in which an Excel sheet containing fictitious customer data could be loaded into a SharePoint list. In order to do this I first generated some fake data for my customers using <a href="https://www.mockaroo.com/">Mockaroo</a>.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/152590014-68ffa7f4-365e-4c5f-9f78-a774a90ed7bd.png" alt="Excel Data"></p>
<p>I then created a new SharePoint list and added some columns to match my spreadsheet.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/152590069-9ce3ffcc-4270-41e3-9fae-0a50baa6652f.png" alt="New SharePoint List"></p>
<h2 id="methods">Methods</h2>
<p>My original plan was to just use the PAD <a href="https://docs.microsoft.com/en-us/power-automate/desktop-flows/recording-flow">recorder</a>, which I did, but after creating that Flow I decided to find other possible ways to accomplish the task. The list below is not a complete list but should provide some ideas. The flexibility of PAD allows for an even wider range of possibilities for carrying out tasks such as these.</p>
<p>At the end of each section I have placed the source for each of these Flows. If you copy the code you can then right click on PAD designer action area and select paste. The code you copy will be automatically converted to actions. This is a great way to get started.</p>
<ul>
<li><a href="#screen-recording">Screen Recording</a></li>
<li><a href="#sharepoint-rest-api">SharePoint REST API</a></li>
<li><a href="#powershell-pnp">Powershell PnP</a></li>
<li><a href="#sharepoint-connector-for-pad">SharePoint Connector for PAD - FUTURE</a></li>
</ul>
<h2 id="screen-recording">Screen Recording</h2>
<p>The first way for creating a SharePoint item I will cover is using the <a href="https://docs.microsoft.com/en-us/power-automate/desktop-flows/recording-flow">screen recorder</a>. The video below demonstrates how I used the PAD recorder to open the SharePoint site and create a new item. I was then able to utilize that record with my PAD designer and drag-and-drop the other activities needed for adding the Excel data in.</p>
<p><a href="https://user-images.githubusercontent.com/7444929/152586742-33d18b4e-d95a-4ae6-80da-b95a5b137f87.mp4">Record Creating SharePoint Item</a></p>
<p>One issue i did run into while building this automation was that after the first record creation the validation on the SharePoint create screen always said that my required field were not filled in even though they were. To work around this I added a Browser Reload Web Page action at the start of each loop to reload the SharePoint list url.</p>
<p>This is the final result for the recording method.<br>
<img src="https://user-images.githubusercontent.com/7444929/152586092-fbef1fae-f7f1-4e2c-8873-6ca847749867.png" alt="Final PAD Flow for Recording"></p>
<p>If you copy the code below you can paste it into the Power Automate Desktop design surface to get started.</p>
<pre><code>Excel.LaunchExcel.LaunchAndOpen Path: $'''C:\\Files\\Desktop\\MOCK_CLIENT_DATA.xlsx''' Visible: True ReadOnly: False LoadAddInsAndMacros: False Instance=> ExcelInstance
@@timestamp: '01/27/2022 02:14:39'
@@source: 'Recorder'
@@culture: 'en-US'
WebAutomation.LaunchEdge.LaunchEdge Url: 'https://yoursharepoint.sharepoint.com/sites/Clients/Lists/Client%20Contacts/AllItems.aspx' BrowserInstance=> Browser
Excel.GetFirstFreeColumnRow Instance: ExcelInstance FirstFreeColumn=> FirstFreeColumn FirstFreeRow=> FirstFreeRow
Excel.ReadFromExcel.ReadCells Instance: ExcelInstance StartColumn: 1 StartRow: 2 EndColumn: FirstFreeColumn - 1 EndRow: FirstFreeRow - 1 ReadAsText: False FirstLineIsHeader: False RangeValue=> ExcelData
LOOP FOREACH CurrentItem IN ExcelData
WebAutomation.GoToWebPage.ReloadWebPage BrowserInstance: Browser
@@timestamp: '01/27/2022 02:16:24'
@@source: 'Recorder'
@@culture: 'en-US'
WebAutomation.Click.Click BrowserInstance: Browser Control: appmask['Recording']['span']
WebAutomation.PopulateTextField.PopulateTextField BrowserInstance: Browser Control: appmask['Recording']['input'] Text: CurrentItem[1] EmulateTyping: True UnfocusAfterPopulate: False Mode: WebAutomation.PopulateTextMode.Replace
WebAutomation.PopulateTextField.PopulateTextField BrowserInstance: Browser Control: appmask['Recording']['input 2'] Text: CurrentItem[2] EmulateTyping: True UnfocusAfterPopulate: False Mode: WebAutomation.PopulateTextMode.Replace
WebAutomation.PopulateTextField.PopulateTextField BrowserInstance: Browser Control: appmask['Recording']['input 3'] Text: $'''WR - %CurrentItem[3]%''' EmulateTyping: True UnfocusAfterPopulate: False Mode: WebAutomation.PopulateTextMode.Replace
WebAutomation.PopulateTextField.PopulateTextField BrowserInstance: Browser Control: appmask['Recording']['input 4'] Text: CurrentItem[4] EmulateTyping: True UnfocusAfterPopulate: False Mode: WebAutomation.PopulateTextMode.Replace
WebAutomation.PopulateTextField.PopulateTextField BrowserInstance: Browser Control: appmask['Recording']['input 5'] Text: CurrentItem[5] EmulateTyping: True UnfocusAfterPopulate: False Mode: WebAutomation.PopulateTextMode.Replace
@@timestamp: '01/27/2022 02:16:38'
@@source: 'Recorder'
@@culture: 'en-US'
WebAutomation.Click.Click BrowserInstance: Browser Control: appmask['Recording']['span 2']
END
Excel.CloseExcel.Close Instance: ExcelInstance
WebAutomation.CloseWebBrowser BrowserInstance: Browser
# [ControlRepository][PowerAutomateDesktop]
{
"ApplicationInfo": {
"Name": "ClipboardControlRepository",
"Version": "1.0"
},
"Screens": [
{
"Controls": [
{
"AutomationProtocol": null,
"ScreenShot": "iVBORw0KGgoAAAANSUhEUgAAAFkAAABKCAYAAADQSfiaAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAPiSURBVHhe7ZtNSxtBGMfzgfod/AS9Fjz21JPgsR56KC2C8dBWi3pQUKyQvtkGm6JGRRpEcasViaKYYhSWJg1Jca0EzZZ/Z/Ylzcuq2XHzWM3zgz+TzMzus/PLsIkhhsA0HZZMAEsmgCUTwJIJYMkEsGQCWDIBLJkAlkwASyaAJRPAkgkI5XNZcJob3skEsGQCWDIBLJkAlkwASyYg1NU+Aeq8vvsCWttDq/UaDzpB1Hv64D2+LqYcbf64FslywQiFrNZrPOgEVU+KVoEl+4wKLNlnVGDJPqMCS/YZFViyz6jAkn1GBWXJ3U9WMPsxiWhP1HP8orDkhhLB8JsMDNOEPv0FjyvG3A/+F+XHnXvWomXrNR50/NS77A8WFQKXLC9ULuimRl5/9Vqro8I17+QOFD7FgXgcpwN9NfPG8Fv0l4YHa/r959bt5EYiF2PvmkGkTpwrwTGSvZXzUsiK3uLWekWfWv7VuyX35EZSJ7l4iqIp2sw+RsvzWkhyuEfDwtQ2EjVZXj0SYkwUNlN1Y4mpJGL9n8+VXyf5JIPEyrF4YCI7P+fMq5ccHtlH1pCvhpxqovh913pRhp1jDyadGgM6DDlHTyEsnmttL616iIxZz93zqUSFSyS/xavpAkrOZD8UVjX03/c6p7fkWPsKdn6Jx2cFJDrlvBrJveL5megwclgeXcLs6rF1XcbaSlmqO9eWLjDFuWQ9cR2yXrGvp3wNqlGh8Z0cq07VTq4Zs3fyjK+dHJNjQ7as0l5S7LhqydHNU/kMO0PueZbsF8XIIOq+QD8PMdwehaaL7r2cONcpUuOiXjhq1ct3tPI92ZUsErNkCjkT4tYgHrmSpThPnGNjW/K4I2id29BNKXfdOrextoTD0Rmr3u7/ek/2TvMkd3XacmCIN0PR1O3k8SVEBiryfM6+z07mrNuHviXaYg4Loi+Rlh068h/Ex0SrHksuj4cdYZLyPXk8Y0kv5XPQIlLwBrStYxzMu8clcSAnyPfF9LZ9nsUjcQITf+L2TmbJVXOiWNizP0WUJYuMTmVQkPMdSsYRNsQ91x23dq5An3a+T+k+REF2xN61suSrL7qRBFlPBUXJE3jWn0Tym47lkRk8qujnL4jqUZZ8XuSFygXd1Mjr91qXGxUCl8w7uZ7AJTcSuZhGdk1QCbKeCizZZ1RgyT6jAkv2GRVYss+owJJ9hH9weEGCqHeln846LS1x+1sxq6WAul4N1yM5nbYXLFsKqOvVcD2SWwyWTABLJoAlE8D/wE4Q3skEsGQCWDIBLJkAlkwASyaAJRPAkglgyQSwZAJYMgEsmQCWTABLJoAlNx3gL98grdEytSjrAAAAAElFTkSuQmCC",
"ElementTypeName": "span",
"InstanceId": "30e39dae-9a80-45f5-bc73-37ec46a06b6f",
"Name": "span",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": "html > body > div:eq(0) > div > div:eq(1) > div:eq(0) > div:eq(1) > div:eq(2) > div > div:eq(1) > div:eq(1) > div:eq(1) > div:eq(0) > div > div > div > div > div > div > div:eq(0) > div:eq(0) > button > span > span > span",
"Elements": [],
"Ignore": false,
"IsCustom": true,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "span"
},
{
"AutomationProtocol": null,
"ScreenShot": "iVBORw0KGgoAAAANSUhEUgAAAyYAAABgCAYAAADo3mbeAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAvESURBVHhe7d37k9X1fcdx/yR1WmNbazKTZCbqtKkx6DQRm6DB65CZkspF7ZhIWqZGbgFMq6wYcZhwkYuLkcVxCS4mIAXCRfAKAiqCiCG7yyW7vHveH/ash+0Cu3L5LvB4zDxn13O+5+xZ/OX78nuOXPH+++/Hu+++G2+99Va8+eabsXnz5tiwYUMAAABcKIYJAABQOcMEAAConGECAABUzjABAAAqZ5gAAACVM0wAAIDKXbFly+bYvOmPsXHD+vjfdW/E2jV/iNdXt/XcDQAAcP65YgIAAFTOMAEAACpnmAAAAJWrfJgcO3Ystm7dEhvWrx9weXw+DgAAuDRUPkz+9PnnMX3a1Fi7Zk2/I6RveVwen487nYMHD8Ydw4fH4sWLem459+o/4647R8Snn37ac2tE/pkOH357+Xqhde7bF22jx8SxQ3/uuQUAAIa+ITFMnp3dFEeOHOm55YSurq44evRozz99IY/L4y/UMHlj7doYN3ZMeb6+6j/juuv+Lp5paorjx4+X2ysZJrWfvXHytFg16l9j+W23x6t33RNvPT+3504AABjahtQw6ezsjKZZT8WePXti1wcfxGurVvUc9YULPUzy8fk8pxsm48eNjWHfvSXefeedcnslwyTVxsnW/5kVK4aPiB1Ll/XcCAAAQ9+QGiY7aq+l+cWl0dr6ahkmc577dcyYPi22b99We21bY9IvHo8ttdd3NsNk3759ZUhc+5VrSpMnT4qOjo5ytaOlZXnc/E/fjquvujJ+9tNHY8GC+eX7euvWret5lhPqP2PhwgUxc8aM+PmECeUqT99h0tzc3Pu8tw4bVvvzXV9unzZ1ajz44L/F88/PKa9lzIMPxttvvx0/HjUqrvnrv4pJTzwRhw8fLsfmv5u7R44sz5Ff899ZXx0ffRx/nDYjOj7eGxuemBLH2tt77gEAgKFtyAyTPAFfUhsRLctfjtlNs2L7tm3xu5Wt5eR/3m/mxotLl8SR2jFne8WkvXayvnfvx9Hd3R07d+yIW75zcxkKO3fujNtuHRYbN27sOfKEgVwxyWPyOX/wL3fE66+v/n/DZM+e3WX85M/871/9Kv79kYfLgMlhkoMlf9ft27eX73N05PFtbW3xzW98vfbvY1Ps3r077rl7ZBkt+Rw5mH4yenR5TgAAuBQMmWGyf//+WLpkcXR3dcXvayf3Ly1rjidnTo/fvrQs2l5bVe7LKyn5Nq+zGSY5bObPmxcjR/6onPjXr4Tk8+WVinxMa2trOS4NdJikFS0t8cD995Wh0zhMcnT89NFHy9WSvBIyfvy4MsRymNS/z/L7WU8/XR5Tf+58bXnFpX7Vpt6pXhMAAFyMhswwyf/6X/+we37wvbOzo1whya/5Nqusvf3P5bizGSbPzp4dY8eMKY9vPPlPOUbycy05WnKk5DGDGSb52vKD8g89ND5u//73yjD58MMPy0jJ582rHXns6YZJ/bkaX1sOk3vvvScOHTpU7gMAgEvNkBgms55+qlwxye/PVB6Xx+f3p1M/sZ87d275Putoby9j4D9+PqF3hOT/UavvZ0fWvfFGeYtXvr0rj8krHXmlpq++wyRt2rQpvvbV6+OmG28owyTLt4jl7fmacxQNdpjkW7hu+Na34oWFC8toy8/JLF/+cjkOAAAuBdUMk9oJf6xYUb7m/4lr8aIX4vk5zw24PD4fdzr1E/vGtz/lEHjvvffi+9/75zIeZs9+plzZyJP/Xbt2lQGSx+VnPfKD8HmVJofQfffdW96CVf/Qel1/wySHQ34Qvj5M8p+frg2p/HB7XoV58smZgx4m+Trysyv115evf+XKleU4AAA4JxrO0atQzTDJX/iKK058BQAAqlfxObphAgAAGCYAAMAQYJgAAACVM0wAAIDKGSYAAEDlDBMAAKByhgkAAFC5S26Y1P9iltM1cWL5pY9PeCwOv7AgOhfMlyRJklRhf3nk4d5z9H7P4Rs7D38J47kfJvlCc2lJkiRJujTLc/5zrNIrJrnKcp0db1kuSZIkqcJi4n/2nqP3ew7f2EVxxWQg8pep/dI5Sg7s/yS6u7skSZIkVVgZJz3n6FUwTCRJkiQZJoaJJEmSVH2GiWEiSZIkVZ5hYphIkiRJlWeYGCaSJElS5RkmhokkSZJUeYaJYSJJkiRVnmFimEiSJEmVZ5gYJpIkSVLlGSZnGCZTp0yJq6+6srebbryh9nrfOen+rPEx/bVo0cIBHSdJkiRdjhkmAxgmOSrq/5zf3zH89jhw4NOTjjtThokkSZJ06gyTQQ6THCQPPHD/SVdNBpJhIkmSJJ06w+Qsh0nfwdHR0R7jxo3tfetXfp+3NR5XP6Z+X/2xkiRJ0uWaYTLIYZL/3DhE+hscjce3LH/5pGFSP6bxOSRJkqTLPcNkkB9+X7t27Un3Nw6TvO9UV0Hqx9Xre78kSZJ0OWeYDOKKSQ6Pvh98bxwmjd/3Le9rfGtXf8dIkiRJl2uGyTl8K9dAr5gYJ5IkSdLJGSZf4sPvedWk/pauxmFSv6/x+L6fMcnbjBNJkiTp5AyTQQ6TLEdJ/S9abBwcWX2c1D+TUr+v8bgcJDlM+v5ljZIkSdLlmmFyhmEiSZIk6fxnmBgmkiRJUuUZJoaJJEmSVHmGiWEiSZIkVZ5hYphIkiRJlWeYGCaSJElS5RkmhokkSZJUeYaJYSJJkiRVnmFimEiSJEmVZ5gYJpIkSVLlGSaGiSRJklR5l/UwOfzCgjJMJEmSJFVbGSQ95+hVOPfDZOfOE8PjdE2cWH7p4xMeK794/iFIkiRJqq6/PPJw7zl6v+fwjeU5/zl27odJvtDaLyRJkiTpEi3P+c+xSq+YlK/93S9JkiTpwjaYc/SL4orJQOQvc56WFgAA8CVUfI5umAAAAIYJAAAwBBgmAABA5QwTAACgcoYJAABQOcMEAAConGECAABU7rIcJvW/hPE8/MUsAADAl1DxOXo1wwQAAKCBYQIAAFTOMAEAACpnmAAAAJUzTAAAgMoZJgAAQOUMEwAAoHKGCQAAUDnDBAAAqJxhAgAAVM4wAQAAKmeYAAAAlTNMAACAylU+TI4dOxZbt26JDevXD7g8Ph8HAABcGiofJn/6/POYPm1qrF2zpt8R0rc8Lo/Px53OwYMH447hw+Pqq67sbdrUqXH06NFYtqw5Pvroo54jz97ixYvK8+fXRvnzMgAA4PSGxDB5dnZTHDlypOeWE7q6usqI6CuPy+MHOkzmzp1bvs862ttj//79ceeIEdHa2tpz5Jnl4x5//L/KoOlPDpK//Ztr47Zbh8Xu3bt7bjVMAABgoIbUMOns7IymWU/Fnj17YtcHH8Rrq1b1HPWFwQ6TvlcxvowzPVfePuy7t8QD998Xv/zltDKqkmECAAADM6SGyY7aa2l+cWm0tr5ahsmc534dM6ZPi+3bt9Ve29aY9IvHY0vt9Z3NMMnf96Ybb4h169bF4cOHY/z4cfGT0aPjx6NGlRFx4MCBeOih8XHtV64ptbW1nfSWsDw+H9cof0Yek8+ZA2XTpk3l9sZhsm/fvhg/bmzv806ePCk6Ojp6X8+SJYvj7pEj45vf+Hq0tCyP5ubm+NpXr49v/+M/9D5f/hk909RUbs/mzHmu36tKAABwsRkywyRP9pfUTvBblr8cs5tmxfZt2+J3K1vLwJj3m7nx4tIlcaR2zGCvmNQHRZbDob9h8sMf/iD27v24PG7mjBnx2GM/O+mEfyBXTPL+zz77rLx1bNzYMWV0NA6T9vb28jO6u7tj544dcct3bq79Oa/vfT0zpk+PQ4cOlZ+fbwtbsGB++R3HjhnT+3pylOT9+X0+1113jojVq1eX5wcAgIvZkBkm+dmPpUsWR3dXV/z+9dXx0rLmeHLm9PjtS8ui7bVV5b68kpJv8xrMMGn8jEn+n7z6Gyb18ZBeeeWVuP7vr4upU6bUftbuOH78+ICHSR6Xryvf0rWipeWkYZKDav68eTFy5I/KVZH+hlLKrzla8vdM+dz5Gvd98kkZUI1DKzvVawIAgItHxP8B9F1+QH+oLEUAAAAASUVORK5CYII=",
"ElementTypeName": "input",
"InstanceId": "1ad356db-2912-4cdc-a538-f7fb3345bdd0",
"Name": "input",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": "html > body > div:eq(0) > div > div:eq(1) > div:eq(2) > div > div:eq(3) > div > div > div:eq(3) > div > div > div:eq(1) > div > div > div > div > div:eq(2) > div:eq(0) > span > div:eq(0) > div > div > input",
"Elements": [],
"Ignore": false,
"IsCustom": true,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "input"
},
{
"AutomationProtocol": null,
"ScreenShot": "iVBORw0KGgoAAAANSUhEUgAAAyYAAABgCAYAAADo3mbeAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAwzSURBVHhe7d39c1WFncdx/yR1WuvuWtuZtj+0nd2udbFTNa6C0lmr9gGrgtZtiwhbKhoKtC4QQaFMbacmJDwITmkRqFgWuhAEHATkSQqDUJaHwADfvd9DTry5DUmg0BPi6zXzniT3npzc8NP9eO6N1+3YsSO2b98e27Zti3feeSc2bdoUGzZsCAAAgL8XwwQAAKicYQIAAFTOMAEAACpnmAAAAJUzTAAAgMoZJgAAQOUMEwAAoHKGCQAAUDnDBAAAqJxhAgAAVK7yYXLmzJnYvLkzNqxfP+jy+Pw+AABgeKh8mPzl6NGYNrU53l67ts8R0lgel8fn9/XnyJEjcU9TU9x4w/U9TW1ujtOnT8eiRR2xf//+7iP/dq2trxXnz4/18udlAABA/4bEMJk7pyW6urq6b7ng7NmzxYholMfl8YMdJgsWLCg+z04cPx6HDh2KUSNHxooVK7qPHFh+3+TJPy4GTV9ykPzjP9wcX7tjROzZs6f7VsMEAAAGa0gNk5MnT0bL7Jmxd+/e2P3++/HmypXdR33kUodJ41WMyzHQufL2Ef92ezz0zQfjpz+dWoyqZJgAAMDgDKlhsrP2WDraF8aKFb8thsm8V16O6dOmxtatW2qPbXNM+cnk6Kw9vr9lmOTv++UvfTHWrVsXp06dinHjxsajY8bEtx55pBgRhw8fjiefHBc3f+qmolWrVvV6SVgen99XL39GHpPnzIGycePG4vb6YXLw4MEYN/aJnvM+//yUOHHiRM/jaWtrjW+MHh1f+PznYtmy16OjoyM++5lb4yv/8s8958t/o5daWorbs3nzXunzqhIAAFxrhswwySf7bbUn+MteXxpzWmbH1i1b4ve/W1EMjFd/uSDaF7ZFV+2YS71iUg6KLIdDX8PkvvvujQMHPii+b8b06TF+/I96PeEfzBWTvP/DDz8sXjo29onHi9FRP0yOHz9e/Ixz587Frp074/av3lb7d17f83imT5sWx44dK35+vizs17/+VfE7PvH44z2PJ0dJ3p+f57nuHzUyVq9eXZwfAACuZUNmmOR7Pxa2tca5s2fjD2tWx+JFHfGzGdNiyeJFserNlcV9eSUlX+Z1KcOk/j0m+Ze8+hom5XhIb7zxRtz66Vui+YUXaj9rT5w/f37QwySPy8eVL+lavmxZr2GSg+pXr74ao0c/UFwV6WsopfyYoyV/z5Tnzsd48M9/LgZU/dDKLvaYAADgWjJkhkleYSivUuR7NE6ePFFcIcmPOQ6y48f/rzjuUoZJ4xP3gYZJ/pwcJDNmTC8GRA6VSxkmaU1tWN377/f0vDwszZ0zp7j6kY+7PN/Fhkn9uRqHyfLly4vbAQBgOBkSw2T2rJnFFZP8fKDyuDw+P+/P5Q6TUr4R/6knnyxeOpV/leu73/l2vDx3bvFSrEaNwyQH1oRnnimuaJTnzo/PTnimuHKSb+q/5ZZ/uqRhki8Fm/Lcc/HA/aNi3759xXjLAbR79+7iOAAAuJZVM0x27YrI//Jf+5gDoPW138T8ea8Mujw+v68/lztMZtVGz02f/ETxBvV8E3y+GT6vorS3txe3Pf39p3q9/yQ1DpNU/pzy3O+9917cdefXizetz5nzUtx9152XNEzysR6tjbH8s8X5OPI8Eyc+2+tnAgDAZat7jl6FaoZJ/sLXXXfhIwAAUL2Kn6MbJgAAgGECAAAMAYYJAABQOcMEAAConGECAABUzjABAAAqZ5gAAACVG3bDpPwfs/TXpEkXfumJE+PckiW1FkuSJEmqsPMTJvQ8R+/zOXx9V+F/wnjdokUd8Z9PPx1vvfXWlRkm+UDzF5IkSZI0PMvn/FfYlR8ml3DFJFdZrrNYvkySJElSldU9R+/zOXx918QVk8HIX6b2S+coOXO6q3bDeUmSJElVluOk+zl6FQwTSZIkSYaJYSJJkiQNgQwTw0SSJEmqPMPEMJEkSZIqzzAxTCRJkqTKM0wME0mSJKnyDBPDRJIkSaq8qofJjh07Yvv27bFt2zbDRJIkSfq4ZpgYJpIkSVLlGSYXHyZTm5uL6m87cuTDuKepKdat+2Ov21tbf1Mc23h/fhw3bmycOnWy1/GSJEmS6jJMLj5M+hoVeduNN1z/V4Mlv85xUn9bZphIkiRJg8gwufgw2bHjvWhqurv4WN6WA+Shbz7Ya2zkVZKHH36o13FlhokkSZI0iAyTiw+THBM5KnJclF8/++yE4uv6IVI/Phq/p3GY5LDJKy5Z/e3lS8DK+/LzvK38uXns8mWv9xzz5S99sc8hJEmSJF2TGSb9v/m9fO9Ifp5DIIdJDoW8rRwf9cf0N0waR8rmzZ3F5+UoqX8pWH5eHlues36s5M+rP5ckSZJ0TWeY9D9Mcozk1ZEcBPUDpPy8HA3lEGn8unGY1I+LssbBkuUx5VWZxnNm9Y+rvE2SJEm6ZjNM+h8m9QOh/ipJOQx27txRjIZyIPQ3TMqvG1+KVT94yurPY5hIkiRp2GeY9D9MshwNs2fN7DVAysGy4Bfzew2PgYZJWQ6L8o31fR2Tn+fLxvL+xnOW32+YSJIkadhkmAw8THIQNL5ZPcvBkn+hq/69IY0jon505JvXc1Dk7eWwya/z88G8x8QwkSRJ0rDNMBl4mOQIyJde1Q+HLIdC41/H6m+Y5OflX93K6odGOU7K+8rv6eucmWEiSZKkYZVhMvAwkSRJknSVM0wME0mSJKnyDBPDRJIkSao8w8QwkSRJkiqv6mHS2bkpNm383/jThvXxP+v+GG+vfSvWrF7VffdVYphIkiRJQytXTAwTSZIkqfIME8NEkiRJqjzDxDCRJEmSKs8wMUwkSZKkyjNMDBNJkiSp8gwTw0SSJEmqvI/3MFlSDBNJkiRJ1VYMku7n6FW48sNk164Lw6O/Jk0qfumYOLH4xfMfQZIkSVJ1nZ8woec5ep/P4evL5/xX2JUfJvlA8xeSJEmSNDzL5/xXWLVXTPJjX/dLkiRJ+vt2Kc/Rr4krJoORv8xVWloAAMBlqPg5umECAAAYJgAAwBBgmAAAAJUzTAAAgMoZJgAAQOUMEwAAoHKGCQAAULmP5TAp/yeMV+F/zAIAAFyGip+jVzNMAAAA6hgmAABA5QwTAACgcoYJAABQOcMEAAConGECAABUzjABAAAqZ5gAAACVM0wAAIDKGSYAAEDlDBMAAKByhgkAAFA5wwQAAKhc5cPkzJkzsXlzZ2xYv37Q5fH5fQAAwPBQ+TD5y9GjMW1qc7y9dm2fI6SxPC6Pz+8byPnz56OzszMefPA/4qZPfiI++5lbY8GCBXH27NnuIwAAgKFgSAyTuXNaoqurq/uWC3I8nD59uvurj+RxefxAwyRHSXt7e9z66Vviv198MT744INYs2Z1/NekSXHq1KnuowAAgKFgSA2TkydPRsvsmbF3797Y/f778ebKld1HfWSww+TgwYPR1HR3tLW1FiOltG/fvj4HDwAAUJ0hNUx21h5LR/vCWLHit8UwmffKyzF92tTYunVL7bFtjik/mRydtcc3mGGSo+buu+6Mw4cPd9/SW/68l1paipd35cu8vvfomNi/f39x39Tm5njsse/F/Pnz4uZP3RSPP/ZYvPvuu/GtRx4pjp3y3HPFVZds3Lix8cz48fH881OKY0eNHBm7du4sztPR0RG3/etX4sYbro87Royo/buuL27P8z86Zkw0v/BCr+/Jn3H7V2+LrVu2FMedOH48vvudb8fSpUuLrwEAYLgaMsMkn+S3tb4Wy15fGnNaZhdPzn//uxVx5MiRePWXC6J9YVt01Y4Z7BWT1tq5cjT09bKtfJlYvrzrgftHxd69e+LYsWPFsMixcOLEiWI45KDIx7B169bi82+MHl0cu2rVqvjC5z9X+3fa2DNM7mlqip21YZHnmTz5xz3nyePz47lz5+LFn/88nv7+U8XVmvL8OVQOHDgQDz/8UMyYPr04Nr/3F/PnF48zf8Z9991bXP0BAIDhbMgMk0OHDsXCttY4VxsNf1izOhYv6oifzZgWSxYvilVvrizuyysp+TKvwQyT5cuXF0/q+zoub8v78phSjpCv3TEi8t8jh0M5asrxMXvWrOK4HEo5RNatW/dX96W8Pe/P43LU/PAHPyiuluSVlvKc9edP+fX48T8q/tJYXh3JqyR5tSTPO2vWzF4vRQMAgOEn4v8B9CulrZ4rTyUAAAAASUVORK5CYII=",
"ElementTypeName": "input",
"InstanceId": "dae70276-c98f-4162-a2f1-0fb4fc78b848",
"Name": "input 2",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": "html > body > div:eq(0) > div > div:eq(1) > div:eq(2) > div > div:eq(3) > div > div > div:eq(3) > div > div > div:eq(1) > div > div > div > div > div:eq(2) > div:eq(1) > span > div:eq(0) > div > div > input",
"Elements": [],
"Ignore": false,
"IsCustom": true,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "input"
},
{
"AutomationProtocol": null,
"ScreenShot": "iVBORw0KGgoAAAANSUhEUgAAAyYAAABgCAYAAADo3mbeAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAArMSURBVHhe7d37c9X1ncdx/iR1ttbtbC9T7Q+97G5tx+7aGmerlc62xd3W1gt1q1tFaK0oFLWuwFqRMsVOIdyFGdMi4sRS6EBComMltogMjEBtEgJDePe8v+TEY3oIgRz4xPh4zDwnyTnfc0jCL+c13/OFGa+//nq89tpr8corr8TevXtjz549sWvXrgAAALhUDBMAAKA4wwQAACjOMAEAAIozTAAAgOIMEwAAoDjDBAAAKM4wAQAAijNMAACA4gwTAACgOMMEAAAorvgwOXnyZHR3d8WunTsnXB6fjwMAAKaH4sPkL8eOxaKFC+Llzs6mI2RseVwen487l9OnT0dXV1d84xv/GVd+6B/iEx//WKxYsSJOnTo1cgQAADAVTIlh8vOnlsbQ0NDILWfkeDhx4sTIV+/K4/L4cw2THCVr166Nj330n+L/nngi3nrrrdi+/cX40bx5cfz48ZGjAACAqWBKDZPBwcFYuuTJ2L9/f/zpjTfiha1bR45610SHyaFDh6Kt7YZob19djZS6N998s+ngAQAAyplSw2Rf7XtZt3ZNdHQ8Xw2TZ5Y9HY8uWhi9vT2176075v/kweiqfX8TGSY5am74ypfj7bffHrnlvfLP+/+lS6u3d+XbvL733dviwIED1X0LFyyI22//Xixf/kxc9eEr447bb49XX301/uvWW6tj5z/0UHXWJZs9+664/7774uGH51fH3nzTTdG3b1/1POvWrYtrP/+vccXll8WXrruu9nvdWd2ez//d226LBY888p7H5J/xxS9cG709PdVxA/398Z1v/3ds2rSp+hoAAKarKTNM8kV+++pVsfm5TfHU0iXVi/Pf/qYjjh49Git/uSLWrmmPodoxEz1jsrr2XDkamr1tK98mlm/vuuVrN8f+/X+Od955pxoWORYGBgaq4ZCDIr+H3t7e6vOvz5xZHbtt27b41DVX135Pu0eHyY1tbbGvNizyeR588Mejz5PH58fh4eF44mc/ix/8z93V2Zr68+dQOXjwYMya9a147NFHq2Pzsb9Yvrz6PvPP+OpX/6M6+wMAANPZlBkmhw8fjjXtq2O4Nhpe2v5ibFi/Lh5/bFFs3LA+tr2wtbovz6Tk27wmMky2bNlSvahvdlzelvflMXU5Qv7tS9dF/j5yONRHTX18LFm8uDouh1IOkR07dvzdfSlvz/vzuBw1/3vvvdXZkjzTUn/OxudP+fV99/2w+pfG8uxIniXJsyX5vIsXP/met6IBAMB0NGWGSZ4tqF/7kWc0BgcHqjMk+TFfmGf9/X+tjpvIMMm3Rf3LP3+uuuB9rPowabyGJYfJ9df/e/T19TUdJnkGJjUbJvUzHKmzs7O6v6f2fHmNS/4Zecak8QxOs2FS/zrPjsyceUv1PLfeOqs6awIAANPdlBgmSxY/WZ0xyc/PVR6Xx+fn46m/XSvfdvXsypXV43Ks5EDIi+zzrVNj38p1zz0/+LvhkOXn4w2TfJ68qD6/p7u///2YO/eB6s/KMzC7d++ubr/zjjtGn3O8YZIDLM+S5FvH7rrzjmqIAQDAdFdmmPT15Xutqo85Elav+nUsf2bZhMvj83Hnkmdg1rS3j16AniNl2bKnq7dM9ff3V2MkL37PC9Dn3H//6IXy5ztMcnTkxfL5dq0cE/k8OYxyYORz50Xzjz/+2OhzjjdMUp69uebqT7roHQCAS6fhNXoJM/JF8N7u7tize3f8oTZIfl97wZ1vI7qo8geeMePMx/ex+jDJYdFK+XeS4yevpwEAgEui8Gv0MmdMDJOm8m1cR44cqf5lr3wbWp51AQCAS8Iwef9q9TDJv4u8YP/ee+6JY+e4hgYAAFrKMAEAAIozTAAAgOIMEwAAoDjDBAAAKM4wAQAAipt2w6T+H7OM17x5Z37ouXNjeOPGWhskSZIkFez0nDmjr9GbvoZv7CL8J4ytHyb5jeYPJEmSJGl6lq/5W6zoGZNcZbnOYvNmSZIkSSVreI3e9DV8Y++LMyYTkT9M7YfOUXLyxFCcPj0sSZIkqWDVOBl5jV6CYSJJkiTJMDFMJEmSpPIZJoaJJEmSVDzDxDCRJEmSimeYGCaSJElS8QwTw0SSJEkqnmFimEiSJEnFM0wME0mSJKl4holhIkmSJBXPMGnhMFm44JH43e9ebnqfJEmSpLNnmLRomPzxj6/FrFnfjCNH3m56vyRJkqSzZ5i0aJisWvXrqmb3SZIkSRo/w2ScYTI4OBCzZ98Zm597Lm5sa4srLr8sPvuZT1dnR8Ye98AD94/eXn9cHp/lY51JkSRJks6eYTKBYXJjw7DI60jytryvflxeV9J4Ww6UHDONz+FsiiRJknT2DJMJDJPGC9qbXUtyrovec5TkMc3ukyRJkmSYTHqY5Mc8ZuxbtXKI1N/KlRkmkiRJ0tkzTCY5TJqdDcmvG9+65YyJJEmSNH6GySSGSd7feNF7s8fUvzZMJEmSpLNnmEximOTnOUzyuMbH5fH1t3Dlv+KVxxgmkiRJ0tkzTMYZJudq7Fu2JEmSJF1YhskFDpOzXfQuSZIk6fwzTCZxxkSSJElSazJMDBNJkiSpeIaJYSJJkiQVzzAxTCRJkqTiGSaGiSRJklQ8w8QwkSRJkopnmBgmkiRJUvEME8NEkiRJKp5hYphIkiRJxfuAD5ON1TCRJEmSVLZqkIy8Ri9hRm9PT+zt7o49u3fHH2qD5Pc7dkRnZ+fI3Regr+/M8BivefOqHzrmzq1+8PwlSJIkSSrX6TlzRl+jN30N31i+5m+x1p8xyW80fyBJkiRJ07N8zd9irR8m53PGJD82u1+SJEnSpe18XqO/L86YTET+MBdpaQEAABeg8Gt0wwQAADBMAACAKcAwAQAAijNMAACA4gwTAACgOMMEAAAozjABAACK+0AOk/p/wngR/mMWAADgAhR+jV5mmAAAADQwTAAAgOIMEwAAoDjDBAAAKM4wAQAAijNMAACA4gwTAACgOMMEAAAozjABAACKM0wAAIDiDBMAAKA4wwQAACjOMAEAAIorPkxOnjwZ3d1dsWvnzgmXx+fjAACA6aH4MPnLsWOxaOGCeLmzs+kIGVsel8fn48Zz9OjRuLGtLa64/LLRFi5YMHLv5Jw4cSLWr18XBw4ciOPHj8fs2Xe17LkBAOCDaEoMk58/tTSGhoZGbjnj1KlT1QAYK4/L4yc6TFasWFF9ng3094/cOzmHDx+Om2+6KTo6OgwTAABogSk1TAYHB2Ppkidj//798ac33ogXtm4dOepd5ztMVq9eNXLLxWGYAADA5E2pYbKv9r2sW7smOjqer4bJM8uejkcXLYze3p7a99Yd83/yYHTVvr/JDJP6kPjxj+bFokU/jY/841Ux/6GHoqurK77y5evjqg9fGb9Yvrw6Y3Po0KGYfded1W3Zww/Pj4GBgcjf2Wc/8+nYsWOHYQIAAC0wZYZJvsBvr42Izc9tiqeWLonenp747W86qoGx8pcrYu2a9hiqHXO+Z0warzFpHBK3fO3m6szM9u0vxjVXf7K6LR/zq189G1/8wrXVff39/XHw4FsxPDwcffv2Vbfv2rXTMAEAgBabMsMkr9tY0746hk+dipdqY2HD+nXx+GOLYuOG9bHtha3VfXkmJQfD+QyTxmtM8l/yGjsk6sdt2bKl+jp/H21tN1QfcwQ9u3JlzJx5S3zqmqtHx41hAgAArRTxN2nDgsWH9FtPAAAAAElFTkSuQmCC",
"ElementTypeName": "input",
"InstanceId": "ec86ac3c-a3cd-4454-b496-5b310b397a95",
"Name": "input 3",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": "html > body > div:eq(0) > div > div:eq(1) > div:eq(2) > div > div:eq(3) > div > div > div:eq(3) > div > div > div:eq(1) > div > div > div > div > div:eq(2) > div:eq(2) > span > div:eq(0) > div > div > input",
"Elements": [],
"Ignore": false,
"IsCustom": true,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "input"
},
{
"AutomationProtocol": null,
"ScreenShot": "iVBORw0KGgoAAAANSUhEUgAAAyYAAABgCAYAAADo3mbeAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAkdSURBVHhe7d37b9X1Hcdx/yQ1m7rNbMvURHQXdXEZKmbqZDPz18nlV6lUUbSs6g8KqwI2oq4Fyj2xykUDITSBAtWBFEOckREua0tp2r53Pl966klzWlp68HPExyN5pu0531PO4afzyvd84YYvvvgijh07Fp999lkcOXIkDh06FF1dXQEAAPBdMUwAAIDsDBMAACA7wwQAAMjOMAEAALIzTAAAgOwMEwAAIDvDBAAAyM4wAQAAsjNMAACA7AwTAAAgu+zDZGhoKA4f7o6uAwemXTo+PQ4AALg+ZB8m58+di+YVTbFv796qI2Ri6bh0fHrcVM6ePRuPzpsXN99043grmprG7p2dS5cuxaZNHfHVV1/FxYsXY/HiRTX73QAA8ENUF8PkrZZVMTg4OHbLZcPDw8UAmCgdl46f7jBpbW0tvk/19/WN3Ts7p0+fjicefzw6OzsNEwAAqIG6GiYDAwOxauUbcerUqfjy5MnYtXPn2FHfmukwaW9vG7vl2jBMAABg9upqmJwoPZeOjRuis/PDYpisWf12vNq8Inp6jpae2+FY/uKy6C49v9kMk/KQeOH5xmhu/kf89Ce3xfKXXoru7u54+KG5cdutt8Q7a9cWZ2y++eabWLxoYXFb6uWXl0d/f3+kv7N775kT+/fvN0wAAKAG6maYpDf460sjYvu2rdGyamX0HD0aH3/UWQyMde+2xsYN62OwdMxMz5hUXmNSOSSe/PMTxZmZTz7ZE3fe8avitvSY999/L37/wP3FfX19ffH11/+JkZGR6D1xori9q+uAYQIAADVWN8MkXbexYX17jAwPx6elsbB5U0e8/lpzbNm8KXbv2lncl86kpMEwk2FSeY1J+pe8Jg6J8nE7duwofk5/H/PmPVJ8TSPovXXrYv78J+OuO+8YHzeGCQAA1FbdDJP0Eanyxe7pY1QDA/3FGZL0dXR0tKiv73/FcTMZJpN9lGviMEkjI6kcJm+1tMTCBQuKP6vyOMMEAABqqy6Gyco33yjOmKTvr1Q6Lh2fvp9KeUhUnjG5cOFCcYH9dIdJOua5hiXFmZN0If7tt//MMAEAgGsgzzDp7Y1IH50qfU1Dob3tX7F2zeppl45Pj5tKeXBUXmOSBkS6fbrD5Pjx48UF8b/8xc+jpeWf8cjDDxkmAABcnyreo+eQZ5ikF3zDDZe/AgAA+WV+j26YAAAAhgkAAFAHDBMAACA7wwQAAMjOMAEAALIzTAAAgOwMEwAAILvrbpiU/2OWqWpsvPyily6NkS1bSm2WJEmSlLHRhobx9+hV38NXdg3+E8baD5P0RNMLkiRJknR9lt7z11jWMyZplRXrbPs2SZIkSRmLxqXj79Grvoev7HtxxmQ60ospveg0SoYuDcbIyLAkSZKkjBXjZOw9eg6GiSRJkiTDxDCRJEmS8meYGCaSJElS9gwTw0SSJEnKnmFimEiSJEnZM0wME0mSJCl7holhIkmSJGXPMDFMJEmSpOwZJoaJJEmSlD3DpIbDpOmVV2Lfvn1V75MkSZI0eYZJjYbJsWP/jqef/lucOfPfqvdLkiRJmjzDpEbDpK3tg6Jq90mSJEmaOsNkimHS398XixYtjO3btsaj8x6Jm2+6Me69Z05xdmTicQ0NS8ZvLz8uHZ9Kj3UmRZIkSZo8w2Qaw6RyWKTrSNJt6b7ycem6ksrb0kBJY6bydzibIkmSJE2eYTKNYVJ5QXu1a0mudNF7GiXpmGr3SZIkSTJMZj1M0td0zMSPaqUhUv4oV8owkSRJkibPMJnlMKl2NiT9XPnRLWdMJEmSpKkzTGYxTNL9lRe9V3tM+WfDRJIkSZo8w2QWwyR9n4ZJOq7ycen48ke40r/ilY4xTCRJkqTJM0ymGCZXauJHtiRJkiRdXYbJVQ6TyS56lyRJkjTzDJNZnDGRJEmSVJsME8NEkiRJyp5hYphIkiRJ2TNMDBNJkiQpe4aJYSJJkiRlzzAxTCRJkqTsGSaGiSRJkpQ9w8QwkSRJkrJnmBgmkiRJUvZ+4MNkSzFMJEmSJOWtGCRj79FzqP0w6e29PDymqrGxeNGxdGnxwtNfgiRJkqR8jTY0jL9Hr/oevrL0nr/Gaj9M0hNNL0iSJEnS9Vl6z19jec+YpK/V7pckSZL03TaT9+jfizMm05FezDVaWgAAwFXI/B7dMAEAAAwTAACgDhgmAABAdoYJAACQnWECAABkZ5gAAADZGSYAAEB2P8hhUv5PGK/Bf8wCAABchczv0fMMEwAAgAqGCQAAkJ1hAgAAZGeYAAAA2RkmAABAdoYJAACQnWECAABkZ5gAAADZGSYAAEB2hgkAAJCdYQIAAGRnmAAAANkZJgAAQHbZh8nQ0FAcPtwdXQcOTLt0fHocAABwfcg+TM6fOxfNK5pi3969VUfIxNJx6fj0uKmcPXs2Hp03L26+6cai++/7XWzfvi1GR0cjveZ775kT+/fvHzsaAADIqS6GyVstq2JwcHDslsuGh4fj0qVLYz99Kx2Xjp/uMGltbS2+37FjRzFGDh48aJgAAECdqathMjAwEKtWvhGnTp2KL0+ejF07d44d9a2ZDpP29rbi5/S7Fy5YUPxsmAAAQH2pq2FyovRcOjZuiM7OD4thsmb12/Fq84ro6Tlaem6HY/mLy6K79PxqNUzeWbs2Hn5obtx26y2xZs3q4ixNkv4e/jJ/fvERsDl33x0bN24s7rt48WIsXrwoljz7bNEtP/5RPPPM3+PMmTPF486VntOyZS8Uvy89buvWrTEyMlLcBwAATK5uhkl607++NBq2b9saLatWRs/Ro/HxR53FwFj3bmts3LA+BkvHXM0ZkzQOdu/eXVxn8vnnn48PkzQizp8/H21tbcV9vb29ceLEiXjg/vvigw/eLz5K1tV1oPh5z54948Pkscf+FL2l444fPx5/ePDB6OjoKI59rmFJ8bvSn9fT0xNz5/6x+PMAAICp1c0wOX36dGxY3x4jw8Px6Sd7YvOmjnj9tebYsnlT7N61s7gvnUlJH/OayTApX/x+1513xLbS6Kl28Xv6OY2PQ4cOFiPjqaf+GhcuXCjuS2dKXni+MVY0NY0Pk/R9Uv555ZtvFqPmt7/59fifV87HxQAA4Eoi/g/lFMmApH+CVwAAAABJRU5ErkJggg==",
"ElementTypeName": "input",
"InstanceId": "520d3912-1a9a-40ee-91a9-c4dccff2537b",
"Name": "input 4",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": "html > body > div:eq(0) > div > div:eq(1) > div:eq(2) > div > div:eq(3) > div > div > div:eq(3) > div > div > div:eq(1) > div > div > div > div > div:eq(2) > div:eq(3) > span > div:eq(0) > div > div > input",
"Elements": [],
"Ignore": false,
"IsCustom": true,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "input"
},
{
"AutomationProtocol": null,
"ScreenShot": "iVBORw0KGgoAAAANSUhEUgAAAyYAAABgCAYAAADo3mbeAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAgLSURBVHhe7d3bj5R3Hcdx/qS20eq9moD10KposWDsTbWmFypt6cE7QLCYKgRq6gKpAhIPlGM5JqAUrBhgEyiwIIetISR1Q2Apy0l2f873B8/2cZmdZcnS3zB9vZJ3dg7P7Oz0aj55Zsqk06dPp5MnT6bjx4+no0ePpsOHD6fu7u4EAADwSTFMAACA4gwTAACgOMMEAAAozjABAACKM0wAAIDiDBMAAKA4wwQAACjOMAEAAIozTAAAgOIMEwAAoLjiw+TGjRvpyJH3U/fBg3ddHB+PAwAAOkPxYdJ/8WJavGhh+ue+fU1HyMjiuDg+HtfKhQsX0vdmzEiPPPxQ7onHv562b9+WhoaGUrzmx748Je3fv//20QAAQEltMUx+99bydO3atdu33HLz5s10/fr129c+FsfF8Xc7TFavXp0v79ixI4+RQ4cOGSYAANBm2mqYXLlyJS1f1pXOnj2b/v3BB+nd3btvH/Wx8Q6TdevW5uvxu1+aNStfN0wAAKC9tNUwOdP4WzZt3JB27dqZh8nKFb9PSxYvSj09xxp/25H0+i8XpPcbf99EDZM/rFqVnvrutPT5zz2aVq5ckc/ShPjv8INnnskfAZsyeXLauHFjvu/q1avplVdeTnNmz849+tnPpBdffCGdP38+P+5i429asOC1/PvicVu3bk2Dg4P5PgAAYHRtM0ziTf/6xmjYvm1remv5stRz7Fj621935YHxpz+uThs3rE/XGsfcyxmTGAd79uzJ3zM5ceLE8DCJEdHf35/Wrl2b7+vt7U1nzpxJ33ji8bRmzV/yR8m6uw/m63v37h0eJk8//f3U2zju1KlT6dtTp6ZNmzblY38+d07+XfF8PT09adq0J/PzAQAArbXNMOnr60sb1q9Lgzdvpvf+vjdtfmdT+s0bi9OWze+kPe/uzvfFmZT4mNd4hkn15fcvffELaVtj9DT78ntcj/Fx+PChPDKeffaH6dKlS/m+OFPy2i/mp0ULFw4Pk7gcquvLli7No+ZrX/3K8PNV+bgYAACMrW2GycDAwPCX3WMMXLkykM+QxM8YE9Hlyx/l48YzTOLL73Fs9TGt0GyYVNdjmPz0Jz9OA5cv5/uqYfLGkiWjDpO4HsNk6re+mccNAAAwPm0xTJYt7cpnTOLyWMVxcXxcbmXkd0zqWg2TuBxnPuof5Zr25Hfyfa2GSQym52fOTD979dX8t8VHzuL/BDbW3wkAAJQaJr29qfGuPf+ML6WvW/t2WrVyxV0Xx8fjWrnXYRJnZg4cOJC/FB8fxYrvkOzauTPf3mqYhHPnzqUXnp+ZvxQfHx3r6vptPgYAANpe7T16CWWGSbzgSZNu/QQAAMor/B7dMAEAAAwTAACgDRgmAABAcYYJAABQnGECAAAUZ5gAAADFGSYAAEBxHTdMqn+YpVXz59960fPmpcEtWxptliRJklSwoblzh9+jN30PX+8+/COMEz9M4g+NFyRJkiSpM4v3/BOs6BmTWGW3FlqcNZEkSZJUquqMSf7Z7D18vQfijMndiBfTeNExSm5cv5b+e+O6JEmSpILFOKneo5dgmEiSJEkyTAwTSZIkqXyGiWEiSZIkFc8wMUwkSZKk4hkmhokkSZJUPMPEMJEkSZKKZ5gYJpIkSVLxDBPDRJIkSSqeYWKYSJIkScUzTMY5TPr+82F67rkfpX+dOJ6vv73mz+nXv3r9juNGK46Nx8Tlff94L7380qz00aX+O45r1nifS5IkSXpQMkwME0mSJKl4holhIkmSJBXPMBljmMRoiPHwyMMP5brefHPUYVIdWx8bcX/12Lh9zpzZow6T+D0zpk/P46e6v3ps3B7PbZhIkiSpEzNMWgyTamhUQyKKYfDYlMl3DJPq2PpwiPvqw6MaGs2GSTxu5LH1kRLPF89rmEiSJKkTM0xaDJMYA3F2pBoH0Wgf5aqqjquGSgyM6rYojhk5TFatWPF/o2TkcVXVc9VvkyRJkjohw6TFMKmf0ahuazZMqo9ptTquauQwqT6mVR8/o40aw0SSJEmdmmEyzjMmcdv06U81PWNSHyfNxkV1W32YxPU4YzJynNQHTP02w0SSJEmdmGHSYpjEUIjBUB8IMQyafcekuq8+TkZer86QjBwmcX/cVh8nI6/H8/mOiSRJkjo1w6TFMImqcRKDItraeEyz75jE5eqMSH24xH3VY+Ny1GyY1I+tzrLEcdVj47g4s1I9lyRJktRJGSZjDBNJkiRJ9z/DxDCRJEmSimeYGCaSJElS8QwTw0SSJEkqnmFimEiSJEnFM0wME0mSJKl4holhIkmSJBXPMDFMJEmSpOIZJoaJJEmSVDzDxDCRJEmSimeYGCaSJElS8T7lw2RLHiaSJEmSypYHye336CVM/DDp7b01PFo1f35+0WnevPzC4z+CJEmSpHINzZ07/B696Xv4evGef4JN/DCJPzRekCRJkqTOLN7zT7CyZ0ziZ7P7JUmSJH2yjec9+gNxxuRuxIu5T0sLAAC4B4XfoxsmAACAYQIAALQBwwQAACjOMAEAAIozTAAAgOIMEwAAoDjDBAAAKO5TOUyqf4TxPvzDLAAAwD0o/B69zDABAACoMUwAAIDiDBMAAKA4wwQAACjOMAEAAIozTAAAgOIMEwAAoDjDBAAAKK7YMDnf96EkSZKkB6T7zRkTAACgOMMEAAAozjABAACKM0wAAIDiDBMAAKA4wwQAACjOMAEAAIozTAAAgOIMEwAAoDjDBAAAKM4wAQAACkvpf2pMXZDitRP0AAAAAElFTkSuQmCC",
"ElementTypeName": "input",
"InstanceId": "298ba462-b2a0-415e-905e-8f620bf377d0",
"Name": "input 5",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": "html > body > div:eq(0) > div > div:eq(1) > div:eq(2) > div > div:eq(3) > div > div > div:eq(3) > div > div > div:eq(1) > div > div > div > div > div:eq(2) > div:eq(4) > span > div:eq(0) > div > div > input",
"Elements": [],
"Ignore": false,
"IsCustom": true,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "input"
},
{
"AutomationProtocol": null,
"ScreenShot": "iVBORw0KGgoAAAANSUhEUgAAAGIAAABKCAYAAABAWoFVAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAOZSURBVHhe7Zpdb9owFIb71/srtmmatN3uar3uf6ggmKQhIYQlQAOUllab1GrSu+N88DWP0C7IZ/S80qs0OY5j+/GxQ5UziFhIQDCRgGAiAcFEAoKJBAQTCQgmEhBMJCCYSEAwkYBgIgHBRAKCiQQEEwkIJhIQTCQgmEhAMJGAYCIBwUQHg3h+fsLjw7IR/+wqPF1c5EdT3IaP2SY9dnU6GISu8GHnAa+17jDOzvKjKW7Dx2pTNWZ1ehGIpvyWQFSuk4AoLSAMcRsWEIa4DQsIQ9yGBYQhbsMCwhC34ZMDUf0w2udf5+d5p/XRFLfhl7TpNT/66tQ4CN1Q3aFTtu6jqe/7XCeq+TCZKjf5nzPi21c8fP6I2Yd3mL4v/ekLfuyWa9gnlxGHWHfGPLPukHgtOK0rKK+HOOoj7l/Dc31Mtso177+3qRnXiRmIEcLOFZxuiMXW9eNbQGzFbhApAtFyEKa3OzHtGdJAodvWZQp3rweYU2waOHTegj9cl6+uBak+z5D2FFR1r6MQje5WZQXETmw56sEtB6ujXMSTTSCUMS4NYDLGNEsQdYtlzB/QgM778PQ9XlyWLaEqHxkteZNeAcXrJ5iOYwQ689oukvuibgFhiD8uRhj61extwQ0SLE3lxtdQNPiqN6LzGeKuHlwPqY5lBJRi3XBCsQh+DmmwrmfooqMhxsW5gDDEVyYgxax3EGX62i0mkQdPtdBpF9mgXYCgbMoHl5aipFqWFOI53VcCq8pv2h8UzxIQhviml5EqBoxm7iL/mzIkHCLLbjBPNzOCyt8P4FMWKT/Kl6V1BlQZEdGSdrPl+aJ4joDYitGAdRSCgF5b81dXr9gv2gpDmtlZufm6oV7nh6s9YgWCnPp0rdMiQHrjrjbjGYb5azFd6xcQsySET6/I1WuxgNiKfUeYLzvrpUPlG3Y5oDOa8XqT1bG2g4BAbWWEdlpccxwPo1W95PsxYs9Z101Lm+vHq9dkAWGI2/DJgZB/+pldp8ZB6IbqDp2ydR9Nfd/nOlHNh8lUucmSEWbXqXEQh1h35rUz61g+dpvqJCBKCwhD3IYFhCFuwwLCELfh/wqEfIT8cjf+EXKTn+W/JRDajX6W36guL/NO50custwmOyDiuOiwPnKR5TbZASH6QwKCiQQEEwkIJhIQTCQgmEhAMJGAYCIBwUQCgokEBBMJCCYSEEwkIJhIQDCRgGAiAcFEAoKJBAQTCQgWAn4D/HRMHi6snHMAAAAASUVORK5CYII=",
"ElementTypeName": "span",
"InstanceId": "b54b28d7-719b-4e77-a97c-d97627ea5a6d",
"Name": "span 2",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": "html > body > div:eq(0) > div > div:eq(1) > div:eq(2) > div > div:eq(3) > div > div > div:eq(3) > div > div > div:eq(1) > div > div > div > div > div:eq(4) > div > button > span > span > span",
"Elements": [],
"Ignore": false,
"IsCustom": true,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "span"
}
],
"ScreenShot": null,
"ElementTypeName": "Web Page",
"InstanceId": "87159776-0a1f-47cb-99f6-2a3a0826ae81",
"Name": "Recording",
"SelectorCount": 1,
"Selectors": [
{
"CustomSelector": null,
"Elements": [
{
"Attributes": [],
"CustomValue": null,
"Ignore": false,
"Name": "Web Page",
"Tag": "domcontainer"
}
],
"Ignore": false,
"IsCustom": false,
"IsWindowsInstance": false,
"Order": 0
}
],
"Tag": "domcontainer"
}
],
"Version": 1
}
</code></pre>
<h2 id="sharepoint-rest-api">SharePoint REST API</h2>
<p>The SharePoint REST API offers a huge amount of capabilities for carrying out actions within SharePoint. In order to utilize them within Power Automate Desktop we can use the <strong>Invoke web service</strong> action. In order to utilize the web service it must have the proper authentication information sent along with it. The authentication mechanism will consist of a service principal, within SharePoint there are two options for doing this. The first is App-only authentication which can be configured through the use of SharePoint pages and is detailed <a href="https://docs.microsoft.com/en-us/sharepoint/dev/solution-guidance/security-apponly-azureacs">here</a>. Additionally, you can utilize AD-only authentication which is newer and preferred for SharePoint Only but does not work in SharePoint On-Premise. For the scenario I choose to utilize the App-Only method due to the greater simplicity in setting it up and the ability for it to function Online and On-Premise.</p>
<h3 id="create-a-service-principal">Create a Service Principal</h3>
<p>Creating the service principal and configuring it’s permissions are well documented at <a href="https://docs.microsoft.com/en-us/sharepoint/dev/solution-guidance/security-apponly-azureacs#setting-up-an-app-only-principal-with-tenant-permissions">Setting up an app-only principal with tenant permissions</a></p>
<p>After you have created the service principal you may also need to ensure that app-only authentication is not disabled. The is detailed in the next section.</p>
<h3 id="enable-tenant-for-app-only-access">Enable Tenant for App-Only Access</h3>
<p>Newer tenants may App-only access disabled. In order to enable you will need to update the setting for the SharePoint tenant. This can be done though the use of the <a href="https://docs.microsoft.com/en-us/powershell/sharepoint/sharepoint-pnp/sharepoint-pnp-cmdlets?view=sharepoint-ps">SharePoint PnP PowerShell Cmdlets.</a></p>
<p>In order to install PnP and authorize it run the following commands.</p>
<pre><code>Install-Module -Name Microsoft.Online.SharePoint.PowerShell
Install-Module -Name PnP.PowerShell
Register-PnPManagementShellAccess
</code></pre>
<p>The Register-PnPManagementShellAccess will open a web browser where you will need to authorize the managment shell app. When this screen appears click the Accept button. You can also choose if you have the proper rights to authorize this connection for your entire organization.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/151868432-6a41a238-42ef-44af-9dbc-8770d60e8587.png" alt="Authorize Management Shell"></p>
<p>Now you are ready to update the property which will allow for App-Only authentication. After you run these commands it can take some time for it to propagate, so go grab some coffee or drink of your choice.</p>
<pre><code>Connect-PnPOnline -Url https://yoursharepoint.sharepoint.com
Set-PnPTenant -DisableCustomAppAuthentication $false
</code></pre>
<p>Now that our authentication mechanism is all ready we will add in the first Invoke web service call which will return the bearer token we need for subsequent calls. To generate the data needed for the URL and the Request Body of the message I have created a PowerShell script which will generate these values for you by just putting in the SharePoint site url.</p>
<p><a href="https://github.com/rwilson504/PowerShell/blob/main/SharePoint/AppOnlyWebResquestProperties.ps1">PowerShell to generate URL and Request Body for SharePoint authentication</a></p>
<p><img src="https://user-images.githubusercontent.com/7444929/152588294-bffa6e31-67b4-47cb-9c87-607035742d34.png" alt="Inovke web service for authentication"></p>
<p>Next parse the JSON returned from the authentication call so that we can easily retrieve the bearer token.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/152588548-f525f8ce-20c1-46b1-9120-8f2db9e2f28b.png" alt="Parse Authentication Response"></p>
<p>Now we can use the Invoke web service to call a post method that will create our new item. You can see in the custom headers section where we utilized the output of the authentication call to pass our bearer token. Additional information on how to generate the Request body when working with the SharePoint REST API can be found here: <a href="https://docs.microsoft.com/en-us/sharepoint/dev/sp-add-ins/working-with-lists-and-list-items-with-rest">Working with lists and list items with REST</a></p>
<p>Also, you will need the content type for the SharePoint list item. There are several ways to do this but the easiest I find it using a browser windows that is already authenticated to the SharePoint site and entering the following url to call the web API. Make sure you update the url to match your specific SharePoint site location and List name.</p>
<pre><code>https://yoursharepoint.sharepoint.com/sites/Clients/_api/web/lists/GetByTitle('Client Contacts')/items?$select=ContentType/Name
</code></pre>
<p>After the data is returned just search for “SP.Data” and you will be able to see the content type(s) for the list.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/152595257-784f6202-80dc-40be-bc1f-f1e213588d5f.png" alt="Find Content Types"></p>
<p>Here is Invoke web service action I used to generate list items.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/152588964-a7d7c090-6880-43c9-b848-1eebd7867a55.png" alt="Call SharePoint to Create Item"></p>
<p>This is the final result for the web service call method.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/152586179-320a1fd3-f8b8-47c4-897b-a5be641c2a87.png" alt="Final PAD Flow for Web Service Calls"></p>
<p>If you copy the code below you can paste it into the Power Automate Desktop design surface to get started.</p>
<pre><code>Excel.LaunchExcel.LaunchAndOpen Path: $'''C:\\Files\\Desktop\\MOCK_CLIENT_DATA.xlsx''' Visible: True ReadOnly: False LoadAddInsAndMacros: False Instance=> ExcelInstance
Excel.GetFirstFreeColumnRow Instance: ExcelInstance FirstFreeColumn=> FirstFreeColumn FirstFreeRow=> FirstFreeRow
Excel.ReadFromExcel.ReadCells Instance: ExcelInstance StartColumn: 1 StartRow: 2 EndColumn: FirstFreeColumn - 1 EndRow: FirstFreeRow - 1 ReadAsText: False FirstLineIsHeader: False RangeValue=> ExcelData
Web.InvokeWebService.InvokeWebService Url: $'''https://accounts.accesscontrol.windows.net/ef0ff4be-5248-4843-9746-4bd7b801fc2b/tokens/OAuth/2''' Method: Web.Method.Post Accept: $'''application/json''' ContentType: $'''application/x-www-form-urlencoded''' RequestBody: $'''
client_id=2c2aa580-ef50-46ba-aeeb-289e03584fc3@ef0ff4be-5248-4843-9746-4bd7b801fc2b
&grant_type=client_credentials
&resource=00000004-0000-0aa2-cc00-000000000000/yoursharepoint.sharepoint.com@ef0ff4be-5248-4843-9746-4bd7b801fc2b
&client_secret=H1pp6FxAK3jeJu/Tjtugb5sUc7ab+5O14mDmvvTR83i=
''' ConnectionTimeout: 30 FollowRedirection: False ClearCookies: False FailOnErrorStatus: False EncodeRequestBody: False UserAgent: $'''Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.21) Gecko/20100312 Firefox/3.6''' Encoding: Web.Encoding.AutoDetect AcceptUntrustedCertificates: False ResponseHeaders=> WebServiceResponseHeaders2 Response=> WebServiceResponse2 StatusCode=> StatusCode2
Variables.ConvertJsonToCustomObject Json: WebServiceResponse2 CustomObject=> JsonAsCustomObject
LOOP FOREACH CurrentItem IN ExcelData
Web.InvokeWebService.InvokeWebService Url: $'''https://yoursharepoint.sharepoint.com/sites/Clients/_api/web/lists/GetByTitle(\'Client Contacts\')/items''' Method: Web.Method.Post Accept: $'''application/json;odata=verbose''' ContentType: $'''application/json;odata=verbose''' CustomHeaders: $'''Authorization: %JsonAsCustomObject.token_type% %JsonAsCustomObject.access_token%''' RequestBody: $'''{
\"__metadata\": {
\"type\": \"SP.Data.Client_x0020_ContactsListItem\"
},
\"FirstName\": \"%CurrentItem[1]%\",
\"Title\" : \"%CurrentItem[2]%\",
\"Company\" : \"WS - %CurrentItem[3]%\",
\"Email\" : \"%CurrentItem[4]%\",
\"Phone\": \"%CurrentItem[5]%\"
}''' ConnectionTimeout: 30 FollowRedirection: True ClearCookies: False FailOnErrorStatus: False EncodeRequestBody: False UserAgent: $'''Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.21) Gecko/20100312 Firefox/3.6''' Encoding: Web.Encoding.AutoDetect AcceptUntrustedCertificates: False ResponseHeaders=> WebServiceResponseHeaders Response=> WebServiceResponse StatusCode=> StatusCode
END
Excel.CloseExcel.Close Instance: ExcelInstance
SET NumberOfClients TO ExcelData.RowsCount
</code></pre>
<h2 id="powershell-pnp">Powershell PnP</h2>
<h3 id="install-sharepoint-pnp">Install SharePoint PnP</h3>
<p>Open PowerShell and run the following commands.</p>
<pre><code>Install-Module -Name PnP.PowerShell
Register-PnPManagementShellAccess
</code></pre>
<p>The Register-PnPManagementShellAccess will open a web browser where you will need to authorize the managment shell app. When this screen appears click the Accept button. You can also choose if you have the proper rights to authorize this connection for your entire organization.<br>
<img src="https://user-images.githubusercontent.com/7444929/151868432-6a41a238-42ef-44af-9dbc-8770d60e8587.png" alt="Authorize Management Shell"></p>
<p>In order for the PowerShell script to authenticate to SharePoint we can utilize the PnP module to save the credential to the machine within the Windows Credential Manager. For more information on this go here: <a href="https://github.com/pnp/PnP-PowerShell/wiki/How-to-use-the-Windows-Credential-Manager-to-ease-authentication-with-PnP-PowerShell">How to use the Windows Credential Manager to ease authentication with PnP PowerShell</a></p>
<p>This command will store the credential for the SharePoint site so that the PowerShell script can access them.</p>
<pre><code>Add-PnPStoredCredential -Name https://yoursharepoint.sharepoint.com -Username youraccount@yourtenant.onmicrosoft.com -Password (ConvertTo-SecureString -String "YourPassword" -AsPlainText -Force)
</code></pre>
<p>This is the final result for the PowerShell method.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/152586245-aac49479-5e69-4077-9ba3-462129bb0b42.png" alt="Final PAD Flow for PowerShell"></p>
<p>If you copy the code below you can paste it into the Power Automate Desktop design surface to get started.</p>
<pre><code>Excel.LaunchExcel.LaunchAndOpen Path: $'''C:\\files\\Desktop\\MOCK_CLIENT_DATA.xlsx''' Visible: True ReadOnly: False LoadAddInsAndMacros: False Instance=> ExcelInstance
Excel.GetFirstFreeColumnRow Instance: ExcelInstance FirstFreeColumn=> FirstFreeColumn FirstFreeRow=> FirstFreeRow
Excel.ReadFromExcel.ReadCells Instance: ExcelInstance StartColumn: 1 StartRow: 2 EndColumn: FirstFreeColumn - 1 EndRow: FirstFreeRow - 1 ReadAsText: False FirstLineIsHeader: False RangeValue=> ExcelData
SET PowerShellScript TO $'''Connect-PnPOnline -Url https://yoursharepoint.sharepoint.com/sites/Clients'''
LOOP FOREACH CurrentItem IN ExcelData
Text.AppendLine Text: PowerShellScript LineToAppend: $'''Add-PnPListItem -List \"Client Contacts\" -ContentType \"SP.Data.Client_x0020_ContactsListItem\" -Values @{\"Title\" = \"%CurrentItem[2]%\"; \"FirstName\" =\"%CurrentItem[1]%\"; \"Company\" = \"PS - %CurrentItem[3]%\"; \"Email\" = \"%CurrentItem[4]%\"; \"Phone\" = \"%CurrentItem[5]%\"}''' Result=> Result
SET PowerShellScript TO Result
END
Scripting.RunPowershellScript Script: PowerShellScript ScriptOutput=> PowershellOutput ScriptError=> ScriptError
Excel.CloseExcel.Close Instance: ExcelInstance
SET NumberOfClients TO ExcelData.RowsCount
</code></pre>
<h2 id="sharepoint-connector-for-pad">SharePoint Connector for PAD</h2>
<p>The FY22 Realese Wave 1 plans show plans for a dedicated SharePoint connector within Power Automate Desktop. Once this is released there will be even another way to connect!</p>
<p><a href="https://docs.microsoft.com/en-us/power-platform-release-plan/2022wave1/power-automate/sharepoint-connector-power-automate-desktop">SharePoint connector in Power Automate for desktop</a></p>
<p><img src="https://user-images.githubusercontent.com/7444929/151870624-483f4c07-b849-4e39-9e2c-58008e55342a.png" alt="image"></p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com2tag:blogger.com,1999:blog-8675696861245191896.post-49691270415504438732022-01-07T15:24:00.001-05:002022-01-07T15:24:31.897-05:00Power Apps Portal - Configure Azure AD Provider in Azure B2C<p>It is recommended that you no longer use Local Login authentication for Power Apps Portal but instead utilize Azure Active Directory B2C to provide this type of authentication. See <a href="https://docs.microsoft.com/en-us/powerapps/maker/portals/configure/migrate-identity-providers">Migrate identity providers to Azure AD B2C<br>
</a></p>
<p>Configuring the B2C providers is fairly straightforward utilizing the new preview interface <a href="https://docs.microsoft.com/en-us/powerapps/maker/portals/configure/configure-azure-ad-b2c-provider">Configure the Azure Active Directory B2C provider</a>. Make sure you navigate to the preview version of the Maker portal for now to access this, <a href="https://make.preview.powerapps.com/">https://make.preview.powerapps.com/</a>.</p>
<p>For this article my goals were the following.</p>
<ul>
<li>Set the existing Azure AD and Local Login configuration as deprecated authentication mechanisms within the portal to migrate the users to B2C.</li>
<li>Allow user to authenticate to B2C using Azure AD, Google or create local B2C account.</li>
</ul>
<p><img src="https://user-images.githubusercontent.com/7444929/148592406-10108368-93ab-4308-a721-23f4412a8a22.png" alt="Original log in screen" title="Original Login Screen"></p>
<h2 id="deprecate-old-providers">Deprecate Old Providers</h2>
<p>Once I had run through the <a href="https://docs.microsoft.com/en-us/powerapps/maker/portals/configure/configure-azure-ad-b2c-provider">instructions</a> for configuring the Azure B2C authentication I then had to mark the Local Login and Azure AD authentication methods as deprecated. This ensures that when existing users log into the Portal using those methods, they will then be asked to migrate their account to B2C.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/148592949-b3d4f3ad-e5e2-48a4-92b3-624f5d7a14c8.png" alt="Account Migration" title="Account Migration Screen"></p>
<p>Deprecation of the old providers can be done through the Portal Management model app within the Site Settings.</p>
<h3 id="set-local-login-authentication-as-deprecated">Set Local Login Authentication as Deprecated</h3>
<p>The site setting for deprecated the local authentication was already in my site settings so i set the value to true.<br>
<img src="https://user-images.githubusercontent.com/7444929/148593379-6f13653c-de3b-4c42-a140-085b5a8facd2.png" alt="Deprecate Local Login" title="Deprecate Local Login"></p>
<table>
<thead>
<tr>
<th>Name</th>
<th>Value</th>
</tr>
</thead>
<tbody>
<tr>
<td>Authentication/Registration/LocalLoginDeprecated</td>
<td>true</td>
</tr>
</tbody>
</table><h3 id="set-azure-ad-authentication-as-deprecated">Set Azure AD Authentication as Deprecated</h3>
<p>In order to deprecate other providers you need to create the site settings for them and set the value to true. The format for these values is.<br>
<code>Authentication/[protocol]/[provider]/Deprecated</code></p>
<p><img src="https://user-images.githubusercontent.com/7444929/148593704-28e66710-1b6b-4782-a7f6-5775b25ede35.png" alt="Deprecate Azure AD Authentication" title="Deprecate Azure AD Authentication"></p>
<table>
<thead>
<tr>
<th>Name</th>
<th>Value</th>
</tr>
</thead>
<tbody>
<tr>
<td>Authentication/OpenIdConnect/AzureAD/Deprecated</td>
<td>true</td>
</tr>
</tbody>
</table><h3 id="google-identity-provider">Google Identity Provider</h3>
<p>Setting up the Google identity provider was easy and the instructions provided worked without any issues. See <a href="https://docs.microsoft.com/en-us/azure/active-directory-b2c/identity-provider-google?pivots=b2c-user-flow">Set up sign-up and sign-in with a Google account using Azure Active Directory B2C</a></p>
<h3 id="azure-ad-provider">Azure AD Provider</h3>
<p>The instructions for <a href="https://docs.microsoft.com/en-us/azure/active-directory-b2c/identity-provider-azure-ad-single-tenant?pivots=b2c-user-flow">Adding an Azure Active Directory provider to Azure Active Directory B2C</a> but there were a few items missing to get it working correctly with Power Apps Portal.</p>
<p>If you don’t complete the additional steps you will end up with users in your B2C who do not have an email address assigned to them. Additionally, the persons email, first name and last name will not be provided to the portal which will result in the following error screen when new users attempt to register.<br>
<img src="https://user-images.githubusercontent.com/7444929/148596769-659b9c43-3bfb-42c8-a921-20c99063bfdc.png" alt="Email field is required" title="Email field is required"></p>
<p>The first thing we need to do after creating the Azure AD provider app registration is to update the token configuration. This will ensure that email, first name, and last name are included correctly in the token.</p>
<ul>
<li>Navigate to the directory in the Azure Portal where your Azure AD lives.</li>
<li>Create the app registration for the Azure AD Identity provider using the instructions found [here]((<a href="https://docs.microsoft.com/en-us/azure/active-directory-b2c/identity-provider-azure-ad-single-tenant?pivots=b2c-user-flow">https://docs.microsoft.com/en-us/azure/active-directory-b2c/identity-provider-azure-ad-single-tenant?pivots=b2c-user-flow</a>)</li>
<li>Under Manage click <strong>Token configuration</strong></li>
<li>Click <strong>Add optional claim</strong> button</li>
<li>Select the <strong>Token type</strong> of <strong>ID</strong></li>
<li>Click the check boxes next to <strong>eamil, family_name, given_name</strong></li>
<li>Click the <strong>Add</strong> button</li>
<li>You will receive a message that the optional claims will require additional API permissions. Click the <strong>Turn on the Microsoft Graph email permission (required for claims to appear in token)</strong> checkbox and click the <strong>Add</strong> button.<br>
<img src="https://user-images.githubusercontent.com/7444929/148598076-37a3f107-434d-4c59-a47c-b29c50dedb4a.png" alt="Token configuration"></li>
</ul>
<p>Next we must ensure that the API permissions that were added have admin consent</p>
<ul>
<li>Under Manage click the <strong>API permissions</strong></li>
<li>Click <strong>Grant admin consent for </strong> button</li>
<li>Click Yes when asked to grand admin consent<br>
<img src="https://user-images.githubusercontent.com/7444929/148598239-d3d804fd-c87a-4e7e-a4b5-d1f9d9cb6f92.png" alt="Grant Admin Consent"></li>
</ul>
<p>I also found issues where the B2C configuration redirect URI utilize the tenant id instead of the domain name so I also added an extra Uri for that address.</p>
<ul>
<li>Under manage click <strong>Authentication</strong></li>
<li>You should already have one Redirect URI listed (eg. <a href="https://domainb2c.b2clogin.com/domainb2c.onmicrosoft.com/oauth2/authresp">https://domainb2c.b2clogin.com/domainb2c.onmicrosoft.com/oauth2/authresp</a>) which you added when going through the instructions for creation the app registration. Click the <strong>Add URI</strong> button to add another.</li>
<li>Add a second uri that utilizes the Tenant ID of the B2C directory instead of <a href="http://domainb2c.onmicrosoft.com">domainb2c.onmicrosoft.com</a> (eg. <a href="https://domainb2c.b2clogin.com/a89ff66d-26c2-4407-9096-b216ce8b6a10/oauth2/authresp">https://domainb2c.b2clogin.com/a89ff66d-26c2-4407-9096-b216ce8b6a10/oauth2/authresp</a>)</li>
</ul>
<p>Finally we need to update the Sign In/Sign Up user flow created during the B2C Portal setup.</p>
<ul>
<li>Navigate to the director in the Azure Portal where you Azure B2C lives.</li>
<li>Within Azure services click on <strong>Azure AD B2C</strong></li>
<li>Under the policies area click <strong>User flows</strong></li>
<li>You should see 2 users flows. Select the one contaiing the text <strong>signupsignin</strong><br>
<img src="https://user-images.githubusercontent.com/7444929/148599190-74834b00-0555-490e-a5bc-d1a1485136a0.png" alt="Select User Flow"></li>
<li>Click on Identity providers and ensure that you have selected the new Identity providers you have created. After selecting them click the <strong>Save</strong> button<br>
<img src="https://user-images.githubusercontent.com/7444929/148599359-3dc1e68c-5ee7-4cac-ad43-47425b0edbc6.png" alt="Choose Identity providers"></li>
<li>Click on Application claims and select the Display Name, Email Addresses, Given Name, and Surname attributes then click the Save button.<br>
<img src="https://user-images.githubusercontent.com/7444929/148599552-c2d34564-a59c-4ea7-b6c4-57146117d068.png" alt="Select Application claims"></li>
</ul>
<p>Now when a user attempts to register using your AD provider the email, first name and last name will all be passed to the Portal and show up on the profile page after the user has logged in.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/148600720-4b044ee1-6b12-43cf-8e40-a96328cfb66b.png" alt="B2C Login Page"><br>
<img src="https://user-images.githubusercontent.com/7444929/148600771-89a76665-4662-424a-85f7-7f5a342ec09e.png" alt="Profile Page After Login"></p>
<h2 id="invitation-info">Invitation Info</h2>
<p>One thing I discovered through all this was that the Invitation system still works correctly after moving to B2C. I was able to create Invitations for contacts and redeem those invitation with the B2C provider in the exact same manner I did with the other providers.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com1tag:blogger.com,1999:blog-8675696861245191896.post-44262017409718689022021-08-16T09:45:00.002-04:002021-08-16T09:45:15.071-04:00Issue Connecting Through Remote Desktop Gateway<p>While attempting to connect to a remote desktop though a Remote Desktop Gateway (RDG) the connection would ask for credentials but then just drop. After some searching I found the answer to the issue.</p>
<p>First open open the regedit utility.</p>
<p>Navigate to <strong>HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Terminal Server Client</strong></p>
<p>If there is a DWORD key called RDGClientTransport set it’s value to 1.</p>
<p>If that key is not there right click and select **New -> DWORD (32-bit) Value" from the context menu. Set the name of the key to <strong>RDGClientTransport</strong> and it’s value to <strong>1</strong>.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/127191752-48ce74f1-2b7e-463c-b30f-8cf40b72ba4b.png" alt="Set RDGClientTransport Key"></p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com3tag:blogger.com,1999:blog-8675696861245191896.post-38163651286509458962021-06-29T16:24:00.001-04:002021-06-29T16:24:47.821-04:00Power Automate Desktop Example - Downloading Image From Email To Network Drive<p><img src="https://user-images.githubusercontent.com/7444929/123496929-b928da00-d5f8-11eb-991b-e216185ec8dc.png" alt="2021-06-25_20-38-09"></p>
<p>After watching the RPA (Robotic Process Automation) demo from Steve Winward <a href="https://www.youtube.com/watch?v=r6f0m1Bn878">Real-Life Use case using AI Builder Form Processing with Power Automate Desktop</a> I have been looking for ways to utilize <a href="https://flow.microsoft.com/en-us/desktop">Power Automate Desktop Flows</a> in order to make my life a little easier. Yesterday like every day I started clicking through the emails that I receive from my kids schools. I open each email, download the image, then move the image to a network drive on my computer. This takes me a few minutes every day but my kids are worth it :) This scenario was a perfect one for me to automate using Desktop Flows since they can access the network drive on my desktop.</p>
<p>The basic flow of the solution will be as follows:</p>
<ul>
<li>A Power Automate Cloud Flow will be triggered to run whenever a new email comes into my Gmail account with a specific tag.</li>
<li>The Cloud Flow will gather up all the information needed to run the Desktop Flow</li>
<li>The Cloud Flow call the Desktop Flow by connecting through an On-Premise Data Gateway that is installed on my desktop.</li>
<li>The Desktop Flow will download the image and save it to my network drive.</li>
</ul>
<h2 id="prerequisites">Prerequisites</h2>
<p>In order to complete all this you will need the following.</p>
<ul>
<li>Power Apps Environment. If you don’t have one you can try it out by signing up for a Developer Plan <a href="https://powerapps.microsoft.com/en-us/developerplan/">here</a>.</li>
<li>Power Automate Desktop. This needs to be installed on your computer, download it <a href="https://flow.microsoft.com/en-us/desktop">here</a>.</li>
<li>On-Premise Data Gateway. This is how the online Power Automate cloud flows will connect to your Desktop Flows. You can download it <a href="https://www.microsoft.com/en-us/download/details.aspx?id=53127">here</a>. Instructions for installation can be found <a href="https://docs.microsoft.com/en-us/data-integration/gateway/service-gateway-install">here</a>.</li>
<li>Application registration for Gmail within the Google console. You cannot use the default shared application because it’s not compatible with the Encodian actions or any other external Flow actions. Learn how to create this <a href="https://docs.microsoft.com/en-us/connectors/gmail/#creating-an-oauth-client-application-in-google">here</a>.</li>
<li>API key for Encodian to utilize their Regex action in your flow. You can sign up for one <a href="https://www.encodian.com/products/flowr/#form">here</a></li>
</ul>
<h2 id="inspecting-the-email">Inspecting the Email</h2>
<p>For this example I’m using emails supplied by a service called Tadpole which my kids school uses to send out images and notification. The first thing I had to do was use the developer tools (F12) within the browser to help me understand how the urls were formatted for the service and how they worked. After searching through the html I figured out the formatting and found that if i used one of the parameters found in the one link for d=t that it would download the full size image.<br>
<img src="https://user-images.githubusercontent.com/7444929/123862479-613eeb80-d8f6-11eb-968b-fb25a2887e11.png" alt="2021-06-25_19-32-46"></p>
<h2 id="add-gmail-label-and-filter">Add Gmail Label and Filter</h2>
<p>The first thing i did was went into my Gmail and created a new label for the incoming emails. The name of the service the school use is Tadpole so i made that my label. I think created a filter so that all emails coming from the Tadpole address would get that label applied. All of this is important because otherwise your Flow will run on everything in your Inbox which could be a lot of messages and you could end up running into API limits for Flow depending on your license.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/123493442-73fdab80-d5ea-11eb-8882-e7aab0a1b7c6.png" alt="2021-06-25_15-23-53"></p>
<h2 id="create-the-desktop-flow">Create the Desktop Flow</h2>
<p>The desktop flow will download the image using it’s url and save it to to the network drive. To get started open the Power Automate Desktop application you have installed and create a new Flow.</p>
<p>The first thing we need to do is define the input variables we will use. We will later be passing the data into these variables from the Cloud Flow created later.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493549-c212af00-d5ea-11eb-83cf-439051dfa1e8.png" alt="2021-06-25_15-30-37"></p>
<p>Now we can start adding actions to the Desktop flow. The first one will be a Convert text to datetime action which we will use to transform the input variable from the flow into a datetime which will allow us to do some string formatting faster.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493611-01410000-d5eb-11eb-8a8e-bed92c140955.png" alt="2021-06-25_15-34-46"></p>
<p>Next we will do some formatting on our date time to create flow variables which we will use later.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493622-0c942b80-d5eb-11eb-95ea-38d0216e3765.png" alt="2021-06-25_15-37-10"></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123493660-26357300-d5eb-11eb-8510-d73d45b4b56b.png" alt="2021-06-25_15-39-25"></p>
<p>Check to see if the network drive folder exists and if not create it.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493697-3f3e2400-d5eb-11eb-8f2e-b8c84a696c36.png" alt="2021-06-25_15-42-17"></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123493713-4e24d680-d5eb-11eb-84d6-923c957af126.png" alt="2021-06-25_15-45-39"></p>
<p>Finally we will download the file using the url provided as an input variable and save it to our network drive folder.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493760-73b1e000-d5eb-11eb-8f67-373abdb36ccd.png" alt="2021-06-25_15-48-55"></p>
<p>We have now successfully created our Desktop flow. Make sure you save it and then you can test it by clicking the Run button in the editor.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493782-8fb58180-d5eb-11eb-8ca6-f0e76e48efbe.png" alt="2021-06-25_15-51-50"></p>
<h2 id="create-the-cloud-flow">Create the Cloud Flow</h2>
<p>Now that we have created our Desktop Flow we need to run it any time an email arrives in our Tadpole inbox. This flow also needs to extract the rrl for the image and pass that information to the Desktop Flow.</p>
<p>Here is a high level outline of the Flow we are going to build.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493050-4fed9a80-d5e9-11eb-9b23-1d8c51ec76c9.png" alt="2021-06-25_17-08-39"></p>
<p>Now let’s get started building! We will start by creating a solution in the maker portal.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493114-84f9ed00-d5e9-11eb-86fc-9d90082fe2ec.png" alt="2021-06-25_16-08-48"></p>
<p>Add a new Cloud Flow.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493154-a5c24280-d5e9-11eb-8399-9490a6296802.png" alt="2021-06-25_16-12-41"></p>
<p>Setup for our Gmail trigger. Again if you haven’t created a new Application within the Google console you should do that now using the instructions located <a href="https://docs.microsoft.com/en-us/connectors/gmail/#creating-an-oauth-client-application-in-google">here</a>.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493166-afe44100-d5e9-11eb-8da6-69d15274ad66.png" alt="2021-06-25_16-14-53"></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123493197-c4283e00-d5e9-11eb-8b68-92c885f76b76.png" alt="2021-06-25_16-24-09"></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123493181-ba063f80-d5e9-11eb-8398-2554f1e6be75.png" alt="2021-06-25_16-21-51"></p>
<p>All the emails that contain images have kids names at the beginning of the subject so i will set up a condition to make sure that this is an image email and also extract that data so i can use it to save the image later.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493239-dace9500-d5e9-11eb-9e0b-1cdaa11ef310.png" alt="2021-06-25_16-27-59"></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123493249-e1f5a300-d5e9-11eb-9b5d-250b57380912.png" alt="2021-06-25_16-32-34"></p>
<p>We need to get all the urls contained in the email so we will use the Encodian Regex action here to find them. Here is the Regex you will need to find all the urls in the body of the email.</p>
<pre><code>(?:(?:https?|ftp):\/\/|\b(?:[a-z\d]+\.))(?:(?:[^\s()<>]+|\((?:[^\s()<>]+|(?:\([^\s()<>]+\)))?\))+(?:\((?:[^\s()<>]+|(?:\(?:[^\s()<>]+\)))?\)|[^\s`!()\[\]{};:'".,<>?«»“”‘’]))?
</code></pre>
<p><img src="https://user-images.githubusercontent.com/7444929/123493265-efab2880-d5e9-11eb-980b-8444699df253.png" alt="2021-06-25_16-37-10"></p>
<p>Now that we have all the urls in the message we need to look at them all and see which one is the image. We will also do some parsing of the url to make sure we get the full size image by appending the ?d=t property to the url.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493360-2d0fb600-d5ea-11eb-9f90-0874e908629a.png" alt="2021-06-25_16-46-10"></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123493373-3a2ca500-d5ea-11eb-967d-20bc5a3200cc.png" alt="2021-06-25_16-49-05"></p>
<p>Last but not least we will connect to the Desktop Flow we created earlier and pass in all the information we have collected from the email.<br>
<img src="https://user-images.githubusercontent.com/7444929/123493394-46186700-d5ea-11eb-908f-7528bb94683f.png" alt="2021-06-25_16-58-56"></p>
<h2 id="test-it">Test It!</h2>
<p>In order to test my flow I added another Gmail filter that would tagged any email coming from myself with my kids names in the subject. This allowed me to then forward old email to myself for test processing.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/123495263-0e60ed80-d5f1-11eb-93ba-f880011f78a3.png" alt="2021-06-25_20-05-06"></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123495267-128d0b00-d5f1-11eb-95d9-0b4db5f1a0d4.png" alt="2021-06-25_20-05-43"></p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com14tag:blogger.com,1999:blog-8675696861245191896.post-45859037285056566032021-06-28T14:16:00.001-04:002021-06-28T14:16:10.731-04:00Model App Access Checker for Dataverse<p><img src="https://user-images.githubusercontent.com/7444929/123683732-91b15780-d81a-11eb-9ae8-badf315269e1.png" alt="2021-06-28_13-53-22"></p>
<p>Want to know which model applications your users have access to in Dataverse? Check out the app access checker that is available within the <a href="https://docs.microsoft.com/en-us/power-platform/admin/admin-guide">Power Platform admin center</a>. Enter a users Id or email address and see the list of published apps in your environment and all the access, license and security information specific to that user. This can be a very useful tool in troubleshooting why a user cannot see a specific app in your environment.</p>
<p>Below are the different ways you can access the app access checker.</p>
<h2 id="direct-url">Direct Url</h2>
<p>You can access the app access checker directly using a url. Here is the format.</p>
<pre><code>https://<Your Org>.crm.dynamics.com/WebResources/msdyn_AppAccessChecker.html
</code></pre>
<h2 id="power-platform-admin-center">Power Platform Admin Center</h2>
<ul>
<li>Open the Power Platform admin center <a href="https://admin.powerplatform.microsoft.com/">https://admin.powerplatform.microsoft.com/</a></li>
<li>On the Environments list click the link for the environment you would like to check.<img src="https://user-images.githubusercontent.com/7444929/123683771-9fff7380-d81a-11eb-9e26-2a35f9541c76.png" alt="2021-06-28_13-33-25"></li>
<li>Click on <strong>See all</strong> under the User heading the Action area. <img src="https://user-images.githubusercontent.com/7444929/123683856-b6a5ca80-d81a-11eb-9240-9fa292e5b1ec.png" alt="2021-06-28_13-26-21"></li>
<li>Click the <strong>app access checker</strong> link located above the Users list.<img src="https://user-images.githubusercontent.com/7444929/123683866-b9a0bb00-d81a-11eb-86cb-1ddeb5eb9bf3.png" alt="2021-06-28_13-26-02"></li>
</ul>
<h2 id="webapi-call">WebAPI Call</h2>
<p>The information returned to the page comes from a single WebAPI call. If you will to call it yourself and create your own page you can do so.</p>
<pre><code>https://<Your Org>.crm.dynamics.com/api/data/v9.0/RetrieveUserAppDebugInfo(UserIdOrEmail%20='testuser2@rawonet.onmicrosoft.com')
</code></pre>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com9tag:blogger.com,1999:blog-8675696861245191896.post-31691242233243344632021-06-25T08:39:00.001-04:002021-06-25T08:39:45.051-04:00Enable the Export to PDF Ribbon Button<p><img src="https://user-images.githubusercontent.com/7444929/123333167-658e9180-d50f-11eb-91e6-1ce770308f59.png" alt="2021-06-24_17-09-43"></p>
<p>Dataverse allows you to easily generate your Word Templates as <a href="https://docs.microsoft.com/en-us/dynamics365/sales-enterprise/create-quote-pdf">PDF files</a>. Last year they expanded the functionality beyone just the out of the box sales entities to all custom entities. In order to enable the functionality though you need to do some configuration. Below are the different methods in which you can turn the PDF generation on or off for entities.</p>
<p><strong>Note</strong>: Aftert updating these settings in whatever manor make sure you <a href="https://support.microsoft.com/en-us/windows/microsoft-edge-browsing-data-and-privacy-bb8174ba-9d73-dcf2-9b4a-c582b4e640dd">clear your browser data</a>. The ribbon stores the values for which entities are enabled and disabled within the browser session. If you don’t do this you may not see the PDF generation buttons show up after you enable them.</p>
<h2 id="method-1-sales-hub-installed">Method 1 (Sales Hub Installed):</h2>
<p>For those orgs that have the Sales Hub app this is fairly straightforward.</p>
<ul>
<li>Navigate to the Apps page for your organization<br>
<code>https://<Your Org>.crm.dynamics.com/apps</code></li>
<li>Open the Sales Hub app</li>
<li>Navigate to the App settings Area in the sitemap and click on Overview under the General group. <a href="https://docs.microsoft.com/en-us/dynamics365/sales-enterprise/admin-settings-overview">Microsoft Instructions</a></li>
<li>In the Overview click the Manage link next to Convert to PDF <a href="https://docs.microsoft.com/en-us/dynamics365/sales-enterprise/enable-pdf-generation-quote">Microsoft Instructions</a></li>
<li>Choose the entities you wish to enable PDF generation for and click Save.</li>
</ul>
<h2 id="method-2-sales-hub-installed">Method 2 (Sales Hub Installed)</h2>
<p>If you have the Sales solution installed you can access the the Convert to PDF page using the following url which you could put in the sitemap.</p>
<p><code>https://<Your Org>.crm.dynamics.com/main.aspx?pagetype=control&controlName=MscrmControls.FieldControls.CCFadminsettings&data={"id":"overview","ismanage":"overview"}</code></p>
<h2 id="method-3-xrmtoolbox">Method 3 (XrmToolBox)</h2>
<p>If you don’t have sales installed on your system you can still enable this feature but it takes a bit more work. The easiest way is to utilize some of the tools in XrmToolBox.</p>
<p>First we will open FetchXML Builder and retrieve the pdfsetting entity, we will need to specify the pdfsettingsid and pdfsettingsjson attributes. This will return a single records. Copy the outputs of these fields to your text editor.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/123332434-7c80b400-d50e-11eb-9c85-a7bcccd1b45c.png" alt="2021-06-24_13-52-57"></p>
<p>Next I will utilize the WebAPI Launcher to update the pdfsettign entity with our updated json which will contain all the entities you want to enable for PDF generation.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/123332462-84d8ef00-d50e-11eb-9d55-d7101b2b611d.png" alt="2021-06-24_15-19-38"></p>
<h2 id="method-4-powershell">Method 4 (PowerShell)</h2>
<p>Because the PDF settings are stored per envrionment you may want to update them as part of your ALM. One way of doing this would be to run a PowerShell script which sets them during deployment. To learn more about how to connect to Dataverse using PowerShell check out <a href="https://www.richardawilson.com/2021/06/calling-dataverse-web-api-in-powershell.html">this article</a>. The following Invoke-RestMethod calls would get the existing pdfsetting entity and then update that entity with the JSON you want.</p>
<pre><code>##########################################################
# GET THE PDF SETTINGS ENTITY
##########################################################
$pdfSettingRetrieveUriParams = '$select=pdfsettingid,pdfsettingsjson&$top=1'
$pdfSettingRetrieveParams =
@{
URI = "$($dataverseEnvUrl)/api/data/v9.1/pdfsettings?$($pdfSettingRetrieveUriParams)"
Headers = @{
"Authorization" = "$($authResponse.token_type) $($authResponse.access_token)"
}
Method = 'GET'
}
# Get the pdf settigns entity
$pdfSettignsRetriveRequest = Invoke-RestMethod @pdfSettingRetrieveParams -ErrorAction Stop
$pdfSettignsRetriveResponse = $pdfSettignsRetriveRequest
#Output the results
$pdfSettingsValue = $apiCallResponse.value
##########################################################
# UPDATE THE PDF SETTINGS ENTITY
##########################################################
$pdfSettingJSON = '{ "pdfsettingsjson" : ''{"contact": true, "account": true}'' }'
$pdfSettingUpdateParams =
@{
URI = "$($dataverseEnvUrl)/api/data/v9.1/pdfsettings($($pdfSettingsValue.pdfsettingid))"
Headers = @{
"Authorization" = "$($authResponse.token_type) $($authResponse.access_token)"
}
Method = 'PATCH'
Body = $pdfSettingJSON
ContentType = "application/json"
}
# Update the pdf settings entity
$pdfSettignsUpdateRequest = Invoke-RestMethod @pdfSettingUpdateParams -ErrorAction Stop
$pdfSettignsUpdateResponse = $pdfSettignsUpdateRequest
</code></pre>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com3tag:blogger.com,1999:blog-8675696861245191896.post-27921805314556310572021-06-23T11:52:00.001-04:002021-06-23T11:52:14.290-04:00Open Model Apps Url Using Unique Name<p><img src="https://user-images.githubusercontent.com/7444929/123127614-727c8980-d418-11eb-9f01-8731c4896e78.png" alt="2021-06-23_11-14-09"></p>
<p>When generating links for records, lists or reports in a Dataverse environment it is important that they open the specific application they relate to so users have the best experience. To see more details about generating links for Dataverse <a href="https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/open-forms-views-dialogs-reports-url?view=op-9-1">click here.</a></p>
<p><img src="https://user-images.githubusercontent.com/7444929/123127398-3ba67380-d418-11eb-9eff-85b6f39d2ed5.png" alt="2021-06-23_10-59-18"><br>
This image shows the message bar displayed within Dataverse when you open a link not directed to a specific application.</p>
<p>Previously in order to open a specific application using a link you had to create the it with the <a href="https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/customize/manage-access-apps-security-roles?view=op-9-1">app suffix url</a> or append the appid parameters to the url. In order to get either of those dynamically you need to query the Model-driven Apps (appmodule) entity and return the url attribute for the suffix or the appmoduleid attribute for the app id.</p>
<p>Using App Suffix</p>
<p><code>https://<Your Org>.crm.dynamics.com/apps/<Your App Suffix>/main.aspx?pagetype=entitylist&etn=contact</code></p>
<p>or</p>
<p>Using App Id</p>
<p><code>https://<Your Org>.crm.dynamics.com/main.aspx?appid=82853804-d2b3-4536-ba75-f49ccca681ea&pagetype=entitylist&etn=contact</code></p>
<p>Recently when creating a new App in the maker portal I saw the creation screen now includes a Unified Interface URL which populates when you set the name of the app. The format of the url looks like this.</p>
<p><code>https://<Your Org>.crm.dynamics.com/Apps/uniquename/<Your App Unique Name>/main.aspx?pagetype=entitylist&etn=contact</code></p>
<p>Because the unique name does not change between environment you can now eliminate any calls you previously did to the Model-driven App entity. You may still need to get the host url for the environment if using Power Apps or Power Automate.</p>
<p>To get the url in Power Automate you can make a call to any Dataverse entity and then parse out the url from the @odata.id value of any record you return.</p>
<p><code>uriHost(outputs('Get_CDS_Record')?['body/value'][0]?['@odata.id'])</code></p>
<p>The get the url in Power Apps you have some options. You can call a Power Automate flow which will call out to a CDS record and return the host url using the same method I described earlier or you can utilize the <a href="https://pcf.gallery/cds-environment-url/">CDS Environment URL </a> PCF component from Dan Cox.</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com12tag:blogger.com,1999:blog-8675696861245191896.post-91868755360306592532021-06-22T14:44:00.001-04:002021-06-22T14:44:28.443-04:00Return Error in Power Automate When Using Try/Catch Scopes<p>When utilizing scopes within Power Automate to create a try/catch/finally statement it can be useful to provide additional details about any errors that occurred within the try block. The example below shows how to get the results of a try block after it has failed and return that information.</p>
<p>To replicate this do the following:</p>
<ul>
<li>Add a <strong>Control - Scope</strong> action called ‘Try’</li>
<li>Add another <strong>Control - Scope</strong> action below Try called ‘Catch’</li>
<li>Click the (…) on the Catch action and select the <strong>Configure after run settings</strong>. Then click the ‘has failed’ checkbox.</li>
<li>Follow the screen shot below which will get the results array of the Try block then filter it down to the Failed result. You can then utilize the filtered result to return errors.</li>
</ul>
<p><img src="https://user-images.githubusercontent.com/7444929/122980038-37ba1900-d366-11eb-9283-b722ac24ebdd.png" alt="2021-06-22_13-50-06"></p>
<p>The iamge belows shows the output after a completed run. We can now see the Action name which failed as well as the error message. In this scenario I am using that information to populate a JSON object which will be used later for returning information back to a Power App.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/122977991-02143080-d364-11eb-8326-7cd42369dd68.png" alt="2021-06-22_13-59-51"></p>
<p>If you would like to return additional information from the Failed result test the flow in a way which will force it to fail then look at the flow run history. Expand the Catch block and click on the Filter Result for Failed action. You can see the body of what is returned in the result and utilize any of that additional information in generating your error message.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/122981846-32f66480-d368-11eb-9176-8aef3509c830.png" alt="2021-06-22_14-42-10"></p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-21529715337221331572021-06-09T11:38:00.002-04:002021-06-09T11:38:05.891-04:00Custom Process Action vs Custom API in Dataverse<p><img src="https://github.com/rwilson504/Blogger/blob/master/custom-process-action-vs-custom-api/customprocessvscustomapiheader.png?raw=true" alt="Custom Process Action vs Custom API in Dataverse"></p>
<p>I recently had the opportunity to utilize the new <a href="https://docs.microsoft.com/en-us/powerapps/developer/data-platform/custom-api">Custom API</a> functionality within Dataverse. I had previously used <a href="https://docs.microsoft.com/en-us/powerapps/maker/data-platform/create-actions">Custom Process Actions</a> and was a little confused as to the difference and why i would want to use the Custom API functionality. After digging through the documentation I finally discovered the major difference is this…</p>
<p><img src="https://github.com/rwilson504/Blogger/blob/master/custom-process-action-vs-custom-api/customactionvscustomapi.png?raw=true" alt="Custom Process Action vs Custom API"></p>
<p>The use case I was working on only returned data to the user so the Custom API allowed me to create a Function rather than an Action. This made it much easier to test my API because i can just put the Url into the web browsers and see the results instance since it’s only a GET operation.</p>
<p>There are some additional benefits to utilizing the Custom API as well such as being able to specify a specific security privilege.</p>
<p>To see all the differences between Custom Process Actions and Custom API check out this <a href="https://docs.microsoft.com/en-us/powerapps/developer/data-platform/custom-actions#compare-custom-process-action-and-custom-api">article from Microsoft</a></p>
<h2 id="headache-alert-make-sure-to-create-custom-api-record-before-deploying-code">Headache Alert! Make Sure To Create Custom API Record Before Deploying Code</h2>
<p>On a side note, most of the articles I found about creating a Custom API talked about creating the code first. This ended up causing me a bit of a headache the first time i tried to deploy my Custom API code with <a href="https://github.com/scottdurow/SparkleXrm/wiki/spkl">spkl</a>. A null reference exception kept being throw during the registration. I finally realized that I needed to create the Custom API record within my solution before actually attempting to deploy the code. It was a silly mistake but one that cost me about an hour of my life which hopefully you can avoid 😀</p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-54571638895116116312021-06-09T11:35:00.002-04:002021-06-09T11:35:27.783-04:00Calling Dataverse Web API in PowerShell using Client Credentials<p><img src="https://github.com/rwilson504/Blogger/blob/master/call-dataverse-webapi-in-powershell-with-client-credentials/powershellplusdataverse.png?raw=true" alt="PowerShell Plus Dataverse"></p>
<p>Connecting to Dataverse using PowerShell can be very helpful for data migrations and use within Azure DevOps. Connecting to an instance in a non-interactive way can be tricky though. This article will provide you the links you need for creation and App registration and adding the app user to your environment. You can then utilize the script provided to call Web API requests including ones you define using the new <a href="https://docs.microsoft.com/en-us/powerapps/developer/data-platform/custom-api">Custom API</a> functionality now available.</p>
<p>The script was written so that it is not dependent on any outside libraries such as the Microsoft.Xrm.Tooling connector. This is helpful in situation where involving an outside library will slow down your deployment time by having to be approved in a change control board.</p>
<p>If utilizing an outside code library is not a concern you can create a connection to Dataverse utilizing the <a href="https://docs.microsoft.com/en-us/powerapps/developer/data-platform/xrm-tooling/use-powershell-cmdlets-xrm-tooling-connect">Microsoft.Xrm.Tooling</a> connector Get-CrmConnection cmdlet.</p>
<h2 id="create-app-registration">Create App Registration</h2>
<p>The first thing to do is create an App Registration within Azure AD for Dataverse. To do this follow the <a href="https://docs.microsoft.com/en-us/powerapps/developer/data-platform/walkthrough-register-app-azure-active-directory">instructions Microsoft has provided</a>.</p>
<h2 id="add-app-user-to-dataverse">Add App User to Dataverse</h2>
<p>Next add an Application User to Dataverse. To do this follow the <a href="https://docs.microsoft.com/en-us/power-platform/admin/manage-application-users">instructions Microsoft has provided</a>. Application users additional are also now available within the Power Platform Admin Portal<br>
<img src="https://github.com/rwilson504/Blogger/blob/master/call-dataverse-webapi-in-powershell-with-client-credentials/appuserinadminportal.png?raw=true" alt="App User Power Platform Admin Portal"></p>
<h2 id="powershell-script">PowerShell Script</h2>
<p>Utilize the script below to connect get the Authorization token and make any Web API calls you want. The Web API call in this script just pulls back the fullname of the first 10 contacts in the system. If you want to view this file in GitHub <a href="https://github.com/rwilson504/Blogger/blob/master/call-dataverse-webapi-in-powershell-with-client-credentials/CallDataverseWebAPIUsingClientCredentials.ps1">click here</a>.</p>
<pre><code><#
.SYNOPSIS
Connect to Dataverse and run Custom API Function
.NOTES
Author : Richard Wilson
.PARAMETER $oAuthTokenEndpoint
The v2 OAuth endpoint for the App registration. This can be found by opening the App registation and
clicking the Endpoints button in the Overview area. Copy the OAuth 2.0 token endpoint (v2) url.
.PARAMETER $appId
The Application (client) ID of the App registration
.PARAMETER $clientSecret
The client secret generated within the App registration
.PARAMETER $dataverseEnvUrl
The url of the Dataverse environment you want to connect to
#>
param
(
[string] $oAuthTokenEndpoint = 'https://login.microsoftonline.com/{YourTenantId}/oauth2/v2.0/token',
[string] $appId = 'xxxxxxxxxx',
[string] $clientSecret = 'xxxxxxxxxxx',
[string] $dataverseEnvUrl = 'https://{YourEnvironmentId}.crm.dynamics.com'
)
##########################################################
# Access Token Request
##########################################################
# OAuth Body Access Token Request
$authBody =
@{
client_id = $appId;
client_secret = $clientSecret;
# The v2 endpoint for OAuth uses scope instead of resource
scope = "$($dataverseEnvUrl)/.default"
grant_type = 'client_credentials'
}
# Parameters for OAuth Access Token Request
$authParams =
@{
URI = $oAuthTokenEndpoint
Method = 'POST'
ContentType = 'application/x-www-form-urlencoded'
Body = $authBody
}
# Get Access Token
$authRequest = Invoke-RestMethod @authParams -ErrorAction Stop
$authResponse = $authRequest
##########################################################
# Call Dataverse WebAPI using Authentication Token
##########################################################
# Params related to the Dataverse WebAPI call you will be making.
# These need to be in single quotes to ensure they are not expanded.
$uriParams = '$top=5&$select=fullname'
# Parameters for the Dataverse WebAPI call which includes our header
# that carries the access token.
$apiCallParams =
@{
URI = "$($dataverseEnvUrl)/api/data/v9.1/contacts?$($uriParams)"
Headers = @{
"Authorization" = "$($authResponse.token_type) $($authResponse.access_token)"
}
Method = 'GET'
}
# Call the Dataverse WebAPI
$apiCallRequest = Invoke-RestMethod @apiCallParams -ErrorAction Stop
$apiCallResponse = $apiCallRequest
#Output the results
Write-Host $apiCallResponse.value
</code></pre>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com22tag:blogger.com,1999:blog-8675696861245191896.post-3055013633369330902021-05-24T14:06:00.002-04:002021-05-24T14:06:36.853-04:00Raspberry Pi Pinout Terminal Command<p>When i first started workign with the Raspberry Pi i found myself constantly searching for images of the pinout and then printing it. Finally I found there is a simple command within bash for all that information when you have the Pi OS installed. 🤦🤦🤦🤦🤦🤦🤦🤦🤦</p>
<pre><code>pinout
</code></pre>
<p><img src="https://user-images.githubusercontent.com/7444929/119388569-7346c200-bc98-11eb-96de-c83d7359b643.png" alt="2021-05-24_13-54-23"></p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com1tag:blogger.com,1999:blog-8675696861245191896.post-38414697261912053522021-05-24T13:31:00.001-04:002021-05-24T13:31:20.311-04:00.NET Core on Raspberry Pi - Access to the path is denied<p>After installing the .NET Core sdk on my Raspberry Pi I received the following error when attempting to create a new console application using the ‘dotnet new console’ command.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/119385533-3ed10700-bc94-11eb-9df9-ad784bc565d2.png" alt="image"></p>
<pre><code>pi@raspberrypi:~/IoT/FirstProject $ dotnet new console
System.UnauthorizedAccessException: Access to the path '/home/pi/.dotnet/5.0.203.toolpath.sentinel' is denied.
---> System.IO.IOException: Permission denied
--- End of inner exception stack trace ---
at Interop.ThrowExceptionForIoErrno(ErrorInfo errorInfo, String path, Boolean isDirectory, Func`2 errorRewriter)
at Microsoft.Win32.SafeHandles.SafeFileHandle.Open(String path, OpenFlags flags, Int32 mode)
at System.IO.FileStream.OpenHandle(FileMode mode, FileShare share, FileOptions options)
at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, FileOptions options)
at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize)
at System.IO.File.Create(String path)
at Microsoft.Extensions.EnvironmentAbstractions.FileWrapper.CreateEmptyFile(String path)
at Microsoft.DotNet.Configurer.FileSystemExtensions.<>c__DisplayClass0_0.<CreateIfNotExists>b__0()
at Microsoft.DotNet.Cli.Utils.FileAccessRetrier.RetryOnIOException(Action action)
at Microsoft.DotNet.Configurer.FileSystemExtensions.CreateIfNotExists(IFileSystem fileSystem, String filePath)
at Microsoft.DotNet.Configurer.FileSentinel.Create()
at Microsoft.DotNet.Configurer.DotnetFirstTimeUseConfigurer.Configure()
at Microsoft.DotNet.Cli.Program.ConfigureDotNetForFirstTimeUse(IFirstTimeUseNoticeSentinel firstTimeUseNoticeSentinel, IAspNetCertificateSentinel aspNetCertificateSentinel, IFileSentinel toolPathSentinel, Boolean isDotnetBeingInvokedFromNativeInstaller, DotnetFirstRunConfiguration dotnetFirstRunConfiguration, IEnvironmentProvider environmentProvider, Dictionary`2 performanceMeasurements)
at Microsoft.DotNet.Cli.Program.ProcessArgs(String[] args, TimeSpan startupTime, ITelemetry telemetryClient)
at Microsoft.DotNet.Cli.Program.Main(String[] args)
</code></pre>
<p>The .NET Core libraries apparently need full access to the users home folder. In the following example I am logged into my Raspberry pi as the default pi user. In order to fix this issue I ran the following commands. This changes the owner of the .dotnet folder to the user you specify in the <a href="https://linuxize.com/post/linux-chown-command/">chown</a> command.</p>
<pre><code>cd /home/pi
sudo chown pi .dotnet
</code></pre>
<p>To run this for another user just replace with your username.</p>
<pre><code>cd /home/<USER>
sudo chown <USER> .dotnet
</code></pre>
<p>After the changes were made the project was created successfully!</p>
<pre><code>pi@raspberrypi:~/IoT/FirstProject $ dotnet new console
Welcome to .NET 5.0!
---------------------
SDK Version: 5.0.203
Telemetry
---------
The .NET tools collect usage data in order to help us improve your experience. It is collected by Microsoft and shared with the community. You can opt-out of telemetry by setting the DOTNET_CLI_TELEMETRY_OPTOUT environment variable to '1' or 'true' using your favorite shell.
Read more about .NET CLI Tools telemetry: https://aka.ms/dotnet-cli-telemetry
----------------
Installed an ASP.NET Core HTTPS development certificate.
To trust the certificate run 'dotnet dev-certs https --trust' (Windows and macOS only).
Learn about HTTPS: https://aka.ms/dotnet-https
----------------
Write your first app: https://aka.ms/dotnet-hello-world
Find out what's new: https://aka.ms/dotnet-whats-new
Explore documentation: https://aka.ms/dotnet-docs
Report issues and find source on GitHub: https://github.com/dotnet/core
Use 'dotnet --help' to see available commands or visit: https://aka.ms/dotnet-cli
--------------------------------------------------------------------------------------
Getting ready...
The template "Console Application" was created successfully.
Processing post-creation actions...
Running 'dotnet restore' on /home/pi/IoT/IoT.csproj...
Determining projects to restore...
Restored /home/pi/IoT/IoT.csproj (in 533 ms).
Restore succeeded.
</code></pre>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com3tag:blogger.com,1999:blog-8675696861245191896.post-88830123134745853872021-05-24T12:15:00.002-04:002021-05-24T12:15:32.375-04:00Error Connecting to Raspberry Pi in VS Code After Reset<p>While utilizing the <a href="https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack">Remote Development tools</a> to connecto to a raspberry pi board using SSH I encountered the following error after doing a full OS reset on the pi board.</p>
<p><img src="https://user-images.githubusercontent.com/7444929/119375596-feb85700-bc88-11eb-835c-dd2e981f0909.png" alt="Visual Studio Code Error"></p>
<p>The detailed error within the VS code console was the following.</p>
<p><code>WARNING: POSSIBLE DNS SPOOFING DETECTED! @ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ The ECDSA host key for raspberrypi.local has changed, and the key for the corresponding IP address fe80::a25a:b6ea:ae38:25d7%bridge100 is unknown. This could either mean that DNS SPOOFING is happening or the IP address for the host and its host key have changed at the same time. @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @ WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED! @ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY! Someone could be eavesdropping on you right now (man-in-the-middle attack)! It is also possible that a host key has just been changed. The fingerprint for the ECDSA key sent by the remote host is SHA256:14s3gi2hHhIlFobItAxtLB1tlyB1uZFFi/C0ruoS9iI. Please contact your system administrator. Add correct host key in /Users/admin/.ssh/known_hosts to get rid of this message. Offending ECDSA key in /Users/admin/.ssh/known_hosts:1 ECDSA host key for raspberrypi.local has changed and you have requested strict checking. Host key verification failed.</code></p>
<p>To fix this open file explorer to the location of the SSH configuration file. There you will see another file called known_hosts. You can either delete the file completely or edit it and remove the line specific to the device you are connecting to. You will be prompted the next time you connect using VS Code to update the key for that device which will then add it back to this file.<br>
<img src="https://user-images.githubusercontent.com/7444929/119375670-11cb2700-bc89-11eb-865a-6b5cfbb07bca.png" alt="Delete known_hosts file"></p>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0tag:blogger.com,1999:blog-8675696861245191896.post-21746422991221702362021-05-13T13:26:00.002-04:002021-05-13T13:26:41.294-04:00Team Member License Enforcement in GCC<p><img src="https://user-images.githubusercontent.com/7444929/118153758-3ea75080-b3e4-11eb-9d7d-079a8f53a627.png" alt="image"></p>
<p>If you have custom applications built on Dynamics/Dataverse within the US Government Community Cloud (GCC) and only have Team user license that were purchased after October 2018 your apps screen may soon like like the one above.</p>
<p>These Team licenses were designed for access to very specific first party apps.</p>
<ul>
<li>Customer Service Team Member</li>
<li>Sales Team Member</li>
<li>Project Resource Hub</li>
</ul>
<p>For a long time the restriction on these licenses in GCC had not had any sort of technical enforcement behind them but that is changing. Restrictions are being put in place that will ensure users can only utilize the first party apps listed above.</p>
<p>Additional information on the license enforcement can be found on the <a href="https://docs.microsoft.com/en-us/dynamics365-release-plan/2020wave1/dynamics365-sales/license-enforcement-users-new-team-member-licenses">Microsoft docs site</a>.</p>
<h2 id="what-are-the-team-license-types">What are the team license types?</h2>
<p>To better understand what licenses you currently have in your environment let’s take a look the different team licenses. The image below shows you the display names of the different team licenses available and what they can access. As you can see there is a <strong>Dynamics 365 for Team Members for Government</strong> that has no restrictions. These licenses were the ones purchased by organizations before October 2018. If your users have this license you will not see any changes and your apps will continue to work correctly. That being said upgrading these to Power Apps licenses can open up a lot of new opportunity. If you have the <strong>Dynamics 365 for Team Members for Government</strong> or <strong>PowerApps for Dynamics 365 Team Members for Government</strong> you will soon be limited to limited first party apps.<br>
<img src="https://user-images.githubusercontent.com/7444929/118153725-3818d900-b3e4-11eb-8269-ca393c7b60f2.png" alt="image"></p>
<h2 id="where-do-i-find-my-license-types">Where do i find my license types?</h2>
<p>Within the Azure Portal you can go to the Azure Active Directory area and look at the license that are displayed there.<br>
<img src="https://user-images.githubusercontent.com/7444929/118158381-b2982780-b3e9-11eb-992d-a3cd44b46e3d.gif" alt="AzureLicenseInfo"></p>
<p>As an individual user you can check your license by going to your <a href="https://portal.office.com/account/">Office 365 account page</a> and clicking the link for subscriptions.<br>
<img src="https://user-images.githubusercontent.com/7444929/118158401-b9269f00-b3e9-11eb-8d09-726c7f3fb9ea.gif" alt="UserLicenseInfo"></p>
<h2 id="so-what-should-i-do">So what should i do?</h2>
<p><strong>Don’t get caught off guard</strong> by losing access to your applications. If you are utilizing the newer Teams licenses for apps not includes within the first party list get together with your Microsoft sales representative or <a href="https://info.microsoft.com/ww-Landing-PowerApps-Contact-Us.html?LCID=EN-US">contact sales directly</a>. They can help you figure out what license options are available and which ones best fit your organizational needs.</p>
<p>Upgrading your organization to Power Apps licenses will also allow your organization to be part of the low code revolution.</p>
<ul>
<li><a href="https://powerapps.microsoft.com/en-us/">Power Apps Home Page</a></li>
<li><a href="https://dynamics.microsoft.com/en-us/guidedtour/power-platform/power-apps/1/1">Power Apps Guided Tour</a></li>
<li><a href="https://powerapps.microsoft.com/en-us/pricing/">Power Apps Pricing</a> Make sure you look for any promotional pricing that may be going on within the page that could apply to your organization.</li>
</ul>
Rick A. Wilson (RAW)http://www.blogger.com/profile/03562616659850164140noreply@blogger.com0