Wednesday, July 18, 2018

VSTS and SourceTree

Recently started using Git with VSTS and wanted to connect it up to Sourcetree.  This article by Adam Prescott helped me out.  You can view the article here:

Tuesday, April 24, 2018

Download SDK Tools for Dynamics V9.0+

Microsoft is no longer providing the entire SDK and samples as a download.  Instead they are hosting the samples on a their website and providing the sdk and tools on nuget.

You can get the tools inside Visual Studio by adding the nuget package but if you just want to download them to a folder on your computer they also have provided a PowerShell script to help you do that too.

Sample code website:

Here is a copy of the powershell script provided by Microsoft:

1. In your Windows Start menu, type Windows Powershell and open it.
2. Navigate to the folder you want to install the tools to. For example if you want to install them in a devtools folder on your D drive, type cd D:\devtools.
3. Copy and paste the following PowerShell script into the PowerShell window and press Enter.
$sourceNugetExe = ""
$targetNugetExe = ".\nuget.exe"
Remove-Item .\Tools -Force -Recurse -ErrorAction Ignore
Invoke-WebRequest $sourceNugetExe -OutFile $targetNugetExe
Set-Alias nuget $targetNugetExe -Scope Global -Verbose

##Download Plugin Registration Tool
./nuget install Microsoft.CrmSdk.XrmTooling.PluginRegistrationTool -O .\Tools
md .\Tools\PluginRegistration
$prtFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.XrmTooling.PluginRegistrationTool.'}
move .\Tools\$prtFolder\tools\*.* .\Tools\PluginRegistration
Remove-Item .\Tools\$prtFolder -Force -Recurse

##Download CoreTools
./nuget install  Microsoft.CrmSdk.CoreTools -O .\Tools
md .\Tools\CoreTools
$coreToolsFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.CoreTools.'}
move .\Tools\$coreToolsFolder\content\bin\coretools\*.* .\Tools\CoreTools
Remove-Item .\Tools\$coreToolsFolder -Force -Recurse

##Download Configuration Migration
./nuget install  Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf -O .\Tools
md .\Tools\ConfigurationMigration
$configMigFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf.'}
move .\Tools\$configMigFolder\tools\*.* .\Tools\ConfigurationMigration
Remove-Item .\Tools\$configMigFolder -Force -Recurse

##Download Package Deployer 
./nuget install  Microsoft.CrmSdk.XrmTooling.PackageDeployment.WPF -O .\Tools
md .\Tools\PackageDeployment
$pdFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.XrmTooling.PackageDeployment.Wpf.'}
move .\Tools\$pdFolder\tools\*.* .\Tools\PackageDeployment
Remove-Item .\Tools\$pdFolder -Force -Recurse

##Remove NuGet.exe
Remove-Item nuget.exe 

Monday, April 23, 2018

Find Empty Methods in Visual Studio

To find empty methods in Visual Studio search using the following regular expression.

void\ .*\(*\)(\ |(\r\n))*{(\ |(\r\n))*}

Thursday, March 15, 2018

Create Windows 10 VHD and Import as EC2 AMI

In needing to test client configuration and keep them in the same domain as our server we recently needed to add some Windows 10 machines to EC2. It was a bit of a surprise that there were no AMIs available for Windows 10. Instead we needed to create our own VHD and import it as an AMI in our EC2 isntance.

Prepare VM
  1. Download and install Oracle VM Virtualbox
  2. Download your Windows 10 ISO file.
  3. Create a new Windows 10 VM
  4. When creating the hard drive make sure to choose the VHD format.

  5. I chose Fixed size for the storage on the hard disk.  I have not tried using Dynamic.

  6. Give the hard drive at least 25GB go bigger if you can.
  7. Attach the ISO to your Virtual Machine and start it up.
  8. Run through the windows setup
  9. When prompted to either enter a windows account or a domain account choose domain account.  You will be promoted to create a new user and password, this new user will be a local administrator on the machine, make sure to save this information.
  10. After you have completed the setup enable Remote Access to the machine.  If you fail to do this now then you won't be able to connect later when you create the EC2 machine instance.

  11. Restart the machine and run Windows Update.
  12. Close the Virtual Machine

Install the AWS Command Line Utility

Upload the VHD File to S3

Create JSON Documents
These documents will be used to a role, assign rights, and import the vhd as an AMI.  It's helpful to have all these documents in the same folder.
  1. Create a trust policy document which will create a new role and give that role the correct actions it will need to do the import. The file can be called role-trust.json

       "Version": "2012-10-17",
       "Statement": [
             "Effect": "Allow",
             "Principal": { "Service": "" },
             "Action": "sts:AssumeRole",
             "Condition": {
                   "sts:Externalid": "vmimport"
  2. Create a role policy document which will provide the role the security permissions it need to do the import.  You can name the new file role-security.json, make sure the two places where YOURBUCKET is mentioned that you replace them with the name of your bucket where the VHD is located.

       "Version": "2012-10-17",
       "Statement": [
             "Effect": "Allow",
             "Action": [
             "Resource": [
             "Effect": "Allow",
             "Action": [
             "Resource": [
             "Effect": "Allow",
             "Resource": "*"
  3. Create another document which will describe the VHD which is going to be imported. Make sure to replace YOURBUCKET with the name of your bucket where the VHD is located. You can name this file image-container.json

             "Description": "Windows 10 Base Install 1709", 
             "Format": "vhd", 
             "UserBucket": { 
                      "S3Bucket": "YOURBUCKET", 
                      "S3Key": "Windows 10.vhd" 
Run Commands
  1. Open a command prompt As Administrator
  2. Navigate to the folder where you have the json document you created in the last step.
  3. Run this first command which will create the role

    aws iam create-role --role-name vmimport --assume-role-policy-document file://role-trust.json
  4. Run this command which assign permissions needed by the role

    aws iam put-role-policy --role-name vmimport --policy-name vmimport --policy-document file://role-security.json
  5. Run this command which will import the VHD

    aws ec2 import-image --description "Windows 10 (1709)" --disk-containers file://image-container.json --region us-east-1
  6. It can take a while to import the VHD as an AMI, in my case it took about 1.5 hours.
  7. To see the status of your import run this command.

    aws ec2 describe-import-image-tasks --region us-east-1

  8. When the import is completed the status will look like this. Make note of the ImageId tag.  You will need this to find the AMI later when you want to launch an instance of the AMI.
Create EC2 Instance
Now that our VHD is an AMI we can create a new EC2 instance from it.
  1. You can quickly launch an instance of the AMI from AWS -> EC2 -> AMIs
  2. Select the AMI you created and click the Launch button.  You can located the AMI by the ImageId you got from the completed status you received earlier.
  3. Make sure when you set up the security group for the Instance you allow port 3389 for RDP.
  4. When you log into the machine use the local administrator account you created when you set up the VM.

Wednesday, March 14, 2018

Uploading Large Files into Amazon S3

I recently had to upload some VHDs to Amazon S3 and found myself going beyond the upload limits for the web upload.  In order to accomplish uploading the 50GB files I used the AWS Command Line Interface (CLI).  If you need to know how to install the CLI check here (Install CLI)

1. Open a command prompt As Administrator.

2. Make sure your CLI configuration is up to date.

C:\Windows\System32> aws configure
AWS Access Key ID: yourkey
AWS Secret Access Key: youraccesskey
Default region name [us-east-1]: yourregion
Default output format [None]: json

3. Create your bucket on S3 if it's not already there.

4. Run the following command. 

C:\Windows\System32> aws s3 cp c:\temp\yourfile.vhd s3://yourbucket/VHDs

5. After you run the command the output window will provide you feedback as to how much of your download is complete and your current upload speed.

C:\Windows\System32> aws s3 cp c:\temp\yourfile.vhd s3://yourbucket/VHDs
Completed 1.1 GiB/50 Gib (14.1 MiB/s) with 1 file(s) remaining

Amazon AWS Command Line Interface (CLI) Install

Using CLI can make doing task in AWS much easier.  In order to get the CLI tool this is the setup i usually perform.  Instead of going the Python/PIP route as show here there is also an MSI installer you can use.  (MSI Download Link for 32/64 Bit)

1. Install Python.  Use Python 2.7.9+ or Python 3.4+ which will insure you get the PIP installer included.  If you are using a previous version of Python you will need to go through a separate setup procedure for PIP.

2. Open a command prompt As Administrator and run the following.

C:\Windows\System32> pip install awscli

3. Test to make sure that the install was successful.

C:\Windows\System32> aws --version
aws-cli/1.11.84 Python/3.6.2 Windows/7 botocore/1.5.47

4. After you confirm it was installed you can go ahead and configure the properties for your connection. If you don't have your access/secret keys you can add a new one to your user account in the AWS online console.  IAM -> Users -> Your User -> Security Credentials Tab -> Access Keys.  If you create a new key make sure to keep the keys in a safe place and don't loose them.

C:\Windows\System32> aws configure
AWS Access Key ID: yourkey
AWS Secret Access Key: youraccesskey
Default region name [us-east-1]: yourregion
Default output format [None]: json

Usually before running the CLI updating it is a good idea.

C:\Windows\System32> pip install --user --upgrade awscli