How to parameterize tokens in URL for web service requests in VisualStudio webtest

While back I had blogged about automating testing of your REST API using VisualStudio webtest and one of the questions I got was how do you parameterize tokens in URL for REST API endpoints. If you have not seen that post see this link. Let’s take a look at our northwind API as an example.

If you want to retrieve a single customer entity and the REST API endpoint looks something like this http://api.northwind.com/customers/{customerID}

By default if you add a web service request to your webtest you will need to hardcode the “customerID” portion in the above API endpoint. This may not be necessarily bad if those IDs never change additionally if you have multiple web service requests added in your webtests for different scenarios, if you hard code the ID in URL and when the IDs change you will have to make sure you go and update all the URLs for each web service requests in your webtest which can quickly become a nightmare. So lets see how we can solve that problem.

Approach 1:

Add a context parameter to your webtest called “customerIDParam” and set the value to a customer ID that you will use as test data for the webtest. Follow steps to add context parameter.

Assuming the webtest is open in VisualStudio, right click on context parameters node and select “add context parameter”

SNAGHTML2d57df3

enter name you want to use for the parameter and for value enter the customer ID you will use as test data.

Once the context parameter is setup, you can now use this in the URL for the web service request as shown below. In the highlighted request you can see the API host portion and customer ID portion using context parameters.

SNAGHTML2db43cb

Approach 2:

This approach we are going to use some custom code to get the parameters for the webtest. First thing we’ll need to do is add a data source to the webtest project, datasource will contain the the test data that we want to use, in this case api host and customer ID values. Next thing we’ll need to do is convert the webtest into a coded webtest. One thing to keep in mind here is that the VisualStudio designer doesn’t allow binding URLs to datasource, it would have been nice if we could do that as that would eliminate having to write custom code.

To convert the webtest into a coded webtest, assuming you’ve opened the webtest inside of VisualStudio, click on “Generate Code” toolbar option as shown in screen capture below

SNAGHTML2e3c9fc

You could also just simply add a class and inherit from “WebTest” class defined in “Microsoft.VisualStudio.TestTools.WebTesting” namespace. I’m not sure why there is no option to added coded web test within VisualStudio that would automatically do this.

Assuming you have the coded webtest class added to the project we can get apihost and customer id values from the datasource by adding below code to the coded webtest and use it when we construct the “WebTestRequest”

   this.Context["CustomerTestDS.CustomerTestData.ApiHost"].ToString()
   this.Context["CustomerTestDS.CustomerTestData.CustomerID"].ToString()

Hope that helps

</Ram>

My Initial thoughts on Azure Resource Manager tooling in VisualStudio

I don’t know about you but to me Azure Resource Manager is one of the coolest features, it totally takes all the complexity of provisioning resources within Azure away from us. If you are new to Azure Resource Manager, I highly suggest checking out build and ignite sessions on this topic. This post I wanted to cover some initial thoughts on the Azure Resource Manager (ARM) tooling in VisualStudio, I will continue to add to this as I find new stuff.

Things I like

Project template to create Azure Resoure Group based on templates. When editing ARM templates you get full intellisense, Easily deploy resource group template from VisualStudio as well as using PowerShell.

Things to know

When you first create a Resource Group project, select azure template dialog makes bunch of HTTP calls to get template metadata, if you don’t have an internet connection or your connection dropped, you are going to see error below. It does how ever cache after the metadata is loaded for the first time from server so if your connection drops later or you are mobile and don’t have internet but want to work on the template, you should have no problems.

image

My Wish List

Today when you create a new resource group project, you start off with a blank template, it would be nice if the tooling provided some sort of hook into the github repository that contains templates created by community. This way you are not starting off with a blank template. If you are not familiar with quick start template take a look at this link http://azure.microsoft.com/en-us/documentation/templates/. Templates you find there are indexed from this github repository https://github.com/Azure/azure-quickstart-templates. You can fork this repository and contribute to the community.

If you started off creating your template inside of VisualStudio and now you want to publish it out to the community templates github repo, you still have to do a lot of additional work such as renaming template json file and parameter json file, as well as creating metadata file etc. before you can commit and send  pull request. I would like to see a way to publish the template directly into the community template github repo from visualstudio, that would be tremendously helpful and avoids template authors having to do a lot of additional work to get it ready for publishing to the github repository.

Biggest Gap I see

Lets say the design of your solution requires you to build bunch of Logic App Connectors (API apps) and a Logic App (Business Process) and you want to create an ARM template to provision the connectors and Logic App together, I’m finding it extremely difficult/painful with the current tooling. You have to deploy connectors individually, then you have to switch to Azure to design the Logic app. After you design you have to then test it. Once I know the Logic app is working correctly I have to take the JSON and come back to VisualStudio and wire it up to ARM template. Its real painful process IMO. Am sure this will get better as Azure App Service gets out of preview.

I’d love to hear what you think of the ARM tooling in VisualStudio. Drop a comment if you have interesting things to share

Cheers,

</Ram>

Enabling Organizational Authentication on an existing MVC project

If you have been working with MVC applications should already know that when you first create the project, you sort of have to make call on how the application is going to authenticate users, options are no authentication, Individual Accounts which essentially means users are going to be stored in a database and template auto generates all the plumbing code for authentication, user management etc. and lastly Organizational accounts. This is where you basically outsource all authentication function to external authentication system, typically a federated identity management system. But what if you started off with individual accounts and later you decide to move to organizational accounts, if you have run into this scenario its not quite easy to swap. I wish there was an easier way in VisualStudio. Goal with this post is to to document the steps you can take to swap the authentication mechanism to organizational accounts.

Nuget Packages to be Uninstalled

In package manager console run commands in the order listed below to uninstall all Owin nuget packages that was pulled down by the project template

Uninstall-Package Microsoft.Owin.Security.Twitter
Uninstall-Package Microsoft.AspNet.Identity.Owin
Uninstall-Package Microsoft.Owin.Security.OAuth
Uninstall-Package Microsoft.Owin.Security.MicrosoftAccount
Uninstall-Package Microsoft.Owin.Security.Google
Uninstall-Package Microsoft.Owin.Security.Facebook
Uninstall-Package Microsoft.Owin.Security.Cookies
Uninstall-Package Microsoft.Owin.Security
Uninstall-Package Microsoft.Owin.Host.SystemWeb
Uninstall-Package Microsoft.Owin
Uninstall-Package Owin

Since Application insights packages are not pulled down when you select organization authentication when a new ASP.NET application is created we will  go ahead and uninstall those packages as well. I did not test to see if there are any issues with enabling Application Insights on ASP.NET web application configured for Organization Authentication. May be that’s for another day Smile

In package manager console run following commands in order listed below

Uninstall-Package Microsoft.ApplicationInsights.Web
Uninstall-Package Microsoft.ApplicationInsights.Web.TelemetryChannel
Uninstall-Package Microsoft.ApplicationInsights.PerfCounterCollector
Uninstall-Package Microsoft.ApplicationInsights.JavaScript
Uninstall-Package Microsoft.ApplicationInsights.DependencyCollector
Uninstall-Package Microsoft.ApplicationInsights.Agent.Intercept
Uninstall-Package Microsoft.ApplicationInsights

Nuget Packages to be installed

Next we need to Install System.IdentityModel.Tokens.ValidatingIssuerNameRegistry, this package adds token validation capabilities and ensures that the signer and issuer are a valid pair.

In package manager console run below command

Install-Package System.IdentityModel.Tokens.ValidatingIssuerNameRegistry

Assembly references to be added

Update Project references and add following assembly references listed below

  • System.IdentityModel
  • System.IdentityModel.Services

EntityFramework wireup to push keys and tenant IDs into database

Next we need to add entity framework classes which enables saving of keys and tenant ids into a database after token validation process is complete during the organization authentication process. Follow steps below.

  • Right click on “Models” folder and add a new class and name it “TenantRegistrationModels.cs”
  • Replace using statements in TenantRegistrationModels.cs file with below:
 using System;
 using System.Collections.Generic;
 using System.Linq;
  • Add following model classes to “TenantRegistrationModels.cs” file. You can break this into individual files if you prefer that.
public class IssuingAuthorityKey
{
	public string Id { get; set; }
}

public class Tenant
{
	public string Id { get; set; }
}
  • Right click on “Models” folder in your existing MVC project and add a new class and name it “TenantDbContext.cs”
  • Replace using statements in “TenantDbContext.cs” file with below:
 using System;
 using System.Data.Entity;
  • Replace the contents of the “TenantDbContext” class with code below
 public class TenantDbContext : DbContext
 {
         public TenantDbContext()
             : base("DefaultConnection")
         {
         }

        public DbSet IssuingAuthorityKeys { get; set; }

        public DbSet Tenants { get; set; }
 }
  • Delete files below that are stored under “Models” folder in your existing MVC project as we no longer need them since we are switching to Organization Authentication model from Individual user accounts.
    • AccountViewModels.cs
    • IdentityModels.cs
    • ManageViewModels.cs

Next we need to add a folder to existing project and name it “Utils”

Right click on this folder and add a class and name it “DatabaseIssuerNameRegistry.cs”

Replace using statements in “DatabaseIssuerNameRegistry.cs” file with following

 using System;
using System.Collections.Generic;
using System.IdentityModel.Tokens;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Web;
using System.Web.Hosting;
using System.Xml.Linq;

Additionally you have to include another using statement above to bring in the namespace for your “Models”.

Replace the “DatabaseIssuerNameRegistry” class with code below

 public class DatabaseIssuerNameRegistry : ValidatingIssuerNameRegistry
     {
         public static bool ContainsTenant(string tenantId)
         {
             using (TenantDbContext context = new TenantDbContext())
             {
                 return context.Tenants
                     .Where(tenant => tenant.Id == tenantId)
                     .Any();
             }
         }

        public static bool ContainsKey(string thumbprint)
         {
             using (TenantDbContext context = new TenantDbContext())
             {
                 return context.IssuingAuthorityKeys
                     .Where(key => key.Id == thumbprint)
                     .Any();
             }
         }

        public static void RefreshKeys(string metadataLocation)
         {
             IssuingAuthority issuingAuthority = ValidatingIssuerNameRegistry.GetIssuingAuthority(metadataLocation);

            bool newKeys = false;
             bool refreshTenant = false;
             foreach (string thumbprint in issuingAuthority.Thumbprints)
             {
                 if (!ContainsKey(thumbprint))
                 {
                     newKeys = true;
                     refreshTenant = true;
                     break;
                 }
             }

            foreach (string issuer in issuingAuthority.Issuers)
             {
                 if (!ContainsTenant(GetIssuerId(issuer)))
                 {
                     refreshTenant = true;
                     break;
                 }
             }

            if (newKeys || refreshTenant)
             {
                 using (TenantDbContext context = new TenantDbContext())
                 {
                     if (newKeys)
                     {
                         context.IssuingAuthorityKeys.RemoveRange(context.IssuingAuthorityKeys);
                         foreach (string thumbprint in issuingAuthority.Thumbprints)
                         {
                             context.IssuingAuthorityKeys.Add(new IssuingAuthorityKey { Id = thumbprint });
                         }
                     }

                    if (refreshTenant)
                     {
                         foreach (string issuer in issuingAuthority.Issuers)
                         {
                             string issuerId = GetIssuerId(issuer);
                             if (!ContainsTenant(issuerId))
                             {
                                 context.Tenants.Add(new Tenant { Id = issuerId });
                             }
                         }
                     }
                     context.SaveChanges();
                 }
             }
         }

        private static string GetIssuerId(string issuer)
         {
             return issuer.TrimEnd('/').Split('/').Last();
         }

        protected override bool IsThumbprintValid(string thumbprint, string issuer)
         {
             return ContainsTenant(GetIssuerId(issuer))
                 && ContainsKey(thumbprint);
         }
     }

Updates to App_Start

Make following Changes to App_Start folder

  • Delete “Startup.Auth.cs” file from App_Start folder as we no longer need to wire up authentication to Owin middleware
  • Edit the “IdentityConfig.cs” file and make changes below
    • Replace using statements with below
using System;
 using System.Collections.Generic;
 using System.Configuration;
 using System.IdentityModel.Claims;
 using System.IdentityModel.Services;
 using System.Linq;
 using System.Web.Helpers;
 using WebApplication3.Utils;
    • Replace everything inside the namespace statement with code below
public static class IdentityConfig
     {
         public static string AudienceUri { get; private set; }
         public static string Realm { get; private set; }

        public static void ConfigureIdentity()
         {
             RefreshValidationSettings();
             // Set the realm for the application
             Realm = ConfigurationManager.AppSettings["ida:realm"];

            // Set the audienceUri for the application
             AudienceUri = ConfigurationManager.AppSettings["ida:AudienceUri"];
             if (!String.IsNullOrEmpty(AudienceUri))
             {
                 UpdateAudienceUri();
             }

            AntiForgeryConfig.UniqueClaimTypeIdentifier = ClaimTypes.Name;
         }

        public static void RefreshValidationSettings()
         {
             string metadataLocation = ConfigurationManager.AppSettings["ida:FederationMetadataLocation"];
             DatabaseIssuerNameRegistry.RefreshKeys(metadataLocation);
         }

        public static void UpdateAudienceUri()
         {
             int count = FederatedAuthentication.FederationConfiguration.IdentityConfiguration
                 .AudienceRestriction.AllowedAudienceUris.Count(
                     uri => String.Equals(uri.OriginalString, AudienceUri, StringComparison.OrdinalIgnoreCase));
             if (count == 0)
             {
                 FederatedAuthentication.FederationConfiguration.IdentityConfiguration
                     .AudienceRestriction.AllowedAudienceUris.Add(new Uri(IdentityConfig.AudienceUri));
             }
         }
     }

Make following changes to files in “Controllers” folder

  • Delete “ManageController.cs” file. If you want to provide user management functions you’ll probably want to keep this controller and update code to do those operations against Azure AD using Graph API

Make following changes to “Global.asax.cs” by right clicking on “Global.asax” file and selecting view code option.

  • Add below code to using statements

using System.IdentityModel.Services;

  • Update code in “Application_Start” event and add a call to ConfigureIdentity method in IdentityConfig class like below:
    • IdentityConfig.ConfigureIdentity();
  • Add a new event handler shown below to hook into RedirectingToIdentityProvider event on WSFederationAuthenticationModule
private void WSFederationAuthenticationModule_RedirectingToIdentityProvider(object sender, RedirectingToIdentityProviderEventArgs e)
         {
             if (!String.IsNullOrEmpty(IdentityConfig.Realm))
             {
                 e.SignInRequestMessage.Realm = IdentityConfig.Realm;
             }
         }

View Updates

Add a new view under “Account” folder and name it “SignOutCallback.cshtml” and add following snippet below. You can also add your own fancy custom message you want to show when users signs out of the application.

<p class=”text-success”>You have successfully signed out.</p>

Web.Config Updates

We’ll need to apply some updates to the web.config for your existing MVC web site. Changes that needed to be made are listed below:

  • Under “configSections” element add snippet below
  • Add snippet below right under closing element for system.web

     
       
       
         
       
       
         
         
       
       
     
   
  • Add section for system.identityModel.services right under “entityFramework” element as shown below
   
     
       
       
     
   
  • Add following keys to “appSettings”

  
  
  

  • Add authorization element under system.web to deny anonymous users access. See below:
  
       
 
  • Add following snippet right after “appSettings” element
  
     
       
         
       
     
   
  • Replace contents of “modules” element under “system.webServer” element with snippet below
  
  

Additional Clean Up

You may also want to replace the contents of “_LoginPartial.cshtml” view with snippet below especially if you have the original auto generated code, you may still have requests going to “Accounts” controller methods that no longer exists or needed

@if (Request.IsAuthenticated)
{
     <text>
     <ul class=”nav navbar-nav navbar-right”>
         <li class=”navbar-text”>
             Hello, @User.Identity.Name!
         </li>
         <li>
             @Html.ActionLink(“Sign out”, “SignOut”, “Account”)
         </li>
     </ul>
     </text>
}
else
{
     <ul class=”nav navbar-nav navbar-right”>
         <li>@Html.ActionLink(“Sign in”, “Index”, “Home”, routeValues: null, htmlAttributes: new { id = “loginLink” })</li>
     </ul>
}

If you decided to delete “ManageController” controller per earlier step, we no longer needed to keep any views that are related to account management activities, so I would recommend deleting them. Additionally all the auto generated views in Account folder could also be removed. You should just have one here which is “SignOutCallback.cshtml” that display user a message when he/she signs out of application.

Delete the “Startup.cs” file from existing MVC project.

Add the Application to Azure AD

Lastly add the application in Azure Active Directory. I won’t go into the details about that as you might already be familiar with it.

Last thing I want to mention is if you started of with “No Authentication” option steps should be 80-90% same as above, couple of additional stuff you’ll need to do is bringing in EntityFramework, Microsoft.AspNet.Identity.Core and Microsoft.AspNet.Identity.EntityFramework packages

Hope this helps.

Cheers,

</Ram>

Azure Api App – Continuous deployment to Azure from local git repository

You may or may not have seen this article that talks about continuous deployment to azure for app service. The thing I want to point out is that the article covers within the context of a web app. You can pretty much do the same thing for API apps as well. If you just looked at the properties of the API app it may not be quite clear. UI is not quite intuitive at this point, hopefully by GA all these will be sorted out. This post I will cover how to configure continuous deployment for API apps.

Once you develop the API app in visual studio, you first need to publish it. If you are new to API app, have a look at Brady Gaster’s article on azure.com. After you have published the Api app to azure, select the Api app in azure portal and click on API app host link (see screen shot below). API app host is basically an Azure web app, were we will need to follow the same steps described in the article for continuous deployment that I shared earlier. If you haven’t configured continuous deployment for web apps I suggest you read that article before also there is plenty of walkthrough’s online published by various folks within azure team.

image

For the purposes of this article I’ve gone and configured continuous deployment for the api host web app for twitterinsights api app. Currently the API looks like below

image

You can also see from the screen shot below that I have connected the api host web app to local git repository and set deployment credentials as well as provided hook for my local git repository (remote git URL for api host web app)

image

Next I will go ahead and update the api and commit to my local git repository, this is just to illustrate the continuous deployment is working correctly. For demonstration purposes I updated the route as shown below

image

Next we will need to push the changes to remote repository by running command below, again this is clearly described in the continuous deployment article shared earlier.

image

You can see from the screenshot below that changes were successfully deployed

image

Finally just to prove that there is no smoke and mirrors you can see in the swagger UI below that my changes are indeed pushed to azure.

image

Hope that helps, one final word is you can use all the supported source control repositories it doesn’t just have to be local git repository. Hope you are just excited as I am about Azure App Service and the possibilities it brings for developers.

Cheers,

Ram

Error connecting to Azure Subscription from VisualStudio

I wanted to make this post with the hope of saving others time. Last two days I have been troubleshooting issue connecting to my Azure Dev/Test subscription. I had developed a custom resource group template and was getting ready to test the template by doing a sample deployment to my azure subscription. Unfortunately I simply could not get past the sign-in in “deploy to resource group” dialog in Visual Studio 2015. Since I had multiple subscriptions on my laptop as most of us do when I pick the Microsoft account from drop down in this case “rprakashg@hotmail.com” I would get the visual studio sign in page where I enter “rprakashg@hotmail.com”, STS would then redirect me to live and I’m able to login there and would come back to the “deploy to resource group” dialog but all the fields remained read only. I had no idea why the fields were still read only. Unfortunately what had happened was my sign-in process had failed (not that I used invalid login or anything), it would have been helpful if VisualStudio had shown some error message to me, additionally there was nothing visually in the dialog that would have given me some indication that there was some problem with sign in, as you can see from the screen shot below.

image

So I popped back to Visual Studio 2013 and opened the same resource group template project and tried to deploy and here is what I found. In the deploy to resource group dialog looks like below in VS 2013, when I tried to sign in to my azure Dev/Test subscription I basically got the same behavior as VS 2015 but the difference is I could visually see that I was not signed in even though I successfully signed in to live. If the sign in was successful the button text changes to Sign out and all the fields become enabled.

image

Either way there should be some sort of error message shown to user when there is a problem with sign in. At this point I needed to find out what is really happening so I tried creating a new ASP.NET project in Visual Studio 2015 and selected “Host in the Cloud” Project was successfully created but no cloud resources were provisioned in my azure subscription and there was no error messages as well, VisualStudio 2015 had decided to fail gracefully. I attempted to do the same thing in Visual Studio 2013 and right after click ok in project creation dialog, got the dialog shown below

image 

So I clicked on “Sign In” in the above dialog signed in using my rprakashg@hotmail.com which is associated with my Azure Dev/Test subscription and ended up getting error below from Visual Studio 2013 after the sign in.

image

At this point I ‘m still not sure exactly what the heck is going on. I can successfully login to both Azure Portal (old and new) using my rprakashg@hotmail.com but Visual Studio was still crapping all over the place. What is up with the error messages VisualStudio? What ever happened to good user experience Smile 

So as a last resort I tried connecting to my Azure Subscription from Server explorer and got following error

image

Error above got me confused even more, other thing I noticed was in VS 2015 under account settings page it was showing one account for rprakashg@hotmail.com and another one for ram.gopinathan@marviewsolutions.com and both as Microsoft account. That did not make any sense to me. So I switched to PowerShell and ran Get-AzureAccount and surely there was two accounts, see below

image

What was interesting here was the second user had no subscription associated, it was just associated with a different tenant, keep in mind ram.gopinathan@marviewsolutions.com is not even an organization account, its just my business email hosted in google apps for business. What I noticed is that visual studio was adding this account when I was signing in using my rprakashg@hotmail.com. Really bizarre stuff. At this point I knew something was up with my rprakashg@hotmail.com live account, to validate I added another live account rprakashg@outlook.com to my existing dev/test azure subscription as co administrator and attempted to sign in using this account instead of the rprakashg@hotmail.com and everything worked as expected, this really confirmed my assumption.

I focused my full attention on trying to figure out what could be wrong with rprakashg@hotmail.com account, checked visual studio service settings, Azure AD side, couldn’t really figure out what could be the root cause. As a last resort I started go back like weeks and try to remember all the changes I might have done from my memory any thing that could have potentially caused this. Suddenly it dawned on me, my client had granted access to their office 365 SharePoint site to my business email ram.gopinathan@marviewsolutions.com and once I received the invitation email I clicked on the link and logged into the site using my rprakashg@hotmail.com. Now my Microsoft account rprakashg@hotmail.com account is linked to my clients azure ad tenant for office 365 under ram.gopinathan@marviewsolutions.com email. I’m not 100% clear what is happening under the hood after you sign in to azure subscription from visual studio, but my guess is there is multiple bugs in that code per the behavior I’m seeing.

Basically after that point it completely broke the connect to Azure functionality from VisualStudio. I was able to successfully repro this using another live account that was working before. Steps to reproduce this issue are quite straight forward, see below.

  1. Login to an office 365 SharePoint site using administrator account and share site to an external email address, it can be anything, assuming you can check the email.
  2. Once you receive the invite email click on the link in email to access the SharePoint site (Make sure you clear cache and that there are no cookies left from previous logins). This will bring up a realm selection page where you will see two options Microsoft account and Organization account. Depending on how you login to your azure subscription, if you use a Microsoft account then you choose Microsoft account otherwise you’ll select Organization account and login.
  3. After you are successfully logged in, open VisualStudio, (2013 or 2015) and try to connect to azure subscription using the same account that you used to login to Office 365 SharePoint site. You can try connecting via server explorer, create a asp.net project and select host in cloud option, deploy resource group etc. nothing will work.

Once I removed the account from Azure AD tenant for Office 365 SharePoint site, everything will start working again. Hopefully this saves some time for others, been pulling my hair on this for last two days.

Cheers,

</Ram>

Setting up docker on Azure and deploying sample ASPNET vNext app

This post is just to document my experience running docker on Azure and deploying ASPNET vNext sample HelloWeb app. I’m super excited about the docker support in Azure and ASPNET vNext and I couldn’t wait any longer to try it out. At a high level, these are the steps I took:

  • Set up Docker Client on Virtual machine running Ubuntu desktop version.
  • Created Docker Host VM on Azure
  • Pull down the ASPNET vNext HelloWeb sample app
  • Created the Dockerfile
  • Build the container
  • Run the container
  • Create an endpoint port mapping for TCP port 80

Setting up Docker Client

While docker client can be set up on Windows and Mac OSX machines I decided to set it up as a virtual machine on my Mac with Ubuntu desktop on it.

Things we need to setup on docker client

  • Install node
  • Install Javascript package manager (npm)
  • Install Azure cross platform CLI tools
  • Connect to Azure subscription
  • Install docker client

Since azure cross platform CLI tools are written using nodejs, first thing we’ll need to do is Login to Ubuntu desktop virtual machine and install nodejs by running the command below

sudo apt-get install nodejs-legacy

Next install javascript package manager, run the command below

sudo apt-get install npm

Install Azure cross platform CLI tools, run command below

sudo npm install azure-cli --global

Connect to Azure Subscription

This process is very similar to azure powershell setup on windows machines. Download the publish settings file by running the command below. Command below will launch a browser session where you will need to login with windows live account after which the publish settings file will be downloaded to your local machine.

azure account download

Import the publish settings file

azure account import <publish settings file>

Finally Install Docker client

sudo apt-get install docker.io

Create docker host virtual machine on azure

We are going to use azure vm docker create command to create the docker host virtual machine on Azure. It uses virtual machine extensions feature in azure to install docker once the virtual machine is provisioned. If you are not familiar with virtual machine extensions I highly recommend reading this article. For a list of virtual machine extensions see. One important thing to remember here is that you should create the docker host vm using this command from your docker client, walkthroughs online did not explicitly mention this. The reason for this is one of the things the command does in addition to creating the virtual machine and installing docker on it is that it creates the certificates needed for docker client to be able to talk to docker host. So if you don’t run this command from your docker client machine you may need to do additional steps so the docker client can properly authenticate with docker host.

If you want to see the azure vm docker create command usage simply add –help or -h switch right after azure vm docker create.Before we can create the host vm for docker we need to identify the Linux image to use, run command below

azure vm image list | grep "20150123"

output from the command above should look something like screenshot shown below, you can see the image circled in red

linuximg

Run the command below to create docker host VM on Azure.

azure vm docker create ram-docker-host "b39f27a8b8c64d52b05eac6a62ebad85__Ubuntu-14_04_1-LTS-amd64-server-20150123-en-us-30GB" rprakashg P@ssw0rd123 -e 22 -l "West US" -u https://rprakashg.blob.core.windows.net -z "Large" -s "a8349281-7715-45c8-ac55-78787752ea7e"

Once the virtual machine is fully provisioned and running you can verify that your docker client can talk to the host by running following command.

docker --tls -H tcp://ram-docker-host.cloudapp.net:4243 info

Now that docker client and host is setup lets look at how we can run sample HelloWeb app published by aspnet team in docker.

Running ASPNET vNext sample app in docker

ASPNET team has published an excellent walkthrough here There is few things I had to do different which I will cover below.

I had some issue cloning the aspnet/home repository per the instructions in the article. I was getting a permission denied error and this had to do with how client machine authenticates with github, so I went down the path of setting up my docker client to talk to github and this really messed things up as docker client could not talk to docker host after I created new ssh key for github. This could be an issue with how I setup, ended up wasting hours :(. Finally I ended up doing an https cloning as shown below. You can find the https clone URL right on the repository page on github.

git clone https://github.com/aspnet/Home.git

Building the container image

On docker client machine running Ubuntu desktop I had to run the build command in elevated privileges using sudo, additionally I had to make sure docker client was talking to host based on a public/default CA pool by adding –tls -H tcp://ram-docker-host.cloudapp.net:4243 (which is my docker host), to pretty much every docker command I was issuing. Not sure why default settings did not work. Definitely need to investigate this further. If you have seen the same issue and have an explanation let me know. Building the container image using the command below took a long time with https v/s default setting which I think is non networked unix socket.

sudo docker --tls -H tcp://ram-docker-host.cloudapp.net:4243  build -t hellowebapp .

If you want to check the status of the above command, simply copy the id string returned by the previous command and run the command below

sudo docker --tls -H tcp://ram-docker-host.cloudapp.net:4243 logs -t <replace with id string>

Running the container

Ran into similar issues here too like build, additionally container just wouldn’t start, encountered “System.FormatException: Value for switch ‘/bin/bash’ is missing” error when I tried to run the container using the command described in article. Worked fine once I started running all docker commands using the –tls –H options.

sudo docker --tls -H tcp://ram-docker-host.cloudapp.net:4243 run -d -t -p 80:5004 hellowebapp

Verify running containers on host

sudo docker --tls -H tcp://ram-docker-host.cloudapp.net:4243 ps -a

Create an endpoint port mapping for TCP port 80

Last step as discussed in the article is to create an endpoint port mapping for TCP port 80 on docker host virtual machine. For name select HTTP from dropdown, protocol should be TCP and enter 80 as value for public port and private port. See screen show below

image

Finally have the sample HelloWeb app successfully running inside docker. Check it out here http://ram-docker-host.cloudapp.net/

Next steps I’m hoping to build a much more real world app in ASPNET vNext and test it out. I’ve already got my Mac machine configured for ASPNET vNext development with Sublime Text.

Huge shout out to Linux team at Microsoft and everyone in the open source community who have contributed writing various tools etc. You can see Microsoft stack is truly becoming more and more open.

Useful links

Docker On Azure

Docker support in ASPNET vNext

Docker

Cheers,

</Ram>

Setting up Azure CLI on a Mac machine and creating Linux virtual machines from terminal

This post is more for me as a reference to steps I took to configure a Mac machine to connect to Azure and be able to create Linux virtual machines. If it helps others then great.

Azure command line tools for Mac and Linux allows us to create/manage virtual machines, web sites and azure mobile services from Mac and Linux desktops. Download and install azure SDK install for Mac here

Connecting to azure subscription

Before you can run operations on your Azure subscription you need to import your subscription, steps to do this are pretty similar to how you setup Azure PowerShell on windows machines

Fire up a new instance of terminal and run following command

azure account download

The above command will launch a browser session and take you to https://windows.azure.com/download/publishprofile.aspx

Save your azure publish settings file locally.

Next execute command below to import your publish settings file

azure account import <file>

If everything went ok you should be good to go run operations on your Azure Subscription. Next we will create the Linux virtual machines, but before we can create the virtual machine we need to create a ssh certificate.

Creating “ssh” certificate

A compatible ssh key must be specified for authentication at the time of creating the Linux virtual machine in Azure. Supported formats for ssh certificates are .cer or .pem. Virtual machine creation will fail If you use any other format.  Run command below in a terminal window to generate a compatible ssh key for authentication.

openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout rprakashg.key -out rprakashg.pem

Creating Linux Virtual Machines

We are going to use azure vm create command from a terminal window to create virtual machines. If you want to get help on command usage add –h or –help in the end

The parameters we are going to need to create Linux virtual machine are listed below

-n <virtual machine name>

-u <blob uri>

-z <virtual machine size>

-t <cert> Specify ssh certificate here for authentication.

-l <location> specifies location

s <subscription id>

-p <password>

-u <username>

Identifying the Ubuntu image to use

azure vm image list | grep "20150123"

20150123 is the date the image is created. See the output from the above command below, you can see the full image name for the ubuntu server you can use when creating the virtual machine.

image

Sample command to create a Linux virtual machine using the ssh certificate we created above

azure vm create "ubuntu-server1" "b39f27a8b8c64d52b05eac6a62ebad85__Ubuntu-14_04_1-LTS-amd64-server-20150123-en-us-30GB" -l "West US" -u "https://rprakashg.blob.core.windows.net" -z "Medium" -s "a8349281-7715-45c8-ac55-78787752ea7e" -t rprakashg.pem -e 22 -p <password> -u "rprakashg"

Summarizing options for running Microsoft workloads on Google Cloud Platform

When it comes to Infrastructure as a Service (IaaS) both Amazon and Azure IMO dominates this space. Google also has IaaS offering on their cloud platform called Compute Engine. I have been using compute engine with windows images to run workloads such as SharePoint strictly for testing and learning purposes so I can make better recommendations to my customers who are looking to move their Microsoft workloads to cloud.

Currently there is only Windows Server 2008 R2 Datacenter Edition available as Operating System Images for compute engine instances. If you have google cloud platform SDK installed you can run the command below to see a list of windows images

gcloud compute images list --project windows-cloud --no-standard-images

Just like Azure VM when you create a compute engine instance using windows image you have to pass a username and password. You can have passwords stored in a password file and specify it at the time of creating the virtual machine.

Once the compute engine instance is up and running you can remote into it using an RDP client. From a MAC machine i use the Remote Desktop App which is available free from app store. You do need to enable RDP before you can remote into the compute engine instance. If you haven’t seen my post on this see enabling RDP

If you have done a lot of work on Azure IaaS you will find that there is simply no PowerShell support available natively within GCE. I have been running operations on compute engine from powershell using this PowerShell function below. Within my powershell script I build the arguments for gcloud command line utility and call this function.

Function Run-CloudCommand(){

    param(

        [Parameter(Mandatory=$True)]

        [string] $Arguments

    )

 

    $pinfo = New-Object System.Diagnostics.ProcessStartInfo

    $pinfo.FileName = "gcloud.cmd"

    $pinfo.Arguments = $Arguments

    $pinfo.RedirectStandardError = $true

    $pinfo.RedirectStandardOutput = $True

    $pinfo.UseShellExecute = $false

    $pinfo.WorkingDirectory = "c:\program files\google\cloud sdk\google-cloud-sdk\bin"

    

    $p = New-Object System.Diagnostics.Process

    $p.StartInfo = $pinfo

    $p.Start() | Out-Null 

    $p.WaitForExit()

    

    $stdout = $p.StandardOutput.ReadToEnd()

    $stderr = $p.StandardError.ReadToEnd()

 

    if($p.ExitCode -ne 0)

    {

        Write-Error $stderr

    }

    else

    {

      return $stdout

    }

}

I would hope Google is working on PowerShell SDK , this would be a nice open source initiative to work on if Google has no plans. Certainly wont be that hard to write a set of custom powershell commandlets that interacts with REST APIs. Would certainly be nice to know if folks are interested.

Remote powershell on windows based compute engine instances

No native remote powershell support is available like Azure, however you can get remote powershell enabled on compute engine instance, if you haven’t seen my post on this check it out here

What can you run on Google cloud platform

You can run server products listed below on compute engine.

  • MS Exchange Server

  • SharePoint Server
  • SQL Server Standard Edition
  • SQL Server Enterprise Edition
  • Lync Server
  • System Center Server
  • Dynamics CRM Server
  • Dynamics AX Server
  • MS Project Server
  • Visual Studio Deployment
  • Visual Studio Team Foundation Server
  • BizTalk Server
  • Forefront Identity Manager
  • Forefront Unified Access Gateway
  • Remote Desktop Services

For SQL Server the number of license required by compute engine instance is tied to the number of virtual cores, so for ex. if you used machine type n1-standard-2 which has two virtual cores you would need 2 SQL Server standard or enterprise license.

Source for the above info is this article. The article also contains lot of information regarding running Microsoft Software on compute engine and provides additional details on process to go through etc.

I’ve seen in forums and online folks talk about windows based compute engine instances start up slow, but I personally have not felt that way. In fact I found windows instances starting up and shut down faster than Azure VMs

Machine types

Machine type determines the spec for your virtual machine instance such as amount of RAM, number of virtual cores, and persistent disk limits etc. Compute engine has four classes of machine types.

  • Standard machine types

  • High CPU machine types
  • High memory machine types
  • Shared-core machine types

You can have up to 16 persistent disks with total disk size up to 10 TB attached to all machine types except  shared-code machine types

More info on machine types see this article

Pricing

Compute engine is priced cheaper compared to Azure VMs but far less options available for machine types. Couple of interesting things here to know are you are changed a minimum of 10 minutes, what this means is if you started an instance and use it for 5 minutes you are still paying for 10 minutes. Apart from being priced competitively Google also offers sustained use discounts. For more info on pricing check out this article 

If you are interested in trying out Google cloud platform head over to this link and sign up for trial. You can get a $300 credit to use for 6 months.

If you run into any issues and you need support, ask a question on stackoverflow and tag it with google-compute-engine

Cheers,

</Ram>

Creating a base SharePoint 2013 image for Google Compute Engine (GCE)

Incase anyone not familiar with Google Compute Engine, it is the Infrastructure as a Service (IaaS) capability available on Google Cloud Platform. Google now supports Windows Server images, unfortunately only Windows Server 2008 R2 DataCenter edition SP1 is available at the time of writing this article.

Bit of a background for this. I’m testing deploying SharePoint on Google Compute Engine and I needed a base image that has all the required software in it. Additionally there are some steps you have to perform in GCE if you want to be able to assign static IP addresses for your virtual machine. I did not want to keep doing the same thing each time I create a virtual machine in GCE. So this article basically covers how I built a base windows image that contains SharePoint + the configurations required so the virtual machine can have a static IP address assigned. You can follow the same methods to create base windows images that contains software and configuration that your application needs.

First thing create a new virtual machine using the windows server image that is currently available.

gcloud compute --project "ce-playground" instances create "sp-base-image" --zone "us-central1-a" --machine-type "n1-standard-2" --network "sp-farm-net" --metadata "gce-initial-windows-user=rprakashg" "gce-initial-windows-password=P@ssw0rd" --maintenance-policy "MIGRATE" --scopes "https://www.googleapis.com/auth/devstorage.read_only" "https://www.googleapis.com/auth/logging.write" --image "https://www.googleapis.com/compute/v1/projects/windows-cloud/global/images/windows-server-2008-r2-dc-v20150110" --boot-disk-type "pd-standard" --boot-disk-device-name "sp-base-image" --can-ip-forward

if you look at the above command for machine type I used n1-standard-2 and for boot disk I used pd-standard, you have the option to use SSD here if you like. Also –can-ip-forward is used to enable IP routing for virtual machines. Compute Engine does not support assigning static network IP address for virtual machines. you can use a combination of routes and instances –can-ip-forward ability to work around this.

Download the RDP file for the newly created virtual machine and remote into the virtual machine and perform steps below. This is required since we want to assign static network IP address. Reason why we care about this is that the internal network IP addresses are managed by compute engine and can change when you start/stop instances.

Enable Windows Loopback Adapter

Windows Loopback adapter will allow assigning of static IP address to a virtual machine. Follow steps below to enable loopback adapter.

  • Type Device Manager in Start menu
  • From the device manager right click on the virtual machine name and select add legacy hardware
  • Click Next on the welcome screen.
  • Select Install the hardware that I manually select from a list and click Next.
  • Select Network Adapter from the list
  • Select Microsoft from the manufactures list and Microsoft Loopback Adapter from the network adapters list and click next

Add Windows firewall rule that allows ICMP traffic

To support pinging we will add a firewall rule to allow ICMP traffic

  • type Windows Firewall with Advanced Security in start menu
  • Right click on Inbound Rules and select New Rule
  • Select Custom for rule type and click next
  • Keep default settings for Program and click next
  • From the Protocol Type dropdown select ICMPV4 and click next
  • Keep default settings for Scope, Action, Profile
  • Provide a Name and Description for rule and click finish (For the purpose of this post I used ICMP)

Enable IP Forwarding

  • Run regedit
  • Switch to HKEY_LOCAL_MACHINE > SYSTEM > CurrentControlSet > services > Tcpip > Parameters.
  • Set value for IPEnableRouter property to 1 (This enables IP routing for the instance)
  • Click OK

Windows Updates

Next we will set the windows update settings to “download updates but let me chose whether to install them” (By default its set to automatic download and install, we don’t want that as we want to control what updates get installed on virtual machine). Next we will install Microsoft update to get updates for other products such as Office, SharePoint etc. Apply outstanding updates to keep the virtual machine up to date on patches.

Install SharePoint on the Virtual Machine

Next we are going to install SharePoint with all the Pre-Requisites on the virtual machine. This image is going to be a base image for SharePoint 2013 with Service Pack 1. I’ve downloaded this from my MSDN subscription. Depending on your scenario you might choose to use a different version of SharePoint.

Double click on default application, this will bring up a splash screen. Click on Install Software prerequisites

image 

This will bring up the prerequisite installer tool. Click Next, Accept the terms of agreement and install all prerequisites.

image

prerequisites installer tool will reboot once during the install, after the reboot installer will automatically continue and complete. Once the prerequisites are successfully installed, we can go ahead and install SharePoint Server by clicking on the Install SharePoint Server link in splash screen. (note: due to the reboot during install of prerequisites we need to fire up the splash screen again)

Since I’m using SharePoint 2013 with SP1 downloaded from MSDN a valid product key must be entered before I can continue with SharePoint Server install. You can also use evaluation version of SharePoint Server bits as you probably don’t want to use a licensed version in the image. Since this image is going to be a private used strictly by me I can safely use my MSDN product key. If you are going to share the image with others you are going to want to use evaluation version instead of giving away your key.

For Server Type keep Complete selected and click on install now

image

After installation is complete uncheck Run the SharePoint Products Configuration Wizard now. Since we are building a base image we don’t want to do this.

image

I also ran a PowerShell script that does couple of things

  • Turn off unneeded services
  • Apply disable loopback check fix
  • Turn off CRL check

Created a folder named Scripts under C drive and copied some additional PowerShell scripts that automates configuration of SharePoint. Once the Virtual Machine comes up you can simply run the scripts.

At this point we are ready to turn this virtual machine into a base image.

Run command prompt elevated, type gcesysprep and hit enter. (note: Don’t run the standard sysprep utility that we use to sysprep windows images). Gcesysprep utility will terminate the virtual machine instance, we can simply delete the virtual machine without deleting the persistent disk by running following command

gcloud compute instances delete sp2013-image --keep-disks boot

Next we need to create a snapshot out of the root persistent disk. Snapshot allows us to create new persistent disk with the data from the snapshot, additionally you can restore to larger size than what was originally used or even a different type of disk that was originally used. Run following command to create the snapshot of our sp2013-image disk that was syspreped using the gcesysprep utility.

gcloud compute disks snapshot "sp2013-image" --project "ce-playground" --snapshot-names "sp2013-with-sp1-win2008r2sp1"

The above command will return a URI for the snapshot, you’ll want to write this down. Next thing we’ll need to do is create a new persistent disk using the snapshot that we created earlier.

gcloud compute disks create "sp2013withsp1onwin2k8r2sp1" --source-snapshot "https://www.googleapis.com/compute/v1/projects/ce-playground/global/snapshots/sp2013-with-sp1-win2008r2sp1" --project "ce-playground" --zone "us-central1-a"

Next we will create an image using the new persistent disk that was created in previous step

gcloud compute images create "sharepoint-server-2013-sp1" --source-disk "sp2013withsp1onwin2k8r2sp1" --source-disk-zone "us-central1-a" --project "ce-playground"

You can see from the screenshot below that my custom image is now available for me and I can create new virtual machines using this image.

image

You can run following command to see all the metadata associated with your custom image

gcloud compute images describe "sharepoint-server-2013-sp1" --project "ce-playground"

At this point we can create new virtual machines using this image. Once the virtual machine is created you can simply RDP into the virtual machine and run the PowerShell script to Create a new SharePoint Farm or join to an existing SharePoint Farm. This significantly cuts down the time required to get infrastructure up and running in Google Cloud Platform to run SharePoint workload. I plan to do some performance testing. My goal is to compare how Google Compute Engine stack up against Azure IaaS specifically when it comes to running large workloads like SharePoint in the cloud.

Cheers,

</Ram>