Tuesday, December 30, 2014

Validation Groups for ASP.NET MVC

If you are used to working with ASP.NET Web Forms, you are probably already familiar with Validation Groups which allow you to validate a portion of a form at a single time rather than validating the entire form.  Validation Groups were introduced with ASP.NET 2.0.

However, even if you are using the latest version of ASP.NET MVC (v. 5.0 as of this writing), you will notice that this functionality is notably missing.

Of course, some projects have arisen recognizing the need to fill this gap in ASP.NET MVC such as the Validation Groups project on CodePlex:  https://mvc3validationgroups.codeplex.com/.  There is even a NuGet package available for this project here: https://www.nuget.org/packages/Mvc3ValidationGroups/

Unfortunately, as you can tell, this code is very much outdated since it only references MVC3 and it does not seem to have been updated in several years. 

While this library does not provide the same functionality as Validation Groups, it provides a good way to consolidate validation logic for your MVC Models/ViewModels: http://fluentvalidation.codeplex.com/

This project is up-to-date with support for ASP.NET MVC 5 so you can use it in your latest MVC projects.

Finally, if you want Microsoft to include Validation Group support in the next release of ASP.NET MVC (MVC 6), I would encourage you to vote for this User Voice item here: http://aspnet.uservoice.com/forums/41201-asp-net-mvc/suggestions/2071605-validation-group-support

Monday, December 29, 2014

Troubleshooting "Method Not Allowed" errors in ASP.NET Web API

If you are working with ASP.NET Web API, you may have encountered a situation whereby you can perform a GET or POST just fine with your ASP.NET Web API project, but cannot successfully execute a PUT without getting a "HTTP/1.1 405 Method Not Allowed" error message even when you have set up the HttpPut verb on your Put method in your Web API controller.

Well, there are 2 possible solutions to this problem:

  1. The request for PUT could be a Cross-Origin Request and thus could be blocked automatically by Web API.  In order to add CORS support to your ASP.NET Web API application, you can follow the steps outlined here: http://www.asp.net/web-api/overview/security/enabling-cross-origin-requests-in-web-api 
  2. The other potential issue could be your configuration of IIS.  The WebDAV module in IIS could potentially conflict with PUT requests, in which case you may have to modify your Web.config file as outlined in this article: http://www.asp.net/web-api/overview/testing-and-debugging/troubleshooting-http-405-errors-after-publishing-web-api-applications

However, if like me, neither of these solutions worked for you, you may need to additionally remove the WebDAV module from IIS using the following Web.config section:

<modules>
    <remove name="WebDAVModule" />
</modules>

Therefore, your system.WebServer section should end up looking like the following:



<system.webServer>
  <handlers>
    <remove name="WebDAV" />
    <remove name="ExtensionlessUrlHandler-Integrated-4.0" />
    <remove name="OPTIONSVerbHandler" />
    <remove name="TRACEVerbHandler" />
    <add name="ExtensionlessUrlHandler-Integrated-4.0" path="*." verb="*" type="System.Web.Handlers.TransferRequestHandler" preCondition="integratedMode,runtimeVersionv4.0" />
  </handlers>
  <modules>
    <remove name="WebDAVModule" />
  </modules>
</system.webServer>

Saturday, December 27, 2014

Working with List Controls in ASP.NET MVC

Unfortunately, Microsoft does not provide default Scaffolding for any List controls in ASP.NET MVC such as the DropDownList, RadioButtonList or CheckBoxList controls as was provided earlier in ASP.NET Web Forms.

Therefore, for the time being, developers are required to come up with their own solutions for implementing Scaffolding or Model-binding support for these controls.

Of the 3 types of controls, the DropDownList control is by far the easiest to manage:

In your C# code, you need to create the following Model properties:
public IList<SelectListItem> Countries { get; set; }

 

[Required]

public string SelectedCountry { get; set; }



Then in your Controller class, you need to add the appropriate model binding code:



SampleViewModel model = new SampleViewModel();

model.Countries = new List<SelectListItem>();

model.Countries.Add(new SelectListItem{Text = "USA", Value = "USA"});

model.Countries.Add(new SelectListItem { Text = "India", Value = "India", Selected = true});

model.Countries.Add(new SelectListItem { Text = "Canada", Value = "Canada" });



In your Razor View, you simply use the following Html Helper:



<div class="form-group">

    @Html.LabelFor(model => model.SelectedCountry, htmlAttributes: new { @class = "control-label col-md-2" })

    <div class="col-md-10">

        @Html.DropDownListFor(model => model.SelectedCountry, Model.Countries, new { htmlAttributes = new { @class = "form-control" } })

        @Html.ValidationMessageFor(model => model.SelectedCountry, "", new { @class = "text-danger" })

    </div>

</div>

This is an article which describes how to do this in a bit more detail: http://odetocode.com/blogs/scott/archive/2013/03/11/dropdownlistfor-with-asp-net-mvc.aspx

With the RadioButtonList and CheckboxList controls, the situation becomes a bit more complex, particularly with the Checkbox control which provides significantly different handling than what was provided with ASP.NET Web Forms.

You have to start out by creating your own custom Class to handle support for your RadioButtonList and CheckboxList controls such as the following:



public class GenericListItem

    {

        public int Value { get; set; }

        public string Text { get; set; }

 

        public string Checked {

            get { return "checked"; }

        }

 

 

        public bool Selected { get; set; }

 

        /// <summary>

        /// Default constructor

        /// </summary>

        public GenericListItem()

        {

            //Default all ListItems to an initial value of unchecked/false

            this.Selected = false;

        }

 

        /// <summary>

        /// Overloaded constructor

        /// </summary>

        /// <param name="Value"></param>

        /// <param name="Text"></param>

        /// <param name="Selected"></param>

        public GenericListItem(int Value, string Text, bool Selected = false)

        {

            this.Value = Value;

            this.Text = Text;

            this.Selected = Selected;

        }

 

    }



In your Controller class, you can add similar model binding code as what was done for the DropDownList control:




model.RadioButtonList = new List<GenericListItem>();

model.RadioButtonList.Add(new GenericListItem { Text = "RadioListItem1", Value = 1 });

model.RadioButtonList.Add(new GenericListItem { Text = "RadioListItem2", Value = 2, Selected = true });

model.RadioButtonList.Add(new GenericListItem { Text = "RadioListItem3", Value = 3 });

 

model.CheckBoxList = new List<GenericListItem>();

model.SelectedCheckbox = "CheckboxListName";

model.CheckBoxList.Add(new GenericListItem { Text = "CheckBoxListItem1", Value = 1, Selected = true});

model.CheckBoxList.Add(new GenericListItem { Text = "CheckBoxListItem2", Value = 2, Selected = true });

model.CheckBoxList.Add(new GenericListItem { Text = "CheckBoxListItem3", Value = 3 });



In your Razor View, you have to write some custom Razor code in order to achieve the desired rendering:



<div class="form-group">

   @Html.LabelFor(model => model.SelectedRadioButton, htmlAttributes: new { @class = "control-label col-md-2" })

   <div class="col-md-10">

       @foreach (GenericListItem radioitem in Model.RadioButtonList)

       {

           if (radioitem.Selected)

           {

               @Html.RadioButtonFor(model => model.RadioButtonList, radioitem.Value, htmlAttributes: new { @checked = radioitem.Checked }) @Html.Label(radioitem.Text)

           }

           else

           {

               @Html.RadioButtonFor(model => model.RadioButtonList, @radioitem.Value) @Html.Label(@radioitem.Text)

           }

 

       }

       @Html.ValidationMessageFor(model => model.SelectedRadioButton, "", new { @class = "text-danger" })

   </div>

</div>

<div class="form-group">

   @Html.LabelFor(model => model.SelectedCheckbox, htmlAttributes: new { @class = "control-label col-md-2" })

   <div class="col-md-10">

       @foreach (GenericListItem checkboxItem in Model.CheckBoxList)

       {

           

           @Html.CheckBox(Model.SelectedCheckbox, checkboxItem.Selected, new { value = @checkboxItem.Value }) @Html.Label(checkboxItem.Text)

       }

   </div>

</div>


You will notice that even though there is a CheckBoxFor Html Helper, I avoided using it due to the lack of support for the value attribute.  

If you are like me and would like to see Microsoft add support for Scaffolding/Model-binding support for List Controls especially for the CheckBoxList and RadioButtonList controls, vote for these User Voice items:

https://aspnet.uservoice.com/forums/41201-asp-net-mvc/suggestions/5849005-provide-html-checkboxlist-and-html-radiobuttonlist

https://aspnet.uservoice.com/forums/41201-asp-net-mvc/suggestions/488893-better-binding-support-for-checkbox-lists

Friday, December 26, 2014

TF30063: You are not authorized to access Microsoft-IIS/8.5

I had recently set up a single server installation of Team Foundation Server 2013 (just the application server) when one of my users reported the following message when attempting to perform a "Get Latest" operation on the server.



TF30063: You are not authorized to access Microsoft-IIS/8.5


Well, I found a similar article which describes the problem and recommends examining the log file: http://mgrowan.wordpress.com/2014/02/26/tf30063-you-are-not-authorized-to-access-microsoft-iis7-5/ 

Sure enough, the disk space on the C:\ drive was getting low because the SQL Server Data and Log files were stored all on the C:\ drive.  One of the SQL Server database log files had grown to an astonishing 62 GB!

Well, of course, the remedy was to detach the SQL Server database and re-attach the database so that the Log File existed on a different drive other than the C:\ drive.

Once I did this, my TFS instance was back up and running smoothly!

Thursday, December 25, 2014

Using the CallTarget Task to pass different parameters in MSBuild

If you have a task in MSBuild that you want to re-use over and over again similar to how you would call a function/method in a programming language such as C#, it is not readily obvious how to accomplish this in MSBuild.

Fortunately, there is a way to accomplish this using MSBuild and the CallTarget Task though the solution is not readily obvious.

  1. First of all, declare a set of Properties in a PropertyGroup that you want to use as parameters to your Target.  It is important that these Properties be declared globally in your MSBuild file so that the Target is always aware of the different values of your Properties.  If the Properties are declared locally to your Target then they will not be available across multiple calls to your Target.
  2. Next, declare your Target that you will need to call multiple times.  Within the Target, utilize the Properties that you defined in your global PropertyGroup.
  3. Finally, declare another Target which will be responsible for calling your Target using the CallTarget task multiple times.  In this Target, you will override the values defined in the Global Property between consecutive calls to your Target.
Therefore, your final MSBuild file will look like the following:

 
<Project ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <Import Project="$(MSBuildExtensionsPath32)\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks" />
  <Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" />
  <PropertyGroup>
    <AdvancedInstallerPath>$(ProgramFiles)\Caphyon\Advanced Installer 11\bin\x86\AdvancedInstaller.com</AdvancedInstallerPath>
    <AdvancedInstallerProjectFilePath>Sample.aip</AdvancedInstallerProjectFilePath>
    <PathVariableName>Infragistics_Dir</PathVariableName>
    <PathVariableValue>$(PrerequisitesFilesDir)\Infragistics</PathVariableValue>
  </PropertyGroup>
  <Target Name="TestBuildPathVariables">
    <PropertyGroup>
      <PathVariableName>MyPathVariable</PathVariableName>
      <PathVariableValue>$(PrerequisitesFilesDir)</PathVariableValue>
    </PropertyGroup>
    <CallTarget Targets="PathVariableTarget" />
    <PropertyGroup>
      <PathVariableName>MyPathVariable2</PathVariableName>
      <PathVariableValue>$(PackagingFilesDir)</PathVariableValue>
    </PropertyGroup>
    <CallTarget Targets="PathVariableTarget" />
  </Target>
<Target Name="PathVariableTarget">
    <ItemGroup>
      <PathVariableArg Include="/edit" />
      <PathVariableArg Include="&quot;$(AdvancedInstallerProjectFilePath)&quot;" />
      <PathVariableArg Include="/NewPathVariable" />
      <PathVariableArg Include="-name $(PathVariableName) -value $(PathVariableValue) -valuetype Folder" />
    </ItemGroup>
    <Message Text="This is the command to be run: $(AdvancedInstallerPath) @(PathVariableArg, ' ')" />
    <Exec Command="&quot;$(AdvancedInstallerPath)&quot; @(PathVariableArg, ' ')" />
</Target>
</Project>

Tuesday, December 23, 2014

Using ASP.NET MVC Scaffolding with ViewModels

If you read about ASP.NET MVC Scaffolding, most examples will make references to using Entity Framework based Model classes in order to build out your Views, however, if you are using a layered architecture or a Repository pattern to segregate your data access logic from your Controllers, then you will probably want to know how to leverage Scaffolding in order to bind to your ViewModels instead.

Well, fortunately, Scaffolding only changes slightly when working with ViewModels:

  1. Create your ViewModel class in your project.  This should be wired up with all of the necessary attributes (System.ComponentModel.DataAnnotations) that you will need for your ModelState validation in your Controller methods/actions.  For a complete list of DataAnnotation attributes, you can consult this MSDN documentation: http://msdn.microsoft.com/en-us/library/system.componentmodel.dataannotations%28v=vs.110%29.aspx
  2. Compile your Project/Solution.  If you do not do this first, then the newly created ViewModel class will not appear as part of your Scaffolding options.
  3. In this next step, you have to be careful about which option you choose.  If you choose "Add-->New Scaffolded Item" then you will only get options to add an empty MVC Controller.  Therefore, you should go to the Controllers directory and "Add-->Controller".  You will then be able to choose a Controller with read/write actions.
  4. Once you have created your MVC Controller, you will see a Controller with empty operations for the common actions needed in an MVC Controller.  Of course, none of the Views have actually been added to the project yet.
  5. Right click on one of the relevant MVC Controller actions and click on "Add-->View".  
  6. You will then be prompted to create a View corresponding to the appropriate Controller action and choose the respective Model/ViewModel that you will be using for binding as well as if there is any corresponding Layout page for that View.
  7. Once the View is created, you will notice that it shells out all of the necessary HTML Helpers for the View that you can use as a starting point for modifying your View to meet your requirements.  Of course, if you want to make your Scaffolding even more useful, then you can create custom Display and Editor templates to ensure your Views more closely match your business needs/requirements as described here: http://www.codeguru.com/csharp/.net/net_asp/mvc/using-display-templates-and-editor-templates-in-asp.net-mvc.htm
  8. Once you have decided on the ViewModel classes that you will use for your Controller and Views, you will want to change the generic FormCollection collection controller parameters on your Create, Edit and Delete MVC Controller methods to match the name of your ViewModel class (ex: MVCSampleModel model)
  9. Now you can proceed to create Views for each of your corresponding Controller actions and then fill in the necessary logic in your Controller actions to store and persist your ViewModel classes to your backing data store knowing that all of your MVC Views are already strongly typed and have much of the validation logic already built-in!!








Monday, December 22, 2014

Using Fiddler to test your ASP.NET Web API application

If you are developing an ASP.NET Web API application, you may want to be able to periodically test your ASP.NET Web API client without writing a Web Page and some JavaScript.

Fortunately, you can do just that using Telerik Fiddler which can be downloaded from here: http://www.telerik.com/fiddler

You basically just use the Fiddler Composer to compose your JSON requests to your Web API endpoints in the Request Body:






So how do you figure out how to properly compose the Request Body using JSON?

Well, fortunately, with Web API 2, you can use the /help feature to view all of your ASP.NET Web API routes/endpoints.

If you then click on the hyperlink to one of your routes, you will be able to see a sample of how to compose a JSON Request to send to your Web API Service.  You can then use Fiddler to choose the appropriate GET, PUT, POST, PATCH or DELETE response and view the resultant JSON!!

How neat is that??





Calling ASP.NET Web API from an ASP.NET MVC Controller

In many cases, you will find examples all over the Internet on how to call ASP.NET Web API from JavaScript using a framework such as jQuery or AngularJS.  However, what if you want to call an ASP.NET Web API application directly from ASP.NET MVC?

You may ask, why would you want to do that? 

Well, the biggest reason for doing this is that it improves testability.  In order to verify that your JavaScript calls work properly, you would have to use a JavaScript Unit Testing framework and none of them integrate very nicely with Visual Studio.  In addition, JavaScript Unit Testing frameworks are not nearly as sophisticated and advanced compared to .NET/C# Unit Testing Frameworks that can use features such as Exception Handling, Mocking and Dependency Injection.

Therefore, once you have committed to calling ASP.NET Web API from your ASP.NET MVC application, how exactly do you accomplish this?

Well, fortunately, there is an article on how to do just that right here: http://www.asp.net/web-api/overview/advanced/calling-a-web-api-from-a-net-client

Though the article does not mention ASP.NET MVC directly, since MVC is simply a .NET Client to your ASP.NET Web API application, the rules still apply.  It basically just leverages the HttpClient object to get and send requests to your ASP.NET Web API application.

That is all there is to it!


Sunday, December 21, 2014

Jetbrains TeamCity vs. Team Foundation Build

If you are already using Team Foundation Server, you may wonder whether it is better to simply use Team Foundation Build or use an alternative Continuous Integration build platform such as Jetbrains TeamCity.

Here is a brief comparison of the 2 platforms that you can use to evaluate in making your decision:

  1. Jetbrains TeamCity is a Java-based platform that runs completely on the Web.  Team Foundation Build is a .NET/Windows Workfow based platform that largely needs to be configured and managed through the Visual Studio IDE.
  2. Jetbrains TeamCity can be set up and running in less than an hour with most build processes, while Team Foundation Build can take considerably longer if you are doing anything more complex than building your solution and running your Unit Tests.
  3. Jetbrains TeamCity is overall simpler to use and much more readily customizable than Team Foundation Build.  It also integrates well with a large variety of 3rd party plugins which are simply not available to Team Foundation Build.
  4. Team Foundation Build has a feature called Gated Check-In which prevents checking in code which breaks the build, while Jetbrains TeamCity has a pre-tested commit feature.
  5. Jetbrains TeamCity has a build definition dependency feature which allows triggering one build based on the output of another build definition.  Team Foundation Build has yet to introduce this feature even in Team Foundation Build 2013.
  6. Jetbrains TeamCity provides some handy features such as built-in runners for FxCop and bundled code coverage tools such as dotCover which allow you to overcome Visual Studio edition limitations. For example, if you are only licensed to use Visual Studio Professional Edition, you will not be able to run Code Coverage on the Unit Tests on your solution without Visual Studio Premium Edition.  However, if you are running a CI Server such as Jetbrains TeamCity, you will still be able to obtain Code Coverage reports from your CI Builds.
  7. Jetbrains TeamCity upgrades are relatively painless while Team Foundation Build upgrades can be quite time consuming depending on how many customizations were made to the original TFS Build Process templates.
Overall, unless you ABSOLUTELY need the features and functionality of Team Foundation Build, I would recommend using Jetbrains TeamCity since it is far easier and more intuitive to get up and running.  Because it is entirely web based, it is much easier to manage and maintain as a development team as well.

If you need the features of Team Foundation Build but also like what Jetbrains TeamCity has to offer, then you may be able to take an approach of combining the 2 solutions to meet your needs.  You can use Team Foundation Build for many of your mainstream build processes such as building solutions and running Unit Tests while leveraging Jetbrains TeamCity for any major customization tasks such as Web.config transformations, Deploy operations, Code Analysis, Code Metrics, Code Coverage etc. This approach allows to easily manage upgrades to Team Foundation Server without fear of losing all of your existing build customizations while continuing to maintain an existing CI Build Server through your upgrade process.

Saturday, December 20, 2014

Manage your NuGet Packages as part of your Continuous Integration build process

With the release of NuGet packages to manage many of your project references, it becomes increasingly difficult to manage all of the various package updates that may be required to support new features in your project.

Well, fortunately, Microsoft has offered a handy solution with the release of Visual Studio 2013 to ensure that the building of your solutions in your Continuous Integration build process goes smoothly.

The new feature is called "Enable NuGet Package Restore".  What this allows you to do is to set up your solution to be configured to automatically restore any missing NuGet Project references at build time.  Therefore, if some of the build references are missing from your source control repository, there is no need to worry!  With this option enabled in your solutions, your build process will automatically download the necessary NuGet Package references as part of your CI Build process to ensure you have successful builds each time. 

This is especially important with how rapidly technologies such as Entity Framework, ASP.NET Web API and ASP.NET MVC are evolving.  It seems like there are new versions of these frameworks almost on a weekly or at least monthly basis. 

Below are screenshots on how to set up your Visual Studio solutions to utilize this great new feature:







Path Variable handling in InstallShield vs. Advanced Installer

If you have to deal with Path Variables in your build/packaging IDE in order to manage and maintain varying file/folder paths as an installation package is moved around between machines (ex: between a local development machine and a build/CI Server), there are 2 very, very different approaches that are used by InstallShield vs. Advanced Installer.

InstallShield offers a much more open and flexible approach to dealing with Path Variables by allowing you to define them yourself without forcing you into predefined values for your Path Variables as shown below:








Advanced Installer takes a very different approach and is much more restrictive in how you can manage Path Variables.  You can create your own Path Variables but they have to be confined to pre-existing values such as a folder on disk, an environment variable or a registry value.  In addition, you have to use the Convert Paths wizard to convert many of the paths in your project to Path Variables.  There is no real easy way to convert to already defined Path Variables in your project. 








Personally, I like InstallShield's approach to Path Variables much more than the approach taken by Advanced Installer but, overall, I prefer the Advanced Installer IDE to InstallShield's IDE.  Therefore, I truly hope that the Advanced Installer team implements a solution to handling Path Variables much more similar to what is offered by InstallShield in order to greatly improve the management and maintenance of installation packages as they move between environments. 

Friday, December 19, 2014

Handling spaces for file paths in MSBuild

If you need to deal with file/folder paths that contain spaces in it, this can be problematic when dealing with MSBuild since MSBuild does not automatically handle file paths with spaces in it similar to the command shell.

Fortunately, there is an easy solution to handling these file/folder paths that contain spaces in them.

My preference is to encapsulate them in a PropertyGroup element and assign them to a Property.  Then, I can surround them with the escaped HTML sequence for double quote characters &quot; and then simply use these properties throughout my MSBuild project file!


Thursday, December 18, 2014

Setting up Jetbrains TeamCity to run Visual Studio Unit Tests

If you are using Jetbrains TeamCity as your CI (Continuous Integration) server, there is a good chance that you also want to be able to run Unit Tests as part of your CI Build processes.

Well, fortunately, TeamCity includes a built-in MSTest Build runner to run your Unit Tests as well as bundles a version of Jetbrains dotCover in order to allow you to obtain Code Coverage results if you are not running a version of Visual Studio Premium edition or higher.

In order to set up TeamCity to run Unit Tests, you will need to do the following:

  1. Make sure that your previous build steps are using the Debug configuration.  If they are using the Release configuration, you will not have the necessary .pdb files in order to successfully run your Unit Tests and obtain your Code Coverage results.  
  2. Create a build step which uses the MSTest build runner
  3. Specify the version of MSTest that you are using on your build machine
  4. On each line of the "List assembly files" text area, specify the FULL PATH to each of the Unit Test assemblies.  Therefore, if the full path of the Unit Test assembly is something like %system.teamcity.build.workingDir%\MyUnitTests\bin\Debug\MyUnitTests.dll you will have to specify this full path. 
  5. For Code Coverage using Jetbrains dotCover, simply choose Jetbrains dotCover as the Code Coverage tool.
  6. Run your Build Step to ensure that it properly executes your Unit Tests and returns your Code Coverage results.
That is all there is to it!!


Saturday, December 6, 2014

Add a Help Page to your existing ASP.NET Web API Project

If you are using Visual Studio 2013 and you create a brand new ASP.NET Web API project, chances are you will already get the necessary structure to be able to use a Help Page for clearly viewing all of the available ASP.NET Web API routes in your project.

However, what if you have an older ASP.NET Web API project that does not have this feature?

Well, fortunately, this is available through a NuGet package called Microsoft.AspNet.WebAPI.HelpPage

You can read more about this NuGet package here: https://www.nuget.org/packages/Microsoft.AspNet.WebApi.HelpPage




The nice thing about this package is that it displays ALL of your routes including your Custom routes that you have defined by decorating your Web API Controllers with attributes!   This makes it very handy to easily discover and document the available RESTful services available in your ASP.NET Web API project. 

Friday, December 5, 2014

VMWare Workstation 11 now available!

VMWare Workstation 11 was just recently released and it promises to be the best release of VMWare Workstation thus far!!

You can read more about VMWare Workstation 11 here: http://www.vmware.com/products/workstation

You can see how VMWare Workstation 11 compares to older versions of VMWare Workstation here: http://www.vmware.com/products/workstation/compare

VMWare is offering promotional pricing on VMWare Workstation 11 right now that allows you to get it at a 30% discount off of the regular price right now: Buy VMWare Workstation 11

Gated Check-in Builds in Team Foundation Server 2013

If you had used Gated Check-In Builds previously in Team Foundation Server 2010, you may have noticed that there was no way to "block" people from overriding the Gated Check-In builds and bypassing the Gated Check-In build.

This became very frustrating because whenever a developer wanted to commit the check-in to TFS Source Control, they could do so and completely avoid the validation checks that were being used as part of the Gated Check-In Build!!

Fortunately, this seems to have been remedied in Team Foundation Server 2013, with the addition of a new build permission called "Override check-in validation by build" in the Security administration area of Team Foundation Server.

Below is a screenshot of the new permissions available with Team Foundation Server 2013 builds:


So now you have a way to actually FORCE your development team to follow Gated Check-In build guidelines!!

Thursday, November 27, 2014

Connecting to LocalDB using SQL Server Management Studio

If you have an application (such as a sample Visual Studio application) that uses LocalDB and you want to move the application to IIS for scalability reasons, odds are, you will need to connect to the LocalDB instance, create a backup and then move it to the full version of SQL Server.

So, how do you connect to your LocalDB instance using SQL Server Management Studio?

Well, actually, the answer is very simple!

You can simply connect to LocalDB using the following path in SQL Server Management Studio: 
(localdb)\V11.0





Of course, when you take a backup of the LocalDB database, you will only be able to specify the original User Profile location such as Users\<username>\<database name>.bak




However, once you have the database backup, you can readily move it over to the full version of SQL Server!

Saturday, November 22, 2014

Visual Studio 2013 and Team Foundation Server 2013 Update 4

I recently had to update my installations of Visual Studio 2013 and Team Foundation Server to Update 4 and I found it surprisingly difficult to get the right search results on the 1st or 2nd hit by searching on Google.  In addition, it is not readily apparent whether the TFS 2013 Update 4 installation package is different from the Visual Studio 2013 Update 4.

For your reference, I am including the direct links to Visual Studio 2013 Update 4 and Team Foundation Server 2013 Update 4 respectively:

Visual Studio 2013 Update 4: http://www.microsoft.com/en-us/download/details.aspx?id=44921

Team Foundation Server 2013 Update 4: http://www.microsoft.com/en-us/download/details.aspx?id=44911

Tuesday, November 18, 2014

Setting up Team Foundation Server 2013 in the Windows Azure Cloud

If you have a need to use on-premise Team Foundation Server (rather than Visual Studio Online) and lack the internal hardware to host it yourself, hosting it in the Windows Azure cloud is a great option.

However, there are numerous caveats and stumbling blocks to hosting it in the Windows Azure cloud.  The biggest of which, of course, is making all of the facets of your Team Foundation Server 2013 installation available over the Internet.

Without introducing SharePoint into the mix, these are the steps that you will need to follow in order to make your TFS instance available in the Windows Azure cloud:

  1. From the Team Foundation Server Administration Console, you will need to change the Urls to be externally facing Urls (ex: myserver.cloudapp.net)
  2. In SQL Server Reporting Services Manager, you will need to add additional Report Urls to point to the externally facing Urls (ex: reportserver.cloudapp.net)
  3. In the Team Foundation Server Administration, Console, in the Reporting section, you will need to Edit the Reporting Urls to point to the externally facing Report Server Urls (ex: reportserver.cloudapp.net)
  4. If you are having difficulty getting the servers to recognize the external DNS name, you will probably have to add the BackConnectionHostNames registry key to get around this problem: http://support.microsoft.com/kb/896861 or http://www.technologytoolbox.com/blog/jjameson/archive/2013/05/24/powershell-scripts-for-managing-backconnectionhostnames-kb-896861.aspx 

This should resolve your major issues with setting up a TFS 2013 instance in the Windows Azure cloud!

Wednesday, November 12, 2014

Visual Studio 2015 now available for download!

Visual Studio 2015 has just been made available for download:  http://www.visualstudio.com/en-us/downloads/visual-studio-2015-downloads-vs

There are numerous welcome enhancements but the one that intrigues me the most is the introduction of the Lightbulb icon to be used while Refactoring code.  I have seen the Lightbulb icon for years and years since I started using Resharper for Visual Studio and it seems that Microsoft is now taking a page out of the Resharper playbook to offer a similar set (but probably a reduced set) of features. 


Saturday, November 8, 2014

Configure SMTP SendGrid for applications that do not support SMTP Authentication

If you are using SMTP SendGrid to send e-mails from within your Windows Azure VMs, you may encounter numerous applications that do not natively support configuring authentication for sending out SMTP e-mail messages.  One such application is SharePoint 2013.

Fortunately, the folks at SMTP SendGrid have created a nice little article for us to configure IIS to allow sending authenticated e-mail mesages through SMTP SendGrid as well as testing out the e-mail functionality through a nice little set of telnet commands.

You can find the article on how to configure IIS SMTP to use in conjunction with SMTP SendGrid here: https://sendgrid.com/docs/Integrate/Mail_Servers/iis75.html

Friday, November 7, 2014

Configuring SharePoint 2013 with Team Foundation Server 2013

I recently had a requirement to set up a SharePoint 2013 Server AFTER I had already set up and installed my Team Foundation Server 2013 installation.  Normally, I have been using the built-in installation capabilities of installing SharePoint Foundation 2013 as part of the TFS installation and have not messed with setting up a separate instance of SharePoint 2013 to integrate with TFS 2013 on a completely different server, so this was a new experience to say the least.

Fortunately, I found this article which guided me through most of the difficult parts of setting up and configuring the SharePoint installation to integrate with TFS 2013: http://nakedalm.com/integrate-sharepoint-2013-with-team-foundation-server-2013/

Of course, if you read through the MSDN documentation, there are lots of things to consider when installing SharePoint 2013 in order to ensure that everything properly integrates with TFS 2013: http://msdn.microsoft.com/en-us/library/dd578615.aspx

Of course, I used AutoSPInstaller (http://autospinstaller.codeplex.com/) to install SharePoint 2013 for me, so many of the manual steps for setting up my SharePoint installation were already taken care of for me. 

I then followed the steps for installing the TFS Extensions on my SharePoint installation: http://msdn.microsoft.com/en-us/library/hh548140.aspx

After doing this, I was ready to go into my TFS Server and complete all of the remaining integration points to get my SharePoint installation up and running with all of the necessary TFS features!

Tuesday, October 28, 2014

Testing Responsive Design using Mozilla Firefox

If you are developing a website that uses Responsive Design techniques, there are several tools available for you to test Responsive Design.

One of the most commonly used tools for testing Responsive Design in the past is the use of the Opera Mobile Emulator: http://www.opera.com/developer/mobile-emulator

However, if you are not concerned with the use of specific devices that have different screen sizes and you just want to test out your media queries that are used as part of  your Responsive Design, you can do this directly using Mozilla Firefox!!

In Mozilla Firefox, use the Shortcut Key Combination: Ctrl + Shift + I.  This will bring up the Developer Tools.

On the right hand side of the Developer Tools, you will find a button for Responsive Design:


 You can then click on the Responsive Design button to highlight/select Responsive Design:



Once the Responsive Design button has been selected, you should see a screen like the following in Firefox that will allow you to choose various different screen sizes to allow you to test out your media queries:

In addition, if you will be bouncing back and forth between using Responsive Design techniques and non-Responsive Design techniques, you can also use the Shortcut Key Combination Ctrl + Shift + M!






Tuesday, October 21, 2014

Provide feedback on the next version of Windows!!

Microsoft finally has decided that it will allow users to provide feedback on what features will go into the next release of Windows (Windows 10).

You can offer suggestions and vote on existing suggestions here: https://windows.uservoice.com/forums/265757-feature-suggestions/filters/top

You can read about some of the top requested features here: http://www.pcworld.com/article/2835218/windows-10-the-top-10-features-users-want-microsoft-to-add.html#tk.nl_today

Friday, October 17, 2014

Free PowerShell Editor - PowerShell Plus

If you are used to using PowerGUI, you may already know that once Dell acquired PowerGUI, there has not been activity or development of the product.

Fortunately, Idera has stepped up to the plate and has released PowerShell Plus, a free PowerShell Editor with many of the same features as PowerGUI:

https://www.idera.com/productssolutions/freetools/powershellplus/powershellregistrationform/thankyouconfirmation





Overall, the look and feel is much more modern than PowerGUI and it is nice to know that PowerShell Plus natively supports PowerShell v. 3.0 and above (no support is provided for PowerShell 2.0 and below).

I am looking forward to using this editor as my replacement to PowerGUI for developing all of my PowerShell scripts and to fill in the gaps left by the PowerShell ISE editor.

Friday, October 10, 2014

Unable to launch Enterprise Library 6 Configuration Console

I have worked with Enterprise Library in the past and have just recently began working on projects that are using .NET v. 4.5 or later.  Since earlier versions of Enterprise Library did not support .NET Framework v. 4.5, this was one of the first times I tried using Enterprise Library 6.

I went ahead and downloaded the Enterprise Library from here: http://www.microsoft.com/en-us/download/details.aspx?id=38789

However, when I went ahead and extracted the EnterpriseLibrary6-binaries, I discovered that I was unable to launch the Enterprise Library Configuration Console!  Each time I attempted to launch the Enterprise Library Configuration Console, I would get an exception message dialog indicating that the application had crashed!



This was unusual since I had always simply extracted Enterprise Library in previous versions and run the Configuration Console earlier.

Well, after further examination, I found the following text in a ReadMe file:

MICROSOFT ENTERPRISE LIBRARY 6

Summary: This package contains Enterprise Library configuration console, MSMQ distributor service, merge configuration tool and a script to download binaries for all application blocks from NuGet.

In order to get all the binaries, run the install-packages.ps1 script.

Note: For the Semantic Logging Application Block Out-of-Process service, a separate package is available for download.

The most up-to-date version of the release notes and known issues is available online:
http://aka.ms/el6release


Microsoft patterns & practices
http://microsoft.com/practices








Aha!  So I had to run a PowerShell script in order to download all of the necessary binaries to get the Enterprise Library Configuration Console to run.



Of course, after I ran the install-packages.ps1 PowerShell script, I was able to successfully launch the Enterprise Library Configuration Console!!

Send e-mails through Windows Azure

If you want to send e-mail messages through Windows Azure, you can go through the trouble of setting up IIS SMTP and then configuring it to work with your e-mail provider or host such as GMail, Yahoo!, GoDaddy etc. or you can use one of various SMTP Hosting providers that provide SMTP Relay services.

If you are using Windows Azure, using an SMTP Relay service becomes easier with SendGrid (http://sendgrid.com/)

SendGrid is available as an add-on to Windows Azure and it offers a variety of pricing options including sending 25,000 e-mail messages per month for FREE!!!

Best of all, you can simply set up your Windows Azure Virtual Machine for the standard SMTP port of 25 and then just configure your application to use the SendGrid user credentials to send e-mail messages.  That is all there is to it!!!


Tuesday, October 7, 2014

Deleting a Team Foundation Server project

If you need to delete a Team Foundation Server Team Project, you can delete the Team Project directly from the Team Foundation Server Admin Console as is outlined here: http://msdn.microsoft.com/en-us/library/ff357756.aspx

Of course, there is a greater likelihood that some of the artifacts from the Team Project may remain even after the deletion, therefore, the recommended strategy to delete a Team Project is to use the command line as is outlined here: http://msdn.microsoft.com/en-us/library/ms181482.aspx

While in most scenarios you will probably not need to delete a Team Project, this option is handy if you are performing a migration from another source control system (such as VSS), where the migration may not have completed successfully.  In those instances, it is useful to simply delete the Team Project, re-create it and restart the migration.


Resuming a failed VSS Upgrade

I was performing a VSS Upgrade to TFS recently and after many hours of running, the VSS Upgrade seemed to freeze and not continue processing.  As you can probably guess, canceling the operation had no effect, thus requiring me to kill the actual VSS process.

I wasn't sure about the state of the migration at that point, but I decided to re-launch the VSS Upgrade wizard just to see what options were available.

Below is the dialog that I saw:





Upon re-launching the VSS Upgrade Wizard, I was presented with the option to resume a previous upgrade!  Therefore, all of the hours spent on performing the analysis and migration could potentially be saved!

Of course, you will have to determine whether or not this option is viable for you.  If you are migrating a recent VSS 2005 database that returned with a clean analysis, a resume may work out just fine.  However, if you are working with an older VSS 6.0 database that might be on its last legs due to the numerous errors in the database, it is probably safer to simply start the upgrade all over again.


Monday, October 6, 2014

Upgrading older versions of Visual SourceSafe databases

I recently was working on a project for a client where they had a Visual SourceSafe database dating back more than 15 years!

As you can probably guess, they wanted to migrate that old VSS database to TFS as well.  I was able to successfully open the database in VSS 2005 and run the analyze and repair but when I went to run the VSS to TFS upgrade, I got the following error message:


Well, as it turns out, the database was running a much older version of VSS than v. 6.0!!  Therefore, it required an upgrade before I could successfully migrate the source code over to TFS.

Fortunately, MSDN had an article on how to run the upgrade on the VSS database to prepare it for migration to TFS:  http://msdn.microsoft.com/en-US/library/x8d2h3te%28v=vs.80%29.aspx

I had to run the ddupd command against the VSS database in order to upgrade it to the latest version of VSS (VSS 2005). 

Once I upgraded the VSS database, I was able to successfully run the VSS to TFS upgrade wizard and begin the migration process!

Friday, October 3, 2014

Windows 10 Technical Preview available for download

Windows 10 Technical Preview was just released a few days ago and you can now download it and testing it out.

You can read more about the new and exciting features in Windows 10 here: http://blogs.windows.com/business/2014/09/30/introducing-windows-10-for-business/

You can grab a copy of Windows 10 Enterprise edition here: http://www.microsoft.com/en-us/evalcenter/evaluate-windows-technical-preview-for-enterprise?i=1

If you want the Home version of Windows 10, you can grab that here: http://windows.microsoft.com/en-us/windows/preview-download?ocid=tp_site_downloadpage

Tuesday, September 30, 2014

Removal/uninstall of AVG Antivirus fails

I was recently helping a friend install Mcafee Total Protection Antivirus on their computer.  As many people know, Antivirus software from one company is not compatible with another company's antivirus solution.

So, of course, Mcafee Total Protection attempted to uninstall AVG Antivirus from the system.  Unfortunately, at the end of the uninstall process, AVG displayed an error message that it could not be removed!!

Well, after several attempts to uninstall it from the machine, we did a quick Google search and found out that this is a common issue for many users and it is readily solved by running a Removal tool that AVG provides directly from their website: http://www.avg.com/us-en/utilities

After running the AVG Remover tool, I was able to successfully remove AVG Antivirus from the system and install Mcafee Total Protection!

Hooray!!

Monday, September 29, 2014

Assigning Static IP Addresses to Windows Azure VMs

If you need to configure a Windows Azure virtual network and want to assign Static IP addresses to your Windows Azure Virtual Machines, you can do so directly through PowerShell:

http://www.bhargavs.com/index.php/2014/03/13/how-to-assign-static-ip-to-azure-vm/

This method basically sets up IP Address binding for the specific VMs in your virtual network similar to how you would set up MAC Address Static IP routing on a router on your own home virtual network.

This avoids the trouble of having to statically set the IP Address in the Network Adapter settings and allows it to be controlled centrally through the Windows Azure Virtual Network.

Therefore, if you need to set up communication between your Windows Azure VMs, this is a handy way to ensure that the Windows Azure VMs that are communicating with each other do not change IP addresses between reboots or other VM operations.  For example, if you are setting up TFS (Team Foundation Server) in the Windows Azure cloud, you may need to rely on your TFS Build machine reliably deploying to the same server on a consistent basis.  If the IP Address of your deploy VM changes, this can break your build and deploy process and therefore ruin your overall CI (Continuous Integration) strategy.  By setting the IP address statically, you can avoid the headaches associated with managing your CI strategy across multiple Windows Azure VMs.

NOTE: However, one thing to note regarding setting static IP Addresses to your Azure VMs, is that if you have already configured Primary and Secondary DNS Servers to your Network Adapter, then these values will be wiped out immediately after assigning a Static IP Address to the Azure VM.  You will have to re-add this information back into the Network Adapter once the Static IP Address has been assigned to the Azure VM.  

Sunday, September 28, 2014

Switch between apps slideout does not go away

I just recently bought a Windows 8.1 laptop that worked both as a touchscreen as well as a regular laptop with a touchscreen.

I was using a Metro App from the Metro Start menu and as soon as I opened my Metro application, I ended up with a slideout "Switch between apps" that did not automatically go away when I opened my Metro app!

No matter what I did in the Metro Start menu, I could not make this slideout disappear!! 

Fortunately, after doing a bit of searching for posts on this problem, I discovered, that you have to simply hover your mouse in the upper left hand corner of the screen in order to get this "Switch between apps" slideout to disappear!

I even found a YouTube video which demonstrates how to resolve this annoying problem/issue: https://www.youtube.com/watch?v=7-x6sQ0WNuo

Wednesday, September 24, 2014

Running Virtual Machines in Windows Azure Virtual Machines

If you want to leverage the existing Windows Azure Virtual Machine infrastructure to run older/legacy VMs such as Windows XP and Windows Server 2003, fortunately, you can do just that!

Here is an article that describes how to do just that: http://fabriccontroller.net/blog/posts/running-a-legacy-os-in-a-windows-azure-virtual-machine-windows-xp-windows-server-2003/

As you can read from the post, only Oracle VirtualBox will support running in a Windows Azure Virtual Machine.

However, there are also a few points when setting up the VirtualBox VM that are not mentioned:


  1. You should install the Oracle VirtualBox Extension Pack in order to ensure everything works regarding mouse and keyboard input etc.
  2. You will probably need to add a Windows Firewall exception in the Windows XP machine or turn off the Windows Firewall completely
  3. You need to set up the VM using NAT in order to use the Port Forwarding feature.  You can set up NAT either in the Preferences of Oracle VirtualBox or configure it on a machine-by-machine basis.  Using other networking options will not work.
  4. When you set up NAT, you will need to configure the Guest IP address as the target for Port Forwarding.  If the Guest IP address is not configured, Port Forwarding may not work.
  5. You may also need to set up the Remote Display feature in order to get RDP to work correctly (Display-->Remote Display)
Leveraging the Windows Azure Virtual Machine infrastructure is very handy if you need to do some testing on older legacy Windows OSes until you phase out support for them completely.  As many IT users know, numerous applications still only work on Windows XP which is why it has been so difficult for Microsoft to completely drop support for Windows XP for so many years.

As an experiment, I decided to see if I could also get Windows 7 or Windows 8/8.1 running on VirtualBox inside of a Windows Azure VM and I got the error message "AMD-V is not supported".  Therefore, you can only run older legacy OSes on VirtualBox inside of a Windows Azure VM.