Rosher Consulting

Software Consulting and Development Services

Angular Tips: Measuring Rendering Performance

Recently I’ve had to try and improve the performance of a page that had too many watches, specifically I had a table where each row was generated with an ‘ng-repeat’ and then within each row I had a ‘select’ element whose options were also generated with an ‘ng-repeat’. The ‘select’ had around 200 options with a binding for both the value and text, the number of rows varied but probably averaged around 5 and each row had a few bindings itself, so tally that up and you’re looking at over 2000 bindings, which when combined with the rest of the page was quite simply too much.

There are many ways of improving the performance of this page but before attempting any changes I wanted to be able to measure the rendering performance, so that I would then have something to compare my changes to.

I’d previously read the blog post from Scalyr on improving Angular rendering performance and in this post they mentioned that they ‘detected the finish of the $digest cycle’ in order to measure the rendering performance. I was curious how they did this and nothing obvious popped up in my searching, however I then noticed a question in the comments, which Scalyr responded to that pointed me in the right direction, I quote:

‘The approach we used was to use a $provider decorator on the $rootScope service to monkey patch $rootScope.$digest. In the override method, we simply record the start time, invoke the original $digest method, and then capture the end time.’

From this I was able to determine that I needed to override the regular ‘$digest’ method and replace it with my own, so I came up with the following simple override:

   1:  angular.module('myApp').run(['$rootScope', function($rootScope) {
   2:   
   3:              var $oldDigest = $rootScope.$digest;
   4:              var $newDigest = function() {
   5:                  console.time("$digest");
   6:                  $oldDigest.apply($rootScope);
   7:                  console.timeEnd("$digest");
   8:              };
   9:              $rootScope.$digest = $newDigest;
  10:          }])

This will call 'console.time' when the ‘$digest’ starts and 'console.timeEnd' when it finishes, which allowed me to see how long each render of the DOM takes. Combined with some logging in my controller I was then able to see that after my data had been loaded from the server it took around 3-4 seconds to render this to the page and for the page to be ready for input.

With some useful measurements to hand, I could now begin the process of making some improvements.

Angular Tips: Displaying an Ajax loading spinner using a custom Interceptor

I think one of the reasons Angular has become so popular, so quickly is because it has a series of conventions that make extending it really easy.

A common requirement for any UI is to let the user know when something is happening, such as when making ajax calls to the server and a common way to do this is to display a spinner. Prior to using AngularJS I would have handled this by creating my own module for talking to the server and then all other modules would have had to call this module, nothing wrong with this approach at all, but Angular already has its own module for dealing with ajax requests: ‘$http’, so we don’t really want to re-invent the wheel and create another module to sit on top of this.

This is where Angular’s extensibility comes in as we can handle this by hooking into Angular’s ‘$http’ service by creating our own interceptor, which allows us to intercept the request to the server, display the spinner, handle the response back and hide it again

Here’s a template for a custom interceptor that displays an ajax spinner when a request starts and hides it when the request finishes:

   1:  var App = angular.module('myApp', []).
   2:      config(['$httpProvider', function ($httpProvider) {
   3:          
   4:          $httpProvider.interceptors.push('myHttpInterceptor');
   5:          
   6:      }]).factory('myHttpInterceptor', ['$q', function ($q) {
   7:          var numRequests = 0;
   8:          var ajaxSpinner = $("#ajaxSpinner");
   9:          var hide = function (r) {
  10:              if (!--numRequests) {
  11:                  ajaxSpinner.hide();
  12:              }
  13:              return r;
  14:          };
  15:   
  16:          return {
  17:              'request': function(config) {
  18:                  numRequests++;
  19:                  ajaxSpinner.show();
  20:   
  21:                  return config;
  22:              },
  23:   
  24:              'response': function(response) {
  25:                  return hide(response);
  26:              },
  27:   
  28:              'responseError': function(response) {
  29:                  return $q.reject(hide(response));
  30:              }
  31:          };
  32:    }]);

This is pretty straightforward and adds a nice effect to our UI, but we can also take this a step further.

Whenever I want to send messages back from the server indicating success or failure I always send a standard object back, such as:

   1:  {
   2:      Error: true,
   3:      Message: 'An error occured saving XYZ'
   4:  }

Now I could handle this at the point I make the call, but since this is a standard message, this seems like a perfect fit for our custom interceptor.

The following code not only shows and hides our ajax spinner, it also checks the response and if necessary displays an alert to the user, saving us from having to handle this everywhere in our code:

   1:  var App = angular.module('myApp', []).
   2:      config(['$httpProvider', function ($httpProvider) {
   3:          
   4:          $httpProvider.interceptors.push('myHttpInterceptor');
   5:          
   6:      }]).factory('myHttpInterceptor', ['$q', function ($q) {
   7:          var numRequests = 0;
   8:          var ajaxSpinner = $("#ajaxSpinner");
   9:          var hide = function (r) {
  10:              if (!--numRequests) {
  11:                  ajaxSpinner.hide();
  12:              }
  13:              return r;
  14:          };
  15:   
  16:          return {
  17:              'request': function(config) {
  18:                  numRequests++;
  19:                  ajaxSpinner.show();
  20:   
  21:                  return config;
  22:              },
  23:   
  24:              'response': function(response) {
  25:                  if (response && response.data && response.data.Error && 
  26:                          response.data.Error === true && response.data.Message) {
  27:                      alert(response.data.Message);
  28:   
  29:                      return $q.reject(hide(response));
  30:                  }
  31:   
  32:                  if (response && response.data && response.data.Error === false && 
  33:                          response.data.Message) {
  34:                      alert(response.data.Message);
  35:                  }
  36:   
  37:                  return hide(response);
  38:              },
  39:   
  40:              'responseError': function(response) {
  41:                  if (!response)
  42:                      return $q.reject(hide(response));
  43:                      
  44:                  if (response.data && response.data.Error && 
  45:                          response.data.Error === true && response.data.Message) {
  46:                      alert(response.data.Message);
  47:                  } else {
  48:                      alert('Sorry, there was an error.');
  49:                  }
  50:                      
  51:                  return $q.reject(hide(response));
  52:              }
  53:          };
  54:    }]);

This is just a couple of examples of what we can do with a custom interceptor to help remove boilerplate from our code and hopefully it’s given you some ideas for your own code.

Angular Tips: Adding functionality to existing directives

One of the features I really like in Angular, which is not that well known is the ability to create new directives with the same name as existing ones and thereby add some new functionality to them.

When you register two directives with the same name, Angular doesn’t complain, it merely runs those directives in the order they were registered. A common scenario I use this for is to create my own ‘ng-click’ directive to provide some feedback to the user when they click a button, such as disabling it while waiting for an Ajax request to complete.

With the above scenario, if you were to look on Stack Overflow for questions about this, the most common answer would be to create your own directive, say ‘my-own-click’ and use that instead of the standard ‘ng-click’. The downside with this is that you have to re-implement the standard ‘ng-click’ functionality and you have to remember to use your ‘my-own-click’ directive everywhere. If you’re working in a team, this then means all of your co-workers also need to know to use this new directive.

A Better Approach

The following is the basic template I use in my projects to add some feedback to my buttons when an action occurs. The directive requires that you also attach an ‘ng-model’ to your button so that it can use the model variable to know when the action starts and ends:

   1:  angular.module('MyApp').directive('ngClick', function () {
   2:      return {
   3:          restrict: 'A',
   4:          link: function (scope, element, attrs) {
   5:              if (attrs.ngModel) {
   6:                  var el = element.find("span");
   7:                  var cls = el.attr("class");
   8:   
   9:                  scope.$watch(attrs.ngModel, function (newValue, oldValue) {
  10:                      if (newValue) {
  11:                          if (el.length) {
  12:                              el.attr("class", "glyphicon glyphicon-refresh fa-spin");
  13:                          }
  14:                          element.attr("disabled", true);
  15:                      } else {
  16:                          if (el.length) {
  17:                              el.attr("class", cls);
  18:                          }
  19:                          element.attr("disabled", false);
  20:                      }
  21:                  });
  22:              }
  23:          }
  24:      };
  25:  });

The directive itself does a bit more than just disable the button, it also looks to see if the button contains a child ‘span’ element and if it does, it assumes this is a Glyph Icon as used by Bootstrap and then changes the class of the ‘span’ to the Refresh icon and also uses the ‘fa-spin’ class from Font Awesome to rotate the icon, thereby actively showing the user that their action has caused something to happen.

Here’s the appropriate HTML to use this directive:

   1:  <button class="btn btn-primary" ng-click="save()" ng-model="saving">
   2:      <span class="glyphicon glyphicon-floppy-save"></span>
   3:      Save
   4:  </button>

And the ‘save’ function in your controller (this is calling a save method on a service which returns a promise, so we’re handling both the success and failure of the promise and updating the model variable appropriately):

   1:  $scope.save = function () {
   2:      $scope.saving = true;
   3:      myService.save($scope.data).then(function(result) {
   4:          $scope.saving = false;
   5:      }, function() {
   6:          $scope.saving = false;
   7:      });
   8:  };

As you can see, as long as you update the appropriate model variable in your controller, your button will automatically be updated without you having to remember to use the appropriate directive.

Obviously this isn’t totally free, in that you still need to do some work, but this could certainly be improved further by creating your own http interceptor for instance in combination with the custom ‘ng-click’ directive to identify which button was clicked and update appropriately.

Overall I love how easy Angular makes it to do things like this and your users will appreciate the feedback from your UI.

Unit Testing AngularJS with Jasmine, Chutzpah and Visual Studio

Recently I’ve been working on a project that heavily utilised AngularJS for its client side functionality. Since all of my server side C# code was all unit tested I wanted to add some unit tests to my Angular controllers and services. Unit testing C# and server side code is well understood, with plenty of tools available and support baked in to Visual Studio, but the idea of unit testing JavaScript is reasonably new, which means that the number of tools available to a developer (especially a .NET/C# dev) is quite limited and the IDE support minimal.

Fortunately the community has been very active in this area, so everything was available for me to get unit tests added to my project, I just needed to work out how to do it.

Since I’m working with Angular, the first thing I did was look at the code samples on the Angular site, which all come with unit tests, these use the Jasmine testing framework and I came across a couple of good articles on testing Angular controllers and services using Jasmine here and here – Note that I won’t be providing example unit tests with this blog post, the examples on the Angular site and the preceding two links cover it far better than I ever could.

This all seemed straightforward so far, however I wanted to get this nicely integrated into Visual Studio, so a bit more Googling and I came across Chutzpah (pronounced hutz-pah). Chutzpah is an open source test runner which lets you integrate JavaScript unit tests into Visual Studio and enables you to run them from the command line, which means you can integrate your tests into your build process.

Installing Chutzpah

Chutzpah can be installed from NuGet, just right-click on your project or solution in Visual Studio and choose ‘Manage NuGet' packages’, then search for ‘Chutzpah’, it should be the first result found:

chutzpah-nuget

To run Chutzpah from within Visual Studio you will also need to install two plugins, the first is ‘Chutzpah.VS2012.vsix’, this integrates into the unit test explorer window in Visual Studio, the second ‘chutzpah.visualstudio.vsix’ integrates into Visual Studio’s context menu so you can right-click on a test and run it.

Resharper

If you’ve got Resharper installed then it automatically supports running Jasmine unit tests so you won’t have to install the two Chutzpah plugins above. The one thing you’ll probably want to change however is how the tests are run, by default it will run them in the local browser, but you can change this to use the PhantomJS headless browser, which is installed with Chutzpah. Just go to Resharper->Options, then Tools->Unit Testing->JavaScript Tests and change the ‘Run Tests With’ setting to ‘PhantomJS’, then browse to the PhantomJS exe, which is in the Chutzpah NuGet package folder, so mine is set to ‘C:\myproject\packages\Chutzpah.2.5.0\tools\phantomjs.exe’ as below:

resharper-phantomjs

Where to put your unit tests?

There are a number of posts on Stack Overflow (see here, here and here for example) about where to put your JavaScript unit tests i.e. should they go in the same project that is under test or should they be in a separate project? Personally I like to keep my tests separate from the actual code and all of my C# unit tests are in a separate project in my solution file, so I’ve done the same for my JavaScript unit tests.

All you need to do is add a new ‘Class Library’ project to your solution and then add the Jasmine NuGet package to that project.

Running your tests

After installing Chutzpah and creating my first set of tests, I then attempted to run them, but no matter what I couldn’t get my tests to pass. After taking a step back and writing some very simple tests with no dependencies it turned out that I just need to make sure I’d included all of the necessary references in my unit test files.

Chutzpah runs all tests in a single file in isolation i.e. think of it like each of your C# unit tests which need to include any references with ‘using’ statements, so each unit test file has to reference any supporting JavaScript files it needs. References look like the following and should be placed at the top of your test file:

/// <reference path="../../../../myproject/scripts/libs/angular.js" /> 
These are file path references, relative from the tests project to the actual file location, the easiest way to create these references is to simply click and drag the file from the solution explorer into the JavaScript test file and Visual Studio will automatically create the reference for you.

In my tests I was including the references for Angular, Angular Mocks etc., but I hadn’t taken into account any other file references I may have needed, especially as they weren’t being used by the controller/service under test. It turns out that in my ‘App.js’ file I had some JQuery code running on start-up prior to my Angular app being created, which meant that in my tests I had to reference some extra files otherwise my Angular app module couldn’t be created and hence my tests would fail, which wasn’t entirely obvious from the test results (and of course I realise now that including non-Angular code in an Angular file was a bad idea!).

Here’s the list of references I ended up having to include, which as you can see is quite a lot for what was a simple unit test:

/// <reference path="../../../../myproject/scripts/libs/angular.js" /> 
/// <reference path="../../../scripts/angular-mocks.js" /> 
/// <reference path="../../../../myproject/scripts/libs/ui-bootstrap.js" /> 
/// <reference path="../../../../myproject/scripts/libs/sanitize.js" /> 
/// <reference path="../../../../myproject/scripts/libs/jquery-1.9.1.js" /> 
/// <reference path="../../../../myproject/scripts/libs/bootstrap-datetimepicker.js" /> 
/// <reference path="../../../../myproject/scripts/libs/jquery.validate.js" /> 
/// <reference path="../../../../myproject/scripts/app/app.js" />

Once I’d included the above references, my tests all ran fine as you can see in the following screenshot from the Resharper unit test session window:

resharper-js-test-run

 

Integrating your tests into your build

Chutzpah comes with a console application, which means that you can run your JavaScript unit tests from the command line, all you have to do is point it to the folder where your tests are, like so:

chutzpah.console.exe C:\MyProject\MyJavaScriptTests\

One issue I found is that Chutzpah will look for all JavaScript files in that folder and sub-folders, meaning that since we’ve installed Jasmine into our project, Chutzpah will also attempt to load and run those files, which we don’t want. Unfortunately Chutzpah doesn’t support filtering or wildcards in the folder path you specify, so the way I got around this was to place all of my tests into a subfolder and then adjust the command line to point to this folder instead, bypassing any files I didn’t want tested:

chutzpah.console.exe C:\MyProject\MyJavaScriptTests\Tests\

chutzpah-console

Adding Chutzpah to your Msbuild project

Since Chutzpah can be run from the command line, adding it to your build project is straightforward, all you need to do is create an ‘Exec’ task that executes the console application with the appropriate path to your JavaScript unit tests:

<Target Name="RunJSUnitTests">
  <Exec Command="&quot;$(SolutionFolder)\packages\Chutzpah.2.5.0\tools\chutzpah.console.exe&quot;
 $(SolutionFolder)\Tests.JavaScript\Tests\ /silent
 /junit $(MSBuildProjectDirectory)\chutzpah-output.xml" 
WorkingDirectory="$(SolutionFolder)\packages\Chutzpah.2.5.0\tools\" />

  <Message Text="$(TASK_BREAK)" />
</Target>

Here’s the result of running the above build task on my system, if any of the JavaScript tests fail, Chutzpah will return an error and the build will fail:

chutzpah-msbuild

Team City

Chutzpah automatically detects when it’s being run inside Team City and will change its output to the format understood by Team City, so as long as Chutzpah has been added to your Msbuild project, you don’t need to do anything to get support inside Team City as you can see from the following build log:

team-city

Summary

As you can see, there really aren’t that many steps involved in getting JavaScript tests setup in both Visual Studio and your build process, really the hardest thing for me was figuring out why my tests were failing and that was merely making sure my references were setup correctly, other than that everything was surprisingly straightforward, certainly more so than I was expecting when I started on this journey.

House Network Part 1

I’ve had a number of posts planned for a while on the new home network, as it’s been nearly 2 years since we moved house, it’s about time I got around to blogging about it!

Our old house had a home network but it was organic in nature, it started off with a wireless router and was then augmented with a mixture of cat5e and cat6 cables from the router to various rooms as needed. I also had a couple of small Netgear swtiches to help extend it even further, so basically it wasn’t very efficient but it got the job done.

I’d seen a few home network installs on various forums over the years (in particular this), so when we decided to move house, I started to plan what I wanted and as soon as we were settled in the new house I ordered everything I would need to get started with the install.

Toys!48 port patch panel and patch leads

Yes, that is 610m of Cat6 cable! :-)

The loft in the new house is quite big, so it seemed the perfect location for node zero, the only problem was how to run the cables from the loft to the rest of the house? After some debate with the other half we decided that running the cable through our bedroom and in to the garage below was the best option as that also enabled me to run the cables from the garage in to the cupboard under the stairs and then from there under the floorboards, eventually I would box in the cables running through our bedroom (2yrs later and this still isn’t done, whoops).

Lots of cable!Eeek!That's a bit better

Once I’d got to this point, I asked a friend to help me route them downstairs and under the floorboards to the various points in the lounge, it took us a whole day but me managed to get it all done and we’d just put the last floorboard back when the other half came home, phew!

In total the initial run included 11 cables, with 8 points going behind the TV, 2 at the far end of the room and 1 in the hallway by the telephone point.

Through the garageUnder the stairs and floorboardOut they pop

Finishing offFar end of the roomHallway

Initially I wasn’t planning on getting a server rack as I thought they were too expensive, however I found a seller on eBay that were selling them for a great price. In total I bought a 12U rack and 2 x 3U shelves for £103 and I have to say that the quality of the rack is excellent and the packaging was first class, can’t recommend highly enough.

The rack arrivesThe rack installed

End of Part 1 – in Part 2 we’ll expand the system with even more cable runs and I’ll explain what the various points are being used for.

Creating custom recorded TV views in SageTV

SageTV version 7 added the ability to customise the existing 4 recordings views as well as configure up to 4 additional views, while this is covered in the SageTV manual (page 41), it’s not immediately obvious how powerful this feature really is.

Before we discuss customising the recordings views though, we’re going to create some custom user categories, which is also a new feature in version 7, so firstly navigate to your favourites: TV –> Schedule recordings –> Manage favourites, choose one of the favourites to edit and then navigate to the 'User defined categories' column and select ok.

Set User CategoriesSelected user category

From this screen, you can add a new category, you just need to give it a name that’s useful to you, in the screenshot above I’ve already created 3 user categories: Helen, Kids and Paul. Once you’ve created a user category you can assign it to a favourite (in the screenshot above the category ‘Helen’ is assigned to the favourite ‘The Vampire Diaries’), when you’ve assigned categories to the selected favourite, just select 'Done' and from now on all recordings of the favourite will be assigned your chosen user category. If you wish, you can then edit your other favourites and assign some custom user categories to those as well.

You can also assign user categories to existing recordings, just navigate to the recording you want to update, bring up its options menu and choose the 'Edit user categories' option as in the screenshot below, you'll then be asked if you want to edit the categories for the current recording or the favourite, in this case, choose recording and you'll be presented with the same 'Set user categories' screen that you've seen before.

Editing user categories on an existing recording

Now that we’ve configured the user categories, it’s time to start customising the recordings views, so navigate to one of your existing recordings views, such as ‘All recordings’ and bring up the options menu, either using the options button on the remote or by pressing Ctrl-O if using the keyboard. When the options menu appears, choose the 'Menu options' item, this will bring up the following menu:

Updating the number of views

On this menu is an option called 'Number of Views', by default this will be set to 4, go ahead and change this to a number greater than 4, but no more than 8 (in the screenshot, I've already got mine set to 8). Once done, close the menu and then bring up the options menu again, this time as well as the 4 default recording views, you will have 4 additional ones labelled 'Recording View 5' to 'Recording View 8' (again, I've already configured mine so the screenshot isn't exactly what you'll see).

Switching to your new view

Go ahead and select one of the new recordings views, if you’re currently in the 'All Recordings' view then it's unlikely you'll see any changes, except maybe the sorting of the recordings. Now this view is active, you can start to customise it, with the first thing you'll probably want to do is to rename it, which you can do by choosing the 'Menu options' item from the options menu. This will bring up the same menu we saw earlier when increasing the number of views. From here, you can select the menu item 'Rename the ‘Recording View X’ view' and then enter a meaningful name for your view, such as 'Kids Shows'.

Now that you've got a custom view, it's time to start filtering what's displayed in the view, so bring up the options menu again and choose the 'Filtering' item, which will present you with the screen below left (note: all of these screenshots are with the Diamond UI Mod installed, which presents a few more options than the standard UI, so don't worry if things aren't exactly the same). At the moment we're interested in the categories, so choose that option and you'll be presented with the screen below right, showing all of the categories available as well as all of the user categories you created earlier.

Filtering optionsFiltering categories

For this example I'm going to choose the 'Kids' user category and then back out of these screens, choosing 'Done' and then 'Close' to bring me back to my custom view, which should have updated itself to only display recordings that are in the 'Kids' category as per the screenshot below.

The filtered view

I could then go on to further customise this view by filtering out watched recordings for instance or changing the sort order to show the latest recordings first. It's worth noting as well that each view will maintain it's own settings, so setting the filtering and sorting options for one view will not affect any of the other views, which is extremely handy if individuals in your family prefer different sort orders etc.

This is really just a taster of what can be done, but hopefully, with the above information you're beginning to realise just how powerful the user categories and custom views can be. In my setup I’ve now got views setup specifically for me, my partner and one for the kids, which means I'm no longer getting nagged about where the better half's recordings are ;-)