Wednesday, November 27, 2013

Common Git Tasks - So I Don't Forget Them

The title speaks for itself. I spend a lot of time hopping between three different source control systems: TFS (eww), SVN and Git.  As such sometimes my brain gets a little fried remembering some of the sequences to enter in Git to do common tasks. I really should script them out, but I'm going to write them down here so I don't forget them.

Yes I know this has been done 1000 times, but I haven't done it and this is an exercise to help me remember certain things. Also, I hope people correct me and/or this helps others.

Note: Anywhere I mention "GitHub" below I really mean "remote repository". It's just that I feel most people that are new to this stuff will be searching for "GitHub"




Checking in existing code to a new GitHub repository


Create the GitHub repo on their website from your account page. Be sure to add a .gitignore to exclude any directories or files you don't want to commit! In a prompt change directory to the root of your code folder then enter:

# initialize local repository
git init

# add all files to git tracking
git add .

# commit the tracked files to local repository
git commit -m "initial commit"

# connect to the remote repository
git remote add origin https://github.com/blesh/some-project.git

# sync remote and local
git pull origin master

# push files to remote
git push origin master



Get a project from existing GitHub repository


Change directory to the directory you want to be the parent of your project directory. Do not create a new directory for your project, Git will do that for you.

# clone the master branch to initialize your project and local repo
git clone https://github.com/blesh/some-project.git

# change to that project directory
cd some-project

# [OPTIONAL] switch to the branch you care about
git checkout somebranchname



Basic pull from GitHub


# usually you're on the master branch,
# otherwise swap "master" for your branch name.
git pull origin master

# sometimes you want to pull from a specific branch
# so switch to that branch locally
git checkout branchname

# then pull it from the remote
git pull origin branchname



Basic push up to GitHub


It's important to sync first with a pull!

# make sure you've added untracked files
git add .

# commit changes locally
git commit -m "commit message here"

# sync with remote
git pull origin master

# you might have to merge some changes at this point
# if so, generally it will involve editing the actual files
# and following the prompts in the console.

# now push your changes up to the remote
git push origin master



Overwriting all local changes with remote files from GitHub


So this one I've found will very depending on whether or not you have multiple branches you care about. This is also one where the difference between git pull and git fetch counts. For one thing, as of this writing git fetch as an --all flag you can use to pull down all remotes (branches and all), for another thing, pull will merge your files where fetch will not.

#get all files from all remotes without merging
git fetch -all

#reset your local to HEAD
git reset -hard origin/master

#remove all untracked files and directories
git clean -f -d

#[OPTIONAL] you might need reset and clean for other branches you've work on.
git checkout otherbranch
git reset -hard origin/otherbranch
git clean -f -d



Updating a forked repository from the original


When you've forked someone else's code, and you'd like to update it to the latest code base, you're going to need to rebase from an "upstream".

# add the upstream remote repository
# in this case we're naming it "upstream", but you can name it whatever you choose.
git remote add upstream https://github.com/angular/angular.js.git

# fetch all files from all branches from the upstream repository
git fetch upstream

# switch to the branch you're working on
git checkout master

# rebase to replay all of the deltas from your previous commits on top
# of the files you just pulled down with the fetch.
git rebase upstream/master

# push your changes back up to your own remote fork
# the -f flag is to force the push and ignore checks and make sure
# that the remote doesn't have issues reconciling which deltas to trust
git push -f origin master

Changing a tag name


If you need to rename a tag, you need to add a new tag where the old tag was, then delete the old one

git tag newtag oldtag
git tag -d oldtag

Inserting a change at a past commit


When you need to make a change at a prior commit in your history and propagate that change throughout. I've found you don't always have to rebase. It seems like this solution is a lot more kosher as well. Basically, you create a copy of master, then reset the master back to the point where you want to make the change. Make the change and commit it, then merge the copy back on top of your change.

# checkout the branch you need to change
git checkout master

# make a copy of it on a new branch
git branch master_copy

# rewind the branch to the commit you'd like to insert
# a change after where [commit] is a commit # or a tag:
git reset --hard [commit]

# (make your changes here)
# and commit them:
git commit -am "changes have been made"

# now merge your copy back on top of your change
git merge master_copy

# clean up the copy
git branch -d master_copy

Monday, November 11, 2013

Quick 3D Pong Game With Three.JS

I was playing around with Three.js on Plunker last Friday and whipped up a silly little Pong Game.

I put a few notes in the readme, but please feel free to play with and/or alter the code. Just show me what you did if you do anything cool!

(Also, don't judge the code, I was sandboxing for kicks, :P haha)

Monday, October 21, 2013

Angular JS - you probably shouldn't use $watch in your controllers.

The problem with $watch isn’t so much that it doesn’t work. It definitely works. The problems are two-fold:
  1. It’s hard/hackish to test effectively.
  2. It’s inefficient.

Inefficiency: Adding complexity to your $digest

As I discussed in my other post, a $digest must occur to update the view from the model or the model from the view. This happens in Angular with great frequency. Whenever a digest occurs you it must evaluate all of your registered $watches. To make matters worse, whatever is altering that value you’re $watching, probably already has a $watch associated to it or an $apply you can leverage to update your value in your view.

Bad: $watch a value changed by user input

What do I mean? Suppose you had some sort of input that was changing a value you were watching:
<input type="text" ng-model="foo"/>
And in your controller you were watching foo for some reason so you could update some other value:
$scope.$watch('foo', function (val){
    switch(val) {
        if(val === 'test') {
            $scope.bar = 'foo is testing me';
            } else if (val === 'blah') {
                $scope.bar = 'foo seems indifferent';
            } else {
                $scope.bar = 'I do not understand foo';
        }
   }
});
and what would the test around this look like? UGLY.
$scope.foo = 'test';
$scope.$apply(); //don't forget the magic step!
expect($scope.bar).toBe('foo is testing me'); //makes sense right? 


Better: Use ng-change

Here we can simplify this greatly by just leveraging ng-change on the <input/>:
<input type="text" ng-model="foo" ng-change="updateBar(foo)"/>
In our controller we’d have a a nice, easy to read, easy to test function:
$scope.updateBar = function(foo) {
    if(val === 'test') {
        scope.bar = 'foo is testing me';
    } else if (val === 'blah') {
        $scope.bar = 'foo seems indifferent';
    } else {
            $scope.bar = 'I do not understand foo';
    }
};
And our test is much clearer and cleaner:
$scope.updateBar('test');
expect($scope.bar).toBe('foo is testing me');

Bad: Use $watch to update a value after [some event here]

In this case you might be doing something like getting a value via Ajax, and you think, “Man, $watches are SWEET, I’m going to use one of these bad boys to watch my value”:
app.controller('WidgetCtrl', function($scope, widgetService) {
    $scope.$watch('widgets', function(val) {
        var count = (val && val.length ? val.length : 0);
        $scope.theCountSays = count + ', ' + count + ' widgets! ah! ah!';
    });

    $scope.updateWidgets = function(widgets) {
        $scope.widgets = widgets;
    };

    $scope.getWidgets = function() {
        widgetService.get().then($scope.updateWidgets);
    };
});
Now look! theCountSays is updated automagically! Isn’t that awesome? No, I say. No it is not.
Look at our test related to it:
$scope.widgets = [1,2,3,4,5];
$scope.$apply(); //Weee! magic!
expect($scope.theCountSays).toBe(5);


Better: Use the event that triggered the change to update your value!

This is really the case for every single use of $watch in a controller… if you need to update something, update it when you need to, not in some catch all $watch.
app.controller('WidgetCtrl', function($scope, widgetService) {
    $scope.updateWidgets = function(widgets) {
       $scope.widgets = result;
       var count = (val && val.length ? val.length : 0);
       $scope.theCountSays = count + ', ' + count + ' widgets! ah! ah!';
    };

    $scope.getWidgets = function() {
        widgetService.get().then($scope.updateWidgets);
    };
});
and the tests are clear again:
$scope.updateWidgets([1,2,3,4,5]);
expect($scope.theCountSays).toBe(5);


Conclusion: Watches are almost never really necessary in a controller.

I realize the examples I gave above are contrived, but I’m happy to take on more specific examples. In the end, the only thing you need to remember when trying to avoid watches is: “What is triggering the change I’m worried about?” and subscribe to that. $watch is really meant to facilitate two-way binding between the model and the DOM as part of constructing directives.

Wednesday, August 28, 2013

Pseudo-Random Number Generation (Seeded) in JavaScript

What, no seed on Math.random()? But I could've sworn...

I was faced with a situation where I wanted to produce random numbers from a seeded generator in JavaScript, and I noticed (strangely for the first time) that Math.random() is a random number generator, but not a pseudo-random number generator (PRNG). The problem I was trying to solve is that I wanted to create random numbers, but I wanted to be able to recreate the same series of random numbers.


The Mersenne Twister

So I hit the Googles, and immediately came across this post on StackOverflow that had a few solutions for PRNG. The post suggested that Mersenne Twister was one of the better algorithms for PRNG, but didn't really offer up code for that in JavaScript. So I Googled further and found that there was a JavaScript implemenation of Messente Twister, but I didn't like it, really. No offense meant to the original authors, but it seemed to be a port of a C or C++ implementation straight to JavaScript, and was a little bulky and not factored out enough for me.


TL;DR: I made an open source Mersenne Twister implementation in JavaScript

It's here on GitHub, hopefully someone else finds it useful. The goal was to keep it concise and compact.

What does it do? It produces evenly distributed random integers between 0 and 4,294,967,296 (aka 0x100000000) from a seeded generator.

BTW: If anyone is an expert in cryptography or mathematics please review my implementation.

Wednesday, August 14, 2013

Angular - $compile: How it works, How to use it.

What is a "View"?

A view is HTML, plain and simple. A view is a DOM element and it's children. Views will contain markup and directives in that markup.

What Views are "Compiled"? 

All views are compiled by Angular. This is done by code that lives in the $compile service area of Angular.

How does view compilation work?

View compilation in Angular is some of the most ingenious functional programming I've seen in JavaScript. It works (very, very roughly) like this:

  1. Step into a DOM node.
  2. Loop through registered directives to see if this node contains any.
  3. For each directive found:
    1. Determine if the current scope is to be used, or if a child or isolated scope needs to be created.
    2. Create a function that, when called, will execute the directive's linking function against the appropriate scope from step 3.1.
    3. Append that function to a list of functions to be executed later.
  4. Does the current DOM node have children?
    1. YES: Step into each one and go to step 1 above.
    2. NO: That's the end of this branch.
  5. Return a function that will execute all functions created in Step 3.2. This is your compiled view.

Binding a View to a Scope

This is done when the compiled view is called and a scope is passed to it.

When does this occur?

  • When you angular app is initially bootstrapped.
  • During the processing of some directives, such as ng-repeat, it will be called on a subsection of your views.
  • When ng-view is updated after a routing event it's called on the incoming view.
  • When ng-include changes, it's called on the incoming view.
  • I'm sure there are many more places I'm not thinking of, but those are the big ones I can think of that happen OOTB.

When should I use $compile?


Truthfully? Almost never. It's going to be pretty rare that you should have to compile some HTML and process it as a view. Generally, you can just use directives that already exist to do whatever it is you think you need to do with $compile. But I suppose there are some edge cases where you may want to use it. Such as creating a completely custom repeater or something like that.

If you do find yourself needing to use compile. It should almost always be in a directive. Think about it: $compile is creating a function to wire up directives to a scope... in essence setting up all interactions between your app and the DOM. When you see "DOM", it's a directive. Directives are where we should be manipulating the DOM and nowhere else.

An example of compiling a view manually

/**
 * A weird directive that takes a space-seprated list of property names,
 * and prints them out as JSON.
 */
app.directive('whatIsInThese', ['$compile', function($compile) {
    return function(scope, elem, attrs) {
        //getting a list of space-separated property names 
        //from the attribute.
        var these = attrs.whatIsInThese.split(' '),

        //start creating an html string for our "view".
            html = '';

        //append a bunch of bound values from the list.
        angular.forEach(these, function(item) {
            html += '{{' + item + '| json}}\n\n';
        });

        //create an angular element. (this is still our "view")
        var el = angular.element(html),

        //compile the view into a function.
            compiled = $compile(el);

        //append our view to the element of the directive.
        elem.append(el);

        //bind our view to the scope!
        //(try commenting out this line to see what happens!)
        compiled(scope);
    };
}]);

Here's a plunker showing the above directive at work. Play around with it if you like. It's a strange example, and the same thing could be done in better, more maintainable ways, for sure. But it gets the idea accross, I think.

TL;DR Version

View compilation in Angular basically traverses the entire DOM tree of whatever node you give it, creating a single function, that when called will execute every linking function from every directive it finds in that DOM tree, with the appropriate scope, element, attributes and (optionally) controllers.

Let's not forget those those linking functions are what are setting up your $watches and event bindings and all of your ties to the DOM.


And as always: READ THE SOURCE!

Remember, Angular is open source, if you want to know more about it, don't read my stupid blog, read the source!!! It can be found on GitHub.

Tuesday, August 6, 2013

AngularJS: $watch, $digest and $apply


While browsing reddit, I read an article I felt over-complicated an explanation of $watch, $digest and $apply, and perhaps gave people a little bit of the wrong idea about what it is.

What is a $watch? 


Let's talk about this first. $watch is arguably the most important internal feature of Angular. $watches can be used to watch any value, and trigger a function call when that value changes. A $watch can be set up from any $scope by calling $scope.$watch() as shown below.

Setting up a $watch


There are two ways to set up a watch, by expression, or by function. Technically, they both do the same thing. If you pass an expression (a string), it will be evaluated against $scope and converted to a function to be watched. If you pass a function, it just watches that function, with no conversion necessary.


By Expression


The following will watch 'foo'. Which is an expression evaluated against $scope.
//$scope.$watch(<function/expression>, <handler>);

$scope.$watch('foo', function(newVal, oldVal) {
    console.log(newVal, oldVal);
});


By Function


To set up a $watch by function, you can do the following, which is technically the same as what is shown above:
$scope.$watch(function() {
    return $scope.foo;
}, function(newVal, oldVal) {
    console.log(newVal, oldVal);
});


Facts about $watch:

  • A watcher can evaluate any value.
  • A watcher's handler can execute anything when aforementioned value has changed.
  • All watchers are evaluated when $digest() is called.
  • If the first argument of a $watch is a string, it is $eval'ed into a function prior to registration. It's functionally equivalent to passing a function as the first argument, just with an extra step internally.


What is $digest?


At it's core, the important thing to know about $digest is that it loops through all watchers on the scope it was called on and it's child scopes. and evaluates them to see if they've changed, executing their handlers if they have. That's the important part you need to know.

How to call $digest:

$scope.$digest();


What is $apply?


Simply put, it's a wrapper around $rootScope.$digest that evaluates any expression passed to it prior to calling $digest(). That's it.  So, if you're calling it by itself without passing an argument to it, you may as well just call $digest().

How to call $apply:


$scope.$apply('foo = "test"');
//or
$scope.$apply(function(scope) {
    scope.foo = 'test';
});
//or
$scope.$apply(function(){
    $scope.foo = 'test';
});

So when does a $digest/$apply happen?


At key moments defined by the framework. The Angular framework has built in calls to $digest and $apply to it's services and directives as well as some internal calls. But basically it's like this, things like $timeout perform a setTimeout, then call $apply or $digest (it actual does a little more than that, but that's the basic idea). $http.get(), same deal, makes an AJAX call, returns it, then queues up a $digest.  Then there are directives, like inputs with ngModel for example. Updates to the input will also trigger a $digest. You get the idea.

How do $watch, $digest, and $apply relate to updating my view?


  • The directive registers a $watch that looks for a change in the model on the $scope. The handler will update the DOM element's value.
  • The directive registers an event handler of some sort in the DOM that will get a value from the DOM and apply it to the model in $scope. It will also call $apply or $digest.
  • When you update the model in the scope via some in-framework call... $http.get() for example, it kicks off a $digest after it completes.
  • The $digest checks the $watch the directive registered, sees the change and fires the handler associated to it, updating the DOM element.

Why does Angular work this way?


Since Angular wanted to use plain-old-JavaScript-objects (*POJSO© Ben Lesh 2013 all rights reserved!! Brought to you by Carl's Jr), a digest was the only real choice. Why? Well you can observe setter and getters on known properties real-time, but there's really no way to build an event into adding new values to or removing values from an object or array, particularly when it's done by an indexer, ala hash table or array index. So a digest becomes necessary. How do I know this? Because I tried to write my own framework and quickly found myself writing a half-assed version of Angular. The only other option that I know of would be to completely wrap the model observation up in a construct with set() and get() functions... think Knockout... which makes the JavaScript code uglier to deal with (IMO).


Some Guidelines For Use:


  • $watch
    • DO use $watch in directives to update the DOM when a $scope value changes.
    • DON'T use $watch in a controller. It's hard to test and completely unnecessary in almost every case. Use a method on the scope to update the value(s) the watch was changing instead.
  • $digest/$apply
    • DO use $digest/$apply in directives to let Angular know you've made changes after an asynchronous call, such as a DOM event.
    • DO use $digest/$apply in services to let Angular know some asynchronous operation has returned, such as a WebSocket update, or an event from a 3rd party library like Facebook API.
    • DON'T use $digest/$apply in a controller. This will make your code harder to test, and asynchronous operations outside of the Angular framework don't belong in your controllers. They belong in services and directives.
Remember these are guidelines, not hard, fast rules. More importantly, they're my guidelines, so take them with a grain of salt.



When to use $apply vs $digest?


[EDIT: for clarification on when these should be used] $scope.$digest should rarely be used outside of some very specific instances. Such as an isolated scope in a directive that might want to only update itself. Or, in an extreme edge case if a scope object has been created for some other isolated purpose. $scope.$apply or $rootScope.$digest should be favored most of the time, as the general desired effect is to update an entire view or model. 



READ THE CODE!!


I'm sure I butchered some part of the explanation above, but if you want to know more, get the information where I did... read the code!!!! You can get to all of this code right on Angular's repository on GitHub. It's extremely well commented, and extremely interesting code to look through.

EDIT: I'd like to thank Zeroto for his comment on reddit that pointed out a pretty serious omission on my part.

Thursday, June 27, 2013

Angular JS - Unit Testing - Directives

Testing directives is only slightly different than testing controllers or services, as I've covered in previous posts. Since you can inject services into directives, you can still mock those services by providing custom mocks as outlined in my "Unit Testing Services" post. But the major difference in that testing directives involves the DOM, which can make things difficult.

$compile is your friend.


To test a directive, you're going to need to compile a view featuring the directive, then probe DOM elements in that view to assert that they've been affected properly.

Basic Directive Example

This is a simple little directive that basically will change the text of an element to the evaluated value of an expression when the element is clicked.

app.directive('sampleOne', function (){
    // this is an attribute with no required controllers, 
    // and no isolated scope, so we're going to use all the
    // defaults, and just providing a linking function.
    
    return function(scope, elem, attrs) {
      elem.bind('click', function(){
        elem.text(scope.$eval(attrs.sampleOne));
      });
    };
});


Using $compile and angular.element() for testing


So to test this directive, we're going to create a string of html as a view that features the directive we want to test, then we're going to $compile that view against a scope we create. After that, we can alter the scope as we need to, and/or use jqLite (packaged with Angular) or JQuery (works well with Angular) to do some DOM manipulation and test some values.

IMPORTANT: Be sure to call scope.$digest() after you make changes to your scope and before you make your assertions!

Since Angular won't be doing this for you because you're testing, you have to be sure to call $digest() to update your view and model.

describe('Testing sampleOne directive', function() {
  var scope,
      elem,
      directive,
      compiled,
      html;
      
  beforeEach(function (){
    //load the module
    module('plunker');
    
    //set our view html.
    html = '<div sample-one="foo"></div>';
    
    inject(function($compile, $rootScope) {
      //create a scope (you could just use $rootScope, I suppose)
      scope = $rootScope.$new();
      
      //get the jqLite or jQuery element
      elem = angular.element(html);
      
      //compile the element into a function to 
      // process the view.
      compiled = $compile(elem);
      
      //run the compiled view.
      compiled(scope);
      
      //call digest on the scope!
      scope.$digest();
    });
  });

  it('Should set the text of the element to whatever was passed.', function() {
    //set a value (the same one we had in the html)
    scope.foo = 'bar';
    
    //check to see if it's blank first.
    expect(elem.text()).toBe('');
    
    //click the element.
    elem[0].click();
    
    //test to see if it was updated.
    expect(elem.text()).toBe('bar');
  });
});



UPDATE: I've added one more example to the Plunk, so have a look below.


And of course, here's a plunker demonstrating a basic directive test:
http://plnkr.co/edit/oTuRbYTPt8RyybUzk5ND?p=preview

Angular JS - Unit Testing - Services


Edit: (updated Feb 28, 2014)


I did a previous entry on unit testing controllers (it also covers the basic set up for Jasmine testing).That's probably the most important thing you can unit test, but what about services? Those are the piece you're using to do things like talk to other controllers, send AJAX requests, and/or interact with any dependency in your Angular application.

Testing services isn't exactly straight forward, at least if you're coming from the mindset of testing controllers. It can get a little more convoluted when your service has dependencies on other services. How can you test it in isolation from the other pieces?


Basic service testing


To start off, let's look at an extremely basic service. Something that just does something really minor, but you use this functionality everywhere, so you'd like it to be injectable.  Below you'll see a really simple service that just manipulates some text and returns it.


To test this, what we need to do is simply inject it with angular-mocks inject() helper function in our beforeEach() call, then we can make calls to it and test the outcomes:


Testing a service with $http using $httpBackend


Things get a little trickier when we want to test a service that depends on other services, particularly in the case of $http. Fortunately, $http relies on $httpBackend in Angular, and angular-mocks.js automatically provides a mock $httpBackend for you.

So if we were to look at a simple service that gets data over $http, it might look something like this:


The process here is fairly similar to what we did above. Since angular-mocks.js has already provided the mock $httpBackend, we can use some functions it's added to test how $http is being used, and even to mock what it returns. We simply need to inject the $httpBackend and make the appropriate calls to it's newly provided test helper functions, such as when(), expectGET() or expectPOST(). We also need to add an afterEach() call to assert that our expectations set in $httpBackend were met.:


Providing custom mocks


Finally, you're likely to have the need to provide mocks to other services in order to test your service in isolation. For example, if you have a service foo that depends on service bar, you may want to inject a mock to bar so you can test foo in isolation from bar. So presuming we have the following service, and we want to test myService in isolation from foo:


Providing mocks is as easy as adding an anonymous function to the module() call that loads your module, that set up values in it's provider. At the same point in your beforeEach() that you're loading the module you're testing, just add a second parameter that is a function that sets up your mock (replacing whatever service was there before).



That should be all you require to test your services.  Test well, test thoroughly, test red-green if you can. If you're testing something that injects $http, use $httpBackend where you can, but you can, of course, mock $http yourself if you wanted to. The world is your oyster here, as long as your tests cover your code well.


And, since I try to provide something for people to play with, here is a working plunk demonstrating the techniques I wrote about above.

Monday, June 3, 2013

Pittsburgh TechFest 2013 - AngularJS Single Page Application Development

I've put the source code for the project for my presentation up on GitHub. It's available here:


I had a blast at TechFest, and I'm really happy with the level of interest in my talk. Thank you very much to all of those who attended my talk, and thank you also to the other presenters for sharing their amazing knowledge on a wide variety of tech topics. I thoroughly enjoyed this TechFest and I look forward to next year's.

Tuesday, May 21, 2013

AngularJS - Unit Testing - Controllers

Updated Feb 6, 2014 -  I'm changing this around a lot. A lot as changed since I wrote the original article and a lot has stayed the same. Most notably how Angular 1.2 handles promises on the $scope when processing the view, and some changes around testing promises. Since I see this gets some traffic and I hate the idea that I'm showing people the wrong thing, I'm going to try to keep this updated as time goes on. (Angular 2.0 will probably be a whold new post though)

Since Controllers carry the "business logic" of your Angular application, they're probably the single most important thing to unit test in your Application. I've run across a few tutorials on this subject, but most of them cover only the simplest scenarios. I'm going to try to add some slightly more complicated stuff in to my controller and test it, just to show examples. As I think of new examples as time goes on, I'll try to add those too.

For Unit Testing Services and Directives see these other Posts:




The Controller: What are you testing?


First off, what is a controller? A controller is an instance of an object defined by executing the controller function as a class constructor. If you're new to JavaScript, that means it's calling the function, but in the context of creating an object. All of this aside, most of what a controller is doing is setting up your $scope object with properties and functions you can use to wire it to a view. This will be the lion's share of what you're testing.


Recommended Testing Suite: Jasmine


The recommended tool for testing Angular is Jasmine. You can, of course, use any unit testing tool you like, but for this blog entry, we'll be using Jasmine. To get started with Jasmine they have some really well annotated code on their site as a tutorial of sorts, but I'd recommend just going to Plunker and starting a new "Angular + Jasmine" Plunk and fiddling around until you get the hang of it.


To TDD or not to TDD? Yes.


I'm not going to go into the specifics of TDD, whether or not you should use it, the pros and cons of TDD, or even attack this blog entry from that angle. I'm going to assume that if you're here, you've probably written some Angular controller, or you know how to write an Angular controller, and you're thinking "how do I test this thing?". So we'll just cover some of those basics, mkay?


Recommended for Later: Karma or Grunt automated tests


Generally, I'd recommend using something like Karma or grunt-contrib-jasmine to run your unit tests automatically in Node... But that's probably another lesson for another day. For now, let's just learn some Jasmine (1.3.X) basics.


Right Now: Jasmine Basics


Let's start off with the basic Jasmine Set up. This is what's required to run the Jasmine specs you're going to write, and produce a report in HTML format you can read. So to do all of this, you will create some HTML file, we'll call it "index.html" for now. and this would be the basic content of it:



An Example Controller


So now we'll need something to test. I'm going to make up a completely contrived controller to create some unit testing examples against. Nothing special, and nothing that might even make sense. It's just different things you might commonly do in an Angular controller, that you might need to test. In the specsRunner.html file above, this would be our "app.js".



Unit Tests For Our Example Controller


So, given the above controller, here is a battery of unit tests that tests the behavior of this controller. Well, more importantly, it tests what has been set up on the $scope by the controller function. In the specsRunner html (above), this would be in our "specs.js":




The Simple Tests


I don't want to dwell too much on the first two tests. They're fairly straight forward, and I don't want to patronize anyone that's made it this far. They're your basic, basic, unit tests. Make a call, assert a value, the end.


Testing a $watch()


Okay, here there's a little trick. If you have a $watch set up on a property, or on anything really, and you want to test it, all you need to do is update whatever you're watching on the $scope (or wherever it is), then call $scope.$apply(). Calling $apply will force a digest which will process all of your $watches.


Testing Service Calls

Testing services calls is easy: Mock the service, spy on it's methods, use expect(service.method).toHaveBeenCalled() or expect(service.method).toHaveBeenCalledWith(arg1, arg2) to verify it's been called. Pretty simple.


...and Asynchronous Service Calls

Testing async calls with Jasmine in Angular gets a little different than it might be with other frameworks. The first thing we did was isolate the controller from it's service with a mock, but that's not the end of it, since we have to handle the promise it returns by calling .then() on it. So there are a few things to make sure you're doing here:


  1. Have your mock service return a resolved Angular promise by using $q.when('returned data here').
  2. Use $timeout.flush() to force unresolved promises to resolve.



View the complete example on Plunker

Checkout the Gist as well


This is just a start


This blog entry really only covers the basics of testing controllers, there are a great many unique situations that can come up while you're unit testing.

Things to consider:  If it's hard to test, maybe it needs refactored? Anything that's hard to test probably has issues with interdependence or functions that try to do too much in one go and a refactor should be considered.

Monday, March 4, 2013

Angular JS: Validating Form Elements in a Repeater

How do I use Angular's form validation on these dynamically created elements?


The common scenario is this, you have some ng-repeat creating inputs of some sort, and you need to validate them individually. The knee-jerk reaction is to try to dynamically add names to the input like name="test{{$index}}"... but that won't really work. So now what?


ngForm directive!


The ng-form directive allows for nesting forms that can be used for things like partial validation. The idea is simple, on each repeated element, add an ng-form directive with a name. Then inside that, you can now reference inputs by name on that subform.



<!-- The "main" form directive is the outer form tag. -->
<form name="mainForm" ng-submit="submitAll()">
  <ul>        
        <!-- We add ng-form to the tag ng-repeat is on,
               to create a nested form context. -->
 <li ng-repeat="item in items" ng-form="subForm">
   <input type="text" required name="name" ng-model="item.name"/>
          
          <!-- now we can reference the validated field by name -->
          <span ng-show="subForm.name.$error.required">required</span>
          
          <!-- the nested form context itself can also be checked for validity. -->
          <button type="button" ng-disabled="subForm.$invalid" 
                     ng-click="submitOne(item)">Submit One</button>
 </li>
  </ul>

  <!-- last, but not least, the validation from our 
         subform bubbles up to our main form! -->
  <button type="submit" ng-disabled="mainForm.$invalid">Submit All</button>
</form>



So that's the idea in a nutshell. I'm not going to get too much more verbose with it than that. I've already covered form validation and custom validation elsewhere in my blog. But I will leave you with this plunker to play with that demonstrates this sort of dynamically created form validation, so you can fork it and play with it for yourself:


Monday, February 25, 2013

JavaScript - Dynamic Prototyping

There are thousands of JavaScript prototype tutorials online


Okay, so most people know about and/or understand JavaScript's prototypical inheritance system. Every object has a prototype which is an object, and when you look for a property on that object, if it doesn't have it, it checks the prototype for the property, and if it doesn't have it, it checks prototype's prototype for the property, etc. etc. (If anyone really wants me to do an entry on this, let me know, but rest assured it's pretty well covered elsewhere)


First let me go over the plain, boring ways prototype is commonly used

(It gets better further down)

In practice, 99% of the time, JavaScript's prototype ends up getting used like this:

//create some class.
function Foo() {
}

//add some properties to it.
Foo.prototype.test = function () {
   alert('wee');
};
Foo.prototype.bar = 123;

//use it.
var foo = new Foo();
foo.test(); //alert wee
console.log(foo.bar); //123


Boring. Boring. Boring.


Then occasionally you'll see some blog entry come along about "OOP in JS" or whatever and you'll see prototype getting used like so:

//create a class.
function Foo(){
}
Foo.prototype.test = 'wee';

//create another class.
function Bar() {
}
//that inherits from the first class.
Bar.prototype = new Foo();

//try it out.
var bar = new Bar();
console.log(bar.test); //wee


Okay, well that's a little more interesting, right? We've got some semblance of two class definitions where one inherits from the other. Pretty cool, I guess. Still not terribly exciting, though.


Dynamic Prototyping! (Finally something interesting)


I'm sure there's a better word for it, but I'm calling it "Dynamic Prototyping" (it's my blog!). JavaScript is dynamic and fun. Certainly we can do something cooler with this right? I mean, this seems like something we can really hack to our advantage. And then I saw what AngularJS was doing with prototypical inheritance. They were using it in a dynamic way, allowing "scopes" to inherit from one another, and changes to propagate from parent to child (but not child to parent). Check this out:

//create a class
function Foo(){ 
}

//Give it a method to create child clones of itself
Foo.prototype.spawnClone = function() {
    //Dynamically assign the current instance as a prototype.
    Foo.prototype = this;
    //return a new one
    return new Foo();
};

//Give it a try
var foo = new Foo();
foo.bar = 'test';
var fooSpawn = foo.spawnClone();
alert(fooSpawn.bar); //test

//so far, so good, but watch this:
foo.bar = 'weee';

//we actually updated the child by updating the parent!
alert(fooSpawn.bar); //weee


Now that's cool! Now I can take an object, and create a "child object" that not only inherits its properties, but is actually bound to that object programmatically. Granted, it's a one way binding, but it's still a lot of fun.


Watch out! Prototypical Madness!


So, we learned that we can dynamically prototype a constructor function. That's pretty cool. But watch out! The class will use whatever prototype was assigned to it last!

See where we can run into problems here (assuming the above code):

var foo = new Foo();
var fooSpawn = foo.spawnClone();
foo.bar = 'test';

var foo2 = new Foo();
//OOPS: We don't want this! We've messed with the prototype and it stuck!
alert(foo2.bar); //test


The fix to get around that is simple, you just need to do one of two things: Copy off the original prototype and reassign it after the child is created and/or use an extend method (such as JQuery's extend) to augment your child class's prototype prior to created a new instance:

function Foo() {
}

Foo.prototype.betterSpawnClone = function (){
    //create a copy of the current prototype.
    var protoCopy = Foo.prototype;
    //dynamically assign the prototype.
    Foo.prototype = this;
    //create the clone.
    var clone = new Foo();
    //reset the prototype back to normal.
    Foo.prototype = protoCopy;
    //return our clone.
    return clone;
}

Anyhow, here's a fiddle to play with. It's a fun little trick with JavaScript and I hope it's useful to someone:

Wednesday, February 13, 2013

AngularJS: Creating A Service With $http

So I did a talk on AngularJS last night at the Pittsburgh .NET Users' Group. It was a great talk and you guys asked a lot of great questions. One of my friends that attended suggest people use my blog as a resource for Angular, and while I do have quite a few posts about Angular, I don't feel like it's quite as flushed out in that department as I'd like it to be before I wanted my friends plugging it like that. So I've decided to try to fill that gap, at least as far as Angular goes, over the next few weeks, if I can. (Don't worry, PGHNUG attendees, I'll still try to get a well commented Angular/Web API solution up on GitHub soon).


HTTP Service Calls In Angular


One of the most common questions I see on StackOverflow regarding Angular, are questions involving the creation of AJAX-based angular services using $http. Commonly the pattern used is very reminiscent of JQuery, where there's a method with a callback when the data is received.

Common Callback Example


app.factory('myService', function($http) {
   return {
     getFooOldSchool: function(callback) {
       $http.get('foo.json').success(callback);
     }
   }
});

app.controller('MainCtrl', function($scope, myService) {
  myService.getFooOldSchool(function(data) {
     $scope.foo = data;
  });
});


That's fine. It's an easy to understand pattern that is predictable for most other developers using your service and most importantly, it works.


Angular Loves Promises

In Angular, there is a service called $q. It is a deferred/promise implementation built off of Q by Kristopher Kowal. I know I've talked about deferment and promises in JavaScript in the past, but as a very quick refresher, the idea behind this pattern is basically to have a mechanism to signal when one (or sometimes many) asynchronous actions are complete. It's the hub of JQuery's AJAX implementation, Angular $http implementation and Angular's $resource implementation. I like this mechanism so much I've even implemented it in my .NET Event Loop Framework.

EDIT: As of 1.2.0, promises are no longer resolved by templates.

So in code, that means if you return a promise from your service, and put it directly in a scope property, it will asynchronously update that scope property and process the changes.

Since $http methods like get() and post() return promises, we can use that promise's then() method (which also returns a promise) to pull the data out of the result. The return value from the then() method's callback is used to resolve that promise.

Simplified Promise Pattern Example


app.factory('myService', function($http) {
   return {
     getFoo: function() {
       //since $http.get returns a promise,
       //and promise.then() also returns a promise
       //that resolves to whatever value is returned in it's 
       //callback argument, we can return that.
       return $http.get('foo.json').then(function(result) {
           return result.data;
       });
     }
   }
});

app.controller('MainCtrl', function($scope, myService) {
   //DEPRECATED: The commented line below WILL NO LONGER WORK in 1.2.0
   //since promises are no longer resolved by templates.
   //$scope.foo = myService.getFoo();

   //make the call to getFoo and handle the promise returned;
   myService.getFoo().then(function(data) {
       //this will execute when the 
       //AJAX call completes.
       $scope.foo2 = data;
       console.log(data);
   });
};



And because some of you like to play around, here's a bit of code on plunker showing the promise pattern for $http calls in Angular:


Edit: Complex $http calls from within a Service


Because I've been asked by friends what to do in situations where you might have nested or simultaneous async calls in a controller, I think this blog entry is a really good place to show some examples of that, since it falls under the same domain, so to speak.

There are of course scenarios where you might have a service method that requires more than one $http call (or other async call) to be made before you want the service to return. This is where you'd want to use the $q service mentioned above.

Example of dealing with multiple async calls to return simultaneously


app.factory('myService', function ($http, $q){
  return {
    getItems: function (){
      //$q.all will wait for an array of promises to resolve,
      // then will resolve it's own promise (which it returns)
      // with an array of results in the same order.
      return $q.all([
        $http.get('items_part_1.json'),
          $http.get('items_part_2.json')
      ])
        
      //process all of the results from the two promises 
      // above, and join them together into a single result.
      // since then() returns a promise that resolves ot the
      // return value of it's callback, this is all we need 
      // to return from our service method.
      .then(function(results) {
        var data = [];
        angular.forEach(results, function(result) {
          data = data.concat(result.data);
        });
        return data;
      });
    }
  };
});


Example of dealing with nested async calls in a single service call

Technically, we can deal with nested async calls without using $q. ("nested" to say that each call to $http triggers a subsequent call to $http in order to build out some data) This simplifies the code a little, but in my opinion makes it harder to follow.  For example:

 getNestedData: function (){ 
      // get the parents.
      return $http.get('parents.json')
        .then(function(result) {          
          //return a promise to that data.
          return result.data;
        })
        //handle the promise to the parent data
        .then(function(parents) {         
          //get the children.
          return $http.get('children.json')
            //handle the promise to the children.
            .then(function(result) {            
              //add children to the appropriate parent(s).
              var children = result.data;
              angular.forEach(parents, function(parent) {
                parent.children = [];            
                angular.forEach(children, function(child) {
                  if(parent.childIds.indexOf(child.id) >= 0) {
                    parent.children.push(child);
                  }
                });
              });              
              //return the parents
              return parents;
            });
        });
    }


Example of nested calls using $q directly

This is a point where it really doesn't matter which route you go when it comes to what's going to work. However, it is my opinion that using $q directly in complicated async calls or nested async calls, enhances readability. This is simply because it becomes easier to see which promise was returned, when it was created, and where it was resolved:


getNestedDataBetter: function (){
  //create your deferred promise.
  var deferred = $q.defer();
  
  //do your thing.
  $http.get('parents.json')
    .then(function(result){
      var parents = result.data;
      $http.get('children.json')
        .then(function(result) {
          var children = result.data;
            angular.forEach(parents, function(parent) {
              parent.children = [];            
              angular.forEach(children, function(child) {
                if(parent.childIds.indexOf(child.id) >= 0) {
                  parent.children.push(child);
                }
              });
            }); 
            
            //at whatever point in your code, you feel your
            // code has loaded all necessary data and/or
            // resolve your promise.
            deferred.resolve(parents);
        });
    });
    
  //return your promise to the user.
  return deferred.promise;
}


Tuesday, February 5, 2013

Angular JS - Scrolling To An Element By Id

So a question on StackOverflow recently was asking about how to scroll to an element by an anchor id, and it garnered some really hacky answers. They worked, but it seemed weird that Angular would force you to jump through hoops to manage scrolling. I thought the question was intriguing because it dealt with a potentially common scenario with a client-side routed application: How do I scroll to an element on the page?



Angular Already Has It

(I'm hoping for $kitchenSinkProvider in 2.x!)

Turns out that Angular already has a mechanism for this called $anchorScroll(). Unfortunately, it's poorly documented, and I had to actually view the source code on GitHub to figure out how to use it. So here's my attempt to remedy this: A quick run down of how to use $anchorScroll to scroll to the proper element after a routed request:

$anchorScroll works off of the hash value set in $location.hash(), so to use it dynamically, you'll need to first set $location.hash to something, then you'll need to call $anchorScroll().


Scrolling To An Element By ID With Routing


So presuming you've set up routing for your application module already in your .config function, I'm going to set up the scrolling mechanism in the application module's run function like so:

app.run(function($rootScope, $location, $anchorScroll, $routeParams) {
  $rootScope.$on('$routeChangeSuccess', function(newRoute, oldRoute) {
    $location.hash($routeParams.scrollTo);
    $anchorScroll();  
  });
});

Pretty simple, right? What I did here is I subscribed to the $routeChangeSuccess event broadcast by the $routeProvider. Inside of that, I set the $location.hash to the value passed into $routeParams at the key scrollTo.

But where does that scrollTo value come from?

<a href="#/test/123/?scrollTo=foo">Test id=123, scroll to #foo</a>

The above link, assuming I have a route like "/test/:id" will route and then scroll to foo. It seems to be a little known thing about Angular that you can pass as many parameters as you like into $routeParams via a faux-querystring.

Anyhow, here is a demonstration on Plunker of the above technique.



Scrolling To An Element By ID Without Routing


Now something to note about $anchorScroll is that you don't have to use it just with routing, if that weren't obvious. You can inject it into any controller or directive and call it as you see fit. You just need to make sure you're setting the hash on Angular's $location.

For example, you could add an item to a list, and then scroll to it after it were added:


app.controller('MainCtrl', function($scope, $location, $anchorScroll) {
  var i = 1;
  
  $scope.items = [{ id: 1, name: 'Item 1' }];
  
  $scope.addItem = function (){
    i++;
    //add the item.
    $scope.items.push({ id: i, name: 'Item ' + i});
    //now scroll to it.
    $location.hash('item' + i);
    $anchorScroll();
  };
});


And here is demo of that on Plunker

Now, truth be told, it's a bit of fuzzy ground as to whether or not you should be using $anchorScroll in a controller like that. Since it's sort of "DOM manipulation" maybe it belongs in a directive... Then again, it never directly references the DOM does it? So it's most likely okay. It's down to a judgement call, and I always differ to whatever is easiest to maintain in those cases.


I Hope This Helps Someone


It's certainly not a well-documented feature. And in the sake of full (probably obvious) disclosure: I did of course contribute my answer on StackOverflow, I don't know or even care whether or not it's accepted, it's likely a better place than my blog for the answer to be so more people find it. I just want to make sure I got what I learned down somewhere I might think to look later, or that might help someone else.

If you found this and it helped you, I'd encourage you to head over the StackOverflow and answer just one question and help someone else. Pick a question you might not even know the answer to if you want to learn something new. The important thing is that we keep technology pushing forward by contributing, gaining knowledge, and innovating.

Monday, February 4, 2013

JavaScript - Scalar Promotion aka Hoisting

Scalar Promotion or Hoisting is a property of JavaScript that is often ignored but can cause some pretty crazy issues. (I like the term Scalar Promotion because it helps me fool people into thinking I'm smart and "Code Motion" sounds like I'm bullshitting too hard. Haha). To illustrate what it is and how it works, I'm going to show a few code examples. Take the following, very simple code for example:

(function() {
    var foo = 'abc',
          bar = 'xyz';
    console.log(foo + bar); //abcxyz
})();

The output of the above, as you might expect, will be abcxyz. Now lets have a look at what happens if we don't assign bar:


(function() {
    var foo = 'abc',
          bar;
    console.log(foo + bar); //abcundefined
})();


Now the output is going to be abcundefined. So what happens if we remove the declaration of bar?


(function() {
    var foo = 'abc';
    console.log(foo + bar); //ReferenceError!!!
})();


Now we get a ReferenceError, because bar was never declared. Well that makes sense, right? But to show you what Scalar Promotion is all about have a look at this.. we'll move the declaration of bar to somewhere below the console.log():

(function() {
    var foo = 'abc';
    console.log(foo + bar); //abcundefined again
    var bar;
})();

... and the result is abcundefined. Okay, what the tell right? Why does this behave that way? Scalar Promotion. Basically as a compiler optimization, all declarations are moved to the top of your code, so to speak. This is why you can use a function in code before you've declared it in javascript:

(function (){
  callMe();

  function callMe() {
    alert("I've been called!");
  }
})();


The reason this can become problematic is that JavaScript doesn't force you to declare your variables. So consider the following code:

// A Tale Of Two Foos
(function (){
    
    // an anonymous function to do something or other asynchronously
    // this could be anything, like an ajax callback for example.
    setTimeout(function (){
        
        //manipulate foo, but forget to declare it.
        for(var i = 0; i < 10; i++) {
          foo += 'x';   
        }
        
        // SORCERY?!!! 45xxxxxxxxxx?! 
        // this should just be ten  x's!
        console.log(foo);
    },0);
    
    /* ---
    assume some large amount of code here. So you don't 
    know/remember these two foos are in the same scope.
    --- */
    
    // declare foo and add some numbers to it.
    var foo = 0;
    for(var i = 0; i < 10; i++) {
        foo += i;
    };
    console.log(foo); //45
})();


Do you see what happened there? One little missed declaration and there's no error... just a bug. Generally the only way to avoid this is to try to declare your variables at the beginning of your function scope if possible/practical. Either way, be careful to declare you variables, kids!

Thursday, January 3, 2013

Structuring A Clientside JavaScript Application


World's largest bowl of spaghetti
... or how I wrote JavaScript the first half of my career




A lot of this will seem like common sense to some, but I have on many occasions seen very good developers write some pretty poorly structured JavaScript apps (including myself, haha), and I've been asked about it a few times recently, so I'll give it a shot. I hope to go over most of what I know in this arena.  There is a good amount to go over, but I'll try to hit the most important basics.


Use A JavaScript Bundler/Minifier/"Compiler"


Probably the biggest and most important thing to do with a large JavaScript client application is to give yourself the ability to separate your code out into individual files each containing code with a single responsibility without during your user experience with a ton of .js files to download.  The best way to do that is by using a tool to take your individual JavaScript files and compile them into a single JavaScript file (minified or un-minified). Fortunately, there are a lot of tools for doing this. Notably tools like UglifyJS and the build tool Grunt, ASP.Net MVC4's new Bundling and Minification features, extensions for Visual Studio like Mindscape Workbench.  They all offer a variety of features, some of them will even do things like run JSLint against your code or kick off unit tests in various JS testing frameworks.  The important features you're looking for is "bundling" or concatenation, and minification.


Use the Module Pattern


The module pattern is something that has been covered extensively all over the web, and probably better than I could do it, but the basics of the module pattern is to use JavaScript closure in an immediately executed anonymous function to contain all variables and functionality that you want to remain private to a set of code. You can do tons of stuff with this pattern, and it really saves you from junking up your global scope, be it a module in NodeJS or the window object in a browser. I recommend wrapping your practially your whole JS file in the module pattern specifically because then you're guaranteed to know what you're adding to the global scope, because you've done so explicitly.

So basically you're looking at something like this:
;var myModule = (function($, foo) {
    //here $ and foo are local to the scope.

    //here's some private variable.
    var bar = 123;

    //here's a private function
    function superThing() {
       alert('Bar is ' + bar++);
    }

    /** You can alter injected objects to
        effect the outside **/
    // here we'll alter the jQuery object by adding a plugin.
    $.fn.superPlugin = function() {
       $(this).on('click', superThing);
    };
    
    // same as above but we're removing a binding here.
    $.fn.removeSuperPlugin = function() {
       $(this).off('click', superThing);
    }

    /** You can have it return an interface 
        to whatever it's doing **/    
    return {
        incrementBar: function () {
            bar++;
        },
        callSuperThing: function (){
            superThing();
        }
    };
})(jQuery, foo);


Note: So what's the ; for in the module above? Basically to keep bundlers and minifiers from combining two function calls into one function call, causing errors: (function(a,b){})(c,d)(function(w,x){})(y,z)


Structure Your Files Appropriately


Think of your class files or code files in your usual programming language of choice: C++, Java or C# for example. You're breaking your code out into files based on single responsibility. (Or at least I hope you are). You should be doing the same with your JavaScript files. Because you're bundling them or building them into a single file prior to hosting them, you now have the freedom to structure your JavaScript files however you want.

An example of such a file structure might be (for an Angular app for example):


  +-App
  | +-Directives
  | | +-my-click-directive.js
  | | +-table-directive.js
  | | +-directives-module.js
  | |
  | +-Controllers
  | | +-HomeCtrl.js
  | | +-SearchCtrl.js
  | | +-FooCtrl.js
  | |
  | +-Services
  | | +-facebook.js
  | | +-twitterSearch.js


... you get the idea. But the point is that in the end, it doesn't matter how many files you have, as long as you can easily maintain and organize them. The alternative is to have the one JavaScript "mega-file" which I've seen way too many times.


But, How Do I Debug Bundled Or Minified Files?


This actually isn't as bad as you'd think. Some of the packages, MVC4's bundler for example, will actually output individual files when you're in debug mode. Other bundlers will tack some whitespace and a comment in with the name of the file where concatenations have been done. This is helpful if it's not minified. Also if it's not minified, your comments will show up, and if you've added good comments, and you know how you've structured your app, it shouldn't be a huge leap to find the offending lines of code in your source while you're debugging.

But what about if the file is minified? Well, now you're in for it, I guess. However, some browsers, such as Google's Chrome actually have a feature to "prettify" minified code to so you can step through it in a readable format. (In Chrome this feature is a little button with a {} in it in your developer's console)



LOOK AT OTHERS' EXAMPLES!!!


Another really great way to get an idea on how to structure your JavaScript code is too look at examples presented by others. They are all over the place these days! Just think of your favorite JavaScript framework or library and check them out on GitHub: JQuery, Angular, Ember, etc. All really awesome examples of how to effectively structure your JavaScript code and files.