Tuning AngularJS for Performance


Qualtrics adopted the use of AngularJS in 2014 when we started building out our Vocalize product. Our goal was to select a framework that would allow us to prototype quickly while building an application that we wouldn’t need to scrap completely when the product became wildly successful. Even though it has a steep learning curve, AngularJS is well suited to building applications very quickly. It allows the developer to write modules in HTML rather than in pure JavaScript, effectively extending the language of HTML. AngularJS was well documented, in active development, and had a large community supporting it, so we decided to pick it up.

Angular does not require detailed knowledge of how it works to use it successfully. Without taking the time to learn the details however, it is far too easy to make poor decisions that compound, negatively impacting the performance of your applications.

In this post, I am going to touch on some of the lessons that we have learned that have helped us improve the performance of Vocalize. We will talk about different ways to reduce the number of watchers, tricks for reducing the digest cycle, and how to eliminate Angular-specific memory leaks.

Reducing Watchers

The Angular world revolves around the digest cycle. The basic concept is that JavaScript variables can become bound to the html document, and each bound expression is “watched” such that when a digest cycle triggers, it loops over every binding to detect and re-render changes in the data. Reducing the number of watchers will not only reduce your application’s memory footprint but also shortens the amount of time taken in each digest cycle.  Here are some things you can do to reduce watchers:

1. Use one time binding

One time binding is a useful feature introduced in Angular 1.3. From the docs: “One-time expressions will stop recalculating once they are stable, which happens after the first digest if the expression result is a non-undefined value.” Thus the expression is evaluated only once; the digest cycle does not continue to evaluate it for changes. To use one time binding, simply put :: in front of any variable, and you are in business! It even works with ng-repeat. For example:

Instead of looping over a static list of tabs:

Use one time binding since they are not expected to change:

Here is a fiddle showing how these two examples perform:

To test the impact on the digest cycle, I created an ng-repeat over a randomly generated list of 10 words to 100,000 words and ran them through Benchmark.js  running the digest loop as the comparison test.  The results show that as the number of words in the list increase bind once remains consistent while the regular bindings quickly drop the number of digests able to run in a second.


2. Avoid ng-repeat for big lists

ng-repeat is commonly used to display table rows and lists. To ensure the list stays in the right order, every row of data is monitored for changes. In addition to all of the watchers added ng-repeat gets a second performance hit by doing very heavy DOM manipulation when there are changes.  Instead of using one long list, consider breaking it up into multiple smaller chunks and providing a UI for paginating through those chunks.

Another option is to avoid ng-repeat altogether and build the html using JavaScript. In the Vocalize application, we had several very large data tables that were built using ng-repeat. The tables didn’t need to be dynamic, and most of them didn’t have any bound interaction, but because they were using  ng-repeat they were adding hundreds of unnecessary watchers. We modified the code and built the tables as html strings utilizing ng-bind-html. This resulted in a significant performance improvement in the Vocalize application.

Instead of looping over every word in a book:

Try splitting the words onto pages:

Or use an HTML generator

Here is a fiddle demonstrating the performance differences between these three methods:

And here is the output for how many digest cycles can be performed in a second with each of the methods. As the number of words increase, both the paged solution (only showing 10 at a time), and the ng-bind-html solution remain consistent while the full list quickly drops to zero.


3. Use ng-if over ng-show

The ng-show and ng-hide directives toggle the CSS display property on a particular element. This means that all child nodes are still rendering and watching for changes even though they may  not be displayed.  ng-if completely removes the DOM elements and unlinks the watchers while the content is not visible. The ng-switch directive gives you the same performance benefits as ng-if.

Instead of using ng-show and ng-hide:

Use ng-if:

Or ng-switch:

Here is a fiddle comparing ng-if to ng-show.  It is demonstrates how child elements and their watchers are removed and completely ignored with ng-if.

As the number of bindings increase on the child elements, you can see that the number of digests per second remain consistent with  ng-if but quickly decrease with ng-show.


4. Avoid filter on ng-repeat

Filters are run twice on every digest cycle. First, when the watcher is triggered from changes, and a second time to check any further changes in the digest cycle.  Similar to ng-show, filters hide the filtered elements with CSS but do not remove those elements from the DOM. Thus all child scopes and watchers are still in memory being tracked, even if the elements are not visible.

One potential solution to using filters is to run the filter in your controller using the $filter service.

Instead of using a filter in your ng-repeat:

Use the $filter service in your controller:

Here is a fiddle that compares the two ways of filtering lists:

The results after running each method through an increasing number of words shows the pre-filtered values remaining a consistent 800k digests per second, while the ng-repeat filtered values continues to drop in performance.


5. Use $watch judiciously

Some developers say it is poor design in Angular to explicitly call scope.$watch.  An alternative to using $watch may be to implement the observer pattern using callbacks or events.

Instead of watching changes to your model:

Listen to an event:

Digest Cycle Improvements

In addition to shortening the time it takes to execute a digest cycle by reducing watchers, there are also techniques to prevent the digest cycle from running too frequently. Here are some tips to keep your digest cycle executions to a minimum:

6. Use $digest instead of $apply

If you are listening to an JavaScript event that happens outside of the Angular ecosystem (jQuery click events for example), you may need to manually trigger a digest cycle to notify other components of your changes.  The scope.$apply function triggers at the $rootScope and travels to all child scopes triggering all watchers in the application. Alternatively, the scope.$digest function only triggers a digest cycle on itself and its own child nodes.  Thus, you only need to use scope.$apply if a parent scope needs to know about the changes.

7. Avoid calling $digest on an interval or using $interval

If you are using the $interval service, know that a digest cycle will be triggered on each interval. If you have a very long digest cycle and a very short interval, your application can quickly lock up. Even if the interval is longer and your digest cycle is quicker, the app is actively triggering digest cycles without any user interaction and may cause lag. One library that does this is ng-idle, which by default, triggers a digest loop every second.

8. Avoid calling $digest on mouse move or window scroll events

Events and handlers that fire quick succession are terrible for performance in general. The digest cycle can, and probably will, take longer than it takes for the events to fire. This can cause the browser to lock up or lag.

If you need to watch for scroll events or mouse move events, use throttling or debouncing. Libraries like lodash make this straight-forward.

Instead of triggering a digest on every scroll event

Trigger on a debounce and only if something changed

9. Use $applyAsync and $evalAsync

$applyAsync waits for the JavaScript interpreter to be silent before queuing up the changes and executing the digest cycle. If you have multiple events triggering changes at the same time, instead of running a digest cycle for each, you can use $applyAsync to queue a single cycle.

Instead of triggering a digest for both the mouseup and the click

Use $applyAsync to queue both changes in one cycle

Eliminating memory leaks

Although not specific to Angular, memory management can quickly become an issue if not actively maintained. The browser will run garbage collection on any JavaScript variable that is no longer assigned or referenced in any context. Frequently we forget to properly destroy or dereference variables causing memory leaks. This is true for many Angular specific constructs.

10. $destroy is your friend

Not only do Angular watchers and event handlers have closures and other variables that can be leaked, but they also may reference a scope, which in turn references a whole lot more. To properly clean up after yourself, you can listen to a very nice $destroy event that is fired when the scope is destroyed. You simply add the following code to your controller or directive:

Here some of things you should clean up in the $destroy handler:

  • scope.$watch
  • scope.$on
  • Element.on
  • $timeout
  • Any other shared variable or reference

11. Use Chrome Devtools’ Profiler

There are a number of tools out there that can help you identify and eliminate memory leaks in your application.  The Chrome Developer tools’ Profiles tab can be extremely handy here. When you open it, you are given an option of selecting which profile type you want to create. The first option is for recording the JavaScript CPU Profile and Google has a great article on how to use this. The other 3 options, Take Heap Snapshot, Record Allocation Timeline and Record Allocation Profile, are used for doing memory profiling and there are also many great articles on how to use these as well.

I have found Heap Snapshot Profiles to be especially useful. This tool takes a look at all of the allocated memory and can help you identify what objects are the highest consumers.


I like to take a snapshot before and after I’ve added a new feature or bugfix to see if there are any unexpected changes.  An example of making sure memory doesn’t change is with the add widget functionality in Vocalize. I would take a snapshot before I add a widget, after I have added it, then once more after I have deleted it. The state of the application before the widget is added and after it is removed should suggest that the heap will remain the same. If they have changed, then I likely have a leak.

12. Use Batarang Devtools Plugin

There is a useful plugin to the Chrome Dev-tools built for debugging Angular applications called Batarang. Batarang comes with its own set of tools to help you profile and track the performance of your Angular application. In addition to listing out the total number of watch expressions and logging performance metrics, it also can show you all initialized scopes. You can click into each one to discover by context which component it belongs to. Thus if you notice an increase in memory, you can easily look through the Watch Tree to determine which scopes are not being destroyed that ought to be.


Parting thoughts

Angular is an extremely powerful framework that can scale to perform for large applications if you take the time to learn what it is doing under the hood.  When using it, keep in mind that there are many libraries out there that have not been built with performance as a priority. Do a performance and memory check when you integrate new libraries or add features to your application. One of the best things you can do is to constantly ask yourself the question: “How will this change affect the performance of my application?” This applies to any template bindings you are adding to your html, manual triggers on the digest cycle and event handlers and watchers you may need to clean up. Never make performance an afterthought!

Owen Hancock
Software Engineer at Qualtrics
Owen joined Qualtrics in June 2009 as the first Qualtrics intern. Owen loves developing for the web and is especially passionate about front end technologies and the user experience. He has worked on various products over the years and is now working on the next generation of reporting.

Owen has a Bachelor's Degree in Computer Science with an emphasis in Animation from Brigham Young University. He enjoys going on runs in the rain with his dog Albert and visiting new places with his wife Chelsey.

You may also like...