Automated builds with Travis

All developers should know the value in unit testing and automated builds.  They ensure quality is built and maintained in your software products.  You don’t want to have you interns come in and add some cool functionality into your product and deploy a broken product because they didn’t understand how their changes impacted existing functionality.  In my last post in this series I went over how to create unit tests for a node.js application.  Unit tests by themselves are great, but you (and your intern) need to execute them for there to be any value in having them.  Chances are that intern doesn’t know about unit testing and won’t know they should, or even how to, run the unit tests before committing their changes.  Fortunately there are tools that automate testing and reporting on commits.  Travis is one such tool, and it is free to use.  Setting it up for my node application was stupid easy.  There were three steps, first I needed to define a “default” gulp task like so:

 

gulp.task(‘default’, [‘run-tests’]);

 

This step is necessary because I only defined the “run-test” task in gulp when setting up the unit tests, Travis by default runs the “default” gulp task, so it needs to exist.  Second you need to create a “.travis.yml” file to define how Travis should run.
travis1
Third, you need to log in to Travis with your github account and select the repository to test.  That is it.  You are then running and testing automatically.  If all is setup up right you will see a screen like this shortly after pushing your changes to github, complete with your “build passing” badge.  Awesome!
travis2

Excel PowerPivot: The poor man’s data warehouse

Recently, one of my employees was working to automate an excel based report.  The old school was to Excel automation was involving heavy use of macros and VBA.  This would execute queries or copy data around between sheets spread out on network shares and accomplish some data task.  From my experience this was very brittle and prone to failures.  This time around, I told by employee to use Power Pivot to accomplish the automation.  There is no VBA and only SQL statements that get executed.  The visualization is accomplished by using Pivot Tables and Pivot Charts.  Power Pivot is awesome in that it can pull data from many different types of data sources, SQL, Oracle, SSRS, SSAS, MySQL and Excel Sheets.   The trick is that in order to make your data model work well for reporting you have to think a little like a data warehouse architect.  THis will enable a power user to create self refreshing data sets that can be analyzed in Pivot Tables and Pivot Charts.  Let me walk you through a simple Power Pivot model that I created to track some of my personal fitness goals and health metrics.
Let’s start with the all important time table.  In Data Warehousing, almost all measurable data ties to a point in time, and we model that with a Time Dimension.  In my example, I have a sheet with one column a date and time.  I increment each row by one hour and copied it down about 10,000 times.  You then click “Add to Data Model” and you created a Power Pivot Table.  To make this more useful and interesting for slicing data you need to add some columns.  For Example, add the day of week with this expression: FORMAT([DateTime], “ddd”).  Finally, create a key column.  I learned this trick from a former colleague, you can create a easy to read key for dates and times with a little addition and multiplication.  You could easily exclude that or expand in out to minutes or seconds even: (YEAR ([DateTime]) *10000) + (Month ([DateTime]) * 100) + DAY ([DateTime]).
Adding a Sheet to Power Pivot
add to pivot
Adding Columns
adding columns
Once you have your time situated, you are ready to start creating some tables with data to measure, fact tables in the Data Warehouse lingo.  I wanted to start with three measures: calories, exercise, and weight.  All of these tables started out very simple, there was a column for the date and column for the value.  In the Power Pivot designer I would then repeat the creation of the Date Key on each table.  Final step is to create an association between the fact tables and the time dimension using the calculated DateKey column in Power Pivot; the easiest way to do that is by clicking and dragging in the Power Pivot designer.  Once that is complete you are ready to start adding Pivot Tables or Pivot Charts in your spread sheet.
A Pivot Chart
pivot charts
The Completed Model
model

Unit Testing JavaScript with Gulp and Mocha

Immediately after getting my first node project up and running, I started to ask how do I write and test my code?  My corporate experience has taught me the importance of automated testing, how to create them in C# with visual studio and how to automate the testing with TFS, but node and VS Code is totally different.  Visual Studio is an IDE and it will take care of almost everything for you.  You simply need to create a new project (a ‘test’ project) in you code and you are good to start writing tests.  Executing your test is as easy as a file menu click or a key board short cut.  You don’t have to think about the testing framework, how your code is built or executed because the IDE will take care of that for you.
Testing Framework
First question to answer was what does it take to write a unit test with JavaScript?  After asking around on Google I discovered node packages, mocha and should.  These libraries allow you to write simple and human readable tests.  In JavaScript, you don’t get classes per say, you need to think in terms of functions and prototypes.  In MS Test you typically create a test class that maps to one class in your code, the target class.  This thinking needs to evolve with JavaScript into files and functions.  One test file is used to test one file of code and you use functions, not classes, to define your tests.  With mocha there are two very interesting functions you need to know about, ‘describe’ and ‘it’.  When you see ‘describe’ think test class in MS Test.  Describe creates a container and label for your tests.  Now ‘it’ is the actual test, think a test method in MS Test.  The ‘it’ function needs a name and a function to execute.  The should package is what will help you make your tests more readable.  This package is similar to the NuGet package, Fluent Assertions.  It allows you replace robotic assert statements with more fluent and readable asserts.  In my experience of training interns, this fluent syntax goes a long way to help new people understand unit testing and understand what is actually going on in the tests.1
Testing Framework
First question to answer was what does it take to write a unit test with JavaScript?  After asking around on Google I discovered node packages, mocha and should.  These libraries allow you to write simple and human readable tests.  In JavaScript, you don’t get classes per say, you need to think in terms of functions and prototypes.  In MS Test you typically create a test class that maps to one class in your code, the target class.  This thinking needs to evolve with JavaScript into files and functions.  One test file is used to test one file of code and you use functions, not classes, to define your tests.  With mocha there are two very interesting functions you need to know about, ‘describe’ and ‘it’.  When you see ‘describe’ think test class in MS Test.  Describe creates a container and label for your tests.  Now ‘it’ is the actual test, think a test method in MS Test.  The ‘it’ function needs a name and a function to execute.  The should package is what will help you make your tests more readable.  This package is similar to the NuGet package, Fluent Assertions.  It allows you replace robotic assert statements with more fluent and readable asserts.  In my experience of training interns, this fluent syntax goes a long way to help new people understand unit testing and understand what is actually going on in the tests.
2 3
Once you have these in place you can run your tests with the test command in the command pallet or the keyboard shortcut.
4

Getting Started with Node

Recently I attended Microsoft’s Build conference in San Francisco.  Prior to the conference I had heard about Visual Studio Code and had starting fiddling around with it and MVC Core on Linux.  While I was there I had the opportunity to explore a few sessions and labs related to VS Code and Node.js.  Over my career I have learned many techniques for building quality into software products (such as: code reviews, unit testing, automated builds, etc.) and I have also accepted that when it comes to the user interface (i.e. JavaScript in my world) those practices either do not apply or are too time consuming to be worth pursuing.  As a result, I have formed a basised opinion and do not see any reason why a sane developer would ever choose to use JavaScript as a server side programming language.   And yet, they are and, Microsoft is promoted it.  What?  Why?  I have realized that I am missing something.  This blog series is documentation of how I’m exploring what I’m failing to understand, at the moment.
This first post will be very short because Microsoft has already done a good job of writing it 🙂  The first step in learning any new tool/framework/language is simply saying “hello” or more specifically, “hello world.”  Microsoft has provided an excellent tutorial for getting up and running with Node.

 

npm install -g express-generator
express ExpressApp
cd ExpressApp
npm install
You are then free to write in a hello world message…
image1
you can then fire up the debugger and point a browser to http://localhost:3000 to see the hello world
image2
From here I added in boot strap into the views and put together a nice starting point to branch out into future learnings.  You can see what I have done on my GitHub.

Dependency Injection and ASP.NET MVC

For many years I have used Dependency Injection in the projects I have worked on professionally.  At work we have standardized on Microsoft’s Unity package.  Unity is great; I have nothing against it, but I wanted to learn more about how these IoC packages work under the hood and the dependency injection really takes.  Turns out, with C#, it really doesn’t take much.  The .NET framework includes reflection and allows you to dynamically create objects without much effort.  To build an object, all you need to do is reflect into the constructors, pick one to use (in my case I did a greedy search finding the largest constructor I could use) and build the object.  You have to recursively build the parameters to the constructor.  Because recursion is happening, it doesn’t matter how complex you object is, it can have zero or a hundred constructor parameters, and each parameter can have their own dependencies.  Recursion makes for a simple and elegant solution.
My next challenge, and this is something that I have used at work many times, was to make this work in the MVC framework.  MVC relies on reflection and naming convention to build your controller objects with the HTTP requests, but the magic is that MVC knows exactly where to send the request and how to build the controller object it needs.  But, have you ever said, gee, I really need a database connection in this controller, so I’ll just add it to the constructor because that will make testing really easy.  MVC will puke on you because it is also using a dependency injection tool under the hood, a very very simple one and only builds parameter-less constructors.  Good news, almost every piece (at least everything I have ever worked with) of MVC is extendable, including its simple DI code.  MVC defines the IDependencyResolver interface for resolving dependencies, including controllers.