Microsoft Dependency Injection

ASP.NET Core recently hit the 1.0 milestone and one of the most interesting features in this version (or edition, I’m not sure of future of the MVC 5.X series) is that dependency injection is baked into the core of the framework.  That means that our controllers are no longer created by simplistic construction logic and can have complex dependencies.  In order to facilitate my unit testing requirements, my team has (for years) overridden the default IDependencyResolver in MVC to use Unity (Microsoft’s IoC library) to resolve all dependencies (“Services”) for MVC.  This approach is functional, and great for Unit Testing becuase you can inject mocked members into your controller, but it would sure be nice if that functionality was cooked into the framework so that my team didn’t have to bolt it on.  You can see the details about how to configure DI in MVC Core here.

Unfortunately, I am not ready to jump into MVC Core for my team’s production applications.  It’s not a me thing, it is a we thing.  Jumping into MVC Core is really a decision that all our development teams need to agree to and we just have too many other things to focus on.  I really believe that MVC core will be were we go in the future, so I did decide to just into the DI stack that Microsoft is using in MVC Core.  Unfortunately, Microsoft’s new DI stack is not Unity.  I decided to go against our defacto standard of Unity because if we do go to MVC core, there’s no reason to continue with Unity.  Unless the new DI stack is terribly complicated, this DI stack will become my team’s standard in the future, spoiler alert, IMHO this new DI stack is better and easier to use than Unity.

Unity (and Ninject, and AutoFac, etc) have a ton of awesomely cool features that I have never used in a professional project.  Since I understood the idea of Dependency Injection, I have desired a simpler tool.  In fact, to learn how DI works and create a simpler tool to use, I created my own IoC Container.  From my experience, and experimentation, the new DI stack is extremely simple.  The root interface, “IServiceProvider,” only has a single method, GetService.  There are several helper method to make it simpler to use, like the generic version of the method.

Dependency Injection Code

https://github.com/jcwmoore/blog

Automated builds with Travis

All developers should know the value in unit testing and automated builds.  They ensure quality is built and maintained in your software products.  You don’t want to have you interns come in and add some cool functionality into your product and deploy a broken product because they didn’t understand how their changes impacted existing functionality.  In my last post in this series I went over how to create unit tests for a node.js application.  Unit tests by themselves are great, but you (and your intern) need to execute them for there to be any value in having them.  Chances are that intern doesn’t know about unit testing and won’t know they should, or even how to, run the unit tests before committing their changes.  Fortunately there are tools that automate testing and reporting on commits.  Travis is one such tool, and it is free to use.  Setting it up for my node application was stupid easy.  There were three steps, first I needed to define a “default” gulp task like so:

 

gulp.task(‘default’, [‘run-tests’]);

 

This step is necessary because I only defined the “run-test” task in gulp when setting up the unit tests, Travis by default runs the “default” gulp task, so it needs to exist.  Second you need to create a “.travis.yml” file to define how Travis should run.
travis1
Third, you need to log in to Travis with your github account and select the repository to test.  That is it.  You are then running and testing automatically.  If all is setup up right you will see a screen like this shortly after pushing your changes to github, complete with your “build passing” badge.  Awesome!
travis2

Excel PowerPivot: The poor man’s data warehouse

Recently, one of my employees was working to automate an excel based report.  The old school was to Excel automation was involving heavy use of macros and VBA.  This would execute queries or copy data around between sheets spread out on network shares and accomplish some data task.  From my experience this was very brittle and prone to failures.  This time around, I told by employee to use Power Pivot to accomplish the automation.  There is no VBA and only SQL statements that get executed.  The visualization is accomplished by using Pivot Tables and Pivot Charts.  Power Pivot is awesome in that it can pull data from many different types of data sources, SQL, Oracle, SSRS, SSAS, MySQL and Excel Sheets.   The trick is that in order to make your data model work well for reporting you have to think a little like a data warehouse architect.  THis will enable a power user to create self refreshing data sets that can be analyzed in Pivot Tables and Pivot Charts.  Let me walk you through a simple Power Pivot model that I created to track some of my personal fitness goals and health metrics.
Let’s start with the all important time table.  In Data Warehousing, almost all measurable data ties to a point in time, and we model that with a Time Dimension.  In my example, I have a sheet with one column a date and time.  I increment each row by one hour and copied it down about 10,000 times.  You then click “Add to Data Model” and you created a Power Pivot Table.  To make this more useful and interesting for slicing data you need to add some columns.  For Example, add the day of week with this expression: FORMAT([DateTime], “ddd”).  Finally, create a key column.  I learned this trick from a former colleague, you can create a easy to read key for dates and times with a little addition and multiplication.  You could easily exclude that or expand in out to minutes or seconds even: (YEAR ([DateTime]) *10000) + (Month ([DateTime]) * 100) + DAY ([DateTime]).
Adding a Sheet to Power Pivot
add to pivot
Adding Columns
adding columns
Once you have your time situated, you are ready to start creating some tables with data to measure, fact tables in the Data Warehouse lingo.  I wanted to start with three measures: calories, exercise, and weight.  All of these tables started out very simple, there was a column for the date and column for the value.  In the Power Pivot designer I would then repeat the creation of the Date Key on each table.  Final step is to create an association between the fact tables and the time dimension using the calculated DateKey column in Power Pivot; the easiest way to do that is by clicking and dragging in the Power Pivot designer.  Once that is complete you are ready to start adding Pivot Tables or Pivot Charts in your spread sheet.
A Pivot Chart
pivot charts
The Completed Model
model

Unit Testing JavaScript with Gulp and Mocha

Immediately after getting my first node project up and running, I started to ask how do I write and test my code?  My corporate experience has taught me the importance of automated testing, how to create them in C# with visual studio and how to automate the testing with TFS, but node and VS Code is totally different.  Visual Studio is an IDE and it will take care of almost everything for you.  You simply need to create a new project (a ‘test’ project) in you code and you are good to start writing tests.  Executing your test is as easy as a file menu click or a key board short cut.  You don’t have to think about the testing framework, how your code is built or executed because the IDE will take care of that for you.
Testing Framework
First question to answer was what does it take to write a unit test with JavaScript?  After asking around on Google I discovered node packages, mocha and should.  These libraries allow you to write simple and human readable tests.  In JavaScript, you don’t get classes per say, you need to think in terms of functions and prototypes.  In MS Test you typically create a test class that maps to one class in your code, the target class.  This thinking needs to evolve with JavaScript into files and functions.  One test file is used to test one file of code and you use functions, not classes, to define your tests.  With mocha there are two very interesting functions you need to know about, ‘describe’ and ‘it’.  When you see ‘describe’ think test class in MS Test.  Describe creates a container and label for your tests.  Now ‘it’ is the actual test, think a test method in MS Test.  The ‘it’ function needs a name and a function to execute.  The should package is what will help you make your tests more readable.  This package is similar to the NuGet package, Fluent Assertions.  It allows you replace robotic assert statements with more fluent and readable asserts.  In my experience of training interns, this fluent syntax goes a long way to help new people understand unit testing and understand what is actually going on in the tests.1
Testing Framework
First question to answer was what does it take to write a unit test with JavaScript?  After asking around on Google I discovered node packages, mocha and should.  These libraries allow you to write simple and human readable tests.  In JavaScript, you don’t get classes per say, you need to think in terms of functions and prototypes.  In MS Test you typically create a test class that maps to one class in your code, the target class.  This thinking needs to evolve with JavaScript into files and functions.  One test file is used to test one file of code and you use functions, not classes, to define your tests.  With mocha there are two very interesting functions you need to know about, ‘describe’ and ‘it’.  When you see ‘describe’ think test class in MS Test.  Describe creates a container and label for your tests.  Now ‘it’ is the actual test, think a test method in MS Test.  The ‘it’ function needs a name and a function to execute.  The should package is what will help you make your tests more readable.  This package is similar to the NuGet package, Fluent Assertions.  It allows you replace robotic assert statements with more fluent and readable asserts.  In my experience of training interns, this fluent syntax goes a long way to help new people understand unit testing and understand what is actually going on in the tests.
2 3
Once you have these in place you can run your tests with the test command in the command pallet or the keyboard shortcut.
4

Getting Started with Node

Recently I attended Microsoft’s Build conference in San Francisco.  Prior to the conference I had heard about Visual Studio Code and had starting fiddling around with it and MVC Core on Linux.  While I was there I had the opportunity to explore a few sessions and labs related to VS Code and Node.js.  Over my career I have learned many techniques for building quality into software products (such as: code reviews, unit testing, automated builds, etc.) and I have also accepted that when it comes to the user interface (i.e. JavaScript in my world) those practices either do not apply or are too time consuming to be worth pursuing.  As a result, I have formed a basised opinion and do not see any reason why a sane developer would ever choose to use JavaScript as a server side programming language.   And yet, they are and, Microsoft is promoted it.  What?  Why?  I have realized that I am missing something.  This blog series is documentation of how I’m exploring what I’m failing to understand, at the moment.
This first post will be very short because Microsoft has already done a good job of writing it 🙂  The first step in learning any new tool/framework/language is simply saying “hello” or more specifically, “hello world.”  Microsoft has provided an excellent tutorial for getting up and running with Node.

 

npm install -g express-generator
express ExpressApp
cd ExpressApp
npm install
You are then free to write in a hello world message…
image1
you can then fire up the debugger and point a browser to http://localhost:3000 to see the hello world
image2
From here I added in boot strap into the views and put together a nice starting point to branch out into future learnings.  You can see what I have done on my GitHub.