Excel PowerPivot: The poor man’s data warehouse

Recently, one of my employees was working to automate an excel based report.  The old school was to Excel automation was involving heavy use of macros and VBA.  This would execute queries or copy data around between sheets spread out on network shares and accomplish some data task.  From my experience this was very brittle and prone to failures.  This time around, I told by employee to use Power Pivot to accomplish the automation.  There is no VBA and only SQL statements that get executed.  The visualization is accomplished by using Pivot Tables and Pivot Charts.  Power Pivot is awesome in that it can pull data from many different types of data sources, SQL, Oracle, SSRS, SSAS, MySQL and Excel Sheets.   The trick is that in order to make your data model work well for reporting you have to think a little like a data warehouse architect.  THis will enable a power user to create self refreshing data sets that can be analyzed in Pivot Tables and Pivot Charts.  Let me walk you through a simple Power Pivot model that I created to track some of my personal fitness goals and health metrics.
Let’s start with the all important time table.  In Data Warehousing, almost all measurable data ties to a point in time, and we model that with a Time Dimension.  In my example, I have a sheet with one column a date and time.  I increment each row by one hour and copied it down about 10,000 times.  You then click “Add to Data Model” and you created a Power Pivot Table.  To make this more useful and interesting for slicing data you need to add some columns.  For Example, add the day of week with this expression: FORMAT([DateTime], “ddd”).  Finally, create a key column.  I learned this trick from a former colleague, you can create a easy to read key for dates and times with a little addition and multiplication.  You could easily exclude that or expand in out to minutes or seconds even: (YEAR ([DateTime]) *10000) + (Month ([DateTime]) * 100) + DAY ([DateTime]).
Adding a Sheet to Power Pivot
add to pivot
Adding Columns
adding columns
Once you have your time situated, you are ready to start creating some tables with data to measure, fact tables in the Data Warehouse lingo.  I wanted to start with three measures: calories, exercise, and weight.  All of these tables started out very simple, there was a column for the date and column for the value.  In the Power Pivot designer I would then repeat the creation of the Date Key on each table.  Final step is to create an association between the fact tables and the time dimension using the calculated DateKey column in Power Pivot; the easiest way to do that is by clicking and dragging in the Power Pivot designer.  Once that is complete you are ready to start adding Pivot Tables or Pivot Charts in your spread sheet.
A Pivot Chart
pivot charts
The Completed Model

Dependency Injection and ASP.NET MVC

For many years I have used Dependency Injection in the projects I have worked on professionally.  At work we have standardized on Microsoft’s Unity package.  Unity is great; I have nothing against it, but I wanted to learn more about how these IoC packages work under the hood and the dependency injection really takes.  Turns out, with C#, it really doesn’t take much.  The .NET framework includes reflection and allows you to dynamically create objects without much effort.  To build an object, all you need to do is reflect into the constructors, pick one to use (in my case I did a greedy search finding the largest constructor I could use) and build the object.  You have to recursively build the parameters to the constructor.  Because recursion is happening, it doesn’t matter how complex you object is, it can have zero or a hundred constructor parameters, and each parameter can have their own dependencies.  Recursion makes for a simple and elegant solution.
My next challenge, and this is something that I have used at work many times, was to make this work in the MVC framework.  MVC relies on reflection and naming convention to build your controller objects with the HTTP requests, but the magic is that MVC knows exactly where to send the request and how to build the controller object it needs.  But, have you ever said, gee, I really need a database connection in this controller, so I’ll just add it to the constructor because that will make testing really easy.  MVC will puke on you because it is also using a dependency injection tool under the hood, a very very simple one and only builds parameter-less constructors.  Good news, almost every piece (at least everything I have ever worked with) of MVC is extendable, including its simple DI code.  MVC defines the IDependencyResolver interface for resolving dependencies, including controllers.

Learning Friday: Entity Framework 7

Unit Testing with entity framework
ah… scratch that, Entity Framework Core 1.0
Recently, one of my peers and I were tasked with creating a database to help us select and identify the different technical skills (PowerShell vs C# vs ASP, etc) that we expect our team members to have as they progress through the technical career track.  Anyway, the details of the exercise are very uninteresting, but what is interesting is that I was able to use this as a chance to explore Entity Framework 7 Entity Framework Core and as a chance to share what I learned with most of the developers in my organization.
Entity Framework 7 Entity Framework Core recently hit release candidate status and there have been some serious changes under the hood to how this ORM works.  Entity Framework has always been build around the EDMX for creating the model to map between you code and the database.  In Entity Frame work the concept of the “Code First” model was introduced to eliminated the need for using the XML based model files and allow developers to build their models using code only.  This code first approach still relied on all the EDM bits inside Entity Framework to actually work.  In EF7 EFC, that model has been retired and the code first method is the way to build your database models with all the EMD bits being left behind.

At Associated we have expectations that all of our people are leaders, to some degree, and that all team members will be effective communicators in both written and verbal communication skills.  It is an emphasis that I have not seen in other organizations but I have see first had the costs of poor communication insted time in meetings and failure to properly execute on projects.  So, to promote these skills, and to promote a culture of learning, we created the “Tech Faire” platform.  It started as an every other Friday activity where people gave 20-20 style pecha kucha style presentation over a work related (usually technical since this started in our development departments) subject.  I would be an opportunity to work on your communication and presentation skills and teach the rest of the department something.  Well, the Tech Faire has grown company wide and now it is happening on a much less frequent basis.  To continue promoting learning and nudging people to teaching others and growing their communication skills, I ran the first “Learning Friday” activity.  My idea is that these activity are 30 minute classes where you dive deeper into a technical subject that you would in the pecha kucha presentation.  It is my strong belief that our senior employees should be doing something like this to promote and grow our junior employees.  This Learning Friday platform is intended to fill the gaps between our Tech Faires and provide a new (smaller and less intimidating) platform to teach technical skills and grow communication abilities.

I used the exercise that I discussed above as a chance to explore Entity Framework 7.  Building as simple ASP.NET site that talked to a database (with EF7) took about 8 hours from zero to production ready, dependency injection, unit testing, the works.  I then spend about 1 hour stripping out the non-EF7 bits and grabbing some screen shots to create some documentation.  So for a total investment of 1 hour above what I was already asked todo, I had everything ready to share my knowledge about EF7 with our other developers.  My goal was to have 6 people (one from each development team) join me to explore EF7.  The results were amazing.  I had 18 people join in, almost all our developers and database administrators.  During this exercise I walked through created a database with Sql Server Data Tools and how easy it was to ship that to your local database and then ship SQL to our DBAs to create the database on a test/production environment.  Our DBA’s complained about some of the “junk” that the tool generates but they all agree it was a big improvement from “pasting sql into a word document.”  I then demonstrated how to generate a code first model from the existing database, a big concern in the room because EDMX is going away.  I then walked through unit testing with the in-memory database provider and configuration with with SQL provider.  All in all, there was about 1 man day invested in this learning.  For that cost we probably saved several man days of independent research and experimentation and we all gain a common starting point and understanding of EF7 and the code first design approach.  In my opinion, time well spent.  Below is the documentation I created for this exercise if you would like to see or following along your self.

Start with a web project, unit testing project and SSDT database project with your tables modeled.

Using SSDT to compare databases
Follow these steps to push you schema to a local database
Pushing Schema changes with SSDT
Steps for created an Entity Framework model
creating entity framework objects from existing database
Select code first (note I’m still using EF6 Power Tools)
creating entity framework objects from existing database
Here is your code first context.  Note the ModelCreating Method and the simple setup to identify Keys and no null colums.
Entity Framework OnModelCreating
This is our test context, it will use the In-memory provider, to keep the in-memory object available after the controller disposes the context I have overrode the dispose method.
Test implementation for entity framework
Here is our test fixture, not the use of “UseInMemoryDatabase”
Entity Framework 7 is platform agnostic, so there is no concept of Schema or Identity columns, by adding the SQL Server extensions you can get those things into your context.
SQL Server configuration for Entity Framework
Finally, here is how you wire up your context to you connection string.
SQL Server configuration for entity framework

Episode 5: Strategy

The team that I took over had a vacancy on it.  During one of my early conversations with Robert he said that filling that position should be one of my top priorities.  Robert told me a story about why the position was posted the way it was.  We were (at the time) looking to a SharePoint specialist to supplement the SharePoint responsibilities of the team.  The story was that in early 2015 we lost our most experienced SharePoint specialist and their were two individuals with relatively little experience to support our SharePoint environment.  Due to the lost of experience and our pending upgrades to SharePoint 2013 and eventually SharePoint 2016, Robert was seeking a third person to focus on supporting our SharePoint infrastructure.  I questioned Robert’s conclusions because, even thought SharePoint is very important to my organization, our SharePoint environment is really not that big.  Early in my time with the team I found this blog post from Mark Rackley about the size and make up of a SharePoint team.  Bottom line is that to support a user base of 5 to 10 thousand people you need nine to twelve people.  My organization is a full order of magnitude smaller than that, 600 users.  Two people should be enough to support our SharePoint environment with some supplementing talent from consultants to help with major projects or upgrades.  At the same time, I held a meeting with the team to review and outline all of the different responsibilities that we as a team had.  I used this exercise to help everyone see the bigger picture about what was going on across the entire team but also to solicit input into what should be done with this open position.  The team identified web/html technologies as the area that we needed the most help in and recommend that we hire a fourth programmer to focus on web technologies.  They sighted several large 2016 projects and the possibility to assist with branding in the coming SharePoint upgrade.  By the time I started my third month in the new role I was able to focus on the stories and assumptions that my predecessor and my team were making about this open position.  For better or worse, I choose not to trust neither of the assumptions and stories and validate the needs against the data from our work tracking system.

Let me start by explaining what we track in our work tracking system, TFS.  We track work items that map to emergent support requests (work we allocate time to but don’t plan for) project work (that we do plan for, see my earlier post Plotting a course) and we track recurring “meta” work (that include things like training, weekly tactical meetings, one-on-ones, etc.)  We also track time for the different tasks.  I could look at the work we completed during my first 10 weeks on the job and compare that to the project plan from our tri-annual meeting to identify the type of work being done and the talent needs for the team for the coming year.

The Platform Services Team supports numerous systems and they each require a slightly different skill set to support and maintain. To reflect this in my analysis I organized all of the different things we build and support into a two tier hierarchy with the TFS Area field.  For the first level in this hierarchy, I assigned a percentage for how much of the work required an “Analyst” and how much required a “Developer.”  I also when one step further by breaking our SharePoint Analyst and Web Developer based on the stories I heard from my predecessor and my team.  When I looked at all of the Support and Project work (the work that requires specialized talent) that we completed over the last 10 weeks a trend emerged.  Additionally, when I looked at all of the 2016 projects and their initial size estimates that trend contained.  Historically we had 4 FTE’s performing Analyst type work with 2 FTE’s dedicated to our SharePoint environment.  For the developers, historically we had 2 FTE’s performing developer work with one FTE focused on web development.  The projects that we have planned with our business partners for 2016 had similar expectations for the type of talent that would be required from the team.  With one team member focusing solely on web development we appeared to have our development needs covered with the existing staff.  With two team members focusing solely on SharePoint, we also appeared to have our needs covered for SharePoint with existing staff.  With the numbers showing me that we are, and expecting to continue, to spend more time than available talent performing Analyst type work, I decided that we needed to hire a fourth analyst.

Graph of FTE work on the team

When I shared the results of the analysis with the team they were not immediately aligned with this, but after focusing on the data and the story it told most everyone accepted it and a logical direction to take the team.  Below is an excerpt from my team strategy document that focuses on the team structure that this analysis helped shape.


We informally view ourselves as multiple teams under a single organization structure.  The needs of the organization are dynamic but by organizing ourselves in a ridged manner we reduce our ability to meet the needs of our Business Partners.  As a result of this rigidness, we struggle (often as individuals) to meet peak needs from our Business Partners.

The PST is tasked with providing tier 2 technical support and IS project implementation for 4 of the 7 divisions at Associated.  At the beginning of 2015 there were 8 team members dedicated to these business units.  Through attrition and reorganization, at the end of 2015 there were 6 team members dedicated to same business units.  This downsizing has created islands and single point of contacts for several business units and products that the PST supports.

AECI has strategically operated as a fiscally lean organization.  Information Services supported this strategy by eliminated operating and capital costs from the organization by creating in-house software products.  Starting in the late 2000’s, this strategy started shifting as the costs of purchasing software products became lower than the labor costs of building software products.  Today our product strategy is to rely on the market to provide software solutions and Information Services can customize and extent, and to build software where the market does not exist.  Job roles and descriptions have not kept pace with this change in strategy.

Guiding Policy:

Data from 2015 and the project outlook for 2016 indicates that a large amount of effort has been (and will continue to be) expended by Analyst talent, or developers performing in the analyst role.  As a result, better definition of the Analyst role and the Analyst career path is necessary.  Additionally, the PST requires a high degree of specialization with the SharePoint Product. Critical mass for supporting and maintaining our SharePoint environment is 2 FTE Analysts.  As such, we should explore new job descriptions that reflect this specialization and evaluate market data of SharePoint analysts vs. generalist analysts.  Finally, the existing open complement position will be filled as an Application Analyst with emphasis on the AssurX platform.

Historically, the members of the PST have been dedicated to a Division within AECI.  This practice has served AECI well in the past; however, in our current environment, this has left individuals islanded and silo’d.  Members of the PST do not have backup for their critical tasks.  As such, the PST will treat itself as a Talent Pool;  team members will align themselves with job role and not with business unit.  The PST will be viewed as a tier two IS support organization and IS project implementation team for 4 of the 7 divisions at Associated.

Knowledge silos must be eliminated.  The PST has historically been a software development organization; that coupled with historical formal and informal divisions of the team, has created an environment where many critical line of business systems are supported and maintained by a single team member.  As our Analyst role is better defined and the new analysts ramp up, this pattern must change so that Analyst are the primary support for critical products and developers serve as secondary support.

Development staff will focus on minimalist, web based, technology stack for custom application development.  This will enable the developer team members to create and support a multitude of applications that serve business partners outside of their traditional area of focus.

Episode 4: Plotting a course

At the beginning of my second month, I had clarity on my immediate expectations from my manager.  Additionally, all of the team’s emergent work was being managed through a Service Level Agreement with assigned deadlines for all incoming requests.  With clarity provided, my focus turned to our tri-annual project planning process.
To begin my preparations for the tri-annual project planning, I started by organizing all the team projects into a central location in OneNote.  Active projects where pulled from other places in OneNote, TFS, and people’s heads.  When everything was collected there were a total of 51 active projects.  Fifty-one active efforts for seven team members, that’s more than seven per employee!  How on earth could anyone function with seven active projects?  As I dug deeper, I began to discover that there was some serious house cleaning that needed to happen.  Many of the projects were in fact completed, some years ago.  Some projects were so small that they really were not worth the effort it would take to plan them out, so we started working on them through the Service Level Agreement.  Finally some of these projects were still only ideas and in a planning phase.  After cleaning up everything, I was left with about 30 projects to plan out for a team of 6 for 2016.   With all the active and anticipated projects defined in a central location, my next effort was to standardize the way that  our project work was tracked and documented.  This was not a difficult as you may thing because many people were already using OneNote and the tool for documenting their project work.  Each project had a documentation hub created (a section in OneNote,) and each project was given a project work item in TFS (for work tracking.)  With this base established I could now have an intelligent conversation with my business partners about what projects we should be undertaking for them in 2016.
The Tri-Annual update has its roots with our former CIO.  This originally started as a quarterly update the Information Services gave to each of the other 6 divisions at Associated Electric to update them on the various projects that IS was undertaking on their behalf.  This started as a communication tool that our CEO implemented to improve communication and collaboration between IS and the other divisions.  Today, under the leadership of our current CIO, it is an open forum to present Information Services’ KPI and gain alignment on our project plan.  To create an initial project plan for the different divisions we estimate the size of each project using “Project Points.”  There’s a complicated formula that determined the definition of a Project Point but the simple explanation is that one point is about 100 man hours of effort.  Each business division is assigned a budget of project points based on their needs and the amount of time that the team was able to on projects in the last 4 months.  With size estimates and budgets, we simply let our business partners elect how to spend their points, see below.  The output of this exercise is the proposed project plan for the next 12 months.
At the end of the planning session a project plan document is produced and that is then distributed to the managers of the business unit for their review.  Immediately before the start of the new trimester (December 1st) the CIO and the IS management team meet with the director and managers of the business unit in the “Tri-Annual Update Meeting.”  Two of the business units that I support are fairly small and they do not have many projects to discuss so the updates to those two groups focused on the IT group and my team and how well we were resolving their support requests.  The Accounting and Finance division is the largest business unit that I support and, as such, they have the most needs from my development staff.  In addition, I had just lost a member of the development staff that historically supported accounting to Michael’s team as back fill for me leaving.  This meeting was fairly tense.  I was leading the conversation on the project plan and there was two points of contention, first a single project on the plan was taking over half of their available project points for 2016 and second, they didn’t feel they were given enough project capacity.  I directed the latter part of the conversation around the fact that capacity was governed by team members and now the aggregate team was down two team members (one for the back fill and one vacancy that was dedicated to SharePoint.)  The conversation lasted over an hour and a half and led to some very good discussions about how accounting should best spend their project points.  At the end, my liaison and I left needing to do a little of revision to our project plan but all the managers were agreeable to that plan.
I have two lessons to share from this part of my experience, first the importance of preparation.  I spend many days reviewing, pruning and cleaning up the documentation related to all of the team’s projects.  This was time well spent and it made the planning sessions and the actual updates fairly easy and empowered me to speak knowledgeably on all areas of my new team.  Second, in difficult conversations, stay focused on the facts.  The accounting team was, understandably, unhappy with the loss of capacity from my team, but keeping them focused on the facts and the why helped smooth that conversation over and kept us focused on the important parts of the conversation.