Spinning up a tenant

Office 365 logo

This is my first post in almost 18 months and the kick off of a series on my role in leading my organization to the cloud, or to borrow a buzz term, #DigitalTransformation!  In the past I have been a diligent blogger, using my blog to document my explorations in technology and leadership; however, the last 18 months have been filled with many personal and professions highs and lows.  As a result, I have not been keeping up with my blog.  To kick off this new series (and it will be a big one, 9 or 10 posts) I need to start with a little background.  Prior to my stewardship of SharePoint, we created an experimental SharePoint online tenant to support a project that involved an outside engineering firm.  As of 2018, that tenant was defunct.  Also, in early 2018, I created a second, “temporary” tenant.  After quickly spinning it up and building a forum site it was turned over to the requestors where it has sat, unused, since.  Finally, in late 2017, I made the decision to pull the pug on a failing SharePoint 2016 upgrade.  This decision played into a series of events that result in me rebuilding a team that in early 2018 was focused on fixing SharePoint.

When it came time to spin up our new SharePoint Online environment, I discovered that, in addition to the two known tenants, a third tenant already existed for our company.  This tenant had our desired tenant name but it existed in an “unmanaged” state.  It turns out that if people signup for online services with their work email, Microsoft will create a tenant based on their email address.  As it turns out, about a dozen people already signed up for the free version of PowerBI.  As a result, I needed to go through the admin takeover process.  To complete this process, a shared inbox was created called “Office365Admin.”  This will result in not having an on-prem user account that syncs to Azure AD.  Once you sign up an account on office.com I signed up for PowerBI with that account and then visit with the office.com admin center to start the admin takeover.  The process is simple and involved adding a TXT record to your DNS.   Once that was completed our tenant setup was completed and ready for licensing.

Back in 2016, I led our company into our first Enterprise Agreement with Microsoft.  Our plan at the time was to upgrade our traditional (on-prem) software platforms and start experimenting with cloud hosted versions.  For that reason, we purchased Office ProPlus with E3 add-ons.    This license was allocated to one of our existing tenants that was now dormant.  With a support request and some email verification, our old tenant was scheduled for deletion and all the licensing moved to our new home in the cloud.  All of this took less than one day to finish.

Between 2015 and 2018, SharePoint was one of many responsibilities for my team; the one that was publicly failing and that my company was losing faith in our ability to manage.  Because of this, my role was refocused to fix this problem.  One of the main reasons for me to choose to make the shift to the cloud was because of this setup process. In one day, we had a production ready SharePoint environment. 

Ignite 2017: Dyanmics 365

Day two of Ignite, and this one was kind of a let down. I was super stoked to get here to learn more about containers and so I scheduled in two classes related to containers, one focused on SQL server and one focused on Visual Studio. The latter, was a complete let down. The was a demonstration of new and upcoming Visual Studio features, they didn’t even talk about containers, and worse, some of the stuff demoed I am already using and some of it I saw last year at Build, 2016. The SQL Server Container talk was very interesting; however, I didn’t learn anything thing new. I already knew that containers were stateless. I already knew that being stateless creates a problem for any database. Finally, I already knew that some type of attached storage would be required to make SQL Server work in Containers. The session was interesting but I didn’t leave with any new ideas. My greatest inspiration today came from my first session talking about Dynamics 365 for Finance and Operations.

My tweet puts it best, Dynamics 365 could be a digital transformation for my organization. If I ask myself was causes me the most pain at work, the answer would be on-prem software upgrades. These “projects” suck the life out of my team; they add zero value to the organization (probably even negative value because of the time commitments from the business to communication, train, and test) and they block my team members from doing their primary jobs, delivering business solutions with software. Dynamics 365 can solve this problem. Going to a hosted solution will permanently get my team out of the software patching and upgrading business. With that added time, we can learn Power BI and Power Apps and extend and enhance these software platforms to add value beyond the core offering. The lesson that I am leave with for today is that I must be dauntless in focusing my team on work where they add the most value to our organization, and getting the mundane and non-value adding work out of my organization and into hands of a trusted vendor. O365, SharePoint Online, and Dynamics 365, I am ready to start exploring what you offer.

Ignite 2017: Modern Office, Business Apps, and Quantum Computing?

Today at Ignite, Satya Nadella kicked off the day with the key note. There are four take always that I got from this talk, first, Microsoft is not divesting in developers, Infrastructure, Data and AI. All of those topics came up in the key note but they were not the focus. Under Mr. Nadella, Microsoft set out to reinvented themselves. From where I was sitting today, I can say mission accomplished. Today I am proud to call my self a Microsoft Developer (even if I am getting into Management.) So proud in fact, that I was able to tell the story of Microsoft’s reinvention to a potential intern when they asked me about my teams usage/acceptance of open source technology and products. It is truly a great experience to tell my future employees that we embrace open source and are better because of it. Microsoft three years ago was focused on open source, azure services and AI. Today, they are leaders in those areas and so the focus for the future is changing. The three areas of focus I took from the key note are “Modern Workspaces,” “Business Application” aka Dynamics 365, and Quantum Computing?

Modern Workspaces
The idea of a “Modern Workspace” is the evolution of workspaces into a mobile first cloud first world. It is seamlessly working from anywhere at anytime. This idea is the evolution of Office 365 into something bigger than a software service or offering. The Microsoft Graph is building out connections to help make content and documents. Microsoft’s acquisition of LinkedIn is also becoming part of this graph. One of the coolest announcements was Bing for Business. I sat in on a presentation on Search this afternoon with Naomi Moneypenny and Kathrine Hammervold. For the first time today, I truly understood why search in the enterprise was so hard. Microsoft and Google got search right in a consumer space, it should just work in enterprise, but it turns out the needs for enterprise search are vastly different from consumer search. Bing for Business will tie in to the Microsoft Graph to make enterprise search better. Super exciting!

Business Applications
Dynamics 365 made a big appearance today in the sessions and in the keynote. There was a lot of talk about how this platform is leveraging the Microsoft Graph to improve business outcomes. I attended a Business Apps keynote after the main keynote and they provided a demo and uses of Dynamics 365, with the graph and LinkedIn to make recruiting and talent discovery easier. My observations is the Microsoft is Microsoft is viewing Dynamics 365 as a mature offering and a natural extension of the core offerings of Office 365. I’m looking forward to exploring this product line in the future.

Quantum Computer
This topic was the biggest topic in the keynote and for me, it came out of left field. There was a full panel discussion on the advances in building a quantum computer. Microsoft announced today that quantum computing was available today in simulators in Visual Studio (on your PC) and in Azure. I think this is a clear signal that true quantum computing is coming, and it is coming at scale soon. This is not some research or pet project at Microsoft. This feels very similar to Microsoft vision of AI 3 or 4 years ago and today it is very real. I can only imagine what Quantum Computing at scale will do for our world if it comes to fruition in the next 4 years.

Container Fest

This week I had the opportunity to attend Microsoft Ignite in Orlando. I will be making my best effort over this week to document and communicate out my learnings each day. Today was day zero and I was able to attend the pre-conference training “Container Fest.” Overall, I was a little disappointed because the session was very lecture heavy and light on demo’s and labs, me and my crew even got kicked out at 6 PM while we were working on the lab :(. Don’t get me wrong, I learned a ton, and this class validated my assumptions of containers, they totally rock!

It is cliche, but containers are designed to “build once, run anywhere.” My entire career, I have been chasing the dream where a line of code it written and it works anywhere I want it to work. Despite all my unit testing, automated deployments, and documentation, production applications break because changes were made to the system, network, software, etc.

Infrastructure as code is a requirement with containers. Your server configuration, network setup, patching level, everything about your container, is defined in code. You have all the benefits associated with that, such as rapid deployments, scalability, and change management. Most importantly, everyone that is working on a product or solution is speaking in the same language and using the same technology stack to solve the problem at hand.

There is still overhead to maintaining and operating these containers, and their hosts. You still have operating systems that need to be patched and rebooted. Even though there is zero down time, if done correctly, you software stack will still be updated and you will still need to keep it up to date. If you set up your build and deploy process, this can be minimal, but it is still a consideration.

I see two situations where containers are a no brainer. First is if you value portability. Containers run on prem, and in any public cloud. If you value, or part of your cloud strategy is portability or diversification, containers enable it. Second is for applications where you do not have sufficient control over the build and release process, i.e. third party software. I still have a lot to learn about how to make this work, but containers can make updates to systems seamless and automatic. More importantly they also can make changes to the underlying software stack equally seamless and automatic. I cannot count the number of times that a operating system update had some negative impact to a production system that I had to deal with the next morning. It is not that the people applying the updates are doing anything wrong, they just don’t know everything about every system. Containers have the possibility to automate and improve the delivery of changes to 1st and 3rd party system, and I intend to explore more and learn how.

ORM in Node.js

In my last post (a long time ago) in this series, I described how to solve one of the most basic problems in web development with Node, user authentication.  In this post I’m going to talk about another common web development problem, database access.  Specifically, using an ORM library to facilitate (and accelerate  database access.) Since I first understood the concept of ORM (Object Relational Mapping) and used it in a professional setting, I have been a huge fan.  My first exposure to ORM (where I truly understood what as going on) was with Linq to SQL while building a WCF service in C#.  It was super empowering when I realized that I could query the database writing C# code!  No more string contamination to build queries (which is a major risk to SQL injection attacks) and no more dependencies on other teams to build stored procedures.  As a developer, this was extremely liberating!  Naturally, when exploring a new framework I want to learn what libraries and tools are available to provide the ORM.  For Node, I have chosen sequelize.  I’m going to go over defining a model (a table) and querying the database.

The Model

In ORM, the model is the representation of your database table.  It defines what columns there are, the data they they are, relationships to other models (tables) and, in more advanced cases, things like indexes and constraints.  In my experience, there is no ORM library that give you the full functionality on SQL in defining your tables, so the question is, why bother with defining models?  The answer is portability.  Sequelize (like any good ORM tool) can map to different SQL dialects.  It supports MySQL, SQLite, and SQL Server.  If you are running automated tests, no problem, sequelize will use the in memory SQLite database.  You are deploying a Linux cloud or an on prem Microsoft environment, again, no problem because the library will generate SQL for the targeted environment.  An ORM can also generate the database for you and update the schema. This concept is called ‘Migrations’ and is a much more involved topic and I will probably devote an entire post to in the future.  Below is a very basic model that describes a database table called ‘WeatherObservation.’  I have defied a few columns to store some data from a Raspberry PI and a DHT 11 sensor.


Once you have the model defined, inserting, updating and querying data out of your database now becomes native code.  It is the same as all the other code you are writing for you application.  Below are two examples, first is a simple GET operation that returns all records in a database table.  Notice that you don’t have to write any SQL to parse out any fields from the database query, the ORM library does all that for you and you get back you data in a format that is easy for you to manipulate and work with.  You are also using the “all” method from sequelize so you are simply writing JavaScript.  The second example is a POST operation that generates an “INSERT” statement using the “create” method from sequelize.