This is my first post in almost 18 months and the kick off
of a series on my role in leading my organization to the cloud, or to borrow a
buzz term, #DigitalTransformation! In
the past I have been a diligent blogger, using my blog to document my
explorations in technology and leadership; however, the last 18 months have
been filled with many personal and professions highs and lows. As a result, I have not been keeping up with
my blog. To kick off this new series
(and it will be a big one, 9 or 10 posts) I need to start with a little
background. Prior to my stewardship of
SharePoint, we created an experimental SharePoint online tenant to support a
project that involved an outside engineering firm. As of 2018, that tenant was defunct. Also, in early 2018, I created a second,
“temporary” tenant. After quickly
spinning it up and building a forum site it was turned over to the requestors
where it has sat, unused, since. Finally,
in late 2017, I made the decision to pull the pug on a failing SharePoint 2016
upgrade. This decision played into a
series of events that result in me rebuilding a team that in early 2018 was focused
on fixing SharePoint.
When it came time to spin up our new SharePoint Online
environment, I discovered that, in addition to the two known tenants, a third
tenant already existed for our company.
This tenant had our desired tenant name but it existed in an “unmanaged”
state. It turns out that if people signup
for online services with their work email, Microsoft will create a tenant based
on their email address. As it turns out,
about a dozen people already signed up for the free version of PowerBI. As a result, I needed to go through the admin
takeover process. To complete this
process, a shared inbox was created called “Office365Admin.” This will result in not having an on-prem
user account that syncs to Azure AD.
Once you sign up an account on office.com I signed up for PowerBI with
that account and then visit with the office.com admin center to start the admin
takeover. The process is simple and
involved adding a TXT record to your DNS.
Once that was completed our
tenant setup was completed and ready for licensing.
Back in 2016, I led our company into our first Enterprise
Agreement with Microsoft. Our plan at
the time was to upgrade our traditional (on-prem) software platforms and start
experimenting with cloud hosted versions.
For that reason, we purchased Office ProPlus with E3 add-ons. This
license was allocated to one of our existing tenants that was now dormant. With a support request and some email
verification, our old tenant was scheduled for deletion and all the licensing
moved to our new home in the cloud. All
of this took less than one day to finish.
Between 2015 and 2018, SharePoint was one of many
responsibilities for my team; the one that was publicly failing and that my
company was losing faith in our ability to manage. Because of this, my role was refocused to fix
this problem. One of the main reasons
for me to choose to make the shift to the cloud was because of this setup
process. In one day, we had a production ready SharePoint environment.
Today at Ignite, Satya Nadella kicked off the day with the key note. There are four take always that I got from this talk, first, Microsoft is not divesting in developers, Infrastructure, Data and AI. All of those topics came up in the key note but they were not the focus. Under Mr. Nadella, Microsoft set out to reinvented themselves. From where I was sitting today, I can say mission accomplished. Today I am proud to call my self a Microsoft Developer (even if I am getting into Management.) So proud in fact, that I was able to tell the story of Microsoft’s reinvention to a potential intern when they asked me about my teams usage/acceptance of open source technology and products. It is truly a great experience to tell my future employees that we embrace open source and are better because of it. Microsoft three years ago was focused on open source, azure services and AI. Today, they are leaders in those areas and so the focus for the future is changing. The three areas of focus I took from the key note are “Modern Workspaces,” “Business Application” aka Dynamics 365, and Quantum Computing?
The idea of a “Modern Workspace” is the evolution of workspaces into a mobile first cloud first world. It is seamlessly working from anywhere at anytime. This idea is the evolution of Office 365 into something bigger than a software service or offering. The Microsoft Graph is building out connections to help make content and documents. Microsoft’s acquisition of LinkedIn is also becoming part of this graph. One of the coolest announcements was Bing for Business. I sat in on a presentation on Search this afternoon with Naomi Moneypenny and Kathrine Hammervold. For the first time today, I truly understood why search in the enterprise was so hard. Microsoft and Google got search right in a consumer space, it should just work in enterprise, but it turns out the needs for enterprise search are vastly different from consumer search. Bing for Business will tie in to the Microsoft Graph to make enterprise search better. Super exciting!
Dynamics 365 made a big appearance today in the sessions and in the keynote. There was a lot of talk about how this platform is leveraging the Microsoft Graph to improve business outcomes. I attended a Business Apps keynote after the main keynote and they provided a demo and uses of Dynamics 365, with the graph and LinkedIn to make recruiting and talent discovery easier. My observations is the Microsoft is Microsoft is viewing Dynamics 365 as a mature offering and a natural extension of the core offerings of Office 365. I’m looking forward to exploring this product line in the future.
This topic was the biggest topic in the keynote and for me, it came out of left field. There was a full panel discussion on the advances in building a quantum computer. Microsoft announced today that quantum computing was available today in simulators in Visual Studio (on your PC) and in Azure. I think this is a clear signal that true quantum computing is coming, and it is coming at scale soon. This is not some research or pet project at Microsoft. This feels very similar to Microsoft vision of AI 3 or 4 years ago and today it is very real. I can only imagine what Quantum Computing at scale will do for our world if it comes to fruition in the next 4 years.
In my last post (a long time ago) in this series, I described how to solve one of the most basic problems in web development with Node, user authentication. In this post I’m going to talk about another common web development problem, database access. Specifically, using an ORM library to facilitate (and accelerate database access.) Since I first understood the concept of ORM (Object Relational Mapping) and used it in a professional setting, I have been a huge fan. My first exposure to ORM (where I truly understood what as going on) was with Linq to SQL while building a WCF service in C#. It was super empowering when I realized that I could query the database writing C# code! No more string contamination to build queries (which is a major risk to SQL injection attacks) and no more dependencies on other teams to build stored procedures. As a developer, this was extremely liberating! Naturally, when exploring a new framework I want to learn what libraries and tools are available to provide the ORM. For Node, I have chosen sequelize. I’m going to go over defining a model (a table) and querying the database.
In ORM, the model is the representation of your database table. It defines what columns there are, the data they they are, relationships to other models (tables) and, in more advanced cases, things like indexes and constraints. In my experience, there is no ORM library that give you the full functionality on SQL in defining your tables, so the question is, why bother with defining models? The answer is portability. Sequelize (like any good ORM tool) can map to different SQL dialects. It supports MySQL, SQLite, and SQL Server. If you are running automated tests, no problem, sequelize will use the in memory SQLite database. You are deploying a Linux cloud or an on prem Microsoft environment, again, no problem because the library will generate SQL for the targeted environment. An ORM can also generate the database for you and update the schema. This concept is called ‘Migrations’ and is a much more involved topic and I will probably devote an entire post to in the future. Below is a very basic model that describes a database table called ‘WeatherObservation.’ I have defied a few columns to store some data from a Raspberry PI and a DHT 11 sensor.