Automation Series 1
Summary
This blog is the first in a series that I’ll be writing, starting today and continuing through the next few weeks, that will focus on how IT shops can easily automate the process of dashboard creation within Power BI and then present this information to their users in a very mobile friendly fashion through the Power BI app. Going a step further, we will use Microsoft’s Azure Automation in the cloud to run on-premise runbooks to collect the data from Service Manager and upload the data into Power BI using Powershell and the Power BI API.
The Challenge with Automation
In the world of information technology, as you can imagine, information is power. The problem we run into in terms of Service Management is the limitation we face in terms of our reporting tools. This is particularly true in terms of Service Manager.
I’m not going to sugar coat it for you, so lets get right to it. Service Managers built in reporting tools are limited and difficult to use. In many cases it takes a consultant or a person very knowledgeable in the ways of Service Manager, Excel, or SQL to put together a report that really provides value to middle and upper management. What we are really looking for is actionable intelligence, provided in a real-time or near real-time fashion.
This is what we are going to focus on in this series of posts, but we are going to do it using the very latest from Microsoft in terms of cloud based reporting and automation.
The Automation Solution
In order to accomplish our goals in this series, we will focus on the use of Powershell to access the data points we require as well as manipulate and transform the data before uploading it to Power BI.
In order to provide near real time data in our dashboards and perform the automation we’ll use Azure Automation with a Hybrid worker.
Part 1: Gathering and Transforming Incident Data
In order to focus on the process and capabilities of this solution we are going to keep the data collection as simple as possible for our first example and then we will advanced the solution further later in the series.
Primarily, we need only a few data points in order to build a very functional Incident table in Power BI and this is highlighted in the script below.
Let’s break the code above down a bit further so we can see what we are doing at each step.
First we have the debug function and all this does is allow us an easy method of posting output as we move through the script. This will help when we are debugging while running via the Azure Automation runbooks.
#captures debug statements globally function Debug { param($text="") write-output $text }
We can use this code to change how we are doing the output at a later time (such as logging to a file), rather than having to change each line of debug code in the script.
In the next bit of script we pull the Incidents out of Service Manager and then log how many Incidents were successfully pulled. In our example we pull all the Incidents every time, but in a production environment you’d want to keep track of the last incident you pulled and only grab the newest Incidents for upload.
#region Collect Incident data using the SMLets $class = get-scsmclass WorkItem.Incident$ $incidents = Get-SCSMObject -Class $class | select Id, Title, Classification, Status, CreatedDate, LastModified Debug -text ("Working on {0} [{1}]" -f "Incidents", $incidents.count) #endregion
We put the Incident class into a variable for easy reuse and then we use the Get-SCSMObject and select only the properties we are interested in.
The last bit of code for this first post is the transformation code. For those of you familiar with Service Manager, you are most likely fully aware that properties such as Status and Classification are actually enumeration values so the Incident returns the true name (IncidentStatusEnum.Active) rather than the display name we are looking for, so we transform each of these properties from their respective lists first, before pushing up into the Power BI table. We can use this same method for other values, even in the case of a type projection in order to get the affected user information, etc. Those are more advanced concepts to be covered in another day, however.
Lets walk through the code below before we close out this post:
#region Transform incident data to human readable values $allIncidents = @() foreach($incident in $incidents){ $obj = @{} $obj.Id = $incident.Id $obj.Title = $incident.Title $obj.Classification = $incident.Classification.DisplayName $obj.Status = $incident.Status.DisplayName $obj.Created = get-date($incident.CreatedDate) -Format "yyyy-MM-dd" $obj.Modified = get-date($incident.LastModified) -Format "yyyy-MM-dd" $allIncidents += New-Object –TypeName PSObject –Prop $obj }
The first step is to create a new variable to hold our newly transformed Incidents. We then run through each Incident and create a new Incident object and add it to the array, but first we convert the dates, the status, and the classification.
We now have our table of Incidents, newly transformed with human readable information and we are ready to upload to Power BI.
In the next post we’ll focus on the pros and cons of uploading data into Power BI and what the limitations are.
Continue to Part 2 of the Automation series.
Ready to transform your SCSM experience? View all of the exciting apps Cireson has to offer.