SCOM / SCSM – Retrieve Decrypted RunAs Account Credentials

password-ftr

I am not sure if you have seen it, but Richard Warren from nccgroup has figured out, how to decrypt the RunAs account credentials in SCOM. The problem up to now was, that there was no official way to retrieve the encrypted credentials from SCOM. There is just one DLL to use, which offers the decrypt method. He has written a EXE and a PowerShell script on Github . I know there are always two sides of the medal. In this case an evil and a good way of using this knowledge. I think I don’t have to talk about the evil way, instead I would like to talk about its benefit.

Richard Warren has used it for SCOM RunAs accounts, but if you think about it Service Manager (SCSM), which is based on the same framework, therefore I was curious if this approach also works for SCSM. In fact it did! Why is this awesome? Well, think about it. We are able to “securely” store credentials in SCSM (or SCOM) using RunAs accounts. Now we are able to retrieve those credentials easily. Because I do a lot of automation in SCSM using service requests and itnetX PowerShell activities I always had some trouble to store credentials in a save manner. There are many ways to do so, like exporting the credentials into XML (Export-CliXML) , using certificates , encrypting the credentials using a key and store it somewhere like here or maybe you could store the credentials in SMA and retrieve it using PowerShell. Whatever method you are going to use, you will end up with more or less problems. The best approach would be, to store the credentials on the system where you need it (SCSM) and the SCSM administrator can manage these accounts without to dig into PowerShell code or certificates etc. Therefore RunAs accounts are a perfect way for storing credentials.

Because of that, I have used Richard’s sample, modified the code a bit to be able to use it on SCOM and SCSM and also return proper output. The PowerShell module will return the a credential hash table. You need to execute the module on the SCOM or SCSM management server and the only parameter you need to provide is the SCOM RunAs account display name like in this example.

In SCOM the RunAs account looks like this…

image

…and if you use the PowerShell module it works like this…

image

You can download the module from PowerShell Gallery . Be aware of the fact, that you need permission to access the database and management server.

Continue reading

SMA Authoring Toolkit – Some Runbooks Are Not Showing Up

untitled

When you are creating runbooks in SMA (Service Management Automation) and you are using the SMA Authoring Toolkit available on PowerShell Gallery, you might have also have faced a very annoying bug. If you have a certain amount of runbooks in SMA and you are browsing through the runbook list in ISE you simply cannot find certain runbooks. Trying to refresh the list does not work at all.

image

If you open SMA to browse the runbook list you can see them all published and in a “healthy” state. So there is no reason not to show up.

Continue reading

SMA – Invoke Runbook Error “Cannot find the ‘’command.”

error

This is just a quick post about SMA. I bumped many times in this error while writing PowerShell workflows in SMA …

SMASpaceError

At line:3497 char:21 + PAT000287-RelateAppSetSR -AppSetID $AppSet -SRID $Applicatio ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Cannot find the 'PAT000287-RelateAppSetSR' command. If this command is defined as a workflow, ensure it is defined before the workflow 
that calls it. If it is a command intended to run directly within Windows PowerShell (or is not available on this system),
place it in an InlineScript: 'InlineScript { PAT000287-RelateAppSetSR }'

…and I figured out that there are many reasons for this error.

If you are nesting runbooks like this…

workflow ParentWorkflow 
{
	ChildWorkflow -Param1 "Test" 	#Calling child workflow
}

Several reasons could lead to this problem:

  • The child workflow does not exist in SMA.
  • The child workflow has not been published.
  • The naming of the child workflow name does not match the invoke command.
  • There is a space in front of the child workflow. Just edit the line, where you call the child runbook and check in the parent runbook again.
  • Before you check in the parent workflow, you must check in the child workflow. This applies only when the runbook itself did not exist before. If the child runbook was created any time before the parent workflow, this error will not happen.

I hope this will help you this quite common error to eliminate during development.

SMA – Database Grooming Some Things You Should Know

know

SMA is Microsoft’s on-premise automation engine and the successor of Opalis / Orchestrator. We have utilized this engine quite a lot and have lots of experience developing PowerShell workflows for SMA. But as every system your need to maintain and pamper it, otherwise it will strike back at some point. We recently experienced such an issue, which also could happen in your environment.

When a runbook is executed it generates a job, see more details here. A job can have different status either be failed, stopped, suspended or running. So, if you decide you want to debug a runbook because it fails all the time, you can turn on different log levels or also known as runbook streams. There is an excellent post on System Center: Orchestrator Engineering Blog explaining how you turn on one of the six different streams like Output, Progress, Warning, Error, Verbose, and Debug. Depending on the type you receive different information levels.

What happens is, as soon you turn on e.g. verbose stream you will see it in the job output like this…

image

A best practice is to keep these streams turned off and only enabling it if you really need them. But why is that? Well, this output has to stay “somewhere” otherwise it would not be “persistent”. In SMA this output gets stored in the Stream.JobStreams table. If you run a select query against this table you will see something like this…

image

If you have a closer look at the Stream TypeName column you figure out the stream type like Verbose, Output, Progress etc. If you see Output, this does not mean it is only data from Write-Output, instead it is also data returned by a runbook for passing as input for the next runbook. As a side note, you should never use Write-Output in your runbooks instead use Write-Verbose. Write-Output is only meant for output objects and consuming by other runbooks.

Continue reading

SCOM – How Data is Encrypted

data_encryption_button-600x450

Recently I got a question from a customer how SCOM traffic is encrypted. Well, I knew that the traffic IS encrypted, but how the encryption works, that is a different story.

First we need to know, about what traffic we are talking about. Is it the communication between agents , respectively healthservices? Is it the encryption of RunAs accounts / credentials within the communication channel? Or, are we talking about the encryption of RunAs accounts within the SCOM database? On TechNet you will find an article talking about the communication and encryption https://technet.microsoft.com/en-us/library/bb735408.aspx but what is the context having certificates or Kerberos in place. To get the full picture, we need to answer these questions.

No one else could answer these questions better than Microsoft itself and therefore of course “Mr. SCOM” Kevin Holmann. All credits to him, he provided me with this very interesting information and letting me publishing it. Thank you Kevin!

Let’s first talk about the healthservice to healthservice communication.

1, Healthservice to Healthservice Encryption and Authentication:

Communication among these Operations Manager components begins with mutual authentication. If certificates are present on both ends of the communications channel (and enabled for use in the registry for the healthservice), then certificates will be used for mutual authentication.  Otherwise, the Kerberos version 5 protocol is used. If any two components are separated across an untrusted domain/forest boundary that doesn’t support Kerberos, then mutual authentication must be performed using certificates.

If Kerberos is available, the agent is authenticated via Kerberos, and then still using Kerberos, the data channel is encrypted using Kerberos AES or RC4 cypher.  A by-product of the Kerberos authentication protocol is the exchange of the session key between the client and the server. The session key may subsequently be used by the application to protect the integrity and privacy of communications. The Kerberos system defines two message types, the safe message and the private message to encapsulate data that must be protected, but the application is free to use a method better suited to the particular data that is transmitted.

If certificates are used for mutual authentication, the same certificates are used to encrypt the data in the channel.

=> Agents are initially authenticated via Kerberos, or Certificates.  Then that same protocol is used for encryption of the channel – per:

https://technet.microsoft.com/en-us/library/bb735408.aspx

From the agent to the gateway server, the Kerberos security package is used to encrypt the data, because the gateway server and the agent are in the same domain. The alert is decrypted by the gateway server and re-encrypted using certificates for the management server. After the management server receives the alert, the management server decrypts the message, re-encrypts it using the Kerberos protocol, and sends it to the RMS where the RMS decrypts the alert.

Continue reading

OMS – Intelligence Packs Cheat Sheet

image

Operations Management Suite (OMS) is one of the (probably) hottest technologies Microsoft is currently working on. If you want to bet on a horse, which will win the crazy technology race now and in the future, OMS will be a save choice. Because of that I highly recommend start using and learning OMS today. There are plenty of sources on the internet to get you started.

OMS uses solutions / Intelligence Packs to add functionality, logic, data and visualization to OMS. As soon you add a solution to OMS, files are downloaded (but not in all cases) to your server where the Microsoft Monitoring Agent is installed. These files look like SCOM management packs and the internal structure is also similar. In a lot of cases, these management packs contain collection rules which are executed at a certain interval. For OMS you can choose two different ways , either you just use the Microsoft Monitoring Agent (MMA) in a “agent only” scenario or you are using the Microsoft Monitoring Agent in conjunction with SCOM. In both situation you will be able to collect data, but not in both situation you will be able to use all solutions, because some solution(s) require a SCOM management group. Another interesting finding is, that not all data gathering processes use the same “methods”. E.g. some solutions just execute PowerShell to gather event and log entries, which will be sent to OMS other solutions use a lot of bundled DLL files to deliver sophisticated data collection. I was interested in getting a kind of an overview, which solution uses what “technology” to collect the information from your systems and what targets are these collection rules using. I did basic investigations, like first activating a solution in the OMS portal and then I checked out what management pack got downloaded to SCOM. After figuring this part out, I checked the management pack itself to see what rules and assemblies it contains etc.

Because every management pack behaves different, I tried to put the most important information into a cheat sheet. I know the solutions will change rapidly and new solutions will come out. But I think for having a first overview / impression, it will help in certain meetings or troubleshooting scenarios.

You will find some deeper information like the following:

  • All rules involved in the data collection process
  • The target class the rules are using
  • What resources (DLL, execpkg) files are used?
  • How frequently the data gets collected (interval)?
  • Are there RunAs profiles involved?
  • What technologies are supported?
  • Agent requirements, MMA only or MMA + SCOM?
  • The solution title is a hyperlink to TechNet article
  • The intelligence pack title is a hyperlink to SystemCenterCore.com which will show all IP / MP details

Let me know if you have any comments, updates or ideas. I will try to frequently update this sheet. You can download the PDF from TechNet here.

PowerShell – PowerShellGet Module “Publish-PSArtifactUtility : Cannot process argument transformation on parameter ‘ElementValue’”

In PowerShell 5.0, Microsoft introduced the PowerShellGet module. This module contains cmdlets for different tasks. E.g. it lets you easily install / upload PowerShell modules / scripts from and to an online gallery such as PowerShellGallery.com. It even lets you find scripts, modules and DSC resources in such repositories. This is a fantastic way to share your script goodies and make it available to others, who can use them on-premise or even in Azure Automation for their runbooks or DSC projects.

In every collaboration scenario, there must be some rules. Publishing scripts has also some rules to follow, otherwise all scripts will end in a chaos and no one will ever find an appropriate script with the latest version etc. Therefore we need to provide structured data for version control, prerequisites and author information. This can be done using the PowerShellGet module.

Here just an overview of the cmdlets provided by this module…

image

Here comes the first pain point, if you try to run a cmdlet e.g. from your Windows 10 client, check the version of the module. In the screenshot above I ran it on an Azure VM with Windows Server 2016 TP4 installed. On my actual Windows 10 client I see this…

image

As you can see, there is a difference in version and cmdlet count. If you think now, that you could just upgrade the PowerShell version to the latest release on your Windows 10 box, well you need to wait until end of February 2016. Microsoft has pulled the latest RTM release back, because of some major issues. Find the post and details of the status on the PowerShell blog . If you managed to get to the latest release of the PowerShellGet module and you have the full set of cmdlets available, you are ready to start.

Continue reading