SCOM – Editing SPN Quickly

Edit

If you are dealing with SCOM, you know that there is a lot to install and configure before it runs smoothly. One step during the installation process, is to configure the SPN (Service Principal Names) in Active Directory. In fact you need to set SPNs per SCOM management server and if you are hosting the web console on a dedicated server you also need to set an SPN (and Kerberos constraint delegation) correctly, so authentication will work properly. But how should the SPNs look like? Well, Kevin Holman (Microsoft) published  while ago an awesome post how they should look like. It will answer probably most of your questions. If not, just drop me a comment 🙂 .

There are different ways to mess with SPNs settings. First born tool is SetSPN.exe, which has been around for a while and can be considered “classic”. A more modern way of doing SPN registration is to use PowerShell of course. In terms of SCOM, if you are using a domain account for System Center Data Access Service then you could use PowerShell cmdlet Set-ADUser to  to register SPNs. It would look like this…

Get-ADUser -Filter 'Name –eq "[DAS Account]"' | Set-ADUser -ServicePrincipalNames  @{Add="MSOMSdkSvc/[MgmtServerFQDN]"}
Get-ADUser -Filter 'Name –eq "[DAS Account]"' | Set-ADUser -ServicePrincipalNames  @{Add="MSOMSdkSvc/[MgmtServerNetBIOS]"}

But if you more into GUI’s or you need to troubleshoot quickly, it might be faster to use the Active Directory Users and Computers console. You need to turn on Advanced Features…

image

…check the Attribute Editor on your System Center Data Access account and select the servicePrincipalName property…

image

There you have a quick and nice overview, what has been configured on you service account. In addition you are able to add and remove obsolete SPN’s. Hope this helps :).

SCOM / SCSM – Retrieve Decrypted RunAs Account Credentials

password-ftr

I am not sure if you have seen it, but Richard Warren from nccgroup has figured out, how to decrypt the RunAs account credentials in SCOM. The problem up to now was, that there was no official way to retrieve the encrypted credentials from SCOM. There is just one DLL to use, which offers the decrypt method. He has written a EXE and a PowerShell script on Github . I know there are always two sides of the medal. In this case an evil and a good way of using this knowledge. I think I don’t have to talk about the evil way, instead I would like to talk about its benefit.

Richard Warren has used it for SCOM RunAs accounts, but if you think about it Service Manager (SCSM), which is based on the same framework, therefore I was curious if this approach also works for SCSM. In fact it did! Why is this awesome? Well, think about it. We are able to “securely” store credentials in SCSM (or SCOM) using RunAs accounts. Now we are able to retrieve those credentials easily. Because I do a lot of automation in SCSM using service requests and itnetX PowerShell activities I always had some trouble to store credentials in a save manner. There are many ways to do so, like exporting the credentials into XML (Export-CliXML) , using certificates , encrypting the credentials using a key and store it somewhere like here or maybe you could store the credentials in SMA and retrieve it using PowerShell. Whatever method you are going to use, you will end up with more or less problems. The best approach would be, to store the credentials on the system where you need it (SCSM) and the SCSM administrator can manage these accounts without to dig into PowerShell code or certificates etc. Therefore RunAs accounts are a perfect way for storing credentials.

Because of that, I have used Richard’s sample, modified the code a bit to be able to use it on SCOM and SCSM and also return proper output. The PowerShell module will return the a credential hash table. You need to execute the module on the SCOM or SCSM management server and the only parameter you need to provide is the SCOM RunAs account display name like in this example.

In SCOM the RunAs account looks like this…

image

…and if you use the PowerShell module it works like this…

image

You can download the module from PowerShell Gallery . Be aware of the fact, that you need permission to access the database and management server.

Continue reading

WAP – Get Windows Azure Pack Websites via PowerShell

image

Windows Azure Pack was Microsoft’s first attempt to bring Azure into your on-premise datacenter. The things you can do with it are limited to IaaS VM, PaaS databases and PaaS websites. In addition there is Service Bus and some networking part which is necessary for the IaaS / PaaS services. Of course there are other required parts, like Service Provider Framework (SPF), SC Virtual Machine Manager etc. Because my job is to automate things using PowerShell, I have sometimes the need to get data out of systems like in this case WAP as my data source. If you look a bit closer at WAP and you want to get information about configured SQL databases or MySQL databases there is a rich set of PowerShell cmdlets available and these modules are installed on the WAP admin servers…

image

…so what you could do is use PowerShell remoting and query these server for information. If you want to get information about provisioned VM’s you simply could query VMM using its own cmdlets to gather information.

One other way you could get information out of WAP, is to use the Public Tenant API. This API provides information about tenant specific information, therefore you need to provide a subscription to get detailed information about that specific tenant. MVP Ben Gelens has written a fantastic PowerShell module to get all sorts of information from the WAP Public Tenant and WAP Admin API you can find the module here https://github.com/bgelens/WAPTenantPublicAPI . I have tested it and it works like a charm.

So but what is now the point of this post? Well, so far we have seen, that we can get information about SQL Server and MySQL databases using these PowerShell cmdlets using the Admin API, for VM’s use VMM as a data source, but what about websites? There are also modules installed on the web controller servers itself, e.g. the WebSites module…

image

…and the WebSiteDev module…

image

…to get infos about websites from the system just use these cmdlets above.

One more elegant way to pull website information is going through the endpoint REST API (Web Site Cloud REST Endpoint) which you need to provide when adding the website resource to the admin portal. It depends how you configured, it but as an example you can find the settings you configured on the web controller server you could execute the Windows Azure Pack Websites MMC and find all different settings…

Continue reading

SMA Authoring Toolkit – Some Runbooks Are Not Showing Up

untitled

When you are creating runbooks in SMA (Service Management Automation) and you are using the SMA Authoring Toolkit available on PowerShell Gallery, you might have also have faced a very annoying bug. If you have a certain amount of runbooks in SMA and you are browsing through the runbook list in ISE you simply cannot find certain runbooks. Trying to refresh the list does not work at all.

image

If you open SMA to browse the runbook list you can see them all published and in a “healthy” state. So there is no reason not to show up.

Continue reading

Quick Post – Linux + PowerShell + DSC Blog Posts @ Hey, Scripting Guy! Blog

image

I would like to make you aware of a 3-part blog post series, which I have written for THE Microsoft Hey, Scripting Guy! Blog .  Because I really like these blog post series and of course the blog itself a lot , I want to share it with you.

The first part shows you, how to use Bash on Windows 10 and how you can connect to a Linux server to install OMI CIM server and the DSC for Linux packages. The second part installs .NET Core and PowerShell for Linux on the system using DSC for Linux. In addition, I show you how to connect via PowerShell and WSMan protocol from your Windows 10 to the OMI CIM server. The last post is applying a DSC configuration from Azure Automation DSC to Linux and executing a PowerShell script to send user data to Azure Log Analytics HTTP Data Collector API .

You can find the post here:

Part 1 – Install Bash on Windows 10, OMI CIM Server, and DSC for Linux

Part 2 – Install .NET Core and PowerShell on Linux Using DSC

Part 3 – Use Azure Automation DSC to Configure Linux and Executing PowerShell Script

 

I hope you like it as much as I do, have fun!

OMS – HTTP Data Collector API 403 (Forbidden)

Few weeks ago Microsoft released the Azure Log Analytics HTTP Data Collector API, which allows you to shoot JSON data into OMS Log Analytics. This is awesome news, because now anything is possible. This means, you are able to use (m)any script languages to send any data to OMS for further analytics and you are able to use all the nice OMS goodies like alerting, view designer for building awesome dashboards, query language for some deep dive into your data etc. I had been playing with this API on my Linux box to see what it is capable of. I use a PowerShell test script on Linux, which I knew worked before. All of a sudden I received this error…

image

I was wondering, because I was sure this script and my workspace is working fine. Actually I modified the script from this blog post here Azure Log Analytics HTTP Data Collector API. If I check the error code it says that workspace ID or connection key needs to be valid.

image

After a minute I got an idea and compared the time on my Linux box…

1

..and the one on my client…2

..so there is a deviation of 40 minutes. I corrected the time on my Linux machine and all of a sudden the data submission worked fine. I was wondering, what the maximum allowed deviation will be . I went back in time in 5 minutes steps and after I reached a 15 minute time difference I received the same error. If I put the time back just 14 minutes, the script worked fine.

Conclusion: If you are playing with the Azure Log Analytics HTTP Data Collector API  make sure your clock is set correctly otherwise you will receive a 403 error.

SCOM 2016 – What’s New UNIX/Linux Series: Agent Task Running Script e.g. Perl

I assume you are familiar with creating SCOM tasks and you know there are tasks that are executed on the SCOM console side (console task) and such that are executed on the agent (agent task). In the past you had only few options, like running commands on Windows and UNIX/Linux or scripts only on Windows agents. The task options looked like this in SCOM 2012 R2…

image

In SCOM2016 you are now able to run scripts on UNIX/Linux agent using all kind of script languages (any), that are installed on the target machine.

image

To prove that it works I created a simple task called Perl Ping…

Continue reading