SCOM / SCSM – Retrieve Decrypted RunAs Account Credentials

password-ftr

I am not sure if you have seen it, but Richard Warren from nccgroup has figured out, how to decrypt the RunAs account credentials in SCOM. The problem up to now was, that there was no official way to retrieve the encrypted credentials from SCOM. There is just one DLL to use, which offers the decrypt method. He has written a EXE and a PowerShell script on Github . I know there are always two sides of the medal. In this case an evil and a good way of using this knowledge. I think I don’t have to talk about the evil way, instead I would like to talk about its benefit.

Richard Warren has used it for SCOM RunAs accounts, but if you think about it Service Manager (SCSM), which is based on the same framework, therefore I was curious if this approach also works for SCSM. In fact it did! Why is this awesome? Well, think about it. We are able to “securely” store credentials in SCSM (or SCOM) using RunAs accounts. Now we are able to retrieve those credentials easily. Because I do a lot of automation in SCSM using service requests and itnetX PowerShell activities I always had some trouble to store credentials in a save manner. There are many ways to do so, like exporting the credentials into XML (Export-CliXML) , using certificates , encrypting the credentials using a key and store it somewhere like here or maybe you could store the credentials in SMA and retrieve it using PowerShell. Whatever method you are going to use, you will end up with more or less problems. The best approach would be, to store the credentials on the system where you need it (SCSM) and the SCSM administrator can manage these accounts without to dig into PowerShell code or certificates etc. Therefore RunAs accounts are a perfect way for storing credentials.

Because of that, I have used Richard’s sample, modified the code a bit to be able to use it on SCOM and SCSM and also return proper output. The PowerShell module will return the a credential hash table. You need to execute the module on the SCOM or SCSM management server and the only parameter you need to provide is the SCOM RunAs account display name like in this example.

In SCOM the RunAs account looks like this…

image

…and if you use the PowerShell module it works like this…

image

You can download the module from PowerShell Gallery . Be aware of the fact, that you need permission to access the database and management server.

Continue reading

WAP – Get Windows Azure Pack Websites via PowerShell

image

Windows Azure Pack was Microsoft’s first attempt to bring Azure into your on-premise datacenter. The things you can do with it are limited to IaaS VM, PaaS databases and PaaS websites. In addition there is Service Bus and some networking part which is necessary for the IaaS / PaaS services. Of course there are other required parts, like Service Provider Framework (SPF), SC Virtual Machine Manager etc. Because my job is to automate things using PowerShell, I have sometimes the need to get data out of systems like in this case WAP as my data source. If you look a bit closer at WAP and you want to get information about configured SQL databases or MySQL databases there is a rich set of PowerShell cmdlets available and these modules are installed on the WAP admin servers…

image

…so what you could do is use PowerShell remoting and query these server for information. If you want to get information about provisioned VM’s you simply could query VMM using its own cmdlets to gather information.

One other way you could get information out of WAP, is to use the Public Tenant API. This API provides information about tenant specific information, therefore you need to provide a subscription to get detailed information about that specific tenant. MVP Ben Gelens has written a fantastic PowerShell module to get all sorts of information from the WAP Public Tenant and WAP Admin API you can find the module here https://github.com/bgelens/WAPTenantPublicAPI . I have tested it and it works like a charm.

So but what is now the point of this post? Well, so far we have seen, that we can get information about SQL Server and MySQL databases using these PowerShell cmdlets using the Admin API, for VM’s use VMM as a data source, but what about websites? There are also modules installed on the web controller servers itself, e.g. the WebSites module…

image

…and the WebSiteDev module…

image

…to get infos about websites from the system just use these cmdlets above.

One more elegant way to pull website information is going through the endpoint REST API (Web Site Cloud REST Endpoint) which you need to provide when adding the website resource to the admin portal. It depends how you configured, it but as an example you can find the settings you configured on the web controller server you could execute the Windows Azure Pack Websites MMC and find all different settings…

Continue reading

SMA Authoring Toolkit – Some Runbooks Are Not Showing Up

untitled

When you are creating runbooks in SMA (Service Management Automation) and you are using the SMA Authoring Toolkit available on PowerShell Gallery, you might have also have faced a very annoying bug. If you have a certain amount of runbooks in SMA and you are browsing through the runbook list in ISE you simply cannot find certain runbooks. Trying to refresh the list does not work at all.

image

If you open SMA to browse the runbook list you can see them all published and in a “healthy” state. So there is no reason not to show up.

Continue reading

Quick Post – Linux + PowerShell + DSC Blog Posts @ Hey, Scripting Guy! Blog

image

I would like to make you aware of a 3-part blog post series, which I have written for THE Microsoft Hey, Scripting Guy! Blog .  Because I really like these blog post series and of course the blog itself a lot , I want to share it with you.

The first part shows you, how to use Bash on Windows 10 and how you can connect to a Linux server to install OMI CIM server and the DSC for Linux packages. The second part installs .NET Core and PowerShell for Linux on the system using DSC for Linux. In addition, I show you how to connect via PowerShell and WSMan protocol from your Windows 10 to the OMI CIM server. The last post is applying a DSC configuration from Azure Automation DSC to Linux and executing a PowerShell script to send user data to Azure Log Analytics HTTP Data Collector API .

You can find the post here:

Part 1 – Install Bash on Windows 10, OMI CIM Server, and DSC for Linux

Part 2 – Install .NET Core and PowerShell on Linux Using DSC

Part 3 – Use Azure Automation DSC to Configure Linux and Executing PowerShell Script

 

I hope you like it as much as I do, have fun!

OMS – HTTP Data Collector API 403 (Forbidden)

Few weeks ago Microsoft released the Azure Log Analytics HTTP Data Collector API, which allows you to shoot JSON data into OMS Log Analytics. This is awesome news, because now anything is possible. This means, you are able to use (m)any script languages to send any data to OMS for further analytics and you are able to use all the nice OMS goodies like alerting, view designer for building awesome dashboards, query language for some deep dive into your data etc. I had been playing with this API on my Linux box to see what it is capable of. I use a PowerShell test script on Linux, which I knew worked before. All of a sudden I received this error…

image

I was wondering, because I was sure this script and my workspace is working fine. Actually I modified the script from this blog post here Azure Log Analytics HTTP Data Collector API. If I check the error code it says that workspace ID or connection key needs to be valid.

image

After a minute I got an idea and compared the time on my Linux box…

1

..and the one on my client…2

..so there is a deviation of 40 minutes. I corrected the time on my Linux machine and all of a sudden the data submission worked fine. I was wondering, what the maximum allowed deviation will be . I went back in time in 5 minutes steps and after I reached a 15 minute time difference I received the same error. If I put the time back just 14 minutes, the script worked fine.

Conclusion: If you are playing with the Azure Log Analytics HTTP Data Collector API  make sure your clock is set correctly otherwise you will receive a 403 error.

SCOM 2016 – What’s New UNIX/Linux Series: Agent Task Running Script e.g. Perl

I assume you are familiar with creating SCOM tasks and you know there are tasks that are executed on the SCOM console side (console task) and such that are executed on the agent (agent task). In the past you had only few options, like running commands on Windows and UNIX/Linux or scripts only on Windows agents. The task options looked like this in SCOM 2012 R2…

image

In SCOM2016 you are now able to run scripts on UNIX/Linux agent using all kind of script languages (any), that are installed on the target machine.

image

To prove that it works I created a simple task called Perl Ping

Continue reading

SCOM 2012 – Meets MS Flow and Service Bus or How to Translate Alerts

translate

Everything is going international and everything is interconnected. Microsoft is providing many technologies to build bridges between different technologies and systems. I like the idea to build connectors to have a system A talking to system B within a matter of seconds. Microsoft Flow is such a technology which will interconnect systems with each other. Although the idea behind Microsoft Flow is not new, there are other providers like IFTTT or Zapier which are much longer on the market. The differences lie in connecting to endpoints, transforming data and sending to a target. Depending on your needs you will use one or the other or you might want to interconnect one task automation engine with the other. Lifehacker.com gives a short comparison:

  • IFTTT: IFTTT is super easy to use. As the name suggests, you set up a trigger: that’s the “if.” Then you pick a reaction, that’s the “that.” IFTTT supports 320 popular services, including Dropbox, Drive, WordPress, Twitter, and plenty of others. IFTTT calls these “recipes,” and you can browse recipes made by other people, which makes it easy to come up with ideas for how you can use the service on your own. On top of the web site, IFTTT also has Android and iOS apps so you can take the experience on the go. IFTTT is free. On Android, IFTTT and Tasker work very similarly.
  • Zapier: Zapier works just like IFTTT, but instead of “recipes” the service calls your actions “zaps.” Zapier focuses more on business app integration, so it supports niche corporate apps, like Recurly, HelloSign, and MySQL. Zapier is also more customizable. Where IFTTT limits itself to two steps (this happens, then that happens), Zapier supports multi-step zaps (this happens, then that, that, and that). That said, Zapier doesn’t have mobile apps. It’s also not free. While Zapier has a free plan, it limits you to five zaps at once, locks off access to certain apps, and can only make two-step zaps (just like IFTTT). For $20/month, you unlock Zapier’s real power, including access to all 500+ app integrations and multi-step zaps.
  • Microsoft Flow: Flow is the newest automation tool on the block and it’s the most limited. As you’d expect, Flow’s strength is its integration with Microsoft apps and services. Flow works like IFTTT, with two-step automation recipes called “templates.” Also like IFTTT, you can browse other people’s templates or share your own. Currently, Flow is a “preview” build on the web, which aside from being a bit limited in scope, also limits it to work or school email accounts. Chances are, that doesn’t include you, unless your company is deep in the Microsoft ecosystem or you’re a student. But hey, at least there’s also an iPhone version. For now, Flow is free as long as it’s in preview.

In our example I will use Microsoft Azure Service Bus, PowerShell, Microsoft Flow, Office 365 to help my old friend SCOM to translate alerts in many languages. Sounds cool? Yes it really is!

First we need to setup Microsoft Azure Service Bus Queue. So what is Service Bus (Source)?

Service Bus is a multi-tenant cloud service, which means that the service is shared by multiple users. Each user, such as an application developer, creates a namespace, then defines the communication mechanisms she needs within that namespace. shows how this looks.

image

Within a namespace, you can use one or more instances of four different communication mechanisms, each of which connects applications in a different way. The choices are:

  • Queues, which allow one-directional communication. Each queue acts as an intermediary (sometimes called a broker) that stores sent messages until they are received. Each message is received by a single recipient.
  • Topics, which provide one-directional communication using subscriptions-a single topic can have multiple subscriptions. Like a queue, a topic acts as a broker, but each subscription can optionally use a filter to receive only messages that match specific criteria.
  • Relays, which provide bi-directional communication. Unlike queues and topics, a relay doesn’t store in-flight messages-it’s not a broker. Instead, it just passes them on to the destination application.
  • Event Hubs, which provide event and telemetry ingress to the cloud at massive scale, with low latency and high reliability.

As you can see a queue is the simplest part to use in Service Bus, it reminds me a bit like a printer queue, instead of sending documents, you are able to send messages. Ok so let’s install a queue…

Continue reading