Setting up LCM with DSC PullServer – cmdlets you need to know

Powershell DSC has a slightly steep learning curve – from the beginning, it is not that straight-forward to figure out how to read logs, how to trigger a consistency check or how to get updated configurations from the Pull server.

After building LCM.meta.mof file, you need to apply it to the machine – so that it enrolls itself with DSC PullServer and determines which configuration states to pull and apply. In fact, LCM is the heart of DSC on each node – and it requires some special treatment in order to deliver predictable results. So, to start with LCM, you need to point DSC engine to the folder where .meta.nof of LCM is stored and register it in the system. This works as follows:

Set-DscLocalConfigurationManager -Path PATH_TO_FOLDER

As far as I noticed, starting from Powershell 5.1 update, this command automatically pulls the resources from Pull server. Previously, it was necessary to trigger the resource update without waiting for the standard DSC consistency check interval – 15-30 minutes.

Update-DscConfiguration -Verbose -Wait

This command also comes handy when there was an update to resources in the server (a new configuration state was uploaded or a PS module version got updated).

When we want to start DSC consistency check (don’t mix it with Update-DscConfiguration, the latter only updates resources, it doesn’t run the consistency check) without waiting for the DSC scheduler to do it for us:

Start-DscConfiguration -UseExisting -Verbose -Wait

After updating resources, registering LCM and starting DSC check, we need to check the status. Here comes the trick – first we need to make sure that there is no consistency check in place – otherwise, we can’t get the status of LCM. So, we run:

Get-DscLocalConfigurationManager

This command returns a bunch of parameters – where we are mainly interested in LCMState.

Get-DscLocalConfigurationManager | Select-Object -ExpandProperty LCMState

It can be either “Idle”, “Busy” or report inconsistent configuration which leaves the LCM in a blocked state. When it is “Idle”, we are good to go – and check the actual result of applying a configuration State pulled from the Pull server.

Get-DscConfigurationStatus

The outputs are either “Failed” or “Success” – and this gives us an answer to the question whether the machine is in the desired state or something went terribly wrong.

And the last command – how to get rid of the DSC configuration:

Remove-DscConfigurationDocument -Stage Current,Previous,Pending

DSC stores two configurations for LCM – current (the last applied) and previous. When it ends up in the “pending” state, most likely, you have a problem with your LCM or State. After using this command for clean up, you may go and set updated LCM.

Making VMWare Integration Plugin 5.5 work in modern browser

When deploying a new OVF template to the vCenter 5.5 host from exported one, you may find yourself in trouble getting it done.

The desktop client may complain about invalid configuration for device 7 which is an extremely misleading message.

error7

According to support forums, to import OVF templates you must use a web interface. However, the web interface complains about the VMWare Integration plugin being not installed. It even offers you to download it.

But here is the trick – whenever you try to install its version 5.5 or 5.6 (since you need ESXi 5.5 compatibility), it simply never works with modern browsers – the plugin is not added to the browser extension list and not detected as installed. I assume this is caused by their enhanced security requirements to the installed extensions. Integration plugin from VMWare requires disk and system access and is silently blocked in newer browsers.

On the original requirements specification of the plugin, its stated its compatibility with IE 7 and 8.

The solution is simple – run your modern IE in the compatibility mode as IE 8! It works like a charm.

Nevertheless, I would recommend quitting the session after you finished using the plugin and come back to IE 11 to not expose your system to risk.

 

The Security Development Lifecycle book is available for downloading

Very recently, Microsoft has published online the foundation book describing SDL (Security Development Lifecycle).

security-development-lifecycle-no-cd

The principles behind the SDL were born as a response to the Windows Longhorn project reset in the early 2000s. Back then, the entire project was wiped out and started from scratch due to the presence of critical vulnerabilities in various components – according to MS insiders. At the time, Microsoft had a questionable reputation with regards to security of its products. Therefore, the company made a huge investment in security improvement. SDL was created as the common approach to developing products, starting from the very bottom to top – from design to release.

The book was published in good old 2006, which can be seen as the Stone age comparing to the threats and attack vectors present nowadays. Nevertheless, it still remains a valuable source of knowledge and actions for the teams and companies that struggle with improving a security of the products. In my opinion, it is impossible to deliver a secure solution without integrating SDL principles into every chunk of the development process.

The most recent overview of SDL can be found at the dedicated Microsoft page.

The best part of it is the set of tools and instruments designed and used by MS at each of the steps of SDL – with links for downloads. It can be seen as a great reference to the spectrum of problems that SDL solves – you don’t have to replicate it to your organization in the exact way it works at MS but at least it helps understand the challenges and possible solutions.

Install Powershell 5.0 on Windows 8.1 with a non en-US locale

Windows Management Framework 5 (aka Powershell 5) fails to install if your Windows was installed with a different locale than en-US. In my case, it was en-GB, so it is not a big deal, right?

Well, not exactly. After downloading WMF 5.0 update package,  it fails to apply the update   – saying “Update is not applicable to your computer”. Do not expect to get something more verbose, neither find any useful information in system logs.

After desperate surfing through MS support tickets and trying different fixes, it turned out that the last suspect was the locale configuration. Ironically, that MS support engineers from Seattle couldn’t verify the problem since they all have en-US Windows installed.

So far, the only working way to install WMF 5 (which means, PS 5.0) if it fails to apply the update is to change the locale setting – which is a non-trivial task. It requires running the system DISM utility in the Offline mode (when current Windows installation is not loaded). Also, it requires obtaining en-US language pack .cab archive. And finally, you may even brick your boot configuration if don’t run it properly. Sounds exciting, let’s start then!

  1. Set the default language and interface language to en-US (Control Panel – Language – Advanced Options)
  2. Prepare a bootable Win 8.1 USB installation drive – I used the same image as for the initial installation. Just write it to USB (Win32DiskImager is a great tool for this).
  3. Download en-US language pack. It can’t be found as a separate package from the official resources. What I did was to use MSDN subscription downloads page and grab an installation media of Windows 8.1 Language Pack – it is DVD with a bunch of language packs on it. After, mount the ISO and navigate to “langpacks/en-US”. save the .cab file from it to the convenient location on your drive. I.e. C:\lp.cab
  4. Boot into troubleshooting mode with command prompt – from running Windows session, press Restart while pressing the “Shift” key. The system will log out and troubleshooting options menu will be loaded from the USB. troubleshoot1 Navigate to Troubleshoot -> Advanced options -> Command Prompttroubleshoot
  5. In the command prompt, run the DISM utility: dism /Image:C:\ /Add-Package /PackagePath:C:\lp.cab.
  6. Do not change the locale here. It is possible and sometimes described as one of the steps to apply, i.e using dism ... /Set-Syslocale , but better don’t. – it made my machine to fail to boot until I reverted this fix back.
  7. Boot normally – the language pack was installed but not yet applied. Open “Control Panel\All Control Panel Items\Language”, select “English – United States” and click “Options” on the right side. Under “Windows display language” there will be a link to set the current locale to it. In a different situation, I’ve seen it being done from the “Control Panel -> Language -> Advanced Settings -> Change Locale” menu.
  8. After signing in and out – you may check that the locale has been changed from the elevated command prompt: `dism /Online /Get-Intl`.troubleshoot3
  9. Now, the WMF 5 update can be applied – it might firstly install some sort of fix and ask for a reboot. Afterward, run the installer again – and you will get your precious PS 5.0

 ps

This is it! I hope, MS folks will fix this issue soon – so the update can be applied to a system with any locale.

DSC Pull Server for managing… Linux nodes

Let’s face it, Powershell DSC and Linux is definitely not the most favorite duo when speaking about CM. In this article, I will explain how to set up Linux node with existing DSC Pull Server and why this is a win-win for Win-dominated tech stack.

Powershell DSC is well-documented for Windows, you may find some information about using it in the Push mode (when the configuration is applied TO the node FROM the server via CIM session) but Linux Pull Server is rarely getting more than just a paragraph of attention. So, I’m here to fix it.

The main prerequisite of this exercise is a configured DSC Pull Server – you may want to check out this excellent MSDN tutorial. We run it on Windows Seever 2012 R2, with WMF (Windows Management Framework 5.0). The hostname for the means of this exercise is set to DSCPULLSERVER. Also, you got a Linux node – in rhis case, CentOS 7, with a hostname: DSCPULLAGENT.

The first discovery about DSC On Linux is that the client doesn’t run in Powershell for Linux. It is actually a set of Python and shell scripts that provide an interface to OMI (Open Management Infrastructure) CIM Server and control an execution of pulled configurations. Therefore, it works in a slightly different way comparing to “native” DSC.

To start with, we would need to obtain both OMI and DSC rpm’s in accordance with an $ openssl version installed – it might be either 0.98 (then  use rpms with ssl_98 in the name) or >=1.00 (use ssl_100 rpms).

rpm -Uvh https://github.com/Microsoft/PowerShell-DSC-for-Linux/releases/download/v1.1.1-294/dsc-1.1.1-294.ssl_100.x64.rpm
rpm -Uvh omi-1.1.0.ssl_100.x64.rpm

After installing the packages, DSC files can be found under /opt/microsoft/dsc/.

The next step is to prepare a DSC Local Configuration manager file. It tells DSC client where to pull the configuration from and which configuration. To generate it (and all kinds of other .mof files), I use a separate Windows machine.

Here is an example PS DSC script that creates a local configuration mof.

[DSCLocalConfigurationManager()]
configuration LinuxPullAgent
{
Node localhost
Settings {
RefreshMode = 'Pull'
ConfigurationMode = 'ApplyAndAutocorrect'
}
ConfigurationRepositoryWeb main {
ServerURL = 'https://DSCPULLSERVER:8080/PSDSCPullServer.svc'
RegistrationKey = "xxx-xxx-xxxxxx-xxxxxx"
ConfigurationNames = @('DSCLINUXSTATE')
}
}
}
LinuxPullAgent

Here, DSCLINUXSTATE is the name of the configuration placed to a Configurations folder of the DSC Pull Server. RegistrationKey is the key you obtain from the Pull Server. Also, you can specify here how ofter to pull the configuration, and a bunch of other parameters – read more here.

After running this snippet on a machine with PS 4.0 or higher, it produces a local configuration manager file localhost.mof that you need transfer to the Linux node. Basically, it should be the same file for all your Linux nodes using the DSCLINUXSTATE configuration.

Assume, the transferred file is located under ~/localhost.mof. Now, you need to install this Pull Server configuration on the node. We navigate to /Scripts where you can found various Python scripts for different tasks – for example, do not run Register.py, it is intended for use with Azure DSC server. Also, GetDscConfiguration.py and StartDscConfiguration.py – are the scripts for running DSC in the Push mode.

So here we are interested only in SetDscLocalConfigurationManager.py. This script performs a configuration of the pull agent basing on the .mof files generated for this agent. The second command in the next listing displays all the parameters of an active configuration of the LCM and helps to troubleshoot issues related to it.

SetDscLocalConfigurationManager.py -configurationmof ~/localhost.mof
GetDscLocalConfigurationManager.py

After installing the configuration, the pulls are done periodically with a 30-minute minimal interval. To trigger the pull of the configuration manually we need to execute

/opt/microsoft/dsc/bin/ConsistencyInvoker

This command pulls the configuration state from the pull server that we just installed and check if the current state corresponds to it.

And this is it! If everything went well, your Linux is now managed by the DSC. Using nx, nxNetworking and nxComputerManagement modules from Powershell Gallery, you can cover basic scenarios of configuration for Linux machines.

You can troubleshoot the issues related to DSC states or server connectivity by checking the error messages in the log file /var/opt/omi/log/dsc.log. OMI server instance log is located under /var/opt/omi/log/omiserver.log.

Pros and cons of DSC on Linux:

  • Pros:
    • Powershell and DSC ecosystem – very convenient when the company’s tech stack mainly includes Windows machines. Managing Linux with the same set of technologies sounds like a dream for decreasing complexity and adding transparency to the CM solution.
    •  Simplicity configuration of the Pull model – server only provides the state, the node does all the work.
    • Extensibility of DSC framework – allows to create custom scripts and utilize existing modules, but for Linux!
    • Microsoft’s open-source direction – it looks very determined, and after Powershell release for Linux, I believe, the integration between these two worlds will only improve, adding more functionality and features to both. Therefore, it might be that tomorrow DSC for Linux will be the next big thing.
    • Free of charge comparing to 120$+/node for Puppet or Chef.
  • Cons:
    • nx family of modules is still limited comparing to what can do native CM tools (Puppet, Chef, Ansible, Salt).
    • Poor logging and transparency of a node state – it is hard to trace the results of applying states, especially from the DSC Pull server. Logs need to be examined on the agent. Also, requires setting up a reporting server for DSC which is also not being the most user-friendly interface.
    • Requires an orchestration tool for machine deployment (in the case of using VMs). DSC can provision a new node by itself.

In general, it worked for us quite well – we were aware of general DSC limitations and didn’t find anything unexpected in its Linux specific implementation – given our tech stack it turned out to be a perfect solution.

In the next post, I will cover the examples of using nx modules for configuring CentOS node.

4 ways of developing DevOps competences

I’ve been in DevOps for about 5 years so far and recently summed up the main ways of learning in this business.

The main challenges of DevOps learning are related to extremely wide technology and skills coverage that are presumed to be useful in this occupation.

While the toolset of DevOps highly depends on the technology stack, there are common areas of knowledge that (basing on my fails and trials) turned out to be crucial for system vision and approach.

First of all, DevOps is not only about development, as we can see from the name but also there is an Ops part. Therefore, any reading related to Operations Management is beneficial. You need to know how the company works from the inside out in order to be efficient in DevOps when improving delivery processes. One of my favorites are:

They will tell you how to build operations of your company as a factory – and you may argue that you work in lean/agile/cool startup and have nothing to do with blue collars from industrial areas – and would be mistaken. It is important to start looking at work being done by your company, as at something that you can measure and improve – not ephemerally but by means of time,  throughput and quality.

The second thing I learned about DevOps – is to always keep asking yourself – am I doing a right thing? Am I delivering value and improving the processes?  Do it all the time even out of work – for example, when attending tech conferences and reading literature. Try to make sure you are on track with the latest trends, and here I mean being on track conceptually. Do not try to chase each and every new tools or release – it rarely makes a really big difference comparing to using the old ones. Concentrate on following the concepts – whenever someone introduces a new way to doing DevOps – it is time to start digging in. When you introduce new concepts and take the best ideas to work – you change the rules of the game and you may get much better results than simply replacing the tools.

The third point – is to constantly and continuously improve your OWN operations. Think about how you handle the work that lands on your desk, think about yourself as a factory and the output that you produce. What do you need to improve? You may start with Time management, Memory skills, and Communication.

And the last, fourth way of learning is to always communicate your ideas – in blogs, comments, discussions with colleagues, and see how people react to them. This will help you understand the main pain points and weak spots of your concepts and improve them, with an assistance of others and from their perspective.

Uninstall MS Exchange Server 2016

Today, I read through the support thread of one guy who ended up paying to a customer support from MS to get rid of a nasty Exchange server.

And, I must admit, it is not something that can be removed with ease.

! The following manual will help you to annihilate the Exchange server and its data. If you still wish to save some of it, don’t use this instruction.

Let’s say, you need to completely wipe it out from the machine – and uninstaller always fails to remove so-called Default Mailboxes.

There are a few types of them which you need to take care of manually:

Get-Mailbox -Archive | Disable-Mailbox
Get-Mailbox -Monitoring | Disable-Mailbox
Get-Mailbox -AuditLog | Disable-Mailbox
Get-Mailbox -PublicFolder | Disable-Mailbox

Now,  you need to get rid of “-Arbitration” mailbox but it is not as easy as previous cases. Firstly, go and find out the name of your Mailbox Database:

Get-MailboxDatabase

It will show you something like “Mailbox Database 12212842873428”. Use the entire name, no0t just a number! Now, you can remove all Arbitration mailboxes:

Get-MailboxDatabase -Database "NAME FROM THE PREVIOUS COMMAND" -Arbitration | Disable-Mailbox -Arbitration

It will fail at the last mailbox which is using some default address book (are you still following the actual meaning of these error messages? Not sure if it is physically possible…)

So, to remove the default Offline Address Book we need to get its name. At least, this is what the support says. Forget the support, use the wild card!

Remove-OfflineAddressBook -Identity "*"

And finally:

Get-Mailbox -Database "NAME" -Arbitration | Disable-Mailbox -Arbitration -DisableLastArbitrationMailbox

Phew, that was it. Now, nothing can stop you from wiping out the Exchange server from your server.

 

Extract ISO without mounting it

I’ve spent a lot of time looking for the easiest way to unpack ISO image file under Linux without mounting it (it requires root privileges).

The solution you may find over the Internet describes various tools – one option is to install p7zip and p7zip-plugins (which is OK) or other custom tools which I wouldn’t like to depend on in my configuration.

However, there is not much information about bsdtar that is an excellent replacement of the common tar tool which also allows unpacking ISO images with:

$ bsdtar xvf image.iso

Ansible: loops and registering results of commands

Ansible has a great feature of registering results of a performed step. But recently I stumbled upon a need to do it against actions from a loop and then using the registered result in the next steps which are also looped. Most often, it can be used to perform some action only in case if a previous action changed a file.

Firstly, we register `action_state` variable. Its sub-dictionary `results` contains attribute `changed` which can be accessed as `action_state.results.changed`. Now, we need to get it into use in the next looped step. It can be done with ease using `with_together` operator:

- name: copy a file
  copy:
    src: '{{ source }}'
    dest: '{{ target }}'
  with_items: yourlist
  when: item.state == 'present'
  register: action_state

- name: create .ssh dir if file changed in the previous step
  sudo_user: '{{ item.0.user }}'
  file:
    state: directory
    path: '/home/{{ item.0.user }}/.ssh'
    mode: 0700
  with_together:
  - yourlist
  - action_state.results
  when: item.1.changed and item.0.state == 'present'

Here, for the `yourlist` we copied some files to a target machine (i.e. public keys to some temp location) in the loop and then for each of the files performed an action only if a file was changed.

Books for getting more power with PowerShell

When you look at available PowerShell books, almost every link at the Internet points you to the infamous:

Jones, Don, and Jeffrey T. Hicks. Learn Windows PowerShell 3 in a Month of Lunches. Manning, 2013.

This book is indeed very good and professionally written. However, it doesn’t explain some basics which are very important for understanding PS features. It feels like you are being dragged into a dark and scary forest of this bloody PowerShell. On the serious note, I could say that this book was written by PS gurus who can barely walk in the shoes of ordinary Linux users or SW engineers.

Instead of it, I would recommend checking out this one:

Santos, Donabel. “PowerShell for SQL Server essentials.” (2015).

It has it all, plus puts it into SQL server context. Delicious!

I very much like the author’s style and the way of approaching main concepts. He just explains it to you as to a colleague without trying to impress you with his deep knowledge of the subject.

Great job, Mr. Santos!