Written by Edmund Dipple, DevOps Engineer, DevOpsGroup (aka @elmundio87)
Over the last couple of years, we have been managing environment configuration of a Windows-based production environment with Powershell DSC. It’s been working rather nicely so far – any configuration drift is fixed within 15 minutes, and Windows-literate engineers are quick to get used to the Powershell-based configuration syntax.
We have a great Continuous Delivery pipeline set up to promote any new configuration, so any new DSC can be rolled out from Dev to Production with relative ease.
All DSC configuration and dependencies are stored in one Git repository – Teamcity packages the repository using Nuget, then Octopus Deploy unpacks and runs the DSC configuration on each environment in the pipeline.
Trouble in paradise
As with any software project, any unchecked minor issues can rear their ugly heads when trying to scale out. In our case, it related to how quickly we can commit new code while navigating through the mass of existing code. At one point, the main DSC configuration file reached nearly 2500 lines of code. There were some very large “if” statements so that some resources were only executed on certain nodes – but it was very easy to get lost.
These were some of the pain points that we identified (in addition to the codebase size);
- It was difficult to test new code without having to run all configuration at once (unable to test new configuration in isolation).
- Running the configuration on your own laptop required secret parameters that were passed in at deployment time. These parameters are handled by Octopus Deploy for the environments.
- Third party module dependencies were committed into Git and we didn’t really know which dependencies matched up to which exact parts of the configuration. We could never be sure what impact a dependency upgrade would have on seemingly unrelated parts of the configuration code.
- It was not easy to test the DSC code on a new virtual environment, a lot of manual steps were required.
Time for a change
It was decided that in order to continue effectively, we would need to start refactoring our existing configuration into DSC modules, and reference those modules in the main configuration file. Any new configuration would either update existing modules, or a new module would be created.
In order to drive this new way of working with DSC, we needed some one-click powershell commands that could;
- Instantly scaffold the file/folder structure of a new Powershell DSC module.
- Create/destroy test environments in AzureRM.
- Upload module code to the test environment, and execute it.
- Run integration tests, for testing the module configuration in isolation from the main configuration file.
- Download third party modules into a local folder and upload them to the test environment.
- The module must work on Linux and MacOS (taking advantage of Powershell 6!)
- Create a versioned artefact that can be consumed by one or more projects (code reuse!) and also list its own dependencies.
With those requirements in mind, I started to work on the module.
PSForge was born
By running New-DSCModule, the following types of files are created inside a new directory;
- A Powershell module manifest
- A tiny bootstrapper to install Paket
- A DSCResource directory, where it will optionally create resources for you
- A Gemfile for Test-Kitchen dependencies
- The Test Kitchen configuration file, with sensible defaults to create a single Windows 2012 R2 VM
- Some example Pester tests
- A README file!
It will then automatically install any missing Ruby gems, install Paket and initialise a Git repository.
This makes creating a new module as simple as running one command.
New DSC resources can be created after the initial bootstrap by running New-DSCResource
A module can be integration tested by running Test-DSCModule – this uses Test Kitchen to orchestrate the test environment creation, configuration and testing. Paket will resolve any module dependencies prior to environment creation in a local folder called “packages” – this folder is uploaded to the Azure environment.
Why not just use Install-Module on the Azure environment? Because in our workflow, we always work under the principle that dependencies are packaged on a Continuous Integration server, the package then transferred to the deployment target. We also want to support environments that may only be running WMF 4.0, so Import-Module is not necessarily available to use.
The Test Kitchen configuration file requires an environment variable to be set (to prevent Azure credentials being checked into source control), PSForge helps you to set the variable up if it’s missing.
Finally, the Export-DSCModule command creates a Nuget package for you. It auto-generates Paket configuration files, then uses Paket to create the package. The reason I chose Paket for this task, is that it is much better at handling transitive dependencies than vanilla Nuget is. On non-Windows systems, PSForge uses the Mono framework to support running Paket.
Putting the module to work
I have created 3 modules using PSForge;
- A Ruby installer module, which supports installing gems.
- A module that configures IIS for maximum security.
- A module that installs EventStore and configures it as a Windows service.
The DSC code for each module was refactored from the monolithic DSC project, with some minor tweaks (to add resource parameters).
Together, those three modules removed 630 lines of code from the main DSC configuration file – which accounts for just over 25% of the entire file!
I’ve learned a lot about PowerShell 5 while writing this module. I can see PowerShell DSC becoming the de-facto standard for Windows configuration management.
PSForge is still work-in-progress, but we’re eager for feedback and Pull Requests – the module code can be downloaded at https://github.com/devopsguys/PSForge.