Thursday, January 12, 2017

A PowerShell function to translate wind speed to Beaufort scale numbers

While last week I have shown you how to write a PowerShell function aimed at converting the direction of wind in degrees to the italianate wind name, this time I am introducing a function, called Get-WindForce that translates the speed of the wind in m/s to a readable description taken from the Beaufort Scale.

While the function that I wrote last week uses the Italian wind names of Medieval origin, which aren't translated because they have become an international standard still used nowadays, the function that gives you a description of the wind force has to be developed in a way to return the information in the language of the reader.

That's why I had to add a -Language optional parameter that tells the function the language you want to use in the output. Based on the passed value it relies on the Switch statement to choose the proper values to display:


Then it uses again the Switch statement to pick the description in the given language:


Nothing especially difficult here, apart from the fact that it took quite a bit of typing to get all the possible wind speeds and descriptions in the four languages that I have pre-loaded in my function; English, Italian, French and German.

Here's the full code of my function:

function Get-WindForce {

<#
.Synopsis
   Returns wind force from speed in m/s
.DESCRIPTION
   Returns wind force in a give language from speed in m/s
.EXAMPLE
   Get-WindForce -speed 2 -language EN
.EXAMPLE
   Get-WindForce -speed 31.5 -language IT
.EXAMPLE
    15,40 | Get-WindForce -Language FR -Verbose
.NOTES
   happysysadm.com
   @sysadm2010
#>
    [CmdletBinding()]
    [OutputType([string])]
    Param
    (
        # Speed of wind in m/s
        [Parameter(Mandatory=$true,ValueFromPipeline=$true)]
        [double]$Speed,

        # Language to use for the output of the wind force
        [string]$Language = 'EN'
    )
    
    Process {
    
        Write-Verbose "working on $speed m/s"
        $windforce = switch ($speed) {
            {$_ -lt 0.3} { @('Calm','Calma','Calme','WindStille') }
            {($_ -ge 0.3) -and ($_ -le 1.5)} { @('Light air','Bava di vento','Très légère brise','Leichter Zug') }
            {($_ -ge 1.6) -and ($_ -le 3.3)} { @('Light breeze','Brezza leggera','Légère brise','Leichte Brise') }
            {($_ -ge 3.4) -and ($_ -le 5.5)} { @('Gentle breeze','Brezza testa','Petite brise','Schwache Brise') }
            {($_ -ge 5.6) -and ($_ -le 7.9)} { @('Moderate breeze','Vento moderato','Jolie brise','Mäßige Brise') }
            {($_ -ge 8) -and ($_ -le 10.7)} { @('Fresh breeze','Vento teso','Bonne brise','Frische Brise') }
            {($_ -ge 10.8) -and ($_ -le 13.8)} { @('Strong breeze','Vento fresco','Vent frais','Starker Wind') }
            {($_ -ge 13.9) -and ($_ -le 17.1)} { @('Near gale','Vento forte','Grand frais','Steifer Wind') } 
            {($_ -ge 17.2) -and ($_ -le 20.7)} { @('Gale','Burrasca','Coup de vent','Stürmischer Wind') }
            {($_ -ge 20.8) -and ($_ -le 24.4)} { @('Strong gale','Burrasca forte','Fort coup de vent','Sturm') }
            {($_ -ge 24.5) -and ($_ -le 28.4)} { @('Storm','Tempesta','Tempête','Schwerer Sturm') }
            {($_ -ge 28.5) -and ($_ -le 32.6)} { @('Violent storm','Fortunale','Violent tempête','Orkanartiger Sturm') }
            {$_ -ge 32.7} { @('Hurricane','Uragano','Ouragan','Orkan') }
            default { 'NA','NA','NA','NA' }
            }

        Write-Verbose "Printing in choosen language: $Language"
        switch ($language) {
            'EN' {$windforce[0]}
            'IT' {$windforce[1]}
            'FR' {$windforce[2]}
            'DE' {$windforce[3]}
            Default {$windforce[0]}
            }

    }
   
}

Feel free to adapt this function to you own language. In my case I am using this function as an additional function to the module I am writing to manage my pellet stove. To be more specific, I am using those few lines of code to translate the raw information I get from an external weather data provider, but I'll talk about that part in a future post. Stay tuned for more PowerShell for Domotics!


UPDATE: It looks like my function has been improved: check out MCT Carnegie Johnson version on Github.

Monday, January 2, 2017

A PowerShell function to convert wind degrees to compass directions and italianate wind names

Winter has arrived and since there is no Windows Phone app for managing my pellet stove, I am working on a PowerShell module so that I know exactly how much my stove consumes by fetching the relevant data from the onboard management unit.

The module I am writing allows me to turn on and off my stove, and to configure it to suit my needs.

I have also added to this module a couple of additional functions to build a relation between the activity of my home stove with the data taken from the external weather data provider (I'll talk about this in a future post).

Now I live in a region where there can be strong winds that lower the temperature, so I thought necessary to add to my module a function specifically aimed at converting the wind direction (in degrees) into something more readable, such as one of the 16 compass headings as well as into its Italianate wind name.

This is what I am going to share with you here today.

I am not going to write an entire blog post on the history of the Rose of the Winds (there's a Wikipedia article for that), but, it is interesting to note that there are three possible ways to express a wind direction:
  1. in degress, such as 90°, 220° etc
  2. through its cardinal (north, south, east, west) and ordinal (northeast, southeast, southwest and northwest) directions
  3. through its italianate wind name
The italianate wind names, also known as the traditional wind names, derivate from the fact that during the 13th and 14th century the Italian was the lingua franca in the whole Mediterranean region, and sailors used it to name winds in an understandable way when crossing with people from from other countries.

So basically you have the following table with the eight principal winds:

Direction Italianate Name
--------- --------------
North     Tramontana
Northeast Grecale or Bora
East      Levante or Oriente
Southeast Scirocco or Exaloc
South     Ostro or Mezzogiorno
Southwest Libeccio or Garbino
West      Ponente or Zephyrus
Northwest Maestrale or Mistral

Now since I want my function to be more precise around the wind direction, I have choosen to adopt the 16-wind compass:

Direction       Italianate Name      
---------       --------------      
North           Tramontana
North Northeast Tramontana-Grecale
Northeast       Grecale
East Northeast  Grecale-Levante
East            Levante
East Southeast  Levante-Scirocco
Southeast       Scirocco
South Southeast Scirocco-Ostro
South           Ostro
South Southwest Ostro-Libeccio
Southwest       Libeccio
West Southwest  Libeccio-Ponente
West            Ponente
West Northwest  Ponente-Mastrale
Northwest       Maestrale
North Northwest Maestrale-Tramontana

This is the function I came up with:

function Get-WindDirection {

<#
.Synopsis
   Returns wind direction
.DESCRIPTION
   Returns wind direction and the italianate wind name
.EXAMPLE
   Get-WindDirection -degress 90
.NOTES
   happysysadm.com
   @sysadm2010
#>

    [CmdletBinding()]
    [OutputType([string])]
    Param
    (
        # Degrees
        [Parameter(Mandatory=$true,ValueFromPipeline=$true)]
        [ValidateRange(0,360)][int]$Degree
    )
    Begin {
        $WindCompassDirection = @("North","North Northeast","Northeast","East Northeast","East","East Southeast", "Southeast", "South Southeast","South","South Southwest","Southwest","West Southwest","West","West Northwest","Northwest","North Northwest","North")
        $WindCompassName = @('Tramontana','Tramontana-Grecale','Grecale','Grecale-Levante','Levante','Levante-Scirocco','Scirocco','Scirocco-Ostro','Ostro','Ostro-Libeccio','Libeccio','Libeccio-Ponente','Ponente','Ponente-Mastrale','Maestrale','Maestrale-Tramontana','Tramontana')
        }

    Process {
        $Sector = $Degree/22.5  #Divide the angle by 22.5 because 360deg/16 directions = 22.5deg/direction change
        Write-Verbose "$Degree is in $Sector sector."
        $Value = "" | Select-Object -Property Direction,Name
        $Value.Direction = $WindCompassDirection[$Sector]
        $Value.Name = $WindCompassName[$Sector]
        return $Value
        }

    End {}
   
}

Let's move to see it in detail and to explain the process of converting wind directions in degrees to text words.

The first step is to divide the given wind angle by 22.5 because 360 degrees divided by 16 directions gives sectors 22.5 degrees wide. So:

PS C:\> 23/22.5
1.02222222222222

which means that a wind coming from 23° is in the first sector, after rounding.

Another example:

PS C:\> 177/22.5
7.86666666666667

which means that a wind coming from 177° is in the 7th sector.

Let's put that value into a $Sector variable:

$Sector = $Degree/22.5

Now I just have to tag each sector by creating an array containing all of them. Actually I want two arrays because I want to be able to print the cardinal and ordinal direction along the italianate wind name.

Each arrays has 17 sectors, not 16, so that I am certain to translate a value such as 359° to North:

$WindCompassDirection = @("North","North Northeast","Northeast","East Northeast","East","East Southeast", "Southeast", "South Southeast","South","South Southwest","Southwest","West Southwest","West","West Northwest","Northwest","North Northwest","North")

$ItalianateWindName = @('Tramontana','Tramontana-Grecale','Grecale','Grecale-Levante','Levante','Levante-Scirocco','Scirocco','Scirocco-Ostro','Ostro','Ostro-Libeccio','Libeccio','Libeccio-Ponente','Ponente','Ponente-Mastrale','Maestrale','Maestrale-Tramontana','Tramontana')

Now you already know for sure that you can echo back the value of an item (such as number 4) in a indexed PowerShell array with the following code:

$array[4]

This is the technique I use to access the value inside the array composed of wind names:

$value.Direction = $WindCompassDirection[$Sector]
$value.ItalianateName = $ItalianateWindName[$Sector]

It is worth noticing that when $Sector is a Double, such in the case

PS C:\> 177/22.5
7.86666666666667

it is automatically converted to an integer when used as index for a value inside an array:

PS C:\> $WindCompassDirection[7.86666666666667]
South

For sure we have to accept here the banker's rounding mechanism, which I have already explained in a previous post.

Stay tuned for more functions from my PowerShell module to manage a pellet stove in those cold times.

Happy new year, readers, and happy coding.

Tuesday, December 27, 2016

A function for robust command execution in PowerShell

I have always been a great fan of a tool named RoboCopy, which I bet many of you have used countless times. Now these days I have been in need for this very same type of robustness for one of my functions since I am running it in an unreliable environment.

What I wanted to achieve with PowerShell in particular, was to fetch a great deal of data from public web servers and reuse the information in my scripts. Unfortunately when you use a cmdlet, be it Invoke-RestMethod, or Test-Connection, you can get failures which are not due to your cmdlets but to the underlying infrastructure (such as a Wi-Fi network, a distorted topology because of a flapping router or a way too busy web server).

Sure Invoke-RestMethod has a TimeoutSec parameter, but what if it fails and I really need the information coming from that website? Well, this reasoning brought me to write an advanced function that takes a command and its parameters and tries to run it a given number of times (three by default) with intervals of three seconds.

This function, which I called Start-RoboCommand (Start is an approved verb, so that my PSScriptAnalyzer is happy, and I borrowed the idea of the RoboCommand noun from RoboCopy itself), also supports an improved functioning where the command is run indefinitely, through the addition of a -Wait parameter, such as the one you can find in recent versions of Get-Content.

To finish with, I added a LogFile parameter to log errors (which is particularly important here because we are exactly dealing with commands not being successfull) and Verbose, which tells you exactly what's going wrong.


Now without further ado, here's my function:

function Start-RoboCommand {

<#
.Synopsis
   Function that tries to run a command until it succeeds or forever
.DESCRIPTION
   Function that tries to run a command until it succeeds or forever. By default this function tries to run a command three times with three seconds intervals.
.PARAMETER Command
    Command to execute
.PARAMETER Args
    Arguments to pass to the command
.PARAMETER Count
    Number of tries before throwing an error
.PARAMETER Wait
    Run the command forever even if it succeeds
.PARAMETER Delay
    Time in seconds between two tries
.PARAMETER LogFile
    The path to the error log
.EXAMPLE
   Start-RoboCommand -Command 'Invoke-RestMethod' -Args @{ URI = "http://guid.it/json"; TimeoutSec = 1 } -Count 2 -Verbose
.EXAMPLE
   Start-RoboCommand -Command 'Invoke-RestMethod' -Args @{ URI = "http://notexisting.it/json"; TimeoutSec = 1 } -Count 2 -Verbose
.EXAMPLE
   Start-RoboCommand -Command 'Invoke-RestMethod' -Args @{ URI = "http://guid.it/json"; TimeoutSec = 1 } -Wait -Verbose
.EXAMPLE
   Start-RoboCommand -Command 'Invoke-RestMethod' -Args @{ URI = "http://notexisting.it/json"; TimeoutSec = 1 } -Wait -Verbose
.EXAMPLE
   Start-RoboCommand -Command 'Test-Connection' -Args @{ ComputerName = "bing.it" } -Wait -Verbose
.EXAMPLE
   Start-RoboCommand -Command 'Test-Connection' -Args @{ ComputerName = "nocomputer" } -Wait -LogFile $Env:temp\error.log -Verbose
.EXAMPLE
   Start-RoboCommand -Command Get-Content -Args @{path='d:\inputfile.txt'} -Wait -DelaySec 2 -LogFile $Env:temp\error.log -Verbose
.NOTES
   happysysadm.com
   @sysadm2010
#>

    [CmdletBinding(SupportsShouldProcess,DefaultParameterSetName='Limited')]
    Param (
    
    [Parameter(Mandatory=$true)]
    [Alias("Cmd")]
    [string]$Command, 

    [Parameter(Mandatory=$true)]
    [hashtable]$Args, 

    [Parameter(Mandatory=$false,ParameterSetName = 'Limited')]
    [int32]$Count = 3, 

    [Parameter(Mandatory=$false,ParameterSetName = 'Forever')]
    [switch]$Wait,

    [Parameter(Mandatory=$false)]
    [int32]$DelaySec = 3,

    [Parameter(Mandatory=$false)]
    $LogFile
    )
    
    $Args.ErrorAction = "Stop"
        
    $RetryCount = 0

    $Success = $false
    
    do {

        try {

            & $command @args

            Write-Verbose "$(Get-Date) - Command $Command with arguments `"$($Args.values[0])`" succeeded."

            if(!$Wait) {
                
                $Success = $true

                }
            
            }
        
        catch {

            if($LogFile) {

                "$(Get-Date) - Error: $($_.Exception.Message) - Command: $Command - Arguments: $($Args.values[0])" | Out-File $LogFile -Append

                }
            
            if ($retrycount -ge $Count) {

                Write-Verbose "$(Get-Date) - Command $Command with arguments `"$($Args.values[0])`" failed $RetryCount times. Exiting."

                $PSCmdlet.ThrowTerminatingError($_)
                
                }

            else {

                Write-Verbose "$(Get-Date) - Command $Command with arguments `"$($Args.values[0])`" failed. Retrying in $DelaySec seconds."

                Start-Sleep -Seconds $DelaySec

                if(!$Wait) {
                
                    $RetryCount++

                    }

                }

            }

        }

    while (!$Success)

 }

Let me know how it works for you and if you have any suggestion on the logic I'll be more than happy to improve it over time. Fo sure you can find it also on my github.

Thursday, December 8, 2016

Spotlight on the PSReadline PowerShell module

The trend is clear: Microsoft has shifted some major projects, like .NET and PowerShell itself, into the open-source ecosystem, and has made them cross-platform. Today you can run your PowerShell scripts on a GUI-less Windows Server Core, or on a headless Nano Server, but also on Linux, and on a Mac.

There is a project in particular which reveals this kind of cross-pollination between OSes, and it is the PSReadline module, which is aimed at bringing the GNU Readline experience to your PowerShell console.
This module is installed by default on Windows 10 and brings some slick functionalities which are well worth a quick look.
The first functionality is the fact that with PSReadline, the console preserves command history across sessions. Ok, you were used to Get-History to find the list of the typed commands, and to use Invoke-History (aliased as 'r') to run commands found in the history. But these two cmdlets are limited to the current session:

Now with the arrival of PSReadline, which is loaded by default when you start a PowerShell console, you got the possibility to retrieve commands typed in previous sessions, even across reboots. This is achieved through log files stored inside the Application Data folder:
  • $env:APPDATA\Microsoft\Windows\PowerShell\PSReadline\ConsoleHost_history.txt for the PowerShell console host (conhost.exe)
  • $env:APPDATA\Microsoft\Windows\PowerShell\PSReadline\Windows PowerShell ISE Host_history.txt for the Integrated Scripting Environment (ISE)
  • $env:APPDATA\Microsoft\Windows\PowerShell\PSReadline\Visual Studio Code Host_history.txt for Visual Studio Code, the new toy for those into DevOps
How I discovered that? Simple. The PSReadline module comes with five cmdlets:
  • Get-PSReadlineKeyHandler: gets the key bindings for the PSReadline module
  • Get-PSReadlineOption: gets values for the options that can be configured
  • Remove-PSReadlineKeyHandler: removes a key binding
  • Set-PSReadlineKeyHandler: binds keys to user-defined or PSReadline-provided key handlers
  • Set-PSReadlineOption: customizes the behavior of command line editing in PSReadline.
If you issue “(Get-PSReadlineOption).HistorySavePath” you will get the location where the system keeps the command history for your current interpreter.

Now for some reason, the only working log between those listed above is the one for PowerShell on the command line, probably because PowerShell ISE and VSCode don't have a true console (conhost.exe) behind it:


Being the Application Data folder user-specific, you only have access to the command history for your user account: so there is one ConsoleHost_history.txt file for each user on a given computer. The permissions are set in a way that the admin can access the command history for other users, which is good for checking your systems.

Here's a script I wrote to retrieve a list of all the consolehost_history.txt files on my systems, so that I know who used PowerShell and when:
(Get-ChildItem -Path c:\users).name | % {

     Get-Item ((Get-PSReadlineOption).HistorySavePath -replace ($env:USERNAME,$_)) -ErrorAction SilentlyContinue

     } | Select-Object FullName,

                       CreationTime,

                       LastWritetime,

                       @{Name="Kbytes";Expression={ "{0:N0}" -f ($_.Length / 1Kb) }},

                       @{Name="Lines";Expression={(Get-Content $_.fullname | Measure-Object -Line).Lines}}
To prevent PowerShell from logging any command just type:
Set-PSReadlineOption –HistorySaveStyle SaveNothing
Other interesting settings that you could adopt or make custom are:
Set-PSReadLineOption -HistoryNoDuplicates
and
Set-PSReadLineOption -MaximumHistoryCount 40960
I wouldn't bother changing the HistorySaveStyle because the default parameter seems well suited to me: SaveIncrementally means that every run command is stored in the log before being actually executed.

If you want to erase you command history, you can just press ALT+F7, as you can discover by issuing:
Get-PSReadlineKeyHandler | ? Function -eq 'clearhistory'

Key    Function     Description
---    --------     -----------
Alt+F7 ClearHistory Remove all items from the command line history (not PowerShell history)
The second functionality is the possibility to access and search the history log in an interactive way. What I mean is that you can use your keyboard to search the history by pressing combinations of keys. The discovery of the existing keys is performed with:
Get-PSReadlineKeyHandler | ? function -like '*history*'

Key       Function                Description
---       --------                -----------
UpArrow   PreviousHistory         Replace the input with the previous item in the history
DownArrow NextHistory             Replace the input with the next item in the history
Ctrl+r    ReverseSearchHistory    Search history backwards interactively
Ctrl+s    ForwardSearchHistory    Search history forward interactively
Alt+F7    ClearHistory            Remove all items from the command line history (not PowerShell history)
F8        HistorySearchBackward   Search for the previous item in the history that starts with the current input - like PreviousHistory if the input is empty
Shift+F8  HistorySearchForward    Search for the next item in the history that starts with the current input - like NextHistory if the input is empty
Unbound   ViSearchHistoryBackward Starts a new seach backward in the history.
Unbound   BeginningOfHistory      Move to the first item in the history
Unbound   EndOfHistory            Move to the last item (the current input) in the history
As you can see pressing Ctrl+r will bring up a bottom-top search (identified by bck-i-search), and just start typing and PSReadline will complete the lines with commands from the history logfile:


The third functionality is the fact that PSReadLine allows you to mark, copy, and paste text in the common Windows way. It is actually just like if  you were in Word: CTRL+C copies text, CTRL+X cuts text, and CTRL+V pastes the text. CTRL+C can still be used to abort a command line, but when you select some text, with the CTRL+SHIFT+ArrowKeys key combination for instance, PSReadline will switch to CTRL+C Windows mode. Awesome.

The fourth functionality is syntax checking as you type. When PSReadline detects a syntax error it turns the grater-than-sign on the left to red, like in the following example where I forgot to close the double quotes after the $Computer variable:


If to all these functionalities you add the syntax coloring provided by PSReadline, or also the possibility to use key combinations like CTRL+Z to undo code changes, then there you are with a PowerShell console that is a delight to use. And that you can even install on your old Windows 7 by installing WMF v5 and then running the following line of code to get the module from the PowerShell Gallery:
Install-Module -Name PSReadline
Now just choose your way. Here's a comparative screenshot of the four main development environment I use:


Happy coding.

Wednesday, November 30, 2016

A PowerShell function to monitor physical disk activity per storage bay during sync activities

These days I have been migrating data on a old Windows 2003 server from an old HP XP128 storage array to a newer one, a HP 3PAR. Both those fiber channel SAN were mounted and managed on the server through Veritas Enterprise Administrator (VEA) version 5.1. At first I started with a Robocopy to migrate data, ACLs, and all the rest from the old volume to the new one, but soon discovered that there could be better ways to move huge amount of data (I am talking here of several million sensitive files).

One of the main advantages of using Robocopy is that you have fine grained control over your sync. The downside is that, after the sync, you have to stop the old volume and move all your pointers to the new volume, which has a big impact on the automation systems relying on those files to keep their 24/7 activity.

I decided then to change plans and build a mirror on VEA between the old storage array and the new one.

The only problem with such a old version of VEA is that you don't have access to such a basic information as the fact that the mirror sync is completed. The interface just shows you that you have successfully built your mirror but hides the information about the actual data sync taking place behind the curtains.

That's the moment the manager came in and asked for a way to keep an eye the sync. And that's the moment I replied: I can do that for you with PowerShell, sir.

I knew fairly well that though there's no PowerShell on a Windows 2003 server unless you have taken the time to install it, I could access its performance counters from a distant, more recent workstation trough Get-Counter:

What I wanted was to give the manager a script he could run himself that showed the activity for the disks involved in the sync. So I knew that I had to rely on cmdlets I am not used to put in my functions, such as Clear-Host or Write-Host.
But, as for anything else, there are times you have to make exceptions. And Write-Host can have its use sometimes.
In the end I came up with a function that given a set of physical disks on a source server and on a target server, monitors the disk activity in terms of read and written bytes per seconds and in case those are not null, set the font color to green, so that they're highlighted.

The names of the disks can be found in the Perfmon GUI itself as well as in VEA:


Their names can be given as input to the functions as a pattern for a regular expression. In my case this gave:

-SourceDiskPattern '\(2\)|\(14\)'
and
-DestinationDiskPattern '\(8\)|\(9\)|\(10\)|\(20\)|\(21\)|\(22\)' 
because I am trying to match those hard disk numbers.

I have also added to the function a couple of parameters to show the processor activity and the disk queue, since these counters can always be of use when tracing a workload:


In the end, here's the output expected by the manager during the sync, with the green lines highlighting the disks where the data are read or written:


I rely on Clear-Host to refresh the screen so that the manager can only see the current workload. This can be a bad practice as Invoke-ScriptAnalyzer will tell you, but in my case this is exactly the cmdlet I needed.


Here's the code for the Get-DiskStat function, which by the way you can find on Github:

<#
.Synopsis
   Monitors physical disk activity per bay during sync activities
.DESCRIPTION
   Monitors physical disk activity during a sync and highlights the disks that are active reading or writing bytes and the bay they belong to
.EXAMPLE
   Get-DiskStat -SourceComputer srv1 -SourceStorageBayName 'HPE 3PAR' -SourceDiskPattern '\(2\)|\(14\)' -DestinationComputer srv2 -DestinationStorageBayName 'HP XP128' -DestinationDiskPattern '\(7\)|\(10\)' -Refresh -Frequency 2 -Repeat 10
.EXAMPLE
   Get-DiskStat -SourceComputer srv1 -SourceStorageBayName 'HPE 3PAR' -SourceDiskPattern '\(2\)|\(14\)' -DestinationComputer srv2 -DestinationStorageBayName 'HP XP128' -DestinationDiskPattern '\(7\)|\(10\)' -Refresh -ShowCpu
.EXAMPLE
   Get-DiskStat -SourceComputer srv1 -SourceStorageBayName 'HPE 3PAR' -SourceDiskPattern '\(2\)|\(14\)' -DestinationComputer srv2 -DestinationStorageBayName 'HP XP128' -DestinationDiskPattern '\(7\)|\(10\)' -Refresh -ShowCpu -ShowQueue
.EXAMPLE
   Get-DiskStat -SourceComputer srv1 -SourceStorageBayName 'HPE 3PAR' -SourceDiskPattern '\(2\)|\(14\)' -DestinationComputer srv2 -DestinationStorageBayName 'HP XP128' -DestinationDiskPattern '\(7\)|\(10\)' -Refresh -ShowCpu -ShowQueue -Credential (Get-Credential)
.EXAMPLE
   Get-DiskStat -sc srv1 -sbn 'HPE 3PAR' -sdp '\(2\)|\(14\)' -dc srv2 -dbn 'HP XP128' -ddp '\(7\)|\(10\)' -R -C -Q -Cred (Get-Credential) -F 1 -rep 1000
.AUTHOR
   Carlo MANCINI
#>
function Get-DiskStat
{
    Param
    (
        # Source computer for the sync
        [Parameter(Mandatory=$true,Position=0)]
        [ValidateNotNull()]
        [ValidateNotNullOrEmpty()]
        [Alias("sc")]
        $SourceComputer,

        # Source bay name for the sync
        [Parameter(Mandatory=$true,Position=1)]
        [ValidateNotNull()]
        [ValidateNotNullOrEmpty()]
        [Alias("sbn")]
        $SourceStorageBayName,

        # Source disk pattern for the sync
        [Parameter(Mandatory=$true,Position=2)]
        [ValidateNotNull()]
        [ValidateNotNullOrEmpty()]
        [Alias("sdp")]
        $SourceDiskPattern,

        # Destination computer for the sync
        [Parameter(Mandatory=$true,Position=3)]
        [ValidateNotNull()]
        [ValidateNotNullOrEmpty()]
        [Alias("dc")]
        $DestinationComputer,

        # Destination bay name for the sync
        [Parameter(Mandatory=$true,Position=4)]
        [ValidateNotNull()]
        [ValidateNotNullOrEmpty()]
        [Alias("dbn")]
        $DestinationStorageBayName,

        # Destination disk pattern for the sync
        [Parameter(Mandatory=$true,Position=5)]
        [ValidateNotNull()]
        [ValidateNotNullOrEmpty()]
        [Alias("ddp")]
        $DestinationDiskPattern,

        # Clear the screen between each execution
        [Parameter(Position=6)][Alias("r")][Switch]$Refresh,

        # Show Active and Idle CPU counters
        [Parameter(Position=7)][Alias("c")][Switch]$ShowCpu,

        # Show disk queue for selected disks
        [Parameter(Position=8)][Alias("q")][Switch]$ShowQueue,

        # Specifies a user account that has permission to perform this action
        [Parameter(Mandatory=$false,Position=9)]
        [System.Management.Automation.PSCredential]
        [System.Management.Automation.Credential()]
        $Credential = [System.Management.Automation.PSCredential]::Empty,

        # Frequency of the polling in seconds
        [Parameter(Position=10)]
        [Alias("f")]
        $Frequency = 10,

        # Total number of polling to perform
        [Parameter(Position=11)]
        [Alias("rep")]
        $Repeat = 10

    )
    
    Try { 
    
        Test-Connection $SourceComputer,$DestinationComputer -Count 1 -ErrorAction Stop | Out-Null
        
        }
    
    Catch {
    
        Throw "At least one of the target servers is not reachable. Exiting."
        
        }

    $CounterList = '\PhysicalDisk(*)\Disk Read Bytes/sec','\PhysicalDisk(*)\Disk Write Bytes/sec','\PhysicalDisk(*)\Current Disk Queue Length','\Processor(_Total)\% Idle Time','\Processor(_Total)\% Processor Time'

    1..$Repeat | % {

        $SourceCounterValue = (Get-Counter $CounterList -ComputerName $SourceComputer).countersamples

        if($DestinationComputer -eq $SourceComputer) {

            $DestinationCounterValue = $SourceCounterValue

            $SameHost = $True

            }

        else {
        
            $DestinationCounterValue = (Get-Counter $CounterList -ComputerName $DestinationComputer).countersamples

            }

        if($Refresh) {Clear-Host}

        if($ShowCpu) {

                    "$SourceComputer CPU Activity & Idle"
                    
                    $SourceCounterValue | ? {$_.path -match 'processor'} | % {
    
                            Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10)
                            
                            }

                    if(!$SameHost) {
                    
                        "$DestinationComputer CPU Activity & Idle"
                    
                        $DestinationCounterValue | ? {$_.path -match 'processor'} | % {
    
                            Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10)
                            
                            }
                        }

                    }

        if($ShowQueue) {

            "$SourceStorageBayName Storage Bay Disk Queue on $SourceComputer"

            $SourceCounterValue | ? {($_.path -match $SourceDiskPattern) -and ($_.path -match 'queue')} | % {
    
                    if($_.cookedvalue -gt 0) {
                    
                        Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $_.cookedvalue.tostring().padright(10) -ForegroundColor Green
                            
                        }
    
                    else {
                    
                        Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $_.cookedvalue.tostring().padright(10) -ForegroundColor White
                        
                        }
                    
                    }

            "$DestinationStorageBayName Storage Bay Disk Queue on $DestinationComputer"

            $DestinationCounterValue | ? {($_.path -match $DestinationDiskPattern) -and ($_.path -match 'queue')} | % {
    
                    if($_.cookedvalue -gt 0) {
                    
                        Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $_.cookedvalue.tostring().padright(10) -ForegroundColor Green
                            
                        }
    
                    else {
                    
                        Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $_.cookedvalue.tostring().padright(10) -ForegroundColor White
                        
                        }
                    
                    }

            }


        "$SourceStorageBayName Read stats on $SourceComputer"

        $SourceCounterValue | ? {($_.path -match $SourceDiskPattern) -and ($_.path -match 'read')} | % {
    
            if($_.cookedvalue -gt 0) {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor Green
                
                }
    
            else {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor White
                
                }
            
            }

        "$SourceStorageBayName Write stats on $SourceComputer"

        $SourceCounterValue | ? {($_.path -match $SourceDiskPattern) -and ($_.path -match 'write')} | % {
    
            if($_.cookedvalue -gt 0) {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor Green
                
                }
    
            else {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor White
                
                }
            
            }

        "$DestinationStorageBayName Read stats on $DestinationComputer"

        $DestinationCounterValue | ? {($_.path -match $DestinationDiskPattern) -and ($_.path -match 'read')} | % {
    
            if($_.cookedvalue -gt 0) {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor Green
                
                }
    
            else {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor White
                
                }
            
            }

        "$DestinationStorageBayName Write stats on $DestinationComputer"

        $DestinationCounterValue | ? {($_.path -match $DestinationDiskPattern) -and ($_.path -match 'write')} | % {
    
            if($_.cookedvalue -gt 0) {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor Green
                
                }
    
            else {
            
                Write-Host $_.path.padright(65)`t $_.InstanceName.padright(5)`t $([math]::round($_.cookedvalue)).tostring().padright(10) -ForegroundColor White
                
                }
            
            }

        Start-Sleep -Seconds $frequency

        }    

}
PowerShell, once again the tool for the job.

Friday, November 25, 2016

On the road to Overlay networking on Docker for Windows

The container networking stack has gone through many rapid improvements on Windows Server 2016, and it's nice to see that some new features are coming out on a regular basis: Docker's release pace is fast, and though they have had a few missteps, most of the discovered bug are promptly addressed.

In this post I want to talk you about the implementation of multi-host networking on Docker for Windows.

On Linux this is supported since Kernel version 3.16 but on Windows, Containers are a recent feature and Overlay networking is likely going to be released pretty soon.

So, let's have a look at what this is and how it works.

As you have learned from my previous posts, the Docker engine communicates with the underlying Host Network Service (HNS) through a Libnetwork plugin. This plugin implements the Docker Container Network Model (CNM) which is composed of three main components:
  • A Sandbox, where the network configuration (IP address, mac address, routes and DNS entries) of the container is stored
  • An Endpoint linking the container Sandbox to a Network: this is a vNIC in the case of a Windows Container or a vmNIC in case of a Hyper-V container
  • A Network, which is a group of Endpoints belonging to different containers that can communicate directly
Behind each Network a built-in Driver performs the actual work of providing the required connectivity and isolation.

There are four possible driver packages inside Libnetwork:
  • null
  • bridge
  • overlay
  • remote
No network interface is attached to a container which is started with the Null driver:
docker run -it --network none microsoft/nanoserver powershell
Get-NetAdapter in this case returns nothing. And upon inspection this container will show no network:

In the second case, when you use the Bridge driver, the container won’t have a public IP but will be assigned a private address from the 20-bit private range defined by RFC 1918:

172.16.0.0 - 172.31.255.255 (172.16/12 prefix)

Get-Netadapter will show the virtual Ethernet adapter:
Get-NetAdapter

Name                      InterfaceDescription                    ifIndex
----                      --------------------                    -------
vEthernet (Container N... Hyper-V Virtual Ethernet Adapter #2          19
and Get-NetIpAddress will show the private IP address:
Get-NetIPAddress | Format-Table

ifIndex IPAddress                                       PrefixLength PrefixOrigin
------- ---------                                       ------------ ------------
19      fe80::29aa:cc8a:43f2:ae0f%19                              64 WellKnown   
18      ::1                                                      128 WellKnown   
19      172.31.2.5                                                20 Manual      
18      127.0.0.1                                                  8 WellKnown   
If I inspect this container, I can see the JSON describing the network specifications:
docker container inspect 4a44649f2b8d

Now just a couple of weeks ago (on version v1.13.0-rc1), Docker has implemented the third Driver (read Swarm-mode overlay networking support for windows), which basically means that your Windows running containers will be able to communicate even if they are residing on different hosts.

Actually this is a bit more complicated than that, because Overlay networking has been implemented in the Docker engine but not yet in the HNS service of Windows. So if you try to build a multi-host network you will get the following error message:
docker network create -d overlay --subnet 10.1.1.0/24 multihost
Error response from daemon: HNS failed with error : Catastrophic failure
Same output if you try the PowerShell version:
New-ContainerNet -Driver overlay -Name MultiHost
New-ContainerNet : Docker API responded with status code=InternalServerError, response={"message":"HNS failed witherror : Catastrophic failure "}
At line:1 char:1
+ New-ContainerNet -Driver overlay -Name MultiHost
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  + CategoryInfo          : NotSpecified: (:) [New-ContainerNet], DockerApiException
  + FullyQualifiedErrorId : Docker Client Exception,Docker.PowerShell.Cmdlets.NewContainerNet
Once the required Windows binaries to build a Overlay network will be released, it will be interesting to see if Microsoft is going to embed in a Nano Server the required Key-Value store which has to be accessible to all the Containers belonging to the same Overlay network to be discoverable.

For the moment the most used key-value store is the one provided by Consul, but it is only Linux based so you won’t be able to run it on Windows:
docker run -p 8500:8500 -d consul --name consul
Unable to find image 'consul:latest' locally
latest: Pulling from library/consul
C:\Program Files\Docker\docker.exe: image operating system "linux" cannot be used on this platform.
See 'C:\Program Files\Docker\docker.exe run --help'.
All the same, Overlay networking is going to be soon available for Docker containers on Windows. The first step has been done. Now it is up to Microsoft to do the next move. Stay tuned for more on the subject.

Wednesday, November 16, 2016

Building a Docker container for the Image2Docker tool

I have been playing a bit with Image2Docker with the intention to see how far I could go into containerizing existing workloads. To date, this PowerShell-based module by fellow MVP and Docker Captain Trevor Sullivan mounts a vhdx or wim Windows Image and tries to discover running artifacts, such as IIS, SQL or Apache, and generates a Dockerfile for a container hosting these services.

Now this is still experimental, and the list of accepted artifacts is still short, but I couldn't retain myself from trying to build a Docker container for the job.

Here's how I tackled this, knowing that as most of us, I am moving my first steps into this new feature of Windows 2016.

First of all I built the following Dockerfile in Visual Studio Code:


Basically I am issuing five statements:
  1. pull the microsoft/nanoserver image. Actually I could have used the microsoft/windowsservercore image as well but that would have taken longer
  2. state that I am the mantainer of the repository
  3. install the package manager called Nuget
  4. install the actual Image2Docker module (version 1.5 at the time of writing)
  5. set the ConvertTo-Dockerfile cmdlet as entry point for this container, so that I can pass the .vhdx or .wim image path straight into this dedicated container on execution
Then the next steps to publish to the Docker Hub are:

docker build .  -t happysysadm/image2docker:latest -t happysysadm/image2docker:v0.1

In the step above I am issuing the build command from the folder containing the Dockerfile file, and I am setting two tags for the same image: latest and v0.1.


Then I logged in to the Hub:

docker login -u user -p password

And pushed the container into my public registry:

docker push happysysadm/image2docker

At this moment this repo becomes visible on the web:


Once I got your container up in the Hub, I cleaned up my local image:

docker rmi -f happysysadm/image2docker:v0.1

and pulled it again:

docker pull happysysadm/image2docker

Every time I have gone through an update of my Dockerfile, I had to do a rebuild and increment the version tag:

docker build .  -t happysysadm/image2docker:latest -t happysysadm/image2docker:v0.2

In the step above the latest tag is passed to v0.2 and the previous image retains only the tag v0.1.

Now this container is public and you can just do:

docker run happysysadm/image2docker sample.vhdx

and get the Dockerfile for your Windows image created for you. Let me know how it goes and remember that this project is open source so everybody's contribution is accepted.

Monday, November 14, 2016

Step up container management with PowerShell for Docker

I remember that one of the first reasons I started using Windows PowerShell is that it uses objects to represent the data, which is great when you are interacting with a object-oriented Windows ecosystem. Now that some historical borders have been crossed between Linux and Windows, and that preexisting tools have been translated to the Microsoft's OS, we, as PowerShell guys, could face a bit of a throwback in the way we use the shell.

Just have a look at Docker.

Invented in 2013 by a French guy named Solomon Hykes, this open source project aimed at automating the deployment of Linux containers has been quickly adopted by Microsoft for their last operating system and can today be run on both Windows 10 and Windows 2016.

The main drawback of adopting such a tool, is that it comes with a command line which is obsolete if you look at it in PowerShell terms: it only produces strings, which are hardly reusable, unless you feed them to ConvertFrom-String:

docker images | ConvertFrom-String -Delimiter "\s{2,}" | Format-Table

P1                          P2     P3           P4          P5
--                          --     --           --          --
REPOSITORY                  TAG    IMAGE ID     CREATED     SIZE
microsoft/iis               latest 211fecef1e6b 5 days ago  9.48 GB
microsoft/sample-dotnet     latest c14528829a37 2 weeks ago 911 MB
microsoft/windowsservercore latest 93a9c37b36d0 7 weeks ago 8.68 GB
microsoft/nanoserver        latest e14bc0ecea12 7 weeks ago 810 MB

Now, tough ConvertFrom-String is a extremely powerful cmdlet released with PowerShell 5.0 (check my blog post on the subject), it take some time to feel easy with its syntax. In the previous example for instance I am outputting the list of the images I have pulled from the Docker Hub onto my system. The text that comes through the pipeline once I run 'docker images' has to be split whenever I find at least 2 empty spaces. To achieve that I have to use the Delimiter parameter and match a whitespace \s at least two times {2,}.

Needless to say, knowing regular expressions becomes a must.

Happily enough we have an alternative to this. Since Docker comes with a nice API, there is open source project for a module exposing PowerShell cmdlets to manage Docker images, containers and networks. Tough still in development, I heartedly suggest you start using it to maintain consistency with your existing environment.

You can find it here:




The installation is straightforward.

Register-PSRepository -Name DockerPS-Dev -SourceLocation https://ci.appveyor.com/nuget/docker-powershell-dev

Install-Module Docker -Repository DockerPS-Dev -Scope CurrentUser

Here's the list of cmdlets that come with it:

Get-Command -Module Docker -CommandType Cmdlet

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Cmdlet          Add-ContainerImageTag                              0.1.0.111  Docker
Cmdlet          ConvertTo-ContainerImage                           0.1.0.111  Docker
Cmdlet          Copy-ContainerFile                                 0.1.0.111  Docker
Cmdlet          Enter-ContainerSession                             0.1.0.111  Docker
Cmdlet          Export-ContainerImage                              0.1.0.111  Docker
Cmdlet          Get-Container                                      0.1.0.111  Docker
Cmdlet          Get-ContainerDetail                                0.1.0.111  Docker
Cmdlet          Get-ContainerImage                                 0.1.0.111  Docker
Cmdlet          Get-ContainerNet                                   0.1.0.111  Docker
Cmdlet          Get-ContainerNetDetail                             0.1.0.111  Docker
Cmdlet          Import-ContainerImage                              0.1.0.111  Docker
Cmdlet          Invoke-ContainerImage                              0.1.0.111  Docker
Cmdlet          New-Container                                      0.1.0.111  Docker
Cmdlet          New-ContainerImage                                 0.1.0.111  Docker
Cmdlet          New-ContainerNet                                   0.1.0.111  Docker
Cmdlet          Remove-Container                                   0.1.0.111  Docker
Cmdlet          Remove-ContainerImage                              0.1.0.111  Docker
Cmdlet          Remove-ContainerNet                                0.1.0.111  Docker
Cmdlet          Request-ContainerImage                             0.1.0.111  Docker
Cmdlet          Start-Container                                    0.1.0.111  Docker
Cmdlet          Start-ContainerProcess                             0.1.0.111  Docker
Cmdlet          Stop-Container                                     0.1.0.111  Docker
Cmdlet          Submit-ContainerImage                              0.1.0.111  Docker
Cmdlet          Wait-Container                                     0.1.0.111  Docker
This module also exposes a bunch of aliases, though I don't recommend their use since they seem confusing to me and don't add anything in terms of command line agility:

Get-Command -Module Docker -CommandType Alias | Format-Table Name,ResolvedCommandName

Name                 ResolvedCommandName
----                 -------------------
Attach-Container     Enter-ContainerSession
Build-ContainerImage New-ContainerImage
Commit-Container     ConvertTo-ContainerImage
Exec-Container       Start-ContainerProcess
Load-ContainerImage  Import-ContainerImage
Pull-ContainerImage  Request-ContainerImage
Push-ContainerImage  Submit-ContainerImage
Run-ContainerImage   Invoke-ContainerImage
Save-ContainerImage  Export-ContainerImage
Tag-ContainerImage   Add-ContainerImageTag

So, docker images becomes:

Get-ContainerImage

RepoTags                              ID                   Created                Size(MB)
--------                              --                   -------                --------
microsoft/sample-dotnet:latest        sha256:c14528829a... 25/10/2016 13:55:28    869,05
microsoft/windowsservercore:latest    sha256:93a9c37b36... 22/09/2016 10:51:07    8 273,19
microsoft/nanoserver:latest           sha256:e14bc0ecea... 22/09/2016 09:39:30    772,81

and the returned object is a heavily reusable ImagesListResponse object:

Get-ContainerImage | Get-Member

   TypeName: Docker.DotNet.Models.ImagesListResponse

Name        MemberType Definition
----        ---------- ----------
Equals      Method     bool Equals(System.Object obj)
GetHashCode Method     int GetHashCode()
GetType     Method     type GetType()
ToString    Method     string ToString()
Created     Property   datetime Created {get;set;}
ID          Property   string ID {get;set;}
Labels      Property   System.Collections.Generic.IDictionary[string,string] Labels...
ParentID    Property   string ParentID {get;set;}
RepoDigests Property   System.Collections.Generic.IList[string] RepoDigests {get;set;}
RepoTags    Property   System.Collections.Generic.IList[string] RepoTags {get;set;}
Size        Property   long Size {get;set;}
VirtualSize Property   long VirtualSize {get;set;}

Same model for the list of existing containers:

Get-Container

ID                   Image           Command              Created                Status
--                   -----           -------              -------                ------
43a05b618697033eb... microsoft/na... c:\windows\system... 14/11/2016 09:44:19    Exited...
005b51dbe002324f8... microsoft/na... --name nanoserver1   14/11/2016 09:44:04    Created
e8b31c61d5f42b271... microsoft/na... --name nanoserver1   14/11/2016 09:42:12    Created
547b7dbd3b1473127... microsoft/sa... dotnet dotnetbot.dll 06/11/2016 16:11:07    Exited...
Get-Container | Get-Member

   TypeName: Docker.DotNet.Models.ContainerListResponse

Name            MemberType Definition
----            ---------- ----------
Equals          Method     bool Equals(System.Object obj)
GetHashCode     Method     int GetHashCode()
GetType         Method     type GetType()
ToString        Method     string ToString()
Command         Property   string Command {get;set;}
Created         Property   datetime Created {get;set;}
ID              Property   string ID {get;set;}
Image           Property   string Image {get;set;}
ImageID         Property   string ImageID {get;set;}
Labels          Property   System.Collections.Generic.IDictionary[string,string] Labels...
Mounts          Property   System.Collections.Generic.IList[Docker.DotNet.Models.MountP...
Names           Property   System.Collections.Generic.IList[string] Names {get;set;}
NetworkSettings Property   Docker.DotNet.Models.SummaryNetworkSettings NetworkSettings...
Ports           Property   System.Collections.Generic.IList[Docker.DotNet.Models.Port]...
SizeRootFs      Property   long SizeRootFs {get;set;}
SizeRw          Property   long SizeRw {get;set;}
State           Property   string State {get;set;}
Status          Property   string Status {get;set;}

Now that you have this module, you have two ways to run a container. Either by using:

docker run -it microsoft\nanoserver powershell

or by using Invoke-ContainerImage (aliased as Run-ContainerImage):

Invoke-ContainerImage -ImageIdOrName microsoft/nanoserver:latest -Command powershell -Input -Terminal

which, at its best, can be shortened to:

Run-ContainerImage microsoft/nanoserver:latest powershell -In -T
None of the PowerShell syntaxes are as short as the 'legacy' one, but again, the produced object is what makes them worthy using.

I hope you have enjoyed this first post on the PowerShell module for the Docker Engine, which brings close integration between those that not so long ago were distant worlds. Stay tuned for more.

Friday, November 4, 2016

Announcing the winner of the PowerShell Oneliner Contest 2016

I am excited to announce the winner of the second PowerShell Oneliner Contest. But before I do it, let me tell you one thing. This year, I received over ninety submissions from wannabe PowerShell Monks from all over the world. Some solutions stood out as the most striking and imaginative entries. Some others were not successful in achieving what I asked, but showed a lot of effort in learning and initiative. Everybody seemed to understand that the aim of such a contest is not just to push PowerShell to its limit and beyond, by bending the command line to your will. It's a matter of generating knowledge and sharing it for others to learn from. Building code that can benefit the whole community is of paramount importance here.

THE WINNER IS...

So thanks to all of the entrants and, without further ado, let's have a look at the winning solution, by Sam Seitz, with 65 chars:

([char[]](71..89)|?{!(gdr $_)2>0}|sort{[guid]::newguid()})[0]+':'
#Posted by Sam Seitz to Happy SysAdm at October 25, 2016 at 8:02 AM

I got in touch with Sam so he could share a bit about himself and his thought process for the script.

Two years into his IT career, Sam is a 25-year old systems engineer for Network Technologies, Inc., an MSP in Olathe, KS. He spends his days perpetually amazed that his employer pays him to "play with computers" (as his father would say). Outside of work, his incredible wife and their pair of regal beagles keep him happier than a man has any right to be.

Seeing this challenge made me realize two things: 1) off the top of my head I know, maybe, five default aliases and 2) I should really start using more aliases. I'm normally extremely verbose in my scripting, so this proved to be a unique challenge. To keep it as short as possible, I used a few interesting techniques, which I'll break down step-by-step:
([char[]](71..89)
To generate the array of letters from g-y, I took advantage of the fact that the 71 through 89 is g through y in the ASCII table. When cast as a [char], 71 is g, 72 is h, etc.
?{!(gdr $_)2>0}
I then filtered out occupied drive letters by using Where-Object (?), the alias for the "-not" operator (!), and Get-Drive (gdr). 2>0 redirects the inevitable error output to null. (If you don't mind seeing each error Get-Drive throws when it's used with a non-existant drive, the 2>0 could be removed as it isn't necessary for the success of the one-liner. But who likes all those ugly red errors on their screen? Terrorists, that's who.)
sort{[guid]::newguid()}
In order to ensure the result was random, I used Sort-Object (sort) on the array of letters and told it to sort by the new GUID created using the .Net method [guid]::newguid().
[0]+':'
Finally, I selected the first result [0] from the array of randomly sorted available drive letters in the output and threw a colon on the end (+':').

Thanks to Sam for sharing his deep knowledge of PowerShell with us. For those interested, I created a Gist with a list of working solutions I got, sorted by line length.

COMMON ERRORS

Now in the following section I will explain why I could not accept some entries.

The most common error by far was the use of Random as an alias of Get-Random. Though I understand the extreme difficulty of generating a random number without using Get-Random, I couldn't accept oneliners using Random as an alias for the simple reason that it is not one. It just works as an alias because the PowerShell interpreter prepends by default the verb 'Get-' to nouns if it can't find another match.

Get-Alias | ? {$_.Definition -match "Get-Random"}

Trace-Command can confirm that interpreter behavior:

Trace-Command -Name CommandDiscovery -PSHost -Expression { random }

...
DEBUG: CommandDiscovery Information: 0 : The command [random] was not found, trying again with get- prepended
DEBUG: CommandDiscovery Information: 0 : Looking up command: get-random
DEBUG: CommandDiscovery Information: 0 : Cmdlet found: Get-Random  Microsoft.PowerShell.Commands.GetRandomCommand

MY SOLUTIONS

Now a word about my solutions to the contest. I wrote four of them. Since they are pretty short, I am pleased to share them with you.

In the first solution I was actually able to get a random GUID to sort on by fetching it from the internet.

There are for sure many websites exposing an engine for GUID generation (https://www.uuidgenerator.net/ or https://www.guidgen.com/ for instance) but in our case we want the shortest URL possible, and I was lucky enough to find a website named guid.it. There's a funky cmdlet for getting stuff from the web and it is Invoke-RestMethod. It has an alias which is irm. Now the cool thing of Internet is that nowadays many websites have adopted a JSON API to let consumers retrieve and manipulate their content using HTTP requests. And guid.it is one of them. Luck, again. So, doing the alike of:

Invoke-WebRequest http://www.guid.it/json -UseBasicParsing | ConvertFrom-Json

can be achieved in a simpler manner with:

Invoke-RestMethod http://www.guid.it/json

which can be shortened to:

irm guid.it/json

or even shorter:

irm guid.it/api

And there I have my random guid. Internet for the IT pro, I daresay.

For the rest my first solution matches Sam's one:
([char[]](71..89)|?{!(gdr $_)2>0}|sort{irm guid.it/api})[0]+':'

In my second solution, I leverage the .NET Framework's System.Random class, but instead of using [random]::New().Next() I went for a shorter ([random]@{}).Next():
do{$x=[char]([random]@{}).Next(71,89)}while(gdr($x)2>0)$x+':'

My third solution relies on use of my favorite cmdlets, Select-String (aliased as sls) in conjunction with the [guid] type accelerator, which called the NewGuid static method:
''+(ls function:[g-y]:|sls(gdr)-n|sort{[guid]::NewGuid()})[0]

In my fourth and last solution I mixed both my third and first solution, and so I was able to go down to 59 chars. It's a bit slower than the others because of the action of fetching GUIDs from the Internet, but for the purpose of the contest this is the shortest solution I was able to come up with:
''+(ls function:[g-y]:|sls(gdr)-n|sort{irm guid.it/api})[0]

I have created a Gist with my solutions, which you can find here.

UNIT TESTING WITH PESTER

Now a word about how the testing of the posted oneliners went.

Since I was rapidly flowed by plenty of tricky oneliners, I was a bit scared by having to check all of them manually for respect of contest rules. Fortunately I had already worked a bit with Pester to define test cases on other projects, so I just had adapt what I knew to the contest I had just started.

Then, and it was sheer luck, I got contacted by Jakub who proposed a complete solution to test those oneliners.

I am glad to say that what Jakub came up with is just brilliant. So, who better than him to explain his approach. Take it away Jakub.

I always loved one liners. "Make the code as short as possible" is such a simple, yet so challenging restriction. Such restriction does not exist in our day to day work, we care about readability, understandability and performance, but rarely about the length of our code. Putting this restriction in place and removing any other, turns our usual focus on its head. For once we get to write code that is so unreadable and uses so many quirks of the language that we will need to explain it at least twice (if not three times). Finally we can put all the side notes we read in books to work and use all the features, that we thought were bugs when we saw them for the first time, to force the language syntax to it's limits, and then watch in awe when others produce a solutions twice as short as ours.

For this reason I had to take part in the Oneliner contest of 2016 hosted by Carlo on his blog. Once I read the requirements I thought to myself: Well that's more that one requirement, what a nice opportunity to take this on another level and write some tests as well. And so I approached the whole problem in a kata-like way, which means not only taking my time to think about the problem itself, but also taking time to reason about the tests and the process of writing tests. Now since I know I have no way of winning the contest, especially after seeing how creative were people last year, I will at least walk you through my thought process.

First I read the requirements just to make sure they are quantifiable, what I mean by that is that I can measure if the requirement was met. A quantifiable requirement is for example "contains no semi-colon", a non-quantifiable requirement (at least not easily) would be "the code looks nice".

Once I made sure I will be able to write some tests for all of the requirements I proceeded to categorize the requirements and realized that they can be split to wwo categories: stylistic, and functional. Where stylistic is how the code should look like, and functional is how the code should behave.

I started with the functional part of the tests as they seemed much simpler to implement.

### Test 1 - Outputs single string
The first decision I made, was to store my one liner as a script block. This enabled me to reuse the same script block in all the test cases, and it also enabled me to change my one liner very easily.

The first test checks that the output of the oneliner is a single string. Pester has a built-in assertion `BeOfType` which was my first choice, but then I realized that piping the output through pipeline would expand the array that I might get, and I wouldn't be able to check if I got just a single item or whole array or items. So I went oldschool and used the `-is` operator.

It "outputs single string" {
    (&$oneliner) -is [string] | Should be $True
}

### Test 2 - Outputs one letter followed by colon
Next requirement forces me to match the text and specifies that it should be a letter followed by colon. Any text matching is easy with the `Match` assertion which uses regular expressions. The only thing I had to watch out for was matching the start and end of the string, to make sure that no sorrounding characters are matched.

It "Outputs single letter followed by colon" {
    &$oneliner | Should Match "^[a-z]\:$"
}

I decided to match the whole alphabet in this test to limit mixing the requirements. I find it being a good practice to specify requirements in one place without unnecessarily resticting other unrelated tests.

### Test 3 - Should exclude drives A-F and Z
Yet another requirement forces me to exclude some of the drive letters. I decided to use test cases to have a single test for each excluded letter and specified a list of test cases. This feature of Pester generates a single test per testcase and also modifies the name of the test to reflect the actual value of `$DriveLetter` for extra readability. The scriptblock then contains parameter I named $DriveLetter, which I use to write the assertion.

It "Should not output drive letter " -TestCases `
    @{DriveLetter = "a:"},
    @{DriveLetter = "b:"},
    @{DriveLetter = "c:"},
    @{DriveLetter = "d:"},
    @{DriveLetter = "e:"},
    @{DriveLetter = "f:"},
    @{DriveLetter = "z:"}{
    param ($DriveLetter)
        &$oneliner | Should Not Be $DriveLetter
}

### Test 4 - Drive should not be used
This test could not be easier. I am used the `Exist` assertion which I know uses `Test-Path` internally. Nothing else was needed here.

It "Resulting drive should not exist" {
    &$oneLiner | Should Not Exist
}

### Test 5 - Drive should be random
This test I found interesting because randomness is something to avoid in tests as much as possible. Randomness can make test fail from time to time and that unexpected failures lower the trust in we have in tests. But well in this case I'll be using the tests locally so I decided to take the simplest route and run the code twice and then compare the results. If the results are not the same the output is probably "random". This is far from perfect, but in this simple case I can validate by running the test multiple times. In a real production environment I'd run the code more than twice and compare the results.

It "Should be random" {
    &$oneLiner | Should Not Be (&$oneLiner)
}

Another interesting thing about this test is that I did not notice the randomness requirement at first and posted my solution without it, which automatically makes my solution incorrect :)

### Test 6 - Code should be error free
This test seemed straight forward because any terminating error (exception) in a Pester test makes the test fail. The difficult part was capturing non-terminating errors as well. I had to set the error action preference to `Stop` and also pipe to `Not Throw` to make the test behave correctly. That's something to be improved in the next version of Pester.

It "Should be error-free" {
    $errorActionPreference = 'Stop'
    $oneLiner | Should Not Throw
}

That was it for the functional tests. All of them were pretty easy to write, and there was not much to figure out. Next up were te the stylistic tests, which were a bit more challenging as I first needed to write some helper functions to avoid any ifs and for loops in the body of my tests.

### Test 6 - All cmdlets must have an alias
This test was the most challenging test to write. There are two things that I needed to figure out. First I needed a way to parse the code and find all the commands. For that I knew I could use the AST, but I had to write and test the code to find all the commands. The other thing was checking if all the found commands have aliases. First I started with the tests for AST parsing and then I implemented the function:

Describe "Get-ScriptBlockCommand" { 
    It "Finds basic cmdlet" {
        Get-ScriptBlockCommand { Get-Date } | Should Be "Get-Date"
    }
    It "Finds basic alias" {
        Get-ScriptBlockCommand { gci } | Should Be "gci"
    }
    It "Finds multiple commands alias" {
        $actual = Get-ScriptBlockCommand { ps; get-process } 
        $actual[0] | Should Be 'ps'
        $actual[1] | Should Be 'get-process'
    }
    It "Ignores keywords" {
        Get-ScriptBlockCommand { if ($true) {} } | Should BeNullOrEmpty
    }
    It "Ignores other tokens" {
        Get-ScriptBlockCommand { $a = 10 ; $false } | Should BeNullOrEmpty
    }
}

function Get-ScriptBlockCommand ($ScriptBlock) {
     $tokens = [System.Management.Automation.PSParser]::Tokenize($ScriptBlock,[ref]$null)
     $tokens | where { $_.Type -eq 'Command' } | select -expand content
}

Then I followed with looking up aliases and testing the every command has at least one:

Describe "Test-Alias" {
    It "Finds alias for basic cmdlet" {
        Test-Alias Get-ChildItem | Should Be $True
        Test-Alias Test-Path | Should Be $False        
    }

    It "Finds alias when given alias " {
        Test-Alias gci | Should Be $True
        Test-Alias ps | Should Be $True 
    }

    It "Returns true when all commands have aliases" {
        Test-Alias ("gci", "ps", "get-childItem") | Should Be $True
    }

    It "Returns false when any of the commands does not have an alias" {
        Test-Alias ("Test-path", "ps", "get-childItem") | Should Be $false
    }
}

function Test-Alias ([string[]] $Name) {
    end {
        $aliases = Get-Alias
        foreach ($n in $name) {
            if ($null -eq ($aliases | Where {$_.Name -eq $n -or $_.Definition -eq $n}))
            {
                return $false
            }
        }
        $true
    }
}

Then I could finally proceed to writing the main test:

It "All used cmdlets have an alias" {
    $commands = Get-ScriptBlockCommand $oneliner
    Test-Alias $commands | Should Be $True
}

### Test 7 - Code must not contain semicolon
And finally I finished with another primitive test checking that semicolon is nowhere to be found in my oneliner. The one liner is also not executed this time. Rather we implicitly convert it to string and pass it to the `Match` assertion.

It "contains no semicolons" {
    $oneliner | Should Not Match "\;"
}

And that was it for my testing. I hope you enjoyed the competition and congratulation to the winners!!!
@nohwnd

Thanks again to all the competitors, to Mike F Robbins for the original function, to Sam Seitz for his brilliant solution and to Jakub Jares for showing us the way to functional testing. And remember, it was all about learning.

Monday, October 24, 2016

PowerShell Oneliner Contest 2016

A lot of time has passed since I have organized a PowerShell oneliner contest. So when I saw the post by fellow MVP and scripting champion Mike F Robbins on a PowerShell Function to Determine Available Drive Letters, I thought that it could be fun to organize a contest to see who can manage to write the shortest possible oneliner that achieves the same result as Mike's function.


As you can see reading his blogpost, the function accepts parameters such as -Random, to return one or more available drive letters at random, or -All, to return all the available drive letters. It also allows you to exclude some letters from the match (A, B, C, D, E, F and Z) by means of a -ExcludeDriveLetter parameter.

Now, for this specific contest, what I want to get in a comment to this post is:
  • a oneliner (meaning in particular no semi-colon) that
  • returns one and only one random available drive letter on the system where it runs
  • with the exception of A-F and Z
  • whose object type is a System.String (I'll check this Get-Member)
  • and whose formatting is, say, G: or h: (case doesn't matter, we are on Windows)
For sure
  • aliases are mandatory, meaning that you can't use a cmdlet unless it has an alias
  • backticks are accepted for readability
  • you can use every PowerShell version, including 5.1, just state in the comment what version you tested it with
  • should you find a shorter oneliner to solve a task you are allowed to post additional comments (just remember to sign your comments so that I know who's who and so that I can get in touch with the winner)
A few more rules:
  • Entries (comments) will not be made public until after the submission deadline.
  • The first person to produce the shortest working solutions to the task wins.
  • The winner will be announced on Friday, November 4th on this blog.
  • I'll be the only judge.
If you want to spread the word about this PowerShell contest, feel free to twit about it. You can use the hashtags #poshcontest2016 and #powershell so that other competitors can share their thoughts (not the solutions of course!).

UPDATE Nov 4 2016
We have a winner! Check it here.
Related Posts Plugin for WordPress, Blogger...