Friday, February 21, 2014

Powershell, legacy tools and unmanaged code

Although Powershell in its latest version has literally a huge number of cmdlets (1602 under Windows 2012 R2) and the .Net Framework exposes almost anything a system administrator could dream of, there are times you are stuck and either fall back to what I inappropriately call 'legacy' executables, or call unmanaged APIs from within your code.

For example, a simple task like finding out which Windows process is holding which socket open is almost an impossible task for any beginner to intermediate Powershell developer.

Of course there is a cmdlet to find out the list of the running processes. It's called Get-Process and exposes many process properties: Process Id (PID), Process name, Handles, Non Paged Memory (NPM), Paged Memory (PM), Working Set (WS), Virtual Memory (VM) and many others, but not the number of the corresponding open port.
Handles  NPM(K)    PM(K)      WS(K) VM(M)   CPU(s)     Id ProcessName
-------  ------    -----      ----- -----   ------     -- -----------
     56       7     1796       7424    53     3.20    292 conhost
    201      11     1532       3612    42     1.98    344 csrss
     89       9     1260       3484    39     0.75    400 csrss
And of course, as the Scripting Guy explains, Powershell 4.0 has a cmdlet, named Get-NetTCPConnection, to show each TCP connection property, such as the local or remote IP address, the local or remote port, and the connection state. But unfortunately the information containing the ID of the process that set up each connection is not fetched.
LocalAddress         LocalPort RemoteAddress     RemotePort State         AppliedSetting
------------         --------- -------------     ---------- -----          --------------
192.168.44.12        58849     192.168.44.14     389        Established    Datacenter
192.168.44.12        58845     192.168.44.15     389        Established    Datacenter
::1                  54765     ::1               135        Established    Datacenter
So, as far as I have found, if we stick to out-of-the-box Powershell cmdlets, there is no way to find the link between a process and a socket.

Looking into .Net, inside the System.Net.NetworkInformation namespace we can find an interesting class to explore: IPGlobalProperties. This class provides information about the network connectivity of the local computer and has:
  • a method GetActiveTcpConnections() that returns a TCPConnectionInformation array holding three properties: LocalEndpoint, RemoteEndPoint and State 
  • a method GetActiveTcpListeners() which returns the IP address of the EndPoint as well as the Port number
Under Powershell you can query them straight away. Notice the we have to use :: to indicate to PowerShell that we want to work with a static method or property, like GetIPGlobalProperties:
[System.Net.NetworkInformation.IPGlobalProperties] | Get-Member -static

   TypeName: System.Net.NetworkInformation.IPGlobalProperties

Name                  MemberType Definition
----                  ---------- ----------
Equals                Method   static bool Equals(System.Object objA, System.Object objB)
GetIPGlobalProperties Method   static System.Net.Netwo...ion.IPGlobalProperties GetIPGlobalProperties()
ReferenceEquals       Method   static bool ReferenceEquals(System.Object objA, System.Object objB)

[System.Net.NetworkInformation.IPGlobalProperties]::GetIPGlobalProperties().GetActiveTcpConnections()

[System.Net.NetworkInformation.IPGlobalProperties]::GetIPGlobalProperties().GetActiveTcpListeners()
Unfortunately the PID of the process opening the socket is still missing. Too bad.

At this point, there are two options left. The first one is to fall back to using good old Netstat.exe and parse the result as Shay Levy did in an old post. You can check his function on PoshCode.

Parsing the output of netstat is not an easy task, and Shay does a brilliant job of splitting the text output and objectify it while adding the result of Get-Process to each connection, so that the PID is shown. The resulting PSObject can be filtered, sorted and formatted to best suit our needs. The function code is explained on his blog, so I won't delve more in it. I just invite you to have a look at it since its a great workaround (if not a full extent solution).

But how comes that we can't retrieve the same information netstat does when used with the -ano parameters? Well, the answer is simple: on a Windows Systems there is managed code and unmanaged code.
Output of netstat -ano, with PID
To make it simple (I am not a developer), Managed Code is code that runs within the .Net Framework's Common Language Runtime (CLR) and benefits from the services provided by the CLR itself. On the contrary, Unmanaged Code is code that runs outside the CLR: COM components, ActiveX components, and Win32 API functions are all examples of Unmanaged Code.

One of these functions, GetExtendedTCpTable, which exists inside the Internet Protocol Helper (IP Helper) API, is the one that Netstat probably use to find out the hidden link between processes and sockets.

These Win32 API functions can't be accessed directly from inside Powershell because there is no wrapper for them in the .Net Framework.

Fortunately there is the possibility to call Win32 and other unmanaged APIs from managed code (like C#) by using a Platform Invoke (P/Invoke), which is the operation the Common Language Runtime does when it finds and loads the required Win32 API DLL in memory, then invokes the desired function.

As Lee Holmes explains on his blog, there are a few ways to access a Win32 API from Powershell. One of them consists in using the Add-Type cmdlet, which takes in the C# signature of a function (GetExtendedTCpTable in our case, whose signature is here), and building a new class on it.

This is a pretty complex process, which goes well beyond the perimeter of this system administration blog, but I want all the same share the link to Justin blog who did an excellent job of importing the iphlpapi.dll and writing a Get-Netstat wrapper function.
Add-Type cmdlet and C# signature
Both Shay's and Justin's solution to the netstat problem are good and very educational and both have pros and cons.

Accessing a function inside an unmanaged API is a rewarding but dangerous task, since you are exiting the comfort zone your Powershell interpreter provides, so be careful, since you shouldn't be doing that unless you have very good Windows Internals and C# skills.

Shay's approach is brilliant, since it shows how strong Powershell is in handling raw text and beautifying it to a manageable object. I liked in particular the way he checks for a string being an IPV6 address inside an IF block:
if (($la = $item[1] -as [ipaddress]).AddressFamily -eq ‘InterNetworkV6′)
        {
 }
I really like Powershell, because I never stop learning. Stay tuned for more!

Thursday, January 23, 2014

Passed exam 74-409 on Server Virtualization with Hyper-V and System Center 2012 R2

Today I took my first exam of the year on Microsoft Server Virtualization and succeeded in getting a quite good score (942/1000). This earned me the certification "Microsoft Certified Specialist: Server Virtualization with Hyper-V and System Center".

My thoughts:
  • The exam itself was not too difficult: if you have at least some hands-on experience with the Microsoft System Center suite and if you already are (like me) a VMWare Certified Professional (VCP), you'll see that having a strong knowledge of all that virtualization is about helps a lot. As Brad Anderson explained on his blog last October, there is an increasing demand for Hyper-V experts, and being comfortable with Microsoft and VMWare platforms can boost your career.
  • I have learned a lot from the labs I made to prepare this exam, since the exam comes up with a lot of real life situations. Labs, labs, labs!
  • I am very happy to see so many Powershell questions on this exam, which confirms (if you didn't know) that my favorite administration language has become a central block of the whole Microsoft strategy
  • There are a few tricks in the wordings, and you may be miss-directed to points that are not essential to the question being asked, so be careful
  • The 2-days show by Symon (@SymonPerriman) and Cory (@holsystems) on MVA is your best bet to success, since they cover most of the exam topics
  • There is a virtual lab for the exam on technet
  • Last but not least, Bjorn Houben has collected lots (all of them?) of resources on his blog. Check it out!

Good luck to everybody sitting this exam in the future!

Friday, January 17, 2014

First steps in Windows Azure with Powershell

A few days are left before the start of the first noted event of the 2014 Powershell Winter Scripting Games. In the meantime I have decided to write a blog post on moving your first steps in a Windows Azure environment with Powershell. This can be particularly useful if you wanted to take part in the Games and still don't have an environment with Powershell 4.0 installed, since you can create a few Windows 2012 R2 VMs with the following quick steps.

As an introduction, know that in these times there is a growing interest for Cloud technologies and Microsoft has answered the need for outsourced infrastructures with the possibility to run your IT as a Service (IaaS) in its Cloud datacenters. Windows Azure is the name Microsoft gave back in 2008 to its Cloud application platform, which became available on February 2010.

Microsoft is offering a one-month free trial, so activate your subscription and download and install the latest version of Windows Azure Powershell, which is 0.7.2.1 at the time of writing.


The package will install a new module for Azure:
Get-Module Azure -ListAvailable

    Directory: C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell

ModuleType Name                  ExportedCommands
---------- ----                  ----------------
Binary     Azure                 {Disable-AzureServiceProjectRemoteDesktop, E
At this point you don't even need to import any module since starting with Powershell 3.0 there is a module auto-load feature: so type the cmdlet you need and PowerShell will load the Azure module for you.

As a general information, the last version of the Azure module comes with 242 cmdlets. The most common nouns in this module are the following ones:
gcm -Module azure | group noun | sort count -desc | select name, count -first 10
Name Count ---- ----- AzureVM 9 WAPackVM 9 AzureWebsite 8 AzureService 6 AzureDeployment 5 AzureVMImage 5 AzureSqlDatabaseServerFirewallRule 4 AzureAclConfig 4 AzureStorageAccount 4 AzureVNetGateway 4
The next step is to configure Windows Azure Active Directory authentication in PowerShell with Add-AzureAccount (this is much easier than using the combination of Get-AzurePublishSettingsFile and Import-AzurePublishSettingsFile). For a basic usage, this cmdlet takes no parameter: it just opens a browser dialog box asking for the Microsoft Account that you registered to manage your subscriptions.


You can check that you have properly bound to your account with Get-Azureaccount:
Get-AzureAccount

Name                              ActiveDirectories
----                              -----------------
happysysadm.com                   {{ ActiveDirectoryTenantId = 5e649293-9...
and with Get-AzureSubscription:
Get-AzureSubscription

SubscriptionName           : Visual Studio Ultimate con MSDN
SubscriptionId             : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
ServiceEndpoint            : https://management.core.windows.net/
ActiveDirectoryEndpoint    : https://login.windows.net/
ActiveDirectoryTenantId    : 5e649293-9842-111a-b2ab-d234d8cc5f54
IsDefault                  : True
Certificate                :
CurrentStorageAccountName  : 
CurrentCloudStorageAccount :
ActiveDirectoryUserId      : happysysadm.com
As you can see in the output of the last cmdlet, the CurrentStorageAccountName is empty. If you tried to build a new virtual machine at this moment you would get the following error:
New-AzureQuickVM : CurrentStorageAccountName is not accessible. Ensure the current storage account is accessible and in the same location or affinity group as your cloud service.
The solution is to associate your Azure storage account with your subscription prior to deploying any new VM:
Set-AzureSubscription -SubscriptionName (Get-AzureSubscription).SubscriptionName -CurrentStorageAccount (Get-AzureStorageAccount).label
Once your subscription is filled with all the proper information, nothing easier than using New-AzureQuickVM to create and provision your new Windows Azure virtual machines.

Here's the syntax for this cmdlet as extracted from the help:
man New-AzureQuickVM

NAME
    New-AzureQuickVM

SYNOPSIS

SYNTAX
    New-AzureQuickVM -ImageName  -Linux -LinuxUser  -Password  -ServiceName 
    [-AffinityGroup ] [-AvailabilitySetName ] [-DnsSettings ] [-HostCaching ]
    [-InstanceSize ] [-Location ] [-MediaLocation ] [-Name ] [-SSHKeyPairs
    ] [-SSHPublicKeys ] [-SubnetNames ] [-VNetName ]
    []

    New-AzureQuickVM -AdminUsername  -ImageName  -Password  -ServiceName  -Windows
    [-AffinityGroup ] [-AvailabilitySetName ] [-Certificates ]
    [-DisableWinRMHttps] [-DnsSettings ] [-EnableWinRMHttp] [-HostCaching ] [-InstanceSize
    ] [-Location ] [-MediaLocation ] [-Name ] [-NoExportPrivateKey] [-NoWinRMEndpoint]
    [-SubnetNames ] [-VNetName ] [-WaitForBoot] [-WinRMCertificate ]
    [-X509Certificates ] []


DESCRIPTION
    The New-AzureQuickVM sets the configuration for a new virtual machine and creates the virtual machine. It can
    create a new Windows Azure service, or deploy the new virtual machine into an existing service if neither
    -Location or -AffinityGroup is specified.


RELATED LINKS
    Get-AzureVM
    New-AzureVMConfig
    Remove-AzureVM
    New-AzureVM

REMARKS
    To see the examples, type: "get-help New-AzureQuickVM -examples".
    For more information, type: "get-help New-AzureQuickVM -detailed".
    For technical information, type: "get-help New-AzureQuickVM -full".
    For online help, type: "get-help New-AzureQuickVM -online"
Before we deploy a new test virtual machine, I want to make a short digression on the VM sizes you can choose from. There are eight possible sizes, starting from Extra Small to A7:


As you can see in the image above, Small is the minimum recommended size for a Production VM (and with the free trial you can run two of them for a whole month), while Large is the minimum for a SQL node.

Keep in mind that the daily cost for each of this virtual machine size is different, so plan accordingly to your budget:
  • XS 11€ per VM per month
  • Small 49€
  • Medium 99€
  • Large 199€
  • A5 221€
  • ExtraLarge 398€
  • A6 443€
  • A7 886€

Let's move to  the provisioning of your first VM in just a one-liner:
New-AzureQuickVM -Windows -ServiceName 'cloudoftheday' -Name 'cloudvm01' -ImageName (Get-AzureVMImage | Where Label -Like "Windows Server 2012 R2 Datacenter")[-1].ImageName -AdminUsername 'happysysadm' -Password 'VerySecurePassw0rd' -InstanceSize Small -Location 'west europe'
Once the cmdlet ends, your VM is already available for use with the OS installed, RDP and WinRM enabled on their standard ports. Public ports for these services are also opened trough a Port Address Translation (PAT) mechanism.


Now, I haven't been succesfull in using the New-AzureQuickVM cmdlet againts an existing service. The cmdlet fails with the following error message:
"New-AzureQuickVM : ResourceNotFound: The deployment name 'existingcloud' does not exist."
It must be a bug, since in the cmdlet help it says that I can specify an existing service name:
-ServiceName
Specifies the new or existing service name.
So for the moment you have to stick to creating a new Service Name each time you run this cmdlet...

As a side note, I am deploying my VMs using 'West Europe' as location (The 'west europe' datacenter is in Amsterdam). Microsoft Public Cloud offers various possible locations and you should choose the one near to you:
Get-AzureLocation

AvailableServices    : {Compute, Storage, PersistentVMRole, HighMemory}
DisplayName          : East Asia
Name                 : East Asia
OperationDescription : Get-AzureLocation
OperationId          : f89f3076-6587-59ba-b9ff-4e47e4658c82
OperationStatus      : Succeeded

AvailableServices    : {Compute, Storage, PersistentVMRole, HighMemory}
DisplayName          : Southeast Asia
Name                 : Southeast Asia
OperationDescription : Get-AzureLocation
OperationId          : f89f3076-6587-59ba-b9ff-4e47e4658c82
OperationStatus      : Succeeded

AvailableServices    : {Compute, Storage, PersistentVMRole, HighMemory}
DisplayName          : North Europe
Name                 : North Europe
OperationDescription : Get-AzureLocation
OperationId          : f89f3076-6587-59ba-b9ff-4e47e4658c82
OperationStatus      : Succeeded

AvailableServices    : {Compute, Storage, PersistentVMRole, HighMemory}
DisplayName          : West Europe
Name                 : West Europe
OperationDescription : Get-AzureLocation
OperationId          : f89f3076-6587-59ba-b9ff-4e47e4658c82
OperationStatus      : Succeeded

AvailableServices    : {Compute, Storage, PersistentVMRole, HighMemory}
DisplayName          : East US
Name                 : East US
OperationDescription : Get-AzureLocation
OperationId          : f89f3076-6587-59ba-b9ff-4e47e4658c82
OperationStatus      : Succeeded

AvailableServices    : {Compute, Storage, PersistentVMRole, HighMemory}
DisplayName          : West US
Name                 : West US
OperationDescription : Get-AzureLocation
OperationId          : f89f3076-6587-59ba-b9ff-4e47e4658c82
OperationStatus      : Succeeded
Now the question that comes to my mind is whether I can deploy many VMs at the same time using a Workflow in Powershell 4.0. Let's see the answer. I have prepared the following workflow as an example:
WorkFlow Deploy-AzureVm {

[cmdletbinding()]
    param(
        [Int]$Quantity = 5,
 [String]$ImageName = "Windows Server 2012 R2 Datacenter",
 [String]$ServiceName,
 [String]$Prefix,
 [String]$AdminUsername,
 [String]$Password,
 [String]$InstanceSize
        )

#Retrieving image name in Windows Azure repository
$Image = (Get-AzureVMImage | Where Label -Like $ImageName)[-1].ImageName

"Deploying first VM with Location parameter"
New-AzureQuickVM -Windows -ServiceName $ServiceName `
 -Name "$Prefix" -ImageName $Image `
 -AdminUsername $AdminUsername -Password $Password -InstanceSize $InstanceSize -Location "west europe"

foreach -parallel ($VM in 2..$Quantity){
   "Deploying VM $Prefix$VM"
   New-AzureQuickVM -Windows -ServiceName $ServiceName `
 -Name "$Prefix$VM" -ImageName $Image `
 -AdminUsername $AdminUsername -Password $Password -InstanceSize $InstanceSize
   }

}
Once I run it I get the following problem:
Deploy-AzureVm -Quantity 3 -ImageName "Windows Server 2012 R2 Datacenter" -ServiceName "WorkflowCloud007" `
>> -Prefix "VMcloud16Jan" -AdminUsername 'happysysadm' -Password 'VerySecurePassw0rd' -InstanceSize Small
>>
Deploying first VM with Location parameter

PSComputerName        : localhost
PSSourceJobInstanceId : de553192-bd99-4c46-be19-505f0007e819
OperationDescription  : New-AzureQuickVM
OperationId           : 2438374c-egb1-529d-af64-ce5b879ee3e1
OperationStatus       : Succeeded

PSComputerName        : localhost
PSSourceJobInstanceId : de553192-bd99-4c46-be19-505f0007e819
OperationDescription  : New-AzureQuickVM
OperationId           : 2438374c-egb1-529d-af64-ce5b879ee3e1
OperationStatus       : Succeeded

Deploying VM VMcloud16Jan3
Deploying VM VMcloud16Jan2
New-AzureQuickVM : ConflictError: Windows Azure is currently performing an operation with x-ms-requestid
123a90e4325857d986b25524987b7d9b on this deployment that requires exclusive access.
At Deploy-AzureVm:20 char:20
+
    + CategoryInfo          : CloseError: (:) [New-AzureQuickVM], CloudException
    + FullyQualifiedErrorId : Microsoft.WindowsAzure.Commands.ServiceManagement.IaaS.PersistentVMs.NewQuickVM
    + PSComputerName        : [localhost]

PSComputerName        : localhost
PSSourceJobInstanceId : de553192-bd99-4c46-be19-505f0007e819
OperationDescription  : New-AzureQuickVM
OperationId           : 2438374c-egb1-529d-af64-ce5b879ee3e1
OperationStatus       : Succeeded
It looks like operations like virtual machine provisioning or deletion keep an exclusive access on the deployment engine and don't allow for parallel virtual machine setup in the same Windows Azure environment. I haven't found a solution to this and that's sad because I liked the idea of deploying a whole IaaS in just one Powershell Workflow. But, I am sure the technology under the hood is making huge steps forward and I would expect such a feature like Powershell workflow to be fully leveraged in the first major release of the Azure module. Meanwhile we have to stick to serial execution of our VM provisioning, which is nonethless extremely trivial.

Performancewise, the rapidity of the deployment of a new VM in the Cloud is pretty stunning, with the vhd disks deployed in a bunch of seconds and the VM started in a couple of minutes:
Measure-Command -expression {
New-AzureQuickVM -Windows -ServiceName 'cloudofthedayxx001' -Name 'abc01' -ImageName (Get-AzureVMImage | Where Label -Like "Windows Server 2012 R2 Datacenter")[-1].ImageName -AdminUsername 'carlo' -Password 'VerySecurePassw0rd' -InstanceSize Small -Location 'west europe'
}

Days              : 0
Hours             : 0
Minutes           : 1
Seconds           : 12
Milliseconds      : 422
Ticks             : 724220786
TotalDays         : 0,000838218502314815
TotalHours        : 0,0201172440555556
TotalMinutes      : 1,20703464333333
TotalSeconds      : 72,4220786
TotalMilliseconds : 72422,0786
That's all for this first post on Windows Azure and on the Windows Azure Powershell module. I hope you have found the subject passionating and in that case I hope you'll share it and give feedback. Stay tuned for more, and good luck with the Games if you're in (I hope you are).

Thursday, January 9, 2014

Filtering left and error trapping in Powershell

The Scripting Games have started and I am pleased to see that a few teams have already published interesting approaches to the first test event. Nonetheless I see that some people still persevere with some of the common mistakes beginners do.

In this blog post, I want to shed some light on two of these common mistakes.

The first one it’s not really a mistake but a bad habit. Let's see why. In the first test event you are asked to perform some quite complex computer inventory tasks. As you know, inventorying Windows-based computers generally passes through WMI queries through the use of the Get-WmiObject cmdlet:
Get-WmiObject–class win32_service  –computername (Get-Content serverlist.txt)
Some people naïvely tend to remove returned objects from the pipeline using Where-Object cmdlet.
But, and this is very important, when you are retrieving many information from many distant servers, and piping the resulting objects to Where-Object, you could encounter performance problems due to the huge amount of data your station has to analyze and keep or discard.
The tip here (that Don Jones named ‘Filter Left’ in his training) is to make use of the –Filter parameter, which is available for many Powershell cmdlets:
Get-Command -ParameterName filter

CommandType     Name
-----------     ----
Cmdlet          Add-Content
Cmdlet          Clear-Content
Cmdlet          Clear-Item
Cmdlet          Clear-ItemProperty
Cmdlet          Copy-Item
Cmdlet          Copy-ItemProperty
Cmdlet          Get-Acl
Cmdlet          Get-ChildItem
Cmdlet          Get-Content
Cmdlet          Get-Item
Cmdlet          Get-ItemProperty
Cmdlet          Get-Job
Cmdlet          Get-WmiObject
Cmdlet          Get-WSManInstance
Cmdlet          Invoke-Item
Cmdlet          Move-Item
Cmdlet          Move-ItemProperty
Cmdlet          New-ItemProperty
Cmdlet          Remove-Item
Cmdlet          Remove-ItemProperty
Cmdlet          Remove-Job
Cmdlet          Rename-ItemProperty
Cmdlet          Resume-Job
Cmdlet          Set-Acl
Cmdlet          Set-Content
Cmdlet          Set-Item
Cmdlet          Set-ItemProperty
Cmdlet          Stop-Job
Cmdlet          Suspend-Job
Cmdlet          Test-Path
Cmdlet          Wait-Job
Using this –Filter parameters boosts your WMI query performance by letting the WMI service itself on the remote server do the homework of filtering information.

A quick script allows us to verify this statement, even on local queries:
$FilterDuration = (Measure-Command -Expression {Get-ChildItem -Path $env:windir\system32 -Filter *.dll}).Milliseconds
$WhereDuration = (Measure-Command -Expression { Get-ChildItem -Path $env:windir\system32 | Where-Object Extension -eq ".dll"}).Milliseconds
"Filtering with -Filter: $FilterDuration ms"
"Filtering with Where-Object: $WhereDuration ms" 

Filtering with -Filter: 239 ms
Filtering with Where-Object: 803 ms 
There you go: unsurprisingly we get 200 milliseconds for –Filter versus more than 800 milliseconds for Where-Object.

Let me move on to the second mistake, which is not knowing that you have to explicitly set the value of the parameter -ErrorAction to Stop in your WMI queries if you want to trap errors using the Try {} construct.

This is because some exceptions returned by WMI queries aren't terminating errors.

For example, the following script returns a big red error message, like if the Catch {} block was skipped:
Try {
    get-wmiobject -class "Win32_PhysicalMemory" -computername Ghostserver
    }
Catch {
    "WMI query failed..."
    }

Get-WmiObject : The RPC server is unavailable. (Exception from HRESULT: 0x800706BA) 
While the following one properly catches the error and shows the predefined error message:
Try {
    get-wmiobject -class "Win32_PhysicalMemory" -computername Ghostserver -ErrorAction Stop
    }
catch {
    "WMI query failed..."
    }
The major advantage of transforming your WMI errors in terminating exceptions and writing to the host nicely composed error messages is that you show that you master your code and that your scripts is designed well enough to cope with unforeseen issues.

Hope this helps.

Wednesday, January 1, 2014

I am a Coach for the 2014 Winter Scripting Games

The 2014 Powershell Winter Scripting Games are just about to start. I am proud to say that I have been selected by 2013 Scripting Games winner and Head Coach Mike F Robbins to act as an expert Coach for this edition and I will do my best to offer constructive feedback to all teams that post their script files to the Scripting Games website.
Here's a quick list of things you have to know and/or keep in mind during these Games:
  • The site of the Games is ScriptingGames.org
  • The rules of the games are in this PDF file.
  • There will be a total of 4 official events for these Winter Scripting Games: January 19th, January 26th, February 2nd, & February 9th. The official schedule is here.
  • For official news, regularly check out the Scripting Games announcement Category
  • MVP Richard Siddaway will be the Head Judge. Richard also authored the practice event and event 4, titled 'Monitoring Support'.
  • MVP Mike F Robbins will be the Head Coach.
  • Event 1, titled 'Pairs' has been authored by Ed Wilson.
  • Event 2, titled 'Security footprint' has been authored by Lee Holmes. 
  • Event 3, titled 'Acl, Cacl, Toil and Trouble' has been authored by Jeff Wouters.
  • You cannot play alone, since the events are complex and challenging and you can't tackle them alone. Try to think more like colleagues and less like individuals.
  • You have to build a team of 2-6 people, or join an existing one.
  • The system can help you find one team for you based on your timezone.
  • Make sure your team has submitted only one entry when the deadline comes (all times UTC).
  • Registration and team formation for the 2014 Winter Scripting Games will begin on January 2nd, which is in a few hours, and a practice event will take place.
  • Regularly upload your team's last version of the script into the Scripting Games website, so that Coaches like me can offer commentary in the private in-Game discussion thread.
  • Coaches won't be able to help if you don't post your entries during the week before the event closes.
  • Coaches comments are flagged so teams can easily spot them.
  • Get a test environment and make sure you are running the last version of Powershell (which is 4.0), as suggested in the FAQ.
  • Have a look at this interesting read on the appropriate use of comments.
  • Read my last post on the use of hashtables, dictionaries and objects.
  • Boe Prox does an excellent job of explaining the best way to make your code neat and clear, through the use of variable that make sense, proper error handling, and comment-based help. Check it out here.
  • Modular development is encouraged.
  • Check out this interactive Git intro, if you want to use GitHub as your collaboration tool.
 People to follow on Twitter during the Games:
... as well as the list provided by Mike on his blog (Introducing the Coaches of the 2014 Winter Scripting Games) and the Twitter hashtage #pshgames.

Remeber: it’s a terrific occasion to learn PowerShell techniques that will help you master the language. Have fun. Good luck!

Tuesday, December 31, 2013

My review on IT facts of 2013

It's December 31st, 2013. It's been a good year for innovation on all fronts with a lot of interesting products, services and releases of my favorite applications, so I want to write a post to talk about the IT things that have attracted my attention during these last 12 months.

First of all I am positively impressed by the release of the last Nokia Tablet, the Lumia 2520, as well as by the release of the Microsoft Surface 2. Both these tablets have been well-engineered and nicely built. The Nokia Lumia 2520, featuring a Qualcomm SnapDragon 800 CPU, 2GB of RAM and a 10.1" display, impresses me for its refreshing design and its cool colors. On the other side, the Microsoft Surface 2, featuring a quad-core ARM Cortex-A15, 2GB of RAM and a 10.6" display, despite its industrial design, inspires me more thanks to its USB 3.0 port, to its kickstand (especially useful when watching movies), for its crisp text during web browsing, and, from a general point of view, for its versatility.

Speaking of these two tablets, I must say that I am really in love with the last Microsoft operating system, Windows 8.1, and its ARM-oriented version, Windows RT 8.1, that you will find on those two tablets. This OS is really powerful, modern, colorful and, despite all you might have heard, much better than good old Windows 7.

I have been using Windows 8.1 for a few months now and I really can't live without the Metro interface and its smart search engine which looks for file and/or applications on your local hard-drive, in your Skydrive folder and on the web (via Bing, naturally) at the same time.

Windows 8.1 omnisearch
My piece of advice here: forget the Start button and move to the Metro interface as fast as you can. Windows 8 (and 8.1) are more than just a new way of thinking about the desktop interface for Microsoft; it is a major bet that touch is the future of computing and you better get accustomed to the change. Don't worry about the Windows App Store, because it will for sure grow up in size to match those by Google and Apple, and all major Apps are already there: TripAdvisor, Facebook, NetFlix, Twitter, Skype, Shazam, Tunein Radio, or the brilliant new Microsoft RDP (kudos for this from every Windows IT guy out there).

For the moment the only negative point I have found in Windows 8.1 is that background apps temporarily stops until they are brought back to foreground. The technical cause for this? Windows 8 and Windows 8.1 have been built to be efficient and perform well. One of the ways this is accomplished is by suspending any application that's not currently on the show. When you replace an app on the screen, that app is maintained in RAM but it is no longer processed. This allows the user to maintain a large number of apps without degrading performance or reducing the oh-so-important battery life. The downside to this life-cycle is that when your app isn't on the screen, it's no longer running. I find this particularly disturbing when listening to YouTube videos that suddenly stop when I move them to the background. The only workaround I have found for the moment, is to open Internet Explorer 11 in Desktop Mode (Windows Key + R, and type iexplore). This actually works, but feel free to suggest in the comments any other workaround that worked for you.

At the end of 2013 it looks like everyone is moving to the Cloud. Me too. I have explored many ways to storing my data into a Cloud solutions and have been deceived most of the times for now. I started testing the solution of a rapidly growing German start-up, named Owncloud, and have been deceived by the number of bugs which make this application unsafe for storing data.

OwnCloud user interface
The idea behind Owncloud is pretty cool: to give users the possibility to have (and therefore own and manage) their on-premise Cloud server. Unfortunately the community behind Owncloud is quite small, and most of the expert users are oriented toward installing the engine on Linux and can't help newcomers to install it in a Windows environment. Also, each update is risky and if you have large amounts of data stored in the database, the performance decreases. Last not so good point, there is an app for Android smartphones, but it does an average job of sync with files, and does not sync the calendar at all, which is one of the basic features I would expect from their solution. Anyhow Owncloud 6 has just been released and I hope that the Owncloud founder Frank Karlitschek (@fkarlitschek) will broaden the support for Owncloud on Windows platforms.

I have of course also tested free apps, like Dropbox or Skydrive. Both are brilliant but, apart from the well-known privacy problems everybody is aware of, I find the amount of free allowed storage really small in our technology era where data get deduplicated on hosting servers (7GB for Skydrive is way too limited and not everyone is willing to buy a Windows Phone for getting 20GB more).

Concerning Skydrive, I have been tempted by the nice interface of this Public Cloud solution and decided to give a try to the Pro version. I set up a full grown SharePoint 2013 environment sitting on the new Windows 2012 R2 operating system, and activated the Skydrive Pro replica between my folders and this on-premise cloud. Everything worked well, much better than with Owncloud anyway, but I felt let down when I discovered that:
- the picture library does not sync
- you can sync up to a maximum of 20k items in your personal SkyDrive Pro document library
- you can't sync more than 5k items in other SharePoint libraries

So, even if this idea of storing files in the Cloud looks good, I will wait for a friendly, generous (in terms of gigabytes and services) and solid solution, be it on-premise or off-premise, before I change the way I store my data. I will let the big competitors in this field (read Microsoft, Amazon, Google and a few others) fight for the win and eventually choose the most valid Cloud Storage solution.

And if I am not yet completely ready for the Cloud, I am years away from having my washing machine or my fridge connected to the Internet of Things, nor I can imagine my neighbor debugging IPV6 addressing problems on his dishwasher. So, despite what has been said, there are other buzzwords in my head for the moment and Internet of Things is not on my list at the end of 2013, even do I must say I have been fairly impressed by the Italian Arduino micro-controller that allows you to create objects that can really interact with their environment through different triggers and signals. It looks like Arduino (which was invented in 2005) is rapidly becoming the building block at the heart of the Internet of Things, as its low-power processor is easy to program while its range of controller pins makes it easy to connect to the outside world. I deeply suggest you follow Massimo Banzi (@mbanzi) and David Cuartielles (@dcuartielles), co-founders of this platform for the next step in this field.

I want now to go back for a moment to Windows 8.1, and to the new Microsoft policy for upgrades: it has changed since Microsoft has moved from a three-year upgrade cycle to a yearly one, and this is bringing very fast improved releases of my favorite administration language: Windows Powershell. I don't want to rewrite a history of Powershell (I already did this), but just mention that this language is quickly becoming the de-jure standard for Windows system administration and the de-facto standard for third-party application management (VMWare, NetAPP, HP, just to mention a few).

With Powershell (like with VB before), there is in the community the annual habit of organizing some sort of Scripting Games to determine who's the best scripter around. Starting January 2014, the Winter Scripting Games will be a team event and for sure this will add to the fun of the competition. I heatedly suggest everybody reading this post to quickly join a team and find teammates to learn this wonderful language and compare with other Windows admins from all around the world. It could be also the occasion for Linux/Unix administrators to take part in the game and discover the power of this new object-oriented administration language and see how it compares with other standards such as Bash, Python or Perl. I will come back on the Games in a future post, but if you are eager to know more, follow these smart people on Twitter: Don Jones (@concentrateddon), Mike F Robbins (@mikefrobbins) and Richard Siddaway (@RSiddaway).

I previously mentioned the fight for emerging as the biggest Cloud provider out there. Well, there are many other interesting fights on the way, which is important to be aware of and that will determine the future of IT. A few examples? 
  • Skydrive, forget that name: Microsoft has lost the right to use this name since Sky users couldn't tell the difference. Which name Microsoft will adopt next?
  • Powershell, a name that Logitech has picked without realizing that Microsoft had grasped it before for its administration language. Will we see them in Court? 
  • YouTube: Microsoft wants a YouTube app on its phones, but Google (which, according to Amazon, is having a solid success with its new Chromebooks) is doing everything it can to block it. It looks like Windows Phone is starting to scare Android.
I don't even want to mention the battle between CPU manufacturers, like Intel or AMD. 2014 will be a year of tremendous change for both, since the market dynamics are shifting and new markets have started to grow at high speed. Intel has started backing Bay Trail and Haswell-based systems across multiple OS and at price points that compete against ARM-powered Android tablets. And next year Intel will push this strategy further, offering multiple Broadwell families, and in particular the Broadwell-Y, which should compete with chips from ARM both on performance and on TDP. A post-PC era has started, and I am curious to see how historical CPU manufacturers will cope with the technology challenge set up by Qualcomm, ARM (today ARM chips power Apple's iPhones, and iPads, as well as a huge number of Android phones)Samsung and Apple, even if this last one, after years or leadership with the iPhone, is actually running out of tricks to impress me. The battle will be fierce.

Broadwell-Y is coming
I said before that Microsoft is delivering new Windows upgrades at an unrivaled frequency. This is also true for most of its products (it must be the Indian summer of Microsoft CEO Steve Ballmer (@steveballmer), who, being Microsoft employed #30, was there since the beginning). And this high frequency is also true for its hypervisor: Microsoft Hyper-V Server 2012 R2 has in the end attained parity with VMWare in hypervisor functionality and is today ahead in Cloud offerings. Check expert VMWare VCDX Andrea Mauro (@Andrea_Mauro) blog for an in-deep comparison of technical specs between Microsoft Hyper-V and VMWare vSphere. Personally, I am a VMWare VCP and I have been a great fan of VMWare for many years. Today, while VMWare continues to dominate the market, I start to feel like I want to move more and more to the Microsoft solution, because it is cheaper, at least for the Full edition, and this is the one that I need, and because Microsoft offers the possibility to Hyper-V users to run their VM in the Windows Azure Cloud, extending the Private Cloud concept to make it Hybrid (and interesting I daresay). In addition Microsoft has understood that most people are well used to working with VMWare ESX and is giving away free Hyper-V and System Center training in order to change the market dynamics. At the same time Microsoft is providing the opportunity for a free Virtualization Certification when you request a free voucher for exam 74-409. Well played.

On the other side of the river there is VMWare. I am happy to see that they have released vSphere 5.5, but, let me say, the upgrade from vSphere 5.0 to 5.1 was a nightmare for most of us mainly due to the issues brought by the introduction of that piece of software that is Single Sign-On (SSO). According to VMWare, SSO is the first step toward true unification of VMware's various software components, but the truth is that they bought so many third-party applications that were difficult to integrate and needed some sort of mechanism to keep them all bound together. The result was poor.

In 2014 I will keep an eye on the evolution of server virtualization market since it is one of the most interesting from my point of view. I won't make any prediction since I am not an analyst, but one thing is sure: Microsoft and VMWare will continue their battle and IT pros will benefit from the competition.

In this post I want also to suggest a buzzword to follow next year: DevOps. You don't know what DevOps mean?  Well, a DevOps is not a person, nor a tool. It's a new way of thinking (and improving) the interactions (in terms of workflow and feedback from early stage to Prod) between software developers on one side and the operation on the other side. If you feel you want to know more about this concept, follow on Twitter the person who invented the word, Patrick Debois (@patrickdebois), and read this explanation by Simon Maple (@sjmaple) on TheServerSide.com.

I want to end this post suggesting all of my readers to start 2014 with the buy (if they haven't yet) of a Solid State Drive (SSD), since there is no better time of the year to treat yourself to a new bit of kit that will speed up your configuration. They are still expensive, but the technology behind is improving fast (Samsung has been able to produce a 1TB SSD, the Samsung 840 EVO, using an astonishing 19nm process) and sooner or later prices will drop.
1TB Samsung 840EVO SSD
Let me know if you agree with all the things I have written and feel free to suggest the things that made your 2013 and the things that you expect from 2014. For those living in France, see me at the Microsoft Techdays in Paris, where, as a Powershell MVP (which I became thanks to Martine Thiphaine @mthiphaine), I will be a speaker together with Fabien Dibot (@fdibot) on Powershell Remoting and on Powershell Tips and Tricks.

Wishing all of my readers all the best for 2014, I leave you with two jokes that made me smile:

thedailydose.com
cloudtweaks.com

Tuesday, December 17, 2013

How to build a System Center Virtual Machine Manager 2012 R2 lab - part 9

In this post we will install the actual System Center Virtual Machine Manager 2012 R2 application. In the previous post you set up a new virtual machine named LAB2013VMM01 for hosting this application. Here you have to mount the SCVMM iso image and run setup.exe:


Click on install. By default none of the options are selected. However you can’t install the management server without installing the console, so when you check the first box, the second one will come along.


Enter your registration key:


Accepts the licence agreement:


Choose whether you want to take part in the CEIP (I suggest so, since this give important information to Microsoft aimed at improving the software):


Set Microsoft Update On (... or Off, if you are in a lab like me, since it’s no use):


Choose the folder where you want to install SCVMM into:


Now the install checks that all the prerequisites are met. If you get an alert telling you to you need at least 4GB of RAM, just modify the VM settings to conform, and VMWare Workstation will do that for you instantly, since it supports hot-add of memory. Hardcore.
Time to configure the connection to the MSSLQ DB:


Now configure SCVMM to use the service account (SCVMMsvc) you setup using Powershell in the previous post and enter the DN of the container for the keys: CN=SCVMMDKM,CN=System,DC=lab2013,DC=local


Review the port configuration (the note of the ports, you could need this if your network is internally firewalled):


Review the library configuration:


Save the installation summary to a text file for future reference:

Features selected to be added
 - VMM management server
 - VMM console
Installation Location
 - C:\Program Files\Microsoft System Center 2012 R2\Virtual Machine Manager\
Database Information
 - VirtualManagerDB database will be created on LAB2013SQL01
Service Account
 - LAB2013\SCVMMsvc
Communication Ports
 - 8100 - Communication with the VMM console
 - 5985 - Communication to agents on hosts and library servers
 - 443 - File transfers to agents on hosts and library servers
 - 8102 - Communication with Windows Deployment Services
 - 8101 - Communication with Windows Preinstallation Environment (Windows PE) agents
 - 8103 - Communication with Windows PE agent for time synchronization
Library Share Name and Location
 - MSSCVMMLibrary
 - C:\ProgramData\Virtual Machine Manager Library Files
Using Microsoft Update
 - No

Wait for the installation to complete:



It was long but there you are with a full grown SCVMM lab! Just connect to it and get accustomed to the interface:



In a future post I will explain how to configure networking on these Hyper-V servers (we will setup additional NICs for Management traffic), how to integrate them into SCVMM, how to setup a Cluster and how to setup your first VM, so stay tuned!

Friday, December 13, 2013

How to suspend and resume VMWare Workstation VMs using PowerShell

If you have ever made a lab in your home environment, you have probably used VMWare Workstation as preferred choice for running your virtual machines. However, in such a small all-in-a-host configuration, you could face the need to shutdown and restart your main computer (i.e. if you have just added some patches or some heavyweight applications), and this could be very unpractical if you have say 5 or 6 VMs (or more!) running on top of it.

What you would like is to have the ability to suspend the state of these VMs so you can pick up where you left off after rebooting your host. And since we are in modern times, what you would like also is the ability to perform this operation with a Powershell script.

Unfortunately there is no API for Powershell in VMWare Workstation, but this program comes with another command which your Powershell script could be based on: vmrun.exe.

This utility can be used to control virtual machines and allows the following operations: start (power on), stop (power off), reset (reboot), suspend (but allow local work to resume), pause (without interrupting), and unpause (continue).

I had the idea to write two scripts. The first one is good for suspending VMs and should be using prior to rebooting your host (right click, open with Windows Powershell ISE):
cd "C:\Program Files (x86)\VMware\VMware Workstation"

$RunningVMs = .\vmrun list | select-object -skip 1
 
Foreach ($RunningVM in $RunningVMs)
{
    "Suspending $RunningVM..."
    .\vmrun suspend "$RunningVM"
}
The second one is good for resuming your VMs to the exact state they were before the host rebooted:
cd "C:\Program Files (x86)\VMware\VMware Workstation"

$SuspendedVMs = gci -Include *.vmx -Recurse -Path C:\VMs
 
Foreach ($SuspendedVM in $SuspendedVMs)
{
.\vmrun start "$SuspendedVM"
}
As you can understand the state of a VM is kept persistent across reboot since it's dumped to a .vmss file (whose name will look like this: vm001-82cc0131.vmss) and the RAM content is dumped to a .vmem file (such as vm001-82cc0131.vmem).

I hope this helps with your labs, and if you have the occasion, check out the new Hyper-V 2012 R2 role as a good alternative to VMWare Workstation.

Monday, December 9, 2013

How to build a System Center Virtual Machine Manager 2012 R2 lab - part 8

In the previous post you saw how to configure SQL Server for your virtual infrastructure. In this post you will setup a sixth virtual machine named LAB2013VMM01 that will host your actual System Center Virtual Machine Manager 2012 R2 installation. You will also go through all the prerequisites for SCVMM to work.



Under VMWare Workstation configure a new virtual machine and install Windows 2012 R2 with a GUI (even though it looks like you have the option to install SCVMM in Core mode starting from version 2012). Once the installation has ended, and you have deployed the VMWare tools, connect as a Domain Admin and use once again the basic configuration cmdlets:
Get-NetAdapter -Name Ethernet0 | % {
  $_ | Set-NetIPInterface -Dhcp Disabled
  $_ | New-NetIPAddress -IPAddress 192.168.134.17 -PrefixLength 24 -DefaultGateway 192.168.134.2
  $_ | Set-DnsClientServerAddress -ServerAddresses 192.168.134.10
}
Rename-Computer –NewName LAB2013VMM01 –Restart
netsh advfirewall set allprofiles state off
cscript C:\Windows\System32\Scregedit.wsf /ar 0
Add-computer –DomainName LAB2013.local –Restart
Time to review the four main prerequisites for SCVMM:
  • During the installation of a VMM management server, on the Configure service account and distributed key management page, you will need to configure the System Center Virtual Machine Manager service to use either the Local System account or a Domain account. You can change this later on, of course. If you choose to use a Domain account make it belong to the Local Administrators Group. In any case there are many reasons to choose to run SCVMM with a Domain account, though I am not going to detail this here.
  • During the installation of a VMM management server, you will need to configure Distributed Key Management. On the 'Configure service account and distributed key management' page of Setup, you can select to use Distributed Key Management to store encryption keys in Active Directory Domain Services (AD DS) instead of storing the encryption keys on the computer on which the VMM management server is installed. The advantage of storing your keys in AD DS is that if you need to move your VMM installation to another computer, all your data are retained. Cool, uh? Also storing keys in AD DS is the only possible way if clustering your VMM. You must create a Container (i.e. SCVMMDKM) in AD DS before installing VMM. The domain account your created in the first step must have full rights on this Container.
  • The third prerequisite is to install Windows Assessment and Deployment Kit (Windows ADK) for Windows 8.1.
  • The fourth is to install SQL Server 2012 Command Line Utilities.
Remember that using a Domain Account and storing the encryption keys are two of the pillars of a Highly Available VMM Management Server, which is out of scope for my lab for now. Maybe I’ll come back on this topic in a future post.

The first two prerequisites are easily accomplished with Powershell:
On the Domain Cotroller (LAB2013AD01 in my case) type the following command to create a new account for SCVMM:
New-ADUser -Name "SCVMM Account" –SamAccountName SCVMMsvc -Description "SCVMM 2012 R2 Server Service Account" -Enabled $true -AccountPassword (Read-Host -AsSecureString "SCVMM Service Account Password")
Note the password you used! Now retrieve the SID of this new user account:
[System.Security.Principal.IdentityReference]$UserSid = (Get-ADUser scvmmsvc).SID
Create a variable containing the DN of your domain:
$ADRoot = (Get-ADDomain).DistinguishedName
Create the Active Directory Container:
New-ADObject -Name "SCVMMDKM" -Type Container –path “CN=System,$ADRoot” –passthru

DistinguishedName   Name                ObjectClass         ObjectGUID
-----------------   ----                -----------         ----------
cn=SCVMMDKM,CN=S... SCVMMDKM            container           59722f15-51af-4c...
Now retrieve the existing ACL of the Container (note the use of the AD: PSDrive):
$Acl = Get-Acl "AD:CN=SCVMMDKM,CN=System,$ADRoot"
Now the tricky part: you have to create the ActiveDirectoryAccessRule that goes into the AddAccessRule method. This object has six different constructors and each can be used for a different use case. Luckily we can check the syntax on MSDN or execute the following command:
[System.DirectoryServices.ActiveDirectoryAccessRule].GetConstructor
OverloadDefinitions ------------------- System.Reflection.ConstructorInfo GetConstructor(System.Reflection.BindingFlags bindingAttr, System.Reflection.Binder binder, System.Reflection.CallingConventions callConvention, type[] types, System.Reflection.ParameterModifier[] modifiers) System.Reflection.ConstructorInfo GetConstructor(System.Reflection.BindingFlags bindingAttr, System.Reflection.Binder binder, type[] types, System.Reflection.ParameterModifier[] modifiers) System.Reflection.ConstructorInfo GetConstructor(type[] types) System.Reflection.ConstructorInfo _Type.GetConstructor(System.Reflection.BindingFlags bindingAttr, System.Reflection.Binder binder, System.Reflection.CallingConventions callConvention, type[] types, System.Reflection.ParameterModifier[] modifiers) System.Reflection.ConstructorInfo _Type.GetConstructor(System.Reflection.BindingFlags bindingAttr, System.Reflection.Binder binder, type[] types, System.Reflection.ParameterModifier[] modifiers) System.Reflection.ConstructorInfo _Type.GetConstructor(type[] types)
To give the SCVMMsvc user account full rights on the SCVMMDKM Container you need to choose a constructor that accepts:
  • An IdentityReference object that identifies the trustee of the access rule. It is the SID of the user account SCVMMsvc.
  • The access rights that are assigned to an Active Directory Domain Services object. You will use GenericAll because it gives the right to create or delete children, delete a subtree, read and write properties, examine children and the object itself, add and remove the object from the directory, and read or write with an extended right.
  • An AccessControlType: Allow or Deny
$Ace = New-Object System.DirectoryServices.ActiveDirectoryAccessRule $UserSid, "GenericAll", "Allow"
$Acl.AddAccessRule($Ace)
Set the Acl:
Set-Acl -aclobject $Acl "AD:CN=SCVMMDKM,CN=System,$ADRoot"
The last step you have to perform to met the first two prerequisite is to add the new SCVMMsvc account to to local administrator group on your SCVMM server LAB2013VMM01:
([ADSI]"WinNT://LAB2013VMM01/Administrators,group").psbase.Invoke("Add",([ADSI]"WinNT://LAB2013.local/SCVMMsvc").path)
Easy, right?
Now, for Windows ADK, you can find it here: http://www.microsoft.com/en-eg/download/details.aspx?id=39982 It’s a small file (1.4MB) named adksetup.exe.

The two features that you need for SCVMM are Deployment Tools and Windows PE:

The installer will download these features (that’s why it is so important to have proper internet connection and a valid DNS forwarding in your LAB). This can take a significant amount of time depending on download speed.

You can take a walk here since this step is pretty long download. When you come back the Windows ADK will be finished and you will see the following window:

Now install the SQL Server 2012 Command Line Utilities from the Microsoft SQL Server 2012 Feature Pack. You can download them from http://www.microsoft.com/en-us/download/details.aspx?id=29065

The installation ends quickly since there are no prerequisites to meet under Windows 2012 R2:

Restart this virtual machine since you made a lot of modifications (this step is not required but I suggest you to do so):
Now you have all the SCVMM prerequisites met. Time to install System Center, which I will explain in the next post!
Related Posts Plugin for WordPress, Blogger...