PowerCLI – where to start

I began using PowerShell around 18 months ago while working for a small UK based Managed Service provider. Prior to this, my coding/scripting experience consisted of an A-Level in Computing, which introduced me to Visual Basic 6.0 and databases, a void of around 7 years, and then some sysadmin VBScript and batch file type goodness for a few years.

Screen Shot 2015-11-02 at 21.34.14

Until I started at said company, I had only been exposed to systems running Windows Server 2003, and with a look to security über alles, no access to PowerShell, or any other exciting languages was available, so VBScript became our automation tool of choice.

I have posted before about good resources to use to learn PowerShell, this is more a rundown of how I learned, and the joy and knowledge it gave me to do this.

My first taste of PowerShell was working with Exchange 2010 servers, doing stuff like this to report on mailbox items over a certain age.

Get-Mailbox "username" | New-MailboxSearch -Name search123 -SearchQuery "Received:<01/01/2014" -estimateonly

Were it not for the necessity to use PowerShell to do anything remotely useful in Exchange 2010, I would have been happy to continue to use batch files and VBScript to automate some of the things, I was confident in using these tools, and could achieve time savings, albeit fairly slowly. But PowerShell I must, so PowerShell I did.

Around this time, I became more keen on working with infrastructure, than applications, and got transferred to a role solely looking after our fairly sizeable Cisco UCS and VMware estate. I had plenty of years of experience of VMware, and none of Cisco UCS, but was excited by the new challenge.

I was quickly steered by the senior engineers, towards Cisco PowerTool, and VMware’s PowerCLI, to help to automate some of the administrative, and reporting type tasks I would soon be inundated with, so I picked them up and learned as I went.

I started small, and Google was my friend. Scripting small tasks to save incrementally larger amounts of time. Stuff like this:


$podcsv = import-csv .\UCS_Pods.csv
$credcsv = import-csv .\UCS_Credentials.csv
$ucsuser = $credcsv.username
$ucspasswd = $credcsv.password
$secpasswd = convertto-securestring $ucspasswd -asplaintext -force
$ucscreds = new-object system.management.automation.pscredential ($ucsuser,$secpasswd)
$datetime = get-date -uformat “%C%y%m%d-%H%M”
foreach($pod in $podcsv)
{
$podname=$pod.name
$podip=$pod.ip
connect-ucs -credential $ucscreds $podip
get-ucsfault | select ucs,id,lasttransition,descr,ack,severity | export-csv -path .\$datetime-$podname-errors.csv
disconnect-ucs
}

To dump out the alerts we had in multiple UCS systems, to CSV files. This would save 20-30 minutes a day, nothing major, but clicking buttons is boring, and I can always find better things to do with my time.

On the VMware side of things, I started really small, with stuff like this which would tell you the version of VMTools on all of your virtual machines:


# Ask for connection details, then connect using these
$vcenter = Read-Host "Enter vCenter Name or IP"
$username = Read-Host "Enter your username"
$password = Read-Host "Enter your password"
# Set up our constants for logging
$datetime = get-date -uformat "%C%y%m%d-%H%M"
$outfilepsp = $(".\" + $datetime + "_" + $vcenter + "_PSPList_Log.txt")
$outfilerdm = $(".\" + $datetime + "_" + $vcenter + "_RDMList_Log.txt")
$OutputFile = ".\" + $datetime + "_" + $vcenter + "_VMTools_Report.txt"
# Connect to vCenter
$Connection = Connect-VIServer $vcenter #-User $username -Password $password
foreach($Cluster in Get-Cluster) {
foreach($esxhost in ($Cluster | Get-VMHost | Where { ($_.ConnectionState -eq "Connected") -or ($_.ConnectionState -eq "Maintenance")} | Sort Name)) {
Get-Cluster | Get-VMhost $esxhost | get-vm | % { get-view $_.id } | select Name, @{ Name="ToolsVersion"; Expression={$_.config.tools.toolsVersion}}, @{ Name="ToolStatus"; Expression={$_.Guest.ToolsVersionStatus}}, @{Name="Host";Expression={$esxhost}}, @{Name="Cluster";Expression={$cluster.name}} | Format-Table | Out-File -FilePath $OutputFile -Append
}
}
Disconnect-VIServer * -Confirm:$false

This is a real time saver, and great for getting quick figures out of your environment. As I wrote these scripts, I learned more and more what I could do, picking up ways of doing different things here and there: for/next loops, do/while loops, arrays. As I picked up these concepts again, concepts I had learned years earlier and not used to great effect, my scripts became more complex, and delivered more value in the output they gave, and the time saved. Scripts like this which reports on any datastores over 90% utilisation, these soon became a part of our daily reporting regime:


$datetime = get-date -uformat "%C%y%m%d-%H%M"
$vcentercsv = import-csv .\VCenter_Servers.csv
# Configure connection settings using Read Only account
$credcsv = import-csv .\VMware_Credentials.csv
$vmuser = $credcsv.username
$vmpasswd = $credcsv.password
$secpasswd = convertto-securestring $vmpasswd -asplaintext -force
$vmcreds = new-object system.management.automation.pscredential ($vmuser,$secpasswd)
$report = @()
foreach($vcenter in $vcentercsv)
{
$vcentername=$vcenter.name
connect-viserver $vcenter.ip -credential $vmcreds
foreach ($datastore in (get-datastore | where {$_.name -notlike "*local*" -and [math]::Round(100-($_.freespacegb/$_.capacitygb)*100) -gt 90}))
{
$row = '' | select Name,FreeSpaceGB,CapacityGB,vCenter,PercentUsed
$row.Name = $datastore.name
$row.FreeSpaceGB = $datastore.freespacegb
$row.CapacityGB = $datastore.capacitygb
$row.vCenter = $vcenter.name
$row.PercentUsed = [math]::Round(100-($datastore.freespacegb/$datastore.capacitygb)*100)
$report += $row
}
Disconnect-VIServer * -Confirm:$false
}
$report | Sort PercentUsed | export-csv -path .\$datetime-datastore-overuse.csv

My knowledge of how to do things, and confidence in what I was doing grew rapidly, and the old thing of ‘the more I know, the more I realise I don’t know’ came to pass. I am still learning at a rapid rate how better to put these things together, and new cmdlets, new modules, new ways to do things. It’s a fun journey though, one which leaves you with extremely useful and admired skills, and one which will continue to develop you as an IT technician throughout your career.

I am now doing the biggest PowerShell datacenter automation project I have ever done, it is around 5000 lines now, and growing every day. I feel like anything can be achieved with PowerShell, and the various modules released by vendors, and finding ways of solving the constant puzzles which hit me in the face is exciting and rewarding in equal measure.

Everywhere you look in IT now, it is automation and DevOps. It has been said many times that IT engineers who do not learn some form of automation are going to be automated out of a job, and to some extent I agree with this. The advent of software defined storage, networking, everything, shows that automation, and policy driven configuration, is really changing the world of IT infrastructure. If you’re in IT then you probably got in because you love technology, well get out there and learn new skills, whatever those may be, you will enjoy it more than you think.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s