James O'Neill's Blog

August 24, 2011

Improving my score at PowerShell Golf

Filed under: Powershell — jamesone111 @ 9:58 am

There is mindset found in among scripting and programming people known as “Golf”. The objective is to get to the end using as few [key]strokes as possible. It’s not something I like to see in scripts and functions, but most of us don’t like to type any more than we have to at the command line. Tab expansion in PowerShell is a great feature for cutting down keystrokes (not beyond improvement, but truly great).

I’ve mentioned a couple of times recently that PowerShell’s parser gives GET- commands a sort of “automatic alias”: if you type “Service” where a command is expected, tab expansion won’t work but if you carry on and run the command, then  before it gives up and reports “The term 'service' is not recognized as the name of a cmdlet, function, script file, or operable program.” PowerShell looks for “Get-Service” finds it and calls that instead.  This works for user defined functions as well , so the Get-SQL command I wrote to send SQL commands via ODBC can be shortened to “SQL”, which is very fine and good and only has two things wrong with it.

  1. Commands need to wrapped in quotes. It doesn’t matter if they are single or double quotes, but if you are pasting in something which was written for MySQL you need to have your wits about you because like PowerShell it allows you to use single or double quotes in the command .
  2. It’s still more keystrokes than using the MySQL Monitor in an SSH session.

What I wanted to do was use a symbol to mean “Send this to SQL”.”PowerShell already uses most of the symbols on the keyboard, although its parser is flexible enough to allow some to be used as aliases – for example when it sees “40 % 13” the % sign is interpreted as the “modulo” operator but when it sees “dir | % {…% is an alias for “ForEach-Object” . Possible though it is, making the $ or > signs into aliases feels like the first dancing step down the path to madness, so for SQL I used ¬ (I’ve used the same technique for the Invoke-RemoteCommand wrapper I wrote for the netCmdlets Invoke-SSH command for which I use ~ )

I want to be able to type
¬ select * from mytable where name="james"
and have my command go into the default ODBC connection (or the default SSH connection or whatever) .
But it won’t work if I simply use Set-Alias -Name "¬" -Value "Get-SQL" , because “Select”,”*”,”from”, “mytable” and so on each get treated as distinct parameters and the quote marks will be lost around james. PowerShell has a “ValueFromRemainingArguments” option for a parameter but that won’t deal with the quotes issue (and there will be an problems deciding what belongs to other parameters like the ODBC connection string ).

My solution was to alias ¬ to an intermediate function named Hide-GetSql which contains 1 line
Function Hide-GetSQL {
  Get-Sql -sql ($MyInvocation.line.substring($MyInvocation.OffsetInLine))
}
Set-Alias -Name ¬ -Value Hide-GetSQL

Normally I hate seeing references to $Args or $MyInvocation.line but here I have to make an exception, this gets the line which was used as a single string and truncates it after the function / alias name.  PowerShell’s parser will try to split this line into sections if it contains semi colons – the function still works, but any attempt to process a second command will almost certainly cause a spurious error. For my use it hasn’t been a problem – your mileage may vary.

August 23, 2011

Enhancing the NetCmdlets

Filed under: Powershell — jamesone111 @ 1:11 pm

At PowerShell Deep dive conference in April /n software gave away USB keys with a full set of “PowerShell inside” software including their NetCmdlets. It turns out that this is 30 day evaluation software, and initially this put me off using them; Joel wrote about an alternative saying “It’s also possible to do this using the NetCmdlets from /n and I’m constantly surprised at how unwilling people are to pay for them.”  Part of the problem is you get nearly 100 cmdlets, and if you want only half a dozen if feels like most of the cost is wasted. After checking a free library and finding it flakey I dug out the USB key, figuring if they worked, or even mostly worked they’d be worth the cost.

I only need to do a couple of things – use SSH to send commands to the Linux server which hosts my application using (for which there are 3 commands: Connect-SSH, Disconnect-SSH and Invoke-SSH) and move files both ways with sftp (which has Connect-SFTP, Disconnect-SFTP, Get-SFTP, Remove-SFTP, Rename-SFTP and Send-SFTP) .  I wanted to be able to invoke the commands against my server without re-making connections each time , and that meant some simple “wrapper” commands to get things how I wanted them.

Connect-Remote Sets up a Remote Session (saves the session object as a global variable)
All the following functions can be passed a remote session but use the global variable as a default
Get-RemoteItem Get properties one or more remote items (Like Get-Item / Get-ChildItem) – alias rDir
Copy-RemoteItem Copies a remote item to the local computer (like Copy-Item ) –alias rCopy
Copy-LocalItem Copies a local item to the remote computer
Remove-RemoteItem Deletes an item on the remote computer  (like Remove-Item)
Invoke-RemoteCommand Runs a command on the remote computer. (Like Invoke-Command)

The two connect- netcmdlets return different types of object which contain the server, credential and various other connection parameters. I can set them up like this

$credential = $Host.ui.PromptForCredential("Login",
   "Enter Your Details for $server","$env:userName","")
$Global:ssh = Connect-SSH -Server $server -Credential $credentialForce

I found downloading causes the sftpConnection object to “stick” to the path of downloaded file. Get-, Remove-, Rename- and Send- all allow the server and credential to be passed, so I can pass the SSHConnection object into my SFTP wrappers and use its server and credential properties – even though it is the “wrong” connection type – being for SSH command-lines rather than SFTP file access. I wrote a Connect-Remote function which also stores the connections so I can switch between them more easily. It looks like this:

$Global:AllSSHConnections = @{}
Function Connect-Remote { 
Param ([String]$server = $global:server,
      
[String]$UserName =$env:userName , 
       [Switch]$force
     
)
      if   ($Global:AllSSHConnections[$server] -and -not $force) {
             
$Global:ssh = $Global:AllSSHConnections[$server]} 
      else { $credential = $Host.ui.PromptForCredential("Login","Enter Your Details for $server",$username,"")
      # Connect-ssh takes errorAction parameter but it doesn't work. So use try-catch 
      try {$Global:ssh = Connect-SSH -Server $server -Credential $credential -Force -ErrorVariable SSHErr }
      catch {continue}
      If ($?) {$remotehost = (invoke-RemoteCommand -Connection $ssh -CmdLine "hostname")[0] -replace "\W*$",""
               Add-Member -InputObject $ssh -MemberType "NoteProperty" -Name "HostName" -Value $remoteHost
               $Global:AllSSHConnections[$server] = $ssh
               $ssh | out-default 
      }
}

Storing the connection simplifies using the other cmdlets. I have a little more error checking in the real version of the script so I can tell the user if there was simply a password error or something more fundamental at fault.

The first task I wanted to perform was to get a file listing via sftp. I have a couple of easily-solved gripes with the netcmdlets: I’ve already mentioned the sticking sftp connection, the other is that the EntryInfo object which is returned when getting a file listing doesn’t contain the fully qualified path, just the file name – so I sort the entries into order, and add properties for the directory and fully qualified path and used New-TableFormat  to give me the start of a formatting XML file to display the files nicely . The function also takes the path, and a connection object – which defaults to the connection I set up before.

Function Get-RemoteItem {
    param( [Parameter(ValueFromPipeLine=$true)]
          
[String]$path= "/*",
           $Connection  = $Global:ssh 
         )
process {
           if ($Connection.server -and $Connection.Credential ) {
                  $Directory = $path -replace "/[^/]*$","" 
                  Get-sftp -List $path -Credential $Connection.Credential`
                                            
-Server $Connection.Server -force | 
                           Sort-Object -property @{e={-not $_.isdir}}, filename | 
                           Add-Member -MemberType NoteProperty   -Name "directory" -Value $directory -PassThru | 
                           Add-MemberMemberType ScriptProperty -Name "Path" `
                                        -Value {$this.directory + "/" + $this.Filename}PassThru
           }
           else {write-warning "We don't appear to have a valid connection."}
         }
}

The next things to add were copy-RemoteItem and Copy-LocalItem – which just have to call Get-SFTP and Send-SFTP with some pre-set parameters.

Function Copy-RemoteItem {
[CmdletBinding(SupportsShouldProcess=$true)]
param ([Parameter(ValueFromPipeLine=$true, ValueFromPipelineByPropertyName=$true, mandatory=$true)]
      
[String]$path,
       [String]$destination = $PWD,
       $Connection = $Global:ssh,
       [Switch]$force
       )
process { if ($Connection.server -and $Connection.Credential -and $path ) {
            
 #Because we re-make the connection in the Get-Sftp command we can pass an SSH object.
              write-verbose "Copying $path from $($Connection.Server) to $destination"
              Get-SFTP -RemoteFile $path -LocalFile $destination -Credential $Connection.Credential `
                       -Server $Connection.Server -Force -Overwrite:$force |
                       Add-Member -PassThru -MemberType Noteproperty -Name "Destination" -Value $destination
          }
          Else {Write-warning "We don't seem to have a valid destination and path" }
        }
}
Set-Alias -Name rCopy -Value Copy-RemoteItem

Function Copy-LocalItem {
[CmdletBinding(SupportsShouldProcess=$true)]
param ([Parameter(ValueFromPipeLine=$true, ValueFromPipelineByPropertyName=$true, mandatory=$true)]
       [String]$path,
       [Parameter(Mandatory=$true)][String]$destination ,
       $Connection = $Global:ssh,
       [Switch]$force
      )
process {  if ($Connection.server -and $Connection.Credential -and $path ) {
              
write-verbose "Copying $path to $destination on $($Connection.Server)"
               Send-SFTP -LocalFile $path -RemoteFile $destination -Credential $Connection.Credential `
                          -Server $Connection.Server -Force -Overwrite:$force
           }
           Else {Write-warning "We don't seem to have a valid destination and path" }
        }
}

The last wrapper I needed was one to go round Invoke-SSH – primarily to set default parameters, and return only the response text.

 Function Invoke-RemoteCommand {
[CmdletBinding(SupportsShouldProcess=$true)]
param   ([parameter(Mandatory=$true , ValueFromPipeLine=$true)][String]$CmdLine,
         $Connection = $Global:ssh
        )
process { if ($Connection -is [PowerShellInside.NetCmdlets.Commands.SSHConnection]) {
             write-verbose "# $cmdline"
             Invoke-SSH -Connection $Connection -Command $CmdLine |
              
 Select-Object -ExpandProperty text
          }
        }
}

Job done … well almost. I’ll talk about a couple of tricks I’ve used in combination with these in future posts.

August 22, 2011

Easy-formating for PowerShell with New-TableFormat

Filed under: Powershell — jamesone111 @ 10:09 am

Most of the PowerShell I write ends up as Functions: rather than scripts and it’s worth differentiating  between the two.

  • A function is something which can be called from somewhere else.
  • A script is carries out a particular task (which might be to define some function(s) , or might be done with functions )

Several of my 10 tips for better functions are about being smart with inputs and outputs (which is less important for one off scripts). I should have included an extra point “Ask if constants should really be defaults for parameters” . If I notice “I am solving one specific case of X, with a value set to Y, so I can solve other X-problems if I allow the value to be changed.” it makes something reusable out of a one-off. For example, I put a wrapper round PowerShells great “Select-String” cmdlet to search PS1 files , then I realised that I might want to search .SQL or .XML files, but usually it would be PS1. So “*.PS1” became the default value for a -include parameter.

Recently I put together a simple function named Get-SQL – it uses ODBC to send SQL commands – in  my case to MySQL on a Linux box but it could be any ODBC source. To avoid constantly creating and tearing sessions, it saves an ODBCconnection object in a global variable between invocations . Once it has been run for the first time in a session I can type Get-SQL “command” and use the existing connection: and I’ve made sure SQL statements can be piped in (that’s what I mean by being smart about inputs).

In it’s basic form Get-SQL takes 3 parameters, a connection string ($connection) a SQL statement ($SQL) and -ForceNew switch, which remakes the connection if it already exists
if ($forceNew -or (-not $global:ODBCconn)) {
    $global:ODBCconn = New-Object System.Data.Odbc.OdbcConnection($connection)
    $global:ODBCconn.open()
}
if ($sql) {
    $adapter = New-Object system.Data.Odbc.OdbcDataAdapter(
    New-Object system.Data.Odbc.OdbcCommand($sql,$ODBCconn))
    $table = New-Object system.Data.datatable
    [void]$adapter.fill($table)
    $table
}
else      {$ODBCconn }

So, if a SQL statement is passed, any data rows it generates will be returned; but if I am just setting up a connection for the rest of my session the ODBC connection object is returned.

PS C:\Users\james\Documents\windowsPowershell> Get-SQL -connection "DSN=foo-Customer5"
ConnectionString  : DSN=foo-Customer5
ConnectionTimeout : 15
Database          : foo
DataSource        : 192.168.1.234 via TCP/IP
ServerVersion     : 5.0.77
Driver            : myodbc5.dll
State             : Open
Site              :
Container         :

I want to show something shorter and simpler: I could finish with
$ODBCConn | format-table DataSource , Database, State
that might be acceptable now, but I don’t know what problems I’m storing up for the future by not returning the full ODBC object.
The better option is to write a Formatting XML file – so the object remains the same but PowerShell displays it the way I want. But writing and debugging the XML longhand is tedious. So the temptation is to go with the format-table method and fix things up later; but there is a short-cut to writing the XML which avoids parking the problem for later.. Richard mentions that it came up during the PowerShell Deep Dive in Vegas, And I downloaded it from here.  (Slightly oddly you need to put Function New-TableFormat { } into the editor, and paste the code block in then run it)

To use it you get the output the way you want with Format-Table and then replace “Format-Table” in the command line with New-TableFormat (with a couple of extra bits of finesse if needed). So it took about a minute to get the ODBC connection  to display like this

Data Source                Database State
-----------                -------- -----
192.168.1.234 via TCP/IP   foo      Open
  

I’ve also created wrappers around /n software’s Netcmdlets to make it easier to work with the linux server, and in one of those  I call
Connect-SSH -Server $server -Credential $credential -Force
which would normally output this :

AuthMode         : Password
CertPassword     :
CertStore        : MY
CertStoreType    : User
CertSubject      :
Config           :
Credential       : System.Management.Automation.PSCredential
FirewallHost     :
FirewallPassword :
FirewallPort     : 0
FirewallType     : None
FirewallUser     :
Force            : True
LocalIP          :
Password         :
Port             : 22
Server           : 192.168.1.234
ShellPrompt      :
SSHAccept        :
Timeout          : 10
User             :

It took about Two minutes to get Add-Member to show what the host thinks its name is, and create a formatting XML file to get the result to look like this:

Connected to   Port  Host name        User name
------------   ----  ---------        ---------
192.168.4.71   22    foo.contoso.com  \root

It was all of another minute to create a manifest file to load my scripts & formatting XML as a PowerShell module and have everything neat and tidy. Cracking stuff

August 17, 2011

Adding “Out-Edit” functionality to PowerShell

Filed under: Powershell — jamesone111 @ 8:41 pm

Back in March I wrote about useful bits in my PowerShell profile. Recently I’ve wanted to take the results of what I’m doing into an editor. It’s easy enough to pipe a result into CLIP.EXE which puts it in the clipboard, but it’s not too hard to use the object model of PowerShell ISE to do it directly. 

Function Out-Edit {
  Param   ( [parameter(ValueFromPipeLine= $true)][Alias('Text')]$inputObject, 
            [String]$Before,
            [String]$After,
            [Switch]$New
           )
   begin   { if ($new)   {$Editor = $psise.CurrentPowerShellTab.Files.Add().editor }
        
   else         {$Editor = $psise.CurrentFile.Editor }
            if ($before) {$Editor.InsertText($before) } 
           }
   process { $Editor.InsertText($inputObject ) }
   end     { if ($after) {$Editor.InsertText($after) } }
}

I gave the function a –new switch to decide if the input goes into the current file or a new one. And to allow the input to be topped and tailed – typically with something like $Foo=@" and "@ to make my output into a here-string , it’s just a case of getting a new or current file’s editor and calling its .InsertText() method

Note: In that March piece I showed my replacement for the default PSEDIT function, in the end I merged it with the function above and I’ve put my profile.ps1 on skydrive. In the end I changed the NEW switch, to a “UseExisting” switch, because I found that I almost always needed the new option.

August 7, 2011

Prayer-based parsing: PowerShell regular expressions again.

Filed under: Powershell — jamesone111 @ 9:52 am

 

Recently I saw a quote on Scott Hanselman’s blog

You’ve got a problem, and you’ve decided to use regular expressions to solve it.
Ok, now you’ve got two problems…”

A telling of the history of this Quote provides some other good quotes,; some restate “Using X as a solution adds to the problems”  another expanded a point I heard Thomas Lee make at one of his PowerShell camps. In PowerShell you can pipe a folder object, or a process object between steps and get its name from a .name property. Without this model: as one of quotes puts it:

The decades-old Unix “pipe” model is just plain dumb, because it forces you to think of everything as serializable text, even things that are fundamentally not text, or that are not sensibly serializable

(DOS copied the Unix pipe model, and it’s still in Windows -away from PowerShell).
Thomas had a great term for de-serializing text: “Prayer based parsing”. Every time you extract the bits you want from the text, you need to pray that it (still) works with your parsing rules. Some of the more arcane switches for command-line tools (on whatever OS) are to control their output – with one eye on simplifying the job of parsing it. 

Some tasks –like  “Screen scraping” – leave no alternative but to parse text. ( “A text only “pipe” is just automated screen scraping” is another way to describe the problem PowerShell’s pipeline solves.) Regular expressions are “big hammer” for advanced parsing (some of the “quotes” article gets into whether in Perl they are the most suitable hammer , or the most obvious hammer), and they’re something which PowerShell handles deftly with the –match and –replace operators.

I recently wanted to take a list of companies offering training which I had found the web, and transform it into a more database like format. In my screen scraped list each company had an entry which looked like this

Contoso Ltd S-1234
123 Station Road
Aberdeen
AB12 3DE
info@Contoso.com
http://www.contoso.com
Ph: (44) 123-456789
Fax: (44) 123-456790

Sign Up for a Online Course

approx: 0.8km / 0.5mi from London, GB, United Kingdom

I copied the list, and opened a new tab the PowerShell ISE and created a small snippet of PowerShell
$list = @"

"@

The @” … “@  defines a multi-line “here-string” – so I can paste the data in between the quotes hit the Run Script button and ta-da! my list is in a PowerShell variable. Studying the format, all the data I want is on consecutive lines – with a double line space after them. The –replace operator can remove single line breaks, leaving additional ones, to group the text I want on one line.
$list -replace "\r\n(?=\w)", ", "
\r and \n indicate the return and newline characters. So -replace "\r\n", ", " would replace all line breaks with with a comma and a space,
the (?=  ) construction specifies a “look ahead” says “match ONLY if what you have found is followed by…” and I want the replacement to happen only if the return/newline is followed by a ‘word’ character (\w) – so a line break followed by another line break won’t be replaced.

Now I can split my list – which is still a single giant string into  into multiple strings. Where I had two line breaks I know have one line break, a comma and a space so I can use that as the expression for the –split operator. 
($list -replace "\r\n(?=\w)", ", ") -split "\r\n, "

So far so good, but this will give be distances and the call to take a course. I want to discard any lines which don’t contain a company’s ID number

($list -replace "\r\n(?=\w)", ", ") -split "\r\n, " |
    foreach { if ($_ -match "\ss-\d+\s*,") {'"' + $_ + '"'}}

This looks at each line and if it contains a space, ‘S’, ‘–’, at least one digit, any number of spaces (including zero) and then a comma, it returns the line wrapped in double quotes. So now my text is one line per company looking like this. 

“Contoso Ltd S-1234, 123 Station Road, Aberdeen, AB12 3DE, info@Contoso.com, http://www.contoso.com, Ph: (44) 123-456789 , Fax: (44) 123-456790”

My next step is to replace the space before the ID number with ‘ “,” ’ 

(($list -replace "`r`n(?=\w)", ", ") -split "`r`n, " |
      foreach { if ($_ -match "\ss-\d+\s*,") {'"' + $_ + '"'}}
) -replace "\s+(?=S-\d+\s*,)", '","'

"\s+(?=S-\d+\s*,)" says “match on one or more spaces ONLY IF  followed by S, –, at least one digit, any spaces (or none) and then a comma.”  Wherever that Occurs I insert “,” breaking my line into two quoted strings separated by a comma – just like a CSV file.  The next step is to make it three quoted strings by replacing the coma and spaces after the ID number with the same ‘ “,” ’  combination

(($list -replace "`r`n(?=\w)", ", ") -split "`r`n, " |
      foreach { if ($_ -match "\ss-\d+\s*,") {'"' + $_ + '"'}}
) -replace "\s+(?=S-\d+\s*,)", '","'  -replace "(?<=S-\d+),\s*",'","'

"(?<=S-\d+),\s*" uses  (?<=   ) which is the look-behind construction, saying  “match on comma and any spaces (or none) ONLY IF it is preceded by ‘S’, ‘–’ and at least one digit”.  Now my line looks like this

“Contoso Ltd”,”S-1234”,”123 Station Road, Aberdeen, AB12 3DE, info@Contoso.com, http://www.contoso.com, Ph: (44) 123-456789 , Fax: (44) 123-456790″

The next task is to put the ” , ” combination before an email address:

(($list -replace "`r`n(?=\w)", ", ") -split "`r`n, " |
foreach { if ($_ -match "\ss-\d+\s*,") {'"' + $_ + '"'}}
) -replace "\s+(?=S-\d+\s*,)", '","' -replace "(?<=S-\d+),\s*",'","' `
–replace ",\s* (?=\w+@\w+\.\w+)" ,'","'

",\s* (?=\w+@\w+\.\w+)" says “match on comma and any spaces ONLY IF followed by at least one character , an @ sign, at least one character , a dot, and at least one character.”
Then I can use  ",\s*(?=http)" which matches on comma and any spaces ONLY IF followed by http and finally "\s*,\s* Ph:\*" and "\s*,\s*Fax:\s* " match on the ph and fax tags and surrounding spaces to put “,” before each of those.

Instead of using a different –replace operation for each step the terms  can be condensed into a single expression with | for “or” between each part.  You can see that building the expression up bit by bit is a lot easier than writing it in one go.

(($list -replace "`r`n(?=\w)", ", ") -split "`r`n, " |
      foreach { if ($_ -match "\ss-\d+\s*,") {'"' + $_ + '"'}}
) -replace "\s+(?=S-\d+\s*,)|(?<=S-\d+),\s*|, (?=\w+@\w+\.\w+)|,\s* (?=http)|\s*,\s*Ph:\s*|\s*,\s*Fax:\s*" , '","'

This turns my text into

“Contoso  Ltd”,”S-1234″,”123 Station Road, Aberdeen, AB12 3DE”, “info@Contoso.com”,”http://www.contoso.com&#8221;,”(44) 123-456789″,”(44) 123-456790″

There is potentially more-cleaning up I could do – for example identifying lines without email or URL sections and inserting a blank “”, in their place. But this gives me all I need; by writing a header row to a file and adding and my text to it I would get a ready made CSV file for Excel. I could have turned it into a bulk database import or used PowerShell’s has a ConvertFrom-Csv cmdlet to turn each row into a object or any number of other things. I don’t think I would have tried typing that line in one go: but PowerShell induced the habit of building up these long lines a little at a time meant it only took a couple of minutes to do. 

Blog at WordPress.com.