Archives for category: PowerShell

I’ve been recently working a lot with cryptographic hashes and sometimes it was useful to be able to check results or hash strings on my own while debugging or diagnosing something.

I started doing this so much, that ended up creating a few Windows PowerShell functions to deal with this:

function convert-tobinhex($array) {
   $str = new-object system.text.stringbuilder
   $array | %{
      [void]$str.Append($_.ToString('x2'));
   }
   return $str.ToString()
}
function convert-frombinhex([string]$binhex) {
   $arr = new-object byte[] ($binhex.Length/2)
   for ( $i=0; $i -lt $arr.Length; $i++ ) {
      $arr[$i] = [Convert]::ToByte($binhex.substring($i*2,2), 16)
   }
   return $arr
}
function get-hash($value, $hashalgo = 'MD5') {
   $tohash = $value
   if ( $value -is [string] ) {
      $tohash = [text.encoding]::UTF8.GetBytes($value)
   }
   $hash = [security.cryptography.hashalgorithm]::Create($hashalgo)
   return convert-tobinhex($hash.ComputeHash($tohash));
}

Not very efficient, but does the trick.

In the latest entry in the official PowerShell Blog, Jeffrey Snover leaves a little tidbit about a change coming on the next CTP of PowerShell V2:

The cmdlet keyword is going away and we'll just have function. Notice that now you can specify the [Parameter()] attribute on parameters.  When you do that, we treat the function like a cmdlet.

I don’t mind getting rid of the cmdlet keyword; honestly, but is the new syntax that much of an improvement? So now instead of an explicit keyword telling you whether something is a function of a cmdlet, you have an attribute spilled around the function parameters? Not sure if that’s good or not, but sounds confusing to me, at first glance.

Maybe what rubs me the wrong way, however, is the [Parameter] attribute. I’m OK with using an attribute to specify the metadata associated with the parameters, honestly. But the whole Param + [Parameter] thing just reads redundant and ugly:

function Emit-XML {
Param ([Parameter(Mandatory=$true,ValueFromPipeline=$true)]$object)
...
}

Can anyone more familiar with this chime in? Am I the only one slightly bothered by this?

I tend to fire up Windows Explorer instances from my prompt very often, either on the current directory or in other spots. I was pretty used to using the “explorer /e,<path>” command from CMD.EXE but I never had figured out exactly how to use it on PowerShell and have it work reliably: Paths with spaces always caused it to open the default documents folder instead.

Finally sat down and played with it until I got it right:

# open explorer in this directory
function exp([string] $loc = '.') {
   explorer "/e,"$loc""
}

Silly little helper PowerShell function that I’ve been finding useful lately when doing some debugging:

# get our own process information
function get-myprocess {
   [diagnostics.process]::GetCurrentProcess()
}

Neil Houghton asked a couple of days ago if I could share my PowerShell prompt() function. Here's what my prompt looks like:

ps-prompt

I tried to keep my prompt relatively short while still on a single line. There are two things I care about in my prompt: The machine name I'm working on (useful when I have VMs opened) and the current path, in abbreviated form.

Thus, my prompt() function looks like this:

function prompt {
   # our theme
   $cdelim = [ConsoleColor]::DarkCyan
   $chost = [ConsoleColor]::Green
   $cloc = [ConsoleColor]::Cyan

   write-host "$([char]0x0A7) " -n -f $cloc
   write-host ([net.dns]::GetHostName()) -n -f $chost
   write-host ' {' -n -f $cdelim
   write-host (shorten-path (pwd).Path) -n -f $cloc
   write-host '}' -n -f $cdelim
   return ' '
}

The abbreviation of the current directory is partially inspired by Unix (use ~ if it's under the $HOME) and partially by how GVim shortens paths for its tab captions:

vim-tabs

Here's the function that takes care of this:

function shorten-path([string] $path) {
   $loc = $path.Replace($HOME, '~')
   # remove prefix for UNC paths
   $loc = $loc -replace '^[^:]+::', ''
   # make path shorter like tabs in Vim,
   # handle paths starting with \\ and . correctly
   return ($loc -replace '\\(\.?)([^\\])[^\\]*(?=\\)','\$1$2')
}

One thing to keep in mind regarding shorten-path: I do a very simple replace of $HOME by ~. The reason it works correctly is that I ensure in my profile script that the $HOME variable has a fully qualified path and "completed" using the resolve-path command. I also modify what the home directory is under PowerShell by using a trick I've described previously.

Technorati tags: ,

I thought it would be fun to bring up a list of some small things I love about using PowerShell as my default command line / scripting tool on the Windows platform. Without further introduction, here they are:

Switching to Network Shares

One of the most annoying limitations of the old cmd.exe was that it provided almost no support for network shares that were not mapped as drives.

The PowerShell provided model, fortunately, didn't make this mistake, so you can use set-location to switch to a network share and execute commands there just like in any other local folder:

ps_network

Location Stacks

Like other shells, PowerShell supports pushing folders into a stack and then popping them back out, making it a lot easier to navigate around a few folders without having to remember the complete paths each time. You manipulate the location stack using the push-location and pop-location commands, which are aliased to pushd and popd respectively.

I find this very useful when running doing builds or when working with Vim and some scripting language.

The PowerShell implementation, however, has a couple of advantages:

  1. It works on the provider model, so you can push/pop more than just file system locations. For example, you could use the location stack to move around the registry.
  2. It supports multiple stacks: You can keep multiple, named location stacks at the same time by simply using the -stackName parameter: 

    ps_stacks

The contents of the location stacks are stored as part of your PowerShell Session State, and you can see the contents of each stack by using the get-location command:

ps_liststacks

Notice that the default stack will always have an empty string ('') as its name.

It's .NET-based

The fact that PowerShell is based on the .NET Framework can be confusing to some people due to some of the underlying complexity of the framework leaking to the shell (like value types).

However, for a developer like me, this is a great thing. It means that a lot of the things I used to create small C# apps for, like quickly trying out a framework class/method, I can now try interactively on the shell. It also means that there is a lot of functionality already built, right at my fingertips that can be used very quickly.

Here's an example: I was recently working on a new Virtual Machine I had just setup and needed to generate a new GUID. I'm very used to going to the command line and running the old uuid.exe utility to do this, but, unfortunately, I had not installed the Windows SDK on the VM yet and was not planning to. With PowerShell, though, this was not a problem:

# uuidgen.exe replacement
function uuidgen {
   [guid]::NewGuid().ToString('d')
}

Shell Introspection

Another aspect that I really enjoy about PowerShell is the built-in introspection features. It's a lot easier to learn the shell when all the information about what's available and the objects you're manipulating is easily accessible.

Here are some of the introspection features I like:

  • The get-command command. Great way to see what commands are available. Coupled with get-help, it's a really nice way to figure something out.
  • The built-in providers. One really nice feature of PowerShell is that functions, aliases and variables are all exposed as providers, meaning you can list them / examine them to your hearts content using the standard location commands like get-childitem. Just try running "ls function:" or "ls variable:" next time.
  • The $MyInvocation, $ExecutionState and $Host variables give you access to plenty of information about the execution environment, as well as lots of options for manipulating it.

No need for Calc.exe

I used to constantly start calc.exe to do a quick calculation. Since PowerShell is a full scripting language, however, I now do most of those quick calcs by directly entering expressions into the PowerShell prompt.

It's a heck of a lot easier and very convenient for me since I always keep around a PowerShell prompt opened and ready to go.

Technorati tags:

I've mentioned in the past that I've experimented with a number of different options for styling and syntax highlighting code snippets on this weblog, with varying degrees of success.

For a few months I've been using CSS-based syntax highlighting based on the code generated by Vim's :TOhtml command, and it's been working out pretty well. It means that my RSS feed doesn't have any color, but at least the code itself is correctly formatted.

The part I didn't enjoy too much about :TOhtml, however, was that it would only generate the minimum number of style rules it need to highlight the piece of code it was generating the HTML for. This works fine in many scenarios, but it was annoying for me because it meant I needed to continually update my site code.css file as new rules were generated for new snippets that I didn't have in there before.

So what I really wanted was a way to convert the Vim colorscheme of my choice into a CSS useful for my site. Naturally, I turned to PowerShell and came up with convert-vim2css.ps1:

param( [string] $vimfile )

# some instructions we don't care for
$ignorable = ( 'link', 'clear' )

$nrx = 'hi (?<n>\w+)'
$fgrx = 'guifg=(?<n>#\w+)'
$bgrx = 'guibg=(?<n>#\w+)'
$frx = 'gui=(?<n>\S+)'

(gc $vimfile) | ?{
   ($_ -match $nrx) -and ($ignorable -notcontains $matches.n)
} | %{
   if ( $matches.n -eq 'Normal' ) {
      write '.codebg {'
      write '   border-left: solid 1em #303030;'
      write '   font-size: 1.1em;'
      write '   padding: 0.8em 0.5em;'
   } else {
      write ".$($matches.n) {"
   }
   if ( $_ -match $fgrx ) {
      write "   color: $($matches.n);"
   }
   if ( $_ -match $bgrx ) {
      write "   background: $($matches.n);"
   }
   # element could any combination of these
   if ( $_ -match $frx ) {
      switch ( $matches.n.split(',') ) {
         "italic" { write "   font-style: $_;" }
         "bold" { write "   font-weight: $_;" }
         "underline" { write "   text-decoration: $_;" }
      }
   }
   write '}'
}

# other boilerplate code
write 'code {'
write '   font-family: Consolas, "DejaVu Sans Mono", "Lucida Console", monospace; '
write '}'

It's not very fancy and it only supports GVim schemes, but that's enough for me and does the trick for now.

Technorati tags: ,

Here's a sample PowerShell script/functions to start/stop BizTalk orchestrations. This is an extended version of the Stop-Orchestration VBScript included in the BizTalk 2006 SDK, which I hope someone finds useful :-).

The script can be used to start or stop either a specific orchestration or a group of orchestrations defined in a BizTalk assembly. For example, to stop and unenlist all orchestrations in a given assembly, you could use this:

stop-orch -assembly 'MyProject.BizTalk, Version=1.0.0.0, Culture=neutral, PublicKeyToken=50b7b2906e3f8aa5' -unenlist

Here's the code for the script:

$script:bound = 2
$script:started = 4
$script:controlRecvLoc = 2
$script:controlInst = 2

function script:get-assemblyfilter([string]$assembly) {
   # The BizTalk WMI provider uses separate properties for each
   # part of the assembly name, so break it up to make it easier to handle
   $parts = $assembly.Split((',', '='))
   $filter = "AssemblyName='$($parts[0])'"
   if ( $parts.Count -gt 1 ) {
      for ( $i=1; $i -lt $parts.Count; $i += 2 ) {
         $filter = "$filter and Assembly$($parts[$i].trim())='$($parts[$i+1])'"
      }
   }
   return $filter
}
function script:find-orch([string]$name, [string]$assembly) {
   # We want to be able to find orchestrations by
   # name and/or assembly. That way we can control
   # all orchestrations in a single assembly in one call
   $filter = ""
   if ( ![String]::IsNullOrEmpty($name) ) {
      $filter = "Name='$name'"
      if ( ![String]::IsNullOrEmpty($assembly) ) {
         $filter = "$filter and $(get-assemblyfilter $assembly)"
      }
   } else {
      $filter = $(get-assemblyfilter $assembly)
   }
   get-wmiobject MSBTS_Orchestration `
      -namespace 'root\MicrosoftBizTalkServer' `
      -filter $filter
}

function start-orch([string]$name, [string]$assembly) {
   $orch = (find-orch $name $assembly)
   $orch | ?{ $_.OrchestrationStatus -eq $bound } | %{
      write-host "Enlisting $($_.Name)..."
      [void]$_.Enlist()
   }
   $orch | ?{ $_.OrchestrationStatus -ne $started } | %{
      write-host "Starting $($_.Name)..."
      [void]$_.Start($controlRecvLoc, $controlInst)
   }
}

function stop-orch([string]$name, [string]$assembly, [switch]$unenlist = $false) {
   $orch = (find-orch $name $assembly)
   $orch | ?{ $_.OrchestrationStatus -eq $started } | %{
      write-host "Stopping $($_.Name)..."
      [void]$_.Stop($controlRecvLoc, $controlInst)
      if ( $unenlist ) {
         [void]$_.Unenlist()
      }
   }
}

I just took a few moments to update some of my old posts on administering BizTalk Server using PowerShell:

Here are some of the changes:

  • Reformatted code samples so that they were nicely readable using the current blog theme
  • Fixed some code that still used InvokeMethod() on WMI objects. This was the big change, since the old code was written with one of the PowerShell betas and didn't work anymore on RTM.

If anyone notices any other problems with the samples, let me know!

Technorati tags: ,

Yesterday I needed to generate a bunch of small files to use as input for testing, as I needed to reproduce a bug I was tracking down. More to the point, I needed to generate 2000 files of size 781 bytes.

Update: I screwed up the code snippets on my first attempt. Fixed now!

Naturally, I turned to PowerShell and whipped this script:

$fc = new-object string ('a', 781)
1..2000 | %{ [io.file]::WriteAllText("$(pwd)\$_.txt", $fc) }

Is this the only way to do it? Certainly no, but a couple of things are worth mentioning about my specific solution. A savvy reader might ask: Why didn't you just use the redirection operator instead?

1..2000 | %{ $fc > "$_.txt" }

That is indeed shorter, but has one major drawback for my problem: The redirection operator when writing to files defaults to using UTF-16LE encoding, which meant my files would come out the wrong size.

Now, the redirection operator in this case is nothing more than a way to implicitly call the out-file cmdlet, which does provide a way to select the encoding:

1..2000 | %{ $fc | out-file "$_.txt" -enc ascii }

This is much better,  but, unfortunately, still screws up the file size because it will add a CR LF pair at the end. And it isn't remarkably shorter than my original solution using File::WriteAllText().

Technorati tags: