James O'Neill's Blog

July 28, 2009

Update on Windows 7-E / IE 8

Filed under: Uncategorized — jamesone111 @ 9:14 am

Many thoughts have been expressed about the situation with Windows 7 and internet explorer in Europe, but relatively few of them on Microsoft blogs.  No-one wants to be the idiot who prejudices a legal case, so anything I (or other bloggers) might be thinking of saying needs to be run past the legal folks first: few of us think a blog post is worth bothering them for*.  The latest official statement is on the Microsoft press pass site , it contains a new proposal, so you might find it is interesting reading. I’ll post something about the practical implications of whatever is finally agreed, but until that point is reached, I’ll stick to signposting what is officially sanctioned.

Please note that reader-comments  to this post would technically be published by Microsoft so I won’t approve any.


* Before anyone feels there is a “Microsoft censors its bloggers” story here, I don’t feel censored. No-one has stopped me doing anything, I’ve just been told what sort of care needs to be taken and why.

Update Someone asked me about the foot note. I picked up the story from a Post  Mary Jo Foley made on twitter. Saying “I’m passing this link on, but can’t add to it reminded me of this post I made a while back. I found it an one interesting to go back to – especially when I followed the first couple of links.

This post originally appeared on my technet blog.

July 25, 2009

Vulcan hunting: a mini case study in social media

Filed under: General musings,Photography — jamesone111 @ 8:00 am

I’ve described some of my activities over recent weekends as the biggest hunt for a Vulcan since Star Trek III – The Search For Spock. The Vulcan I’m after isn’t the pointy eared kind but XH558, the only flying example left of the V Bomber. It’s very easy to talk a lot of tosh about beautiful machines of various kinds, but big delta winged aircraft do have a certain something… Concorde always made people stop and look, and the Vulcan has been doing the same since it first flew in the 1950s. The Vulcan set a record for the longest bombing mission in history when one bombed the airfield at Port Stanley during the Falklands war in 1982. The task sounds crazy: “Chaps we’ve got an aircraft which entered service in the 1950s and is due to to be decommissioned. We’d like you to use it to bomb a runway, which is defended but we don’t know what with exactly. The good news is we’ve got you a base within 4000 miles of the target. The rest of the news isn’t so good, the aircraft’s navigation system is based on terrain mapping radar, and those 4000 miles are over featureless ocean, but before you worry about finding the target you’ll need to figure out how to get this aircraft to do  do air-to-air refuelling. We need this … well as soon as possible really”. They did it of course, and its been chronicled in at least one book.

The RAF kept a Vulcan flying for display purposes up till 1992 – ten years after it was meant to have come out service.  Enthusiasts wanted to keep it flying and it when it was retired it was flown to Bruntingthorpe in Leicestershire. Getting it back in the sky made getting to the Falklands look like seem like a walk in the park, it took 14 years. Even then it won’t fly forever: the engines have a very short life: they are rated in cycles so the translation to hours is approximate. There are two sets of engines available and they will last about 200 hours each – I don’t reboot my PC as frequently as once every 200 hours. The  Vulcan to the Sky trust wants to spin out the 400 hours they have 10 years, so they have 40 hours a year to maximize the chances people have to see the aircraft, and that will be it. 

The Trust has two linked problems. Like any charity their main problem is raising money. People only give in a crisis, and when the crisis is averted the money dries up till the next crisis comes. The other is making sure people who want to see the aircraft get a chance to do so. Of course the more people see it, the more donors they get, and they don’t show up when expected they alienate those donors. So XH558 is now registered on twitter. This has been a help for me since it has been spending the air-show season 20 miles or so from where I live. The first time twitter worked for me, I was standing in a field hoping to catch the take off. Late in the morning I checked twitter on my phone “Planned take off 14:00, return 16:00” I had to be home for 14:00, so that cut a lot of wasted time, and I came back to see it land. Next morning “Taking off 14:30 …” back I go – knowing I have to collect my children later. 14:30 comes and goes. At 14:45 I checked twitter on the phone. “Working on a problem” it said, so I went to pick up the children and found later that the problem wasn’t fixed in time for the days display, so flight cancelled and again I’d been saved a pointless wait. When XH558 finally did take off and do a practice display run a few days later I was there thanks to another tweet.

Think about this – if you’re trying to build a community around what you are doing, a few moments here and there to connect with your customers can produce some real results. Does it work for generating donations ?  I can’t extrapolate to everyone but they’ve got some money out of me. And here are the pictures I got – not just of the Vulcan  but of the other planes which came and went while I was waiting, produced with Microsoft Research Autocollage


This post originally appeared on my technet blog.

July 24, 2009

Seventh Heaven

Filed under: General musings,Photography — jamesone111 @ 8:40 pm

As I mentioned recently I have bought a new Camera – the Pentax K7 : as a proper photographer I’m bothered more by lenses than camera bodies and last year I acquired Pentax’s beautiful 77mm Limited series. All those 7s and a new version of Windows… So I thought I’d grab a photo , so I picked up my compact (a 7MP one), and took a photo to have in the Windows 7 screen shot below. Quite by coincidence it is one of 77 pictures in the folder and the lens is focused at about 7M. Instead of doing this post on a Friday I should have waited to till the weekend, not for the 7th day of the week, but because I will be flying to the US – you guessed it, on a Boeing 777 with Air Canada – the great circle route does quite take me past seventy degrees north, sadly. Or Perhaps I should have shot it at 7 minutes past 7 O’clock….

Editing a 7MP image of a Pentax K7 with 77 mm lens, one of 77 pictures in a Windows 7 folder.

tweetmeme_style = ‘compact’;
tweetmeme_url = ‘http://blogs.technet.com/jamesone/archive/2009/07/24/seventh-heaven.aspx’;

This post originally appeared on my technet blog.

PowerShell on-line help: A change you should make for V2 (#3) (and How to Author MAML help files for PowerShell)

Filed under: How to,Powershell — jamesone111 @ 4:39 pm

In the last could of “change you should make” posts I’ve talked about a couple of things which turn  Functions from being the poor-relation of compiled Cmdlets (as they were in PowerShell V1) to first class citizens, under V2. Originally the term “Script Cmdlets” was used but now we call them “Advanced functions”. This is quite an important psychological difference because writing a cmdlet – even a “script cmdlet”  sounds like difficult, real, programming. On the other hand an advanced function sounds like a function with some extra bits bolted on and nowhere near as daunting.

In V2 there is a new version of the Tab-Expansion function which completes names of functions and fills in parameters , and that doesn’t need any change to the code. In part 2 of this “Change you should make” series I talked about the better discover which comes from putting your functions into modules – simply type get-command <module-Name> and you get the list. And In part 1 I talked about the fact that we get support for the important common parameters –whatif, and –confirm for example. So we can find script-cmdlets advanced functions, and we can test to see if they are going to trash the system before we run them for real. What’s the one great thing about PowerShell that’s missing ? Consistent help everywhere!

In V2 you can put in a comment which actually defines help information, like this:

Filter Get-VMThumbnail

       Creates a JPG image of a running VM

         The Virtual Machine
     .PARAMETER width
         The width of the Image in Pixels (default 800)

    .PARAMETER Height
        The Height of the Image in Pixels (default 600)

        The path to save the image to, if no name is specified the VM name is used.
If no directory is included the current one is used

        Get-VMJPEG core
        Gets a 800×600 jpeg for the machine named core,
and writes it as core.jpg in the current folder.
        While ($true) { Get-VMJPEG -vm “core” -w 640 -h 480 `
-path ((get-date).toLongTimeString().replace(“:”,”-“) + “.JPG”)
{Sleep -Seconds 10}
       Creates a loop which continues until interrupted; in the loop creates an image of the
VM “Core” with a name based on the current time, then waits 10 seconds and repeats.



This is all very fine and good – and there are other fields which you can read about in the on-line help. As far as I can tell there are just two things wrong with this,

  1. It needs a fair amount of processing to extract the help into a document as the basis of a manual.

  2. It adds a lot of bulk to a script – in fact most of my functions will be shorter than their help, I don’t think this is truly harmful, but I like my scripts compact.

image Fortunately both of these are solved by using external  help files – which are in XML format. The help declaration is simple enough.

Function Get-VMThumbnail
{# .ExternalHelp  MAML-VM.XML

The XML schema is known as MAML and isn’t the nicest thing I’ve ever worked with….

A little while ago I was filling in my year end review and inside Microsoft we do this using an InfoPath form. I could write all I knew about infopath on the back of a postage stamp. You fill in forms which come out as XML. Could InfoPath give me an easier way to get MAML files ? As it turns out the answer was “yes”, and I’ve learnt enough infopath to make half a blog post.  It turns out that InfoPath can read in an existing data file to deduce the the XML structure you want to have as the result of filling in the form: once it has the structure you can drag the document elements around to make a form.

It’s not the most beautiful bit of forms design ever – and the XML does need a change to move between editing in infoPath and using for PowerShell help. But it compared with putting the XML together in notepad … well, there is no comparison. I’ve attached the InfoPath form for those who want it. Filling it in is pretty much self explanatory, with the exception of Parameter sets. Each Parameter is entered in its own right and at least one (default) parameter set is defined. The set names are actually displayed as the command when the help is displayed so the set name needs to be the command name for every set.

I could script the change required, but I don’t seem to be able to manage it in infopath , at the top of the file there is an opening <HelpItems> XML tag. To work as Powershell help this needs to contain XMLNS=”http://msh”  Schema=”MAML”. InfoPath considers the file to be invalid if the first of these is present, so you need to take the file out to edit it.

One final warning when you’re developing the XML help, once PowerShell has loaded the XML file it doesn’t seem to want to unload it so you have to start another instance of the shell –using the graphical ISE version of Powershell I found just press control T to get a new instance of the shell in its own tab.

This post originally appeared on my technet blog.

A tale of two codecs. Or how not to be a standard.

Filed under: General musings,Photography,Windows 7,Windows Vista — jamesone111 @ 12:16 pm

I’ve just bought a new digital SLR camera. Being a dyed in the wool Pentax person, I’ve upgraded to their new K7.

Being fairly serious about (some of) my photography I shoot quite a lot in RAW format.(In case you didn’t know higher end digital cameras can save the data as it comes off their sensor without converting it to JPEG format). There are only a small number of ways of expressing RAW data but every camera maker embeds one of those methods into their own file format: then each new camera introduces a new sub-version of the format. This is, frankly, a right pain.

Adobe came up with an answer to this, Digital Negative format, DNG. It has been adopted, but not Widely.  Pentax were first to support it in parallel with their own PEF format; Heavyweights like Hassleblad and Leica support it, so do some models from Casio, Ricoh and Samsung. But Canon and Nikon who account for somewhere round 3/4 of all DSLR sales have stuck with their own formats. Adobe maintain a converter which take proprietary files and convert them to DNG, so if you have an application which supports DNG but not your specific camera, Adobe’s tool will bridge the gap. So the take-up in photo processing software has been quite good. My chosen RAW software Capture One needs an update to work with the latest PEF, but will take DNG files straight from the camera. And I’d switch the camera over from PEF to DNG format if it weren’t for the vexing matter of Codecs. 

Before Windows Vista shipped we introduced “Windows Imaging components” WIC, which provide  RAW file using imaging CoDecs (COmpressor DECompressor). Windows 7 and Vista include WIC, and it’s WIC which provides image preview in the explorer: the net effect is that if you have a suitable Codec you get image preview. But, only a very basic set of codecs ships with the OS, partly because of the maintenance headache and partly because some RAW processing requires a bit of reverse engineering and we try to avoid doing that. Camera vendors provide Codecs and Pentax had a new PEF Codec on-line when I got my K7 home. But this is 32 bit only – other camera makers also lack 64 bit support. I could take this as inspiration for a huge rant  but let’s just say I’d make it a requirement for 32 AND 64 bit Windows to be able to preview a camera’s files before it was granted the “certified for Vista” logo – which the K7 sports on its packaging. Perhaps it’s good for our partnerships that I don’t decide such things.

I was on 64 bit Vista and I’m now on 64 bit Windows 7, so you might think the 32 bit codec would be totally useless … but no. A 32 bit codec won’t work with 64 bit software, like Windows explorer. But it will work with a 32 bit program like Windows Live Photo Gallery. (Photo Gallery from Vista has been moved over to Windows Live). Since WLPG shares a thumbnail cache with explorer, anything which you have seen in the Gallery will get a thumbnail in Explorer.  Now, granted, this is a Kludge but there are worse ones out in the world – so I can see my PEFs. But using PEF format means I need to use the (less than great) bundled RAW software until Capture one support the revised PEF. If I want to use Capture one today, I need to use DNG. So  do Adobe have a DNG codec ? They do, but their web site has (unanswered) complaints about the lack of 64 bit support going back to May of last year. Unlike the Pentax codec the Adobe one catches that I am on 64 bit Windows 7 and tells me it only installs on 32 bit Vista. [Users with the Windows Imaging Components installed on XP are out of luck too].

It’s a pretty poor show on Adobe’s part, but it’s easy to see how this comes about. None of the Camera vendors see it as their job to write a Codec for DNG – especially as Adobe have started the process. Microsoft don’t write Codecs except for major standards like JPG, PNG and TIFF and our own formats like Windows Media photo:  DNG doesn’t have enough of a foothold to be classed as a major standard. Adobe – I suspect – must feel that too many people are and not pulling their weight – expecting them to do all the work. It’s perhaps unfair to draw a parallel our support for Linux in the virtualization world (which I have only just written about) – after all it is in our interest to get our virtualization platform adopted, Adobe aren’t disadvantaged if people don’t choose to adopt DNG. But it needs a bit more commitment to get something adopted than Adobe are showing. If you were a product planner at Canon or Nikon would you write DNG support into the spec for future models ? Or would you decide that the support for DNG was half baked and you’d leave it as “something to keep an eye on” for now ?

In researching this I had a look at the Microsoft’s pro photo web site. Which is worth a visit just for the “Icons of imaging” page if you haven’t been there before. The downloads page does feature a 3rd party codec for DNG , which I must investigate. Sadly it’s not free: it’s not that I begrudge the money, but if I have to pay even a token amount to get something which bundled with something I have bought and is supposed to be a standard, to working in the all the places I’d expect it work then how much of a standard is it. I could level the same charge at Adobe over PDF iFilters and preview – but as I’ve written before, Foxit software plugs the gaps and is free – reinforcing the idea that PDF is a standard which is bigger than the company which devised it. I’d love to think DNG would do for RAW formats what PDF has done for documents, but sadly it doesn’t look like it will go that way.

This post originally appeared on my technet blog.

July 23, 2009

Oink flap –– Microsoft releases software under GPL — oink Flap

Mary-Jo has a post about the release of our Hyper-V drivers for Linux entitled Pigs are flying low: Why Microsoft open-sourced its Linux drivers , it’s one of many out there but the title caught my eye: I thought I’d give a little of my perspective on this unusual release. News of it reached me through one of those “go ahead and share this” mails earlier this week  which began

NEWSFLASH: Microsoft contributes code to the Linux kernel under GPL v2.
Read it again – Microsoft contributes code to the Linux kernel under GPL v2!
Today is a day that will forever be remembered at Microsoft.

Well indeed… but hang on just a moment: we’re supposed to “hate” the GPL aren’t we ? And we’re not exactly big supporters of Linux … are we ? So what gives ? Let’s get the GPL thing out of the way first:

For as long as I can remember I’ve thought (and so has Microsoft) that whoever writes a book, or piece of software or paints a picture or takes a photo should have the absolute right decide its fate.  [Though the responsibilities that come with a  large share of an important market apply various IFs and BUTs to this principle]. Here in the UK the that’s what comes through in the Copyrights Designs and Patents Act, and I frequently find myself explaining to photographers that the act tilts things in their favour far more than they expect. Having created a work, you get the choice whether to sell it, give it away, publish the Source code , whatever. The GPL breaks that principle, by saying, in effect “if you take what I have given away, and build something around it, you must give your work away too and force others to give their work away ad infinitum”; it requires an author of a derivative work to surrender rights they would normally have. The GPL community would typically say don’t create derivative works based on theirs if you want those rights. Some in that community – it’s hard to know how many because they are its noisiest members -  argue for a world where there is no concept of intellectual property (would they argue you could come into my garden and take the vegetables that stem from my physical work ? Because they do argue that you can just take the product of my mental work). Others argue for very short protection under copyright and patent laws: ironically a licence (including the GPL) only applies for the term of copyright, after that others can incorporate a work into something which is treated as wholly their own. However we should be clear that GPL and Open Source are not synonyms (Mary Jo, wasn’t in her title) . Open source is one perfectly valid way for people to distribute their works – we want Open Source developers to write for Windows and as I like to point out to people this little project here  means I am an Open Source Developer and proud of it. However I don’t interfere with the rights of others who re-use my code,  because it goes out under the Microsoft Public Licence: some may think it ironic that is the Microsoft licence which gives people freedom and those who make most noise about “free software” push a licence that constrains people.

What are we doing ? We have released the Linux Integration Components for Hyper-V under a GPL v2 license, and the synthetic drivers have been submitted to the Linux kernel community for inclusion in upcoming versions of the Linux kernel.  The code is being integrated into the Linux kernel tree via the  Linux Driver Project which is a team of Linux developers that develops and maintains drivers in the Linux kernel. We worked very closely with Greg Kroah-Hartman to integrate our Linux IC’s into the Linux kernel. We will continue to develop the Integration components and as we do we will contribute the code to the drivers that are part of the kernel.
What is the result ? The drivers will be available to anyone running an appropriate Linux kernel. And we hope that various Linux distributions will make them available to their customers through their releases. 
WHY ? It’s very simple. Every vendor would like their share of the market to come from customers who used only their technology; no interoperability would be be needed: but in the real world, real customers run a mixture. Making the Linux side of those customers lives unnecessarily awkward just makes them miserable without getting more sales for Microsoft. Regulators will say that if you make life tough enough, it will get you more sales, but interoperability is not driven by some high minded ideal – unless you count customer satisfaction, which to my way of thinking is just good business sense. Accepting that customers aren’t exclusive makes it easier for them to put a greater proportion of their business your way. So: we are committed to making Hyper-V the virtualization platform of choice, that means working to give a good experience with Linux workloads. We’d prefer that to happen all by itself, but it won’t: we need to do work to ensure it happens.  We haven’t become fans of the GPL: everything I wrote above about the GPL still holds. Using it for one piece of software is the price of admission to the distributions we need to be in, in order to deliver that good experience. Well… so be it.  Or put another way, the principle of helping customers to do more business with you trumps other principles.
Does this mean we are supporting all Linux distributions ? Today we distribute Integration components for SLES 10 SP2. Our next release will add support for SLES 11 and Red Hat Enterprise Linux (5.2 and 5.3). If you want to split hairs we don’t “support” SLES or RHEL – but we have support arrangements with Red Hat and Novell to allow customers to be supported seamlessly. The reason for being pedantic about that point is that a customer’s ability to open a support case with Microsoft over something which involves something written by someone else depends on those arrangements being in place. It’s impossible to say which vendors we’ll have agreements with in future (if we said who we negotiating with it would have all kinds of knock on effects, so those discussions aren’t even disclosed inside the companies involved). Where we haven’t arranged support with a vendor we can only give limited advice from first principles about their product, so outside of generic problems which would apply to any OS, customers will still need to work with the vendors of those distributions for support.

You can read the press release or watch the Channel 9 Video for more information.

This post originally appeared on my technet blog.

July 22, 2009

Release the Windows 7 !

Filed under: Beta Products,Windows 7,Windows Server,Windows Server 2008-R2 — jamesone111 @ 10:03 pm

It’s official. Windows 7 has released to manufacturing. http://www.microsoft.com/Presspass/press/2009/jul09/07-22Windows7RTMPR.mspx 

It’s official. Windows Server 2008 R2 has released to Manufacturing http://blogs.technet.com/windowsserver/archive/2009/07/22/windows-server-2008-r2-rtm.aspx

It’s official. Hyper-V server R2 has released to manufacturing http://blogs.technet.com/virtualization/archive/2009/07/22/windows-server-2008-r2-hyper-v-server-2008-r2-rtm.aspx 

When will be able to get your hands on it http://windowsteamblog.com/blogs/windows7/archive/2009/07/21/when-will-you-get-windows-7-rtm.aspx 


Woo hoo !

This post originally appeared on my technet blog.

PowerShell Modules: A change you should make for V2. (#2)

Filed under: How to,Powershell — jamesone111 @ 4:11 pm

A few days back I wrote about PowerShell version 2’s ability to confirm whether it should be changing something. Since I was writing something which would some pretty drastic changes , supporting –WhatIf and –Confirm for almost no effort seems like a huge win.

The next thing I wanted to cover was modules. I’ve written some quite large function libraries in PowerShell V1 and I met a few problems which V2 solves by use of modules.

  1. Collaboration. One script with 100 functions isn’t easy to collaborate on. A module lets you load multiple script files as one, but different people can work on each. 
  2. Loading of formatting and type extensions – these XML files can be loaded from scripts, but when they are loaded more than once things get untidy. Modules can load them along with the code.
  3. While we’re on code, modules allow the loading of a mixture of script files and DLLs in one go. Module DLLs don’t need to be registered as snap-ins do, so deployment is easier.
  4. Loading at all: an environment variable defines module paths, and you use import-module NAME, educating people on dot sourcing was a pain (and part of the reason for having a single monolithic file)
  5. Discovery : finding all the functions in a script needed some creativity. Now you can just do get-command –module Name

You can turn a script into a module simply by creating a manifest file, which is a text file with a .PSD extension. At it’s simplest it looks like this

@{ ModuleVersion     = "1.0.0"   
    NestedModules     = "Helper.ps1”}

But there is no reason why there should only be one nested module. So to collaborate with different people owning different functions, you just have a long list in the manifest. Here’s the (only slightly edited) manifest for a project which I’m just about to publish.

@{ ModuleVersion     = "1.0.0"
    NestedModules     = "Firewall.ps1" , "Helper.ps1" , "Licensing.ps1",
                        "network.ps1" , "menu.ps1" , "Remote.ps1",
                        "WinConfig.ps1", "windowsUpdate.ps1", "WinFeatures.ps1"

    GUID              = "{75c6f959-23a1-4673-8ee9-e61e21ff8381}"
    Author            = "James O'Neill"
    CompanyName       = "Microsoft Corporation"
    Copyright         = "© Microsoft Corporation 2009. All rights reserved."
    PowerShellVersion = "2.0"
    CLRVersion        = "2.0"
    FormatsToProcess  = "Config.format.ps1xml"


If my manifest file is named Configurator.psd, all I need to do is create a configurator sub-folder in one of the folders pointed to by the Environment variable PSModulePath then I can just load it with import-module configurator. Of course different people can be working on Firewall.ps1 and Licensing.ps1. Collaboration problem solved. I can get rid of the functions if I want to with remove-module configurator. And if I reload them the fact that I am loading a .format.ps1XML file for the second time isn’t a problem as it would be when I load it from a script. No need to dot source, and the functions are discoverable with get-command –module configurator .

As you can see there are quite a few things you can add to the manifest file over and above the basics – things like FormatsToProcess and TypesToProcess, so to make it easier to build the file there is a new-moduleManifest cmdlet. There is plenty more to read about modules, but for starters look at this post of Oisin’s on Module Manifests.

This post originally appeared on my technet blog.

How to activate Windows from a script (even remotely).

I have been working on some PowerShell recently to handle the initial setup of a new machine, and I wanted to add the activation. If you do this from a command line it usually using the Software Licence manager script (slMgr.vbs) but this is just a wrapper around a couple of WMI objects which are documented on MSDN so I thought I would have a try at calling them from PowerShell. Before you make use of the code below, please understand it has had only token testing and comes with absolutely no warranty whatsoever, you may find it a useful worked example but you assume all responsibility for any damage that results to your system. If you’re happy with that, read on.  

So first, here is a function which could be written as  one line to get the status of Windows licensing. This relies on the SoftwareLicensingProduct WMI object : the Windows OS will have something set in the Partial Product Key field and the ApplicationID is a known guid. Having fetched the right object(s) it outputs the name and the status for each – translating the status ID to text using a hash table.

$licenseStatus=@{0=”Unlicensed”; 1=”Licensed”; 2=”OOBGrace”; 3=”OOTGrace”;
4=”NonGenuineGrace”; 5=”Notification”; 6=”ExtendedGrace”}
Function Get-Registration

{ Param ($server=”.” )
get-wmiObject -query  “SELECT * FROM SoftwareLicensingProduct WHERE PartialProductKey <> null
AND ApplicationId=’55c92734-d682-4d71-983e-d6ec3f16059f’
AND LicenseIsAddon=False” -Computername $server |
foreach {“Product: {0} — Licence status: {1}” -f $_.name , $licenseStatus[[int]$_.LicenseStatus] }


On my Windows 7 machine this comes back with Product: Windows(R) 7, Ultimate edition — Licence status: Licensed

One of my server machines the OS was in the “Notification” state meaning it keeps popping up the notice that I might be the victim of counterfeiting  (all Microsoft shareholders are … but that’s not what it means. We found a large proportion of counterfeit windows had be sold to people as genuine.)  So the next step was to write something to register the computer. To add a licence key it is 3 lines – get a wmi object, call its “Install Product Key” method, and then call its “Refresh License Status method”.  (Note for speakers of British English, it is License with an S, even though we keep that for the verb and Licence with a C for the noun).  To Activate we get a different object (technically there might be multiple objects), and call its activate method. Refreshing the licensing status system wide and then checking the “license Status”  property for the object indicates what has happened. Easy stuff, so here’s the function.

Function Register-Computer
{  [CmdletBinding(SupportsShouldProcess=$True)]
param ([parameter()][ValidateScript({ $_ -match “^\S{5}-\S{5}-\S{5}-\S{5}-\S{5}$”})][String]$Productkey ,
[String] $Server=”.” )


$objService = get-wmiObject -query “select * from SoftwareLicensingService” -computername $server
if ($ProductKey) { If ($psCmdlet.shouldProcess($Server , $lStr_RegistrationSetKey)) {
                           $objService.InstallProductKey($ProductKey) | out-null 
                           $objService.RefreshLicenseStatus() | out-null }

    }   get-wmiObject -query  “SELECT * FROM SoftwareLicensingProduct WHERE PartialProductKey <> null
                                                                   AND ApplicationId=’55c92734-d682-4d71-983e-d6ec3f16059f’
                                                                   AND LicenseIsAddon=False” -Computername $server |

      foreach-object { If ($psCmdlet.shouldProcess($_.name , “Activate product” ))

{ $_.Activate() | out-null

$objService.RefreshLicenseStatus() | out-null

If     ($_.LicenseStatus -eq 1) {write-verbose “Product activated successfully.”}
Else   {write-error (“Activation failed, and the license state is ‘{0}'” `
-f $licenseStatus[[int]$_.LicenseStatus] ) }
                            If     (-not $_.LicenseIsAddon) { return }

else { write-Host ($lStr_RegistrationState -f $lStr_licenseStatus[[int]$_.LicenseStatus]) }

Things to note

  • I’ve taken advantage of PowerShell V2’s ability to include validation code as a part of the declaration of a parameter.
  • I as mentioned before, it’s really good to use the SHOULD PROCESS feature of V2 , so I’ve done that too.
  • Finally, since this is WMI it can be remoted to any computer. So the function takes a Server parameter to allow machines to be remotely activated.

A few minutes later windows detected the change and here is the result.



This post originally appeared on my technet blog.

July 7, 2009

Michael Jackson memorial concert

Filed under: Uncategorized — jamesone111 @ 8:22 am

I’ve had a “please blog this” email: most of these get deleted on sight, but because of the unusual circumstances I’m posting this one. I’m not going to add any commentary on the Music, life and/or death of Michael Jackson: it’s all been said.

July 7th, 10AM PT, 1PM ET [that is 6PM UK, 7PM Central Europe]– We are broadcasting the Michael Jackson memorial, live in HD from the Staples Center in Los Angeles using IIS Smooth Streaming and Silverlight to the world.  As you can imagine this has come together fast and we need to get the word out … http://inmusic.ca/news_and_features/Michael_Jackson is the link to the page



This post originally appeared on my technet blog.

July 5, 2009

Tennis as a way of testing search.

Filed under: Uncategorized — jamesone111 @ 7:54 pm

Like a lot of people who would really call themselves Tennis fans I was watching the Wimbledon final with that very long last set. I wondered what the longest set at Wimbledon was. A quick tap into my favourite search engine pulled up this page on wikipedia


What impressed me was that someone was updating the games count as the match progressed.  Of course, because I used bing I already had the answer: it was 26-28 in a doubles match.


Click fo full size version

One of the things I’m liking about bing is the way it displays results with a little “hover for a précis” beside them. So I wondered what Google would get.


Hmmm. Looks like Google got the wrong answer. Not sufficient to overturn your search engine preference overnight perhaps, but if you haven’t yet taken a good look at bing then you might be missing a trick,

This post originally appeared on my technet blog.

Blog at WordPress.com.