James O'Neill's Blog

February 25, 2010

Retirement Planning (for service packs)

Yesterday I wrote about end-of-life planning for OSes and so it makes sense to talk about the end of a service pack, as retirement – it is after all the word that is used on the product lifecycle pages. Of course we don’t mean retirement in go and live by the seaside sense…



Special police squads — BLADE RUNNER UNITS — had orders to shoot to kill, upon detection,any trespassing Replicants.


This was not called execution. It was called retirement


that sense. Service packs, like OSes (and replicants) get an end date set well in advance, having explained OSes I want to move on to service packs (and if you want to know about Replicants you’ll have to look elsewhere).


The rule for service packs is simple. Two years after the release of a Service Pack we stop supporting the previous version. So although Windows Vista will be in mainstream support until 2012, and extended support until 2017, that doesn’t mean you can run the initial release , or Service Pack 1 and be supported until then. Lets use Vista as a worked example – I explained yesterday


Windows Vista [had] a General Availability date [of] Jan 2007.For Vista, five years after GA will be later than 2 years after Windows 7, so Vista goes from mainstream to extended support in or shortly after January 2012. We’ve set the date, April 10th 2012. The end of extended support will depend on when the next version of Window ships, but it won’t be before April 11th 2017.


Service pack 1 for Vista became available in April 2008, and Service Pack 2 became available in April 2009.
So, the life of the original Release to Manufacturing of (RTM) version of Windows Vista ends on April 14 2010.
In the same way the life of SP1 of Vista should end in April 2011, actually because we don’t retire things on the exact anniversary, SP1 gets an extension until July 12 2011.


If you are on Vista you must have upgraded to SP1 or SP2 (or Windows 7) by April 14 if you want to continue being supported.


So here’s the summary for what is supported with Vista, and when


Jan ‘07 – April ‘08  Only RTM release available


April ‘08 – April ‘09 RTM and Service Pack 1 supported


April ‘09 – April ‘10 RTM , Service Pack 1 and Service Pack 2 supported


April ‘10  – July ‘11 Service pack 1 and Service Pack 2 Supported


July ‘11 – April ‘12 Service Pack 2 only supported


April ‘12 – April ‘17 Extended support phase on SP2 only.


To simplify things, that assumes there is no Service pack 3 for Windows Vista, and that the successor to Windows 7 ships before April 11 2015.



Vista SP1 coincided with the release of Server 2008, and  Windows XP service pack 3 came very shortly afterwards. The extra few days means the anniversary for XP SP2 falls after the cut off date for April retirement and the end of life for XP SP 2 is July 13th 2010 (the same as day Windows 2000 professional and server editions). Mainstream support for Windows XP (all service packs) has ended,  after July 13 XP is extended support ONLY on SP3 ONLY.


I should have included in yesterdays post that July 13th 2010 also marks the end of mainstream support for Server 2003 (and Server 2003 R2), the  RTM and SP1 versions are already retired. It would be very unusual to see a new service pack for something in extended support. If you still have 2003 servers, you need to decide what you will do about support / upgrades before Jul 13th


Server 2008 shipped as SP1 to sync up with Windows Vista  and SP2 for both came out on the same date, so there are no server service pack actions required until July 12 2011. I explained yesterday why I have sympathy with people who don’t plan, but if you are on Server 2008 SP1 don’t leave it till the last minute to choose between SP2 or upgrading to R2  and then implementing your choice.


Update – Fixed a few typos. 

This post originally appeared on my technet blog.

February 24, 2010

End of life planning.

Filed under: Windows 7,Windows Server,Windows Vista,Windows XP — jamesone111 @ 4:57 pm

Click for a full size version No. I’m not talking about sorting out the music for one’s funeral* …

I think every manager I have had in my 10 years at Microsoft has grumbled that I’m not great with planning – it’s a fair criticism and I try to work on it. When the subject comes up a quote from a book by William Gibson comes into my head.  “I try to plan in your sense of the word, but that isn’t my basic mode, really. I improvise. It’s my greatest talent. I prefer situations to plans, you see…. Really, I’ve had to deal with givens.”  the speaker is actually an artificial intelligence, but I think that is how a lot of IT people work: improvise, deal with the situation at hand, then deal with the next situation. It may be what we prefer – but be it training plans or plans for rolling out new software you’ve got to do it.

We do try to help on the software side, by being both transparent and predictable. The rule for core things (like desktop and server operating systems) is at least 10 years of support. (Embedded operating systems have a different support model which runs for longer).
Mainstream support runs for 5 years from release OR until 2 years after the successor product releases whichever is later. Extended support runs for 5 years, or 2 years after the second successor product releases. After that those who can’t move forwards, but have deep pockets have the option on custom support. In order to be supported you have to be running a supported level of service pack, and I’ll cover that in a later post.

So let’s take a worked example.

  *  Windows 2000 professional’s General availability date was March 2000.

  *  The “n+1” release is Windows XP, which had a General availability date of December 2001.

  *  Two years after Windows XP would be December 2003 , less than the 5 year minimum so mainstream support for Windows 2000 runs to March 2005 when extended support begins. (In practice it got a mainstream June – products only go off the support list on particular days and they live on to the next one after the anniversary)

  *  The “n+2” release is Windows Vista with a General Availability date was Jan 2007.

  *  Two years Vista would be Jan 2009, again less than the 5 year minimum, so extended support support runs to June 2010. Again there is a few days extension.

So the cut off date for Windows 2000 professional is July 13th 2010. After that there will be custom support only for 2000 and if you are still running it you should understand that means we stop the routine distribution of security updates for it. 

As it happens the cut off dates for Windows 2000 Server mainstream support was 2 years after the release of Server 2003 – putting it in May 2005  -  so  2000 professional and server sync’d up. The 2 year point after Server 2008 and the 5 years of extended support take it to the same time, June 2010. So the cut off date for Windows 2000 Server is July 13th 2010.

I like to think that no-one reading this blog would still be running Windows 2000, but I know a good many are still running Windows XP. So let’s carve the dates on XPs tombstone:

5 years after XP’s GA date would be December 2006, but Vista had not shipped by then. So Mainstream support for XP ends two years after the GA date of Vista which takes us to Jan 2009 (In practice it was April 2009). Unless you have taken out a contract for extended support, you have only been getting security updates for XP since then.

5 Years after that is April 2014. Windows 7 had a GA date of October 2009, so 2 years on from there would be sooner. Extended support for XP ends on the later of the two dates, so April 2014.

For Vista, five years after GA will be later than 2 years after Windows 7, so Vista goes from mainstream to extended support in or shortly after January 2012. We’ve set the date, April 10th 2012. The end of extended support will depend on when the next version of Window ships, but it won’t be before April 11th 2017. Both dates for Windows 7 depend on future versions of Windows but won’t be sooner than January 13th 2015, and January 14th 2020. Put them in your diary now, with a reminder a long time in advance 🙂

You can get all the dates from the Product lifecycle page


* Strange Angels by Laurie Anderson if you must know.

tweetmeme_style = ‘compact’;
tweetmeme_url = ‘http://blogs.technet.com/jamesone/archive/2010/02/24/end-of-life-planning.aspx’;

This post originally appeared on my technet blog.

February 8, 2010

Installing Windows from a phone

Arthur : “You mean you can see into my mind ?”
Marvin: “Yes.”
Arthur: “And … ?”
Marvin: “It amazes me how you manage to live in anything that small”

Looking back down the recent posts you might notice that this is the 8th in a row about my new phone (so it’s obviously made something of an impression), this one brings the series to a close.

I’ve said already that I bought at 16GB memory card for the new phone which is a lot – I had 1GB before, so… what will I do with all that space? I’m not going to use it for video and 16GB is room for something like 250 hours of MP3s or 500 hours of WMAs: I own roughly 200 albums, so it’s a fair bet they’d fit. Photos – well maybe I’d keep a few hundred MB on the phone. In any event, I don’t want to fill the card completely. After a trip out with no card in the my camera I keep a SD-USB card adapter on my key-ring so I always have both a USB stick and a memory card : currently this uses my old micro-SD card in an full size SD adapter. If I need more than 1GB I can whip the card out of the phone, pop it in the adapter and keep shooting 

However the phone has a mass storage device mode so I thought to myself why not copy the Windows installation files to it, and see if I can boot a Machine off it and install Windows from the phone ? That way one could avoid carrying a lot of setup disks.
Here’s how I got on.

This post originally appeared on my technet blog.

September 14, 2009

How to view RAW image files on Windows 7 (and Windows Vista).

Filed under: Photography,Windows Server,Windows Vista — jamesone111 @ 4:09 pm

My photography posts appear to be a bit like busses. I don’t make one for a while then two together …


Some while back I wrote a tale of two Codecs bemoaning the patchy support for RAW files.  Basically we (Microsoft) don’t provide codecs for anything other JPG, TIF, PNG and our Windows Media formats. Everything else is down to whoever is responsible for the format showing a bit of leadership. Pentax fell a bit short with the codec for their PEF format – no 64 bit support. Still, a 32 bit codec works in 32 bit apps –like live Windows Live Photo Gallery, and if one of those previews the image and creates the Thumbnail it then shows up in explorer. At least Pentax’s Codec will install: they support Adobe’s DNG format as an alternative and Adobe’s rather old beta codec won’t install on 64 bit Windows 7. I discovered Ardfry’s Codec for DNG, which is pretty good, though not free.


Putting QuickTime player onto my rebuilt PC I find that it has partial codec support for WIndows – i.e. some Mov files can be played in Windows Media Player and show a thumbnail in Explorer , and some can’t (it appears the “can” use H264 video and “the can’t” are CinePak or Sorenson). Before I had a chance to get the latest build from Ardfry, someone sent me a link to this page of Codecs from Axel Rietschin Software Developments.  I’ve only installed and tested the 64 bit ones PEF and DNG ones but the initial impression is very good indeed. The only gripe is that there doesn’t seem to be a way for the Codec to return the meta data information from the picture but tell Windows “For this format the meta data is read only” – with both Axel’s and Ardfry’s codecs you can enter new data only to get an error when Windows tries to save it.


The full list of supported formats is as follows.


Adobe Digital Negative (*.dng  )
Canon Raw Image  (*.cr2, *.crw )
Fuji Raw Image (*.raf)
Hasselblad Raw Image (*.3pr, *.fff)
Kodak Raw Image (*.dcr, *.kdc )
Leica Raw Image (*.raw, *.rwl)
Minolta Raw Image (*.mrw)
Nikon Raw Image (*.nef, *.nrw )
Olympus Raw Image (*.orf)
Panasonic Raw Image (*.rw2)
Pentax Raw Image (*.pef)
Sony Raw Image (*.arw, *.sr2, *.srf)


A nice bonus is that these were created to support Fast Image Viewer, which I hadn’t come across before: this supports tethered shooting on Cameras with PTP support (like my new Pentax K7). I’m going to give this a try and I’ll hand over the small pile of pennies required if it works. Update there are different levels of PTP support, and the K7 doesn’t do what I need it to. Sigh.


This post originally appeared on my technet blog.

July 22, 2009

Release the Windows 7 !

Filed under: Beta Products,Windows 7,Windows Server,Windows Server 2008-R2 — jamesone111 @ 10:03 pm

It’s official. Windows 7 has released to manufacturing. http://www.microsoft.com/Presspass/press/2009/jul09/07-22Windows7RTMPR.mspx 

It’s official. Windows Server 2008 R2 has released to Manufacturing http://blogs.technet.com/windowsserver/archive/2009/07/22/windows-server-2008-r2-rtm.aspx

It’s official. Hyper-V server R2 has released to manufacturing http://blogs.technet.com/virtualization/archive/2009/07/22/windows-server-2008-r2-hyper-v-server-2008-r2-rtm.aspx 

When will be able to get your hands on it http://windowsteamblog.com/blogs/windows7/archive/2009/07/21/when-will-you-get-windows-7-rtm.aspx 

 

Woo hoo !

This post originally appeared on my technet blog.

How to activate Windows from a script (even remotely).

I have been working on some PowerShell recently to handle the initial setup of a new machine, and I wanted to add the activation. If you do this from a command line it usually using the Software Licence manager script (slMgr.vbs) but this is just a wrapper around a couple of WMI objects which are documented on MSDN so I thought I would have a try at calling them from PowerShell. Before you make use of the code below, please understand it has had only token testing and comes with absolutely no warranty whatsoever, you may find it a useful worked example but you assume all responsibility for any damage that results to your system. If you’re happy with that, read on.  


So first, here is a function which could be written as  one line to get the status of Windows licensing. This relies on the SoftwareLicensingProduct WMI object : the Windows OS will have something set in the Partial Product Key field and the ApplicationID is a known guid. Having fetched the right object(s) it outputs the name and the status for each – translating the status ID to text using a hash table.

$licenseStatus=@{0=”Unlicensed”; 1=”Licensed”; 2=”OOBGrace”; 3=”OOTGrace”;
4=”NonGenuineGrace”; 5=”Notification”; 6=”ExtendedGrace”}
Function Get-Registration

{ Param ($server=”.” )
get-wmiObject -query  “SELECT * FROM SoftwareLicensingProduct WHERE PartialProductKey <> null
AND ApplicationId=’55c92734-d682-4d71-983e-d6ec3f16059f’
AND LicenseIsAddon=False” -Computername $server |
foreach {“Product: {0} — Licence status: {1}” -f $_.name , $licenseStatus[[int]$_.LicenseStatus] }
}

 


On my Windows 7 machine this comes back with Product: Windows(R) 7, Ultimate edition — Licence status: Licensed


One of my server machines the OS was in the “Notification” state meaning it keeps popping up the notice that I might be the victim of counterfeiting  (all Microsoft shareholders are … but that’s not what it means. We found a large proportion of counterfeit windows had be sold to people as genuine.)  So the next step was to write something to register the computer. To add a licence key it is 3 lines – get a wmi object, call its “Install Product Key” method, and then call its “Refresh License Status method”.  (Note for speakers of British English, it is License with an S, even though we keep that for the verb and Licence with a C for the noun).  To Activate we get a different object (technically there might be multiple objects), and call its activate method. Refreshing the licensing status system wide and then checking the “license Status”  property for the object indicates what has happened. Easy stuff, so here’s the function.

Function Register-Computer
{  [CmdletBinding(SupportsShouldProcess=$True)]
param ([parameter()][ValidateScript({ $_ -match “^\S{5}-\S{5}-\S{5}-\S{5}-\S{5}$”})][String]$Productkey ,
[String] $Server=”.” )

 

$objService = get-wmiObject -query “select * from SoftwareLicensingService” -computername $server
if ($ProductKey) { If ($psCmdlet.shouldProcess($Server , $lStr_RegistrationSetKey)) {
                           $objService.InstallProductKey($ProductKey) | out-null 
                           $objService.RefreshLicenseStatus() | out-null }

    }   get-wmiObject -query  “SELECT * FROM SoftwareLicensingProduct WHERE PartialProductKey <> null
                                                                   AND ApplicationId=’55c92734-d682-4d71-983e-d6ec3f16059f’
                                                                   AND LicenseIsAddon=False” -Computername $server |

      foreach-object { If ($psCmdlet.shouldProcess($_.name , “Activate product” ))

{ $_.Activate() | out-null

$objService.RefreshLicenseStatus() | out-null

$_.get()
If     ($_.LicenseStatus -eq 1) {write-verbose “Product activated successfully.”}
Else   {write-error (“Activation failed, and the license state is ‘{0}'” `
-f $licenseStatus[[int]$_.LicenseStatus] ) }
                            If     (-not $_.LicenseIsAddon) { return }

              }              
else { write-Host ($lStr_RegistrationState -f $lStr_licenseStatus[[int]$_.LicenseStatus]) }
    }
}


Things to note



  • I’ve taken advantage of PowerShell V2’s ability to include validation code as a part of the declaration of a parameter.
  • I as mentioned before, it’s really good to use the SHOULD PROCESS feature of V2 , so I’ve done that too.
  • Finally, since this is WMI it can be remoted to any computer. So the function takes a Server parameter to allow machines to be remotely activated.

A few minutes later windows detected the change and here is the result.


image


 


This post originally appeared on my technet blog.

June 25, 2009

How to: have nicer Active Directory management from PowerShell – without upgrading AD

One of the first books I read on PowerShell  had a comment about using AD from the PowerShell V1 which amounted to “It’s too hard, don’t bother use VB Script instead”. I’d taken against the book in question (no names no pack drill) – in fact it reminded me of something Dorothy Parker is supposed to have said*  "This is not a book to be cast aside lightly, it should be hurled with great force."  When I was asked to contribute to the Windows Scripting Bible (out of stock at Amazon at the time of writing!) someone had put a chapter on AD into the outline, so I had to write one. This gives me enough expertise to say it can be done, and having written scripts in VBScript to work with AD it is easier in PowerShell, but it is ugly and not done in true PowerShell style.

All that changed when we took the covers off the Beta of Windows Server 2008 R2 , it has PowerShell V2 with Cmdlets for Active directory. A quick scratch of the surface revealed these work with a new Web Service which is (you guessed it) on in R2. This quickly led to questions about whether it would be back-ported… and I had to answer “I know customers are asking for it, but I don’t know if it will happen”. There is a post on the  AD Powershell blog announcing the beta of a version for Windows Server 2003 and 2008 version for Windows Server 2003 and 2008.  

(Quick tip of the hat to Jonanthan who tweeted this)

 

 


* If in doubt about attributing quotes which don’t sound like Shakespeare or the bible, Churchill, Mark Twain or Dorothy Parker are always good bets. 

This post originally appeared on my technet blog.

February 18, 2009

How to manage the Windows firewall settings with PowerShell

I mentioned recently that I’m writing a PowerShell configuration tool for the R2 edition of Hyper-V server and Windows server core.   One of the key parts of that is managing the firewall settings…. Now… I don’t want to plug my book too much (especially as I only wrote the PowerShell part) but I had a mail from the publisher today saying copies ship from the warehouse this week and this code appears in the book (ISBN  9780470386804 , orderable through any good bookseller)

The process is pretty simple. Everything firewall-related in Server 2008/Vista / Server R2/ Windows 7, is managed through the HNetCfg.FwPolicy2 COM object, so. First I define some hash tables to convert codes to meaningful text, and I define a function to translate network profiles to names. So on my home network

$fw=New-object –comObject HNetCfg.FwPolicy2  ;  Convert-fwprofileType $fw.CurrentProfileTypes  

returns “Private”


$FWprofileTypes= @{1GB=”All”;1=”Domain”; 2=”Private” ; 4=”Public”}
$FwAction      =@{1=”Allow”; 0=”Block”}
$FwProtocols   =@{1=”ICMPv4”;2=”IGMP”;6=”TCP”;17=”UDP”;41=”IPv6”;43=”IPv6Route”; 44=”IPv6Frag”;
                  47=”GRE”; 58=”ICMPv6”;59=”IPv6NoNxt”;60=”IPv6Opts”;112=”VRRP”; 113=”PGM”;115=”L2TP”;
                  ”ICMPv4”=1;”IGMP”=2;”TCP”=6;”UDP”=17;”IPv6”=41;”IPv6Route”=43;”IPv6Frag”=44;”GRE”=47;
                  ”ICMPv6”=48;”IPv6NoNxt”=59;”IPv6Opts”=60;”VRRP”=112; ”PGM”=113;”L2TP”=115}
$FWDirection   =@{1=”Inbound”; 2=”outbound”; ”Inbound”=1;”outbound”=2}

 

Function Convert-FWProfileType
{Param ($ProfileCode)
$FWprofileTypes.keys | foreach –begin {[String[]]$descriptions= @()} `
                                -process {if ($profileCode -bAND $_) {$descriptions += $FWProfileTypes[$_]} } `
                                –end {$descriptions}
}


The next step is to get the general configuration of the firewall; I think my Windows 7 machine is still on the defaults, and the result looks like this

Active Profiles(s) :Private 

Network Type Firewall Enabled Block All Inbound Default In Default Out
------------ ---------------- ----------------- ---------- -----------
Domain                   True             False Block      Allow     
Private                  True             False Block      Allow     
Public                   True             False Block      Allow     

The Code looks like this 


Function Get-FirewallConfig {
$fw=New-object –comObject HNetCfg.FwPolicy2
"Active Profiles(s) :" + (Convert-fwprofileType $fw.CurrentProfileTypes)
@(1,2,4) | select @{Name=“Network Type”     ;expression={$fwProfileTypes[$_]}},
                   @{Name=“Firewall Enabled” ;expression={$fw.FireWallEnabled($_)}},
                   @{Name=“Block All Inbound”;expression={$fw.BlockAllInboundTraffic($_)}},
                   @{name=“Default In”       ;expression={$FwAction[$fw.DefaultInboundAction($_)]}},
                   @{Name=“Default Out”      ;expression={$FwAction[$fw.DefaultOutboundAction($_)]}}|
            Format-Table -auto
}

Finally comes the code to get the firewall rules. One slight pain here is that the text is often returned as pointer to a resource in a DLL, so it takes a little trial and error to find grouping information.
The other thing to note is that a change to a rule takes effect immediately, so you can enable a group of rules as easily as :

Get-FireWallRule -grouping "@FirewallAPI.dll,-29752" | foreach-object {$_.enabled = $true}

 

Function Get-FireWallRule
{Param ($Name, $Direction, $Enabled, $Protocol, $profile, $action, $grouping)
$Rules=(New-object –comObject HNetCfg.FwPolicy2).rules
If ($name)      {$rules= $rules | where-object {$_.name     –like $name}}
If ($direction) {$rules= $rules | where-object {$_.direction  –eq $direction}}
If ($Enabled)   {$rules= $rules | where-object {$_.Enabled    –eq $Enabled}}
If ($protocol)  {$rules= $rules | where-object {$_.protocol  -eq $protocol}}
If ($profile)   {$rules= $rules | where-object {$_.Profiles -bAND $profile}}
If ($Action)    {$rules= $rules | where-object {$_.Action     -eq $Action}}
If ($Grouping)  {$rules= $rules | where-object {$_.Grouping -Like $Grouping}}
$rules}

Since this the rules aren’t the easiest thing to read I usually pipe the output into format table for example

Get-firewallRule -enabled $true | sort direction,applicationName,name | 
            format-table -wrap -autosize -property Name, @{Label=”Action”; expression={$Fwaction[$_.action]}},
            @{label="Direction";expression={ $fwdirection[$_.direction]}},
@{Label=”Protocol”; expression={$FwProtocols[$_.protocol]}} , localPorts,applicationname

 

Last, but not least if you want to create a rule from scratch you want to create a rule object with New-object –comObject HNetCfg.Fwrule, you can then pass it to the add method of the Policy object’s rules collection.  If I ever find time to finish the script it will probably have new-firewallRule, but for now you need to write your own.

This post originally appeared on my technet blog.

February 17, 2009

Two useful Hyper-V links

A short post by my standards

On the server core blog, Chuck has posted Top Issues for Microsoft Support for Windows Server 2008 Hyper-V , which makes an interesting read. If you do a lot of hyper-v you’ll probably stumble over one of these at some point.

On the Main Microsoft site we have Windows Server 2008 R2 & Microsoft Hyper-V Server 2008 R2 – Hyper-V Live Migration Overview & Architecture  which is exactly what it says.

This post originally appeared on my technet blog.

Support for Red Hat OSes on Microsoft Virtualization (and Vice Versa)

One of the questions which comes up on our internal distribution lists for Hyper-V is “when will such and such and OS be supported on Hyper-V” and the somewhat frustrating response is usually in the form “We’re talking to OS vendors, but we can’t talk about contract negotiations while they are going on. As soon as we can say something we’ll do it in public”. We have to negotiate certification , support and so on. Even saying we’re talking (or not talking) to vendor X my impact what we’re doing with vendor Y. The OS which comes up most often in this context is Red Hat Enterprise Linux, we’ve made some public announcements which are a  step in this direction

Here are key points from Red Hat’s Press Release

  • Red Hat will validate Windows Server guests to be supported on Red Hat Enterprise virtualization technologies.
  • Microsoft will validate Red Hat Enterprise Linux server guests to be supported on Windows Server Hyper-V and Microsoft Hyper-V Server.
  • Once each company completes testing, customers with valid support agreements will receive coordinated technical support for running Windows Server operating system virtualized on Red Hat Enterprise virtualization, and for running Red Hat Enterprise Linux virtualized on Windows Server Hyper-V and Microsoft Hyper-V Server

The last one is important because the it means a customer with an issue can call on vendor and if the problem appears to lie with the other vendor’s product it’s managed as one streamlined incident.  Note that work hasn’t been completed – the above is written in the future tense. According to Mike Neil’s blog post “Microsoft and Red Hat Cooperative Technical Support” we will provide integration components for Red Hat on Hyper-V and Red Hat will provide properly certified drivers for Windows on their Virtualization stack

Microsoft people would prefer customers only used Microsoft products, and Red Hat people would prefer customers only used Red Hat products – we sure aren’t going to stop competing. But the reality is customers use both: and both companies want their customers to have an excellent experience of their respective technologies, which mean we have to cooperate as well . This is coopertiton in action.

This post originally appeared on my technet blog.

February 13, 2009

A Job or two saved for my “PowerShell configurator”

Somewhere in the queue of things to post is the remainder of my PowerShell configurator for Windows Server 2008 R2 Core and Hyper-VS Server R2. If you’re building a cluster the PowerShell CMDlets for clustering make that a breeze. Of course a cluster often calls for iSCSI and setting that up from the command line is tough, so I was going look at doing it in PowerShell. Quick tip of the hat to Ben, who’s blogged that the iSCSI Configuration UI is included in Hyper-V 2008 R2, just run iSCSIcpl.exe And there is an MPIOCPL.exe for setting up Multipath IO (when it is enabled.)

You can also run control.exe DateTime.cpl and control.exe intl.cpl to set time and international settings respectively. Then PowerShell V2 already has cmdlets for stop-computer and restart-computer, plus Add-Computer (to domain) and Rename-Computer, plus  Test-WsMan and Set-WSMANQuickConfig, so the number of things I need to implement is getting smaller…

This post originally appeared on my technet blog.

February 6, 2009

Virtualization road show

Earlier in the week we took the virtualization tour over the Irish Sea. Tuesday was Belfast – and with the snow, getting there was quite a challenge. I felt ill all day and didn’t think I’d delivered the content as well as I should have, but the feedback forms were really positive, better than I thought I deserved. Then it was south to Cork: I hadn’t been there before – though I want to go diving nearby – and was quite impressed with an airport which would grace a far bigger place, and with the hotel (free wifi,and a receptionist who takes a lost booking in her stride are both guaranteed to impress). I did a better job , and again we had a really good audience, I don’t think I’ve ever had so many people from the audience thank me for the session on their way out.  My Irish surname comes from many generations back, so I don’t have much of a connection with the island, but I’ve come away feeling positive from every trip I’ve made there, North or South, and I’ve volunteered to do events in either place again.

We’re getting to the end of the virtualization tour, we have dates in Scotland for March, which will probably be the last. Before that there are seats available in Northampton on 24th Feb. Northampton isn’t somewhere  we’ve held events before but it’s easy to get to.We keep sneaking new bits into each session and I’m now including demos of the live migration in Server 2008 R2. Just follow the link to book your place

This post originally appeared on my technet blog.

January 31, 2009

Checking and enabling Remote Desktop with PowerShell

A couple of posts back I mentioned that I was working on a configuration library for Server 2008 R2 Core and Hyper-V Server R2 and this includes checking and setting the configuration for remote desktop.

It turns out that this is controlled from just 2 registry entries – hence it is controlled by the SCRegEdit script. One turns is fDenyTSConnections under  ‘HKLM:\System\CurrentControlSet\Control\Terminal Server’ and the other is UserAuthentication  under ‘HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp. So if the Values exist they appear as Item property in PowerShell and can be set, otherwise it can be created. I’ve found the safest way is to try to set  the value and trap the error which occurs if it doesn’t exist then create it specifying that it is a DWORD. So my function enables RemoteDesktop UNLESS –Disable is specified , and -lowSecurity is a boolean which tells it whether to demand user stronger authentication.

 

Function Set-RemoteDesktopConfig 

{Param ([switch]$LowSecurity, [switch]$disable) if ($Disable) {
set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server'`
-name "fDenyTSConnections" -Value 1 -erroraction silentlycontinue if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server' `
-name "fDenyTSConnections" -Value 1 -PropertyType dword }
       set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp' `
-name "UserAuthentication" -Value 1 -erroraction silentlycontinue
      if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp'
-name "UserAuthentication" -Value 1 -PropertyType dword}
}
else {
set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server' `
-name "fDenyTSConnections" -Value 0 -erroraction silentlycontinue
        if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server' `
-name "fDenyTSConnections" -Value 0 -PropertyType dword }
       if ($LowSecurity) {
set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp'`
-name "UserAuthentication" -Value 0 -erroraction silentlycontinue
        if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp'`
-name "UserAuthentication" -Value 0 -PropertyType dword}
}
     } 

}

Finding out what the settings are is even easier.

Function Get-RemoteDesktopConfig
{if ((Get-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server').fDenyTSConnections -eq 1)

          {"Connections not allowed"}
elseif ((Get-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp').UserAuthentication -eq 1)
         {"Only Secure Connections allowed"}
else {"All Connections allowed"}
}

The next part of the configurator to share will be for checking and setting firewall rules.

This post originally appeared on my technet blog.

January 26, 2009

PowerShell and the smarter command line.

Filed under: Powershell,Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 2:49 pm

I mentioned I was doing some PowerShell work to manage configuration on the R2 versions of Windows Server 2008 Core and Hyper-V server (which now support PowerShell), and somebody in Redmond asked me if I knew  there were tools out there to do this job. …

I was thinking about how we find stuff, and I was something I wrote about Taxonomy came to mind.

Store data in the data, not in the field name. Do not create a long list of properties with yes/no answers. Not only is this awkward for users, but the sequence “Relates to product A, Relates to product B” stores Yes and No as the data. A multi select box “Relates to products…” stores the information where you can search it.  (This was dealing with document properties in the first release of Sharepoint Portal Server.) 

What has this got to with command lines ? It’s about thinking where the meaning is. I would say that it is true and self evident that the commands we enter should reflect the task being carried out. But (pre-PowerShell at least) it doesn’t happen. For example: suppose you want to configure Automatic Updates on Server 2008. If you run the full version of Server, the option is on the front page of Server Manager. But if, instead, you run the Core version, then what do you type at the command prompt ?  (HELP won’t give you the command). You can trawl through all the .EXEs .BATs, .CMDs, VBSs, .WSFs and so on but you won’t find anything with a name like AUCONFIG. If you’re lucky you’ve got the core guide from the Windows Server 2008 step-by-step Guides page. That will tell you that you need to run

cscript scregedit.wsf /AU 4

The part which identifies the task (set updates to automatic) isn’t the script name (SC Reg Edit) but the switches /AU and 4 if the task was to set updates to disabled the switches would be /AU and 1. Traditional command line tools have a name which reflect what they ARE –the Server Core Registry Editor ScRegEdit in this case. These tools are overloaded carrying out different tasks based on their switches ( if you want a worse case, look at the Network Settings Shell – NetSh).

At the command prompt there is no way to discover the tasks you can carry out and their Command+Switch or Command+Switch+Value combinations ; you have to resort to product documentation, training or a helpful expert who already knows that /AC tells SCRegEdit you want to Allow Connections (via terminal services) but /AU sets Auto Updates (where else does 1 mean disabled ? ).  By contrast PowerShell would have multiple commands for the different tasks – with names which reflect what they DO “Get-UpdateConfig” , “Set-UpdateConfig”, “Get-RemoteDesktopConfig” , “Set-RemoteConfig”. The commands are easily discoverable: and having found you have Set-UpdateConfig the tab key helps you to discover switches like –Disabled –InstallConfirmation –DownloadConfirmation and -Automatic  instead of 1,2,3 and 4 used by ScRegEdit /au

It’s easy to see how the command line tools that we have grew up: and SC Reg Edit edits the registry on Server Core (hence the name) /AU sets the registry entries for auto Update and so on. But understanding it doesn’t remove the desire for PowerShell naming and discoverability. V2 introduces ReName-Computer and Add-Computer – (if you ask Add Computer to what the help will tell you it is a to a domain or workgroup). These tasks were previously done by the NetDom program, with its switches My aim is to add  commands like Get-UpdateConfig” , “Set-UpdateConfig”, “Get-RemoteDesktopConfig” , “Set-RemoteConfig”  “Get-IPConfig” and “Set-IPconfig” (and a few more) at their simplest these can be wrappers for SCRegEdit , Netdom, NetSh, Net, and the others, but the ideal is to go straight to management objects instead.

So next I’ll look at a couple of the simpler ones.

This post originally appeared on my technet blog.

January 25, 2009

New build of my PowerShell library for Hyper-V on codeplex

Filed under: Powershell,Windows Server,Windows Server 2008 — jamesone111 @ 11:47 pm

Some naked statistics


1942 – the number of lines of code in HyperV.PS1


499 – the number of lines of formatting XML


14,381 – the number of words in the documentation file


2443 – the number of downloads of the last version


929 – the number of downloads for the versions before that


1.00 – the version number of the Library now on codeplex.


 


I’m calling this “released”. I daresay I will need to come back and do a service pack later, and the documentation must be regarded as “preliminary” but it’s done.


 


Update. A new stat. 200 – the number of downloads so far. Crumbs, that’s only a day and half.

This post originally appeared on my technet blog.

PowerShell 2, Server Core R2 and Hyper-V server R2

Filed under: Powershell,Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 7:01 pm

With all the betas out at the moment I’ve been trying to get up to speed on both Windows 7 (by using it) and Windows Server 2008 R2 , and Hyper-V server R2 as well.


If you’re a regular reader you won’t be surprised to know that I’m excited by what’s coming in PowerShell. PowerShell V2 is better in a lot of regards,although I’m not getting the best out of it yet. Quite aside from the graphical editor, the additional cmdlets, and the restored ability of Windows to drag and drop a file into a command window. It’s got better tab expansion. PowerShell has its own function you can customize for tab expansion, and there are two improvements in V2. The first is that tab expansion finds functions you have written as well as built in cmdlets. Much typing and confusion saved there… but something I only noticed today is that it expands parameters: can’t remember if you named a parameter “-Path” or “-VHDPath” just type the – sign and hit [Tab] and it cycles through the list. It sounds like a feeble reason to upgrade but trips back to a script file to find a parameter add up to a fair old chunk of time saved.


The other thing which is big news is the ability to run PowerShell on Core and on Hyper-V server. Unlike the full servers PowerShell is not installed by default (I’ve no idea why it is on one and not the other). Of course you shouldn’t be going the console of a Core / Hyper-V server box to do admin. But the ability to “remote” PowerShell  is a biggy. If PowerShell is installed AND remote management via WinRM is enabled then you can run any PowerShell command on a box in your data centre from your desktop machine. Of course from my point of view the this is a great push for my PowerShell library for Hyper-V on Codeplex – incidentally an update has been waiting for me to put the finishing touches to it since before Christmas and should appear shortly.


I’ve been looking at the Hyper-V configurator from R2

===============================================================================
                            Hyper-V Configuration

===============================================================================

 

1) Domain/Workgroup:                    Workgroup:  WORKGROUP

2) Computer Name:                       WIN-75FRHHINP4U

3) Network Settings

4) Add Local Administrator


5) Windows Update Settings:             Manual
6) Download and Install Updates
7) Remote Desktop:                      Enabled (all clients)
8) Failover Clustering Role             Enabled
9) Configure Remote Management


10) Regional and Language Options
11) Date and Time

12) Do not display this menu at login
13) Log Off User

14) Restart Server

15) Shut Down Server

16) Exit to Command Line

Some of the things it does are native PowerShell commands, for example V2 has commands for renaming a server, adding it to a domain, rebooting it and shutting it down

Rename-Computer –New “Core-27” 

 

Add-Computer –DomainName “Contoso” –Reboot


Stop-Computer

Restart-Computer


Are all pretty easy to understand – Not many people I meet can give me the command lines for NetDom and Shutdown to do the same things. Three more commands cover the regional/language and Date/Time options and logging off

control.exe “intl.cpl”

control.exe “timedate.cpl”

 

Logoff.exe


So what about the other options. Remote Desktop,  Networking, Cluster installation, Windows Update (settings and download) , Adding local Admins and configuring Remote Management ? If NetDom and ShutDown aren’t easy to remember, NetSH, ScReg, WinRm and the others are worse. Well I’ve been coding them up, and a couple of hundred lines thrown together over the weekend do what it took a coupe of thousand lines of VBscript to do. That’s not entirely a fair comparison because the VB Scripts which come with the OS are designed to be portable across languages, provide help and catch errors way beyond what I do in PowerShell.


I’ll break that code into a bunch of easy to digest posts over the next few days.

This post originally appeared on my technet blog.

January 16, 2009

Hyper-V licensing changes

Filed under: Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 8:00 am

A few days ago I wrote about our licensing for Hyper-V. The boys at VMware had picked up that we required Client Access Licences for clients of VMs running on Windows. The “if it runs on Windows it needs a CAL” philosophy has the virtue of simplicity, but it makes Virtualization projects expensive if you haven’t already got server 2008 CALs.. If there was a degree of ambivalence in that post it was because I like simplicity in licensing, but it just seemed wrong for us to ask for a CAL for a Linux client to access a Linux server.


Well… we’ve added to the IFs ANDs or BUTs count in the product use rights. But I’m not expecting any complaints. Here’s the key part of a “customer ready” mail which we have been given to share with Volume Licence customers


A number of trends, including consolidation and high availability, are driving more deployments and evaluations of Hyper-V. Based on feedback from our customers, we are updating our licensing policies to address these new scenarios enabled by virtualization.


Currently, if your physical server environment is running Windows Server 2003, matching version CALs are required for all users (i.e. Windows Server 2003 CALs). However, if you move your physical Windows Server 2003 Operating System Environments (OSE) to run as virtual machines hosted by Windows Server 2008 Hyper-V, Windows Server 2008 CALs are required. This is per the current use rights. With the change in our licensing policy, Windows Server 2008 CALs are no longer required if you are using Windows Server 2008 solely as a virtualization host. The only exception to this is if you are running Windows Server 2008 virtual machines, which would require Windows Server 2008 CALs.


If you would like more in depth information on this change, please read the updated Volume Licensing Brief (note that for now these terms only customers covered by a Volume Licence)


Oh and a quick tip of the hat to Chris Wolf, who saw this coming.

This post originally appeared on my technet blog.

January 13, 2009

Fun and games with VHD files in the new OSes

Filed under: Beta Products,Windows 7,Windows Server — jamesone111 @ 9:53 pm

image One of the new features for both Windows Server 2008 R2 and the Windows 7 client is support for Virtual Hard Disk files built into the OS. You can create fixed or dynamic disks and “attach” the tools for Hyper-V call this “mounting” a VHD and early stuff on 7 seems to  have called it surfacing a VHD . You need to click the image on the left to see it full size but  Disk 3 has a different coloured disk icon, and it contains the full image backup of my Vista hard disk.

Disk part will do the job too. 

Create VDISK will setup a new VHD (Help Create Vdisk will tell you the parameters)

Select VDISK "<file name">” followed by “Attach” Vdisk  will bring a disk on-line.

Then you can partition it like any other disk, use it like any other disk and so on.

 

Now … during installation you can press [Shift][F10] to pop up a command prompt, and if the installation runs on the 6.1 build of Windows PE what do you think you might be able to do  ?

This post originally appeared on my technet blog.

January 5, 2009

Lies,Damn lies and licence interpretations.

From time to time people ask me who I write for, and I always say I write for myself in the hope that there are enough people out there like me to make a reasonable size audience. It always surprises me how many people inside Microsoft read this blog, not to mention the number of competitors who come here to read my impeccably researched and completely impartial comments (and in return I read their lies, twisted truths and false malicious implications. Ha. Ha.)


Someone pointed me to a post of Mike DiPetrillo’s from just before Christmas with the so charming title of “Microsoft lies to their customers again.”. Mike’s beef is that people who work for Microsoft have said things to customers which contradict what we have posted in public. Unwilling to resist a good title, he’s chosen to make this ineptitude sound like some sort of corporately organized conspiracy… The odd thing is that he is complaining about something you’ll hear people from his company say. During  2008 people from VMware complained that Microsoft was playing dirty with licensing rules for virtualization – that VMware could not use the bundled instances of the operating system included with Enterprise and Datacenter versions of Server 2003 R2 and Server 2008. Allowing customers to do less with your product if they also buy someone else’s product tends to have regulators beating your doors down. If customers get a certain number of bundled instances with a licence that has to apply regardless of the virtualization technology. Indeed, I constantly have to explain to people the reason that you can’t use anything but virtualization on top of Windows Server Enterprise with the full compliment of 4 VMs is that if you did that you’d have 5 working copies of Windows vs 4 with VMware and someone would cry foul. We put out a Licensing FAQ to try to make things clear. (I wish we lived in a world where the licence agreements could be so clear that no FAQs were needed but legal documents and clarity rarely go together).
However… Not everyone at Microsoft understands all the nuances of licensing, or government regulation. Every so often I see someone saying “VMware told my customer they could assign a server licence to a box and use that licence for windows VMs running on VMware. Where do I find something to fight that lie” and some kindly person has to point out it is no lie, and steer the poor chap to the FAQ.  If anyone out there meets Microsoft people who are still getting this wrong (and don’t have a better channel) mail me and I’ll gently set them straight.


[Update] The rest of this post has been overtaken by events – it is easier to remove it than to explain…

This post originally appeared on my technet blog.

December 10, 2008

Virtualization: user group and good stories

Filed under: Events,Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 11:05 am

Details of the next Microsoft Virtualisation* User Group meeting now up on www.mvug.co.uk!

Where:   Microsoft London (Cardinal Place)
When: Date & Time 29th January 2009 18:00 – 21:30
Who & What: Simon Cleland (Unisys) & Colin Power (Slough Borough Council)
Hyper-V RDP deployment at Slough Borough Council
Aaron Parker (TFL)
Application virtualisation – what is App-V?
Benefits of App-V & a look inside an enterprise implementation.
Justin Zarb (Microsoft)
Application virtualisation – in-depth look at App-V architecture

I presented at an event for Public Sector customers recently and the folks from Slough Borough Council were there. I thought they were a great example because so many of the things we talk about when we’re presenting about Hyper-V actually cropped up with their deployment.

We’ve got another great story from the Public sector at the other end of the British Isles – Perth and Kinross council reckon Hyper-V will save £100,000 in its first year.  

However the best virtualization Story was one which told by one of our partners on the recent unplugged tour. Virtualization reduces server count, and that’s great for Electricity (cost and CO2), maintenance, capital expenditure and so on. But they had a customer who didn’t care about that. They found the the walls were cracking in their office and the reason was the weight of Servers in the room above. According to the structural engineer, they had overloaded the floor of server room by a factor of 4 and there was a risk that it would collapse onto the office staff below. That’s the first story I’ve heard of virtualization being used to reduce Weight. 

 

* Being British the MVUG  like using the Victorian affectation of spelling it with an S

This post originally appeared on my technet blog.

Next Page »

Blog at WordPress.com.