James O'Neill's Blog

February 25, 2010

Retirement Planning (for service packs)

Yesterday I wrote about end-of-life planning for OSes and so it makes sense to talk about the end of a service pack, as retirement – it is after all the word that is used on the product lifecycle pages. Of course we don’t mean retirement in go and live by the seaside sense…



Special police squads — BLADE RUNNER UNITS — had orders to shoot to kill, upon detection,any trespassing Replicants.


This was not called execution. It was called retirement


that sense. Service packs, like OSes (and replicants) get an end date set well in advance, having explained OSes I want to move on to service packs (and if you want to know about Replicants you’ll have to look elsewhere).


The rule for service packs is simple. Two years after the release of a Service Pack we stop supporting the previous version. So although Windows Vista will be in mainstream support until 2012, and extended support until 2017, that doesn’t mean you can run the initial release , or Service Pack 1 and be supported until then. Lets use Vista as a worked example – I explained yesterday


Windows Vista [had] a General Availability date [of] Jan 2007.For Vista, five years after GA will be later than 2 years after Windows 7, so Vista goes from mainstream to extended support in or shortly after January 2012. We’ve set the date, April 10th 2012. The end of extended support will depend on when the next version of Window ships, but it won’t be before April 11th 2017.


Service pack 1 for Vista became available in April 2008, and Service Pack 2 became available in April 2009.
So, the life of the original Release to Manufacturing of (RTM) version of Windows Vista ends on April 14 2010.
In the same way the life of SP1 of Vista should end in April 2011, actually because we don’t retire things on the exact anniversary, SP1 gets an extension until July 12 2011.


If you are on Vista you must have upgraded to SP1 or SP2 (or Windows 7) by April 14 if you want to continue being supported.


So here’s the summary for what is supported with Vista, and when


Jan ‘07 – April ‘08  Only RTM release available


April ‘08 – April ‘09 RTM and Service Pack 1 supported


April ‘09 – April ‘10 RTM , Service Pack 1 and Service Pack 2 supported


April ‘10  – July ‘11 Service pack 1 and Service Pack 2 Supported


July ‘11 – April ‘12 Service Pack 2 only supported


April ‘12 – April ‘17 Extended support phase on SP2 only.


To simplify things, that assumes there is no Service pack 3 for Windows Vista, and that the successor to Windows 7 ships before April 11 2015.



Vista SP1 coincided with the release of Server 2008, and  Windows XP service pack 3 came very shortly afterwards. The extra few days means the anniversary for XP SP2 falls after the cut off date for April retirement and the end of life for XP SP 2 is July 13th 2010 (the same as day Windows 2000 professional and server editions). Mainstream support for Windows XP (all service packs) has ended,  after July 13 XP is extended support ONLY on SP3 ONLY.


I should have included in yesterdays post that July 13th 2010 also marks the end of mainstream support for Server 2003 (and Server 2003 R2), the  RTM and SP1 versions are already retired. It would be very unusual to see a new service pack for something in extended support. If you still have 2003 servers, you need to decide what you will do about support / upgrades before Jul 13th


Server 2008 shipped as SP1 to sync up with Windows Vista  and SP2 for both came out on the same date, so there are no server service pack actions required until July 12 2011. I explained yesterday why I have sympathy with people who don’t plan, but if you are on Server 2008 SP1 don’t leave it till the last minute to choose between SP2 or upgrading to R2  and then implementing your choice.


Update – Fixed a few typos. 

This post originally appeared on my technet blog.

Advertisements

February 8, 2010

Installing Windows from a phone

Arthur : “You mean you can see into my mind ?”
Marvin: “Yes.”
Arthur: “And … ?”
Marvin: “It amazes me how you manage to live in anything that small”

Looking back down the recent posts you might notice that this is the 8th in a row about my new phone (so it’s obviously made something of an impression), this one brings the series to a close.

I’ve said already that I bought at 16GB memory card for the new phone which is a lot – I had 1GB before, so… what will I do with all that space? I’m not going to use it for video and 16GB is room for something like 250 hours of MP3s or 500 hours of WMAs: I own roughly 200 albums, so it’s a fair bet they’d fit. Photos – well maybe I’d keep a few hundred MB on the phone. In any event, I don’t want to fill the card completely. After a trip out with no card in the my camera I keep a SD-USB card adapter on my key-ring so I always have both a USB stick and a memory card : currently this uses my old micro-SD card in an full size SD adapter. If I need more than 1GB I can whip the card out of the phone, pop it in the adapter and keep shooting 

However the phone has a mass storage device mode so I thought to myself why not copy the Windows installation files to it, and see if I can boot a Machine off it and install Windows from the phone ? That way one could avoid carrying a lot of setup disks.
Here’s how I got on.

This post originally appeared on my technet blog.

November 26, 2009

How to deploy Windows: Windows deployment services.

Filed under: Windows 7,Windows Server 2008,Windows Server 2008-R2,Windows Vista — jamesone111 @ 12:23 pm

I saw something recently – it must have been in the discussion about Google’s bootable browser new “operating system” which talked about it taking hours to install Windows. I didn’t know whether to get cross or to laugh.Kicking around on youtube is a video I made of putting Windows 7 on a Netbook from a USB key (The technical quality of the video is very poor for the first couple of minutes, the installation starts in the third minute) . It took me 25 minutes from powering on for the first time to getting my first web page up. It was quick because I installed from a USB flash device. It would be quicker still on a higher spec machine, especially one with a fast hard disk. 

Installing from USB is all very well if you are go the machine(s) to do the installation(s). But if you have many machines to install, or you want to have users or other team members install at will then Windows Deployment Services is a tool you really should get to know.  WDS was originally a separate download for server 2003, then it got rolled into the product so it is just and installable component in Server 2008 and 2008-R2. There are other add on which round out deployment capabilities but there are 3 scenarios where WDS alone is all you need.

  1. Deploying the “vanilla” Windows image to machines. This can be Windows Vista, Windows Server 2008, Server 2008-R2 or Windows 7. I haven’t checked on deploying hyper-V server, it may be a special case because the generic setup process may not create a line that’s needed in the boot configuration database.
  2. Deploying a Windows image, customized with an unattend.xml file – again the same version choices are available , but now if you want to install with particular options turned on or off you can do so (The Windows Automated Installation Kit helps with the creation of this file, among other things)
  3. Creating a “Gold Image” machine with applications pre-installed, and capturing that image and pushing it out to many different machines [There are a few applications which don’t like this, so sometime it is better to run something to install ].

One thing which many people don’t seem to realise is that since Vista arrived one 32 bit can cover all machines, and one 64 bit image can be used on all 64-bit machines. Those images not only handle differences in hardware but can also be multi-lingual.

By itself WDS doesn’t address installing combinations of applications and images, nor does it automate the process of moving users data off an old machine and onto a new machine. I’ll talk about some of these things in future posts: but if you thinking about the skills you’ll need to do a deployment of Windows 7 (for example) understanding WDS is a key first step; the next step is answering the question “What do I need that WDS doesn’t give me ?”

Because I have to deploy a lot of servers. I put together a video showing WDS being used to deploy Windows server (server core is also the smallest OS and so the quickest to install as a demo). Because my Servers are most virtualized I have another video in the pipeline showing System Virtual Machine manager doing deployments of built VMs.
You get an idea of the power of WDS, but the fact the video is only 6 minutes long also gives you an idea of its simplicity.

This post originally appeared on my technet blog.

August 14, 2009

VMware – the economics of falling skies … and disk footprints.

Filed under: Virtualization,Windows Server 2008,Windows Server 2008-R2 — jamesone111 @ 4:36 pm

There’s a phrase which has being go through my head recently: before coming to to Microsoft I ran my a small business; I thought our bank manager was OK, but one of my fellow directors – someone with greater experience in finance than I’ll ever have – sank the guy with 7 words “I have a professional disregard for him.”. I think of “professional disregard” when hearing people talk about VMware. It’s not that people I’m meeting simply want to see another product – HyperV – displace VMware (well, those people would, wouldn’t they ?) , but that nothing they see from VMware triggers those feelings of “professional regard” which you have for some companies – often your toughest competitor.

When you’ve had a sector to yourself for a while, having Microsoft show up is scary. Maybe that’s why Paul Maritz was appointed to the top job at VMware. His rather sparse entry on Wikipedia says that Maritz was born in Zimbabwe in 1955 (same year as Bill Gates and Ray Ozzie, not to mention Apple’s Steve Jobs, and Eric Schmidt – the man who made Novell the company it is today) and that in the 1990’s he was often said to be the third-ranking executive in Microsoft (behind Gates and Steve Ballmer, born in early 1956). The late 90s was when people came to see us as “the nasty company”. It’s a role that VMware seem to be sliding into: even people I thought of as being aligned with VMware now seem inclined to kick them.

Since the beta of Hyper-V last year, I’ve being saying that the position was very like that with Novell in the mid 1990s. The first point of similarity is on economics. Novell Netware was an expensive product and with the kind of market share where a certain kind of person talks of “monopoly”. That’s a pejorative word, as well as one with special meanings to economists and lawyers. It isn’t automatically illegal or even bad to have a very large share (just as very large proportions in parliament can make you either Nelson Mandela or Robert Mugabe). Market mechanisms which act to ensure “fair” outcomes rely on buyers being able to change to another seller (and vice versa – some say farmers are forced to sell to supermarkets on unfair terms) if one party is locked in then terms can be dictated. Microsoft usually gets accused of giving too much to customers for too little money. Economists would say that if a product is over priced, other players will step in – regulators trying wanting to reduce the amount customers get from Microsoft argue they are preserving such players. Economists don’t worry so much about that side, but say new suppliers need more people to buy the product, which means lower prices, so a new entrant must expect to make money at a lower price: they would say if Microsoft makes a serious entry into a existing market dominated by one product, that product is overpriced. Interestingly I’ve seen the VMware side claim that HyperV , Xen and other competitors are not taking market share and VMware’s position is as dominant as ever.

The second point of similarity is that when Windows NT went up against entrenched Netware it was not our first entry into networking – I worked for RM where we OEM’d MS-NET (a.k.a 3COM 3+ Open, IBM PC-Lan program) and OS/2 Lan manager (a.k.a. 3+Open). Though not bad products for their time – like Virtual server – they did little to shift things away from the incumbent. The sky did not fall in on Novell when we launched NT, but that was when people stopped seeing NetWare as the only game in town. [A third point of similarity]. Worse, new customers began to dismiss its differentiators as irrelevant and that marks the beginning of end. 
Having been using that analogy for a while it’s nice to see no less a person than a Gartner Vice President, David Cappuccio, envisaging a Novell-like future for VMware.  In piece entitled “Is the sky falling on VMware” SearchServerVirtualization.com  also quotes him as saying that “‘good enough’ always wins out in the long run”.  I hate “good enough” because so often it is used for “lowest common denominator” I’ve kept the words of a Honda TV ad with me for a several years.

Ever wondered what the most commonly used in the world is ?
”OK”
Man’s favourite word is one which means all-right, satisfactory, not bad
So why invent the light bulb, when candles are OK ?
Why make lifts, if stairs are OK ?
Earth’s OK, Why go to the moon ?
Clearly, not everybody believes OK is OK.
We don’t.

Some people advance the idea that we don’t need desktop apps because web apps are “good enough”. Actually, for a great many purposes, they aren’t. Why have a bulky laptop when a netbook is “good enough” ?. Actually for many purposes it is not. Why pay for Windows if Linux is ‘free’ … I think you get the pattern here. But it is our constant challenge to explain why one should have a new version of Windows or Office when the old version was “good enough” ? The answer – any economist will give you – is that when people choose to spend extra money, whatever differentiates one product from the other is relevant to them and outweighs the cost (monetary or otherwise , real or perceived) : then you re-define “good enough” the old version is not good enough any more. If we don’t persuade customers of that, we can’t make them change. [Ditto people who opt for Apple: they’d be spectacularly ignorant not to know a Mac costs more, so unless they are acting perversely they must see differentiators, relevant to them, which justify both the financial cost and the cost of forgoing Windows’ differentiators. Most people of course, see no such thing.]. One of the earliest business slogans to get  imprinted on me was “quality is meeting the customers needs”: pointless gold-plating is not “quality”. In that sense “Good enough” wins out: not everything that one product offers over and above another is a meaningful improvement. The car that leaves you stranded at the roadside isn’t meeting your needs however sophisticated its air conditioning, the camera you don’t carry with you to shoot photos isn’t meeting your needs even if it could shoot 6 frames a second, the computer system which is down when you need it is (by definition) not meeting your needs. A product which meets more of your needs is worth more.

A supplier can charge more in the market with choices (VMware, Novell, Apple) only if they persuade enough people accept the differentiators in their products meet real needs and are worth a premium. In the end Novell didn’t persuade enough, Apple have not persuaded a majority but enough for a healthy business, and VMware ? Who knows what enough is yet, never mind if they will get that many. If people don’t see the price as a premium but as a  legacy of being able to overcharge when there was no choice then it becomes the “VMware tax” as  Zane Adam calls it in our video interview. He talked about mortgaging everything to pay for VMware: the product which costs more than you can afford doesn’t meet your needs either, whatever features it may have.

I’ll come back to cost another time – there’s some great work which Matt has done which I want to borrow rather than plagiarize. It needs a long post and I can already see lots of words scrolling up my screen so want to give the rest of this post to one of VMware’s irrelevant feature claims :disk footprint.  Disk space is laughably cheap these days, and in case you missed the announcement Hyper-v Server now boots from flash – hence the Video above: before you run off to do this for yourself, check what set-ups are supported in production. And note it is only Hyper-V server not Windows Server, or client versions of Windows. The steps are all on this blog already. See How to install an image onto a VHD file, (I used a fixed size of 4GB). Just boot from VHD stored on a bootable USB stick. Simples.

I’ve never met a customer who cares about a small footprint: VMware want you to believe a tiny little piece of code must need less patching, give better uptime, and be more trustworthy  than a whole OS – even a pared down one like Windows Server Core or Hyper-V server. Now Jeff, who writes on the virtualization team blog , finally decided he’d heard enough of this and decided it was time to sink it once and for all . It’s a great post (with follow-up).  If you want to talk about patching and byte counts, argues Jeff, let’s count bytes in patches over a representative period:  Microsoft Hyper-V Server 2008 had 26 patches, not all of which required re-boots, and many were delivered as combined updates. They totalled 82 MB.  VMware ESXi 3.5 had 13 patches, totalling over 2.7 GB. That’s not a misprint 2700 MB against 82 (see VMware sometimes does give you more), that’s because VMware releases a whole new ESXi image every time they release a patch so  every ESXi patch requires a reboot. Could that be why VMotion (Live Migration, as now found in R2 of HyperV), seemed vital to them and merely important to us ? When we didn’t have it it was the most relevant feature. Jeff goes to town on VMware software quality – including the “Update 2” debacle, that wasn’t the worst thing though. The very worst thing that can happen in on a virtualized platform is  VM’s breaking out of containment and running code on the host: Since the host needs to access the VMs’ memory for snapshots, saving, migration, a VM that can run code on the host can impact all the other VMs. So CVE-2009-1244: “A critical vulnerability in the virtual machine display function allows a guest operating system users to execute arbitrary code on the host OS” is very alarming reading.

And that’s the thing – how can have a regard for a competitor who doesn’t meet the customers needs on on security or reliability, and who calls things like disk space to justify costing customers far, far more money ?

This post originally appeared on my technet blog.

July 23, 2009

Oink flap –– Microsoft releases software under GPL — oink Flap

Mary-Jo has a post about the release of our Hyper-V drivers for Linux entitled Pigs are flying low: Why Microsoft open-sourced its Linux drivers , it’s one of many out there but the title caught my eye: I thought I’d give a little of my perspective on this unusual release. News of it reached me through one of those “go ahead and share this” mails earlier this week  which began

NEWSFLASH: Microsoft contributes code to the Linux kernel under GPL v2.
Read it again – Microsoft contributes code to the Linux kernel under GPL v2!
Today is a day that will forever be remembered at Microsoft.

Well indeed… but hang on just a moment: we’re supposed to “hate” the GPL aren’t we ? And we’re not exactly big supporters of Linux … are we ? So what gives ? Let’s get the GPL thing out of the way first:

For as long as I can remember I’ve thought (and so has Microsoft) that whoever writes a book, or piece of software or paints a picture or takes a photo should have the absolute right decide its fate.  [Though the responsibilities that come with a  large share of an important market apply various IFs and BUTs to this principle]. Here in the UK the that’s what comes through in the Copyrights Designs and Patents Act, and I frequently find myself explaining to photographers that the act tilts things in their favour far more than they expect. Having created a work, you get the choice whether to sell it, give it away, publish the Source code , whatever. The GPL breaks that principle, by saying, in effect “if you take what I have given away, and build something around it, you must give your work away too and force others to give their work away ad infinitum”; it requires an author of a derivative work to surrender rights they would normally have. The GPL community would typically say don’t create derivative works based on theirs if you want those rights. Some in that community – it’s hard to know how many because they are its noisiest members -  argue for a world where there is no concept of intellectual property (would they argue you could come into my garden and take the vegetables that stem from my physical work ? Because they do argue that you can just take the product of my mental work). Others argue for very short protection under copyright and patent laws: ironically a licence (including the GPL) only applies for the term of copyright, after that others can incorporate a work into something which is treated as wholly their own. However we should be clear that GPL and Open Source are not synonyms (Mary Jo, wasn’t in her title) . Open source is one perfectly valid way for people to distribute their works – we want Open Source developers to write for Windows and as I like to point out to people this little project here  means I am an Open Source Developer and proud of it. However I don’t interfere with the rights of others who re-use my code,  because it goes out under the Microsoft Public Licence: some may think it ironic that is the Microsoft licence which gives people freedom and those who make most noise about “free software” push a licence that constrains people.

What are we doing ? We have released the Linux Integration Components for Hyper-V under a GPL v2 license, and the synthetic drivers have been submitted to the Linux kernel community for inclusion in upcoming versions of the Linux kernel.  The code is being integrated into the Linux kernel tree via the  Linux Driver Project which is a team of Linux developers that develops and maintains drivers in the Linux kernel. We worked very closely with Greg Kroah-Hartman to integrate our Linux IC’s into the Linux kernel. We will continue to develop the Integration components and as we do we will contribute the code to the drivers that are part of the kernel.
What is the result ? The drivers will be available to anyone running an appropriate Linux kernel. And we hope that various Linux distributions will make them available to their customers through their releases. 
WHY ? It’s very simple. Every vendor would like their share of the market to come from customers who used only their technology; no interoperability would be be needed: but in the real world, real customers run a mixture. Making the Linux side of those customers lives unnecessarily awkward just makes them miserable without getting more sales for Microsoft. Regulators will say that if you make life tough enough, it will get you more sales, but interoperability is not driven by some high minded ideal – unless you count customer satisfaction, which to my way of thinking is just good business sense. Accepting that customers aren’t exclusive makes it easier for them to put a greater proportion of their business your way. So: we are committed to making Hyper-V the virtualization platform of choice, that means working to give a good experience with Linux workloads. We’d prefer that to happen all by itself, but it won’t: we need to do work to ensure it happens.  We haven’t become fans of the GPL: everything I wrote above about the GPL still holds. Using it for one piece of software is the price of admission to the distributions we need to be in, in order to deliver that good experience. Well… so be it.  Or put another way, the principle of helping customers to do more business with you trumps other principles.
Does this mean we are supporting all Linux distributions ? Today we distribute Integration components for SLES 10 SP2. Our next release will add support for SLES 11 and Red Hat Enterprise Linux (5.2 and 5.3). If you want to split hairs we don’t “support” SLES or RHEL – but we have support arrangements with Red Hat and Novell to allow customers to be supported seamlessly. The reason for being pedantic about that point is that a customer’s ability to open a support case with Microsoft over something which involves something written by someone else depends on those arrangements being in place. It’s impossible to say which vendors we’ll have agreements with in future (if we said who we negotiating with it would have all kinds of knock on effects, so those discussions aren’t even disclosed inside the companies involved). Where we haven’t arranged support with a vendor we can only give limited advice from first principles about their product, so outside of generic problems which would apply to any OS, customers will still need to work with the vendors of those distributions for support.

You can read the press release or watch the Channel 9 Video for more information.

This post originally appeared on my technet blog.

July 22, 2009

How to activate Windows from a script (even remotely).

I have been working on some PowerShell recently to handle the initial setup of a new machine, and I wanted to add the activation. If you do this from a command line it usually using the Software Licence manager script (slMgr.vbs) but this is just a wrapper around a couple of WMI objects which are documented on MSDN so I thought I would have a try at calling them from PowerShell. Before you make use of the code below, please understand it has had only token testing and comes with absolutely no warranty whatsoever, you may find it a useful worked example but you assume all responsibility for any damage that results to your system. If you’re happy with that, read on.  


So first, here is a function which could be written as  one line to get the status of Windows licensing. This relies on the SoftwareLicensingProduct WMI object : the Windows OS will have something set in the Partial Product Key field and the ApplicationID is a known guid. Having fetched the right object(s) it outputs the name and the status for each – translating the status ID to text using a hash table.

$licenseStatus=@{0=”Unlicensed”; 1=”Licensed”; 2=”OOBGrace”; 3=”OOTGrace”;
4=”NonGenuineGrace”; 5=”Notification”; 6=”ExtendedGrace”}
Function Get-Registration

{ Param ($server=”.” )
get-wmiObject -query  “SELECT * FROM SoftwareLicensingProduct WHERE PartialProductKey <> null
AND ApplicationId=’55c92734-d682-4d71-983e-d6ec3f16059f’
AND LicenseIsAddon=False” -Computername $server |
foreach {“Product: {0} — Licence status: {1}” -f $_.name , $licenseStatus[[int]$_.LicenseStatus] }
}

 


On my Windows 7 machine this comes back with Product: Windows(R) 7, Ultimate edition — Licence status: Licensed


One of my server machines the OS was in the “Notification” state meaning it keeps popping up the notice that I might be the victim of counterfeiting  (all Microsoft shareholders are … but that’s not what it means. We found a large proportion of counterfeit windows had be sold to people as genuine.)  So the next step was to write something to register the computer. To add a licence key it is 3 lines – get a wmi object, call its “Install Product Key” method, and then call its “Refresh License Status method”.  (Note for speakers of British English, it is License with an S, even though we keep that for the verb and Licence with a C for the noun).  To Activate we get a different object (technically there might be multiple objects), and call its activate method. Refreshing the licensing status system wide and then checking the “license Status”  property for the object indicates what has happened. Easy stuff, so here’s the function.

Function Register-Computer
{  [CmdletBinding(SupportsShouldProcess=$True)]
param ([parameter()][ValidateScript({ $_ -match “^\S{5}-\S{5}-\S{5}-\S{5}-\S{5}$”})][String]$Productkey ,
[String] $Server=”.” )

 

$objService = get-wmiObject -query “select * from SoftwareLicensingService” -computername $server
if ($ProductKey) { If ($psCmdlet.shouldProcess($Server , $lStr_RegistrationSetKey)) {
                           $objService.InstallProductKey($ProductKey) | out-null 
                           $objService.RefreshLicenseStatus() | out-null }

    }   get-wmiObject -query  “SELECT * FROM SoftwareLicensingProduct WHERE PartialProductKey <> null
                                                                   AND ApplicationId=’55c92734-d682-4d71-983e-d6ec3f16059f’
                                                                   AND LicenseIsAddon=False” -Computername $server |

      foreach-object { If ($psCmdlet.shouldProcess($_.name , “Activate product” ))

{ $_.Activate() | out-null

$objService.RefreshLicenseStatus() | out-null

$_.get()
If     ($_.LicenseStatus -eq 1) {write-verbose “Product activated successfully.”}
Else   {write-error (“Activation failed, and the license state is ‘{0}'” `
-f $licenseStatus[[int]$_.LicenseStatus] ) }
                            If     (-not $_.LicenseIsAddon) { return }

              }              
else { write-Host ($lStr_RegistrationState -f $lStr_licenseStatus[[int]$_.LicenseStatus]) }
    }
}


Things to note



  • I’ve taken advantage of PowerShell V2’s ability to include validation code as a part of the declaration of a parameter.
  • I as mentioned before, it’s really good to use the SHOULD PROCESS feature of V2 , so I’ve done that too.
  • Finally, since this is WMI it can be remoted to any computer. So the function takes a Server parameter to allow machines to be remotely activated.

A few minutes later windows detected the change and here is the result.


image


 


This post originally appeared on my technet blog.

June 25, 2009

How to: have nicer Active Directory management from PowerShell – without upgrading AD

One of the first books I read on PowerShell  had a comment about using AD from the PowerShell V1 which amounted to “It’s too hard, don’t bother use VB Script instead”. I’d taken against the book in question (no names no pack drill) – in fact it reminded me of something Dorothy Parker is supposed to have said*  "This is not a book to be cast aside lightly, it should be hurled with great force."  When I was asked to contribute to the Windows Scripting Bible (out of stock at Amazon at the time of writing!) someone had put a chapter on AD into the outline, so I had to write one. This gives me enough expertise to say it can be done, and having written scripts in VBScript to work with AD it is easier in PowerShell, but it is ugly and not done in true PowerShell style.

All that changed when we took the covers off the Beta of Windows Server 2008 R2 , it has PowerShell V2 with Cmdlets for Active directory. A quick scratch of the surface revealed these work with a new Web Service which is (you guessed it) on in R2. This quickly led to questions about whether it would be back-ported… and I had to answer “I know customers are asking for it, but I don’t know if it will happen”. There is a post on the  AD Powershell blog announcing the beta of a version for Windows Server 2003 and 2008 version for Windows Server 2003 and 2008.  

(Quick tip of the hat to Jonanthan who tweeted this)

 

 


* If in doubt about attributing quotes which don’t sound like Shakespeare or the bible, Churchill, Mark Twain or Dorothy Parker are always good bets. 

This post originally appeared on my technet blog.

May 10, 2009

Thoughts on uses of YouTube … and virtualization

Filed under: Virtualization,Windows Server 2008,Windows Server 2008-R2 — jamesone111 @ 7:49 pm

I’m taking a breather from re-recording the voice track for a Video on Live Migration in Hyper-V. When it’s done it will end up on YouTube.

Now YouTube is giving me pause right now: it is certainly the easiest way to put up videos so that people can find them. I’m viewing it as an experiment because anecdotal evidence suggests that the audience I’m trying to reach (IT Professionals) don’t really think of it as a good source of content. And also because you-tube is all very well for bits of fun (like this one or these ) but if you have a serious message isn’t it a bit needy ?

Consider the case, for example, that you have had things pretty much your own way for years, but now it seems everyone who hasn’t deserted you yet is flirting with the other side, the people who pay your wages are even beginning to question the money they pay you. This may sound like Gordon Brown’s attempts to woo the public – although derided by his colleagues* at least he has some more control over YouTube than press photos – but in fact I was referring to VMware.

There’s a video on You tube which starts with a factual error about Technet and MSDN. First off, Technet and MSDN themselves didn’t fail, the failure was in the download site for Windows 7, Server 2008 R2 and Hyper-V Server R2. Demand for these greatly outstripped predictions – there just wasn’t enough hardware capacity, as explained here. I don’t have the stats on how many of the downloads were for Hyper-V server R2 or people wanting to test Hyper-V on Windows Server 2008 R2, we’d been averaging in excess of 100,000 downloads per month of the first release of Hyper-V. Live migration was missing from that release (the main reason that customers chose VMware) but its in R2 – even with the free Hyper-V server product (hence my video). These must be scary times at VMware…  but I digress.

Last year I ripped a hole in VMware’s dishonest pricing examples, this year one someone thought it would be a wheeze to post footage of Hyper-V blue screening on YouTube. He kept quiet about who he was, but it didn’t take long for Virtualization Review to revealthe root cause of the VMware FUD: Scott Drummonds.”  as they go on to say “his job basically is to look at the competition and spread the word about VMware superiority. Unfortunately, Drummonds doesn’t identify himself on the Hyper-V crashing video. Why not? Cynics might say because the video would have less impact if they knew it came from Microsoft’s chief virtualization competitor”.

Drummonds confesses he made the post and gives some blather about using two virtual disks (in VMware IDE performance isn’t much good, so they run the test from SCSI disks. In Hyper-V IDE performance matches SCSI yet they wanted to run the test from SCSI disks, which conveys a degree of ignorance of hyper-V and a lack of scientific method – what effect does doubling the number of disks  have on the validity of the tests ?). 

Jeff Woolsey demolishes the Video – since it showed the STOP error at the blue screen he went digging and found from our 750,000 downloads of hyper-v “we’ve had 3 reports of crashes under stress and with the same error code as seen in the video bugcheck (0x00020001). The solution in all three cases was to upgrade the server BIOS which solved the problem”. VMware have seen similar things incidentally. I love a good demolition, so Jeff’s follow-up post makes good reading; in particular he points out that to have any merit a test has to be repeatable, there’s no published methodology, no statement of what is in the VMs being tested, what the hardware was etc. Jeff points out that VMware prohibit publication of benchmarks unless they have approved the way in which they are carried out, because as they put it Benchmarking is a difficult process fraught with error and complexity at every turn. It’s important for those attempting to analyze performance of systems to understand what they’re doing to avoid drawing the wrong conclusions or allowing their readers to do so." 

As the question of disks made clear they don’t understand they are doing with hyper-v  and anyone doing a serious test for publication would put in a call to Microsoft support and get a problem like this solved. Who says “A-ha ! a problem with the competitor let’s not try to fix, just video it and put it on YouTube.” …  well Scott Drummonds, obviously. But you can decide for yourself if VMware – at least Scott – were allowing their readers to draw the wrong conclusions or deliberately leading them astray.

Oh and Scott, if you’re reading, anyone with a good knowledge of testing windows will tell you that SlMgr.vbs –rearm will stop that “You may a victim of counterfeiting” message spoiling your videos.

 

* I had to go to the source of Hazel Blears’ comment “You tube if you want to” because it sounded like there was something missing it’s obviously meant to echo the famous Thatcher quote “To those expecting a u-turn I say You turn if you want to … the lady’s not for turning”. The full quote is actually “I’m not against new media. YouTube if you want to. But it’s not substitute for knocking on doors”.

 

Technorati Tags: ,,

Update. Fixed some typos and bad edits.

This post originally appeared on my technet blog.

February 18, 2009

How to manage the Windows firewall settings with PowerShell

I mentioned recently that I’m writing a PowerShell configuration tool for the R2 edition of Hyper-V server and Windows server core.   One of the key parts of that is managing the firewall settings…. Now… I don’t want to plug my book too much (especially as I only wrote the PowerShell part) but I had a mail from the publisher today saying copies ship from the warehouse this week and this code appears in the book (ISBN  9780470386804 , orderable through any good bookseller)

The process is pretty simple. Everything firewall-related in Server 2008/Vista / Server R2/ Windows 7, is managed through the HNetCfg.FwPolicy2 COM object, so. First I define some hash tables to convert codes to meaningful text, and I define a function to translate network profiles to names. So on my home network

$fw=New-object –comObject HNetCfg.FwPolicy2  ;  Convert-fwprofileType $fw.CurrentProfileTypes  

returns “Private”


$FWprofileTypes= @{1GB=”All”;1=”Domain”; 2=”Private” ; 4=”Public”}
$FwAction      =@{1=”Allow”; 0=”Block”}
$FwProtocols   =@{1=”ICMPv4”;2=”IGMP”;6=”TCP”;17=”UDP”;41=”IPv6”;43=”IPv6Route”; 44=”IPv6Frag”;
                  47=”GRE”; 58=”ICMPv6”;59=”IPv6NoNxt”;60=”IPv6Opts”;112=”VRRP”; 113=”PGM”;115=”L2TP”;
                  ”ICMPv4”=1;”IGMP”=2;”TCP”=6;”UDP”=17;”IPv6”=41;”IPv6Route”=43;”IPv6Frag”=44;”GRE”=47;
                  ”ICMPv6”=48;”IPv6NoNxt”=59;”IPv6Opts”=60;”VRRP”=112; ”PGM”=113;”L2TP”=115}
$FWDirection   =@{1=”Inbound”; 2=”outbound”; ”Inbound”=1;”outbound”=2}

 

Function Convert-FWProfileType
{Param ($ProfileCode)
$FWprofileTypes.keys | foreach –begin {[String[]]$descriptions= @()} `
                                -process {if ($profileCode -bAND $_) {$descriptions += $FWProfileTypes[$_]} } `
                                –end {$descriptions}
}


The next step is to get the general configuration of the firewall; I think my Windows 7 machine is still on the defaults, and the result looks like this

Active Profiles(s) :Private 

Network Type Firewall Enabled Block All Inbound Default In Default Out
------------ ---------------- ----------------- ---------- -----------
Domain                   True             False Block      Allow     
Private                  True             False Block      Allow     
Public                   True             False Block      Allow     

The Code looks like this 


Function Get-FirewallConfig {
$fw=New-object –comObject HNetCfg.FwPolicy2
"Active Profiles(s) :" + (Convert-fwprofileType $fw.CurrentProfileTypes)
@(1,2,4) | select @{Name=“Network Type”     ;expression={$fwProfileTypes[$_]}},
                   @{Name=“Firewall Enabled” ;expression={$fw.FireWallEnabled($_)}},
                   @{Name=“Block All Inbound”;expression={$fw.BlockAllInboundTraffic($_)}},
                   @{name=“Default In”       ;expression={$FwAction[$fw.DefaultInboundAction($_)]}},
                   @{Name=“Default Out”      ;expression={$FwAction[$fw.DefaultOutboundAction($_)]}}|
            Format-Table -auto
}

Finally comes the code to get the firewall rules. One slight pain here is that the text is often returned as pointer to a resource in a DLL, so it takes a little trial and error to find grouping information.
The other thing to note is that a change to a rule takes effect immediately, so you can enable a group of rules as easily as :

Get-FireWallRule -grouping "@FirewallAPI.dll,-29752" | foreach-object {$_.enabled = $true}

 

Function Get-FireWallRule
{Param ($Name, $Direction, $Enabled, $Protocol, $profile, $action, $grouping)
$Rules=(New-object –comObject HNetCfg.FwPolicy2).rules
If ($name)      {$rules= $rules | where-object {$_.name     –like $name}}
If ($direction) {$rules= $rules | where-object {$_.direction  –eq $direction}}
If ($Enabled)   {$rules= $rules | where-object {$_.Enabled    –eq $Enabled}}
If ($protocol)  {$rules= $rules | where-object {$_.protocol  -eq $protocol}}
If ($profile)   {$rules= $rules | where-object {$_.Profiles -bAND $profile}}
If ($Action)    {$rules= $rules | where-object {$_.Action     -eq $Action}}
If ($Grouping)  {$rules= $rules | where-object {$_.Grouping -Like $Grouping}}
$rules}

Since this the rules aren’t the easiest thing to read I usually pipe the output into format table for example

Get-firewallRule -enabled $true | sort direction,applicationName,name | 
            format-table -wrap -autosize -property Name, @{Label=”Action”; expression={$Fwaction[$_.action]}},
            @{label="Direction";expression={ $fwdirection[$_.direction]}},
@{Label=”Protocol”; expression={$FwProtocols[$_.protocol]}} , localPorts,applicationname

 

Last, but not least if you want to create a rule from scratch you want to create a rule object with New-object –comObject HNetCfg.Fwrule, you can then pass it to the add method of the Policy object’s rules collection.  If I ever find time to finish the script it will probably have new-firewallRule, but for now you need to write your own.

This post originally appeared on my technet blog.

February 17, 2009

Two useful Hyper-V links

A short post by my standards

On the server core blog, Chuck has posted Top Issues for Microsoft Support for Windows Server 2008 Hyper-V , which makes an interesting read. If you do a lot of hyper-v you’ll probably stumble over one of these at some point.

On the Main Microsoft site we have Windows Server 2008 R2 & Microsoft Hyper-V Server 2008 R2 – Hyper-V Live Migration Overview & Architecture  which is exactly what it says.

This post originally appeared on my technet blog.

Support for Red Hat OSes on Microsoft Virtualization (and Vice Versa)

One of the questions which comes up on our internal distribution lists for Hyper-V is “when will such and such and OS be supported on Hyper-V” and the somewhat frustrating response is usually in the form “We’re talking to OS vendors, but we can’t talk about contract negotiations while they are going on. As soon as we can say something we’ll do it in public”. We have to negotiate certification , support and so on. Even saying we’re talking (or not talking) to vendor X my impact what we’re doing with vendor Y. The OS which comes up most often in this context is Red Hat Enterprise Linux, we’ve made some public announcements which are a  step in this direction

Here are key points from Red Hat’s Press Release

  • Red Hat will validate Windows Server guests to be supported on Red Hat Enterprise virtualization technologies.
  • Microsoft will validate Red Hat Enterprise Linux server guests to be supported on Windows Server Hyper-V and Microsoft Hyper-V Server.
  • Once each company completes testing, customers with valid support agreements will receive coordinated technical support for running Windows Server operating system virtualized on Red Hat Enterprise virtualization, and for running Red Hat Enterprise Linux virtualized on Windows Server Hyper-V and Microsoft Hyper-V Server

The last one is important because the it means a customer with an issue can call on vendor and if the problem appears to lie with the other vendor’s product it’s managed as one streamlined incident.  Note that work hasn’t been completed – the above is written in the future tense. According to Mike Neil’s blog post “Microsoft and Red Hat Cooperative Technical Support” we will provide integration components for Red Hat on Hyper-V and Red Hat will provide properly certified drivers for Windows on their Virtualization stack

Microsoft people would prefer customers only used Microsoft products, and Red Hat people would prefer customers only used Red Hat products – we sure aren’t going to stop competing. But the reality is customers use both: and both companies want their customers to have an excellent experience of their respective technologies, which mean we have to cooperate as well . This is coopertiton in action.

This post originally appeared on my technet blog.

February 6, 2009

Virtualization road show

Earlier in the week we took the virtualization tour over the Irish Sea. Tuesday was Belfast – and with the snow, getting there was quite a challenge. I felt ill all day and didn’t think I’d delivered the content as well as I should have, but the feedback forms were really positive, better than I thought I deserved. Then it was south to Cork: I hadn’t been there before – though I want to go diving nearby – and was quite impressed with an airport which would grace a far bigger place, and with the hotel (free wifi,and a receptionist who takes a lost booking in her stride are both guaranteed to impress). I did a better job , and again we had a really good audience, I don’t think I’ve ever had so many people from the audience thank me for the session on their way out.  My Irish surname comes from many generations back, so I don’t have much of a connection with the island, but I’ve come away feeling positive from every trip I’ve made there, North or South, and I’ve volunteered to do events in either place again.

We’re getting to the end of the virtualization tour, we have dates in Scotland for March, which will probably be the last. Before that there are seats available in Northampton on 24th Feb. Northampton isn’t somewhere  we’ve held events before but it’s easy to get to.We keep sneaking new bits into each session and I’m now including demos of the live migration in Server 2008 R2. Just follow the link to book your place

This post originally appeared on my technet blog.

January 31, 2009

Checking and enabling Remote Desktop with PowerShell

A couple of posts back I mentioned that I was working on a configuration library for Server 2008 R2 Core and Hyper-V Server R2 and this includes checking and setting the configuration for remote desktop.

It turns out that this is controlled from just 2 registry entries – hence it is controlled by the SCRegEdit script. One turns is fDenyTSConnections under  ‘HKLM:\System\CurrentControlSet\Control\Terminal Server’ and the other is UserAuthentication  under ‘HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp. So if the Values exist they appear as Item property in PowerShell and can be set, otherwise it can be created. I’ve found the safest way is to try to set  the value and trap the error which occurs if it doesn’t exist then create it specifying that it is a DWORD. So my function enables RemoteDesktop UNLESS –Disable is specified , and -lowSecurity is a boolean which tells it whether to demand user stronger authentication.

 

Function Set-RemoteDesktopConfig 

{Param ([switch]$LowSecurity, [switch]$disable) if ($Disable) {
set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server'`
-name "fDenyTSConnections" -Value 1 -erroraction silentlycontinue if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server' `
-name "fDenyTSConnections" -Value 1 -PropertyType dword }
       set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp' `
-name "UserAuthentication" -Value 1 -erroraction silentlycontinue
      if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp'
-name "UserAuthentication" -Value 1 -PropertyType dword}
}
else {
set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server' `
-name "fDenyTSConnections" -Value 0 -erroraction silentlycontinue
        if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server' `
-name "fDenyTSConnections" -Value 0 -PropertyType dword }
       if ($LowSecurity) {
set-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp'`
-name "UserAuthentication" -Value 0 -erroraction silentlycontinue
        if (-not $?) {new-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp'`
-name "UserAuthentication" -Value 0 -PropertyType dword}
}
     } 

}

Finding out what the settings are is even easier.

Function Get-RemoteDesktopConfig
{if ((Get-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server').fDenyTSConnections -eq 1)

          {"Connections not allowed"}
elseif ((Get-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp').UserAuthentication -eq 1)
         {"Only Secure Connections allowed"}
else {"All Connections allowed"}
}

The next part of the configurator to share will be for checking and setting firewall rules.

This post originally appeared on my technet blog.

January 27, 2009

Managing Windows Update with PowerShell

I mentioned some of the server management and config tools I’ve been creating with Server 2008 R2 core in mind, and I’m going to put the major bits here. I’ll post the whole script at the end of the set of posts but for now here is the code to look after Windows update


First I created two hash tables to map the numbers used in the functions to some meaningful text

$AutoUpdateNotificationLevels= @{0=”Not configured”; 1=”Disabled” ; 2=”Notify before download”; 
3=”Notify before installation”; 4=”Scheduled installation”}

$AutoUpdateDays=@{0=”Every Day”; 1=”Every Sunday”; 2=”Every Monday”; 3=”Every Tuesday”; 4=”Every Wednesday”;
5=”Every Thursday”; 6=”Every Friday”; 7=”EverySaturday”}

Next, there is a COM object for everything relating to auto-update. It has a settings property which contains the notification level and update days, the hour at which updates are fetched and how recommended updates are processed.  Setting the properties is pretty easy, and there is is a save method to commit them  – I’ve been lazy here and haven’t got a hash table mapping names to numbers for the notification level so or multiple switches so the user would need to know that notification levels in the hash table (or enter  $AutoUpdateNotificationLevels at the prompt to see what is in the table) – I might fix that for the final script.

 Function Set-WindowsUpdateConfig

{Param ($NotificationLevel , $Day, $hour, $IncludeRecommended)

$AUSettings = (New-Object -com “Microsoft.Update.AutoUpdate”).Settings

if ($NotificationLevel)  {$AUSettings.NotificationLevel        =$NotificationLevel}

if ($Day)                {$AUSettings.ScheduledInstallationDay =$Day}

if ($hour)               {$AUSettings.ScheduledInstallationTime=$hour}

if ($IncludeRecommended) {$AUSettings.IncludeRecommendedUpdates=$IncludeRecommended}

$AUSettings.Save

}

To show what the settings are, I decode them and return a custom object with the decoded properties.

Function Get-WindowsUpdateConfig

{$AUSettings = (New-Object -com “Microsoft.Update.AutoUpdate”).Settings

$AUObj = New-Object -TypeName System.Object
 Add-Member -inputObject $AuObj -MemberType NoteProperty -Name “NotificationLevel”   `
    -Value $AutoUpdateNotificationLevels[$AUSettings.NotificationLevel]

Add-Member -inputObject $AuObj -MemberType NoteProperty -Name “UpdateDays”      `
       -Value $AutoUpdateDays[$AUSettings.ScheduledInstallationDay]

Add-Member -inputObject $AuObj -MemberType NoteProperty -Name “UpdateHour”        `
   -Value $AUSettings.ScheduledInstallationTime
Add-Member -inputObject $AuObj -MemberType NoteProperty -Name “Recommended updates” `
-Value $(IF ($AUSettings.IncludeRecommendedUpdates) {“Included.”}  else {“Excluded.”})
$AuObj
} 

Checking on MSDN I found there is another object used in a script WUA_SearchDownloadInstall , which does what it says – it searches Windows update for updates, downloads them and installs them I added the logic to my function to over-ride the selection criteria, and auto-Restart if a restart is needed. Since it is sometimes useful to patch Virtual Machines, then shut them down, then Patch the host and then reboot it and bring the VMs back again , I’ve also put in a shutdown after Update switch.


The logic is simple enough, create a Session object which has CreateSearcher, CreateDownloader and CreateInstaller Methods. Then create a searcher and use it to get updates matching the default or specified criteria. If there are any updates, create a downloader object, hand it the list of updates found by the searcher and start a download. If the download completes successfully, filter out the successfully downloaded items, and pass those into a newly created installer object. Run the installation process and afterwards output a table showing the state of the updates. Finally reboot if needed.

Function Add-WindowsUpdate

{param ($Criteria=”IsInstalled=0 and Type=’Software'” , [switch]$AutoRestart, [Switch]$ShutdownAfterUpdate)

$resultcode= @{0=”Not Started”; 1=”In Progress”; 2=”Succeeded”; 3=”Succeeded With Errors”; 4=”Failed” ; 5=”Aborted” }

$updateSession = new-object -com “Microsoft.Update.Session”

write-progress -Activity “Updating” -Status “Checking available updates” 
$updates=$updateSession.CreateupdateSearcher().Search($criteria).Updates

if ($Updates.Count -eq 0)  { “There are no applicable updates.”}  
else {

       $downloader = $updateSession.CreateUpdateDownloader()
       $downloader.Updates = $Updates 
        write-progress -Activity ‘Updating’ -Status “Downloading $($downloader.Updates.count) updates” 
       $Result= $downloader.Download() 
       if (($Result.Hresult -eq 0) –and (($result.resultCode –eq 2) -or ($result.resultCode –eq 3)) ) {

       $updatesToInstall = New-object -com “Microsoft.Update.UpdateColl”

       $Updates | where {$_.isdownloaded} | foreach-Object {$updatesToInstall.Add($_) | out-null }
       $installer = $updateSession.CreateUpdateInstaller()
       $installer.Updates = $updatesToInstall


       write-progress -Activity ‘Updating’ -Status “Installing $($Installer.Updates.count) updates’
        $installationResult = $installer.Install()
        $Global:counter=-1
        $installer.updates | Format-Table -autosize -property Title,EulaAccepted,@{label=’Result’;
                               expression={$ResultCode[$installationResult.GetUpdateResult($Global:Counter++).resultCode ] }}
       if ($autoRestart -and $installationResult.rebootRequired) { shutdown.exe /t 0 /r }
       if ($ShutdownAfterUpdate) {shutdown.exe /t 0 /s }
}
}
}


So now I can run
Add-WindowsUpdate –Auto to download updates and reboot if needed,
Set-WindowsUpdateConfig –n 4 –i to schedule updates (Including the merely recommended)  and
Get-WindowsUpdateConfig


So next up it’s Get-RemoteDesktopConfig and Set-RemoteDesktopConfig.

This post originally appeared on my technet blog.

January 26, 2009

PowerShell and the smarter command line.

Filed under: Powershell,Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 2:49 pm

I mentioned I was doing some PowerShell work to manage configuration on the R2 versions of Windows Server 2008 Core and Hyper-V server (which now support PowerShell), and somebody in Redmond asked me if I knew  there were tools out there to do this job. …

I was thinking about how we find stuff, and I was something I wrote about Taxonomy came to mind.

Store data in the data, not in the field name. Do not create a long list of properties with yes/no answers. Not only is this awkward for users, but the sequence “Relates to product A, Relates to product B” stores Yes and No as the data. A multi select box “Relates to products…” stores the information where you can search it.  (This was dealing with document properties in the first release of Sharepoint Portal Server.) 

What has this got to with command lines ? It’s about thinking where the meaning is. I would say that it is true and self evident that the commands we enter should reflect the task being carried out. But (pre-PowerShell at least) it doesn’t happen. For example: suppose you want to configure Automatic Updates on Server 2008. If you run the full version of Server, the option is on the front page of Server Manager. But if, instead, you run the Core version, then what do you type at the command prompt ?  (HELP won’t give you the command). You can trawl through all the .EXEs .BATs, .CMDs, VBSs, .WSFs and so on but you won’t find anything with a name like AUCONFIG. If you’re lucky you’ve got the core guide from the Windows Server 2008 step-by-step Guides page. That will tell you that you need to run

cscript scregedit.wsf /AU 4

The part which identifies the task (set updates to automatic) isn’t the script name (SC Reg Edit) but the switches /AU and 4 if the task was to set updates to disabled the switches would be /AU and 1. Traditional command line tools have a name which reflect what they ARE –the Server Core Registry Editor ScRegEdit in this case. These tools are overloaded carrying out different tasks based on their switches ( if you want a worse case, look at the Network Settings Shell – NetSh).

At the command prompt there is no way to discover the tasks you can carry out and their Command+Switch or Command+Switch+Value combinations ; you have to resort to product documentation, training or a helpful expert who already knows that /AC tells SCRegEdit you want to Allow Connections (via terminal services) but /AU sets Auto Updates (where else does 1 mean disabled ? ).  By contrast PowerShell would have multiple commands for the different tasks – with names which reflect what they DO “Get-UpdateConfig” , “Set-UpdateConfig”, “Get-RemoteDesktopConfig” , “Set-RemoteConfig”. The commands are easily discoverable: and having found you have Set-UpdateConfig the tab key helps you to discover switches like –Disabled –InstallConfirmation –DownloadConfirmation and -Automatic  instead of 1,2,3 and 4 used by ScRegEdit /au

It’s easy to see how the command line tools that we have grew up: and SC Reg Edit edits the registry on Server Core (hence the name) /AU sets the registry entries for auto Update and so on. But understanding it doesn’t remove the desire for PowerShell naming and discoverability. V2 introduces ReName-Computer and Add-Computer – (if you ask Add Computer to what the help will tell you it is a to a domain or workgroup). These tasks were previously done by the NetDom program, with its switches My aim is to add  commands like Get-UpdateConfig” , “Set-UpdateConfig”, “Get-RemoteDesktopConfig” , “Set-RemoteConfig”  “Get-IPConfig” and “Set-IPconfig” (and a few more) at their simplest these can be wrappers for SCRegEdit , Netdom, NetSh, Net, and the others, but the ideal is to go straight to management objects instead.

So next I’ll look at a couple of the simpler ones.

This post originally appeared on my technet blog.

January 25, 2009

New build of my PowerShell library for Hyper-V on codeplex

Filed under: Powershell,Windows Server,Windows Server 2008 — jamesone111 @ 11:47 pm

Some naked statistics


1942 – the number of lines of code in HyperV.PS1


499 – the number of lines of formatting XML


14,381 – the number of words in the documentation file


2443 – the number of downloads of the last version


929 – the number of downloads for the versions before that


1.00 – the version number of the Library now on codeplex.


 


I’m calling this “released”. I daresay I will need to come back and do a service pack later, and the documentation must be regarded as “preliminary” but it’s done.


 


Update. A new stat. 200 – the number of downloads so far. Crumbs, that’s only a day and half.

This post originally appeared on my technet blog.

PowerShell 2, Server Core R2 and Hyper-V server R2

Filed under: Powershell,Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 7:01 pm

With all the betas out at the moment I’ve been trying to get up to speed on both Windows 7 (by using it) and Windows Server 2008 R2 , and Hyper-V server R2 as well.


If you’re a regular reader you won’t be surprised to know that I’m excited by what’s coming in PowerShell. PowerShell V2 is better in a lot of regards,although I’m not getting the best out of it yet. Quite aside from the graphical editor, the additional cmdlets, and the restored ability of Windows to drag and drop a file into a command window. It’s got better tab expansion. PowerShell has its own function you can customize for tab expansion, and there are two improvements in V2. The first is that tab expansion finds functions you have written as well as built in cmdlets. Much typing and confusion saved there… but something I only noticed today is that it expands parameters: can’t remember if you named a parameter “-Path” or “-VHDPath” just type the – sign and hit [Tab] and it cycles through the list. It sounds like a feeble reason to upgrade but trips back to a script file to find a parameter add up to a fair old chunk of time saved.


The other thing which is big news is the ability to run PowerShell on Core and on Hyper-V server. Unlike the full servers PowerShell is not installed by default (I’ve no idea why it is on one and not the other). Of course you shouldn’t be going the console of a Core / Hyper-V server box to do admin. But the ability to “remote” PowerShell  is a biggy. If PowerShell is installed AND remote management via WinRM is enabled then you can run any PowerShell command on a box in your data centre from your desktop machine. Of course from my point of view the this is a great push for my PowerShell library for Hyper-V on Codeplex – incidentally an update has been waiting for me to put the finishing touches to it since before Christmas and should appear shortly.


I’ve been looking at the Hyper-V configurator from R2

===============================================================================
                            Hyper-V Configuration

===============================================================================

 

1) Domain/Workgroup:                    Workgroup:  WORKGROUP

2) Computer Name:                       WIN-75FRHHINP4U

3) Network Settings

4) Add Local Administrator


5) Windows Update Settings:             Manual
6) Download and Install Updates
7) Remote Desktop:                      Enabled (all clients)
8) Failover Clustering Role             Enabled
9) Configure Remote Management


10) Regional and Language Options
11) Date and Time

12) Do not display this menu at login
13) Log Off User

14) Restart Server

15) Shut Down Server

16) Exit to Command Line

Some of the things it does are native PowerShell commands, for example V2 has commands for renaming a server, adding it to a domain, rebooting it and shutting it down

Rename-Computer –New “Core-27” 

 

Add-Computer –DomainName “Contoso” –Reboot


Stop-Computer

Restart-Computer


Are all pretty easy to understand – Not many people I meet can give me the command lines for NetDom and Shutdown to do the same things. Three more commands cover the regional/language and Date/Time options and logging off

control.exe “intl.cpl”

control.exe “timedate.cpl”

 

Logoff.exe


So what about the other options. Remote Desktop,  Networking, Cluster installation, Windows Update (settings and download) , Adding local Admins and configuring Remote Management ? If NetDom and ShutDown aren’t easy to remember, NetSH, ScReg, WinRm and the others are worse. Well I’ve been coding them up, and a couple of hundred lines thrown together over the weekend do what it took a coupe of thousand lines of VBscript to do. That’s not entirely a fair comparison because the VB Scripts which come with the OS are designed to be portable across languages, provide help and catch errors way beyond what I do in PowerShell.


I’ll break that code into a bunch of easy to digest posts over the next few days.

This post originally appeared on my technet blog.

January 16, 2009

Hyper-V licensing changes

Filed under: Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 8:00 am

A few days ago I wrote about our licensing for Hyper-V. The boys at VMware had picked up that we required Client Access Licences for clients of VMs running on Windows. The “if it runs on Windows it needs a CAL” philosophy has the virtue of simplicity, but it makes Virtualization projects expensive if you haven’t already got server 2008 CALs.. If there was a degree of ambivalence in that post it was because I like simplicity in licensing, but it just seemed wrong for us to ask for a CAL for a Linux client to access a Linux server.


Well… we’ve added to the IFs ANDs or BUTs count in the product use rights. But I’m not expecting any complaints. Here’s the key part of a “customer ready” mail which we have been given to share with Volume Licence customers


A number of trends, including consolidation and high availability, are driving more deployments and evaluations of Hyper-V. Based on feedback from our customers, we are updating our licensing policies to address these new scenarios enabled by virtualization.


Currently, if your physical server environment is running Windows Server 2003, matching version CALs are required for all users (i.e. Windows Server 2003 CALs). However, if you move your physical Windows Server 2003 Operating System Environments (OSE) to run as virtual machines hosted by Windows Server 2008 Hyper-V, Windows Server 2008 CALs are required. This is per the current use rights. With the change in our licensing policy, Windows Server 2008 CALs are no longer required if you are using Windows Server 2008 solely as a virtualization host. The only exception to this is if you are running Windows Server 2008 virtual machines, which would require Windows Server 2008 CALs.


If you would like more in depth information on this change, please read the updated Volume Licensing Brief (note that for now these terms only customers covered by a Volume Licence)


Oh and a quick tip of the hat to Chris Wolf, who saw this coming.

This post originally appeared on my technet blog.

January 8, 2009

Windows Server 2008 R2 and Windows 7 client Betas on Technet

Filed under: Beta Products,Windows Server 2008,Windows Vista — jamesone111 @ 3:33 am

I’m not exactly delighted to be be blogging at 3:13 in the morning, but I’m watching the keynote from CES. In the last 45 minutes we’ve published a press release about Windows 7. Steve Ballmer said that “Technet and MSDN subscribers can download it now”. I was watching site propagate and the downloads are there. now. What do you mean you’re not a technet subscriber ? Well you’ll be able to get it in a few days.

I picked up from Mary Jo’s blog that there was a live friend feed for those watching. Mary Jo picked up that everything says “the beta” not Beta 1 or some such. Someone else who was thinking in the lines I outlined in the my previous post and suggested that shipping 7 or July 7th (7/7) was too good to miss. Touch in Windows 7 is going to get a lot of attention. I can see quite a few “lightweight surface” type apps being built with that.

Bonus info: Halo Wars will be out in February and a new Halo 3 game (Halo 3 ODST) will be out before the end of the year.

This post originally appeared on my technet blog.

January 5, 2009

Lies,Damn lies and licence interpretations.

From time to time people ask me who I write for, and I always say I write for myself in the hope that there are enough people out there like me to make a reasonable size audience. It always surprises me how many people inside Microsoft read this blog, not to mention the number of competitors who come here to read my impeccably researched and completely impartial comments (and in return I read their lies, twisted truths and false malicious implications. Ha. Ha.)


Someone pointed me to a post of Mike DiPetrillo’s from just before Christmas with the so charming title of “Microsoft lies to their customers again.”. Mike’s beef is that people who work for Microsoft have said things to customers which contradict what we have posted in public. Unwilling to resist a good title, he’s chosen to make this ineptitude sound like some sort of corporately organized conspiracy… The odd thing is that he is complaining about something you’ll hear people from his company say. During  2008 people from VMware complained that Microsoft was playing dirty with licensing rules for virtualization – that VMware could not use the bundled instances of the operating system included with Enterprise and Datacenter versions of Server 2003 R2 and Server 2008. Allowing customers to do less with your product if they also buy someone else’s product tends to have regulators beating your doors down. If customers get a certain number of bundled instances with a licence that has to apply regardless of the virtualization technology. Indeed, I constantly have to explain to people the reason that you can’t use anything but virtualization on top of Windows Server Enterprise with the full compliment of 4 VMs is that if you did that you’d have 5 working copies of Windows vs 4 with VMware and someone would cry foul. We put out a Licensing FAQ to try to make things clear. (I wish we lived in a world where the licence agreements could be so clear that no FAQs were needed but legal documents and clarity rarely go together).
However… Not everyone at Microsoft understands all the nuances of licensing, or government regulation. Every so often I see someone saying “VMware told my customer they could assign a server licence to a box and use that licence for windows VMs running on VMware. Where do I find something to fight that lie” and some kindly person has to point out it is no lie, and steer the poor chap to the FAQ.  If anyone out there meets Microsoft people who are still getting this wrong (and don’t have a better channel) mail me and I’ll gently set them straight.


[Update] The rest of this post has been overtaken by events – it is easier to remove it than to explain…

This post originally appeared on my technet blog.

Next Page »

Create a free website or blog at WordPress.com.