James O'Neill's Blog

December 23, 2008

Grrr. A christmas present which is just going to have to wait.

Filed under: Uncategorized — jamesone111 @ 10:08 pm

I’m not a Christmas person. Among other things I’m I’ve no imagination for Christmas presents for myself or other people. Then I saw this…

Early Christmas Present from PowerShell Team: Community Technology Preview-3 (CTP3) of Windows PowerShell V2

Having watched and …er … “helped” my son with his new Lego recently I was reminded that I said years ago that Lego was great training for working with computers, and playing with PowerShell is no different to playing with Lego. I doubt that the people I’m spending Christmas with would see it that way…

 

Whatever you’re playing with, what ever festival you mark at this time of year and whoever you’re spending it with, let me wish you a merry one.

This post originally appeared on my technet blog.

December 20, 2008

Borrowing from Windows Explorer in PowerShell part 3: using extended properties with zip files

Filed under: Powershell — jamesone111 @ 6:00 pm

So a couple of posts back I showed a little bit of PowerShell which could create a new Zip file and hook into Windows Explorer’s object model to Add files to the Zip.


There always used to be a joke about backup products .. * Restore sold separately. So having showed the code to add files , I need something to copy files out of a zip.

Function Copy-ZipContent  
{Param ($zipFile, $Path)

    if(test-path($zipFile)) {$shell = new-object -com Shell.Application

                             $destFolder = $shell.NameSpace($Path)

                             $destFolder.CopyHere(($shell.NameSpace($zipFile).Items()))}       
}

Of course now I want to see what is in the ZIP file. Since I’ve stuck to PowerShell Naming with Add-ZIPContent and Copy-ZipContent this function should probably be named Get-ZipContent. It uses the same processes as before – get a shell.NameSpace object to represent the ZIP file and then use its GetDetailsOf() method in the same way that getting folder properties did in the second part of this series. Where that gets 266 properties for a normal folder for a ZIP archive there are only 9. So the function I ended up with looks like this.

Function Get-ZIPContent  
{Param ($zipFile=$(throw “You must specify a Zip File”), [Switch]$Raw)

if (-not $ZipFile.EndsWith(‘.zip’)) {$ZipFile += ‘.zip’}

if (test-path $Zipfile) {

        $shell = new-object -com Shell.Application

        $NS=$shell.NameSpace(((resolve-path $ZipFile).path))       
        $Files = $(foreach ($item in $NS.items() ) {0..8 |
foreach –begin {$ZipObj = New-Object -TypeName System.Object } `
-process {Add-Member -inputObject $ZipObj -MemberType NoteProperty
-Name ($NS.GetDetailsOf($null,$_)) -Value ($NS.GetDetailsOf($Item,$_)) }  `
                 -end     {$ZipObj} })
        if ($raw) {$files} else {$files | format-Table -autosize}                                           
        }
}

This function uses a trick I’ve used more and more lately. A loop which creates an object at the Begin stage, adds a property to it with each iteration and emits the object at the End Stage. In this case the loop is run once for each file and goes through the numbers 0..8 and adds a property getting the name for property with that number (that’s getdetailsOf with $null in place of the file) and the value of property (calling GetDetails of with the File). If the –RAW switch is specified it outputs the object so something else –normally I want something like this.

Name                 Type                      Compressed size Password protected Method   Size    Ratio Date modified         CRC-32
—-                 —-                      ————— —————— ——   —-    —– ————-         ——
hyperv.format.ps1xml PS1XML File               1.60 KB         No                 Deflated 23.0 KB ‎94%  ‎18/‎12/‎2008 ‏‎22:49 CC973E54
hyperv.ps1           Windows PowerShell Script 26.5 KB         No                 Deflated 122 KB  ‎79%  ‎19/‎12/‎2008 ‏‎00:07 D9877F44

It won’t take you very long looking at the code to see that I don’t recurse through directories stored in the zip file in the code above. I re-used some of the code I had for formatting something as a tree and put it into the final version which is in the attached file along with the other ZIP tools (which is subject to the terms of use that cover this site – see the link at the foot of the page).

This post originally appeared on my technet blog.

December 19, 2008

What will Windows 7 be like … magnificent ? The Goldilocks question.

Filed under: Beta Products — jamesone111 @ 3:07 pm

It’s odd to think that  I’ve been using Windows Vista as my main OS for more than 2 years. We’ve started taking the wrappers off “Windows 7” and “Server 2008 R2” at Tech-ed in Barcelona people were asking when they could get a beta, when the product would ship and so on. I have summed this up on internal discussions with “Those who know aren’t talking and those who talk don’t know”. The best answer we got to for the beta,was “It might be before Christmas, but then again it might be after”. Ah yes, put a Microsoft field person on the spot about dates and we become evasive enough to make the average politician’s answers look clear and direct. There are still a few days left for me to be proved wrong, but it looks like early in the new year. As for ship dates, we said when it shipped that Vista would have a 3 year life -  taking it to the end of 2009, and in the run up to the launch of Server 2008 we used slides which indicated “R2” at the end of 2009. We haven’t gone public on any changes to that. A beta at the start of the year would tie in nicely with that too.

I’m going to switch to 7 when the beta arrives. So that means the end of the road for Vista and I . I’ve genuinely liked the OS : but I’d need to have my head in sand to think that everyone liked it as much as I have. So I’ve been thinking about what people will make of 7. And I’ve got to a few examples.

“All Microsoft OSes are dreadful. It’s their natural state,XP was bad, I never used Vista but it was worse, 7 will be worse still”  Thank you, professor. There’s precious little point in my talking to you. Next.

Microsoft OSes were OK upto and Including XP.Things went off the rails with Vista.”  Now there are two ways this can go.
Either “The only hope is for Microsoft to go back to XP, and refine that”. That just won’t happen. The Windows 5.x generation (Windows 2000, XP, and Server 2003) is behind us, we are in the 6.x generation (Vista, 2008, 2008 R2, and “Seven”). The handful of things which can’t be made to work on Vista won’t work on ANY future OS.
OR “As a point release of Vista, Seven will deliver what Vista should have been in the first place”.  I’m probably going to have resist the urge to disagree with this. Vista was what it should have been for 2006. If we’d somehow produced 7 back then it would have had all the problems which come from a generational shift. But when people are ready to move its daft to beat them up for being slow to come to that decision.

“Vista was probably OK, but we didn’t move up to the 5.x generation in its first couple of years. The 6.x generation is the same”  This isn’t a good thing in an IT department.”We can’t deploy anything new” is a bad thing to say when the word “technology” is part of your title. But again if you’re ready to move now, no sense in asking “what took you so long”

“We’ve deployed vista .” This can go three ways.
Either “The change to 7 is small enough to make the step up really easy and we can see the benefits” – thank you, no need to talk to you but would you like to talk to some of my other customers !
orThe change to 7 is small enough that we can’t see the benefits ,so we’re going to stick with Vista”-
or “We’ve been through the pain of one OS upgrade, we’re not doing another one any time soon”

And that is the Goldilocks question. “This jump is too big” – “This jump is too small” and hopefully “this one is just right.”. Of course the only way to find out is to try the porridge for yourself – which – hopefully – is where we’ll be in a few weeks.

This post originally appeared on my technet blog.

December 18, 2008

IE Security Patch

Filed under: Events,Security and Malware — jamesone111 @ 11:48 am

You may have seen in the news over the last few days that a vulnerability has come to light in IE, which allows a carefully crafted web page to run arbitrary code on a PC. I don’t assess the technical side vulnerabilities -  some of the things written about how serious this one was one verge on the hysterical, and some downplay it too far. There are two web casts scheduled to talk about this one. Wednesday, December 17, 2008 1:00 P.M. Pacific Time / 9PM GMT and Thursday, December 18, 2008 11:00 A.M. Pacific Time / 7PM GMT if you want to get chapter and verse.
In any event the fix is now on Windows update. It’s serious enough to put a fix out without sticking to our normal schedule. Our biggest worry with every fix we post is they get reverse engineered, so get this one installed on any machine where you use IE to access the internet. On servers, where you don’t use a browser, or only use it for very limited browsing of trustworthy sites, there is less urgency.

I did read something in from a recent customer survey, where a customer wrote that products should be 100% bug free. Realistically, bug-free code is like an error free newspaper … a great aim, but something which doesn’t really happen. Some minor typos, spelling , punctuation or grammatical errors can be left without anyone being concerned. Other change the meaning of what it is said. Some errors of fact need a correction to be issued (patches) and some can land you in the libel courts.  Something like the Nimda virus was the equivalent of a huge libel payout, this one seems to be more than a correction buried somewhere internally and less than a £1M libel payout.

This post originally appeared on my technet blog.

December 10, 2008

Virtualization: user group and good stories

Filed under: Events,Virtualization,Windows Server,Windows Server 2008 — jamesone111 @ 11:05 am

Details of the next Microsoft Virtualisation* User Group meeting now up on www.mvug.co.uk!

Where:   Microsoft London (Cardinal Place)
When: Date & Time 29th January 2009 18:00 – 21:30
Who & What: Simon Cleland (Unisys) & Colin Power (Slough Borough Council)
Hyper-V RDP deployment at Slough Borough Council
Aaron Parker (TFL)
Application virtualisation – what is App-V?
Benefits of App-V & a look inside an enterprise implementation.
Justin Zarb (Microsoft)
Application virtualisation – in-depth look at App-V architecture

I presented at an event for Public Sector customers recently and the folks from Slough Borough Council were there. I thought they were a great example because so many of the things we talk about when we’re presenting about Hyper-V actually cropped up with their deployment.

We’ve got another great story from the Public sector at the other end of the British Isles – Perth and Kinross council reckon Hyper-V will save £100,000 in its first year.  

However the best virtualization Story was one which told by one of our partners on the recent unplugged tour. Virtualization reduces server count, and that’s great for Electricity (cost and CO2), maintenance, capital expenditure and so on. But they had a customer who didn’t care about that. They found the the walls were cracking in their office and the reason was the weight of Servers in the room above. According to the structural engineer, they had overloaded the floor of server room by a factor of 4 and there was a risk that it would collapse onto the office staff below. That’s the first story I’ve heard of virtualization being used to reduce Weight. 

 

* Being British the MVUG  like using the Victorian affectation of spelling it with an S

This post originally appeared on my technet blog.

December 9, 2008

Borrowing from Windows Explorer in PowerShell part 2: extended properties

Filed under: Powershell — jamesone111 @ 11:07 pm

When I stated looking at what could be done using explorer objects from PowerShell I “discovered” the extended properties. This vary between operating systems, before Windows 2000 the set was pretty rudimentary, XP and Server 2003 had quite an extensive set of properties and Vista/Server 2008 extends the set still further to 266 items.  You can send a list of the properties to the screen with this command.

[ps]  >  $objShell = New-Object -ComObject Shell.Application
[ps] > $objFolder = $objShell.namespace(“C:\”)
[ps] > 0..266 | foreach {“{0,3}:{1}”-f $_,$objFolder.getDetailsOf($Null, $_)}
  0:Name
1:Size
2:Type

264:Frame rate
265:Frame width
266:Total bitrate

It doesn’t matter which folder you choose at line 2. In line 3 I don’t like using strings with –f in examples but it’s the easiest way to do the output {0,3} is ‘Argument 0 formatted to 3 wide’  – that’s the number and {1] is argument 1, and getDetailsOf $null and a number returns the column name.


I store this in a hash-table in the example below

function Get-ext

{param ($attributes, $Path=(pwd).path)
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.namespace($path)
0..266 | Foreach-object -begin {$Columns=@{} } -process {$Columns.add($objFolder.getDetailsOf($Null, $_),$_)}
foreach ($file in $objFolder.items())  {          $attributes | forEach -begin  {$fileObj = New-Object -TypeName System.Object } `

                               -process {Add-Member -inputObject $fileObj -MemberType NoteProperty -Name $_ `

-Value ($objFolder.GetDetailsOf($file , $Columns[$_]) )}  `
                                -end { $fileObj} }

}


So the function takes a list of attributes as text. It puts all the attribute names and numbers into a hash table, and for each file it builds an object, the properties are built by saying “for each attribute passed, look up its number and get that attribute for the current file”


I found that there is an extendedProperty() method on the file objects, but that this doesn’t seem to work with all the property names, but asking the Folder object to get the properties does.  SO now I can run a command like this to get the photographic information I want.

Get-ext “name”,”Title”,”Tags”,”f-stop”,”Exposure Time”,”ISO Speed” | ft *

This post originally appeared on my technet blog.

December 8, 2008

Borrowing from Windows Explorer in PowerShell part 1 – ZIP files

Filed under: Powershell — jamesone111 @ 1:54 pm

When I was at tech-ed in Barcelona recently I met Corey Hynes, who looks after building a lot the labs for these events and some of our internal training too. If you were at Tech-ed and saw Windows 2008 R2 on our stand there, you were seeing some of Corey’s work. He builds a lot of virtual Machines and asked if I could add a feature to the codeplex library for Hyper-V which will show up in the next version I post there. I was looking at the way he distributes his VMs and added a second thing to the library.

Without giving all of his trade secrets away, Corey builds VMs making the maximum use of differencing disks, so if he has 4 machines running the same OS he will have 1 parent and 4 differencing VHDs. This doesn’t give the best performance*, but these small scale environments don’t need it. Corey also makes a lot of use of snapshots to allow a VM to be rolled back – or rolled forward to the state at the end of a lab, again something you’d avoid in production to get the best performance.

His technique is to export each VM, remove the duplicated parent VHD and make a secondary copy of the description files which are destroyed in the import process and then compact the whole lot. If anything goes wrong with the VM on the target computer it can be deleted and re-imported just by unpacking the import description files. So I though it would be a good idea to allow my Import-VM function to preserve the files, the line of code it needs is

if ($Preserve) {Add-ZipFile "$path\importFiles.zip" "$Path\config.xml","$path\virtual machines"}

Add-ZipFile  ZipName  FileNames, is all very well as a function specification, but how do you write it.  I’m told that for licensing reasons the only way to access ZIP files that Windows provides is via explorer so the technique is to create an empty ZIP file and then tell explorer to add the files to it.

Here’s the code to make a new, empty, Zip file.

Function new-zip

{Param ($zipFile)
if (-not $ZipFile.EndsWith('.zip')) {$ZipFile += '.zip'}
set-content $ZipFile ("PK" + [char]5 + [char]6 + ([string][char]0) * 18)
}

As you can see a 22 character header marks a file as a ZIP file. The the code below adds files to it.

Filter  Add-ZIPfile 

{Param ($zipFile=$(throw "You must specify a Zip File"), $files) if ($files -eq $null) {$files = $_} if (-not $ZipFile.EndsWith('.zip')) {$ZipFile += '.zip'} if (-not (test-path $Zipfile)) {new-zip $ZipFile} $ZipObj = (new-object -com shell.application).NameSpace(((resolve-path $ZipFile).path))

$files | foreach { if ($_ -is [String])
{$zipObj.CopyHere((resolve-path $_).path )}
                  elseif (($_ -is [System.IO.FileInfo]) -or ($_ -is [System.IO.DirectoryInfo]) )
{$zipObj.CopyHere($_.fullname) }
              start-sleep -seconds 2} $files = $null
}

The key thing is the Shell.application object has a namespace method which takes a path, and returns a folder or zipfile as a namespace. The namespace has a “copy here” method, so the logic is check for one or more file(s) passed as a parameter or via the pipe.Check that the Zip file ends with .ZIP and if it does, and the .ZIP extension. If the file doesn’t exist,create it as an empty ZIPfile.Get a namespace for it and call  the copy here method for each file passed. (If the file was a name, resolve it to a full name and if it is an object get the full name from the object).

Easy …. Now this led me to explore the Shell.application object more, but I’ll make that another post.

 

* Update Corey pointed out that by sharing a VHD with the common files on you maximize the benefit of any read cache. Differencing disks (including the ones used for snapshots) are extended when blocks on the parent change, that’s the slow bit. In a workload with few changes to the disk a differencing disk can work out faster.

Update 2. Corey was way to polite to mention I’d misspelled his name ! I’ve put that right.

This post originally appeared on my technet blog.

December 3, 2008

Two more HPC events.

Filed under: Events,High Performance Computing — jamesone111 @ 3:23 pm

NetEffect presents: iWARP Enabled Ethernet and Windows HPC Server 2008 Deliver Super Computer Performance

Where and When: Live Webcast December 3, 2008, 9:00am PDT / 5:00 GMT  through Live Meeting Here! (yes, today ! ) but On-demand recording will be available for download and viewing after the live webcast.

There was a day when you needed an army of computer scientists to build a compute cluster that performed well on applications with a lot of fine grained parallel processing.  Not any more, NetworkDirect and iWARP provide Windows and a standard Ethernet fabric with application run times comparable to proprietary OSes and proprietary clustering fabrics while maintaining the ease of use expected from Microsoft and Ethernet.  See how iWARP transforms Ethernet into a high performance clustering fabric.  

Target audience: Developers, Architects, Cluster administrator, HPC end users

 


 

Viglen HPC and Windows HPC Server 2008 – Leveraging Existing Expertise for Simple Cluster Deployment

Where and When: Live Webcast December 8, 2008 4:00pm GMT through Live Meeting Here. On-demand recording will be available for download and viewing after the live webcast.

As High Performance Computing use grows within the mainstream server arena, and larger and more complex HPC solutions being deployed, managing the HPC Data Centre has become a large consideration when considering an HPC deployment. 
With products ranging from Standard 1U Pizza box solutions and High Capacity Storage Nodes, to Twin motherboard 1U Intel and AMD nodes, coupled with best of breed compilers, management tools and Operating Systems, Viglen HPC solutions are designed to fit most HPC requirements. Learn how Viglen HPC nodes, coupled with Microsoft Windows HPC Server 2008 allow you to build, deploy and manage HPC clusters using the Windows expertise you already have in house.  Join us today!

Target Audience: Cluster administrator, HPC architects, HPC end users

This post originally appeared on my technet blog.

December 2, 2008

PowerShell Verbs Vs Nouns

Filed under: Powershell,Real Time Collaboration — jamesone111 @ 11:54 am

The first big PowerShell project I worked on was to produce the scripts in the OCS resource Kit. With OCS R2 announced, it won’t come as a great surprised that we’re working on the Reskit again and I’ve gone back to my scripts. Boy oh Boy have I learnt some stuff in the last year.

  • PowerShell nouns are written in the singular. (I had a mix of singular and plural)
  • Be consistent with Nouns (don’t have “usage” in one place and “OCSPhoneRouteUsage” in another)
  • Avoid creating new verbs. (I’d written LIST-, when Get- would have done, LINK- for Add)
  • Try to allow the user to pipe things into commands, pass an object or a name to fetch the object
  • Allow Wildcards in names where possible. Allow people to use * even when if queries prefer %

The list goes on. I had already produced a table of verbs to nouns for the first release and I had this post of Jeffrey’s rattling round in my head. Rather than use his code I put my own together, using my new favourite PowerShell cmdlet, Select-String.

Function Get-VerbNounMatrix
{Param ($scriptName)

$Functions=Select-String -Pattern "^function|^filter" -path $scriptName | % {$_.line.split(" ")[1]}

$Verbs=($functions | forEach {$_.split("-")[0]} | sort -unique)

$Nouns=($functions | forEach {$_.split("-")[1]} | sort -unique)

$(foreach ($n in $Nouns){

$verbs | foreach -begin {$Info = New-Object -TypeName System.Object

Add-Member -inputObject $Info -MemberType NoteProperty -Name "Type" -Value $n} `

-process {Add-Member -inputObject $Info -MemberType NoteProperty -Name $_ `

-Value $(if ($functions -Contains "$_-$n") {"*"} else {" "})} `

-end {$info} }

) | export-csv -path $scriptName.toUpper().replace("PS1","CSV") }

 

The $functions= line gets all the lines in the specified file which start either “Function” or “Filter”, splits them where it finds space and takes the function name part (after the first space)

the $verbs= and $nouns= bit give the function arrays of the two halves of the function name, and then all the work is done in two nest loops

For each noun, it looks at each verb and creates an object with properties whose names match the verbs and whose values are set to a space or a * depending on whether the verb-noun combination exists; each object also gets a type property which is the noun. These object are then sent out to a CSV file.

and here’s the result (with a little cleaning up – the first time I ran it I found a function which was still using the wrong noun; not only is this is a great way of showing quickly what you have to others, but it shows you what you need to go back and fix).

Type Add Choose Export Get Import New Remove Update
ADUser     * *        
OCSADContainer       *        
OCS * Cert       *        
OCSEdgeFederationDenied       *   * *  
OCSEdgeFederationPartner     * * * * *  
OCSEdgeIMProvider       *   * * *
OCSEdgeInternalDomain       *   * *  
OCSEdgeInternalServer       *   * *  
OCSErrorEvent       *        
OCSGlobalUCSetting       *        
OCSInstalledService       *        
OCSLocationProfile   *   *   * *  
OCSMediationServer   *            
OCSMediationServerSetting       *        
OCSMeetingPolicy   *   *   * * *
OCSNormalizationRule   *   *   * * *
OCSPhoneRoute   *   *   * * *
OCSPhoneRouteUsage   *   *   * *  
OCSPICUserCount       *        
OCSPool   *   *        
OCSSchemaVersion       *        
OCSSIPDomain       *   * *  
OCSSipRoutingCert       *        
OCSTrustedService       *        
OCSUCPolicy   *   *   * * *
OCSUser     * * * * * *
OCSUserDetail       *        
OCSWarningEvent       *        
OCSWindowsService       *        

 

This post originally appeared on my technet blog.

December 1, 2008

Green IT and adding up the numbers

Filed under: General musings — jamesone111 @ 4:33 pm

I did the keynote for the virtualization unplugged tour recently, and I tried to draw several themes together in it. Virtualization is good for making IT more dynamic (and what that means and why it is good thing) and Virtualization is good for saving money, space and carbon emissions. But I’ve been at pains to point out that some of the things that I see as “good IT” – that is IT which lets people work in the best way – are part of being dynamic and part of being both green and cost effective.  In particular that means using  IT for flexible working.

I’m working at home today. I’ve managed to get stuff written thanks to the peace and quiet. And by staying home I’ve avoided putting 25 Kilos of CO2 into the atmosphere. I make the point over and over that the environmental benefits from using IT to reduce travel are greater than those from reducing the power our computers use. When I said this at the last event a gentleman from a company which thin client devices asked me about the numbers. He said his organization had recently published a case study for a company which was saving more than 1000 Kilowatt hours per desk per year by taking out their PCs (which were old and ran 24/7) and replacing them with thin clients from his company – which then ran their familiar Windows programs from the company’s data centre. The thin clients can save a shade over 100 watts compared with a normal PC, and a quick bit of mental arithmetic said yes the numbers add up. Its unlikely that the old PCs could have run Vista, but using a policy on Vista to enforce sleep would have got 70% of that saving.

Different countries use different mixes of fossil and non-fossil fuels, and according to the carbon trust 1 KWh of UK grid electricity generates 537 grams of Co2, so the this company was saving over half a tonne per year – per desk. It’s a great achievement and it would be churlish to knock it: Carbon, and money saved is saved. But I want to put it in perspective. Another site I checked quoted 300g/mile for air travel , and the average car on UK roads emits about 200g/KM – make that 320g/mile. So 1000 KWh is the same as driving 1700 miles or flying 1800. 3 round trip flights to Edinburgh, or as far as Greenland on the way to Seattle. Or about 25 round trips to the office in the car for me. Those trips are worth about 40 hours, my family gets some of that and the company gets some. Think of it as Microsoft getting two or three more days work, without taking more of my time… 

Here’s another way to calculate the saving. That same site gives a figure of 2630 grams per litre of diesel or 2320 grams per litre of petrol; assume you save the mileage equivalent of that. That fuel’s worth about £240 at the pump – if the company pays for Petrol that money goes to the bottom line, but for a higher-rate tax payer who buys their own fuel it’s like getting a £400 pay rise without costing the company a bean.

Of course some jobs don’t lend themselves to flexible working, some people get less benefit because they use public transport or drive short distances. But there are plenty of people who can save that much travel in a year. So lets tot up the benefits of flexible working. The company gets more hours worked and each hour worked is productive.  The employee gets a better work/life balance.* And depending on exact car arrangements they share the benefit of lower mileage in fuel and running costs. AND the environment wins.

 

 

* While I was writing my wife asked if I can collect our son from school one day this week.  Since I have only a one hour meeting that day I’m going to make it a Live Meeting and work from home.

This post originally appeared on my technet blog.

iPhone Adverts.

Filed under: Mobility — jamesone111 @ 9:12 am

Everyone knows Apple’s advertising annoys people at Microsoft. That’s partly what it’s for. And I’ve held my peace about an iPhone advert which appear to feature a faked series of operations. If you watched the ad closely you could see that a .ZIP file is downloaded from a mail message and magically opens as a document, without ever being unzipped. Not having tried it I can’t be 100% certain that the iPhone doesn’t magically unzip files and open their contents, but I’m sceptical to say the least. So I was quite pleased to hear the Advertising Standards Authority have banned that ad – the issue being it showed downloads at a speed that no 3G network can deliver (with a message that Network performance varies).

Incidentally I’m told that the launch sales of Samsung’s Omnia phone were better than the equivalent for the iPhone. The Omnia isn’t for me, but it’s one of 160 Windows mobile devices – there’s a form factor for most people.

This post originally appeared on my technet blog.

Blog at WordPress.com.