James O'Neill's Blog

March 22, 2010

How to make and share Panoramas more easily – Microsoft ICE + Photosynth

Filed under: Uncategorized — jamesone111 @ 12:10 pm

I mentioned in yesterday’s post that I was working on a post about scanned images – in fact it’s the same set of scanned images I was using in this earlier post. In that post I talked about putting Panoramas into SilverLight Deep Zoom

The teams behind Microsoft ICE* – (the Image Composite and Editing tool) and Photosynth have worked together so the newest version of ICE can upload its results into Photosynth: Photosynth is driven by deep zoom so I don’t have to any work to get a shared, zoomable panorama, with Geotagging thrown in. This is not the only great new addition to ICE** it has support for mechanised capture devices which produce an ordered grid or as ICE terms it a structured panorama. You can read more on the HDview blog and on the Photosynth blog (Matt mentions that the ability to see Geotagged panoramas on the same map as Geotagged synths isn’t there yet, but is coming – and I’ve got another post in the pipeline about Geotagging photos) .

I put together a little video to show just how easy the process is (if the embedded video doesn’t work properly, you can view it in its original location) – at 1280×720 resolution it doesn’t really do Photosyth’s viewer justice.  You can click the version below and hit the button on the right for the full screen view to see just why I like it so much. I have a couple more panoramas which I will rebuild and upload.



* Should I proud of myself for not making any contrived “ICEmen” jokes ?
** I could have said cool new thing in ICE .

This post originally appeared on my technet blog.

March 21, 2010

The 50th birthday that would have been

Filed under: Photography — jamesone111 @ 2:42 pm

If you believe in the parallel universes then there are ones where Ayrton Senna is celebrating his 50th birthday today, having won 6, 7 or 8 world championships. 

Senna and Mansell - click for a bigger versionIn this one, the 50th anniversary of his birth is marked a more sombrely. It’s not quite 16 years since he died with 3 championships to his name and last week when his nephew flipped open the visor on a very similarly patterned helmet to reveal very similar looking eyes I felt like I had seen a ghost – and heard Martin Brundle articulate the same thought on his TV commentary. I said pretty much all I wanted to about Ayrton on fifteenth anniversary  of his death. But since I’ve posted recently about scanning pictures and I have another post in draft about that , I thought I’d share a couple of successful scans.

In 1991 Senna looked like he was going successfully defend the title he won in 1990 – he won the first 3 races before Nigel Mansell had got a finish, and he wasn’t even the leading Williams driver until 7th race. The 8th was the British Grand Prix, and I was there. To cap a perfect day for Mansell and a partisan crowd, Senna ran out of fuel on the last lap. Mansell on his victory lap stopped and gave his adversary a lift back to the pits. In this picture you can see how much more exposed the drivers heads were in those days – which was to be the death of Senna in another Williams, 3 years later. Riding back like that is forbidden now, and that too speaks of the attitudes to safety. But it says something to me about the nature of sport that drivers could fight for everything on the track, yet offer help and not be humiliated by accepting it. Senna at Silverstone

The cheap film, second-rate lenses and my own technique limit how good the scan of the photo can be – but I hope the story explains why I treasure it. Far better from a technical point of view is this second picture – but sometimes the technical quality isn’t what matters.


To mark the anniversary, Autosport have a page Ayrton Senna: A life in pictures – they are arranged with the oldest at the bottom – you can see his first F1 test (in a Williams), another shot of the ride home with Mansell in the middle and on the left of the top row is quite a poor shot – you can’t tell what it is from the thumbnail. It’s the back of the car, and I generally don’t keep those. The caption reads “Ayrton Senna, Williams FW16 Renault; 1994 San Marino Grand Prix at Imola.”  And then slowly it dawns that the wall and trees in the background mean the car is going into the corner named Tamburello and there’s a big gap to car behind, so this must be the sixth lap – a few seconds after this shot was taken the right front wheel of FW16 hit the wall a little further down than we can see in the shot, and parted company from the car. On another day or in another universe it would have passed harmlessly by, but it didn’t and that picture – like the lift home one – captures what we now know to be a decisive moment.

This post originally appeared on my technet blog.

March 18, 2010

Virtualization announcements today.

Filed under: Virtualization — jamesone111 @ 5:23 pm

In another window I am listening to the desktop Virtualization hour which I blogged about yesterday. A couple of hours ahead of the broadcast we posted the press release on Press pass which contained the following detail of what we are announcing today.

• New VDI promotions available for qualified customers to choose from today. Microsoft and Citrix Systems are offering the “Rescue for VMware VDI” promotion, which allows VMware View customers to trade in up to 500 licenses at no additional cost, and the “VDI Kick Start” promotion, which offers new customers a more than 50 percent discount off the estimated retail price. Eligibility and other details on the two promotions can be found at http://www.citrixandmicrosoft.com.

• Improved licensing model for virtual Windows desktop. Beginning July 1, 2010, Windows Client Software Assurance customers will no longer have to buy a separate license to access their Windows operating system in a VDI environment, as virtual desktop access rights now will be a Software Assurance benefit. [Note the new name VDA is what we used to call VECD.]

• New roaming use rights improve flexibility. Beginning July 1, 2010, Windows Client Software Assurance and new Virtual Desktop Access license customers will have the right to access their virtual Windows desktop and their Microsoft Office applications hosted on VDI technology on secondary, non-corporate network devices, such as home PCs and kiosks.

Windows XP Mode no longer requires hardware virtualization technology. This change simplifies the experience by making virtualization more accessible to many more PCs for small and midsize businesses wanting to migrate to Windows 7 Professional or higher editions, while still running Windows XP-based productivity applications.

• Two new features coming in Windows Server 2008 R2 service pack 1. Microsoft Dynamic Memory will allow customers to adjust memory of a guest virtual machine on demand to maximize server hardware use. Microsoft RemoteFX will enable users of virtual desktops and applications to receive a rich 3-D, multimedia experience while accessing information remotely. [Note the new name RemoteFx is the technology we acquired with the purchase of Calista.]

New technology agreement with Citrix Systems. The companies will work together to enable the high-definition HDX technology in Citrix XenDesktop to enhance and extend the capabilities of the Microsoft RemoteFX platform.


Good stuff all round, but from a technical viewpoint it’s the new bits in SP1 which will get the attention. I’ll post a little more on what Dynamic memory is and is not in the next day or two.

This post originally appeared on my technet blog.

March 17, 2010

Re-post : Desktop Virtualization Hour

Filed under: Events,Virtualization — jamesone111 @ 3:44 pm

About a month ago I mentioned that we have a “Desktop Virtualization Hour” planned for 4PM (GMT) tomorrow, March 18th. (That’s 9AM Seattle time, 5PM CET … you can work out the others I’m sure). More information and a downloadable meeting request are Here.

I said then that I thought it might be “more than the average web cast.” and over the last couple of days I’ve had some information about announcements which are planned for the session.  Obviously I am not got say what they are until tomorrow, but if you want the news as it breaks – click the link above.

This post originally appeared on my technet blog.

March 16, 2010

Book Review: Windows Powershell 2.0 Best Practices, Ed Wilson

Filed under: Powershell — jamesone111 @ 10:25 am

Last week I watched a TV program called Michelin Stars, The Madness of Perfection it talked about the pressure Chefs put themselves under – sometimes tragically so. The presenter was a restaurant critic and in talking to one of his fellow critics they talked about a problem – not exclusively a Michelin one – where every plate was a supreme accomplishment and yet they didn’t want to go to those restaurants.

That’s quite a good analogy for how I look at Ed Wilson’s “PowerShell 2.0 best practices.”. It doesn’t deserve a ranting torrent directed against it, because there are plenty of people who will get a lot from it; but I’m not one of them and it left me frustrated.

My first frustration is the title. A book on best practices should be a slim volume, “Always do this, Try to do that, Avoid so-and-so”. The list of things for PowerShell can’t be enormous, yet this book is big enough to stun an Ox. At over 700 pages it’s the biggest book on PowerShell I’ve met so far. It is padded out with tables and lists which really don’t need to be there – quite early on I came on a 3 page table which lists all the properties of WIN32_processManagement object where it adds no value.  A lot of the rest is tips and good ideas – useful stuff, but not helping me to tell good practice from bad.  For example two chapters – 65 pages – cover Active directory using ADSI – PowerShell 2.0 introduced cmdlets but these get the briefest of mentions. What I wanted was something to say either – “You’re mad if you don’t ensure you can use the AD cmdlets” or “In these situations you’re better using ADSI because …” .

The second frustration is with the likely reader: if you’re writing for people who know PowerShell and telling them about best practice, I think it’s inevitable they will try to pick holes in the code examples. It would seem that Ed writes scripts to do a task rather than add commands to the PowerShell environment – which is most of  what I do – so he and I will have different approaches in places. Although he talks about the importance of making scripts readable, he will produce something with many short lines, where you can’t just understand a section on it’s own. It’s probably easiest to show an example. Given a need to read some data from a file and process it, the book shows something which is structured like this

Function TestPath {...}            
Function SetConnectionString   {...}            
Function ReadData  {...}            
Function CloseAdoConnection   {...}                        
$FilePath = "C:\BestPractices\excel.xls"             

The problem with this is you have to start at the bottom with TestPath and then go up to where the function is defined. Then back to bottom to see the next line is SetConnectionString , and then up to where that is defined. Since “up” takes you to a different page / screenful of code, it’s bad for readability.  TestPath does a check and exits the script if the file isn’t found – here’s the code: 

Function TestPath($FilePath)             
 If(Test-Path -path $FilePath)             
    if($verbose) { "$filePath found" }             
  } #end if             
   Write-Host -foregroundcolor red "$filePath not found."             
  } #end else             
} #end TestPath            

Notice that a “verbose” response doesn’t go to the verbose channel but becomes output of the function which is a way to introduce some interesting bugs,  I would get rid of the function and write

$FilePath = "C:\BestPractices\excel.xls"             
If   ( Test-Path -path $FilePath)             
     { write-verbose "$filepath found"             
ELSE { Write-Host -foregroundcolor red "$filePath not found."}             

Notice there is a CloseAdoConnection but not an open connection ?  That’s because SetConnectionString opens the connection: Like the other functions it doesn’t return a result – the connection which it opens is left as a variable for ReadData (which doesn’t just read data in this example but creates AD objects) and CloseAdoConnection to use.  Here’s what appears in the book.

Function SetConnectionString()             
 $strFileName = $FilePath             
 $strSheetName = 'Excel$'             
 $strProvider = "Provider=Microsoft.Jet.OLEDB.4.0"             
 $strDataSource = "Data Source = $strFileName"             
 $strExtend = "Extended Properties=Excel 8.0"             
 $strQuery = "Select * from [$strSheetName]"             
} #end SetConnectionString            
Function NewAdoConnection()             
 $Script:objConn = New-Object System.Data.OleDb.OleDbConnection(`
 $sqlCommand = New-Object System.Data.OleDb.OleDbCommand($strQuery)             
 $sqlCommand.Connection = $objConn             
 $Script:DataReader = $sqlCommand.ExecuteReader()             
} #end NewAdoConnection  

This seems to going out of its way to avoid passing parameters and returning results: but that means you can’t see when SetConnectionString is called that it relies on the $FilePath variable. The functions don’t follow proper naming conventions meaning they can only really be used inside their script (not in a module). Although I’ve seen Ed protest that he is a PowerShell scripter now, but the way the functions are declared with the redundant () after them suggests his VB habits die hard – PowerShell does allow you to declare parameters in brackets, but the convention is to use the Param statement. The VB convention of putting STR in front of strings is followed in some places and not others; but naming isn’t the thing I find horrible with variables and scopes here. SetConnectionString() sets a bunch of variables within its own scope, and then calls NewAdoConnection() – which inherits that scope and relies on those variables. If you decided to move NewAdoConnection() between SetConnectionString and ReadData it would fail.  NewAdoConnection()  doesn’t return a result but sets a variable for something else in the script to use, but it has to use the script scope to do it.  The job could be done in 6 lines rather than 18

$objConn = New-Object System.Data.OleDb.OleDbConnection(`
   "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + $strFileName +"; Extended Properties=Excel 8.0" )             
$sqlCommand = New-Object System.Data.OleDb.OleDbCommand("Select * from [Excel$]")             
$sqlCommand.Connection = $objConn             
$DataReader = $sqlCommand.ExecuteReader()

There is a danger of getting into “PowerShell Golf” – chasing the fewest [key]strokes to reach the target – which can produce something intricate but impossible to understand and maintain. But here I think the shorter version is clearer and easier.  There’s a case to be made for keeping New-AdoDataReader as a function, but it should be reusable, and this isn’t. It should take the parts of the connection string as parameters and return a result, and this doesn’t

I don’t like the habit of never writing a constant, but always assigning it to a variable first and then using the variable (it’s even worse here with $filePath being put in a new variable $strFileName ) and again you can write code which uses no variables but at the price of making it incomprehensible. In this case if you looking through the ReadData function trying to work out what it does, you have where find where another variable was set, and to find out how it was set you have to find where other variables are set. 

This “variablitis” copes up in many places , and Ed has written that the constraints of the 75 character page width in a book can cause this, but I spotted this example

$os = [environment]::osversion             
$os | Get-Member -MemberType property             

Writing it properly would only take 60 characters.

[environment]::osversion | Get-Member -MemberType property

Once I started seeing these things I kept tripping over them. For example, Powershell supports for loops in the style of C like this .

For($i = 0 ;$i -le 10; $i++)   {doSomethingWith  $i}

But people who work in PowerShell would more commonly write 

1..10 | ForEach {doSomethingWith  $_}

Again one can argue this “PowerShell golf”, but is it easier to read “1..10” or “Set i to 0, while I is less than 10, increment i” . I choked when I saw it here.

Function Get-MoreHelp             
 # .help Get-MoreHelp Get-Command Get-Process             
 For($i = 0 ;$i -le $args.count ; $i++)             
  Get-Help $args[$i] -full |   more             
 } #end for             
} #end Get-MoreHelp             

There isn’t really any justification for writing 

For($i = 0 ;$i -le $args.count ; $i++)  { doSomethingWith $Array[$i]}

Instead of

$Array | foreach { doSomethingWith $_ }            

But the bigger issue for me using args instead of named parameters is generally discouraged and doing so to get the user passes a list of items as separate un-named parameters rather than doing what all PowerShell’s built in cmdlets do and writing a comma separated list, well it’s just wrong.

Ultimately this was what stopped me getting value from the book – there were so many departures from accepted good practice that I lost faith in it as a book on best practice. If MS Press or Ed himself had used a title like “Improve your PowerShell” I would have had a lot less of a problem with it. There is good content, and lots of contributions from people with proven expertise. But you can’t blindly order copies for a team who work with PowerShell and tell them to use it as a Style guide. On the other hand if you’re in a bookshop and see it on the shelf I’d recommend having a thumb through it (I look up something I’ve been trying to do and see what the explanation is like) – it might suit you better than it suited me.

Windows PowerShell 2.0 Best Practices  By Ed Wilson is published by Microsoft press ISBN 978-0-7356-2646-1 List price £47.49


This post originally appeared on my technet blog.

March 15, 2010

IE 8 is safest. Fact.

Filed under: Internet Explorer,Security and Malware,Virtualization — jamesone111 @ 1:11 pm

Every now and then a news story comes up which reminds us that if people with bad intentions, even sensible people can fall into traps on-line. There was one such story last week where friends of the victim said she was “the sensible one” – if she wasn’t unusually gullible it could happen to anyone. I wrote about safer internet day recently and it’s worth making another call to readers who are tech savvy to explain to others who are less so just how careful we need to be trusting people on-line.  I got a well constructed phishing mail last week claiming to have come from Amazon I would have fallen for if it had been sent to my home rather than work account – it’s  as well to be reminded sometimes we’re not as smart as we like to think.

I’ve also been reading about a libel case. I avoid making legal commentary and won’t risk repeating a libel: the contested statement said that something had been advocated for which there was no evidence. I read a commentary which said something to the effect that in scientific disciplines, if your advocacy is not in dispute and someone says you have no evidence for it, you produce the evidence. Without evidence you have a belief, not a scientific fact.  This idea came up on later in the week when I was talking to someone about VMware:  you might have noticed there is a lack of virtualization Benchmarks out in the world, and the reason is in VMware’s licence agreement (under 3.3)

You may use the Software to conduct internal performance testing and benchmarking studies, the results of which you (and not unauthorized third parties) may publish or publicly disseminate; provided that VMware has reviewed and approved of the methodology, assumptions and other parameters of the study

imageTesting, when done scientifically, involves publishing ,methodology, assumptions and other parameters along with the test outcomes and the conclusions drawn That way others can review the work to see if is rigorous and reproducible. If someone else’s conclusions go against what you believe to be the case, you look to see if they are justified from the outcomes: then you move to the assumptions and parameters of the test and it’s methodology. You might even repeat the test to see if the outcomes are reproducible. If a test shows your product and yours is shown in a bad light then you might bring something else to the debate. “Sure the competing product is slightly better at that measure, but ours is better at this measure”. What is one to think of a company which uses legal terms to stop people conducting their own tests and putting the results in public domain for others to review ?

After that conversation I saw a link to an article IE 8 Leads in Malware Protection . NSS labs have come out with their third test of web browser protection against socially engineered malware*. The first one appeared in March of last year, and it looks set to be a regular twice yearly thing. The first one pointed out that there was a big improvement between IE7 and IE8 (IE6 has no protection at all  if you are still working for one of the organizations that has it, I’d question what you’re doing there).
IE 8 does much better than its rivals : the top 4 have all improved since the last run of of the tests. IE was up from 81 to 85% , Firefox from 27 to 29%, Safari from 21% to 29% and Chrome from 7% to  17%:

Being pessimistically inclined I look at the numbers the other way round : in the previous test we were letting 19 out of every 100 through, now it’s 15 – down by 21%: in the first test we were letting 31 of every 100 through so 52% of what got through a year ago gets blocked today. Letting that many through means we can’t sit back and say the battle is won, but IE8 is the only Browser which is winning against the criminals:  Google,for example, have improved Chrome since last time,so it only lets through 83 out of every 100 malware URLs -  that’s blocking 11% of the 93 it let through before from each 100. With every other browser the crooks are winning, which is nothing to gloat over – I hope to see a day when we’re all scoring well into the 90s.

I haven’t mentioned Opera – which has been have been consistently last, and by some margin, slipping from 5% in the first test to 1% in the second to less than 1 in the most recent. In a spirit of full scientific disclosure I’ll say I think the famous description of Real Networks fits Opera. Unable to succeed against Safari or Chrome , and blown into the weeds by Firefox,  Opera said its emaciated market-share was because IE was supplied by default with Windows. Instead of producing a browser people might want, Opera followed the path trodden by Real Networks – complaining to the European Commissioner for the protection of lame ducks competition. The result was the browser election screen.

I’m not a fan of browser election screen – not least because it is easily mistaken for Malware. To see the fault let me ask you, as reader of an IT blog, which of the following would you choose ? 

  1. The powerful and easy-to-use Web browser. Try the only browser with Browser-A Turbo technology, and speed up your Internet connection.
  2. Browser-B . A fast new browser. Made for everyone
  3. Browser-C is the world’s most widely used browser, designed by Company-C with you in mind.
  4. Browser-D from Company-D, the world’s most innovative browser.
  5. Your online security is Browser E’s top priority. Browser-E is free, and made to help you get the most out of the web.

You might say (for example) “I want Firefox”, but which is Firefox in that list ? You are probably more IT savvy than the people the election screen is aimed at and if you can’t choose from that information, how are they supposed to ? You see, if you have done your testing and know a particular browser will meet your needs best, you’d go to it by name you don’t need the screen. People who don’t know the pros and cons of the options before seeing the screen might just as well pick at random – which favours whoever has least market share – which would be Opera.

The IE 8 Leads in Malware Protection  article linked to a post of Opera’s complaining that the results of the first test were fixed “Microsoft sponsored the report, so it must be fixed!” If we’d got NSS labs to fix the results a year ago would we stipulate that Opera should be so far behind everyone else? Did we have a strategy to show Opera going from “dire failure” to “not even trying”? Or that IE8 should start at a satisfactory score and improve over several surveys with the others static  ? But to return to my original point: the only evidence which I’m aware of shows every other browser lets at least 4 times as much Malware through as IE. The only response to anyone who disputes it is let’s see your evidence to counter what NSS labs found.Google have spent a fortune advertising Chrome: if Chrome really did let fewer than 5 out of 6 malware sites through they’d get someone else to do a [reviewable] study which showed that.

And since we’re back at the question of evidence, if you want are asked for advice on the election screen and you want to advocate the one which will help people to stay safe from Phising attacks – I don’t think you have any evidence to recommend anything other than IE.  But remember it’s not a problem which can be solved by technology alone. Always question the motives of something which wants to change the configuration of your computer.

tweetmeme_style = ‘compact’;
tweetmeme_url = ‘http://blogs.technet.com/jamesone/archive/2010/03/15/ie-8-is-safest-fact.aspx’;

This post originally appeared on my technet blog.

March 10, 2010

UK techdays Free events in London – including after hours.


You may have seen that registration for UK TechDays events from 12th to 16th April is already open – but you probably won’t have seen this newly announced session, even if you are following @uktechdays on twitter

After Hours @ UK Tech Days 2010 – Wednesday 14th April, 7pm – 9pm. Vue Cinema, Fulham Broadway.

Reviving the critically acclaimed series of mad cap hobbyist technology demonstrations – After Hours reappears at Tech Days 2010. After Hours is all about the fun stuff people are building at home with Microsoft technology, ranging from the useful ‘must haves’ no modern home should be without, too the bleeding edge of science fiction made real! Featuring in this fun filled two hour installment of entertaining projects are: Home Entertainment systems, XNA Augmented Reality, Natural User Interfaces, Robotics and virtual adventures in the real world with a home brew holodeck!

Session 1: Home entertainment.

In this session we demonstrate the integration of e-home technologies to produce the ultimate in media entertainment systems and cyber home services.  We show you how to inspire your children to follow the ‘way of the coder’ by tapping into their Xbox 360 gaming time.

Session 2: Augmented reality.

2010 promises to be the year of the Natural User Interface. In this session we demonstrate and discuss the innovations under development at Microsoft, and take an adventure in the ultimate of geek fantasies – the XNA Holodeck.

Like all other techdays session this one is FREE to attend  – if you hadn’t heard: UK Tech Days 2010 is a week-long series of events run by Microsoft and technical communities to celebrate and inspire developers, IT professionals and IT Managers to get more from Microsoft technology.  Our day events in London will cover the latest technology releases including Microsoft Visual Studio 2010, Microsoft Office 2010, Virtualisation, Silverlight, Microsoft Windows 7 and Microsoft SQL Server 2008 R2 plus events focusing on deployment and an IT Manager day. Oh and did I say they were FREE

IT Professional Week – Shepherds Bush

Monday, 12 April 2010   – Virtualization Summit – From the Desktop to the Datacentre

Designed to provide you with an understanding of the key products & technologies enabling seamless physical and virtual management, interoperable tools, and cost-savings & value.

Tuesday, 13 April 2010  – Office 2010 – Experience the Next Wave in Business Productivity

The event will cover how the improvements to Office, SharePoint, Exchange, Project and Visio will provide a practical platform that will allow IT professionals to not only solve problems and deliver business value, but also demonstrate this value to IT’s stakeholders. 

Wednesday, 14 April 2010Windows 7 and Windows Server 2008 R2 – Deployment made easy

This event will provide you with an understanding of these tools including the new Microsoft Deployment Toolkit 2010, Windows Deployment services and the Application Compatibility Toolkit. Understanding of these tools including the new Microsoft Deployment Toolkit 2010, Windows Deployment Services. We will also take you through the considerations for deploying Windows Server 2008 R2 and migrating your server roles.

Thursday, 15 April 2010 SQL Server 2008 R2 – The Information Platform
Highlighting the new capabilities of the platform, as well as diving into specific topics, such as consolidating SQL Server databases, and tips and techniques for Performance Monitoring and Tuning as well as looking at our newly released Cloud platform SQL Azure.

Friday, 16 April 2010 (IT Managers)Looking ahead, keeping the boss happy and raising the profile of IT
IT Managers have more and more responsibilities to drive and support the direction of the business. We’ll explore the various trends and technologies that can bring IT to the top table, from score-carding to data governance and cloud computing.

Developer Week – Fulham Broadway

Monday, 12 April 2010 (For Heads of Development and Software Architects) Microsoft Visual Studio 2010 Launch – A Path to Big Ideas

This launch event is aimed at development managers, heads of development and software architects who want to hear how Visual Studio 2010 can help build better applications whilst taking advantage of great integration with other key technologies.
NB – Day 2 will cover the technical in-depth sessions aimed at developers

Tuesday, 13 April 2010 Getting started with Microsoft .NET Framework 4 and Microsoft Visual Studio 2010 WAITLIST ONLY
Microsoft and industry experts will share their perspectives on the top new and useful features with core programming languages and in the framework and tooling, such as — ASP.NET MVC, Parallel Programming, Entity Framework 4, and the offerings around rich client and web development experiences.

Wednesday, 14 April 2010 The Essential MIX
Join us for the Essential MIX as we continue exploring the art and science of creating great user experiences. Learn about the next generation ASP.NET & Silverlight platforms that make it a rich and reach world.

Thursday, 15 April 2010 Best of Breed Client Applications on Microsoft Windows 7
Windows 7 adoption is happening at a startling pace. In this demo-driven day, we’ll look at the developer landscape around Windows 7 to get you up to speed on the operating system that’ll your applications will run on through the new decade.

Friday, 16 April 2010 – Registration opening soon! Windows phone Day
Join us for a practical day of detailed Windows Phone development sessions covering the new Windows Phone specification, application standards and services

There will also be some “fringe” events , these won’t all be in London and I’ll post about them separately (James in the Midlands, I’ve heard you :-)  )


This post originally appeared on my technet blog.

March 9, 2010

Cars, social media, phones, windows media and there’s no hiding with co-pilot.

Filed under: General musings,Mobility,Music and Media — jamesone111 @ 3:39 pm

As titles go that’s an odd one, but stay with me.

I’ve written before about my Citroen C6: Before Christmas a warning message popped up saying something was wrong with the hydro-pneumatic suspension which give the big Citroens their wonderful ride. A visit to the garage confirmed the problem was real – not a diagnostic issue, and lay with a part which rarely goes wrong i.e. one no dealer keeps in stock. It would be take a day or two to get the part and by the time it was fitted I needed to be at Tech-ed in Berlin. I expected the car to be ready when I got back, but it wasn’t. Having replaced the faulty part it turned out it had failed because of a fault in the hydraulic pump: this is beyond rare – Citroen UK told me later that they’d only ever supplied one before and that was after an accident, but I’m getting ahead of myself. The pump should have arrived before I got back from Tech-ed, but there was no sign of it. We then began a sequence where every few days I would call the garage or they would call me, and I’d get the news the pump had not arrived but would be there in a couple of days.

I’m not totally without patience, but after 3 weeks I was getting cross and started to tweet about it, and found Citroen UK on Twitter. So I posted things like Day 26 of my Citroen C6 being in the Garage. @Citroenuk promised to deliver the part today and they #Fail to. Now promising Thursday. It was partly to vent and partly to see if Citroen responded – if your organization is “doing social media” you really should know what you’re going to do if someone complains – we try to do this at Microsoft when it isn’t the “I hate Microsoft because they’re a big money making concern” variety.  Citroen UK’s twitter account turned out to be someone from  marketing who took enough ownership of the problem to get some information and make sure the right person saw it.  That was how I ended up talking to Brian (I’ll keep his last name out of it – I can see just people calling Citroen asking for him). If Brian was trained in customer care (rather than doing it by instinct) his teacher would have been pleased: he apologized (sincerely – not in an over the top way), saw the customer’s point of view “I know Caterpillar have a ‘parts anywhere in the world in 24 hours, or Cat pays’ promise. You should be able to get a part here in 24 days for your top of the range model”,  explained why it had gone wrong (the pumps showed as in stock but been taken to be modified to the latest specification), committed to speeding the resolution of the problem and promised to follow-up to talk about how Citroen could rebuild the relationship. I’ve had Citroens (7 of them) for 16 of the last 20 years, so I guess I qualify as a loyal customer they’d want to keep. 

USB box,  in glove compartment, showing all 3 connections - click for a bigger version Brian had an unexpected spell off work so it was well into January by the time he got in touch, and offered me a choice of accessories as compensation. I wanted to be able to plug in a music player in the car – I’ve tried those little FM transmitters and found on a decent length journey they’re more trouble than they’re worth. The accessory catalogue had a “USB box” which plays MP3s. Some of the other options which Brian was willing to pay for were pretty pricey and would have felt like taking advantage, but this seemed OK. It took a while to get the kit and sort out a day to fit it, but it went in last week and I have to say it’s a pretty neat gadget. The handbook suggests it goes in lot of cars – Peugeot and Citroen across the PSA group; it has a USB socket which is powered, so will charge my phone (I’ve twice blown the cigar lighter socket fuse with cheap adapters), plus a dedicated iPod socket – which will work with my wife’s nano, and a 3½mm jack plug for anything else. I tried playing a few MP3s I’d copied to a memory stick – and the first impression was nice sound quality: the integration with the built in Stereo isn’t perfect but is quite good enough.  But there was better news: it turns out the USB box plays pretty much any format, including WMA, WAV and even OGG format. Most of my music is in WMA format and sync’d to my phone, so just I tell the phone to connect in storage mode by default , plug it in (even if locked) and the USB box reads the files and plays them.

Playing OGG isn’t quite the advantage it might be when Co-pilot is installed on the memory card, because it uses OGG files for all its messages, and the USB box thinks it should play them – so the first thing it played was “Take 1st exit at roundabout” , “Take 2nd exit at roundabout” and so on. I set the files to hidden, interestingly file explorer on the phone ignores the hidden attribute, so I can’t blame the USB box for doing the same. It’s not an insurmountable problem, unlike its predecessors this phone has enough main memory to allow me to move Co-pilot’s sound files off the storage card.  So that’s another plus for the phone.

And as far as the car is concerned it’s one more thing to like about driving it, I’ve had another, minor problem since which was quickly fixed and thanks to Brian I’m back in the happy customers column.

This post originally appeared on my technet blog.

A FAT (32) lot of good that did me …

Filed under: General musings,Virtualization — jamesone111 @ 11:35 am

First rule of blogging. Don’t blog when angry.

I’ve been through a time consuming process which could be called educational – in the sense of “Well ! That taught me a lesson”. My drug regime has been mentioned before in my posts, and this is one of those times when the drugs don’t seem to be working – so lets just say I was a shade cranky when before I started and now…


Up on youtube I have a video showing Hyper-V server R2 booting from a USB flash drive, (which I described in this post please note the recommendation to check supportability before going down this path yourself).

And I have a second video showing how I made my phone into a bootable USB device from which I could install windows. .

Why not, I thought, Boot HyperV server R2 from a phone – in fact why stop at phones ? I’ve had a good laugh at Will it Blend ? So I was thinking of doing a Will it boot series. Can I boot HVS from my camera ? etc.


Let’s stop for a second and think. What file systems do cameras, phones, MP3 players support ? NTFS – er no. They use FAT, in most of its forms, new memory cards show up formatted as FAT32.
And what is the limitation of FAT32 ? A maximum file size of 4GB: not a problem for installing Windows because WIM files are sized at less than 4GB to fit on DVD disks.  A bit of a challenge for VHD files as 4GB is shade small by today’s standards. In fact when I ran the setup for Hyper-V server against my sub-4GB VHD it wouldn’t install. Undeterred I have a customized Hyper-V server R2 VHD – which I use as a testing VM on a server 2008 box – I’d pared this down before so it uses comfortably less than 3.5GB on a 6GB VHD. I attached that VHD as a second drive on another VM which has the Windows Automated Installation Kit installed, created a 3.5GB VHD and added that as third drive, and fired up the VM. I used ImageX to make a WIM image of the this disk, and then it was question of partitioning my new VHD, activating the partition, formatting it, applying the image and making sure the VHD was bootable, and testing it in it’s own VM on server 2008. It worked like a charm. Next I copied it to a “4GB” SD card – the card is 4,000,000,000 bytes, which is only about 3.7 true gigabytes (taking 1GB as 2^30 bytes). I switched my test VM on Server 2008 to use the VHD on the SD card and all was well. I went through the steps to make the card bootable. Abject failure. I tried lots of things: without success – to retain one’s optimism and avoid anger, these are classified as things eliminated rather than failures.

Slowly, a picture began to emerge. I tried testing the VM from the SD card on Server 2008 R2, first I attached the VHD to a VM

Click for fill size version

A file system limitation ? Hmm. OK,  let’s see if we can attach VHD files on the SD card Windows 7’s Computer management or Server manager on Server 2008 R2 , go to storage, then to disk management, right click choose “attach VHD” browse to the disk and


I know that R2 removed the ability to use VHDs which had been compressed, and I think I probably did know that R2 also introduced a requirement to keep the VHD on NTFS.

There’s no reason why Windows can’t format an SD card as NTFS, and I can probably use my camera as a card reader for an NTFS formatted card; but the camera can’t save pictures to it. I’m sure I could partition the 16GB MicroSD card which I’m using in the phone so that there was a roughly 4GB active partition which could boot and 12GB left for camera / phone / whatever but I want to be able to reclaim the space at a moment’s notice if I need to put pictures on it – and such a scheme rules that out.


Angry at the time I’ve wasted ? No, no I’m calm, composed and working on other ideas for what I can do booting from off the wall devices.

Creating an image of me , in a pram, throwing toys from it is left as an exercise for the reader .

This post originally appeared on my technet blog.

March 8, 2010

Photographic resolution and scans.

Filed under: Photography — jamesone111 @ 10:22 am

I’ve heard it said that every time you use an equation you lose half the audience. I’m going to take that risk : In photography there are a lot of equations which come up in the form 1/x + 1/y = 1/z , and one of those is for recorded resolution. 1/Lens-Resolution + 1/Recording-Resolution = 1/ImageResolution. It’s also a manifestation of the law of diminishing returns. But why do I think it is worth a post ?

First: there is a limit on the smallest detail that lens can resolve in an image: one test to get an indication of this is to look patterns of parallel black and white lines and see how fine the lines can be before they blend into a grey mush.

Second: However many Pixels you have, the digitization process can’t put in detail which wasn’t captured by the lens. There will always be some loss in the process – (or, if you prefer, to record as much detail as the lens could resolve the sensor would need to have infinite resolution). Increasing the sensor resolution will reduce the loss but each successive increase produces smaller and smaller benefits (and remember that if the number of lines the sensor can resolve doubles, the number of Pixels quadruples).

That equation says if a lens can resolve x pairs of lines per unit of distance (it doesn’t matter if it’s lines per mm or lines per image width), and the sensor records ½x , the net resolution is 1/3x ; if the sensor records x, net resolution is 1/2x ;  2x line pairs at the sensor gives a net resolution of 2/3x , go up to 4x and the result is 4/5x, 8 times lens resolution at the sensor gives 8/9 of the lens detail in the output. You can see the progression – but I’ve just described a 16 fold increase in linear resolution, or a 256 fold increase in pixels – like going from a basic 240×320 Pixel QVGA webcam to 20 Mega pixels – (in 2010) that’s the realm of professional equipment – but improving detail recorded detail by a factor of less than 3. Of course that would only be true if the image being digitized were the same – the pro camera will have a lens which resolves more detail (thousands of lines over the image width, against hundreds for a web cam lens). Changing whichever component has lower res will have a bigger impact than changing the higher res one. There’s no point in making a web cam where the lens has many times the resolving power of the sensor, or mounting a lens on a pro camera with much less resolution than its sensor.


I’ve known this for years, the upgrades to my digital SLR cameras have increased the Pixels but the images only show a fraction more detail – although it is easier to see with my best lenses. But I’ve recently been going over the problem with scanned images.
I made the picture above in 2003, printed it on roll paper using my A4 printer and it has been on my wall ever since. But it was shot on film and transferred to CD at the lab – the JPG files are quite low resolution. The border makes up about 1/3 of the height and the actual picture is roughly 14cm / 5½” tall – and covered by 1100 pixels. (The border is a useful trick for making the aspect ratio of the picture a bit squarer so the print isn’t so long. Instead of being 7700 x 1100 – a 7:1 aspect ratio, it is 8000×1400 a 5.7:1 ratio).  I’ve been thinking about doing a new version, the picture can be cropped less at the top & bottom as well and I can re-visit the ideal size of border; but I can fix a couple of other things the sepia toning is excessive and there is a stitching error (look at the legs at the landward end of the pier). 



The final result would go into silverlight deep zoom and I’m already looking at 13” / 330mm rolls of paper for my new super-A3 size paper. (The current print is 4 feet /1.2M long. The exact size of an A3 print would depend on the cropping and the border – with a border like the original the aspect ratio would be 3.8 : 1 – so the print wouldn’t longer just taller.) But I want to wring the maximum possible detail from the negatives. It’s not simply a question of smearing the same detail over more pixels – printing software can do that, so the even current image on bigger paper won’t look pixelated. I want something which rewards looking closer.  I can scan the prints which came back with the film, I have two negative scanners (one of which was the subject of the video I just made)  and I can set my 14 Megapixel camera up as a slide copier. Which will give the best results – is it the one with the most pixels ? No. Using the camera gave the most pixels, but the results weren’t great. But the question turns out to be much more complex than I expected, because no two digitisations produce the same range of tones, and they all have different levels of noise – noise can be processed out in software but at the price of some detail. Subtle details can be lost through a lack of contrast rather than a lack of resolution or swamped in the noise.

I spent some time trying to get examples of how each looked and gave it up as impractical – different details rendered better in different scans, and trying to find a single piece of the picture which shows both the good and bad from the different scans proved to be impossible – especially since the panorama software handles the overlapping sections differently in different sets of scans, so one might be comparing the fuzzy edge of a frame in one result and the sharp part of an overlapping frame in another (there was a flock of birds flying round the collapsed central ballroom and they appear – or don’t – depending on the whim of the software; the original used some 3rd party software and I’ve had 3 versions of Microsoft software since). In short – the more time I spent trying to be objective the less conclusive the results became, which print looks best isn’t necessarily the one with the most detail. As I said beforeBecause my experience has been bad I don’t scan much, and because I don’t scan much I won’t spend the money to get a better experience.”. So the key might be to stop wasting time scanning my own negatives and send them to a professional scanning service.

update: Fixed a bunch of typos.

tweetmeme_style = ‘compact’;
tweetmeme_url = ‘http://blogs.technet.com/jamesone/archive/2010/03/08/photographic-resolution-and-scans.aspx’;

This post originally appeared on my technet blog.

March 7, 2010

How to use old drivers with a new OS – more on XP mode

Filed under: Virtualization,Windows 7 — jamesone111 @ 5:45 pm

In a post a while back about Windows Image Acquisition (WIA) I wrote “I’ve got a bad track record choosing scanners” and described the most recent one I’ve bought as a  “piece of junk”. Because my experience has been bad I don’t scan much, and because I don’t scan much I won’t spend the money to get a better experience. The scanner I have at home is an HP one and after HP failed to produce Vista drivers for it I said I’d never spend my own money on HP kit again. Eventually they DID release Vista drivers (including 64 bit) and these support Windows 7. The trouble is although they support WIA – rather than using HP’s rather crummy software for XP, they are what HP calls “Basic Feature” drivers. The video below shows what this means, and how I was able to get access to the other features using that crummy software in XP mode.

[For some reason the embedded video doesn’t play in all browsers – here is the link to the video on You tube]

This makes quite a good follow up to a video I did for Edge when XP mode was still in Beta, which showed how some 32bit-only camera software (which would work with vista or Windows 7 – but not in the 64 bit version I’m running) could be used in (32 bit) XP mode.

Get Microsoft Silverlight

This post originally appeared on my technet blog.

March 5, 2010

Steve Ballmer talks about our cloud strategy

Filed under: Azure / Cloud Services — jamesone111 @ 3:19 pm

Yesterday Steve gave a talk at the University of Washington to discuss how cloud computing will change the way people and businesses use technology.

You can watch the speech on demand here, the video is about 85 minutes long, and if you want to get a snapshot of how we see things developing you can’t do much better. 

This post originally appeared on my technet blog.

Blog at WordPress.com.