James O'Neill's Blog

May 31, 2018

More tricks with PowerShell and Excel

Filed under: Office,Powershell — jamesone111 @ 6:25 am

I’ve already written about Doug Finke’s ImportExcel module – for example, this post from last year covers

  • Basic exporting (use where-object to reduce the number of rows , select-object to remove columns that aren’t needed)
  • Using -ClearSheet to remove old data, –Autosize to get the column-widths right, setting titles, freezing panes and applying filters, creating tables
  • Setting formats and conditional format      
  • In this post I want to round up a few other things I commonly use.

    Any custom work after the export means asking Export-Excel to pass through the unsaved Excel Package object like this

    $xl = Get-WmiObject -Class win32_logicaldisk | select -Property DeviceId,VolumeName, Size,Freespace |
               Export-Excel -Path "$env:computerName.xlsx" -WorkSheetname Volumes –PassThru

    Then we can set about making modifications to the sheet. I can keep referring to it via the Excel package object, but it’s easier to use a variable. 
    $Sheet = $xl.Workbook.Worksheets["Volumes"]

    Then I can start applying formatting, or adding extra information to the file
    Set-Format -WorkSheet $sheet -Range "C:D" -NumberFormat "0,000"
    Set-Column -Worksheet $sheet -Column 5
    -Heading "PercentageFree" -Value {"=D$row/C$row"} -NumberFormat "0%" 

    I talked about Set-column in another post. Sometimes though, the data isn’t a natural row or column and the only way to do things is by “Poking” individual cells, like this

        
    $sheet.Cells["G2"].value = "Collected on"
    $sheet.Cells["G3"].value = [datetime]::Today
    $sheet.Cells["G3"].Style.Numberformat.Format =
     "mm-dd-yy"
    $sheet.Cells.AutoFitColumns()
    Close-ExcelPackage $xl –Show

    Sharp-eyed readers will see that the date format appears to be “Least-significant-in-the-middle” which is only used by one country – and not the one where I live. It turns out Excel tokenizes some formatsthis MSDN page explains and describes “number formats whose formatCode value is implied rather than explicitly saved in the file….. [some] can be interpreted differently, depending on the UI language”. In other words if you write “mm-dd-yy” or “m/d/yy h:mm” it will be translated into the local date or date time format. When Export-Excel encounters a date/time value it uses the second of these; and yes, the first one does use hyphens and the second does use slashes. My to-do list includes adding an argument completer for Set-Format so that it proposes these formats.

    Since the columns change their widths during these steps I only auto-size them when I’ve finished setting their data and formats. So now I have the first page in the audit workbook for my computer

    image

    Of course there times when we don’t want a book per computer with each aspect on it’s own sheet, but we want book for each aspect with a page per computer.
    If we want to copy a sheet from one workbook to another, we could read the data and write it back out like this

    Import-Excel -Path "$env:COMPUTERNAME.xlsx" -WorksheetName "volumes" | 
         Export-Excel
    -Path "volumes.xlsx" -WorkSheetname $env:COMPUTERNAME

    but this strips off all the formatting and loses the formulas  – however the Workbook object offers a better way, we can get the Excel package for an existing file with
    $xl1 = Open-ExcelPackage -path "$env:COMPUTERNAME.xlsx"

    and create a new file and get the Package object for it with 
    $xl2 = Export-Excel -Path "volumes.xlsx" -PassThru

    (if the file exists we can use Open-ExcelPackage). The worksheets collection has an add method which allows you to specify an existing sheet as the basis of the new one, so we can call that, remove the default sheet that export created, and close the files (saving and loading in Excel, or not, as required) 

    $newSheet = $xl2.Workbook.Worksheets.Add($env:COMPUTERNAME, ($xl1.Workbook.Worksheets["Volumes"]))
    $xl2.Workbook.Worksheets.Delete("Sheet1")
    Close-ExcelPackage $xl2 -show
    Close-ExcelPackage $xl1 –NoSave

    The new workbook looks the same (formatting has been preserved -  although I have found it doesn’t like conditional formatting) but the file name and sheet name have switched places.

    image

    Recently I’ve found that I want the equivalent of selecting “Transpose” in Excel’s paste-special dialog- take an object with many properties and instead of exporting it so it runs over many columns in making a two-column list of Property name and value
    For example
    $x = Get-WmiObject win32_computersystem  | Select-Object -Property Caption,Domain,Manufacturer,
                                Model, TotalPhysicalMemory, NumberOfProcessors, NumberOfLogicalProcessors

    $x.psobject.Properties | Select-Object -Property name,value |
        Export-Excel -Path "$env:COMPUTERNAME.xlsx" -WorkSheetname General -NoHeader -AutoSize –Show

    imagec

    When I do this i a real script I use the –passthru swtich and apply some formatting

    $ws    = $excel.Workbook.Worksheets["General"]
    $ws.Column(1).Width                     =  64
    $ws.Column(1).Style.VerticalAlignment   = "Center"
    $ws.Column(2).Width                     =  128
    $ws.Column(2).Style.HorizontalAlignment = "Left"
    $ws.Column(2).Style.WrapText            = $true

    Of course I could use Set-Format instead but sometimes the natural way is to refer to use .Cells[]  , .Row() or .Column().

    Advertisements

    December 11, 2017

    Using the Import Excel module part 2: putting data into .XLSx files

    Filed under: Office,Powershell — jamesone111 @ 3:55 pm

    This is third of a series of posts on Excel and PowerShell – the first on getting parts of an Excel file out as images wasn’t particularly tied to the ImportExcel Module, but the last one, this one and next one are.  I started with the Import Command – which seemed logical given the name of the module; the Export command is more complicated, because we may want to control the layout and formatting of the data, add titles, include pivot tables and draw charts;. so I have split it into two posts. At its simplest the command looks like this :

    Get-Process | Export-Excel -Path .\demo.xlsx -Show

    This gets a list of processes, and exports them to an Excel file; the -Show switch tells the command to try to open the file using Excel after saving it. I should be clear here that import and export don’t need Excel to be installed and one of the main uses is to get things into Excel format with all the extras like calculations, formatting and charts on a computer where you don’t want to install desktop apps; so –Show won’t work in those environments.  If no –WorksheetName parameter is give the command will use “Sheet1”.

    Each process object has 67 properties and in the example above they would all become columns in the worksheet, we can make things more compact and efficient by using Select-Object in the command to filter down to just the things we need:

    Get-Process | Select-Object -Property Name,WS,CPU,Description,StartTime |
    Export-Excel -Path .\demo.xls -Show
     

    Failed exporting worksheet 'Sheet1' to 'demo.xls':
    Exception calling ".ctor" with "1" argument(s):
    "The process cannot access the file 'demo.xls' because it is being used by another process."

    This often happens when you look at the file and go back to change the command and forget to close it – we can either close the file from Excel, or use the -KillExcel switch in Export‑Excel – from now on I’ll use data from a variable

    $mydata = Get-Process | Select-Object -Property Name, WS, CPU, Description, Company, StartTime
    $mydata | Export-Excel -KillExcel -Path .\demo.xlsx -Show

    This works, but Export-Excel modifies the existing file and doesn’t remove the old data – it takes the properties of the first item that is piped into it and makes them column headings, and writes each item as a row in the spreadsheet with those properties. (If different items have different properties there is a function Update-FirstObjectProperties to ensure the first row has every property used in any row). If we are re-writing an existing sheet, and the new data doesn’t completely cover the old we may be left with “ghost” data. To ensure this doesn’t happen, we can use the ‑ClearSheet option

    $mydata | Export-Excel -KillExcel -Path .\demo.xlsx -ClearSheet -Show

    clip_image002

    Sometimes you don’t want to clear the sheet but to add to the end of it, and one of the first changes I gave Doug for the module was to support a –Append switch, swiftly followed by a change to make sure that the command wasn’t trying to clear and append to the same sheet.

    We could make this a nicer spreadsheet – we could make it clear the column headings look like headings, and even make them filters, we can also size the columns to fit…

    $mydata | Export-Excel -Path .\demo.xlsx -KillExcel -WorkSheetname "Processes" -ClearSheet `
                 -BoldTopRow
    -AutoSize -Title "My Processes" -TitleBold -TitleSize 20 -FreezePane 3 -AutoFilter -Show

    clip_image004

    The screen shot above shows the headings are now in bold and the columns have been auto sized to fit. A title has been added in bold, 20-point type; and the panes have been frozen above row 3. (There are options for freezing the top row or the left column or both, as well as the option used here –FreezePane row [column]) and filtering has been turned on.

    Another way to present tabular data nicely is to use the -Table option

    $mydata | Export-Excel -Path .\demo.xlsx -KillExcel -WorkSheetname "Processes" -ClearSheet –BoldTopRow    -AutoSize `
           -TableName table -TableStyle Medium6 -FreezeTopRow -show

    clip_image006

    “Medium6” is the default table style but there are plenty of others to choose from, and intellisense will suggest them

    clip_image008

    Sometimes it is helpful NOT to show the sheet immediately, and one of the first things I wanted to add to the module was the ability to pass on an object representing the current state of the workbook to a further command, which makes the following possible:

    $xl = $mydata | Export-Excel -Path .\demo.xlsx -KillExcel -WorkSheetname "Processes" `
        
    -ClearSheet -AutoSize -AutoFilter -BoldTopRow –FreezeTopRow -PassThru

    $ws = $xl.Workbook.Worksheets["Processes"]

    Set-Format -WorkSheet $ws -Range "b:b" -NumberFormat "#,###"   -AutoFit
    Set-Format -WorkSheet $ws -Range "C:C" -NumberFormat "#,##0.00" -AutoFit
    Set-Format -WorkSheet $ws -Range "F:F" -NumberFormat "dd MMMM HH:mm:ss" -AutoFit

    The first line creates a spreadsheet much like the ones above, and passes on the Excel Package object which provides the reference to the workbook and in turn to the worksheets inside it.
    The example selected three columns from the worksheet and applied different formatting to each. The module even supports conditional formatting, for example we could add these lines into the sequence above

    Add-ConditionalFormatting -WorkSheet $ws -Range "c2:c1000" -DataBarColor Blue
    Add-ConditionalFormatting -WorkSheet $ws -Range "b2:B1000" -RuleType GreaterThan
    `
               
    -ConditionValue '104857600'  -ForeGroundColor "Red" -Bold

    The first draws data bars so we can see at glance what is using CPU time and the second makes anything using over 100MB of memory stand out.

    Finally, a call to Export-Excel will normally apply changes to the workbook and save the file, but there don’t need to any changes – if you pass it a package object and don’t specify passthrough it will save your work, so “Save and Open in Excel” is done like this once we have put the data in a formatted it the way we want.

    Export-Excel -ExcelPackage $xl -WorkSheetname "Processes" -Show

    clip_image002[1]

    In the next post I’ll look at charts and Pivots, and the quick way to get SQL data into Excel

    December 5, 2017

    Using the Import-Excel module: Part 1 Importing

    Filed under: Office,Powershell — jamesone111 @ 9:15 am

    The “EEPLus” project provides .NET classes to read and write XLSx files without the need to use the Excel object model or even have Excel installed on the computer (XLSx files, like the other  Office Open XML Format are actually .ZIP format files, containing XML files describing different aspects of the document – they were designed to make that sort of thing easier than the “binary” formats which went before.)   Doug Finke, who is well known in the PowerShell community, used EEPlus to build a PowerShell module named ImportExcel which is on GitHub and can be downloaded from the PowerShell gallery (by running Install-Module ImportExcel on PowerShell 5 or PS4 with the Package Management addition installed) As of version 4.0.4 his module contains some of my contributions. This post is to act as an introduction to the Export parts of the module that I contributed to; there are some additional scripts bundled into the module which do require Excel itself but the core Import / Export functions do not. This gives a useful way to get data on a server into Excel format, or to provide users with a work book to enter data in an easy to use way and process that data on the server – without needing to install Microsoft Office or translate to and from formats like .CSV.

    The Import-Excel command reads data from a worksheet in an XLSx file. By default, it assumes the data has headers and starts with the first header in Cell A1 and the first row of data in row 2. It will skip columns which don’t have a header and but will include empty rows. If no worksheet name is specified it will use the first one in the work book, so at its simplest the command looks like :
    Import-Excel -Path .\demo.xlsx  

    It’s possible that the worksheet isn’t the first sheet in the workbook and/or has a title above the data, so we can specify the start point explicitly
    Import-Excel -Path .\demo.xlsx -WorkSheetname winners -StartRow 2  

    We can say the first row does not contain headers and either have each property (column) named P1, P2, P3 etc, by using the ‑NoHeader switch or specify header names with the -HeaderName parameter like this

    Import-Excel -Path .\demo.xlsx -StartRow 3 -HeaderName “Name”,"How Many"

    The module also provides a ConvertFrom-ExcelSheet command which takes -Encoding and -Delimiter parameters and sends the data to Export-CSV with those parameters, and a ConvertFrom-ExcelToSQLInsert command which turns each row into a SQL statement: this command in turn uses a command ConvertFrom-ExcelData, which calls Import-Excel and then runs a script block which takes two parameters PropertyNames and Record.

    Because this script block can do more than convert data, I added an alias “Use-ExcelData” which is now  part of the module and can be used like this
    Use-ExcelData -Path .\NewUsers.xlsx -HeaderRow 2 -scriptBlock $sb

    If I define the script block as below, each column becomes a parameter for the New-AdUser command which is run for each row

    $sb = {
      param($propertyNames, $record)
      $propertyNames | foreach-object -Begin {$h = @{} }  -Process {
          if ($null -ne $record.$_) {$h[$_] = $record.$_}
      } -end {New-AdUser @h -verbose}
    }

    The script block gets a list of property names and a row of data: the script block gets called for each row and creates a hash table, adds an entry for each property and finally Splats the parameters into a command. It can be any command in the end block provided that the column names in Excel match its Parameters , I’m sure you can come up with your own use cases.

    November 25, 2017

    Getting parts of Excel files as images.

    Filed under: Office,Powershell — jamesone111 @ 7:54 pm

    I feel old when I realise its more than two decades since I learnt about the object models in Word, Excel and even Microsoft project and how to control them from other applications. Although my preferred tool is now PowerShell rather than Access’s version of Visual basic, the idea that “it’s all in there somewhere” means I’ll go and do stuff inside Excel from time to time…

    One of the things I needed to do recently was to get performance data into a spreadsheet with charts – which the export part of Doug Finke’s ImportExcel module handles very nicely. But we had a request to display the charts on a web page without the need to open an Excel file, so it was time to have a look around in Excel’s [very hierarchical] object model.

    An Excel.Application contains
    …. Workbooks which contain
    …. …. Worksheets which contain
    …. …. …. Chartobjects each of which contains
    …. …. …. …. A Chart which has
    …. …. …. …. …. An Export Method

    It seems I can get what I need if I get an Excel application object, load the workbook, work through the sheets, find each chart, decide a name to save it as and call its export method. The PowerShell to do that looks like this

    $OutputType    = "JPG"
    $excelApp      = New-Object -ComObject "Excel.Application"
    $excelWorkBook = $excelApp.Workbooks.Open($path)
    foreach ($excelWorkSheet in $excelWorkBook.Worksheets) {
      foreach ($excelchart in $excelWorkSheet.ChartObjects([System.Type]::Missing)) {
        $excelApp.Goto($excelchart.TopLeftCell,$true)
        $imagePath = Join-Path -Path $Destination -ChildPath ($excelWorkSheet.Name +
                            "_" + ($excelchart.Chart.ChartTitle.Text + ".$OutputType"))
        $excelchart.Chart.Export($imagePath, $OutputType, $false)    
      }
    }
    $excelApp.Quit()

    A couple of things to note – the export method can output a PNG, JPG or GIF file and in the final version of this code, $OutputType is passed as a parameter (like $Path and $Destination  I’ve got into the habit of capitalizing parameter names, and starting normal variables with lowercase letters). There’s a slightly odd way of selecting ‘all charts’ and if the chart isn’t selected before exporting it doesn’t export properly.

    I sent Doug a this which he added to his module (along with some other additions I’d been meaning to send him for over a year!). Shortly afterwards he sent me a message 
    Hello again. Someone asked me about png files from Excel. They generate a sheet, do conditional formatting and then they want to save is as a png and send that instead of the xlsx…

    Back at Excel’s object model… there isn’t an Export method which applies to a range of cells or a whole worksheet – the SaveAs method doesn’t have the option to save a sheet (or part of one) as an image. Which left me asking “how would I do this manually?” I’d copy what I needed and paste it into something which can save it. From version 5 PowerShell has a Get-Clipboard cmdlet which can handle image data. (Earlier versions let you access the clipboard via the .net objects but images were painful). The Excel object model will allow a selection to be copied, so a single script can load the workbook, make a selection, copy it, receive it from the clipboard as an image and save the image.

    $Format = [system.Drawing.Imaging.ImageFormat]::Jpeg
    $xlApp  = New-Object -ComObject "Excel.Application"
    $xlWbk  = $xlApp.Workbooks.Open($Path)
    $xlWbk.Worksheets($WorkSheetname).Select()
    $xlWbk.ActiveSheet.Range($Range).Select() | Out-Null
    $xlApp.Selection.Copy() | Out-Null
    $image = Get-Clipboard -Format Image
    $image.Save($Destination, $Format)

    In practice $Path, $Worksheetname, $Range, $Format and $Destination are all parameters. And the whole thing is wrapped in a function Convert-XlRangeToImage
    Excel puts up a warning that there is a lot of data in the clipboard on exit and to stop that I copy a single cell before exiting.

    $xlWbk.ActiveSheet.Range("a1").Select() | Out-Null
    $xlApp.Selection.Copy() | Out-Null
    $xlApp.Quit()

    The Select and Copy methods return TRUE if they succeed so I send those to Null. The whole thing combines with Doug’s module like this

    $excelPackage = $myData | Export-Excel -Path $Path -WorkSheetname $workSheetname
    $workSheet    = $excelPackage.Workbook.Worksheets[$workSheetname]
    $range        = $workSheet.Dimension.Address
    #      << apply formatting >>
    Export-Excel -ExcelPackage $excelPackage -WorkSheetname $workSheetname
    Convert-XlRangeToImage -Path $Path -WorkSheetname $workSheetname -Range $range –Destination "$pwd\temp.png" –Show

    I sent the new function over to Doug and starting with version 4.0.8 it’s part of the downloadable module

    December 15, 2010

    My review of Windows Phone 7 on the HTC trophy

    Filed under: Exchange,Mobility,Music and Media,Office,Windows Phone 7 — jamesone111 @ 7:51 pm

    I have already looked the move from Windows Mobile 6.5 to Windows phone 7, from the point of view of what’s gone. Now I want to look at what’s better.

    The Trophy is the thinnest phone I have ever had, at 12mm. Its frontal area is fractionally larger than the Touch pro 2 I had before, but without the need to accommodate a keyboard, overall volume is down about 20% and weight down about 25%. By way of comparison it 3 grams heavier than the Iphone 4, and a whisker bigger in all dimension. It’s my sixth HTC device and the tidiest design, the finish feels nice in the hand, and the ergonomics are good; held in two hands – like a camera – the camera button is under right index finger, as it should be.  The camera has a 5MP sensor I remain to be convinced that the lens justifies even 5MP and an LED flash, so it is usable under more circumstances than its predecessors. The Touch Pro 2 had a second, front-facing, camera for video chat, but I never used it and so it won’t be missed..

    Holding the device in the left hand to work with the right, puts the volume controls are under my thumb and (like the camera button and Power button) they work when the phone is locked.  When the phone is unlocked these buttons bring up a mini menu with Music player controls and access to the ring/vibrate settings; if the phone locks with music playing this menu appears when you tap the power switch – which is naturally under the index finger – so you can pause or skip tracks without needing to unlock the device.
    By contrast, Mobile 6.5 devices locked out all the buttons  – even power – so the only way to silence one left on a desk to ring was to remove the battery. Now colleagues can turn a phone off or set it to silent and add a photo to remind you not to leave it on your desk when you go to meetings. You can enable Voice dialling while locked, on Mobile 6.5 you needed to add Voice command, now a subset of it is built in.

    Music shows the change from previous Windows Mobile devices: Firstly Microsoft’s hardware spec demands a Micro-USB connector (which is becoming the standard for all phones), with a standard 3.5mm headphone jack: that means the end of HTC’s combination audio / Mini-USB connector which required an adapter to connect headphones. Pulling out the headphone connector pauses music (instead of blaring out to a carriage full of tutting commuters). And secondly the old devices tried to shoe horn Windows-Media-player into a mobile device:  sound quality was fine but the experience was way behind that of proper music player (which I never wanted to carry).  The new phone is a Zune:  Zune isn’t the market leader people who have them really like them, and I can see why.

    Over the years I had grown used to the mobile mail/messaging, contacts and calendar applications being the round pegs of their Outlook equivalents hammered into the small square hole which is a mobile display. The same philosophy which got rid of Windows media player has seen these replaced with versions work better on a phone and took zero time to learn.

    Volume, camera and power are mounted on the edge, on the front are three buttons which take longer to explain than to learn   “Back” and “Start” are the main navigation buttons going 1 step at a time or straight to the Start screen and “Search” which applications can use to start their own search (want to find a contact ? – Press search in people – it’s very intuitive), otherwise it launches Bing.  The search destination in Internet Explorer is fixed by the carrier: that’s out of character for the phone, you can remove the carrier’s or phone maker’s apps, but with Vodafone I’m stuck with Google in the all-new IE. IE supports multiple open pages, and “pinch zoom”: I’m still learning to tap accurately with a fingertip (the old touch screens worked with a fingernail or any stylus, the new ones don’t – though there are capacitive styli available to stop me obsessively polishing finger-marks off) so zooming in on a link is good and pinch zoom has come more naturally than using the on-screen keyboard.  Zooming is smooth and rendering is snappy which I attribute to having graphics acceleration as standard, rather than the faster processor. Competing phones have graphics acceleration but  introducing it piecemeal into Windows Mobile 6.5 (or earlier) wouldn’t have worked: the “Break with the past” means all apps can count on a certain level of graphics,  accelerometers and so on,  very little should be specific to one phone. Nowhere is the new hardware standard more obvious than in games. 

    This is my first phone without built in games;  odd as Microsoft positions it as consumer more than corporate, but it means that people will find their way to the Market Place. Plenty on offer is free, and most paid games and apps have demo versions. The quality runs from truly great to truly lousy. Videos and music are downloaded in the same way as software, all three can be driven from the phone or the PC Zune software. I blogged early in the life of Vista that Windows had what it took to handle podcasts, it was just ugly and buried: finally Microsoft has decent podcast support through Zune. Microsoft are pushing Zune pass, all you can eat music streaming for £8.99 a month – which works from the Xbox, Phone or PC – as well as traditional purchase and download

    There are some new look and feel elements, so besides the search button, when the built-in apps produce a long list – like songs or people – they break it up by letters: pressing a heading letter displays the whole alphabet, as a “jump-to” list To listen to “zest” or phone “Zoe”, a couple of taps saves a long scroll. It will be interesting to see how developers stick to the style – I  compared two Sudoku games one would look wrong on anything but WP7 and another looked like a port by people who hadn’t seen a finished phone. Chunky tiles on the start menu make simple targets to tap on)Word press have copied it for their newly launched app.) I already think of the icons interface we’ve grown as the Windows 3.1 program manager, the phone’s start menu is like what we’ve been used to since Windows 95/NT 4. “Start” button jumps to something like main part of the menu; and “All Programs” is off to the right. I looked in settings, for “Uninstall” without success: taping and holding anything brings up an action menu (think “right-click”) for a  program this has options to uninstall, pin to the start menu or write a review for Marketplace.

    There is a distinction between Marketplace apps and the built in ones; the latter can run in the background (and update their tiles)  downloaded ones deactivate when they lose focus – although the phone locking doesn’t count. Storage for apps is compartmentalized –so a rogue app can’t do much damage, but the Trophy’s storage device isn’t removable (it is on some WP7 devices, but the phone does some weird magic with the file system, so the card won’t work anywhere else). There is a hack to make Zune files visible from the PC, but it can’t see any  other “compartments” The Zune software will only sync Music Videos and photos, Contacts and Appointments need to go via Exchange or Windows Live.

    One-note syncs with Windows live, which is great, but you can only to get something from the other office apps to your PC via SharePoint or by mailing it to yourself. The button attach inside mail only offers photos. I had a look at the developer tools and there is no API for add-on apps to e-mail an attachment or to upload / sync to Windows live and so on.

    These are among the things which I hope to see fixed in an update early in 201.1 Paul Thurrott has a long list of What needs to be fixed in Windows Phone, here’s my cut down version of his list.

    • Add Copy/cut/paste
    • Allow Custom ringtones, notifications, and alarms
    • Support Multitasking for Third-party apps
    • Appear as a camera to  photo importing applications, e.g.  Windows Live Photo Gallery
    • Allow Zune PC software to be extended to Sync 3rd party file types.
    • Allow third party services to integrate hubs; The Pictures hub should share with and see photo services and Twitter should be able to add people to the people Hub
    • Support all the policies in Exchange ActiveSync (EAS)  instead of just a subset
    • Provide Access to Microsoft services in all supported locales (Voice search with TellMe is US only)
    • Provide a  service like MyPhone for Mobile 6.5 to deliver Cloud-based backup
    • Provide Windows Live SkyDrive in all of Office Mobile
    • Provide Developer APIS for all functions (speech, mail attachments, live sync)

    Most of the list can be summarized as (a) Improve support of cloud services (or the Services themselves) (b) make it easier to get stuff on and off the phone (c) Allow developers to do more. The updates will come when Microsoft declares them ready, not when/if the device makers and carriers get round to it, and yes that’s another plus about saying goodbye to Windows Mobile 6.5

    November 1, 2010

    Thinking about the cloud – part 2, Office 365

    Filed under: Azure / Cloud Services,Exchange,Office,Real Time Collaboration — jamesone111 @ 3:03 pm

    In my previous post I was talking in general terms about why BPOS was a sound idea. The recent announcement of Ray Ozzie’s retirement set people quoting his mantra “Three screens and a cloud” – the three screens being Computer, Mobile device, and TV.  The unwritten part of “Three screens” is recognising their diversity: people should interact with the best possible client – which means adapting to the specifics of each “screen”; it’s not “any browser and a cloud”: many phone apps do something which PCs do in the browser, they only exist because of the need to give a different experience on a different kind of screen. Instead of seeing a monolithic website (which in reality probably wasn’t monolithic) we see an app which consumes a service (probably the same service which was behind the web site).

    But there was more than publishing stuff using services instead of HTML pages; more even than the likes of Groove or Live Meeting which used the cloud to enable new things.  From Ozzie’s vision, famously expressed in 2005, came a realization that services already used by business PCs and devices would increasingly be in the cloud, instead of on an organizations own servers. That was the cue to provide Exchange as a service, SharePoint as a service and so on. We’ve tried to make a distinction between “Software as a Service” – which in some people’s minds is “Any browser and a cloud” and “Software PLUS Services” – which covers a plethora of client software: from multi-player games on Xbox to iTunes to Outlook talking to an Exchange server. But when Office Outlook on a PC accesses Exchange-Online , Exchange is software and it is provided as a service –it just isn’t accessed using a browser: I haven’t yet seen a successful way to make the distinction between the two kinds of “Software as a service” just understand it has different meanings depending on who is speaking.

    I don’t know if it was planned but it seemed fitting that we should announce the next generation of BPOS on the day after Ray’s announcement.  I prefer the new name Office 365. Mary Jo Foley posted something headed “This is not Office in the cloud” – in which she says “this was not some out-of-the-blue change in Microsoft’s business model. Microsoft is still pushing Office first and foremost as a PC-based software package.” Which is spot on: if you need office in a browser, Office Web App is there but it is not a replacement. I wrote in the previous post about the challenges of providing SharePoint, Exchange and so on, it is not Office but the services behind Office which are in the cloud. The key points of Office 365 are these:

    • At it’s core are the latest versions of the Server Software (Lync replaces Office Communications Server and provides Live Meeting functionality, and both Exchange and SharePoint are updated).  The FAQ page has a link to explain what happens to existing BPOS customers (and there are plenty of them – sending 167 million e-mails a day).
    • The ability to create a Public website (previously part of Office Live Small Business) has moved into Office 365 (Again the FAQ page explains what will happen to Office Live Small Business)
    • The update to SharePoint 2010 enables us to offer Office Web Apps – so documents can be viewed in high fidelity and edited from the browser.
    • Despite the the presence of Office Web Apps the main client will be Office on Desktop computers : Office Professional Plus for the desktop is now available as a part of the package on the same monthly subscription basis
    • There is a-la-carte pricing for individual parts of the suite and bundles known as plans targeted at different market segments.

    I think the a-la-carte pricing option is a good thing – though some are bound to say “Microsoft are offering too many options”. The plans are just the combinations of cloud services we think will be popular; services can be added to a plan or bought standalone – for example “Kiosk” workers can get on the company e-mail system with Outlook web access from $2.  We’ve announced that the plans will cost between $4 to $27 per month,  that one of the enterprise plans closely mirrors the current BPOS at the same $10/user/month, and that there will be $6 plan with the features we think small business will need. In the run up to the launch I did see some details of different plans and options but I haven’t seen all of these in the announcements and it is not impossible that they will be fine tuned before the system goes fully live.  When will that be? The launch has a beta programme (sign-up is at http://office365.microsoft.com) , Mary-Jo said back in July that the plan was for full launch was early 2011 which sounds about right – it’s also necessarily vague, because a beta might reveal a lot of unexpected work to be done: if you want a more precise date I always say in these cases those who know won’t talk, and those who talk don’t know.

    We’ve positioned Office 365 as helping small businesses to think big and big business to act fast – the link gives examples which range from the Starwood hotel chain to a single independent restaurant – it’s worth taking time to work out what it might mean to the organization(s) you work in/with: the cloud might be right for you, it might not – but if it isn’t I’d want to be able to explain why not and not have people think an opportunity was being missed through inertia.

    This post originally appeared on my technet blog.

    October 19, 2010

    Thinking about the cloud (part 1).

    Filed under: Azure / Cloud Services,Exchange,Office,Real Time Collaboration — jamesone111 @ 5:49 pm

    I was telling someone recently that before I joined Microsoft I spent the late 1990s running a small training company. The number of employees varied, averaging out at a dozen or so. I delivered training, did the business management, helped the win over customers and I looked after the IT. It was like doing two or three jobs.

    I’ve been quite reticent about our  “Business Productivity Online Service“partly because it takes a long and closely argued post to cover why, from an IT professional’s point of view, getting rid of your servers isn’t abdicating. (This is not going to be that post). But as chance would have it I was looking at BPOS again with my old job in my thoughts.  B-POS sounds like it should be something… ”points of sale”, but it is Exchange,Communications server and Sharepoint provided as Pay-monthly “Cloud services”

    In the training company we ran all our own IT services, but there’s no way I’d host my own web-server today: the sense of using a hosting company was clear before I left for Microsoft.  The launch of BPOS gave businesses a way to get hosted Mail (Exchange), Presence & IM (OCS) and Collaboration & Document management (Sharepoint) for $10 US per month – or in round numbers £80 annually – per user. Comparing that with the cost of server hardware and software and especially the time that in-house systems took up, if I were running that business today, my head would say get rid of the servers.  You can mix in-house and in-cloud servers; users keep the same desktop software which is crucial: you don’t give up Outlook to move your mailboxes to the cloud.

    It needs a change of attitude to give up the server. If my head argued costs and figures,  my heart might have come back with benefits like “You are master of your own destiny with the servers in-house”. But are you ? Back then we couldn’t justify clustering our servers, so if hardware failed – work would stop until it was repaired. Paying for a service in a Microsoft datacentre means it runs on clustered hardware, which someone else maintains. Microsoft’s datacentre is a bigger target for attack, but the sheer scale of the operation allows investment in tiers of defence. Small businesses tend not to worry about these things until something goes wrong, and you can always tell yourself that the risk is OK if you’re getting a better service in-house. But the truth is you’re probably not getting  better service.  As a Microsoft employee I’m used to having access to my mail and calendar from anything that connect to the internet – laptop at home, or on the move, any PC with web access, or Sync’d to a phone. I doubt if I would have set that up for the training company but it’s part of BPOS – even to the extent of supporting iPhones and Blackberries.   Getting rid of servers could not only save money but give users a better set of tools to use in their jobs – an easier thing to accept now that I don’t run servers for a business.

    Now if you’ve come across the idea of the HypeCycle (see Wikipedia if not) – I agree with Gartner that cloud technologies somewhere near “peak of inflated expectations”  – in other words people are talking up “the cloud” beyond it’s true capabilities, and if things follow a normal course there will be a “trough of disillusionment” before things find their true level. I don’t buy into the idea that in the future scarcely any business will bother with keeping their own server, any more than they would generate their own electricity.  Nor do I buy into the polar opposite – that very few organisations, and none with any sense, will keep critical services in the cloud – that idea seems just as implausible to me. So the truth must lie in between: the method of delivering services to users won’t change from one foregone conclusion (the in-house server) to another foregone conclusion (the service in the cloud), like so many things it will be a question of businesses asking “does it make sense to do this in-house”, and I think IT professionals will want to avoid depending on that question being answered one way.

    This post originally appeared on my technet blog.

    March 10, 2010

    UK techdays Free events in London – including after hours.

    clip_image001

    You may have seen that registration for UK TechDays events from 12th to 16th April is already open – but you probably won’t have seen this newly announced session, even if you are following @uktechdays on twitter

    After Hours @ UK Tech Days 2010 – Wednesday 14th April, 7pm – 9pm. Vue Cinema, Fulham Broadway.

    Reviving the critically acclaimed series of mad cap hobbyist technology demonstrations – After Hours reappears at Tech Days 2010. After Hours is all about the fun stuff people are building at home with Microsoft technology, ranging from the useful ‘must haves’ no modern home should be without, too the bleeding edge of science fiction made real! Featuring in this fun filled two hour installment of entertaining projects are: Home Entertainment systems, XNA Augmented Reality, Natural User Interfaces, Robotics and virtual adventures in the real world with a home brew holodeck!

    Session 1: Home entertainment.

    In this session we demonstrate the integration of e-home technologies to produce the ultimate in media entertainment systems and cyber home services.  We show you how to inspire your children to follow the ‘way of the coder’ by tapping into their Xbox 360 gaming time.

    Session 2: Augmented reality.

    2010 promises to be the year of the Natural User Interface. In this session we demonstrate and discuss the innovations under development at Microsoft, and take an adventure in the ultimate of geek fantasies – the XNA Holodeck.

    Like all other techdays session this one is FREE to attend  – if you hadn’t heard: UK Tech Days 2010 is a week-long series of events run by Microsoft and technical communities to celebrate and inspire developers, IT professionals and IT Managers to get more from Microsoft technology.  Our day events in London will cover the latest technology releases including Microsoft Visual Studio 2010, Microsoft Office 2010, Virtualisation, Silverlight, Microsoft Windows 7 and Microsoft SQL Server 2008 R2 plus events focusing on deployment and an IT Manager day. Oh and did I say they were FREE

    IT Professional Week – Shepherds Bush

    Monday, 12 April 2010   – Virtualization Summit – From the Desktop to the Datacentre

    Designed to provide you with an understanding of the key products & technologies enabling seamless physical and virtual management, interoperable tools, and cost-savings & value.

    Tuesday, 13 April 2010  – Office 2010 – Experience the Next Wave in Business Productivity

    The event will cover how the improvements to Office, SharePoint, Exchange, Project and Visio will provide a practical platform that will allow IT professionals to not only solve problems and deliver business value, but also demonstrate this value to IT’s stakeholders. 

    Wednesday, 14 April 2010Windows 7 and Windows Server 2008 R2 – Deployment made easy

    This event will provide you with an understanding of these tools including the new Microsoft Deployment Toolkit 2010, Windows Deployment services and the Application Compatibility Toolkit. Understanding of these tools including the new Microsoft Deployment Toolkit 2010, Windows Deployment Services. We will also take you through the considerations for deploying Windows Server 2008 R2 and migrating your server roles.

    Thursday, 15 April 2010 SQL Server 2008 R2 – The Information Platform
    Highlighting the new capabilities of the platform, as well as diving into specific topics, such as consolidating SQL Server databases, and tips and techniques for Performance Monitoring and Tuning as well as looking at our newly released Cloud platform SQL Azure.

    Friday, 16 April 2010 (IT Managers)Looking ahead, keeping the boss happy and raising the profile of IT
    IT Managers have more and more responsibilities to drive and support the direction of the business. We’ll explore the various trends and technologies that can bring IT to the top table, from score-carding to data governance and cloud computing.

    Developer Week – Fulham Broadway

    Monday, 12 April 2010 (For Heads of Development and Software Architects) Microsoft Visual Studio 2010 Launch – A Path to Big Ideas

    This launch event is aimed at development managers, heads of development and software architects who want to hear how Visual Studio 2010 can help build better applications whilst taking advantage of great integration with other key technologies.
    NB – Day 2 will cover the technical in-depth sessions aimed at developers

    Tuesday, 13 April 2010 Getting started with Microsoft .NET Framework 4 and Microsoft Visual Studio 2010 WAITLIST ONLY
    Microsoft and industry experts will share their perspectives on the top new and useful features with core programming languages and in the framework and tooling, such as — ASP.NET MVC, Parallel Programming, Entity Framework 4, and the offerings around rich client and web development experiences.

    Wednesday, 14 April 2010 The Essential MIX
    Join us for the Essential MIX as we continue exploring the art and science of creating great user experiences. Learn about the next generation ASP.NET & Silverlight platforms that make it a rich and reach world.

    Thursday, 15 April 2010 Best of Breed Client Applications on Microsoft Windows 7
    Windows 7 adoption is happening at a startling pace. In this demo-driven day, we’ll look at the developer landscape around Windows 7 to get you up to speed on the operating system that’ll your applications will run on through the new decade.

    Friday, 16 April 2010 – Registration opening soon! Windows phone Day
    Join us for a practical day of detailed Windows Phone development sessions covering the new Windows Phone specification, application standards and services

    There will also be some “fringe” events , these won’t all be in London and I’ll post about them separately (James in the Midlands, I’ve heard you :-)  )

     

    This post originally appeared on my technet blog.

    November 17, 2009

    Making word clouds (Part 2: how to use it , and clouds from twitter).

    Filed under: Music and Media,Office,Powershell — jamesone111 @ 2:04 pm

    Attached to this post is a Zip file containing Twitter.ps1 the PowerShell script I use to get information from Twitter, and since the word cloud work grew out of that it has ended up in the same file.  It also contains noise.dat the list of noise words which you can customize.


    If you want to use it you will need to have PowerShell V2 installed – if you are on Windows 7 or Server 2008-R2 you have it already, otherwise you need look up KB968929 and  you can download WINRM and PowerShell 2.0 for anything back to Windows XP. The code has been tried on the Beta of office 2010 and on office 2007 and should work with PowerPoint from earlier versions but hasn’t been tested.


    Click for a larger versionThe first thing you need to do is to load it , any version of Windows form Vista onwards flags files which have been downloaded from the internet and PowerShell can be a bit fussy about those. I suggest that you when you have downloaded the ZIP file you right click it go to properties and click Unblock on the general page before you extract the files. Once you have extracted the files start PowerShell and you need to enter two commands 
    CD [folder where you extracted the files]  


    . .\twitter.ps1


    note that is Dot, space, dot backslash twitter.ps1:  it won’t work without the dots.


    I have included a sample file, Macbeth.txt to get you started, it is the text of… the Scottish play. So you can now type the command


    Format-wordCloud Macbeth.txt


    Click for a larger version PowerPoint should start in the background and it will put together your first word cloud. The text for this will be all horizontal, all default colours and fonts and all words and no phrases. The biggest text will be 80 point and the smallest 16 – if your example turns out like the first one of mine you can see that we might want to change the –maxfont and –minfont settings or the –howmany parameter to fill the space better. When it finishes the function gives a fill percentage – that is: the total space occupied by the words as a proportion of the slide area. Mine came out at just under 50% , and experience tells me not to expect more than 75% so I might increase the font size to –maxfont 100 –minfont 20 as there are plenty of words – I don’t want to fill the space with more words.


    It’s not bad for a first attempt , but it has my, our, your,his, me , him , us and No too prominently, these can be taken out with the –extraNoisewords parameter,like this:


    Format-wordCloud macbeth.txt -ExtraNoiseWords  my, our, your,his, me , him, us, no


    We can introduce some colours – if you enter the this command it will show you what the colour selections are


    $RGBSet


    The colours numbers are Red + 256 * Green + 65536 * Blue , so the default is 4 black, 1 red, 1 green , 1 blue.
    In addition we can make about 25% of the words appear vertical, and use a font which looks right for Shakespeare


    Format-wordCloud macbeth.txt -ExtraNoiseWords  my, our, your,his, me , him -RandomVertical 0.25 -RgbSet $rgbset -fontName “Blackadder ITC”


    Click for a larger version The final thing to try might be to look something on twitter. It takes several seconds to run a twitter search for the last 1500 posts (that’s the –Deep switch) so it is better to store the result, in case you want to run with a different set of parameters so let’s see what is in showing up in the F1 world today, first get the tweets , the put just their titles into the cloud.


    $searchResults = Get-TwitterSearch “F1” –deep


    $searchResults | Foreach {$_.title} | Format-wordCloud -phrases -RandomVertical 0.25 -RgbSet $rgbset –uniqueLines –maxfont 60


    The -uniqueLines switch is there for something which I have mentioned before – the tendency of people to retweet an identical post many times – you can spot this happening when a long phrase becomes very prominent, which is often the case if a couple of news stories dominate a search, even with this in you can a few stories are repeated in slightly different forms.


    I can’t show everything here. Obviously you can do a lot once the slide is created in PowerPoint: one favourite trick is to do select all and set the animation for every bit of text to Appear 0.1 second after previous.  I tweaked the colours and layout for the F1 tweets from twitter in PowerPoint as well. I haven’t shown –randomtwist (the value is the maximum angle of twist in degrees), but that needs a lot more fiddling after the layout is done. Nor have I shown -randombold or -randomItalic which work just like random vertical – (phrases are always in italic). No two layouts which use any of the Random parameters will be identical, and sometimes it is worth running the format again with the –useExisting switch(and no filename –text or piped data) to see if you a second one looks better than the first.


    You can export $words to a csv file with $words | export.csv –path MyWords.csv , and modify it in excel or use it as a template for your own text. If you do you’ll notice there is a URL column – you can assign links to the text if you want to. Once you have the text you want as a CSV file you can reimport it with $words = import-CSV MyWords.csv and run format-wordcloud with the –useExisting switch


    As you can see there’s lots to play with. PowerShell seems quite happy to process very large amounts of text – I got the text of War and Peace and it took a while to process the words but it worked just fine. So try your own text and combinations of settings.  But let me stress the disclaimer that covers everything here – it is provided “AS IS” with no warranties, and confers no rights.


    This post originally appeared on my technet blog.

    Making Word clouds (Part 1: how it works).

    Filed under: Desktop Productivity,Music and Media,Office,Powershell — jamesone111 @ 9:41 am

    I’ve been playing with word clouds on and off for the last couple of months, and finally I’ve decided the time has come to share what I have been doing. 


    Word clouds turn up in all sorts of places, and I wanted to produce something which could take any text, be customized, and let me edit the the final version. The last requirement was key, because anything which produces a bitmap graphic at the end is not going to be easy to edit. I’ve seen it done with HTML tables but they are hard redesign (You can’t move words round easily). So it needed to be something like Visio or PowerPoint, or WMF which can produce a drawing containing text. Eventually I settled on PowerPoint. Although I’m using the beta of Office 2010 it relies on an object model for PowerPoint which hasn’t changed for several versions. And, since I only seem to program in PowerShell these days I wrote it in PowerShell. This gives me an easy way of taking any text – like Tweets from Twitter – and pushing it into a cloud. So I wrote my longest single PowerShell function yet to do the job.


    wordCloud



    1. If Not already connected to PowerPoint, get connected. Start a new, blank, slide

    2. Get a list of “Noise words” from a file (I used a copy of the Noise.dat, which is part of Windows Search, as a starting point) and merge that list with any passed via the –ExtraNoiseWords parameter.

    3. Take text from a file (specified by the –Filename Parameter) , a PowerShell variable or expression (specified by the –text parameter) or from the pipeline in PowerShell, and  produce a “clean” set of words by:

      1. Removing anything which is not a space, letter, digit or apostrophe from the text.

      2. Removing `s at the end of words, and convert “_” to space.

      3. Splitting the text at spaces.

      4. Removing “words” which are either URLs or numbers .

    4. Count the occurrences of the words , and determine the “cut-off” frequency which words must meet to get into the final cloud (a –HowMany parameter sets the number of words, if this is the default value of 150 and the 150th non-noise word occurs 10 times, accept all words with 10 occurrences, even if that gives 160 non-noise words )

    5. if the –phrases switch is specified:

      1. Find phrases which contain any of the words which meet the cut-off frequency.

      2. Ignore those phrases which don’t make the cut-off frequency.

      3. Repeat the process looking for longer phrases which contain the phrases which were just found. Keep repeating until no phrases are found which meet the cut-off frequency.

      4. Add the phrases to the list of found words and reduce the count of their constituent words.

    6. Remove noise words, and two word phrases where one is a noise word, and words which do not reach the cut-off frequency, sort the list of words by frequency and then number of letters

    7. Store the words in a global variable ($words) so that the function can be re-run with the ‑useExisting switch. $words can be reviewed or exported and re-imported later.

    8. If the –noPlot  switch is specified , stop leaving the words and phrases found and their counts in $words.

    9. Set additional properties on the word:
      Set the font size for the word, scaled between the values set by the -minFont and –maxFont parameters (these default to 16 and 80 point respectively)
      Set the margins to the value specified in the –Margin parameter – Powerpoint uses quite generous margins by default, but script defaults to 0.
      If –RandomVertical and/or -RandomBold, and/or -RandomItalic values are specified, generate random number for each and if it are less than the specified number, set the text attributes to true
      If -Randomtwist is specified set the twistAngle attribute to a random amount up to the value of randomtwist
      If multiple rgb colours have been provided using the -RgbSet parameter, select one at random. If not the default PowerPoint colour will be used – normally black.
      If the -fontname parameter has been provided  and is a single name, set the word to use it it, if multiple fonts have been specified select one at random. If not font is specified the default PowerPoint font will be used. 

    10. Place the first (most common) word in a Powerpoint Shape (rectangle) at the centre of the slide, store the positions of its corners as properties of the word

    11. Place each remaining word in its own shape at the top left corner of the slide, setting its properties as already defined. Get its size from PowerPoint, then try to place it around the boundaries of each existing shape, stopping when the placement won’t overlap with any of the other placed shapes. (The starting point for this method was something I read by Chris Done it was here but his pages on word clouds only show up in Search Engine caches now.)  Note that the the more shapes which have been placed, the longer each new shape will take to place. Store the positions of the newly-placed shape’s corners as properties for use placing future shapes.

    12. Stop when either the number of words cannot be placed exceeds the value in –maxFailsToPlace (3 by default) or all words have been placed successfully.

    In part 2 I’ll include the PowerShell code: the example above was from the Tweets about teched and I’ll show some more examples, with the command lines which were used. As you can see from the above, there are 20 or so parameters to explain.


    Update Thanks Ian for letting me know that Chris’s Page is missing in action, the italicized part of point 11 has been changed accordingly.


    tweetmeme_style = ‘compact’;
    tweetmeme_url = ‘http://blogs.technet.com/jamesone/archive/2009/11/17/making-word-clouds-part-1-how-it-works.aspx&#8217;;

    This post originally appeared on my technet blog.

    January 26, 2009

    Camera-phones One Note and OCR.

    Filed under: General musings,Mobility,Office — jamesone111 @ 1:43 pm

    Everyone uses different bits of office. There’s a core piece that everyone uses and then we all have our personal 10%. I like the OCR feature of One-Note. For example on the way to the BETT show a few days back I saw an advert on the tube that’s a grander variation on “How do you pronounce Ghoti ?” *

    If GH can stand for P in Hiccough
    If OUGH can stand for 0 in Dough
    If PHTH can stand for T in Phthisis
    If EIGH can stand for A in Neighbour
    If TTE can stand for T in Gazette
    It EAU can stand for 0 in Plateau
    Then the way you spell POTATO is…

    GHOUGHPHTHEIGHTTEEAU

    Isn’t it?

    Only The Times brings you the UK’s first national spelling championship for schools.
    Join in at
    Timesonline.co.uk/spellingbee

    I just grabbed a snap with my phone (the handles for standing passengers make a great camera rest to keep shake down) and when I hooked up to my PC  I dropped the picture into OneNote: One notes does OCR on pictures offers a “copy text” menu option when you right click them. I’m finding myself using this more and more, even for slides with a variety of cameras and even screen grabs of on-line presentations. I’ve noticed than some phones now do recognition of input from business cards via their phones. I wonder how long it will be before the whole thing can be done in the phone without needing the PC to do the OCR part.

     

    * Ghoti is pronounced “Fish” , Gh as in enough, o as in women , and ti as in station

    This post originally appeared on my technet blog.

    November 5, 2008

    Slides and scripts from Open XML / PowerShell reporting Session in Barcelona

    Filed under: Office,Powershell,Windows Server — jamesone111 @ 2:36 pm

    I’m going to spend the next two or three Posts explaining what I did in this session for those who weren’t there but, for those who were I promised the slides and scripts would be downloadable


    You can get the zip from my skydrive

    This post originally appeared on my technet blog.

    September 8, 2008

    Net maps – PowerShell and Excel

    Filed under: Office,Powershell — jamesone111 @ 2:46 pm

    Over the weekend I picked up a link from Jeffrey, which took me to this post by Doug. My post from Friday had already got me thinking about visualization stuff and so I went off on the link that Doug provided. A nice little snippet of PowerShell which piqued my interest and a link to another post of his which explains Microsoft Research released .NetMap – linking to the download on codeplex. “I’ve got to try this” I thought. So after a couple of hiccups with the installation I had it installed and was playing with the Excel test harness they built for it. To get an idea of what it can do it imports information about mails you’ve had with who links to who, and I had fun hour playing with that last night, Here’s the result. I’ve blurred the names except my own, but the items in centre are distribution lists I belong to. The huge star burst on the bottom left is one of the main activists on the Virtualization discussion lists: I’ll dream up some other uses for this in due course, but I’d love to see what other people come up with


    Update I wrote a little bit of PowerShell to map out links between WMI objects

    Function get-related
     {param($class, $maxlevels)
      if ($global:done -notContains $class) {$global:done +=  $class
          $classes=(([wmiclass]”\\.\root\cimv2:$class”).getRelatedClasses() | where {$_.name -like “Win32*”})
       $classes | select-object -property @{Name=”Source”; expression={$class}}, @{Name=”Target”;expression={$_.name}}
       if ($maxLevels -gt 0)   {$classes | foreach-object {get-related $_.name ($maxLevels -1 )}  }
     }}

    PS > [string[]]$done=@()
    PS > get-related “Win32_ComputerSystem” 2 | show-netmap

    The result is attached below … the picture


    Click for full size view

    This post originally appeared on my technet blog.

    August 12, 2008

    PowerShell from Excel (oh oh, VMware again)

    Filed under: Office,Powershell — jamesone111 @ 5:34 pm

    VMware are a competitor and so when things go wrong for them I’ll point it out (and to answer Nick, a regular commenter: No, that’s not FUD. Saying "VMware had a failure here, so you can extrapolate from that to unspecified future failures" would be spreading Fear Uncertainty and Doubt. I might lapse into that now and then, but aspire not to. If I do then I expect to get a tough time).

    But competitors do good things sometimes, and I’ve praised them for their use of PowerShell before now. Using PowerShell was good for them (it’s much easier for them to develop a PowerShell snap-in than a raft of command line tools), and it’s good for customers (no need to learn special purpose tools).

    Now they’ve done another clever thing with PowerShell: not rocket science, but great application of sense. Sometimes it makes sense to have your user interface in an Office application, like Excel (with the business logic implemented with Macros).  But sometimes this gives rise to tasks to be carried out from PowerShell. How do you knit together PowerShell scripts and Excel Macros (or any other scripting language).  Provided that a language can write files and invoke other programs you can do it. Someone pointed out this page on the VI Tool kit blog , it says

    If an ESX host you want to manage doesn’t appear in VirtualCenter, you need to add it. This is a bit tricker than reconnecting since there’s no inventory in VirtualCenter to tell you the IP addresses of all the hosts you use, and you also need to know a host’s password in order to add it. This is another case where entering things in a spreadsheet can really speed things up

    Not only is there a video of the spreadsheet in use but it’s available for download as well, so I had a look and (like so many good ideas) the code is remarkably simple, start by opening a file

        Handle = FreeFile

    Open "script.ps1" For Output Access Write As #Handle

    Then use Print # to output the lines of script to it

        Print #Handle, "add-pssnapin VMware.VimAutomation.Core" & vbCrLf

    And when you’ve print #ed the whole script, close the file and run it

        Close #Handle

    Call Shell("powershell -command .\script.ps1", vbNormalFocus)

    Not exactly Rocket science, but smart use of the right tool for the job; I can’t find a name on the blog post, but whoever you are a tip of the hat is due to you.

     

    Technorati Tags: ,,

    This post originally appeared on my technet blog.

    June 15, 2008

    Open-XML. This is what it’s all about.

    Filed under: How to,Office,Powershell — jamesone111 @ 11:46 pm

    I have been saying for ages that most IT professionals really don’t give two hoots about Open-XML. It’s a file format. Who cares ? My first chart - click for full size version

    The old file format had been around for ages, and was pretty opaque, so hardly anyone dug into it. The new format is XML (which is good), and rich (also good), but complicated (not so good). It’s joined the ranks of formats approved by ISO, which might matter if to those Government trying to follow de-Jure standards (and ignoring de-facto ones… X400 vs SMTP anyone ?). But to IT Professionals in the commercial sector using and supporting office, does any of this matter ? Don’t they see file formats as "black boxes" … they might say that in theory it’s lovely that instead of something proprietary file internals are now XML (and standard’s body controlled XML)  but in practice they still need to deal with a change in file formats. Will they take advantage of the changes  ? And some will wonder if manipulation through the Office Object model which we’ve been doing since word 6 wasn’t enough ?

    Time to Reconsider. 

    This week we released the Open-XML SDK. It allows developers to work much more quickly with Open-XML. That’s obviously a Good Thing because it will bring more things to market which can work with the format. Still to the kind of IT-Pro I’m thinking about … the kind who I think reads this blog … a new SDK isn’t exactly a reason to crack open the champagne. OK we can manipulate the files without having the applications present, and  developers who slave away in C# might produce stuff which IT pros want… but what can I do with this RIGHT NOW ?

    How about charting your Data Centre Activity in Excel. I can almost feel the interest, but it’s tempered with a "but that needs a bucket full of code …. doesn’t it". At risk of repeating myself

    Time to Reconsider. 

    Eric White – a fellow evangelist, although I’ll confess not one I could pick in an ID parade – has posted some the C# code for a PowerShell snap-in to CodePlex. This shows there might be a bucketful of code, but you don’t have to be one who writes it.  You can view what’s on Codeplex as two things, one is a demo of what can be done with the Open-XML SDK. The other is a bunch of Powershell cmdlets which are useful in their own right. Now those readers who have seen some of my PowerShell might still feel cautious – there may still be some nasties here.

    Time to Reconsider. 

    How’s this for a command line to get running processes into a spreadsheet.

       Get-Process | Export-OpenXMLSpreadSheet -OutputPath Process.xlsx  

    That’s it. Not nasty is it ? Cynics might think "I could use Export-CSV and open that in Excel". But we’re just getting started. How about a graph ? These two lines of PowerShell get the total CPU time used by running processes, and then gets the 10 heaviest processes, and spits out their ProcessName  and %CPU – the proportion of CPU they’ve used (the method is a shade simplistic but bear with me)

       get-process | foreach -begin {$TotalCPU=0;} -process {$TotalCPU += $_.CPU;}   
    Get-Process | sort -descending cpu | `
    select -first 10 -property ProcessName, @{name="%CPU" ;expression={100*$_.cpu/$TotalCPU}}

    You’ve already seen that we could pipe that into Export-OpenXMLSpreadSheet. But I want a bar chart of CPU used for each process, what would that need.  The video Eric has on his blog gives the answer – the extra switches needed by Export-OpenXMLSpreadSheet would be

       -chart -ChartType Bar -ColumnsToChart %CPU -HeaderColumn ProcessName

    I thought I’d have a go, I had the Express version of C# on my demo server but no Office. I downloaded the SDK, and the Stuff from CodePlex, copied what I’d seen Eric do in the video and hey presto I had a compiled version set-up for 32 bit powershell only. This is not the first and won’t be the last Snap in to only register as 32 bit. Fortunately I know how to fix that and next post will explain it.  But within 30 minutes I’d produced the chart you see here, and opened it using office in a VM on the same box.

    In my previous role as a consultant I would have loved tools like this. Giving the client reports that were easy to understand and looked great from Exchange , System Center Virtual Machine Manager, HPC, you name it… The PowerShell community extensions could even mail them as an attachment. .

    Eric’s video shows how multiple documents can be given a common style and gives a bunch more detail. Seriously if you didn’t think you could learn anything from an 8 minute video today it might be time to reconsider.

     

    This post originally appeared on my technet blog.

    April 2, 2008

    It’s official, Office Open XML is an ISO standard

    Filed under: Office — jamesone111 @ 11:46 am

    ISO have posted the official news to their web site, and the less than inspiring title, "ISO/IEC DIS 29500, Information technology – Office Open XML file formats, has received the necessary number of votes for approval as an ISO/IEC International Standard."

    I’ve had some pretty robust things to say about certain competitors who used approval by standards bodies as a substitute for producing a good product, and the temptation to go "Nah nah na-na nah" – or something equally childish is pretty strong. But that would be to ignore a couple of important things, the first is Microsoft handing over responsibility for file formats to a standards body. The second is that customers get both choice of file formats and the comfort that Microsoft (and others) won’t go changing standards on a whim.

    Brian Jones has a very well thought out post on the subject if you want to read more.  

    Technorati Tags: ,,

    This post originally appeared on my technet blog.

    January 8, 2008

    Search: A quick round-up

    Filed under: Events,Office,Virtualization,Windows 2003 Server — jamesone111 @ 12:48 pm

    I’ve been meaning to have a play with Search Server since Viral demo’d it on the last roadshow. We have a downloadable VHD for it with a very good walk-through, I can say that honestly because I’ve just spent a couple of hours walking through it.

    Of course since I’m doing all my virtualization on Server 2008 and Hyper-V now, I thought I’d try it out on there and rather annoyingly the VHD was built with an out of date set of Virtual Server extensions which won’t un-install under Hyper-V.  So I downloaded Virtual PC 2007, which warns me that it is not supported on Server 2008 (and I’m running Virualization on top of Virtualization – a silly thing to do) but it works all the same. So I fired up the VM, removed the extensions, shut it down, booted back in Hyper-V, added the integration components and off it goes. I’d run search server with more than the 1GB specified but even with that it runs OK.

    Now I’ve been playing with Microsoft Search technologies since we introduced Index Server for IIS 2.0 back in about 1996, and at one stage of my life I was a Sharepoint Portal Server Guru. Back in 1999 when I was first interviewing for a job at Microsoft I brought in the some of the work I had done with digital dashboards – stuff that evolved into Web parts. Doing the walk-through for Search Server express two things leap out at me; first was how much easier it is now than I remember it. The other is how simple but how clever federated search is. Some while back I mentioned open search and that seems to me to be a clever technology based on two simple ideas (a) Provide an XML document to tell people how to query your site. (b) return the results as XML to make it easy to consume them in something else. (Use the RSS XML-Schema to make this doubly easy).  Several people are keeping lists of services which support OpenSearch, OpenSearch.org has a list of such lists.

    According to Sharon’s blog we’ve booked her to do a series of workshops on search, which will cover search server unlike the walk through I haven’t seen the content for these, but knowing it’s Sharon doing them, I feel safe recommending those too. Someone is bound to ask her the perennial question “Can I index a database” – answer “What constitutes a match ? What is the URL you’re going to click, and what will that open ? – but you can create a web pages to crawl data and return records”. I don’t know if she’s going to cover things like adding PDF support, or whether Viral will have that on his blog sometime soon. I guess if he’s going to blog about Virtualization I’ll have to blog about configuring iFilters.

    Finally this morning I got a mail saying we have Announced an Offer to Acquire the Norweigian search company FAST , I don’t know anything about them or how their product will fit in with our other technologies

     

    This post originally appeared on my technet blog.

    December 14, 2007

    MY reason for having Office 2007 SP1

    Filed under: Office — jamesone111 @ 3:15 pm

    I’ve blogged about this twice already, but now it’s on my system I have one bug-fix to report.

    I read every thing in Outlook. I don’t go to blog sites to see if they have posts, I subscribe to RSS feeds, which get brought into folders in Outlook. In fact if something doesn’t come to me via outlook, it probably won’t get read. Web forums ? Forget it.

    I have my feeds offline, searchable, accessible from anywhere (although only updated when my PC is online). For reasons I never got to the bottom of Outlook dated all the posts on some feeds 1st Jan 2007 or 31st Dec 2006.

    That behaviour is fixed in SP1 – though I’ve not seen that recorded anywhere.  

    I’ve re-created some of the affected feeds. to force them to re-date the last few dozen posts, and life’s better already.

    Technorati tags: , ,

    This post originally appeared on my technet blog.

    December 13, 2007

    More on office SP1 (pass the tin hat)

    Filed under: Office — jamesone111 @ 11:44 am

    It was an interesting journey to work. Between the fog, ice and roadworks and traffic police who seem to have become suicidal of late, I was listening to my e-mail through Outlook Voice Access. The time really has come to buy myself a Jawbone headset: OVA is perfect when I use the mobile handset (illegal in the car) but using my Jabra it can take 6 attempts to get words recognizable: “Next” sounds like “Help”, or “Delete” sounds like “repeat”, and “e-mail” is indecipherable. Reading works well and the mechanical voice read me this:

    Microsoft announced publicly this week that Office 2007 SP1 is available on Microsoft Update for interactive, user-initiated installation, but will not be pushed for automatic installation for a few months. However soon after the release on Tuesday, customers began to report that the service pack had been automatically installed on some of their systems. 

    After thoroughly investigating the reports, we determined that the Office service pack was operating as expected on Microsoft Update but that some customers were confused about the expected behaviour of interactive and automatic updates on Windows Vista due to changes from the Windows XP functionality. We also confirmed that Office 2007 SP1 will automatically install on systems running a beta version of Windows Vista SP1 since those systems use a different Microsoft Update server for their updates and that server always pushes out all Important and Recommended updates to keep the beta products up to date.

     Further public communications on this issue will be posted on the Microsoft Update blog at http://blogs.technet.com/mu/.

    The Microsoft update blog has a post with more detail. Darren also has more on his blog.

    Technorati tags: , ,

    This post originally appeared on my technet blog.

    December 12, 2007

    Office 2007 Service pack now out

    Filed under: Office — jamesone111 @ 11:30 am

    The title says is all really. Details of what is in the service pack are in  KB article 936982 and the download is available here.  I’ve been saying I’d rebuild my PC once we have the release of office and vista SP1 so I haven’t been running this one in beta, so I haven’t got experience to share, yet .

     

     

    Technorati tags: , ,

    This post originally appeared on my technet blog.

    Next Page »

    Blog at WordPress.com.