On the move!

05 December 2014

After 7 years blogging at http://aidangarnish.net using the BlogEngine.net platform I have decided to try something new. I will now be blogging at www.collaborationnotcompetition.com using the Ghost blogging platform.

All the existing SharePoint content still gets a reasonable number of hits (it is remarkable how many people are still interested in the STSAdm commands for deploying a wsp!) so rather than trying to complete the almost impossible task of moving it without breaking existing links I will be leaving it in place. If I produce any SharePoint blogs in future I will most likely put them here but for now new content is likely to be slim pickings on this blog. Fear not though as there is already a scintillating tale of the Internet of Things and home automation over on the new blog so go and take a look.

Are developers too demanding?

20 February 2014

The write up of a recent Digital Leaders meeting has raised some interesting debate about the best way to develop the North East digital sector but the comment that struck a real chord with me was the following:

...someone "who runs an IT recruitment agency said they often had problems filling job vacancies as some local developers were being too demanding"

This is similar to other comments I have heard recently made by other recruiters. The feeling seems to be that because we are in an area with relatively low costs of living, local companies can't be expected to pay competitive salaries. Unfortunately for those companies they are competing for developers with some large local private and public sector organisations who have deep pockets and can come close to offering internationally competitive rates. The other often overlooked factor is that developers can work remotely. As a result there are people (including myself) who live in the region but work with companies based in other countries. This means local companies aren't just competing with other local companies they are potentially competing with every company, everywhere.

So, if local SMEs can't compete on salary what are their options? They could outsource to other countries where the cost of living is even lower but that probably isn't sustainable forever as costs of living slowly start to harmonise. There is also the added challenge of language barriers and time zones which can be overcome but takes time and effort which drives up cost.

Another alternative would be to stop writing shitty job adverts! Or rather, write job ads that present your company as a place a developer wants to work. Most job ads are an uninspiring paragraph of buzz phrases, "demonstrate a good level of commercial experience", "client facing personality", "willingness to learn new technologies" and "the opportunity to work with the latest technologies", immediately followed by a bullet list of technologies that are 1 or more iterations out of date. What these ads fails to recognise is that any good developer wants to know that whatever they will be working on is something they can feel proud of and get passionate about. Companies should spend more time communicating what gets them excited and why what they are doing is important. If they can't or if that feels like too much effort then why would anyone worth their salt choose to work with them?

As developers we also need to take responsibility for encouraging the types of companies we want to work for. We either need to get out into the world and find interesting, engaging companies to work with or we need to help build those companies ourselves. One organisation that is doing just that, and much more, in the north east is Ignite - check out their Campus North Kickstarter campaign to get involved.

What do you think, are developers too demanding? What can local SMEs do to attract developers? @aidangarnish

**UPDATE** Turns out the quote from the recruiter above has been somewhat misrepresented. To put the record straight:

"Agh! It was taken out of context! What I actually said was developers demand more in a good way, they demand more than the same old job specs, prospects and salaries that have been put forward for the 7 yrs I've worked in the North East! Employers need to take responsibility for the talent drain and offer something worth staying for!" - Laura Sharpe

Great news! Sounds like we can all look forward to more engaging job ads and work in future.

 

Support Spanish City Regeneration

27 October 2013

No, not Granada or Seville, the Spanish City in Whitley Bay, a grade II listed building on the north east coast just down the road from Newcastle!

North Tyneside council are currently in the process of working to regenerate the iconic building after years of neglect and need your support to convince the Heritage Lottery Fund that some of their cash is worth spending on this project.

After visiting Spanish City yesterday on a rare open day I decided to put together a website to make it easier to find the online form that enables people to express their support for the scheme (I still haven't found a link to it from the council's own website!)

So, all you need to do is visit the Spanish City site to find out more and to pledge your support by filling in an extremely short form.

If you also fancy sharing the link and encouraging your friends to add their support too, well, the more the merrier. The majestic dome thanks you for your support!

SQLServerSpatial.dll and Azure Cloud Services

16 August 2013

There is a great article here by Alistair Aitchison detailing how to get the SQL Server spatial types working with Azure roles. Even with the aid of that article it took a little bit of trial and error to get it right.

The main issue I found was that when adding SQLServerSpatial.dll from C:\Windows\System32 the Add Exisiting Item dialogue was picking up the SQLServerSpatial.dll file from the SysWOW64 folder.

You can tell the difference by the file size, the SysWOW64 file is around 230k whilst the System32 file is over 400k. Make sure you add the larger file. I eventually had to copy the file to another location and add from there to get the correct file added.

I also found that I didn't need to add the msvcp and msvcr dlls - this is probably because the standard VM running an Azure Cloud Service now comes with these available.

Hope this saves someone the few hours it took me to get this working!

Configuring Azure AD as an Identity Provider in ACS

19 July 2013

There have been some big improvements recently in the ease of configuring applications to authenticate using Azure AD. It is now possible to manage configuration of your applications through the Azure portal as part of managing Azure AD.

There are lots of tutorials on how to set up Azure AD to work with your apps.

There is also lots of information on using ACS to work with Google, Facebook, Yahoo, Microsoft accounts and on premise AD FS 2.0.

Where there is a slight gap is for the scenario where you want to authenticate your users using Azure AD through ACS. The app I am building allows users to register using a Microsoft account or a Google account but we also want to add Azure AD to allow organizations to take advantage of single sign on using their own organisation AD credentials.

I am starting from a position where my web app is already configured to use ACS and is happily authenticating users with Microsoft and Google accounts.

To also include Azure AD in the identity provider mix is a three step process:

1. Configure ACS

In ACS select Identity Providers and click Add

Leave the default selection of WS-Federation identity provider and click Next

Enter a display name and then get the url for your Azure AD WS-Federation metadata. This can be found here https://login.windows.net/[myTenantName].onmicrosoft.com/FederationMetadata/2007-06/FederationMetadata.xml - replacing [myTenantName] with whatever your tenant name is.

Enter some login link text - this is what will be displayed when your user is selecting the IP they want to use.

Select the relying party applications you want to make the Azure AD IP available for and then hit save.

In the rule group for your relying party application you will need to add a new rule to pass through claims from Azure AD (or do whatever transformations are appropriate)

2. Configure your application in Azure AD

This step is only really necessary if you want to make the app available to external users or you want to enable your app to read or write directory data. If you only require straight authentication this step could be skipped.

Login to the Azure portal and In the applications tab of your Azure AD directory click Add

Follow the wizard and fill in the fields as relevant for your app

3. Provision a service principal in the directory tenant for the ACS namespace

After completing the first two steps I was getting the following error when logging into my app using Azure AD as the IP:

HTTP Error Code:400

Message:ACS50000: There was an error issuing a token.

Inner Message:ACS50001: Relying party with identifier 'https://[mynamespace].accesscontrol.windows.net/' was not found

The solution is to provision a service principal in the AD tenant for your ACS namespace. This is the bit that took me some time to figure out as it looks like it is still something that can only be done using PowerShell. Hat tip to Ross Dargan for suggesting this could be the issue.

For a full explanation see Vittorio Bertocci's post but the crucial bit you need is this (remember to replace the urls with your ACS namespace):

 1: 

Connect-MsolService

 2: 

Import-Module

 MSOnlineExtended -Force
 3: 

$replyUrl

 = 

New-MsolServicePrincipalAddresses

 –Address "https://lefederateur.accesscontrol.windows.net/"
 4: 

New-MsolServicePrincipal

 –ServicePrincipalNames
         @(

“https://lefederateur.accesscontrol.windows.net/”

) 
         -DisplayName 

“LeFederateur ACS Namespace”

         -Addresses 

$replyUrl

 

Once those steps are complete you should be able to start up your app and select Azure AD as the IP option and sign in using your Azure AD account.

Logging Elmah Exceptions To Azure Storage

13 June 2013

I wanted to store the Elmah errors from a web application using Windows Azure Table Storage and came across this post by Dina Berry which outlines the steps required very nicely.

However, it was written in 2011 and the Windows Azure Table Storage client has moved on since then so some of the syntax is out of date.

Below is the code updated for the current Windows Azure Storage client (2.0) as of 13/06/2013.

I wanted to use the Azure Storage connection string that I already have set up in the Azure portal rather than exposing the username and password in the Elmah web.config settings so this code no longer uses the IDictionary config and instead has a reference to the ConnectionString via ConfigurationManager. 

using Elmah;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Table;
using Microsoft.WindowsAzure.Storage.Table.DataServices;
using System;
using System.Collections;
using System.Collections.Generic;
using System.Configuration;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace TIROne.ElmahAzureStorage
{
    public class WindowsAzureErrorLogs : ErrorLog
    {
         /// <summary>
        /// Table Name To Use In Windows Azure Storage
        /// </summary>
        private readonly string tableName = "ElmahExceptions";

        /// <summary>
        /// Cloud Table Client To Use When Accessing Windows Azure Storage
        /// </summary>
        private readonly CloudTableClient cloudTableClient;

        /// <summary>
        /// Initialize a new instance of the WindowsAzureErrorLogs class.
        /// </summary>
        /// <param name="config"></param>
        public WindowsAzureErrorLogs(IDictionary config)
        {
            if (!(ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString is string))
            {
                throw new Elmah.ApplicationException("Connection string is missing for the Windows Azure error log.");
            }

            if (string.IsNullOrWhiteSpace((string)ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString))
            {
                throw new Elmah.ApplicationException("Connection string is missing for the Windows Azure error log.");
            }

            CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString);
            this.cloudTableClient = cloudStorageAccount.CreateCloudTableClient();
    
        }

        /// <summary>
        /// 
        /// </summary>
        /// <param name="error"></param>
        /// <returns></returns>
        public override string Log(Error error)
        {
            ErrorEntity entity = new ErrorEntity(error.Time, Guid.NewGuid())
            {
                HostName = error.HostName,
                Type = error.Type,
                ErrorXml = ErrorXml.EncodeString(error),
                Message = error.Message,
                StatusCode = error.StatusCode,
                User = error.User,
                Source = error.Source
            };

            CloudTable table = this.cloudTableClient.GetTableReference(this.tableName);
            table.CreateIfNotExists();

            TableOperation insertOperation = TableOperation.Insert(entity);

            if (entity.RowKey != null && entity.PartitionKey != null)
            {
                TableResult result = table.Execute(insertOperation);
            }

            return entity.Id.ToString();
        }

        /// <summary>
        /// Get a Error From Windows Azure Storage
        /// </summary>
        /// <param name="id">Error Identifier (Guid)</param>
        /// <returns>Error Fetched (or Null If Not Found)</returns>
        public override ErrorLogEntry GetError(string id)
        {
             TableServiceContext tableServiceContext = this.cloudTableClient.GetTableServiceContext();

            var query = from entity in tableServiceContext.CreateQuery<ErrorEntity>(this.tableName).AsTableServiceQuery(tableServiceContext)
                        where ErrorEntity.GetRowKey(Guid.Parse(id)) == entity.RowKey
                        select entity;

            ErrorEntity errorEntity = query.FirstOrDefault();
            if (errorEntity == null)
            {
                return null;
            }

            return new ErrorLogEntry(this, id, ErrorXml.DecodeString(errorEntity.ErrorXml));

        }

       
        public override int GetErrors(int pageIndex, int pageSize, System.Collections.IList errorEntryList)
        {
            if (pageIndex < 0)
                throw new ArgumentOutOfRangeException("pageIndex", pageIndex, null);

            if (pageSize < 0)
                throw new ArgumentOutOfRangeException("pageSize", pageSize, null);

            TableServiceContext tableServiceContext = this.cloudTableClient.GetTableServiceContext();

            // WWB: Server Side Call To Get All Data
            ErrorEntity[] serverSideQuery = tableServiceContext.CreateQuery<ErrorEntity>(this.tableName).AsTableServiceQuery(tableServiceContext).Execute().ToArray();

            // WWB: Sorted in Reverse Order So Oldest are First
            var sorted = serverSideQuery.OrderByDescending(entity => entity.TimeUtc);

            // WWB: Trim To Just a Page From The End
            ErrorEntity[] page = sorted.Skip(pageIndex * pageSize).Take(pageSize).ToArray();

            // WWB: Convert To ErrorLogEntry classes From Windows Azure Table Entities
            IEnumerable<ErrorLogEntry> errorLogEntries = page.Select(errorEntity => new ErrorLogEntry(this, errorEntity.Id.ToString(), ErrorXml.DecodeString(errorEntity.ErrorXml)));

            // WWB: Stuff them into the class we were passed
            foreach (var errorLogEntry in errorLogEntries)
            {
                errorEntryList.Add(errorLogEntry);
            };

            return serverSideQuery.Length;
        }
    }
}

 

Get XML Node InnerText from SPFile

16 May 2013

A method to get the value from an XML node stored in an InfoPath XML document in SharePoint:

public static string GetValueFromSPListItemXml(SPListItem item, string node)
{

SPFile file = item.File;
byte[] xmlFile = file.OpenBinary();
MemoryStream ms = new MemoryStream(xmlFile);
XmlDocument xml = new XmlDocument();
xml.Load(ms);
string s = xml.OuterXml;
XPathDocument x = new XPathDocument(new StringReader(s));
XPathNavigator xPathNav = x.CreateNavigator();
xPathNav.MoveToFollowing(XPathNodeType.Element);
IDictionary<string, string> namespaceDictionary = xPathNav.GetNamespacesInScope(XmlNamespaceScope.All);
XmlNamespaceManager nsmgr = new XmlNamespaceManager(xml.NameTable);
nsmgr.AddNamespace("my", namespaceDictionary["my"]);
XmlNode root = xml.DocumentElement;
return root.SelectSingleNode(node, nsmgr).InnerText;

}

 

Example use:

GetValueFromSPListItemXml(item, "//my:Details//my:Title")

CV Once

19 March 2013

CV Once is the answer to your job hunting woes!

When applying for jobs it isn't unusual to send copies of your CV out to 10 or more companies and recruitment agents. This is usually done by emailing a static document like a PDF or Word file.

The downside of this approach is that those 10+ copies of your CV become stale and out-of-date very quickly as you gain new skills and experience.

This is where CV Once comes to the rescue by providing an online version of your CV that companies can access from their own systems so that they always have the most recent version of your CV.

Simply login with your Google or Microsoft account and create your CV. A link is then generated for you that you can give to companies and recruitment agents so that they can keep their systems up to date with the most recent CV you have to offer.

You can even download your own JSON or XML version of your CV to host on your own website or use the API call to display your CV on your own web page.

A new blog theme

06 February 2013

I have been thinking about giving my blog a bit of a freshen up for a while and this is the result. It is still a work in progress, the about page still needs a proper rework, but the aim is to remove any unecessary clutter and leave only the content. The SharePoint banner ads are gone and so is the post and category navigation. I have also removed comments and ping backs.

The vast majority of visitors to my blog arrive via a search engine so internal navigation is largely unused, I may relent and add the search box and some navigation somewhere on my about page. Very few people comment, despite getting a respectable amount of traffic for a small blog like mine (~5000 visits per month), and the comments that are left are 50% spam and 50% a simple "thanks for the info" so comments won't be missed. If anyone wants to get in touch with me directly they can find my email address in the footer. What is left is simply the information they were looking for when they typed their search terms into Google/Bing....hopefully!

I would ask what people think of this approach (is it too severe?) but as comments are now turned off that would be pointless...if you care that much you can always email me...

Getting a SharePoint 2013 App Submitted to the Office Store

13 December 2012

I recently had my first SharePoint 2013 app accepted to the Office Store and thought it would be worth sharing some of the lessons I have learned over the last 3 and a bit months whilst trying to get it through the validation process.

The app I submitted is a CSV Uploader that I had previously developed as a full trust wsp solution that used server side code to upload a selected CSV file into a SharePoint custom list. To try and get a better understanding of the new SharePoint 2013 app model I decided to redevelop this functionality using client side code in a SharePoint hosted app. For more information on the three types of app (SharePoint Hosted; Auto Hosted and Provider Hosted) take a look at this Apps for SharePoint overview.

In the end it took me quite a while to get the app submitted to the store despite the initial development process being relatively quick. There were a few reasons for this that were mostly my own daft fault. However, there were some initial teething problems with the process that meant my submission disappeared down a rabbit hole for a few weeks early on and my Office 365 preview developer site undergoing maintenance for several weeks also didn't help.

So...these are my top tips for a smooth SharePoint app submission process. Of course submitting to the App Store is entirely optional, you could always just distribute the app directly to your sites/clients and avoid Microsoft's Office Store validation rules but where is the fun in that? Laughing

1. Read the validation guidelines very carefully. The two pages you want to look at are Validation policies for the apps submitted to the Office Store (version 1.2) and Validation policies for apps FAQ

  • The main thing I missed was that the app has to work in IE8/9 as well as IE10. Since I was originally using the HTML 5 File API to read the CSV file this caused my app to be rejected as IE8/9 does not support File API.
  • Completing the version number correctly on the submission forms was another brief stumbling block. The version number you enter must match the version number in the AppManifest.xml of your solution.
  • Make sure you include the SupportedLocales tag in your AppManifest.xml - Locale support information is required for all apps in the store.

2. Test in Firefox and Chrome as well as IE8/9/10

  • This might sound obvious but it is easy to assume that once you have your app working in a couple of these browsers your testing is done. This cost me a couple of failed submissions highlighting small things that were due to browser inconsistencies. What didn't help was that one of the Microsoft examples includes code to populate a dropdown list with SharePoint list names but the code to add items to the dropdown list did not work in Firefox despite working fine in IE and Chrome! The example is here and I have submitted a comment to flag the issue.
  • It is also worth checking that any of the newer HTML5 features that your app relies on will function in all browsers supported by SharePoint (IE8/9/10; latest release of Chrome, Firefox, Safari) using the handy Can I Use website. E.g. I originally started out using File API but had to switch to using Filepicker.io to support IE8/9

3. Make sure that the test steps you submit to the validation team are crystal clear

  • This probably caused me the most head scratching! The test I asked the validation team to carry out was to try and upload a CSV file with a header row of "Title, Description" to a custom list with a "Title" and a "Description" column. Time and again they came back with an error about the "Description" field not being found but it was working fine on my machine, damn it! I eventually figured out that they had added a site column called "Description" to their custom list but the site column had a static name of "kpidescription" which was causing this error. Once that was figured out and the validation team created a column that had a static name of "Description" the tests passed and all was good.
  • The main learning point here is that if your test steps can be misinterpreted it is far more likely that you will have problems. Trying to debug an issue raised by a remote testing team is very frustrating so it is up to you to keep your tests as clear and as unambiguous as possible.

It has been a long and at times frustrating but ultimately satisfying experience getting my first app approved and I am confident that my future submissions will be made much faster by following the advice above.

Finally, I would like to thank a few people who helped along the way. Thanks to Jeremy Thake who nudged the right people when my submission got lost in the process for three weeks. Huge thanks go to Jes Brown of the Office Store Dev Communications team for all his help, communication and general hand holding through the process. He really went above and beyond in the assisstance he provided even going so far as making some sightseeing suggestions while I was in Seattle! Finally, thanks to the validation team for the excellent and detailed feedback that ultimately helped my app to pass validation and hit the store!