Total Pageviews

14 March 2016

10 tips for future-proofing against mistakes it made before

GDS Logo The UK Government Digital Service (GDS) has just had a reboot.

However will it be value for money and deliver its objectives?
Will the Budget 2016 changes be implemented efficiently and effectively?
Read on to find out more.


First of all, some background and context about where the Government Digital Service (GDS) came from and why it's building and doing it in an agile way.

In the beginning Government used to do waterfall projects. The seven stages of a waterfall project are usually as follows

System Requirement, Software Requirement, Analysis, Design, Coding, Testing, Operation
Waterfall, the original model

However after a large number of high profile expensive IT failures, the seven stages often unfortunately looked a bit more like this. It wasn't always this bad, but sometimes it did feel like this!

Perfect Plan, Wild Enthusiasm, Total Confusion, Death March, Search for the Guilty, Persecution of the Innocent, Promotion of the Incompetent
Waterfall, when it goes wrong

Add in the hugely expensive and long procurement process, the massive documentation few people actually read, the lengthy complex contracts and change control process. All in, you can see this is not one which embraces flexibility, speed, quality for the citizen and value for government. Projects were frequently late and significantly over budget. By the time the product was delivered, the requirements were out of date. This was a culture dominated by meetings and paperwork rather than working software and value for money.

There was a groundswell in support for Agile following the Agile Manifesto in 2001 and Government sought to embrace the agile movement to minimise the likelihood of more big IT failures. This led to agile values being part of the 2010 transformation of Directgov into GDS - The Government Digital Service. Agile is about following the values in the agile manifesto which are valuing

  • Individuals and interactions over processes and tools, 
  • Working software over comprehensive documentation, 
  • Customer collaboration over contract negotiation and 
  • Responding to change over following a plan.

Although there is value in the items on the right, there is more value in the items on the left.

GDS - The 2010 launch

Shortly after the Conservative / Liberal government took office in May 2010, a report was commissioned by Martha Lane Fox on the then DirectGov service and this resulted in the report revolution not evolution.  Within 6 months, alphagov had launched (May 2011). 9 months later (Feb 2012) the beta was launched and the full service went live in October 2012. It was great to meet up then with my former colleagues and commemorate the previous 8 years.  The past had achieved progress, but better progress was to yet come.

GDS - 2012 - 2016

This is what agile in government feels like sometimes.

Rocket powered snail on skateboard
Lipstick on a pig gets a new look

The government is trying really hard to be agile but is often hampered by the slow waterfall behaviours of ministers, politics and some third party suppliers. A slow body (government) trying to move quickly with a speed-up rocket being the proverbial silver bullet. This is not only potentially dangerous, but when you get there, it's still a snail. This is doing agile rather than being agile.

If Henry Ford had asked his customers what they had wanted, they would probably have said faster horses. It took one visionary, not a democratic consensus to think of the mass market car. If Steve Jobs had asked his customers what they wanted I doubt they would have come up with the iPod (ridiculed at launch) or the iPad (who would have thought of that?). Doing things right is a combination of not only understanding the market and end users but also strategic vision and being able to launch in a relevant time-scale.

What went well at GDS?

Before I look at what didn't go so well, it's important to reflect on what went well. Overall GDS has made a great start. We have:

  • A far more usable website than the one which preceded it
  • Much easier to use services
  • Cost savings
  • A standard to aim for  
  • A better way of working.

What didn't go so well and what needs to be fixed

However GDS has often been busy singing its own praises without reflecting openly on where things didn't work so that the wider sector benefits from this knowledge. In agile, you deploy early then inspect and adapt. You fail fast and when things do go wrong you move forward having not expended a fortune finding out what doesn't work. You learn about what works and what doesn't. You have retrospectives which allow an open opportunity to reflect on what works and what doesn't - these are absolutely not a blame game but instead an opportunity to learn from previous work in a productive and constructive way. The bad news is sadly not very apparent on the GDS blog, which does seem rather self-congratulatory, instead the press have been left to pick it up. Here's a selection:

  1. Users shun new digital service and full digitisation plans were withdrawn.
  2. Up to £180m in fines per year due to a botched examplar service and again here. Perhaps in future an examplar would be defined after launch and be dependent on what people think of it? How do you know it's an examplar before you build it and people use it?
  3. All the great expertise built up in GDS will not be deployed to help local government  despite this being a good idea and a local government remit being announced in March 2015.
  4. Turf wars where GDS stepped in as the "police" to fail an already successful site. Following a standard should not be at the expense of creating user dissatisfaction.
  5. Failing to meet the standards. All services must publish performance statistics.  However all the data stops in September 2014.  Where is the more recent data?
  6. GDS puts services live that don't pass. GDS prefer to call a fail a "not-pass". Here's the data of all live services including those that not-passed (failed) and went live anyway. 
  7. Some services have had an embarrassingly low number of users. Dozens of services are listed on the dashboard at less than 100 users per year. Is this really a good use of money?
  8.  £1.3Bn needs to be spent on improving HMRC's Digital IT.  How do you even begin to spend that much money in an agile project? If this is the Minimum Viable Product, then what is the full spend?
  9.  Universal jobmatch is the most used service but still looks like a 1995 horror story with usability to match and jobseekers are compelled to use it rather than better designed competitor sites. If I was looking at user needs, I'd have tackled this one early on. It's thought to be necessary to track job applications - what if applicants actually get jobs without applying? What if the job is just on LinkedIn? Why is the most used service still so appalling to use and why do I need a government gateway ID? This has been designed for civil servant needs, not end users.
  10. Sometimes it's so bad, even the BBC were calling for the site to be rolled back to 2009.
After all the issues it's now been announced that GDS admits it’s not as good as it could be. It took 4 years to find this out? That sounds like waterfall deluxe, not agile. Agile is about fail fast, not 4 years later and have a £450m budget.  Look at the site for a moment, this cost £58m in the last year. Sure it's saved money but at a cost of £58m a year?

Retrospective lesson: no one's going to be perfect, but as part of being agile you should be open and accepting about your problems early on rather than having the press do the job for you. 

Bored computer user
Stock photo models want more from GDS too!

GDS - The 2016 reboot

On 9th March 2016, a reboot was announced of GDS. Ironically, the slides are also available in PDF format which is something that we really should be doing without in the mobile age.

The aims are now

  1. provide coherent services that are easy to discover and use
  2. make government participative, open and accountable
  3. help government communicate with authority and trust
  4. make great digital and user-centred publishing easy
  5. make government content easy to re-use and build on

10 top tips to take the reboot even further

  1. Optimise government too. GDS assumes people want easy to use services. It isn't actually about services and it isn't about easy to use. The ideal service is one I don't need to use at all.  GDS needs to work with government and feedback where government itself can be simplified and a simplified government becomes a strategic objective. Government needs to simplify. Just doing it at the digital level is missing the point. Legislation is too complex. Services are too complex. Simplifying it at the digital level is not feeding back to government to optimise the root cause of the complexity.
  2. Optimise the speed of delivery. There's nothing in the objectives about being lean and fast or time to market. At the rate GDS is migrating the old directgov estate, the technology will have moved on so fast it will have overtaken what GDS is really trying to do. In the 1960s we filled in forms and sent them in. In the 1990s we filled in forms and contacted a contact centre who filled in forms. In the 2000s we filled in online forms. In the 2010s we fill in user friendly online forms. We already have the technology to say to our mobiles "hey Google, tell me my tax due to HMRC". Automated voice recognition, available now. Automated ID verification available now. No forms, no website. A computer in the background just making life easy for me. "Hey Google, tell me how much money I owe HMRC and pay it to them so it arrives on time". Isn't that a better future than a website only focus? Digital is a platform; websites are just a service on that platform for when a written interaction is necessary. GDS themselves put an alpha live, now you need to complete an alpha and an internal beta. This means you're not learning early on in service about what large numbers of users need or even if the service will be usable for them - farmers probably don't want a digital service if they don't have access to broadband for instance.
  3. Ensure there is a clear demand and this influences priority. How about users voting for what service they need most and GDS implementing it? An open backlog where we can see the pipeline of work? GDS has made a start, but ironically for an organisation oriented towards user needs not really in a way that makes sense to end users.
  4. Ensure there is genuine value for money. Don't spend money on services people don't use or which aren't cost effective.
  5. Think about the citizen first. The new GDS aims look like they were written more for government's benefit. Customer centric organisations put the customer at the centre of all they do.  When you do this, then the government will benefit from it. 
  6. Fix the worst bits first. The  pain points, such as Government Gateway ID and people not having a memorable login, for universal jobmatch need to be addressed especially as this is the main service for citizens. 
  7. Use the best tools to help people. If want to get more people using website services then provide video tutorials to help people and then let them try things out in a safe environment to practice. That way fewer people will need to call contact centres for help.
  8. Only relevant content. Please personalise I live in Scotland and so I should be able to tell you this and not get content which is for England only. If I'm a Welsh speaker, prioritise content in the Welsh language as well.
  9. Truly embrace agile. Stop doing big projects with no flexibility and public dates which are infeasible. Start small, prove the concept and then grow so no more universal credit type failures. Figure out the minimum viable product and think like a start-up.
  10. Learning is two way. Up to now, GDS is being seen as the " police" and the centre of excellence. This is great but as time passes this should become more federated. Departments such as HMRC and DWP are setting up centres of excellence and as they are closest to the citizens they serve they might find out things that would not only benefit them but GDS and wider government. Command and control is not an agile concept and GDS should collaborate and learn from government departments and be prepared to listen and adapt.

Bringing it all together into a strategy

  1. Develop a strategic vision. What is the GDS of the future? What's the vision for how citizens, visitors and businesses will interact with government - how about as simply and easily as possible? Where's the GDS Start with Why?
  2. Develop a target operating model. Realise that people would prefer to interact with government less, and optimise towards this. 
  3. Ensure Digital by Default makes sense. Realise that Digital is only a channel and might not always be the most appropriate, easiest or most cost effective. For the nature of the service, the most appropriate channel should be chosen rather than assuming digital by default. 
  4. Join up government. Government as a platform needs to extend to local services and join up. The UK government created the Council Tax, so why doesn't the UK government build a platform for collecting it rather than 300+ councils building 300+ solutions. How does that make any sense at all? When I change my address, I want to tell all government services at the same time - I don't care if they are provided by central government or not. Think about government as a platform across all of government, not just central government.
  5. Use standards appropriately. Realise that one size fits all doesn't always work. Standards and processes should only be used to make things better, not worse.
  6. Think beyond the service. Allow the user to connect with the relevant content which matters to them with the least possible effort. This includes personalisation, geo targeting and filters 
  7. Simplify for the citizen. Do the complex behind the scenes work necessary to shield users from complexity. Users want simplicity. The renewal of the Car Tax disc - great! Let's have more clever thinking like this. Join up the legislative process and government thinking with what user needs are telling you so that there is optimisation all the way from top to bottom.
  8. Measure, improve and optimise. Keep evolving quicker. Decrease your cycle time so that we don't have to wait 4 years for a review. Measure your service deployment time from discovery to launch and work out how to do it quicker and cheaper for the same quality.
  9. Look for big wins not just incremental improvements. Look for the 10x improvements which Google embodies. Start there and think about applying this to government to enable huge efficiency savings.
  10. Keep up the great work! It's not all bad really!

Four things you can do

  1. Please feel free to share this article on social media  using the links below
  2. Please comment on the GDS blog if you want to feed back direct to GDS
  3. Please respond to the survey.
  4. Comment at the end of this article.

Keep calm by focusing on simplicity mug

About the author

Craig Cockburn has worked across the public sector as a freelance Digital Consultant including Direct Gov, The Department for Work and Pensions, The Scottish Government, The Public Prosecution Service, The Department for Business, HM Revenue and Customs, The Scottish Tourist Board, The CIO Council and Southwark Council and gives his views here on whether the GDS reboot will be a success both for government but more importantly for the citizen. Finally, if you feel I can help you with related Digital Transformation work, please feel free to contact me via LinkedIn or by email on

Many thanks,

View Craig Cockburn's profile on LinkedIn

08 March 2016

Search 3.0, transformative big data and the road ahead

Search 3.0, transformative big data and the road ahead


You may be wondering the significance of the three Scottish flags in the image. I took this picture a few weeks ago. I'm in Edinburgh, there's 3 flags and this article is about Search 3.0 and preferably I'd like Silicon Glen (the Scottish IT sector) to take this idea forward rather than Silicon Valley being the home of the best search engine. So my aim is for a Search 3.0 search engine to be based in Scotland. However I'm open to ideas. First of all, a history in order to explain what search 3.0 is and what transformative big data is all about.

Search 1.0

Although I have been on the Internet since the 1980's, my first experience of the web was in 1993 when I was studying for my Masters in Large Software Systems Development at Napier University and wrote a research paper on cooperative working. I first had email at home since 1988 and used usenet a great deal so was an early adopter of the web and downloaded Mosaic, used early search engines such as Yahoo, Excite, Lycos, and so on when they first came out. I was particularly interested in Altavista when it launched in 1995 as it had the biggest search index at the time and also was built by my former employer Digital. I had floated the idea of a web browser to them in 1989 but that was rather ahead of its time then. The early search engines were interesting and their job was a lot easier than now as there were so few sites however as the web grew the unstructured web needed some order to it so that relevant results came to the fore in the ever growing web.

Search 2.0

Search 2.0 came about when the founders of Google realised that a ranking of pages would help produce more relevant results. Their January 1998 paper on search is available. The basis for this was that the human element of embedding links in pages could be used to deduce that the pages being linked to were more important because people had chosen to link to them. In effect, the human element of adding links allowed a computer algorithm to assign a rank to the pages and produce results which people found more valuable. It was also (somewhat) hard to spam the search index as it required the manual effort of the links to be changed. Trustable sites on .edu and domains also scored higher. Search 2.0 has evolved since then in which ever better and more sophisticated algorithms have tried to make more sense of the data which is out there on the web to produce even more useful results. Despite 17 years of Google, search is still pretty poor.
Try these difficult searches:
  1. You are flying soon. Your airline allows you to take a 2nd bag 40cm x 30cm x 15cm. Your task is to buy a bag that fits. As a secondary task, find a site that allows you to search by bag size.
  2. You are travelling alone searching for a hotel room in London. You require a double bed, ensuite and breakfast. You want a 3 star hotel or better. That in itself is quite hard because by stating one adult, you sometimes get twin rooms returned. Hotelscombined and others when you rank the results by price give you hostels. However, try combining this search to add within 10 minutes walk of a London tube station on the Picadilly line or the number 133 bus or some transport requirement and you're stuck. A bit tricky if you're disabled and want accommodation near a bus route without having to change buses or a tube station accessible by wheelchair.
  3. You need to be at a meeting. Find a travel planner site which allows a portion of your journey by public transport to be swapped with a taxi provided it saves you time and doesn't cost more than £15.
  4. You are a single mother returning to work. You seek a part time job that allows you to balance childcare and work from 9am-3pm Mon to Friday or from home. Your challenge is to find the website that allows you to search for this.
  5. You're looking to move house. The smallest bedroom must be at least 6ft by 8ft. Find the matching houses. Would prefer house to be within 5 mins walk of a bus stop.
  6. Tell me the flights which, allowing for connections at the other end get me to my meeting in London on time. In London you have a choice of 5 airports all with different prices and onward travel times. Let me know the total journey cost too (including by public transport)
  7. You have forgotten a friends birthday but know what would be the ideal present. Find all the local shops open now within a 30 minute drive which have it in stock.
  8. Find me all the used cars in the UK which comfortably take three adults in the back seat for a long journey. 
  9. Find me all the events on in my area. Surprisingly there isn't a predominant global website which does this. 
  10. Find me a job, such that duplicate postings by multiple agencies for the same position are eliminated. Also, show on the job advert the time expected to complete the job application process as I favour jobs without application forms. Let me know the commute time to the job.
(the above list is not meant to be exhaustive and I welcome additions for things that we would find useful, but which can't be searched on via a primary search engine such as bing, google,  duckduckgo or wolfram alpha. 
There are lots of data driven searches for products and services that are simply impossible on the current web. There are three reasons for this.
1. The data is not published at all because it is not gathered in the first place. A bit like a 1993 when I was campaigning for more smoke free areas in pubs, the first step was to get pub guides to survey pubs so we had a current state of the market and some data to work with and actual pubs to speak to about how smoke free areas affected them. 
2. The data is gathered but is in a database somewhere that you have to query via an intermediate website. Such sites usually charge you to list there, unlike Google which is free. This is the likes of, autotrader, zoopla, etc. 
3. The data is published but is not structured in any useful way - instead you get a page of content and somewhere on that page is the info you need and you have to manually scour for it. Such as Amazon listing the size of luggage on the listings page but not giving me a filter to search for luggage under a certain size. We could attempt to solve this problem by applying AI and a deep knowledge of human language to interpret each page but that is a hard job to do error free and extremely hard to to for all the world's languages. As a Gaelic speaker, I support minority languages and I wouldn't want the speakers of minority languages to be sidelined. Data, ideas and feelings are our universal language and speech is only an interpretation of these. 
So here is where clever Google algorithms run out of steam, because of the lack of quality data. So to Search 3.0 and transformative data.
What we've seen is that the old business model of a newspaper listing advertisements hasn't really changed much for the internet age. Ebay, Zoopla, Autotrader - they are simply at the same position in the sales cycle as a newspaper used to be selling adverts and making money from the advertiser based on their readership. What's changed in 20 years?

Search 3.0

This idea isn't new, but I have been promoting it and winning attention for it, just not financial backing. In 2000 I entered the Scottish Enterprise "Who wants to be an entrepreneur" competition with an early version of the idea and my idea got recommended for a feasibility study by Ian Ritchie , leading Scottish entrepreneur  and TED speaker.  I also submitted it to a computer magazine which awarded it one of the top E-commerce ideas in the UK in Feb 2000. The issue then was funding due to the dotcom crash. Great idea, no funding climate. I suggested it to a crowdsourcing site in 2006 where it was called "The next Google". I blogged about it in 2008 and did a Google hangout with Google's Product Manager Director for Search in 2013. Still no traction. Becoming rather fed up with the huge mountain to climb in order to get funding, I feel rather like Queen being told with Bohemian Rhapsody that "the band had no hope of it ever being played on radio". It is the most played song in Radio 1's history. Even Steve Jobs got ridiculed when the iPod was launched. The product that paved the way to the world's most valuable company. Laugh away now. Sometimes the critics get it wrong.  
To counter that I'm putting some of the idea out there because back in the early 90s on the Internet that exactly what people used to do. For free. I did it with the UK Internet List in 1992, the first online guide to Scotland in 1994 and Tim Berners Lee did it with the web in 1991. Link to original post (might not work on mobile browsers). Why do this? To advance the Internet. To encourage debate. To drive forward standards. To recognise this is the first time in the history of the planet where we have a global free platform we can converse on to exchange ideas and to make that a better place for future generations. This only happens once in a planet's history and we are lucky to live in that time. It would be great if we got it right for future generations.
Why not? 

Search 3.0 - Layer 1. Data enrichment

I listed a few examples above of searches I've found frustrating. However this could just be me. I don't know what you find frustrating about the web, what you are looking for that you can't find and what you would like to do to change it. Google probably has an idea because it can track sessions and the long sessions searching for repeatedly similar subjects might be a good indicator that the data is poor but in order to open this up democratically I suggest the following approach.
I'll begin by assuming the refinement of search is based around improving the quality of related high volume e-commerce sites. The reason for this is that if you approach the idea from a VC perspective, this is where you might build the greatest economic value for the search first. However you needn't necessarily follow this approach if you're being altruistic.
Step 1:  Identify the top search categories you want to specialise in initially, e.g. hotel rooms, job listings, rooms to let, restaurants. Cross reference these search terms against the first page of results in Google for these terms, for each of the world’s top cities. Just record the domains returned (including from adverts because the adverts are ranked for relevance). Store this in a database. Rank it by city size if you want a priority order. You now have a list of the top 2nd level search engines by product and locality. You no longer need Google.
For each of the above queries to discover top categories and locations, the same query is sent off to (the Open Directory Project). The categories of results which are returned are what are relevant here are opposed to the actual pages returned. So for a query on travel and London , the top category returned from dmoz would be: Regional: Europe: United Kingdom: England: London: Travel and Tourism
Now you can correlate the Google results by category using the two results above. Furthermore as the dmoz directory is hierarchical, you can build up a hierarchy of websites to allow users to refine their search results. You now have a hierarchical product and geography driven database which references the top websites in the product category and geography. It's still only a list of websites though, no products or services yet.
Step 2: Since we have no data at present for more sophisticated searching, the user is presented with the option to refine the results by keywords against the results returned. Something like “To refine the listings on the webpages below, please indicate what is important to you..”
In the example above, the user could specify grading, price, address etc.
The next user who comes along with a similar search sees the keywords the first user added and then votes for them and/or adds their own. Over time, the keywords entered by users would be shown in each category of search results in order of decreasing popularity. This seeding of the database would occur during the product alpha/beta stages so that there was already a dataset at the time of formal launch. What the site is doing is learning in a Web2.0 sense what criteria are actually important to people to drive the next phase – a popularity contest for how to drive the next phase “how would you like to extend the search capability”, something most websites never ask – they just give you options and it’s “take it or leave it”
Stage 3: you have a database of accommodation websites categorised in a directory from stage 1 and you have a list of how you’d like to search them from stage 2. Next, you send out a search engine spider to these sites, like Google, and spider them. Websites are usually built from templates in Content Management Systems and even complex sites might only have around 20 unique templates. So once you have figured out the templates, the data on them is usually just repeating patterns.
Some might call this site scraping but it's no different to what a search engine does when it indexes content. You should respect the rules from the robots.txt file and behave in a considerate way when indexing other people's content.
We're not dealing with a vast number of websites here and they are based on repeating patterns so joining them up is not so hard. 
 You open up a programmable interface to the site for markup editors, possibly in return for a share of revenue from the advertising revenue generated from those listings. The markup editors would use a tool such as Xray to examine the contents of pages on the site to see where the relevant info occurs that people want to search for. This is effectively an open API for a site scraper tool. demonstrates that site scraping works and rather than running into copyright issues all that is happening is an intelligent parsing of the site, rather like a search engine robot. There is nothing particularly revolutionary here – besides mysupermarket for groceries, the same concept has been applied to workhound for jobs and globrix for housing – however these sites are all narrow vertical markets, limited by geography and do not interact with users to extend their search capabilities.
Sites could ban this parsing if they wanted via the standard robots.txt and instructions would be available on how to do this for site owners. The site editors, guided by the top search terms from Stage 2 then indicate where the relevant content is on the page. For instance if you were parsing, the price information is after the text “total stay from”. If you were parsing job listings, the salary on Jobserve is next to the bold “Rate” keyword and so on. Although this is a non trivial job, the existence of existing site scrapers shows this can be done, plus XML/RSS feeds from the site provides additional scope to help with the parsing. The spider would only be sent to sites with a certain minimum (1000?) number of pages (as seen by Google) to ensure that only content rich sites were indexed. The volume of pages returns also gives you a good data set to teach the parsing technique.
Step 4: Once the top sites have been parsed in this way, the parsed information can then be used to drive subsequent searches. Supposing the price info had been parsed, the price keyword would show as bold on the search results indicating that data was available; this would then allow the user to refine further on that option. So we have built in this example a search engine that allows the user to search for hotels by price across all the top relevant accommodation search engines. Exactly the same pattern could be used to write a search engine for jobs, real estate, electronic goods for sale ultimately arriving at a search engine that is like ebay in terms of refining listings down to a level that the user wants, e.g. mobile phones with wifi, 5megapixel camera, etc etc.  This difference however is that ebay charges for a listing whereas this is a general search engine that points off to the original site, allowing the product to be listed on the  site for nothing. You might call this the professional consumer who knows what they want to search for (I have for this purpose)
Step5: Complete world domination! (only kidding)
Having targeted the big sites for useful listings and built a really useful product and service search engine, categorised by product type, location and searchable by top keywords we now get to the bit were the Internet as we’ve known it can really change massively. VCs interested in buzzwords might call this disruption. I suppose it is because you no longer need to pay to advertise.
Until now, if you had a specialised product or service then in order to get it really noticed you had to submit it (usually at cost) to a specialised search site. If you have a property to sell in Edinburgh, you put it on If you have a job, you add it to, etc. However, just as Google can index individual sites and list them, the same should be true for products and services. It shouldn’t be necessary to list them on some other site, you should be able to list the products and services effectively on your own site and have them searchable for free, just as Google indexes simple web pages for free. Why not? 
How is this achieved?Having followed steps 1 to 4 above, let us assume that we want to allow people to list a job without having to pay to do it on a job search site. Jobs come from agencies and employers, so in the search category listing for Jobs in UK derived at stage 1, you publish a “get listed here” guide. The guide would refer to the top parsed search terms (derived from stage 4) and the format this needs to appear in on the webpage for it to be successfully parsed. So for a job listing you could require that there are bold fields “Min salary:” and “Max salary:” and next to these the salary information is stored (alternatively this info could come through in the site’s RSS feed). Thus any site can be added provided it can be easily parsed. What is especially exciting is that the search terms are of course driven by users so there is scope here to go well beyond the searchable terms on existing sites. For instance, users might want to search for jobs that are accessible by public transport, yet no job search site offers this. Disabled people might want to search for jobs that they can access from a level entrance (an option already available for tourist accommodation searches). Part time mums might want to search for jobs by specific working hours etc. Asking users how to improve search is a unique feature of this site. By specifying the enhanced template for listing against new criteria, sites would have an incentive to provide this information to make their listings more relevant and searchable.
How do users generate this data? Since the data format is open source, the tools will be freely available and could take the form of web apps, wordpress plug ins, CMS extensions and so on. They would be updated in real time to deal with updates to the agreed schema. 
Where is the value? With open data, there is opportunity for competition - sites can bring the data together in new and interesting ways as we've seen where this has happened with government data. There would be competition for the data in terms of who built the best sites around it. There would be entrepreneurship in taking the data forward, rather than the world of jobseeking where the schema hasn't moved in over 20 years. There would be integration of the data with existing apps to make them more useful. There are lots of opportunities.
The net result is a search engine with the power of eBay's searches, the breadth of Google, the profitability of PlentyOfFish or Jobserve (85% profit) scaled up and the usefulness of Amazon, driving and expanding search according to user preference. With more openneess there is also more decentralisation and less need for high end centralised expensive data centres which usually the consumer ends up paying for in some form.
Besides products and services, there is also my data. I want to control who I share with whom. Like driving a search, the more I share the more relevant info I might get in terms of services, but ultimately it's my choice rather than the number of mandatory fields the people in your business have put up. 
This isn't intented to be a polished article. It's open to inspection, adaption and improvement. As the Internet's data should be.
Original article at please also feel free to comment there.

07 March 2016

Dear Recruiting Stereotyper, please stop

Your profile indicates you have been contracting recently, therefore you will only be interested in contract work then? 
This post is aimed at those people who don't recognise that people go through lifestyle changes and there are sometimes times where the balance of priorities between the attractiveness of contracting Vs permanent shifts in response to family and personal needs.
I worked in a permanent job for 5 years until l was laid off. I worked in another permanent job for 3 years until the company hit financial difficulties. Then I worked in another permanent job for over 6 years until I was laid off. Permanent work was fine, although the career progression was limited due to lack of growth in the organisations I was working for and often going from one hiring freeze to another.
During the last job, when I was told my job was likely to be no longer required I was given more than 6 months notice of the redundancy and during this time was given support to look for new work outside the organisation. That was in 2006 and despite my work being extended I had still not found any local work after more than 7 months of looking. It actually took 12 months before I found any work lasting more than a few weeks.
Sadly, and to the significant detriment of family life, I had to travel 400 miles to find work and leave my family behind. This practice continued on and off for 6 years. The 6 years was incredibly hard work at a huge personal cost. I travelled across the UK from Newcastle, London, Norwich, and in Northern Ireland and the Republic of Ireland. I had commutes which were regularly 6+ hours away from my home base and I had 6 years of living in hotels 4 nights a week. I did 12 hour days and a working week which started at 4am on a Monday to catch a flight and return at midnight on a Friday. Welcome to the lifestyle of contracting. Probably not ideal if you enjoy spending time with your family as I did.
I had started self-employment in 2001 on a part time basis. I enjoyed it. However as a lifestyle if you do not have a large local client base, it comes a quite a social price. Also perhaps in later life if your health is not great, maybe it's not necessarily the best choice either. 
There are lots of prejudices in recruitment. I got a lot of it in 2006 along the lines of  you have no experience in banking therefore you can't apply for a job in that sector which contributed to the problem. Don't mean to be harsh on the banking sector but was it the experienced people or the inexperienced people who got the banking sector into a mess?
I travelled to where the work was in order to get the experience and rather than trailing my family around Scotland, Northern Ireland, Ireland and England it made sense for me to take contracts because permanent jobs would have been far too disruptive, expensive, stressful and unsettling for them. I put my family first and myself second. 
I'm happy to be in contracting. I'm also happy to be permanent if the right opportunity came my way. I would tend to be fussier about the permanent jobs though as I see that as a far long term commitment. I'm also rather disappointed in permanent salaries which certainly in Edinburgh appear to have stood still for 10 years whereas contract rates have moved with the market. People offering permanent jobs need to accept that people can choose whether to be permanent or contract and there are pros and cons to each approach. It isn't as simple as contracting for ever or permanent for ever. If contracting gets attacked by HMRC, people will move into permanent work. If permanent salaries become uncompetitive or the career opportunities aren't there or there is no local work, people will turn to contracting. This is not only why recruiters shouldn't be prejudiced but also why there should be a flexible and balanced workforce incorporating both sides rather than a dwindling number of contractors as a result of the government being contractor unfriendly. Contractors provide flexibility and specialist skills. In response to the increasingly rapid changes in the market, how can you scale a team quickly from scratch if you have to wait 3 months for permanent people to hand in their notices from their current jobs?
So dear recruiter, don't look at my CV and be prejudiced. The idea of my having been a contractor for the last few years doesn't automatically extend all the way to retirement. People change, lifestyles change and needs change. If I apply for a permanent job it's because the job is of interest to me. If it wasn't of interest, I wouldn't bother. So why are you asking me if it's right for me? I've already made that judgement thanks. I was a single father for a year and if you are applying stereotypes then think of the woman who has a high flying career as a contractor then has a family and wants stability and being based in one place. Would you be questioning her change of lifestyle or is it none of your business really? If a woman takes a career break for a family do you think she might do the same again for child #2 and exclude her based on past career lifestyle?
I actually want a career, stability, continuity, benefits and being in a place long enough to make a long standing difference and make friends over a period of years. I have been able to do this to some extent as a contractor as I've maintained contacts between contracts and I run into the same people regularly and they recommend me for work, but it's harder going. If I had the choice in 2006 with a young family then - I wouldn't have gone into contracting at all, I didn't have the choice - there were no permanent jobs after looking for a year.
Please bear in mind that when I apply for a job it's because I find the job interesting and relevant for a variety of reasons and that it fits in with my career and lifestyle going forward. What I did in the past is just that, we can't change the past but we can change the future. 
So when you replied, as you did this morning "As it says on the advert Craig, they are permanent positions which I guess would not be of interest to you as your CV/profile looks very much like that of a contractor." I would like you to please respect my necessary lifestyle decisions in the past and my choices for the future rather than your prejudices.
Original article at please also feel free to comment there

04 March 2016

The internet of things and the future of search

The internet of things and the future of search

The old search

Back in 2008 I wrote about how Yahoo could take on Google. 7 years have passed and there has been great innovation in search, yet my idea is still valid.
We still use intermediary search sites such as Skyscanner rather than Google flights. Job boards dominate rather than jobseekers connecting directly with recruiters. The intermediary site whether it is Ebay connecting buyers and sellers, connecting hotels with travellers and Autotrader connecting car sellers with purchasers all still dominate. Whether on your mobile on the web, there's many of these intermediary platforms of varying quality offering you a wide range of apps or websites to download the same data multiple times.In the online job space (which I had an early hand in back in the 1980s) , I have LinkedIn, Jobsite, Monster, Jobserve, Reed and so on. Yet many of them advertise the same jobs. My phone has an increasing number of job apps on it, none of them talking to one another so that when I respond to a job on one platform, it then doesn't come up on searches elsewhere. In the accommodation space, should I use,,, kayak, trivago, expedia. The list goes on, yet it's mostly the same hotels. Airbnb has innovated by generating new opportunities to stay but it's still the same old model of intermediary platform connecting vendors and purchasers. There's nothing in principle to stop the same vendors listing on multiple sites. A decent search engine could do this and eliminate the middleman to produce a consistent joined up search experience. A vendor would simply self publish and then the search engines would index it, the public would find it and the intermediary sites and the proprietary searches would diminish.

The new connectivity

We talk about the Internet of Things and more "things" being connected and getting online. I'm glad to see this having written the UK's first guide to getting online, back in 1992. It's a natural evolution from companies and university to individuals and PCs to mobile phones to other devices. However these devices will need to communicate meaningful information in a meaningful way otherwise we will get lost in the noise.  We talk about intelligent fridges that self order. I think such things are a waste of time. I might want to eat different food one week, perhaps I'm bored with the same stuff, perhaps there is a special occassional or I have friends over. So I will always need an ah-hoc ordering mechanism that I can plug into to replenish my fridge, freezer, larder and anything else I care to order from a supermarket. Why should I need a special device in my fridge, another in my freezer, another in my larder to track all this when instead I could have a voice activated ordering system embedded in my kitchen wall and I can ask to put milk, bread and pasta on the shopping list just by speaking to it and which integrates with recipes to order what I need. What value does an "intelligent" fridge really add here other than maybe to remotely query it from the office incase I need to buy milk on the way home? We need to also manage the data around internet of things, as some of the things are household related rather than related to me as an individual. The association between data generators and people is not 1-1 and this causes an issue for privacy, personalisation and advertising.

The new intelligence

having studied AI briefly at Edinburgh University and my Computer Science thesis on Character Recognition, I have a bit of a background in this area but certainly don't claim to be an expert. Sadly there aren't many jobs in the field. When I did study AI it was all about making sense of complex real world situations such as recognising objects, reading handwriting, and so called expert systems.  We can simplify this and say that for a machine to be intelligent it needs both a body of knowledge (including access to knowledge via various means) and to know what do with that knowledge. Wikipedia by itself is knowledge but a database by itself is hardly intelligent. A clever person who suffers memory loss would similarly struggle - the deduction and reasoning are perhaps still there but the memories for the algorithm to draw on are no longer accessible. Intelligence, whether in machines or people, is the ability to take facts and apply a method in other to use the knowledge usually in a beneficial way. More fundamentally, it's data and a complex algorithm. Terms such as the "knowledge graph" refer to the building blocks of such an algorithm and Google's purchase of Metaweb was an important step in modelling objects (entities) and the relationships between them to gain a better understanding of the world.

data evolution

In order to have the Internet of Things work properly we need an agreed schema so that devices can share data in a meaningful way. It isn't going to be much use if an intelligent fridge has a schema that tries to speak to my Grocery provider in a way that varies depending on what model of fridge I have and what Grocery provider I use for my shopping. We are only going to make sense of IoT when there are common schemas and common APIs and these schemas can be adapted dynamically as enhancements to them are agreed. 
So this brings us back to the web and the semantic web. Despite 20 or more years of search engines we still don't have a product based search engine. I can't ask Google to find me all the jobs in an area by extracting data from job boards. I can't ask Bing to find me all the events near me because there is no schema for event publishing that allows them to be found and classified. I am still stuck in the land of the intermediary search, whether it is Skyscanner for flights, Autotrader for cars for sale, Zoopla for properties for sale or rent, Tourism sites for some event information, and so on. We need a web which is open so that data can be published not just to the Internet of Things but to the web in general. If I am publishing an event why can't I just put it on my own website and list it for free and for it to be indexable as an event in a search engine. Why should I be paying a site listing intermediary to do something that should be free?
I did hint at this in 2008 but things have moved slowly in the world of structured search. Perhaps the internet of things will bring about the long overdue change which makes search really useful. We can then use this data foundation as the basis for evolving algorithms in order that the internet can begin to be intelligent. Intelligence is after all data and an algorithm.
Original article at please feel free to also comment there.

Popular Posts