Buyer Beware?

A recent interim role opportunity advert listed among the required skills “extensive mergers and acquisition (M&A) experience” but on further discussion the need really seemed to be focused on systems integration.  Clearly the ability to plan a sensible integration of two or more corporate IT landscapes into a strategic coherent whole is critical in a M&A context, including re-conforming previous decisions on delivery models still meet the needs of the organisation.  However, there is equally value, potentially more value,  to be gained from engaging the CIO prior to the deal being struck rather than simply to handle the consequences.  If the acquisition is not a technology company it can be very easy to ignore hidden value or costs within its IT assets, a relatively trite example being that buying a company today with only Microsoft XP deployed on very aged computers will very quickly present a potentially material investment cost.  In a previous role I was taken along to a newly acquired company in Germany by my regional CEO on the Monday morning we took possession.  The employees joining the company were there bright and early in the very smart office but there was no IT kit on any desk as the detail of the deal had failed to actually acquire those assets; hard to believe but a true story.  It was this incident that made it very easy for me to insist on the IT team being engaged with any acquisition deal right from the start and being part of the final buy decision review process.

If anyone is needing convincing of the criticality of IT and its leader on successful M&A then point them at this excellent McKinsey article from 2011, Understanding the strategic value of IT in M&A.  If M&A is all about finding synergies then the McKinsey statistic in this article that over 50% of those synergies tend to relate to IT probably wins any debate on why CIOs should be actively engaged from the twinkle in the eye stage through to the fully integrated with no seams showing outcome.   This argument holds true if you are part of the divesting team equally and a compelling articulation of the strength of the corporate technology and how it is designed to enable rapid and low cost integration could well help close the deal.

I do struggle to understand why the CIO role in M&A is open to question but at a recent industry event I found a number of CIOs that were feeling excluded from the decision process and very concerned at the potential consequences that they would inherit and have to resolve.  The common pitfall it seemed to me listening to the debate was to express the imperative in technology terms rather than using business language and describing the concerns in terms of business outcomes that would be thwarted as well as the clear attention grabber of how much money getting the IT assessment and integration plan wrong could require.  I’m not sure it can really be that simple, but then again perhaps it might just be that straightforward?

A Rose By Any Other Name?

Over the summer there has been an increasing number of references to a new technology centric role in the IT press, the Chief Digital Officer (CDO).  I’ve come across a few heated debates on Chief Information Officer (CIO) and Chief Technology Officer (CTO) forums where people carrying either (or both) of those role titles are discussing whether actually they should become a CDO instead or if it is a new name for the Chief Marketing Officer.  It certainly risks seeing a large number of “chiefs” but is there a meaningful distinction being signaled by the emerging new role title?  In an interesting article on the CIO website, “Chief Digital Officer – here to stay or flash in the pan?“,  one CDO argues that CIOs and CTOs “don’t focus on the core business” and tend to “look at technology for technology’s sake” which would certainly raise the hackles of people in those roles!  Gartner predict that by 2015 some 25% of companies will have a CDO in post; there is even now a Chief Digital Officer Club.

What I think is going on with the CDO title is that it is signalling a focus on the external market and the how your company builds and leverages its digital assets for competitive advantage.  There have been people carrying the titles of CIO and CTO in many companies that have had that external focus and been divorced from the internal IT operations and service delivery.  However, I think the use of the “digital” word recognises that this new role is also in the traditional Chief Marketing Office territory too and declares that remit legitimate.  There has been much debate over recent years of the growing overlap in the era of cloud computing and social media of the IT and Marketing landscapes, with a clear convergence point where there is focus on embracing the ever evolving and growing digital world.  Indeed this focus on the digital world and economy was recently clearly highlighted by a global survey report from McKinsey entitled Bullish On Digital which is well worth a read.

career planning - shutterstock_152010875 (2)

Ultimately what is important is that there is someone in a company ensuring that it is optimally positioned to create competitive advantage from technology and equipped compete in the digital world whatever that means specific to its business sector.  This seems to me to be an evolution of the old debate about how to ensure a strong focus on strategic competitive advantage from technology as well as on gain the scale and cost benefits available from technological operational excellence.  I certainly held CIO and CTO titled roles where that strategic market facing aspects was my core objective from the CEO, sometimes including operational IT delivery but increasingly over recent years excluding it.

CDO is another perspective on the debates of recent years that you might summarise as “what type of CIO are you then” (on which I have mused on previously here) and whether the “T” in CTO is for technology or transformation.  Clearly what is really key is the value the role holder regardless of title brings to their company and how they can help it to maximise the value gained from technology in the digital age.

If a new title can help deliver on that promise as the traditional IT and Marketing landscape converge then fantastic. As long as you deliver I suspect you can probably pick any name you fancy; well to a degree as I suspect that the days of deciding to be called the Chief Wizard are probably gone.

Image via Shutterstock (152010875)

Innovating – keeping it relevant and real

Over recent months I have had ample time to catch up on my “to be read” collection of interesting articles and magazines.  I will confess to having consumed a large number of publications related to cycling that probably could see me be labelled obsessive  particularly as since October I’ve not been fit enough to ride my bike!  However, among all the cycling material consumed I read an article in MIT Sloan Management Review (Fall 2012 edition, volume 54) that really struck a chord with me.

The article was titled The Benefits of Combining Data With Empathy (the majority of it is behind a pay-wall I’m afraid).  It argues that to succeed in the future companies will increasingly need to find an approach that whilst being optimised in terms of business processes and technology enablement is also able to foster emotional connections with their client base and as part of that relationship building be able to use data empathetically.  The authors Ritu Agarwal and Peter Weill make a strong argument  for what they term “softscaling”.  This is an approach which has three key strands

  • creating emotional connections to customers, employees and business partners,
  • achieving operational excellence in terms of both cost and outcome, and
  • combining timely analysis of data with a clear understanding of the context to arrive at optimised empathetic decisions.

The authors argue that you need to excel at all three aspects of “softscaling” to reap the rewards.  The paper is based on research conducted in India and at five major companies in detail.  What particularly registered with me at the time was the human centric nature of the approach they are advocating.

The article was given resonance when I read the annual IBM Next 5 In 5 forecast of the five innovations that will most impact our world in the next five years.  This year IBM is talking about the innovations that will underpin the next evolution of computing, an era which IBM describe as “the era of cognitive systems“.  They believe that “this new generation of machines will learn, adapt, sense and begin to experience the world as it really is…..predictions focus on one element of the new era, the ability of computers to mimic the human senses – in their own way to see, smell, touch, taste and hear.”  The promise of these “cognitive systems” is that by operating from a human centric perspective they will help us “see through complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, enrich our lives and break down all kinds of barriers – including geographic distance, cost and inaccessibility“.  Bold and exciting statements that the paper from IBM then continues to explore in detail in a very clear and engaging manner.  I urge you to take the time to read the IBM material, I think it is their most compelling annual forecast so far.


It is not just the technology sector that are seeing major potential in the bringing of digital intelligence to the physical world.  Recently there was an interesting article in The New York Times on General Electric’s plans entitled  Looking To Industry For The Next Digital Disruption.    The article outlines how General Electric will have invested £1 billion by 2015 in a new software centre to leverage what many call “The Internet of Things” and which GE term the “industrial Internet”.  GE believe that they have a $150 billion sized opportunity to drive efficiency into their operations by enabling their “industrial Internet”.  To reach that goal GE are embedding sensors on everything, from “a gas turbine or a hospital bed” and investing heavily in the software component to gather and make sense of the vast streams of data created.   It certainly brings a strong sense of concept becoming reality when you have an leading industrial company outside of the technology sector investing this materially.

Ensuring that innovations are viable and can deliver value in the real world is always a key concern.  A recent article in the McKinsey Quarterly looked at this necessity in an article entitled Battle-Test Your Innovation Strategy.  Ensuring that your “great new idea” is as good as you believe it to be could save a lot of wasted time and money, ensuring that if you are going to fail that you fail early.  McKinsey looked at how some companies are using war games to assess their product and services innovations by simulating the competitive real world and how the many variables might react to a given startling brilliant new idea.  This approach is an interesting variant of the scenario planning technique that many companies use at the strategic level but don’t always take to sufficient granularity as the idea evolves into a proposed new product or service. I found the McKinsey article a thought provoking read and it certainly hammered home the importance of ensuring your beloved innovation is sufficiently road-tested at each stage of its evolution.

If the future is here today, what will shape tomorrow?

“How will the digital world change the nature of work by  2020?” was the question posed for a recent webcast I joined.  The debate was interesting, at times heated and very wide ranging. At the risk of irritating everyone else in the debate I suggested that first the question needed to be reworded.  I think the question should have read more along the lines of “How has the digital world already changed the nature of work today and where will it have reached by 2020?”

The future of work has become an increasingly high profile topic over recent months.  The approach taken in the various articles and posts has been varied with some focusing on the technology solutions whilst others have taken a more holistic societal perspective.  Regardless it is clear that we have already seen fundamental changes enabled by technology arrive in the workplace today and that over the coming years we will see what is fringe activity today become mainstream tomorrow.  Indeed we have already seen multiple waves of workplace change impact an increasing number of companies quite apart from the debate you could have around social business.  An excellent read on this topic is a book called “The Digital Workplace”  by Paul Miller that has recently been published.  Paul has also launched an organisation called the Digital Workplace Forum which is essentially a diverse community of global corporations collaborating to understand the current baseline, share the art of the possible today and jointly evolve the future opportunities.

Earlier this year I chaired a two day consultation event entitled “The Future of Work”.  The consultation was organised by an organisation called St George’s House and was one of their series of such events on social and ethical themes.  The consultations are held in unique surroundings at Windsor Castle and operated to an ethos that sees highly diverse groups of people rapidly bond into collaborative and effective discussion groups.  This particular group was diverse in every respect  bringing representatives of public and private sector and the debate was stimulating, wide-ranging and at times a challenge for me to keep on topic.  You can find the report entitled simply “The Future Of Work” on the St George’s House website.   I think a fair overall summation would be that everyone in the group recognised that the workplace was rapidly changing, that the pace of change was only going to accelerate and that there were a large number of interconnected driving and enabling technology factors.

In the months after the event I have increasingly read about a technology that we did not discuss in much detail during the consultation called 3D printing.  The impact of 3D printing on the future of industry is huge.  It is a technique for rapidly prototyping and manufacturing that is also sometimes known as additive manufacturing.  It involves creating a solid object by layering a material such as liquid plastic to a specified pattern, ie “printing”.  The pattern is created as a detailed 3D file and can be visualised prior to initiating the manufacture. Examples of what can be created range from hearing aids to replica models to prosthetic limbs to components for jet engines to dresses through to chocolate confectionary.  The implications of being able to create such complex, diverse and customised output are huge and some are talking about 3D printing as being the third industrial revolution.  A recent great piece on this theme can be found in The Economist article entitled “A Third Industrial Revolution”.  I also really like an article in Wired magazine entitled “Future Of Stuff: Vending Machines That Prints in 3D”.

The more I read about 3D printing the more it brings home that the drivers of change in the workplace of tomorrow are not restricted to the digital technologies that immediately tend to spring to mind.  I think we can regard much of the technology solutions often described as innovative as being increasing established and effectively old news where arguably it is all about implementation of solutions that the early adopters have already proven.  What is exciting is seeing clever innovative ideas evolve through the R&D stage, survive the hype stage and become implemented new solutions delivering value, business and/or societal. Clearly identifying the next new groundbreaking initiative early (presuming that like me you lack what is required to be the inventor!) and understanding how to derive value from it is the key.

Getting personal with the cloud

If there’s one thing the IT industry is spectacularly good at, it is producing buzzwords. Marketing executives – even management gurus – look enviously over their shoulders at our industry’s propensity to churn out a seemingly inexhaustible supply of new acronyms and expressions.  We over use them in PowerPoint, extolling the virtues of the latest X and how it will mean Y to Z and to all of Z’s customers.  Meanwhile our audiences wearily roll their eyes upwards at each new piece of jargon!

So, after an endless diatribe of Private Clouds, Public Clouds and Hybrid Clouds, does anybody have the energy for Personal Clouds?  And when we learn it is rooted in consumer IT – itself the most crowded territory for industry jargon (think ‘Mobile’, ‘Post PC’, ‘BYOD’, ‘User Experience’ – it never stops) – we’re reaching for the off switch.  Why should we care?  Perhaps more to the point, why should I risk antagonising you by writing a blog on the subject?

I could start by explaining the idea of the Personal Cloud is gaining traction across the IT industry.  Gartner, for example, were predicting last month that Personal Cloud would replace the PC by 2014.   Or that a cursory search of Google Trends shows the term first appearing in web searches as recently as June 2011, and growing rapidly ever since.  But hype of course is no justification of something’s worth in itself. Worse, it’s so often accompanied by the array of contradictory definitions that seem to meet every new piece of IT terminology.  The important thing is to look at what is actually happening out there.  Because whatever words we want to use, whatever charts we want to draw, an important development is taking place.

For me there are two parts to this.

One is that we now have an unprecedented range of consumer utilities at our disposal to enable our – for want of a better phrase – personal productivity. All the things that you need to do in your daily life – communicate, write, find things out, calculate, plan and schedule, collaborate and share – are enabled by software.  And these days you are quite likely to go online for your software because, let’s face it, apps are as cheap as chips and very often they are free.  When consumed in this way, the set of utilities starts to resemble a virtual space which exists somewhere ‘out there’. This is where the term Personal Cloud may start to seem relevant.  Moreover, this is perhaps the first truly consumerized set of software with real consumer product DNA. It is pure B2C, whereas MS Office and its ilk have their heritage in B2B – even when they have been sold to the C.

Second is to consider this in the context of mobile devices. It is fair to say that if you use a PC you are probably happy to use a workspace that is fixed and licensed to that machine. Traditionally, that has been provided for you by your company. More than likely you have created a similar environment on a home PC – maybe the software was cheaper than the corporate version but nonetheless what you bought came in a cardboard box wrapped in cellophane.  Its code is now firmly attached to the hard disk – as is the information you have created from it.  Mobile changes everything.  You probably don’t need me to argue that with a mobile device, online, consumer software makes the most sense. But here’s the thing. The real value of Personal Cloud is not about your first mobile device, it’s about your second, and your third.  As you add more devices – a smartphone here, a media tablet there, so it becomes more beneficial to you that your software and personal information are virtualised and accessible.   DropBox and Apple’s iCloud are enjoying huge popularity as people realise how much easier it is to have a consistent experience across their devices.  Of course you also have come to realise you need – and expect – the same experience across all of your computers – home and work.

Lurking behind all this, like a troublesome and unwelcome party guest, is a profound implication for the way that businesses deliver end user computing to their employees.  Because now you’ve got your personal devices synced, isn’t it time you also synced your work stuff?  And if you already have a virtual workspace, which by the way you can access at work, why would you need your employer to provide you with an alternative, possibly inferior one?  And would you use it?

There is already strong impetus in the enterprise for Bring Your Own Device (BYOD) and no doubt you will be familiar with the arguments.  The use of mobile in the workplace is a disruptive force and is being viewed by the enterprise, albeit with suspicion, as mostly harmless.  But the argument for Personal Cloud is slightly different. Devices are as varied as they are disposable.  Their useful life expectancy is falling. No one device will define what’s personal to us.  It will be our own personal experience – the set of information and applications that we use – that will become the footprint that defines us and persists with us.  This is what Personal Cloud has the potential to deliver.

Personal Cloud is therefore likely to overtake mobile as the number one headache for CIOs.  Consumer technology has a Trojan Horse feel about it.  It sits outside the enterprise walls, gathering a lot of attention, as suspicious IT functions ready themselves to accept the seemingly harmless gift.  But as we all know, it wasn’t a big hollow wooden horse that did for the Trojans.  It was its payload of Greek warriors, led by Odysseus, who crept out in the dead of night and opened up all the city gates to break a ten year deadlock.  Likewise, Personal Cloud will be carried into the enterprise on mobile devices.  It will change the way enterprises deliver end-user computing for good.

A big data reality check

A few weeks ago I had the pleasure of being the chair for the Fujitsu North America Technology Forum 2012. The focus of the event was extremely topical as it was “From Sensor Networks to Human Networks: Turning Big Data into Actionable Wisdom”. Alongside the excellent presentations there were also specific topic breakout sessions as well as a technology hall with 20 new opportunities showcased as well as innovative solutions from Fujitsu’s research and development programmes working on “leveraging ‘big data’ to transform business, society and our daily lives”. That’s certainly a big vision statement!

The event attracted around 400 attendees which managed to combine a significant scale with a nice feeling of intimacy.  What struck me most about the day was the high level of interest and the wide range of perspectives represented and explored.  Oh and yes I also learned the value of your main keynote speaker being someone as experienced and relaxed as Gordon Bell – when the microphone failed just as he got into his stride; it was great to see a professional handle that blip without a flick of concern or missing a beat!

This is not the first time we’ve mentioned the big data topic on this blog (and you can read more on our Technology Perspectives site) but the over-riding message I took from my many discussions during the event was that people seem fairly comfortable with the concept but are very much focused on how to extract “actionable wisdom”.  In the context the presentation from Michael Chui of McKinsey Global Institute is definitely worth some reading time as a great summary of where value might be drawn across the industry spectrum.  There is also more detail in a research paper he published in 2011 entitled “Big Data: The next frontier for innovation, competition and productivity”.

I was also extremely interested in one of the breakout sessions focusing on the healthcare sector when Fujitsu Laboratories of America (FLA) spoke of partnering with a technology subsidiary of a healthcare provider, Springfield Clinic.  This joint development around remote patient monitoring and reporting caught my attention and I was able to discuss in more detail outside of the session with Jim Hewitt, CEO Jardogs and CIO Springfield Clinic.  I was excited to hear about some technology I’d seen during its earlier research incarnation, a remote sensing platform, had been integrated with Jardogs’ FollowMyHealth Universal Health Record (UHR).  The combination creates an anytime, anywhere collection and transmits selected health data types to be immediately usable by the patients UHR.  There is definitely a buzz to be had from seeing something you saw at a very early conceptual stage becoming real and moving to pilot deployment.

Finally what gave me most food for thought was the keynote presentation by Gordon Bell and his MyLifeBits initiative; the digital storage of every aspect of a life.  I am still mulling over the questions his material raised for me and deciding what conclusions I reach.  It certainly made a term like “digital universe” have wider connotations and more personal resonance for me than it did before he started speaking.

If you’d like to learn more, follow these links to: more images; copies of the presentation materials; and details of this and previous FLA events.

Interactions between the physical and digital worlds

Working for an Information Technology company presents me with a view of life that the digital economy is a must and an integral part of today’s society yet, where that may be true in some parts of the world and within certain demographics, it’s not a statement that everyone would recognise.

But technology is increasing its impact our world every day and lots of very inventive people are finding ways that the digital world can support the physical world, even in very poor and under-developed regions.

The trend for me in this blog post is not the consumerisation of IT as an IT professional may see it, but at what point is something compelling for a consumer, who has very little in the physical world compelled to join the digital world because it makes a significant difference to their daily life?

An excellent example is the Reuters Mobile Light service provided to Indian farmers since 2007 to provide commodity prices, crop and weather data via SMS. Often a community shares a handset but individuals have their own SIM. The service has grown as one subscriber often shares the information with their community to decide where is the best place to send their produce to get the best price and now even use mobile phones to control irrigation.

In more developed parts of the world what we want can be very different, but still critical to our day to day needs with the ability to respond to a need being very fast indeed, such as using a cell phone to measure exposure to radiation in response to specific events such as last year’s earthquake and tsunami in Japan.

Mobile technology is not just useful to respond to disasters, for example going for a health check at a labyrinth of a hospital and wondering how long one is going to have to wait, lead to the creation of a patient guidance system that should take some of the stress out of the visit – and there are many other examples of mobile applications allowing us to take care of ourselves and improve our physical well-being.

This tells us that the digital world can be a significant force for good in the physical world, the needs of the developing world are very different from the developed world and, using ITU numbers, it seems that a third of the developed world is still not connected (two thirds in the developing world).

So, we can all “do our bit” by taking the 2G mobile handsets that we last used about five years ago and are collecting dust in a drawer, digging them out and sending them to our favourite charities, who may use them to help people in other parts of the world (for example, the Indian farmers) and allow greater participation in the digital society.

A back catalogue of ideas

When we look in to the works of some of our great inventors such as Leonardo da Vinci, Charles Babbage, Nikola Tesla or even authors like Isaac Asimov, you have to ask “have we thought of everything we will ever need?”

This particular train of thought was sparked as I read a BBC article about Charles Babbage’s difference engine. This machine’s memory would be equivalent to around 675bytes, or just over half that of Sinclair’s ZX81, released in 1981. A later proposal by Babbage called for 20KB of storage. The machine’s clock speed would work out at around 7Hz, compared to the ZX81’s 3.2MHz – and this was all designed circa 1835. The fastest computers of today deliver 10-petaflops of computational performance per second – so time moves on but I have to ask if all of our ideas been realised?

Assuming not raises another question – where is the “back catalogue” of ideas that we’ve not been able to deliver on yet? And are we just waiting for the materials science to catch up? Every day I read something new is appearing, usually though it’s smaller, faster, cheaper rather than brand new and in achieving these attributes becomes more consumable and available to a wider population. The humble mobile phone is one example – with wireless telephony invented by a Kentucky Farmer called Nathan B Stubblefield as long ago as 1907!

Another example comes to mind that of wireless power, Nikola Tesla demonstrated this circa 1896 and only now is it close to reality with technology demonstrators that we would recognise with contact changing mats and wireless monitors.

So as material science brings many of these crazy ideas in to the realm of possibility it would be great to see companies returning to the dusty archives in patent offices (or company intellectual property offices) and reviewing what they have, to see if it old ideas are now possible. On the flip side we have to avoid squabbles over patents (the “Nortel patent wars” are just one example) – it’s much more preferable that something tangible is realised rather than arguing in a court of law.

Really it’s not just about the way we will do things but how we already do things today. Just because we worked out how to crack a nut with a sledge hammer doesn’t mean we shouldn’t go back and examine the original “nut cracker” invention to see if the material science allows us to accomplish the task more effectively and to greater benefit to society.

I’ll leave you with two questions:

  1. Which is the greatest invention that we’ve realised so far: the wheel; paper and the written word; the Internet; or something else?
  2. What idea that you have heard of is still to be realised in our World today? (“Beam me up Scotty”)

Technology Perspectives – updated for 2012 and beyond

In general, we don’t directly promote Fujitsu products and services on this blog, but we do try to highlight our thought leadership and, late in 2010, we highlighted Fujitsu’s “Technology Perspectives” microsite, which provides Fujitsu opinion on the future of technology, as well as the societal impact of these evolving capabilities.

As we all know, technology doesn’t stand still – mobile technologies, cloud services, business use of social media, and explosive data growth are driving major change in 2012 and beyond – so the Technology Perspectives site has recently undergone a refresh to reflect key forces shaping the industry and how Fujitsu is responding.

Technology Perspectives breaks these down into twelve distinct business and technology trends and considers how they are shaping the future. The trends support and reinforce Fujitsu’s vision and strategy, and complementary Fujitsu innovations are highlighted throughout. The new content will be enhanced and updated on a regular basis with new formats planned to complement the existing online and downloadable PDF formats.

I encourage you to take a look at Technology Perspectives – and please give us your feedback – we’d love to know what you think.

Futurology: art, science or nonsense?

Recently I was asked to present to a group of MBA students on my view of the future and how technology will shape our world by 2015 through to 2020 and beyond. I decided to deliver the session under the title “Futurology – Science, Art or Nonsense?”.

At this time of year it is tempting to wrap up the events of the year with a forecast of what the future will bring. You may be pleased to know that I am going to resist that temptation!

This is primarily because, early in 2012, we will be refreshing the Fujitsu view of the trends shaping our world and the potential outcomes, Technology Perspectives, so I’ll hold fire for now – although I do commend the current material to you as we will evolve our views not completely re-invent them!

Even so, I couldn’t resist re-reading my blog post from December 2010 and musing on how much of what I talked about was still relevant. The post was primarily about the concept of consumerisation of IT and my sense then that it was not restricted to being the generational trait that in 2010 many of us had linked to “Generation Y”. Twelve months on, I think it is clear that the expectation our corporate workplace will have the same 21st century technology capabilities as the consumer arena has moved into the mainstream. The most frequent topic on which I’ve been asked to give an opinion in 2011 is “Bring Your Own” technology (BYO) in its many variants and consequences for the corporate IT landscape. Indeed at the point where I moved from the CIO position in Fujitsu UK and Ireland to my current role the two topics dominating my CIO barometer of demand were requests for BYO solutions and our moving to support Android based smartphones and tablets within our own BYO initiative.

If you remember with the help of my HR colleagues I was able to have the data set rendered anonymous and then age group analysed. In May when the demand on these topics started to register in the monthly statistics there was a clear Generation Y skew, however by September the total figures for Android support were equally split between Generation Y and Generation X (c45% each of volume) yet the BYO demand remained Generation Y dominated (60% of volume). I’m not going to ponder on the demographic angle in this post but what I will say is that in a company of around 12,000 employees over the period I had over 1,000 requests for BYOT and over 2,500 requests for Android smartphone or tablet support (not necessarily all unique, i.e. people could have requested both). This level of interest mirrored what we saw in the marketplace and in the requests for opinion from CIOs from across our client base.

So whilst I am sidestepping listed some forecasts for 2012 I can say that the most common topic I have been asked to talk about over recently months is “Big Data” and “Smart Cities/Infrastructure” (the Intelligent Society). I no longer have the CIO Barometer to give me some data points but I am willing to assert that I think in 12 months we may well be reflecting on a year that saw that concept become pervasive and examples of business value being derived from it become easy to list.

It seems appropriate to end my last blog post of 2011 in the year which saw the passing of Steve Jobs to end with one of my favourite Apple related quotes. The final line from Apple’s famous Think Different campaign was:

“The people who are crazy enough to think they can change the world are the ones who do.”

Clearly 2012 is going to be a challenging year on so many levels for us all, but alongside the challenges there are plenty of opportunities too. Have a restful festive period and return refreshed for what lies ahead.