Earlier this week my eye was caught be an email I received entitled “Eight New Tech Job Titles”. One of the most unusual job titles I have come across in my career is “Chief Monster” which was used for a while by Jeff Tayler the founder of monster.com. Although I believe that role title was unique I do tend to keep an eye on what role titles are emerging in the corporate world and so of course I clicked on the link. Nothing particularly startling in the eight titles listed but the “Chief Analytics / Data / Science Officer” entry did catch my eye, not least because I had recently for the first time encountered someone who carried the title “Chief Data Scientist”. The chap was presenting on a webcast and worked for a large US retailer and the context was a session on “big data” on which he had some very interesting views. I would share the link to the webcast but it is behind a subscription service paywall.
However, first it is perhaps worth defining the term “big data”. In general it seems accepted to refer to the exponential growth and availability of structured and unstructured data, a key dynamic of the digital age. People typically refine that broad definition by referencing some concepts I believe were first articulated by Doug Laney in 2001. He defined “big data” in terms of the characteristics of volume, velocity and variety. I recently came across an excellent IBM infographic entitled “The Four Vs of Big Data in which they had added veracity. The infographic is an excellent summary and it would be foolish of me to try and restate it here. I did register that within it is an arresting statement that by 2015 IBM believe that there will be 4.4m new IT jobs created in the big data field; note to self, can this old dog learn some new tricks to reinvent himself?
So you can imagine that on coming across my first Chief Data Scientist I had a number of questions to pose to him. I’m sure you will all have seen the various statistics about the exponential generation of data both in terms of the social media type context such as 30billion pieces of content shared on Facebook every month or facts like there are 6 billion mobile phones in use or that by 2016 it is forecast that there will be 18.9 billion network connections or that today each day there is estimated to be 2.3 trillion gigabytes of data created. Sadly I didn’t get to pose any of my (in my view!) insightful questions, however others admirably stepped into the breach.
The first question set to him was by a CIO musing on the number of “challenging” data warehouse projects in her past (snap!) and was focused on his approach to handling the complexity implicit in the big data arena. The data is arriving at speed from multiple sources both structured and unstructured and to be of value it is necessary to process the data sets (link, match and cleanse at a minimum) before you can start to meaningfully connect/correlate relationships to turn data into information into insight. I liked his initial answer; “Frankly if it was at all easy no one would be interested in paying me to hold a role with such a fancy job title!”. I liked his second point even more which was “It is easy to get over excited about the neat new analytic tools, how much processing power you need or whether you can leverage cloud based analytic engines. What is absolutely critical is domain knowledge, you have to understand the business context in which the data is created and in which it is being interpreted to create business insight and ultimately competitive advantage.” However, driven by the questions being posed he did then actually proceed to talk at length about technology tools at which point I will confess to losing interest quite quickly.
Of course what was extremely familiar was the message around needing to be able to use the power of the technology to create a context for the data whilst taking due note of the implications of the “four Vs” so well-articulated by the IBM infographic. This is the core message that CIOs and their teams hear all the time. It is those that internalise and act upon it that typically become the success stories and the technology capacity seen as an innovation engine for the business enabling competitive advantages. To get a sense of the size of the prize around big data and some case studies on success stories I recommend a read of the report Big Data In Big Companies by Thomas Davenport and Jill Dyche. To reflect on the gestation time for trends in technology to become deployed innovations in the business world I suggest reading the 2011 McKinsey report “Big Data The Next Frontier For Innovation, Competition & Productivity”. I remember reading this report in late 2011 when preparing a presentation on the Internet of Things and pondering whether it would be more than hype by 2015; I think we can declare yes at this point in time.
“So what is the right team to mobilise to deliver a technology enabled business transformation then?” This was a question posed in an email I received shortly after my last contribution to the BVEX site. Just to be irritating I answered the question with a question of my own; “thanks for reading and posing the question but could you not use the comment feature on the site?!”. A critical part of the answer lies within the wording of the question, specifically “business transformation”. My starting point on mobilising any transformation initiative is to understand how the business will engage, how actively and to confirm that it has a clear view on the benefits to be obtained and how they will be measured.
Once I have those parameters defined then I can start to look at the skills balance the team needs to have to be successful. As we will all appreciate the enabling technology must be deployed effectively to provide a solid base before we can then drive the required organisational and/or individual behavioural changes to use it. However, I am extremely wary of having transformation programme leads that are fans of technology or even worse fanatics. My best results have been achieved when the programme lead views the technology as simply a tool and maintains a dispassionate perspective, much like most of us would regard the choice of different types of pen. They just need to understand the technology to a sufficient level to be able to lead those in the team for whom that is their specialist skill.
My primary focus for the transformation lead is to find someone able to communicate the vision underlying the intent and make the business change meaningful to those delivering, engaging or being impacted. Once you catch the imagination of people with the vision then they will commit and provide the persistence that is often needed to achieve success as there are always, repeat always, bumps in the road with any programme with a significant technological dimension.
Personalising the transformation and visibly living the values set is critical and in terms of business engagement, if you can have your CEO provide that role model then you have materially de-risked your programme. You want to build a cadre of committed individuals driving towards the desired outcomes and impact on the business. A key success factor that enables that peer group pressure is well defined measurement; tracking the right metrics both in terms of the delivery of the programme but also in terms of the business benefits derived as you embed the business change. Clearly there will be people that need a deal of persuasion within your programme team and the wider business. It is vital to have a strong focus on the organisational change key enablers as well as having strategies to handle the resisters, including those that are hidden or passive. I recently found an excellent exploration of this area by McKinsey entitled “Tapping The Power Of Hidden Influencers” which is well worth a read.
My key argument is that you absolutely need to mobilise a team that can deliver the enabling technology to scope, budget, deadline and quality. However, to derive the business benefits from that enabling technology you need more than “just” those qualities, you need a team equipped to drive the organisation and behavioural change skills by moving engagement into commitment and then into enactment. The tendency of technology enabled business transformation programmes to fail to deliver the business benefits, even when they succeed in delivering the technology dimension, highlights the multifaceted team that success requires. Even if you do mobilise the optimal multi skilled team you must have answered an even more fundamental question, are the leaders within that business environment are committed to the change and ready to lead from the front as compelling role models? So in short even before start thinking about the optimal skills mix and mobilise the team make sure you have verified that those commissioning the transformation understand clearly what the journey will entail and are able to holistically articulate the destination.
This article was first posted on the Business Value Exchange.
Image via Shutterstock.com.
Over recent months I have had ample time to catch up on my “to be read” collection of interesting articles and magazines. I will confess to having consumed a large number of publications related to cycling that probably could see me be labelled obsessive particularly as since October I’ve not been fit enough to ride my bike! However, among all the cycling material consumed I read an article in MIT Sloan Management Review (Fall 2012 edition, volume 54) that really struck a chord with me.
The article was titled The Benefits of Combining Data With Empathy (the majority of it is behind a pay-wall I’m afraid). It argues that to succeed in the future companies will increasingly need to find an approach that whilst being optimised in terms of business processes and technology enablement is also able to foster emotional connections with their client base and as part of that relationship building be able to use data empathetically. The authors Ritu Agarwal and Peter Weill make a strong argument for what they term “softscaling”. This is an approach which has three key strands
- creating emotional connections to customers, employees and business partners,
- achieving operational excellence in terms of both cost and outcome, and
- combining timely analysis of data with a clear understanding of the context to arrive at optimised empathetic decisions.
The authors argue that you need to excel at all three aspects of “softscaling” to reap the rewards. The paper is based on research conducted in India and at five major companies in detail. What particularly registered with me at the time was the human centric nature of the approach they are advocating.
The article was given resonance when I read the annual IBM Next 5 In 5 forecast of the five innovations that will most impact our world in the next five years. This year IBM is talking about the innovations that will underpin the next evolution of computing, an era which IBM describe as “the era of cognitive systems“. They believe that “this new generation of machines will learn, adapt, sense and begin to experience the world as it really is…..predictions focus on one element of the new era, the ability of computers to mimic the human senses – in their own way to see, smell, touch, taste and hear.” The promise of these “cognitive systems” is that by operating from a human centric perspective they will help us “see through complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, enrich our lives and break down all kinds of barriers – including geographic distance, cost and inaccessibility“. Bold and exciting statements that the paper from IBM then continues to explore in detail in a very clear and engaging manner. I urge you to take the time to read the IBM material, I think it is their most compelling annual forecast so far.
It is not just the technology sector that are seeing major potential in the bringing of digital intelligence to the physical world. Recently there was an interesting article in The New York Times on General Electric’s plans entitled Looking To Industry For The Next Digital Disruption. The article outlines how General Electric will have invested £1 billion by 2015 in a new software centre to leverage what many call “The Internet of Things” and which GE term the “industrial Internet”. GE believe that they have a $150 billion sized opportunity to drive efficiency into their operations by enabling their “industrial Internet”. To reach that goal GE are embedding sensors on everything, from “a gas turbine or a hospital bed” and investing heavily in the software component to gather and make sense of the vast streams of data created. It certainly brings a strong sense of concept becoming reality when you have an leading industrial company outside of the technology sector investing this materially.
Ensuring that innovations are viable and can deliver value in the real world is always a key concern. A recent article in the McKinsey Quarterly looked at this necessity in an article entitled Battle-Test Your Innovation Strategy. Ensuring that your “great new idea” is as good as you believe it to be could save a lot of wasted time and money, ensuring that if you are going to fail that you fail early. McKinsey looked at how some companies are using war games to assess their product and services innovations by simulating the competitive real world and how the many variables might react to a given startling brilliant new idea. This approach is an interesting variant of the scenario planning technique that many companies use at the strategic level but don’t always take to sufficient granularity as the idea evolves into a proposed new product or service. I found the McKinsey article a thought provoking read and it certainly hammered home the importance of ensuring your beloved innovation is sufficiently road-tested at each stage of its evolution.