Mind+Machine. Vollenweider Marc. Читать онлайн. Newlib. NEWLIB.NET

Автор: Vollenweider Marc
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Зарубежная образовательная литература
Год издания: 0
isbn: 9781119302971
Скачать книгу
me to delve into the psychology of those involved at all levels of mind+machine analytics.

– Marc Vollenweider

      PART I

      THE TOP 12 FALLACIES ABOUT MIND+MACHINE

      The number of incredible opportunities with great potential for mind+ machine is large and growing. Many companies have already begun successfully leveraging this potential, building whole digital business models around smart minds and effective machines. Despite the potential for remarkable return on investment (ROI), there are pitfalls – particularly if you fall into the trap of believing some of the common wisdoms in analytics, which are exposed as fallacies on closer examination.

      Some vendors might not agree with the view that current approaches have serious limitations, but the world of analytics is showing some clear and undisputable symptoms that all is not well. To ensure you can approach mind+machine successfully, I want to arm you with insights into the traps and falsehoods you will very likely encounter.

      First, let's make sure we all know what successful analytics means: the delivery of the right insight to the right decision makers at the right time and in the right format. Anything else means a lessened impact – which is an unsatisfactory experience for all involved.

      The simplest analogy is to food service. Success in a restaurant means the food is tasty, presented appropriately, and delivered to the table on time. It's not enough to have a great chef if the food doesn't reach the table promptly. And the most efficient service won't save the business if the food is poor quality or served with the wrong utensils.

      The impact on a business from analytics should be clear and strong. However, many organizations struggle, spending millions or even tens of millions on their analytics infrastructure but failing to receive the high-quality insights when they are needed in a usable form – and thus failing to get the right return on their investments. Why is that?

      Analytics serves the fundamental desire to support decisions with facts and data. In the minds of many managers, it's a case of the more, the better. And there is certainly no issue with finding data! The rapid expansion in the availability of relatively inexpensive computing power and storage has been matched by the unprecedented proliferation of information sources. There is a temptation to see more data combined with more computing power as the sole solution to all analytics problems. But the human element cannot be underestimated.

      I vividly remember my first year at McKinsey Zurich. It was 1990, and one of my first projects was a strategy study in the weaving machines market. I was really lucky, discovering around 40 useful data points and some good qualitative descriptions in the 160-page analyst report procured by our very competent library team. We also conducted 15 qualitative interviews and found another useful source.

      By today's standards, the report provided a combined study-relevant data volume of 2 to 3 kilobytes. We used this information to create a small but robust model in Lotus 1-2-3 on a standard laptop. Those insights proved accurate: in 2000, I came across the market estimates again and found that we had been only about 5 % off.

      Granted, this may have been luck, but my point is that deriving valuable insight – finding the “so what?” – required thought, not just the mass of data and raw computing power that many see as the right way to do analytics. Fallacies like this and the ones I outline in this part of the book are holding analytics back from achieving its full potential.

      FALLACY #1

      BIG DATA SOLVES EVERYTHING

From Google to start-up analytics firms, many companies have successfully implemented business models around the opportunities offered by big data. The growing number of analytics use cases include media streaming, business-to-consumer (B2C) marketing, risk and compliance in financial services, surveillance and security in the private sector, social media monitoring, and preventive maintenance strategies (Figure I.1). However, throwing big data at every analytics use case isn't always the way to generate the best return on investment (ROI).

Figure I.1 Areas of Big Data Impact

      Before we explore the big data fallacy in detail, we need to define analytics use case, a term you'll encounter a lot in this book. Here is a proposed definition:

      “An analytics use case is the end-to-end analytics support solution applied once or repeatedly to a single business issue faced by an end user or homogeneous group of end users who need to make decisions, take actions, or deliver a product or service on time based on the insights delivered.”

      What are the implications of this definition? First and foremost, use cases are really about the end users and their needs, not about data scientists, informaticians, or analytics vendors. Second, the definition does not specify the data as small or big, qualitative or quantitative, static or dynamic – the type, origin, and size of the data input sets are open. Whether humans or machines or a combination thereof deliver the solution is also not defined. However, it is specific on the need for timely insights and on the end-to-end character of the solution, which means the complete workflow from data creation to delivery of the insights to the decision maker.

      Now, getting back to big data: the list of big data use cases has grown significantly over the past decade and will continue to grow. With the advent of social media and the Internet of Things, we are faced with a vast number of information sources, with more to come. Continuous data streams are becoming increasingly prevalent. As companies offering big data tools spring up like mushrooms, people are dreaming up an increasing number of analytics possibilities.

      One of the issues with talking about big data, or indeed small data, is the lack of a singular understanding of what the term means. It's good hype in action: an attractive name with a fuzzy definition. I found no less than 12 different definitions of big data while researching this book! I'm certainly not going to list all of them, but I can help you understand them by categorizing them into two buckets: the geek's concept and the anthropologist's view.

      Broadly speaking, tech geeks define big data in terms of volumes; velocity (speed); variety (types include text, voice, and video); structure (which can mean structured, such as tables and charts, or unstructured, such as user comments from social media channels); variability over time; and veracity (i.e., the level of quality assurance). There are two fundamental problems with this definition. First, nobody has laid down any commonly accepted limits for what counts as big or small, obviously because this is a highly moving target, and second, there is no clear “so what?” from this definition. Why do all of these factors matter to the end user when they are all so variable?

      That brings us to the anthropologist's view, which focuses on the objective. Wikipedia provides an elegant definition that expresses the ambiguity, associated activities, and ultimate objective:

      Big data is a term for data sets that are so large or complex that traditional data processing applications are inadequate. Challenges include analysis, capture, data curation, search, sharing, storage, transfer, visualization, querying, updating and information privacy. The term often refers simply to the use of predictive analytics or certain other advanced methods to extract value from data, and seldom to a particular size of data set. Accuracy in big data may lead to more confident decision making, and better decisions can result in greater operational efficiency, cost reduction and reduced risk.

      High-ROI use cases for big data existed before the current hype. Examples are B2C marketing analytics and advertising, risk analytics, and fraud detection. They've been proven in the market