Self-Service Data Analytics and Governance for Managers. Nathan E. Myers. Читать онлайн. Newlib. NEWLIB.NET

Автор: Nathan E. Myers
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Корпоративная культура
Год издания: 0
isbn: 9781119773306
Скачать книгу
the tools that has gained prominence is robotic process automation (RPA). Robotics, or Bots, for short, can be used to automate routine processing steps that were previously done by humans. RPA is most appropriate for highly routinized or transactional processes or for the routinized portion of more complex processes. Obvious benefits of RPA can be measured in three ways:

      1 In many cases, the cost of software licenses required to maintain the Bot can be less than the costs to maintain the number of employees they can replace. This is often, but not always true.

      2 Given that Bots, by definition, structure processes that were previously unstructured and manually performed by operators, they can lead to increased control and process stability.

      3 Bots can perform processes at speeds unrivaled by humans, when appropriately configured. This means that work which previously required a full day to perform (or many equivalent workdays to perform, in cases where an entire team performed the task in the legacy environment) can be accomplished in minutes – or even seconds.

      Today, readers may be interacting with Bots without even being aware. Individuals may engage Chatter Bots (Chat Bots) in any number of platforms. Chat Bots are NLP-intensive applications which are programmed to perform human-like on-line chat conversations. Customers interact with them in much the same way as they would, were there a live operator on the other end of the line. The software would respond to social and conversational queues like “Hello” and respond in kind (“Hello, Reader!”). They would also perform a classification to understand what information is being requested and what operations are required to respond most appropriately. These are heavily used by support teams and can multiply the bandwidth of existing staff. Of course, there are limitations. The Bots must be explicitly programmed to respond to written queues, meaning that each response must have been explicitly provided for, as the program is developed. The Turing Test was developed to test the ability of a machine to interact in a way that is indistinguishable from a human. On this scale, many Chat Bots today fail to convincingly resemble humans, but they can be used to great advantage when requests are highly predictable and standardized.

      Complex software installations can be performed by a Bot, given that the number and order of steps are discrete, finite, and well understood. Reconciliations, which occupy many in accounting, finance, and operations, can readily be performed by Bots. Any number of activities that require the merging of data from any number of systems, departments, or processing outputs can benefit from a Bot, though in many cases there are ready-made tools available for ETL use cases that are better suited to this task.

      Another way to deal with complex processes is to break them down to a series of the most basic discrete steps. For the simpler steps in the processing chain that are viable, Bots can be deployed for rapid execution, leaving the more complicated processing steps in the value chain to an operator (who now enjoys a bit more time to perform them). A much-repeated folly is to promise your stakeholders that an entire process from A-to-Z can be automated. Very often it is simply a matter of time until a time-sucking challenge is encountered that endangers the delivery as a whole, or at least the perceptions of the delivery. In reality, for virtually all automation projects, there is very often residual manual tail of work left unautomated by the effort.

      At the time of this writing, the robotics platforms of four companies dominate, although this is a moving target. The industry-leading platforms are as follows: Automation Anywhere, Blue Prism, UIPath, and NICE. These names are less important than gaining an appreciation for the underlying technology and the appropriate use cases to which they are best applied. For now, remember that Bots are best deployed for stable and repetitive processes that exhibit very little variance.

      Machine Learning

      Machine learning (ML) is the subset of artificial intelligence (AI) that is focused on the study of computer algorithms that improve automatically through experience. Machine learning algorithms build a mathematical model based on samples of data observations or training data, to make decisions or predictions, without being explicitly programmed to do so. Above, we introduced RPA, which relies on very regimented coding of specific operations, depending on explicit variables. With machine learning, a number of samples are analyzed to understand the relationships of inputs and to determine how outcomes are derived. The more training data that is pumped through the model, the better the algorithm should get at predicting the “right” answer. Machine learning algorithms are used in a wide variety of applications, such as email filtering and computer vision, where it is difficult or infeasible to develop conventional algorithms or code to predict and specify needed tasks.