Digital Transformation for Chiefs and Owners. Volume 1. Immersion. Dzhimsher Chelidze. Читать онлайн. Newlib. NEWLIB.NET

Автор: Dzhimsher Chelidze
Издательство: Издательские решения
Серия:
Жанр произведения:
Год издания: 0
isbn: 9785006410169
Скачать книгу
object that can be controlled and shared from a single information space.

      The development of artificial intelligence and the cheapening of the Internet of Things have made technology the most advanced. Digital doubles began to receive «clean» big data about the behaviour of real objects, it became possible to predict equipment failures long before accidents. Although the latter thesis is quite controversial, this direction is actively developing.

      As a result, the digital double is a synergy of 3D technologies, including augmented or virtual reality, artificial intelligence, the Internet of Things. It’s a synthesis of several technologies and basic sciences.

      The digital counterparts themselves can be divided into four levels.

      • The double of the individual assembly unit simulates the most critical assembly unit. It can be a specific bearing, motor brushes, stator winding or pump motor. In general, the one that has the greatest risk of failure.

      • The twin of the unit simulates the operation of the entire unit. For example, the gas turbine unit or the entire pump.

      • The production system double simulates several assets linked together: the production line or the entire plant.

      • Process counterpart – this is no longer about «hardware» but about process modelling. For example, when implementing MES- or APS-systems. We’ll talk about them in the next chapter.

      What problems can digital duplicate technology solve?

      • It becomes possible to reduce the number of changes and costs already at the stage of designing the equipment or plant, which allows to significantly reduce costs at the remaining stages of the life cycle. Additionally, it also avoids critical errors, which cannot be changed at the stage of operation.

      The sooner an error is detected, the cheaper it is to fix it

      In addition to cost increases, there is less room for error correction over time

      – By collecting, visualizing and analyzing data, it is possible to take preventive measures before serious accidents and damage to equipment.

      – Optimize maintenance costs while increasing overall reliability. The ability to predict failures allows to repair the equipment on the actual condition, and not on the «calendar». It is not necessary to keep a large amount of equipment in stock, that is, to freeze working capital.

      The use of DC in combination with big data and neural networks and the way from reporting and monitoring to predictive analysis and accident prevention systems

      Build the most efficient operating regimes and minimize production costs. The longer the accumulation of data and the deeper the analytics, the more efficient optimization will be.

      It is very important not to confuse the types of forecasting. Lately, working with the market of various IT solutions, I constantly see confusion in the concepts of predictive analytics and machine detection of anomalies in the operation of equipment. That is, using machine detection of deviations, they speak about the introduction of a new, predictive approach to the organization of service.

      On the one hand, both neural networks actually work. When machine detection of anomalies of the neuronet also finds deviations, which allows to perform maintenance to a serious failure and replace only worn-out element.

      However, let’s take a closer look at the definition of predictive analytics.

      A predictive (or predictive, predictive) analysis is a prediction based on historical data.

      So, it’s the ability to predict equipment failures before the abnormality happens. When the operational performance is still normal, but already begin to develop trends to deviation.

      If you go to a very domestic level, the detection of anomalies – it is when you have a change of pressure and you are warned about it before you have a headache or begin to have heart problems. And predictive analytics is when things are still normal, but you have changed your diet, your sleep quality or something, respectively, the processes in your body that will subsequently lead to an increase in pressure.

      As a result, the main difference is the depth of the dive, the availability of the skills and the horizon of prediction. Anomaly detection is a short-term prediction to avoid a crisis. To do this, you do not need to study historical data for a long period of time, for example, several years.

      A full-fledged predictive analysis is a long-term prediction. You get more time to make decisions and work out measures: plan the purchase of new equipment or spare parts, call a repair team at a lower price or change the mode of operation of the equipment to prevent any deviations.

      That’s what I think, but maybe there are alternative opinions, especially from marketers. The most important constraint I see at the moment is the complexity and cost of technology. Creating mathematical models is long and expensive, and the risk of error is high. It is necessary to combine technical knowledge about the object, practical experience, knowledge in modelling and visualization, observance of standards in real objects. Not all technical solutions are justified, as not every company has all competencies.

      So, I think it’s useful for the industry to start with accident analysis, to identify the critical components of the assets and to model them. That is, to use an approach from the system constraint theory.

      This will, first, minimize the risk of errors. Second, to enter this direction at a lower cost and to get an effect on which you can rely in the future. Third, accumulate expertise in working with data, making decisions based on them and «complicating» models. Having your own data competence is one of the key conditions for successful digitalization.

      It is worth remembering that for now it is a new technology. Additionally, on the same cycle Gartner, it must pass the «valley of disappointment». Then later, when the digital competencies become more common and the neural networks become more massive, we’re going to use the digital counterparts to the full.

      Clouds, online analytics and remote control

      The concept of digital transformation involves the active use of clouds, online analytics, and remote-control capabilities.

      The National Institute of Standards and Technology (NIST) identified the following cloud characteristics:

      – self-service on demand (self-service on demand) – the consumer determines his own needs: speed of access, productivity «iron», its availability, the amount of necessary memory;

      – access to resources from any device connected to the network – it does not matter which computer or smartphone the user logs on from, as long as it is connected to the Internet;

      – pooling of resources (resource pooling) – suppliers complete «iron» for quick balancing between consumers, that is, the consumer indicates what he needs, but the distribution between specific machines is assumed by the supplier;

      – flexibility – the consumer can change the range of necessary services and their scope at any time without unnecessary communication and agreement with the supplier;

      – automatic metering of service consumption.

      However, what are the benefits of the cloud for business?

      – Ability not to «freeze» resources by investing in fixed assets and future expenses (for repair, upgrade and modernization). This simplifies accounting and tax work, allows resources to be directed to development. Key – you can increase the number of digital tools without the need to constantly purchase server hardware and storage systems.

      – Savings on the wage fund (ZP + taxes of expensive professionals