Introduction
To the great promises that AI brings, it must meet a great human responsibility.
The digital world is characterized by its immediacy, its density of information, its omnipresence, in contrast to the concrete world of things. Now, with the multiplication of means of connection, the decrease in technology costs, the new capacities of data collection and algorithmic processing, we realize that we can communicate elements of our environment that were silent until now. We are witnessing the multifaceted development of new information and communication technologies (NICTs), illustrated by the emergence of technologies associated with Big Data; connected objects; algorithms; nanotechnology, biotechnology, information technology, and cognitive science (NBIC); blockchain; artificial intelligence (AI); virtual and augmented reality; and even quantum computing. AI is developing at an extremely rapid pace. We should expect to see significant changes in our society as AI systems become embedded in many aspects of our lives.
This multifaceted digital phenomenon is bringing different universes together by adding the speed, intelligence, and ubiquity of digital technology to the objects associated with these NICTs. Major developments related to AI in healthcare, autonomous vehicles, cybersecurity, education, home and service robots are improving the quality and comfort of our lives every day. Now, AI is fundamental to address many of the major challenges facing humanity, such as climate change, global health and well-being, natural resource development, and reliable and sustainable legal and democratic systems. This technology is, therefore, changing the way we live, consume, function and work. This is illustrated by a disruption with the past in the relationship and link that each person has with his or her neighbors. From then on, these interactions force the system to rethink each human activity. This is the beginning of a silent but very present revolution that is happening right before our eyes. A new era of change and disruption where survival inevitably requires reactivity, adaptability, creativity and, therefore, innovation.
Consequently, this technoscientific context is conducive to the development of an increasingly important international cultural and intellectual movement, namely transhumanism, whose objective is to improve the physical and mental characteristics of human beings by relying on biotechnologies and other emerging technologies. This current of thought considers that certain states of the human condition such as illness, disability, pain, aging and death are not fatal in themselves and can be corrected or even eliminated.
Thus, technological revolutions have enabled a change of scale in the exploitation of digital data, particularly in the field of genetics. They can be produced in large quantities, in an increasingly precise manner and preserved over an indefinite period of time. It can be observed that advances in computer science have made it possible, through the creation of specific programs, for databases to be interoperable, thus allowing for the fusion of data from various and multiple sources. To this, we can add the development of new ways of accessing data, in particular through the multiplication of data sources of all kinds. Crowdsourcing1 is becoming one of the new devices allowing easy access, in real time, to digital data in order to develop research (Khare et al. 2015).
ALGORITHMIC PROCESSING.–
Algorithmic processing is a finite and unambiguous sequence of operations or instructions to solve a problem or obtain a result. Algorithms are found today in many applications such as computer operation, cryptography, information routing, resource planning, and optimal use of resources, image processing, word processing and so on. An algorithm is a general method for solving a set of problems. It is said to be correct when, for each instance of the problem, it ends up producing the correct output, i.e. it solves the problem.
BIG DATA.–
Big Data, or megadata, sometimes referred to as massive data, refers to data sets that become so large that they are difficult to make use of with traditional database or information management tools. The term Big Data refers to a new discipline at the crossroads of several sectors such as technology, statistics, databases and business (marketing, finance, health, human resources, etc.). This phenomenon can be defined according to seven characteristics, the 7Vs (volume, variety, velocity, veracity, visualization, variability, value).
BLOCKCHAIN.–
Computer “block chain” is protected against any modification, each of which contains the identifier of its predecessor. The blockchain records a set of data such as a date, a cryptographic signature associated with the sender and a whole set of other specific elements. All these exchanges can be traced, consulted and downloaded free of charge on the Internet, by anyone who wishes to check the validity and non-falsification of the database in real time. The major advantage of this device is the ability to store a proof of information with each transaction in order to be able to prove later and at any moment the existence and content of this original information at a given moment. Its mission is, therefore, to create trust by protocolizing a digital asset or database by making it auditable.
CROWDSOURCING.–
A practice that corresponds to appealing to the general public or consumers to propose and create elements of the marketing policy (brand choice, slogan creation, video creation, product ideation/co-creation, etc.) or even to carry out marketing services. Within the framework of crowdsourcing, professional or amateur service providers can then be rewarded, remunerated or sometimes only valued when their creations are chosen by the advertiser or sometimes simply for their participation effort. Crowdsourcing has especially developed with the Internet, which favors the soliciting consumers or freelancers through specialized platforms.
AI appears as an essential evolution in the processing of digital information. It represents the part of computing dedicated to the automation of intelligent behaviors. This approach is the search for ways to endow computer systems with intellectual capacities comparable to those of human beings. AI must be capable of learning, adapting and changing its behavior.
The idea of elaborating autonomous machines probably dates back to Greek antiquity with the automatons built by Hephaestus, reported notably in the Iliad (Marcinkowski and Wilgaux 2004). For Brian Krzanich, President and CEO of Intel (the world’s leading microprocessor manufacturer), AI is not only the next tidal wave in computing, but also the next major turning point in the history of humankind. It does not represent a classic computer program: it is more educated than programmed. It is clear that the AI lawsuit has mixed fantasy, science fiction and long-term futurology, forgetting even the basic definitions of the latter.
Thus, the concept of AI2 is to develop computer programs capable of performing tasks performed by humans requiring learning, memory organization, and reasoning. The objective is to give notions of rationality, reasoning and perception (e.g. visual) functions to control a robot in an unknown environment. Its popularity is associated with new techniques, such as deep learning, which gives a program the possibility to learn how to represent the world because of a network of virtual neurons that perform each of the elementary calculations, in a similar way to our brain.
DEEP LEARNING.–
This algorithmic system has been used for more than 20 years for different actions in the form of neural networks, in particular to “learn”. A neuron represents a simple function that takes different inputs and calculates its result, which it sends to different outputs. These neurons are mainly structured and organized in layers. The first layer uses almost raw data and the last layer will generate a result. The more layers there are, the greater the learning and performance capacity will be. One can take the example of character recognition from handwriting.