In particular, we consider AI‐based algorithms for traffic classification, traffic routing, congestion control, resource management, fault management, Quality of Service (QoS) and Quality of Experience (QoE) management, network security, ML for caching in small cell networks, Q‐learning‐based joint channel and power level selection in heterogeneous cellular networks, stochastic non‐cooperative game, multi‐agent Q‐learning, Q‐learning for channel and power level selection, ML for self‐organizing cellular networks, learning in self‐configuration, RL for SON coordination, SON function model, RL, RL‐based caching, system model, optimality conditions, big data analytics in wireless networks, evolution of analytics, data‐driven networks optimization, GNNs, network virtualization, GNN‐based dynamic resource management, deep reinforcement learning (DRL) for multioperator network slicing, game equilibria by DRL, deep Q‐learning for latency limited network virtualization, DRL for dynamic VNF migration, multi‐armed bandit estimator (MBE), and network representation learning.
Chapter 8 (Fundamentals of Quantum Communications): During the last few years, the research community has turned its attention to quantum computing [32–36] with the objective of combining it with classical communications in order to achieve certain performance targets, such as throughput, round trip delay, and reliability targets at a low computational complexity. As we will discuss in more detail in this chapter, there are numerous optimization problems in wireless communications systems that may be solved at a reduced number of cost function evaluations (CFEs) by employing quantum algorithms. Although we do not attempt to cover the problems of quantum computer design itself, in this chapter we will discuss the basics of QC technology in order to understand better how this technology can enable significant improvements in the design and optimization of communication networks. These fundamentals include discussions on the qubit system, algebraic representation of quantum states, entanglement, geometrical (2D, 3D) representation of quantum states, quantum logical gates, tensor computing, the Hadamard operator H, and the Pauli and Toffoli gates.
Chapter 9 (Quantum Channel Information Theory): Quantum information processing exploits the quantum nature of information. It offers fundamentally new solutions in the field of computer science and extends the possibilities to a level that cannot be imagined in classical communication systems. For quantum communication channels, many new capacity definitions were developed in analogy with their classical counterparts. A quantum channel can be used to achieve classical information transmission or to deliver quantum information, such as quantum entanglement. In this chapter, we review the properties of the quantum communication channel, the various capacity measures, and the fundamental differences between the classical and quantum channels [37–43]. Specifically, we will discuss the privacy and performance gains of quantum channels, the quantum channel map, the formal model, quantum channel capacity, classical capacities of a quantum channel, the quantum capacity of a quantum channel, quantum channel maps, and capacities and practical implementations of quantum channels.
Chapter 10 (Quantum Error Correction): The challenge in creating quantum error correction codes lies in finding commuting sets of stabilizers that enable errors to be detected without disturbing the encoded information. Finding such sets is nontrivial, and special code constructions are required to find stabilizers with the desired properties. We will start this section by discussing how a code can be constructed by concatenating two smaller codes. Other constructions include methods for repurposing classical codes to obtain commuting stabilizer checks [44–47]. Here, we will outline a construction known as the surface code [48, 49]. The realization of a surface code logical qubit is a key goal for many quantum computing hardware efforts [50–54]. The codes belong to a broader family of so‐called topological codes [55]. In this framework, within this chapter we will discuss stabilizer codes, surface codes, the rotated lattice, fault‐tolerant gates, fault tolerance, theoretical framework, classical error correction, and the theory of quantum error correction in addition to some auxiliary material on binary fields and discrete vector spaces, and noise physics.
Chapter 11 (Quantum Search Algorithms): The appetite for faster, more reliable, greener, and more secure communications continues to grow. The state‐of‐the‐art methods conceived for achieving the performance targets of the associated processes may be accompanied by an increase in computational complexity. Alternatively, degraded performance may have to be accepted due to the lack of jointly optimized system components. In this chapter, we investigate the employment of quantum computing for solving problems in wireless communication systems. By exploiting the inherent parallelism of quantum computing, quantum algorithms may be invoked for approaching the optimal performance of classical wireless processes, despite their reduced number of CFEs cost-function evaluations. In Chapter 8, we have already discussed the basics of quantum computing using linear algebra, before presenting here the operation of the major quantum algorithms that have been proposed in the literature for improving wireless communications systems. Furthermore, in the following chapters, we will investigate a number of optimization problems encountered both in the physical and network layer of wireless communications, while comparing their classical and quantum‐assisted solutions. More specifically, in this chapter we will discuss the following: quantum search algorithms (QSAs) for wireless communications such as the Deutsch algorithm, the Deutsch–Jozsa algorithm, Simon’s algorithm, Shor’s algorithm, the quantum phase estimation algorithm, Grover’s QSA, the Boyer–Brassard–Høyer–Tapp QSA, the Dürr–Høyer QSA, quantum counting algorithm, quantum heuristic algorithm, quantum genetic algorithm, Harrow–Hassidim–Lloyd algorithm, quantum mean algorithm, and quantum‐weighted sum algorithm.
Chapter 12 (Quantum Machine Learning): In this chapter, we provide a brief description of quantum machine learning (QML) and its correlation with AI. We will see how the quantum counterpart of ML is much faster and more efficient than classical ML. Training the machine to learn from the algorithms implemented to handle data is the core of ML. This field of computer science and statistics employs AI and computational statistics. The classical ML method, through its subsets of deep learning (supervised and unsupervised), helps to classify images, recognize patterns and speech, handle big data, and many more. Thus, classical ML has received a lot of attention and investments from the industry. Nowadays, due to the huge quantities of data with which we deal every day, new approaches are needed to automatically manage, organize, and classify these data. Classical ML, which is a flexible and adaptable procedure, can recognize patterns efficiently, but some of these problems cannot be efficiently solved by these algorithms. Companies engaged in big databases management are aware of these limitations, and are very interested in new approaches to accomplish this. They have found one of these approaches in quantum ML. However, the interest in implementing these techniques through QC is what paves the way for quantum ML. QML [56–59] aims to implement ML algorithms in quantum systems by using quantum properties such as superposition and entanglement to solve these problems efficiently. This gives QML an edge over the classical ML technique in terms of speed of functioning and data handling. In the QML techniques, we develop quantum algorithms to operate classical algorithms using a quantum computer. Thus, data can be classified, sorted, and analyzed using the quantum algorithms of supervised and unsupervised learning methods. These methods are again implemented through models of a quantum neural network or support vector machine. This is the point where we merge the algorithms discussed in Parts I and II of this book. In particular, we will discuss QML algorithms, quantum neural network preliminaries, quantum, classifiers with ML: near‐term solutions, the circuit‐centric quantum classifier, training, gradients of parameterized quantum gates, classification with quantum neural networks, representation, learning, the quantum decision tree classifier, and the model of the classifier in addition to some auxiliary material on matrix exponential.