Table of Contents
1 Cover
5 Preface
6 1 Introduction 1.1 Data Science: Statistics, Probability, Calculus … Python (or Perl) and Linux 1.2 Informatics and Data Analytics 1.3 FSA‐Based Signal Acquisition and Bioinformatics 1.4 Feature Extraction and Language Analytics 1.5 Feature Extraction and Gene Structure Identification 1.6 Theoretical Foundations for Learning 1.7 Classification and Clustering 1.8 Search 1.9 Stochastic Sequential Analysis (SSA) Protocol (Deep Learning Without NNs) 1.10 Deep Learning using Neural Nets 1.11 Mathematical Specifics and Computational Implementations
7 2 Probabilistic Reasoning and Bioinformatics 2.1 Python Shell Scripting 2.2 Counting, the Enumeration Problem, and Statistics 2.3 From Counts to Frequencies to Probabilities 2.4 Identifying Emergent/Convergent Statistics and Anomalous Statistics 2.5 Statistics, Conditional Probability, and Bayes' Rule 2.6 Emergent Distributions and Series 2.7 Exercises
8 3 Information Entropy and Statistical Measures 3.1 Shannon Entropy, Relative Entropy, Maxent, Mutual Information 3.2 Codon Discovery from Mutual Information Anomaly 3.3 ORF Discovery from Long‐Tail Distribution Anomaly 3.4 Sequential Processes and Markov Models 3.5 Exercises
9 4 Ad Hoc, Ab Initio, and Bootstrap Signal Acquisition Methods 4.1 Signal Acquisition, or Scanning, at Linear Order Time‐Complexity 4.2 Genome Analytics: The Gene‐Finder 4.3 Objective Performance Evaluation: Sensitivity and Specificity 4.4 Signal Analytics: The Time‐Domain Finite State Automaton (tFSA) 4.5 Signal Statistics (Fast): Mean, Variance, and Boxcar Filter 4.6 Signal Spectrum: Nyquist Criterion, Gabor Limit, Power Spectrum 4.7 Exercises
10 5 Text Analytics 5.1 Words 5.2 Phrases – Short (Three Words) 5.3 Phrases – Long (A Line or Sentence) 5.4 Exercises
11 6 Analysis of Sequential Data Using HMMs 6.1 Hidden Markov Models (HMMs) 6.2 Graphical Models for Markov Models and Hidden Markov Models 6.3 Standard HMM Weaknesses and their GHMM Fixes 6.4 Generalized HMMs (GHMMs – “Gems”): Minor Viterbi Variants 6.5 HMM Implementation for Viterbi (in C and Perl) 6.6 Exercises
12 7 Generalized HMMs (GHMMs) 7.1 GHMMs: Maximal Clique for Viterbi and Baum–Welch 7.2 GHMMs: Full Duration Model 7.3 GHMMs: Linear Memory Baum–Welch Algorithm 7.4 GHMMs: Distributable Viterbi and Baum–Welch Algorithms 7.5 Martingales and the Feasibility of Statistical Learning (further details in Appendix) 7.6 Exercises
13 8 Neuromanifolds and the Uniqueness of Relative Entropy 8.1 Overview 8.2 Review of Differential Geometry [206, 207] 8.3 Amari’s Dually Flat Formulation [113–115] 8.4 Neuromanifolds [113–115] 8.5 Exercises
14 9