目录 Preface Chapter 1: Introduction to Natural Language Processing What is Natural Language Processing? Tasks of Natural Language Processing The traditional approach to Natural Language Processing Understanding the traditional approach Example - generating football game summaries Drawbacks of the traditional approach The deep learning approach to Natural Language Processing History of deep learning The current state of deep learning and NLP Understanding a simple deep model - a Fully-Connected Neural Network The roadmap - beyond this chapter Introduction to the technical tools Description of the tools Installing Python and scikit-learn Installing Jupyter Notebook Installing TensorFlow Summary Chapter 2: Understanding TensorFlow What is TensorFlow? Getting started with TensorFlow TensorFlow client in detail TensorFlow architecture - what happens when you execute the client? Cafe Le TensorFlow - understanding TensorFlow with an analogy Inputs, variables, outputs, and operations Defining inputs in TensorFlow Feeding data with Python code Preloading and storing data as tensors Building an input pipeline Defining variables in TensorFlow Defining TensorFlow outputs Defining TensorFlow operations Comparison operations Mathematical operations Scatter and gather operations Neural network-related operations Reusing variables with scoping Implementing our first neural network Preparing the data Defining the TensorFlow graph Running the neural network Summary Chapter 3: Word2vec - Learning Word Embeddings What is a word representation or meaning? Classical approaches to learning word representation WordNet - using an external lexical knowledge base for learning word representations Tour of WordNet Problems with WordNet One-hot encoded representation The TF-IDF method Co-occurrence matrix Word2vec - a neural network-based approach to learning word representation Exercise: is queen = king - he + she? Designing a loss function for learning word embeddings The skip-gram algorithm From raw text to structured data Learning the word embeddings with a neural network Formulating a practical loss function Efficiently approximating the loss function Implementing skip-gram with TensorFlow The Continuous Bag-of-Words algorithm Implementing CBOW in TensorFlow Summary Chapter 4: Advanced Word2vec The original skip-gram algorithm Implementing the original skip-gram algorithm …… Chapter 5: Sentence Classification with Convolutional Neural Networks Chapter 6: Recurrent Neural Networks Chapter 7: Lonq Short-Term Memory_ Networks Chapter 8: Applications of LSTM - Generating Text Chapter 9: Applications of LSTM - Image Caption Generation Chapter 10: Sequence-to-Sequence Learning - Neural Machine Translation Chapter 11: Current Trends and the Future of Natural Language Processing Appendix: Mathematical Foundations and Advanced TensorFlow Other Books You May Enjoy Index
以下为对购买帮助不大的评价