Hands-on Question Answering Systems with BERT


Book Description

Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT. After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT. You will: Examine the fundamentals of word embeddings Apply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch Train question-answering systems for your own data.




Hands-on Question Answering Systems with BERT


Book Description

Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT. After this solid foundation, you’ll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT. What You Will Learn Examine the fundamentals of word embeddings Apply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch Train question-answering systems for your own data Who This Book Is For AI and machine learning developers and natural language processing developers.




Proceedings of Third Doctoral Symposium on Computational Intelligence


Book Description

This book features high-quality research papers presented at Third Doctoral Symposium on Computational Intelligence (DoSCI 2022), organized by Institute of Engineering and Technology (IET), AKTU, Lucknow, India, on March 5, 2022. This book discusses the topics such as computational intelligence, artificial intelligence, deep learning, evolutionary algorithms, swarm intelligence, fuzzy sets and vague sets, rough set theoretic approaches, quantum inspired computational intelligence, hybrid computational intelligence, machine learning, computer vision, soft computing, distributed computing, parallel and grid computing, cloud computing, high performance computing, biomedical computing, and decision support and decision making.




Getting Started with Google BERT


Book Description

Kickstart your NLP journey by exploring BERT and its variants such as ALBERT, RoBERTa, DistilBERT, VideoBERT, and more with Hugging Face's transformers library Key FeaturesExplore the encoder and decoder of the transformer modelBecome well-versed with BERT along with ALBERT, RoBERTa, and DistilBERTDiscover how to pre-train and fine-tune BERT models for several NLP tasksBook Description BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer’s encoder and decoder work. You’ll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you’ll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT. The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT. By the end of this BERT book, you’ll be well-versed with using BERT and its variants for performing practical NLP tasks. What you will learnUnderstand the transformer model from the ground upFind out how BERT works and pre-train it using masked language model (MLM) and next sentence prediction (NSP) tasksGet hands-on with BERT by learning to generate contextual word and sentence embeddingsFine-tune BERT for downstream tasksGet to grips with ALBERT, RoBERTa, ELECTRA, and SpanBERT modelsGet the hang of the BERT models based on knowledge distillationUnderstand cross-lingual models such as XLM and XLM-RExplore Sentence-BERT, VideoBERT, and BARTWho this book is for This book is for NLP professionals and data scientists looking to simplify NLP tasks to enable efficient language understanding using BERT. A basic understanding of NLP concepts and deep learning is required to get the best out of this book.










Research Anthology on Implementing Sentiment Analysis Across Multiple Disciplines


Book Description

The rise of internet and social media usage in the past couple of decades has presented a very useful tool for many different industries and fields to utilize. With much of the world’s population writing their opinions on various products and services in public online forums, industries can collect this data through various computational tools and methods. These tools and methods, however, are still being perfected in both collection and implementation. Sentiment analysis can be used for many different industries and for many different purposes, which could better business performance and even society. The Research Anthology on Implementing Sentiment Analysis Across Multiple Disciplines discusses the tools, methodologies, applications, and implementation of sentiment analysis across various disciplines and industries such as the pharmaceutical industry, government, and the tourism industry. It further presents emerging technologies and developments within the field of sentiment analysis and opinion mining. Covering topics such as electronic word of mouth (eWOM), public security, and user similarity, this major reference work is a comprehensive resource for computer scientists, IT professionals, AI scientists, business leaders and managers, marketers, advertising agencies, public administrators, government officials, university administrators, libraries, students and faculty of higher education, researchers, and academicians.




Digital Libraries for Open Knowledge


Book Description

This book constitutes the proceedings of the 24th International Conference on Theory and Practice of Digital Libraries, TPDL 2020, held in Lyon, France, in August 2020.* The 14 full papers and 4 short papers presented were carefully reviewed and selected from 53 submissions. TPDL 2020 attempts to facilitate establishing connections and convergences between diverse research communities such as Digital Humanities, Information Sciences and others that could benefit from ecosystems offered by digital libraries and repositories. The papers present a wide range of the following topics: knowledge graphs and linked data; quality assurance in digital libraries; ontology design; user requirements and behavior; research data management and discovery; and digital cultural heritage. * The conference was held virtually due to the COVID-19 pandemic.




Recent Advances in Information and Communication Technology 2021


Book Description

This book contains the proceedings of the 17th International Conference on Computing and Information Technology (IC2IT2021) that was held during May 13–14, 2021, in Bangkok, Thailand. The research contributions include machine learning, natural language processing, image processing, intelligent systems and algorithms, as well as network and cloud computing. These lead to the major research directions for emerging information technology and innovation, reflecting digital disruption in the world.




Machine Learning for Text


Book Description

This second edition textbook covers a coherently organized framework for text analytics, which integrates material drawn from the intersecting topics of information retrieval, machine learning, and natural language processing. Particular importance is placed on deep learning methods. The chapters of this book span three broad categories:1. Basic algorithms: Chapters 1 through 7 discuss the classical algorithms for text analytics such as preprocessing, similarity computation, topic modeling, matrix factorization, clustering, classification, regression, and ensemble analysis. 2. Domain-sensitive learning and information retrieval: Chapters 8 and 9 discuss learning models in heterogeneous settings such as a combination of text with multimedia or Web links. The problem of information retrieval and Web search is also discussed in the context of its relationship with ranking and machine learning methods. 3. Natural language processing: Chapters 10 through 16 discuss various sequence-centric and natural language applications, such as feature engineering, neural language models, deep learning, transformers, pre-trained language models, text summarization, information extraction, knowledge graphs, question answering, opinion mining, text segmentation, and event detection. Compared to the first edition, this second edition textbook (which targets mostly advanced level students majoring in computer science and math) has substantially more material on deep learning and natural language processing. Significant focus is placed on topics like transformers, pre-trained language models, knowledge graphs, and question answering.