Zum Hauptinhalt springen Zur Suche springen Zur Hauptnavigation springen

Hands-on Question Answering Systems with BERT

46,99 €

Sofort verfügbar, Lieferzeit: Sofort lieferbar

Format auswählen

Hands-on Question Answering Systems with BERT, Apress
Applications in Neural Networks and Natural Language Processing
Von Navin Sabharwal, Amit Agrawal, im heise Shop in digitaler Fassung erhältlich

Produktinformationen "Hands-on Question Answering Systems with BERT"

Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning.

The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT.

After this solid foundation, you’ll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system.

Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.

WHAT YOU WILL LEARN

* Examine the fundamentals of word embeddings
* Apply neural networks and BERT for various NLP tasks
Develop a question-answering system from scratch* Train question-answering systems for your own data

WHO THIS BOOK IS FOR

AI and machine learning developers and natural language processing developers.

Navin is the chief architect for HCL DryICE Autonomics. He is an innovator, thought leader, author, and consultant in the areas of AI, machine learning, cloud computing, big data analytics, and software product development. He is responsible for IP development and service delivery in the areas of AI and machine learning, automation, AIOPS, public cloud GCP, AWS, and Microsoft Azure. Navin has authored 15+ books in the areas of cloud computing , cognitive virtual agents, IBM Watson, GCP, containers, and microservices.

Amit Agrawal is a senior data scientist and researcher delivering solutions in the fields of AI and machine learning. He is responsible for designing end-to-end solutions and architecture for enterprise products. He has also authored and reviewed books in the area of cognitive virtual assistants.

Chapter 1: Introduction to Natural Language Processing

Chapter Goal: To introduce basics of natural language processing

1.1 What is natural language processing

1.2 What is natural language understanding

1.3 Natural language processing tasks1.3.1 Tokenization

1.3.2 Stemming and lemmatization

1.3.3 Bag of words

1.3.4 Word / Sentence vectorization

Chapter 2: Introduction to Word Embeddings

Chapter Goal: To introduce the basics of word embeddings

3.1 What is word embeddings

3.2 Different methods of word embeddings

3.2.1 Word2vec

3.2.2 Glove

3.2.3 Elmo

3.2.4 Universal sentence encoders

3.2.5 BERT

3.3 Bidirectional Encoder Representations from Transformers (BERT)

3.3.1 BERT – base

3.3.2 BERT - large

Chapter 3: BERT Algorithms Explained

Chapter Goal: Details on BERT model algorithms

4.1 Masked language model

4.2 Next sentence prediction (NSP)

4.3 Text classification using BERT

4.4 Various types of BERT based models

4.4.1 ALBERT

4.4.2 ROBERT

4.4.3 DistilBERT

Chapter 4: BERT Model Applications - Question Answering System

Chapter Goal: Details on question answering system

5.1 Introduction

5.2 Types of QA systems5.3 QA system design using BERT

5.4 DrQA system

5.5 DeepPavlov QA system

Chapter 5: BERT Model Applications - Other tasks

Chapter Goal: Details on NLP tasks performed by BERT.

6.1 Introduction

6.2 Other NLP Tasks:

6.2.1 Sentiment analysis

6.2.2. Named entity recognition

6.2.3 Tag generation

6.2.4 Classification

6.2.5 Text summarization

6.2.6 Language translation

Chapter 6: Future of BERT models

Chapter Goal: Provides an introduction to the new advances in the areas NLP using BERT

7.1 BERT - Future capabilities

Artikel-Details

Anbieter:
Apress
Autor:
Amit Agrawal, Navin Sabharwal
Artikelnummer:
9781484266649
Veröffentlicht:
12.01.21