Generative AI and LLMs: Architecture and Data Preparation

What you’ll learn

  1. Differentiate between generative AI architectures and models, such as RNNs, Transformers, VAEs, GANs, and Diffusion Models.
  2. Describe how LLMs, such as GPT, BERT, BART, and T5, are used in language processing.
  3. Implement tokenization to preprocess raw textual data using NLP libraries such as NLTK, spaCy, BertTokenizer, and XLNetTokenizer.
  4. Create an NLP data loader using PyTorch to perform tokenization, numericalization, and padding of text data.
Price Free
Language English
Duration 5 Hours
Certificate No
Course Pace Self Paced
Course Level Advanced
Course Category Generative AI
Course Instructor IBM
Generative AIGenerative AI and LLMs: Architecture and Data Preparation