Generative AI and LLMs: Architecture and Data Preparation
What you’ll learn
- Differentiate between generative AI architectures and models, such as RNNs, Transformers, VAEs, GANs, and Diffusion Models.
- Describe how LLMs, such as GPT, BERT, BART, and T5, are used in language processing.
- Implement tokenization to preprocess raw textual data using NLP libraries such as NLTK, spaCy, BertTokenizer, and XLNetTokenizer.
- Create an NLP data loader using PyTorch to perform tokenization, numericalization, and padding of text data.
Price
Free
Language
English
Duration
5 Hours
Certificate
No
Course Pace
Self Paced
Course Level
Advanced
Course Category
Generative AI
Course Instructor
IBM