This course provides you with the fundamental understandings of the latest language models built using deep learning architectures.
You will start with the traditional language models, learn about word embeddings, attention and then move on to transformer models. At the end of this course, you will be able to build efficient language models, and tell how well they are performing.
This course will prepare you for more advanced studies in deep learning and more intermediate positions in AI research and opportunities.
Starts
Duration
Format
Time Investment
Fee
(Interest free EMI available)
Registration closes
After successfully completing the program, you will be:
Learners and practitioners who have a fundamental understanding and practice of basic AI and ML concepts including Neural Networks are ideal for this course. You should have strong foundations in statistics, computer science & mathematics.
Knowledge of beginning and intermediate AI as exemplified by the topics in AI: Basics
Prior knowledge of high level machine learning libraries such as keras
Bi-weekly lessons with labs and quizzes
Ten or more hours/week to interact with an experienced & accomplished mentor
Complex problems that challenge you to apply what you learned
10–12 week long Capstone Project with one of our partner companies or faculty
Traditional language modeling
Embeddings
RNNs and LSTMs
Language Modeling (predict next world) with RNNs
Attention and Seq2Seq models
Transformer Models
BERT, ELMO, and GPT
Project Week
Our programs are priced for access, and are affordable for most around the world who seek the best training in sought after areas like AI and Data Science. For those who might still need financing, we have EMI payments available.
A small number of Univ.AI Scholarships are available for the best candidates. Learn More