وب سایت تخصصی شرکت فرین
دسته بندی دوره ها

Foundational Math for Generative AI: Understanding LLMs and Transformers through Practical Applications

سرفصل های دوره

Unlock the mysteries behind the models powering today’s most advanced AI applications. In this course, instructor Axel Sirota takes you beyond just using large language models (LLMs) like BERT or GPT and highlights the mathematical foundations of generative AI. Explore the challenge of sentiment analysis with simple recurrent neural networks (RNNs) and progressively evolve your approach as you gain a deep understanding of attention mechanisms, transformers, and models. Through intuitive explanations and hands-on coding exercises, Axel outlines why attention revolutionized natural language processing, and how transformers reshaped the field by eliminating the need for RNNs altogether. Along the way, get tips on fine-tuning pretrained models, applying cutting-edge techniques like low-rank adaptation (LoRA), and leveraging your newly acquired skills to build smarter, more efficient models and innovate in the fast-evolving world of AI.


01 - Introduction
  • 01 - Intro to foundational math for generative AI
  • 02 - Getting the most out of this course
  • 03 - Version check

  • 02 - 1. Introduction to Math for GenAI and Attention Basics
  • 01 - Why LLMs and attention matter
  • 02 - RNNs and the context bottleneck problem
  • 03 - Demo Building a simple RNN model for sentiment analysis
  • 04 - Introduction to attention Bahdanau’s solution
  • 05 - Demo Adding attention to an RNN model
  • 06 - Solution Implement Bahdanau's attention

  • 03 - 2. Transformers Removing RNNs for More Efficient Models
  • 01 - From RNNs to transformers
  • 02 - Understanding self-attention in transformers
  • 03 - Multi-head attention and positional encoding
  • 04 - Building a transformer model for sentiment analysis
  • 05 - Solution Build a two-layer transformer encoder

  • 04 - 3. Deep Dive into LLMs and Model Fine-Tuning
  • 01 - The three types of LLMs
  • 02 - Special decoder-only models
  • 03 - Explaining encoder-only models like BERT
  • 04 - Fine-tuning DistilBERT for sentiment analysis
  • 05 - Attention masks in transformers
  • 06 - Solution Detect irony and climate stance in TweetEval

  • 05 - Conclusion
  • 01 - Course summary and next steps
  • 139,000 تومان
    بیش از یک محصول به صورت دانلودی میخواهید؟ محصول را به سبد خرید اضافه کنید.
    افزودن به سبد خرید
    خرید دانلودی فوری

    در این روش نیاز به افزودن محصول به سبد خرید و تکمیل اطلاعات نیست و شما پس از وارد کردن ایمیل خود و طی کردن مراحل پرداخت لینک های دریافت محصولات را در ایمیل خود دریافت خواهید کرد.

    ایمیل شما:
    تولید کننده:
    مدرس:
    شناسه: 45553
    حجم: 405 مگابایت
    مدت زمان: 179 دقیقه
    تاریخ انتشار: ۱۴ دی ۱۴۰۴
    طراحی سایت و خدمات سئو

    139,000 تومان
    افزودن به سبد خرید