وب سایت تخصصی شرکت فرین
دسته بندی دوره ها

Python for Machine Learning & Deep Learning in One Semester

سرفصل های دوره

Practical Oriented Explanations by solving more than 80 projects with Numpy, Scikit-learn, Pandas, Matplotlib, Pytorch.


1. Introduction and Course Material
  • 1. Introduction of the course
  • 2.1 Course Material.zip
  • 2. Course Material.html

  • 2. Introduction to Machine Learning and Deep Learning
  • 1. Introduction of the Section
  • 2. What in Intelligence
  • 3. Machine Learning
  • 4. Supervised Machine Learning
  • 5.1 5-Unsupervised Machine Learning
  • 5. Unsupervised Machine Learning
  • 6. Deep Learning

  • 3. Introduction to Google Colab
  • 1. Introduction of the Section
  • 2. Importing Dataset in Google Colab
  • 3. Importing and Displaying Image in Google Colab
  • 4. Importing more datasets
  • 5. Uploading Course Material on your Google Drive

  • 4. Python Crash Course
  • 1. Introduction of the Section
  • 2. Arithmetic With Python
  • 3. Comparison and Logical Operations
  • 4. Conditional Statements
  • 5. Dealing With Numpy Arrays-Part01
  • 6. Dealing With Numpy Arrays-Part02
  • 7. Dealing With Numpy Arrays-Part03
  • 8. Plotting and Visualization-Part01
  • 9. Plotting and Visualization-Part02
  • 10. Plotting and Visualization-Part03
  • 11. Plotting and Visualization-Part04
  • 12. Lists in Python
  • 13. For Loops-Part01
  • 14. For Loops-Part02
  • 15. Strings
  • 16. Print Formatting With Strings
  • 17. Dictionaries-Part01
  • 18. Dictionaries-Part02
  • 19. Functions in Python-Part01
  • 20. Functions in Python-Part02
  • 21. Pandas-Part01
  • 22. Pandas-Part02
  • 23. Pandas-Part03
  • 24. Pandas-Part04
  • 25. Seaborn-Part01
  • 26. Seaborn-Part02
  • 27. Seaborn-Part03
  • 28. Tuples
  • 29. Classes in Python

  • 5. Data Preprocessing
  • 1. Introduction of the Section
  • 2. Need of Data Preprocessing
  • 3. Data Normalization and Min-Max Scaling
  • 4. Project01-Data Normalization and Min-Max Scaling-Part01
  • 5. Project01-Data Normalization and Min-Max Scaling-Part02
  • 6. Data Standardization
  • 7. Project02-Data Standardization
  • 8. Project03-Dealing With Missing Values
  • 9. Project04-Dealing With Categorical Features
  • 10. Project05-Feature Engineering
  • 11. Project06-Feature Engineering by Window Method

  • 6. Supervised Machine Learning
  • 1. Supervised Machine Learning

  • 7. Regression Analysis
  • 1. Introduction of the Section
  • 2. Origin of the Regression
  • 3. Definition of Regression
  • 4. Requirement from Regression
  • 5. Simple Linear Regression
  • 6. Multiple Linear Regression
  • 7. Target and Predicted Values
  • 8. Loss Function
  • 9. Regression With Least Square Method
  • 10. Least Square Method With Numerical Example
  • 11. Evaluation Metrics for Regression
  • 12. Project01-Simple Regression-Part01
  • 13. Project01-Simple Regression-Part02
  • 14. Project01-Simple Regression-Part03
  • 15. Project02-Multiple Regression-Part01
  • 16. Project02-Multiple Regression-Part02
  • 17. Project02-Multiple Regression-Part03
  • 18. Project03-Another Multiple Regression
  • 19. Regression by Gradient Descent
  • 20. Project04-Simple Regression With Gradient Descent
  • 21. Project05-Multiple Regression With Gradient Descent
  • 22. Polynomial Regression
  • 23. Project06-Polynomial Regression
  • 24. Cross-validation
  • 25. Project07-Cross-validation
  • 26. Underfitting and Overfitting ( Bias-Variance Tradeoff )
  • 27. Concept of Regularization
  • 28. Ridge Regression OR L2 Regularization
  • 29. Lasso Regression OR L1 Regularization
  • 30. Comparing Ridge and Lasso Regression
  • 31. Elastic Net Regularization
  • 32. Project08-Regularizations
  • 33. Grid search Cross-validation
  • 34. Project09-Grid Search Cross-validation

  • 8. Logistic Regression
  • 1. Introduction of the Section
  • 2. Fundamentals of Logistic Regression
  • 3. Limitations of Regression Models
  • 4. Transforming Linear Regression into Logistic Regression
  • 5. Project01-Getting Class Probabilities-Part01
  • 6. Project01-Getting Class Probabilities-Part02
  • 7. Loss Function
  • 8. Model Evaluation-Confusion Matrix
  • 9. Accuracy, Precision, Recall and F1-Score
  • 10. ROC Curves and Area Under ROC
  • 11. Project02-Evaluating Logistic Regression Model
  • 12. Project03-Cross-validation With Logistic Regression Model
  • 13. Project04-Multiclass Classification
  • 14. Project05-Classification With Challenging Dataset-Part01
  • 15. Project05-Classification With Challenging Dataset-Part02
  • 16. Project05-Classification With Challenging Dataset-Part03
  • 17. Grid Search Cross-validation With Logistic Regression

  • 9. K-Nearest Neighbors ( KNN )
  • 1. Introduction of the Section
  • 2. Intuition Behind KNN
  • 3. Steps of KNN Algorithm
  • 4. Numerical Example on KNN Algorithm
  • 5. Project01-KNN Algorithm-Part01
  • 6. Project01-KNN Algorithm-Part02
  • 7. Finding Optimal Value of K
  • 8. Project02-Implementing KNN
  • 9. Project03-Implementing KNN
  • 10. Project04-Implementing KNN
  • 11. Advantages and disadvantages of KNN

  • 10. Bayes Theorem and Naive Bayes Classifier
  • 1. Introduction of the section
  • 2. Fundamentals of Probability
  • 3. Conditional Probability and Bayes Theorem
  • 4. Numerical Example on Bayes Theorem
  • 5. Naive Bayes Classification
  • 6. Comparing Naive Bayes Classification With Logistic Regression
  • 7. Project01 Naive Bayes as probabilistic classifier
  • 8. Project02 Comparing Naive Bayes and Logistic Regression
  • 9. Project03 Multiclass Classification With Naive Bayes Classifier

  • 11. Support Vector Machines ( SVM )
  • 1. Introduction of the Section
  • 2. Basic Concept of SVM
  • 3. Maths of SVM
  • 4. Hard and Soft Margin Classifier
  • 5. Decision rules of SVM
  • 6. Kernel trick in SVM
  • 7. Project01-Understanding SVM-Part01
  • 8. Project01-Understanding SVM-Part02
  • 9. Project02-Multiclass Classification With SVM
  • 10. Project03-Grid Search CV-Part01
  • 11. Project03-Grid Search CV-Part02
  • 12. Project04-Breast Cancer Classification with SVM

  • 12. Decision Tree
  • 1. Introduction of the Section
  • 2. Concept of Decision Tree
  • 3. Important terms related to decision tree
  • 4. Entropy-An information gain criterion
  • 5. Numerical Example on Entropy-Part01
  • 6. Numerical Example on Entropy-Part02
  • 7. Gini Impurity - An information criterion
  • 8. Numerical Example on Gini Impurity
  • 9. Project01-Decision Tree Implementation
  • 10. Project02-Breast Cancer Classification With Decision Tree
  • 11. Project03-Grid Search CV with Decision Tree

  • 13. Random Forest
  • 1. Introduction of the Section
  • 2. Why Random Forest
  • 3. Working of Random Forest
  • 4. Hyperparameters of Random Forest
  • 5. Bootstrap sampling and OOB Error
  • 6. Project01-Random Forest-Part01
  • 7. Project01-Random Forest-Part02
  • 8. Project02-Random Forest-Part01
  • 9. Project02-Random Forest-Part02

  • 14. Boosting Methods in Machine Learning
  • 1. Introduction of the Section
  • 2. AdaBoost (Adaptive Boosting )
  • 3. Numerical Example on Adaboost
  • 4. Project01-AdaBoost Classifier
  • 5. Project02-AdaBoost Classifier
  • 6. Gradient Boosting
  • 7. Numerical Example on Gradient Boosting
  • 8. Project03-Gradient Boosting
  • 9. Project04-Gradient Boosting
  • 10. Extreme Gradient Boosting ( XGBoost )
  • 11. Project05-XGBoost-Part01
  • 12. Project05-XGBoost-Part02

  • 15. Deep Learning
  • 1. Deep Learning

  • 16. Introduction to Neural Networks and Deep Learning
  • 1. Introduction of the Section
  • 2. The perceptron
  • 3. Features, Weights and Activation Function
  • 4. Learning of Neural Network
  • 5. Rise of Deep Learning

  • 17. Activation Functions
  • 1. Introduction of the Section
  • 2. Classification by Perceptron-Part01
  • 3. Classification by Perceptron-Part02
  • 4. Need of Activation Functions
  • 5. Adding Activation Function to Neural Network
  • 6. Sigmoid as Activation Function
  • 7. Hyperbolic Tangent Function
  • 8. ReLU and Leaky ReLU Function

  • 18. Loss Functions
  • 1. Introduction of the Section
  • 2. MSE Loss Function
  • 3. Cross Entropy Loss Function
  • 4. Softmax Function

  • 19. Back Propagation
  • 1. Introduction of the Section
  • 2. Forward Propagation
  • 3. Backward Propagation-Part01
  • 4. Backward Propagation-Part02

  • 20. Neural Networks for Regression Analysis
  • 1. Introduction of the Section
  • 2. Project01-Neural Network for Simple Regression-Part01
  • 3. Project01-Neural Network for Simple Regression-Part02
  • 4. Project02 Neural Network for Multiple Regression
  • 5. Creating Neural Network Using Python Class

  • 21. Neural Networks for Classification
  • 1. Introduction of the Section
  • 2. Epoch, Batch size and Iteration
  • 3. Project00 Tensor Dataset and Data Loader
  • 4. Code Preparation for Iris Dataset
  • 5. Project01 Neural Network for Iris Data Classification
  • 6. Code Preparation for MNIST dataset
  • 7. Project02 Neural Network for MNIST data classification-Part01
  • 8. Project02 Neural Network for MNIST data classification-Part02
  • 9. Save and Load Trained model
  • 10. Code Preparation for Custom Images
  • 11. Project03-Neural Networks for Custom Images
  • 12. Code Preparation for Human Action Recognition
  • 13. Project04-Neural Network for Human Action Recognition
  • 14. Project05-Neural Network for Feature Engineered Dataset

  • 22. Dropout Regularization and Batch Normalization
  • 1. Introduction of the Section
  • 2. Dropout Regularization
  • 3. Introducing Dataset for dropout Regularization
  • 4. Project01-Dropout Regularization
  • 5. Project02-Dropout Regularization
  • 6. Batch Normalization
  • 7. Project03-Batch Normalization
  • 8. Project04-Batch Normalization

  • 23. Convolutional Neural Network ( CNN )
  • 1. Introduction of the Section
  • 2. CNN Architecture and main operations
  • 3. 2D Convolution
  • 4. Shape of Feature Map after Convolution
  • 5. Average and Maximum Pooling
  • 6. Pooling to Classification
  • 7. Project01-CNN on MNIST-Part01
  • 8. Project01-CNN on MNIST-Part02
  • 9. An Efficient Lazy Linear Layer
  • 10. Project02 CNN on Custom Images
  • 11. Transfer Learning
  • 12. Project03-Transfer Learning With ResNet-18
  • 13. Project04-Transfer Learning With VGG-16

  • 24. Recurrent Neural Networks ( RNN )
  • 1. Introduction of the Section
  • 2. Why we need RNN
  • 3. Sequential data
  • 4. ANN to RNN
  • 5. Back Propagation Through Time
  • 6. Long-Short Term Memory ( LSTM )
  • 7. LSTM Gates
  • 8. Project01-LSTM Shapes
  • 9. Project02-LSTM Basics
  • 10. Batch size, Sequence length and Feature dimension
  • 11. Project03-Interpolation and Extrapolation With LSTM
  • 12. Project04- Data classification with LSTM

  • 25. Autoencoders
  • 1. Introduction of the Section
  • 2. Architecture of Autoencoder
  • 3. Applications of Autoencoders
  • 4. Project01-Image Denoising using Autoencoder
  • 5. Project02-Occlusion Removing Using Autoencoder
  • 6. Project03-Autoencoder as an Image Classifier

  • 26. Generative Adversarial Networks (GANs)
  • 1. Introduction of the Section
  • 2. Discriminative and Generative Models
  • 3. Training of GAN
  • 4. Project01 GAN Implementation

  • 27. Unsupervised Machine Learning
  • 1. Unsupervised Machine Learning

  • 28. K-Means Clustering
  • 1. Introduction of the Section
  • 2. Steps of K-Means Clustering
  • 3. Numerical Example- K-Means Clustering in One-D
  • 4. Numerical Example-K-Means Clustering in 2D
  • 5. Objective Function of K-Means Clustering
  • 6. Selecting Optimal Number of Clusters ( Elbow Method )
  • 7. Evaluating Metric for K-Means Clustering
  • 8. Project01-K-Means Clustering-Part01
  • 9. Project01-K-Means Clustering-Part02
  • 10. Project01-K-Means Clustering-Part03
  • 11. Project02-K-Means Clustering
  • 12. Project03-K-Means Clustering

  • 29. Hierarchical Clustering
  • 1. Introduction of the Section
  • 2. Hierarchical Clustering Algorithm
  • 3. Hierarchical Clustering in One-D
  • 4. Dendrograms-Selecting Optimal Clusters-Part01
  • 5. Dendrograms-Selecting Optimal Clusters-Part02
  • 6. Hierarchical Clustering Using d-max criterion
  • 7. Hierarchical Clustering in 2D
  • 8. Evaluating Metrics for Hierarchical Clustering
  • 9. Project01-Hierarchical Clustering-Part01
  • 10. Project01-Hierarchical Clustering-Part02
  • 11. Project02-Hierarchical Clustering
  • 12. Project03-Hierarchical Clustering

  • 30. Density Based Spatial Clustering of Applications With Noise (DBSCAN)
  • 1. Introduction of the Section
  • 2. Definition of DBSCAN
  • 3. Step by step DBSCAN
  • 4. Comparing DBSCAN with K-Means Clustering
  • 5. Project01-Part01
  • 6. Project01-Part02
  • 7. Parameters of DBSCAN
  • 8. Project02-DBSCAN

  • 31. Gaussian Mixture Model (GMM) Clustering
  • 1. Introduction of the Section
  • 2. Definition of GMM Clustering
  • 3. Limitations of K-Means Clustering
  • 4. Project01-GMM Clustering
  • 5. Project02-GMM Clustering
  • 6. Project03-GMM Clustering
  • 7. Binomial Distribution
  • 8. Expectation Maximization (EM) Algorithm
  • 9. Expectation Maximization (EM) Algorithm ( Numerical Example )

  • 32. Principal Component Analysis (PCA)
  • 1. Introduction of the Section
  • 2. Key Concepts of PCA
  • 3. Need of PCA
  • 4. PCA Algorithm With Numerical Example
  • 5. Project01-PCA
  • 6. Project02-PCA
  • 7. Project03-PCA
  • 8. Project04-PCA
  • 9. Project05-PCA
  • 10. Project06-PCA
  • 139,000 تومان
    بیش از یک محصول به صورت دانلودی میخواهید؟ محصول را به سبد خرید اضافه کنید.
    افزودن به سبد خرید
    خرید دانلودی فوری

    در این روش نیاز به افزودن محصول به سبد خرید و تکمیل اطلاعات نیست و شما پس از وارد کردن ایمیل خود و طی کردن مراحل پرداخت لینک های دریافت محصولات را در ایمیل خود دریافت خواهید کرد.

    ایمیل شما:
    تولید کننده:
    مدرس:
    شناسه: 20772
    حجم: 17347 مگابایت
    مدت زمان: 2810 دقیقه
    تاریخ انتشار: ۱۵ مهر ۱۴۰۲
    طراحی سایت و خدمات سئو

    139,000 تومان
    افزودن به سبد خرید