وب سایت تخصصی شرکت فرین
دسته بندی دوره ها

Build an AWS Machine Learning Pipeline for Object Detection

سرفصل های دوره

Use AWS Step Functions + Sagemaker to Build a Scalable Production Ready Machine Learning Pipeline for Plastic Detection


1. What we are Building
  • 1. Lets look at our End Project

  • 2. Getting Started with AWS and Getting our Dataset
  • 1. Source Code for the Course.html
  • 2. Setting up IAM User
  • 3. Clarification about AWS S3
  • 4. Getting Data for our Project
  • 5. Getting dataset Part 1
  • 6. Getting dataset Part 2
  • 7. Getting dataset Part 3
  • 8. Getting dataset Part 4

  • 3. Setting up AWS SageMaker
  • 1. Create SageMaker Domain
  • 2. Create SageMaker Studio Notebook
  • 3. Learning how to Stop and Start SageMaker Notebooks
  • 4. Restarting our SageMaker Studio Notebook Kernel
  • 5. Upload and Extract Data in SageMaker
  • 6. Deleting Unused Files

  • 4. Exploratory Data Analysis
  • 1. Loading and Understanding our Data
  • 2. Counting total Images and getting Image ids
  • 3. Getting Classname Identifier
  • 4. Looking at Random Samples from our Dataframe
  • 5. Understanding Annotations
  • 6. Visualize Random Images Part 1
  • 7. Visualise Random Images Part 2
  • 8. Matplotlib difference between plt.show() and plt.imshow().html
  • 9. Visualising Multiples Images at Once
  • 10. Correcting our Function.html
  • 11. Visualising Bounding Boxes Part 1
  • 12. Visualising Bounding Boxes Part 2 (Theory Lesson)
  • 13. Visualising Random Images with Bounding Boxes Part 1
  • 14. Wrong Print Statement.html
  • 15. Visualising Random Images with Bounding Boxes Part 2
  • 16. Read this Lesson if you have issues with Data Visualization.html

  • 5. Cleaning and Splitting our Data
  • 1. Clean our Train and Validation Dataframes
  • 2. Split Dataframe into Test and Train
  • 3. Get Images IDs
  • 4. Splitting IDs Theory Lesson
  • 5. Explanation Regarding Next video.html
  • 6. Moving Images to Appropriate Folders
  • 7. Count how many Train and Test Images we have
  • 8. Verifying that our Images have been moved Properly Part 1
  • 9. Verifying that our Images have been moved Properly Part 2

  • 6. Date Engineering
  • 1. Using Mxnet
  • 2.1 RecordIO Reading.html
  • 2. Additional Info regarding RecordIO format.html
  • 3. Using Mxnet RecordIO
  • 4. Correction Regarding Label width.html
  • 5. Preparing Dataframes to RecordIO format Part 1
  • 6.1 37 Getting df into correct format part2.mov
  • 6. Preparing Dataframes to RecordIO format Part 2
  • 7. Moving Images To Correct Directory
  • 8. Explanation Regarding the Previous Video.html
  • 9. Verifying that all Images have been Moved Properly
  • 10. Read Before Proceeding to the next Lecture.html
  • 11. Creating Production .lst files (Optional)

  • 7. Data Augmentation
  • 1. Data Augmentation Theory
  • 2. Augmenting a Random Image
  • 3. Moving Images to new Folder structure
  • 4. Visualising Random Augmented Images Part 1
  • 5. Visualising Random Augmented Images Part 2
  • 6. Read this Lesson if you have issues visualising your images.html
  • 7. Creating Data Augmentation Function Part 1
  • 8. Creating Data Augmentation Function Part 2
  • 9. Checking Image Counts Before running the Function
  • 10. Correctional Video regarding our Function
  • 11. Augmenting Test Dataset and Creating test .lst Files
  • 12. Augmenting Train Dataset and Creating .lst File Part 1
  • 13. Augmenting Train Dataset and Creating .lst File Part 2
  • 14. Verifying that Data Augmentation has Worked

  • 8. Setting up and Creating our Training Job
  • 1. Increasing Service Quotas
  • 2. Installing dependencies and Packages
  • 3. Creating our RecordIO Files
  • 4. Uploading our RecordIO data to our S3 bucket
  • 5. Downloading Object Detection Algorithm from AWS ECR
  • 6. Setting up our Estimator Object
  • 7. Setting up Hyperparameters
  • 8. Additional Information for Hyperparameter Tuning in AWS.html
  • 9. Setting up Hyperparameter Ranges
  • 10. Setting up Hyperparameter Tuner
  • 11. Additional Information about mAP( mean average precision).html
  • 12. Starting the Training Job Part 1
  • 13. Starting the Training Job Part 2
  • 14. More on mAP Scores.html
  • 15. Monitoring the Training Job
  • 16. Looking at our Finished Hyperparameter Tuning Job

  • 9. Analysing Training Job Results
  • 1. Deploying our Model in a Notebook
  • 2. Creating Visualization Function for Inferences
  • 3. Testing our Endpoint Part 1
  • 4. Testing out Endpoint Part 2
  • 5. Testing our Endpoint from Random Images from the Internet

  • 10. Setting up Batch Transformation
  • 1. Setting up Batch Transformation Job locally first
  • 2. Starting our Batch Transformation Job
  • 3. Analysing our Batch Transformation Job
  • 4. Visualising Batch Transformation Results
  • 5. Look at this lesson if you have trouble with the Visualisations.html

  • 11. Setting Up Our Machine Learning Pipeline
  • 1. Read this Before Watching the Next Lesson.html
  • 2. Setting up AWS Step Function
  • 3. Verify that CloudFormation has worked
  • 4. Configure Batch Transform Lambda Part 1
  • 5. Configure Batch Transform Lambda Part 2
  • 6. Create Check Batch Transform Job Lambda
  • 7. Fixing typos and Syntax Erros
  • 8. JSON output Format
  • 9. Creating Cleaning Batch output Lambda Function Part 1
  • 10. Creating Cleaning Batch output Lambda Function Part 2
  • 11. Configuring our Step Function Part 1
  • 12. Configuring our Step Function Part 2
  • 13. Configuring our Step Function Part 3
  • 14. Upload Test Data to S3
  • 15. Testing our Step Function
  • 16. Fixing Errors
  • 17. Testing our Step Function with the Corrections
  • 18. Verifying that our Step Function Ran Successfully
  • 19. Donwloading our JSON file from S3
  • 20. Using Event Bridge to set up Cron Job for our Machine Learning Pipeline
  • 21. Verify that the Cron Job works
  • 22. Verifying that our Pipeline Ran Successfully
  • 23. Setting up Production Notebook
  • 24. Extending Our Machine Learning Pipeline
  • 25. Coding our Process Job Notebook Part 1
  • 26. Coding our Process Job Notebook Part 2
  • 27. Coding our Process Job Notebook Part 3
  • 28. Coding our Process Job Notebook Part 4
  • 29. Verifying that the Images have been Saved Properly.html
  • 30. Productionizing our Notebook Part 1
  • 31.1 Link to the Trust Policy.html
  • 31. Productionizing our Notebook Part 2
  • 32. Verify that the Entire Machine Learning Pipeline works
  • 33. Deleted Unused items from Sagemaker EFS

  • 12. Creating our Web Application
  • 1. Clone the Web Application from Github
  • 2. Setup MongoDB
  • 3. Connect to MongoDB and get AWS Credentials
  • 4. Configuring Env file
  • 5. Install Node modules
  • 6.1 Article About Next.js proxy server.html
  • 6. MERN app Walkthrough Part 1
  • 7. MERN app Walkthrough Part 2
  • 8. MERN app Walkthrough Part 3
  • 9. Output Images Explanation.html
  • 10. MERN app Walkthrough Part 4
  • 11. MERN app Walkthrough Part 5

  • 13. Outro
  • 1. Clean Up Resources
  • 2. Congratulations
  • 45,900 تومان
    بیش از یک محصول به صورت دانلودی میخواهید؟ محصول را به سبد خرید اضافه کنید.
    خرید دانلودی فوری

    در این روش نیاز به افزودن محصول به سبد خرید و تکمیل اطلاعات نیست و شما پس از وارد کردن ایمیل خود و طی کردن مراحل پرداخت لینک های دریافت محصولات را در ایمیل خود دریافت خواهید کرد.

    ایمیل شما:
    تولید کننده:
    مدرس:
    شناسه: 10590
    حجم: 7519 مگابایت
    مدت زمان: 978 دقیقه
    تاریخ انتشار: 8 اردیبهشت 1402
    طراحی سایت و خدمات سئو

    45,900 تومان
    افزودن به سبد خرید