Welcome to the Nanodegree program

Introduction to Software Engineering

In this lesson, you’ll write production-level code and practice object-oriented programming, which you can integrate into machine learning projects.

Software Engineering Practices Pt I

Software Engineering Practices Pt II

OOP

Portfolio Exercise: Upload a Package to PyPi

Cloud Computing

Introduction to Deployment

Learn how to deploy machine learning models to a production environment using Amazon SageMaker.

Building a Model using SageMaker

Implement and use a mode

04. Hyperparameter Tuning

Updating a Model

Project: Deploying a Sentiment Analysis Model

Population Segmentation

Apply machine learning techniques to solve real-world tasks; explore data and deploy both built-in and custom-made Amazon SageMaker models.

Payment Fraud Detection

Interview Segment: SageMaker

Deploying Custom Models

Time-Series Forecasting

Project

Introduction to NLP

Learn Natural Language Processing one of the fields with the most real applications of Deep Learning

Implementation of RNN _ LSTM

Sentiment Prediction RNN

Convolutional Neural Networks

Transfer Learning

Weight Initialization

Autoencoders

Job Search

Find your dream job with continuous learning and constant effort

Refine Your Entry-Level Resume

Craft Your Cover Letter

Optimize Your GitHub Profile

Develop Your Personal Brand

01. Hyperparameter Tuning

In this lesson, we are going to take a look at how we can improve our models using one of SageMaker’s features. In particular, we are going to explore how we can use SageMaker to perform hyperparameter tuning.

In many machine learning models there are some parameters that need to be specified by the model creator and which can’t be determined directly from the data itself. Generally the approach to finding the best parameters is to train a bunch of models with different parameters and then choose the model that works best.

SageMaker provides an automated way of doing this. In fact, SageMaker also does this in an intelligent way using Bayesian optimization. What we will do is specify ranges for our hyperparameters. Then, SageMaker will explore different choices within those ranges, increasing the performance of our model over time.

In addition to learning how to use hyperparameter tuning, we will look at Amazon’s CloudWatch service. For our purposes, CloudWatch provides a user interface through which we can examine various logs generated during training. This can be especially useful when diagnosing errors.

Black Friday - Get 50% off

X