Skip main navigation

Hyperparameters

A video discussing hyperparameters in deep learning, including learning rate and batch size.

What are hyperparameters?

As you may know if you’ve studied any machine learning techniques before, hyperparameters are the algorithm parameters that are set prior to training and do not change during the learning process.

Important hyperparameters that are commonly adjusted during training of deep learning models include:

  • learning rate
  • batch size.

We will talk more about adjustment of these hyperparameters in the video.

This article is from the free online

Deep Learning for Bioscientists

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now