Skip main navigation

Week 3 summary

A summary of week three of the course deep learning for bioscientists, on the training loop, hyperparametersm datasets and dataloaders.
Well done – we are now more than halfway through the course!

By now, we hope you have a basic working understanding of what convolutional neural networks are, how they are trained, and the types of data that are used in them.

In this week we have covered:

  • the training loop, including:
    • network output and loss functions
    • how to make a training loop in PyTorch
  • adusting hyperparameters such as learning rate and batch size
  • datasets and dataloaders – software objects that help with managing your data and streamlining performance.

Next week, we will put all we have learned so far together and build a working image classifier using a publicly available image dataset depicting different flower species. After this, we will then see how we can adapt the network to perform a regression task, rather than a classification task.

This article is from the free online

Deep Learning for Bioscientists

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now