Skip to content

SImple Multi-Layer Perceptron (MLP) used for teaching BSc/MSc Data Science students

Notifications You must be signed in to change notification settings

KerrFitzgerald/Basic_Neural_Network

Repository files navigation

Basic_Neural_Network

Simple Multi-Layer Perceptron (MLP) used for teaching BSc/MSc Data Science students. The teaching was split across 2 lab sessions (2 hours each) and aimed to teach students about the basics of neural networks.

Important aspects included:

  • Demonstrating the importance of vectorized code implementation (acheived using 'for loops' vs 'numpy inbuilt functionality'. See the 'Vectorization Teaching Version' Notebook.

  • Demonstarting the importance of activation functions. See the 'Activation Functions Teaching Version' Notebook.

  • Walking through the Python implementation of a Multi-Layer Perceptron (MLP). The Notebook also allowed students to investigate the impacts of changing network parameters (such as number of neurons in each layer & alpha backpropigation paramater). See the 'MLP Walkthrough Teaching Version' Notebook.

The 'wine' and 'iris' datasets are available at: https://archive.ics.uci.edu/ml/datasets.php

About

SImple Multi-Layer Perceptron (MLP) used for teaching BSc/MSc Data Science students

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published