Implements a neural network with a single hidden layer in Numpy to predict bike rentals.
View the results of my trained network:
This Jupyter notebook runs my neural network, (with results embedded).
Here is my code for this neural net.
I wrote the:
- backprop algorithm, the
- forward prop algorithm, and
- tuned the training paramaters: the number of
- epochs,
- hidden nodes,
- output nodes, and the
- learning rate
Udactiy Grader's Review of my project.
Note: I have since updated my code to divide by batch size in the update_weights function,
and moved multiplying by the learning rate to the that same update_weights function.)
This started as a copy of Your First Neural Network directory from the course repo.
This repository contains material related to Udacity's Deep Learning Nanodegree Foundation program.
It consists of a bunch of tutorial notebooks for various deep learning topics. In most cases, the notebooks lead you through implementing models such as convolutional networks, recurrent networks, and GANs. There are other topics covered such as weight intialization and batch normalization.
There are also notebooks used as projects for the Nanodegree program. In the program itself, the projects are reviewed by Udacity experts, but they are available here as well.
- Sentiment Analysis with Numpy: Andrew Trask leads you through building a sentiment analysis model, predicting if some text is positive or negative.
- Intro to TensorFlow: Starting building neural networks with Tensorflow.
- Weight Intialization: Explore how initializing network weights affects performance.
- Autoencoders: Build models for image compression and denoising, using feed-forward and convolution networks in TensorFlow.
- Transfer Learning (ConvNet). In practice, most people don't train their own large networkd on huge datasets, but use pretrained networks such as VGGnet. Here you'll use VGGnet to classify images of flowers without training a network on the images themselves.
- Intro to Recurrent Networks (Character-wise RNN): Recurrent neural networks are able to use information about the sequence of data, such as the sequence of characters in text.
- Embeddings (Word2Vec): Implement the Word2Vec model to find semantic representations of words for use in natural language processing.
- Sentiment Analysis RNN: Implement a recurrent neural network that can predict if a text sample is positive or negative.
- Tensorboard: Use TensorBoard to visualize the network graph, as well as how parameters change through training.
- Reinforcement Learning (Q-Learning): Implement a deep Q-learning network to play a simple game from OpenAI Gym.
- Sequence to sequence: Implement a sequence-to-sequence recurrent network.
- Batch normalization: Learn how to improve training rates and network stability with batch normalizations.
- Generative Adversatial Network on MNIST: Train a simple generative adversarial network on the MNIST dataset.
- Deep Convolutional GAN (DCGAN): Implement a DCGAN to generate new images based on the Street View House Numbers (SVHN) dataset.
- Intro to TFLearn: A couple introductions to a high-level library for building neural networks.
- Your First Neural Network: Implement a neural network in Numpy to predict bike rentals.
- Image classification: Build a convolutional neural network with TensorFlow to classify CIFAR-10 images.
- Text Generation: Train a recurrent neural network on scripts from The Simpson's (copyright Fox) to generate new scripts.
- Machine Translation: Train a sequence to sequence network for English to French translation (on a simple dataset)
- Face Generation: Use a DCGAN on the CelebA dataset to generate images of novel and realistic human faces.
Each directory has a requirements.txt
describing the minimal dependencies required to run the notebooks in that directory.
To install these dependencies with pip, you can issue pip3 install -r requirements.txt
.
You can find Conda environment files for the Deep Learning program in the environments
folder. Note that environment files are platform dependent. Versions with tensorflow-gpu
are labeled in the filename with "GPU".