Hone logo
Hone
Problems

Simple Feedforward Neural Network Implementation

This challenge asks you to implement a basic feedforward neural network from scratch in Python. Neural networks are fundamental to machine learning, enabling systems to learn complex patterns from data. Successfully completing this challenge will give you a solid understanding of the core components and operations within a neural network.

Problem Description

You are tasked with creating a simple feedforward neural network with one hidden layer. The network should be able to take an input vector, process it through the hidden layer, and produce an output vector. The activation function for both layers should be the sigmoid function. The network should include methods for initialization, forward propagation, and backpropagation to update weights and biases.

What needs to be achieved:

  • Implement a NeuralNetwork class with the following methods:

    • __init__(self, input_size, hidden_size, output_size): Initializes the network with the specified input, hidden, and output sizes. Weights and biases should be initialized randomly (e.g., using a normal distribution with mean 0 and standard deviation 1).
    • forward(self, input_data): Performs forward propagation through the network, returning the output vector.
    • backward(self, input_data, target_data, learning_rate): Performs backpropagation to calculate gradients and update weights and biases based on the provided input data, target data, and learning rate.
    • predict(self, input_data): Performs forward propagation and returns the predicted output.
  • The network should use the sigmoid activation function.

Key Requirements:

  • Random weight and bias initialization.
  • Sigmoid activation function.
  • Forward propagation calculation.
  • Backpropagation calculation of gradients.
  • Weight and bias updates using the calculated gradients.

Expected Behavior:

The network should learn to approximate a simple function (e.g., XOR) given sufficient training data and a suitable learning rate. The backward method should correctly update the weights and biases to minimize the error between the network's output and the target output. The predict method should return the network's output for a given input.

Edge Cases to Consider:

  • Input data with zero values.
  • Learning rates that are too high (causing instability) or too low (causing slow convergence).
  • Incorrect dimensions of input and target data.

Examples

Example 1:

Input: input_size = 2, hidden_size = 3, output_size = 1
Output: A NeuralNetwork object with randomly initialized weights and biases.
Explanation: The constructor should create a network with the specified dimensions and initialize the weights and biases randomly.

Example 2:

Input: input_data = [0.5, 0.2], target_data = [0.8], learning_rate = 0.1
Output: The weights and biases of the network are updated after the backward pass.
Explanation: The backward method should calculate the gradients and update the weights and biases based on the input, target, and learning rate.

Example 3: (XOR training)

Input:  A series of input/target pairs representing the XOR function, a learning rate, and a number of epochs.
Output:  The network's weights and biases converge to values that allow it to accurately predict the XOR function.
Explanation:  Repeatedly calling forward and backward with XOR data should train the network to approximate the XOR function.

Constraints

  • input_size, hidden_size, and output_size must be positive integers.
  • Input data and target data should be NumPy arrays.
  • The learning rate should be a positive floating-point number.
  • The sigmoid function should be implemented correctly.
  • The backpropagation algorithm should be implemented accurately.
  • The code should be reasonably efficient (avoid unnecessary computations).

Notes

  • You can use NumPy for numerical operations.
  • Consider using a sigmoid function defined as 1 / (1 + np.exp(-x)).
  • The backpropagation algorithm involves calculating the error gradient for each layer and updating the weights and biases accordingly.
  • Start with a small dataset and a simple function to test your implementation before moving on to more complex scenarios.
  • Debugging can be challenging; use print statements or a debugger to inspect the values of weights, biases, and activations during forward and backward propagation.
Loading editor...
python