8000 GitHub - amdjadouxx/my_neural_network: implementation of a neural network library
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

amdjadouxx/my_neural_network

Repository files navigation

My Neural Network

This project implements neural networks from scratch, providing a flexible framework for building and training custom neural network architectures.

Overview

This repository contains an implementation of neural networks built from scratch using Python. The goal of this project is to provide a deep dive into the inner workings of neural networks, including forward and backward propagation, activation functions, optimization techniques, and more. It is a great resource for those who want to understand how neural networks work under the hood, as well as a solid starting point for building more advanced neural networks models.

Features

  • Fully Customizable Architectures: Build your own neural network architectures by modifying layers and activation functions.
  • Backpropagation and Optimization: Implement backpropagation for training, with the ability to customize optimization methods such as gradient descent.
  • Support for Various Activation Functions: Includes support for common activation functions like ReLU, Sigmoid, and Tanh.
  • Training and Evaluation: Tools to train models and evaluate their performance on test data.

Requirements

  • Python 3.x
  • Numpy

Brain

this code is based from: omaraflak

About

implementation of a neural network library

Topics

Resources

License

Stars

Watchers

Forks

Languages

0