Skip to content

Latest commit

 

History

History
50 lines (34 loc) · 1.44 KB

README.md

File metadata and controls

50 lines (34 loc) · 1.44 KB

minigrad

minigrad is a toy implementation of an autograd engine, inspired by Andrej Karpathy's micrograd project, with and API similar to that of PyTorch. It supports both CPU and GPU operations using NumPy and CuPy, respectively.

My goal with minigrad is to explore and learn the internals of deep learning frameworks — minigrad is not optimized for speed or efficiency. As of now it provides basic tensor operations, back propagation, neural network layers, optimizers, and loss functions.

Features

  • Tensor Operations: Supports basic tensor operations with automatic differentiation.
  • CPU and GPU Support: Use NumPy for CPU operations and CuPy for GPU operations.
  • Neural Networks: Includes simple layers like Linear and activations like ReLU
  • Optimizers: Implements basic optimizers like SGD.
  • Loss Functions: Provides loss functions like MSELoss.

Installation

To install minigrad:

pip install minigrad-python

Usage

Basic Usage

import minigrad as mg

a = mg.Tensor([1, 2, 3], device='gpu')
b = mg.Tensor([2, 3, 4], device='gpu')
c = a + b
d = c.sum()
d.backward()
print(a.grad)
print(b.grad)

Training a Neural Network

  • A very simple example of training a neural network can be found in examples/simple_nn.py
  • TODO: Add more examples.

Run Tests

To run the unit tests and benchmarks for the project

pytest