Skip to content

Low level implementation of basic and LSTM Recurrent units

Notifications You must be signed in to change notification settings

NullOsama/RNN-Units

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RNN-Units

Low level implementation of basic and LSTM Recurrent units

Using numpy only, I have implemented the forward propagation of a standard RNN and an LSTM RNN.

Note: Math explinations and discussions are inspired from Deep Learning Specialization (with Andrew ng) offered by Coursera.

Working on deriving backpropagation...

Contributing

Pull requests are welcomed. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

About

Low level implementation of basic and LSTM Recurrent units

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published