Skip to content

Commit

Permalink
Update docs.
Browse files Browse the repository at this point in the history
  • Loading branch information
francesco-innocenti committed Nov 27, 2024
1 parent 91073ff commit 1c2d507
Show file tree
Hide file tree
Showing 4 changed files with 17 additions and 9 deletions.
6 changes: 6 additions & 0 deletions docs/api/Gradients.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# Gradients

!!! note
There are two similar functions to compute the activity gradient:
`jpc.neg_activity_grad` and `jpc.compute_activity_grad`. The first is used
by `jpc.solve_inference` as gradient flow, while the second is for
compatibility with discrete optax optimisers such as gradient descent.

::: jpc.neg_activity_grad

---
Expand Down
4 changes: 4 additions & 0 deletions docs/api/Initialisation.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Initialisation

!!! info
JPC provides 3 standard ways of initialising the activities: a feedforward
pass, randomly, or using an amortised network.

::: jpc.init_activities_with_ffwd

---
Expand Down
11 changes: 5 additions & 6 deletions docs/basic_usage.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
!!! info
!!! info
JPC provides two types of API depending on the use case:
* a simple, high-level API that allows to train and test models with predictive
coding in a few lines of code
coding in a few lines of code, and
* a more advanced API offering greater flexibility as well as additional features.

# Basic usage

At a high level, JPC provides a single convenience function `jpc.make_pc_step`
to update the parameters of a neural network with PC.
```py
Expand Down Expand Up @@ -49,7 +48,7 @@ sense that it's split into callable layers (see the
that the `input` is actually not needed for unsupervised training. In fact,
`jpc.make_pc_step` can be used for classification and generation tasks, for
supervised as well as unsupervised training (again see the [example notebooks
](https://thebuckleylab.github.io/jpc/examples/discriminative_pc/))
](https://thebuckleylab.github.io/jpc/examples/discriminative_pc/)).

Under the hood, `jpc.make_pc_step` uses [Diffrax
](https://github.com/patrick-kidger/diffrax) to solve the activity (inference)
Expand All @@ -59,8 +58,8 @@ variety of metrics such as loss, accuracy, and energies. See the [docs
](https://thebuckleylab.github.io/jpc/api/Training/#jpc.make_pc_step) for more
details.

A similar convenience function `jpc.make_hpc_step` is provided for training a
hybrid PCN (see [Tschantz et al., 2023
A similar convenience function `jpc.make_hpc_step` is provided for updating the
parameters of a hybrid PCN ([Tschantz et al., 2023
](https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1011280)).
```py
import jax.random as jr
Expand Down
5 changes: 2 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,18 +130,17 @@ If you found this library useful in your work, please cite (arXiv link):
```
Also consider starring the project [on GitHub](https://github.com/thebuckleylab/jpc)! ⭐️

## ⏭️ Next steps


## 🙏 Acknowledgements
We are grateful to Patrick Kidger for early advice on how to use Diffrax.

## See also: other PC libraries
JAX-based:

* [pcx](https://github.com/liukidar/pcx)
* [pyhgf](https://github.com/ComputationalPsychiatry/pyhgf)

PyTorch-based:

* [Torch2PC](https://github.com/RobertRosenbaum/Torch2PC)
* [pypc](https://github.com/infer-actively/pypc)
* [pybrid](https://github.com/alec-tschantz/pybrid)

0 comments on commit 1c2d507

Please sign in to comment.