Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

activation intrinsics for neural networks #860

Open
wants to merge 40 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
2ff7029
start working on activations module
jalvesz Aug 13, 2024
7d1c6ad
softmax for ranks from 1 to 4
jalvesz Aug 15, 2024
c1303e7
move activations to specialfunctions, add specs
jalvesz Aug 17, 2024
f22756a
fix float constant definition
jalvesz Aug 17, 2024
b1a4180
fix float constant definition
jalvesz Aug 17, 2024
90b8de3
fix float constant definition
jalvesz Aug 17, 2024
b7c8c81
Merge branch 'fortran-lang:master' into activations
jalvesz Aug 19, 2024
1b3bf4f
update src CMakeLists
jalvesz Aug 19, 2024
f4ad250
add tests for activations
jalvesz Aug 19, 2024
9d7eb7c
add tests for sigmoid and gelu
jalvesz Aug 20, 2024
5727921
missing module procedure
jalvesz Aug 20, 2024
2ed7626
missing interface and change of kind definition for elemental module …
jalvesz Aug 20, 2024
f1acf1e
add SiLU activation
jalvesz Aug 21, 2024
230bea9
Merge branch 'fortran-lang:master' into activations
jalvesz Aug 21, 2024
dd7125d
Merge branch 'fortran-lang:master' into activations
jalvesz Sep 15, 2024
b137b36
Merge branch 'fortran-lang:master' into activations
jalvesz Sep 18, 2024
bc2bf5a
Merge branch 'fortran-lang:master' into activations
jalvesz Sep 24, 2024
5c47bf0
add any rank support for softmax and logsoftmax
jalvesz Sep 29, 2024
8f0cd69
Merge branch 'fortran-lang:master' into activations
jalvesz Oct 26, 2024
1a2245a
Merge branch 'activations' of https://github.com/jalvesz/stdlib into …
jalvesz Oct 26, 2024
5d0419e
homogenize arguments
jalvesz Oct 30, 2024
21851d0
add selu activation
jalvesz Dec 21, 2024
ef6e3e6
Merge branch 'activations' of https://github.com/jalvesz/stdlib into …
jalvesz Dec 22, 2024
1914e78
Add SELU documentation
jalvesz Dec 22, 2024
4c1afde
add tests
jalvesz Dec 23, 2024
bccbdd4
examples
jalvesz Dec 23, 2024
9b4ed49
fix relu example
jalvesz Dec 23, 2024
564c99c
fix tests
jalvesz Dec 23, 2024
9e9b28b
improve specs
jalvesz Dec 23, 2024
14af3f9
examples bugfix
jalvesz Dec 23, 2024
3789518
replace ifs with merge
jalvesz Dec 24, 2024
b36b143
Merge branch 'fortran-lang:master' into activations
jalvesz Dec 24, 2024
eedfad7
Merge branch 'fortran-lang:master' into activations
jalvesz Dec 26, 2024
2cba1ee
Merge branch 'fortran-lang:master' into activations
jalvesz Jan 2, 2025
8dc0654
Merge branch 'fortran-lang:master' into activations
jalvesz Jan 3, 2025
9e0f026
Merge branch 'activations' of https://github.com/jalvesz/stdlib into …
jalvesz Jan 3, 2025
cdde132
Merge branch 'fortran-lang:master' into activations
jalvesz Jan 5, 2025
4363271
Merge branch 'fortran-lang:master' into activations
jalvesz Jan 17, 2025
f06ab3b
Merge branch 'activations' of https://github.com/jalvesz/stdlib into …
jalvesz Jan 18, 2025
e483325
add leaky relu activation
jalvesz Jan 18, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
examples
  • Loading branch information
jalvesz committed Dec 23, 2024
commit bccbdd41124c8276b0385ac4fd89f3a7e0664b06
30 changes: 30 additions & 0 deletions doc/specs/stdlib_specialfunctions_activations.md
Original file line number Diff line number Diff line change
@@ -33,6 +33,11 @@ Elemental function

The function returns a value with the same type and kind as input argument.

### Example
```fortran
{!example/specialfunctions_activations/example_gaussian.f90!}
```

## `Gaussian_grad` - Gradient of the Gaussian function

### Status
@@ -94,6 +99,11 @@ Elemental function

The function returns a value with the same type and kind as input argument.

### Example
```fortran
{!example/specialfunctions_activations/example_elu.f90!}
```

## `Elu_grad` - Gradient of the Exponential Linear Unit function

### Status
@@ -155,6 +165,11 @@ Elemental function

The function returns a value with the same type and kind as input argument.

### Example
```fortran
{!example/specialfunctions_activations/example_relu.f90!}
```

## `Relu_grad` - Gradient of the Rectified Linear Unit function

### Status
@@ -215,6 +230,11 @@ Elemental function

The function returns a value with the same type and kind as input argument.

### Example
```fortran
{!example/specialfunctions_activations/example_gelu.f90!}
```

## `Gelu_grad` - Gradient of the Gaussian Error Linear Unit function

### Status
@@ -335,6 +355,11 @@ Elemental function

The function returns a value with the same type and kind as input argument.

### Example
```fortran
{!example/specialfunctions_activations/example_selu.f90!}
```

## `selu_grad` - Gradient of the Scaled Exponential Linear Unit function

### Status
@@ -449,6 +474,11 @@ Elemental function

The function returns a value with the same type and kind as input argument.

### Example
```fortran
{!example/specialfunctions_activations/example_silu.f90!}
```

## `Silu_grad` - Gradient of the Sigmoid Linear Unit function

### Status
6 changes: 6 additions & 0 deletions example/specialfunctions_activations/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
ADD_EXAMPLE(elu)
ADD_EXAMPLE(gaussian)
ADD_EXAMPLE(gelu)
ADD_EXAMPLE(relu)
ADD_EXAMPLE(selu)
ADD_EXAMPLE(silu)
13 changes: 13 additions & 0 deletions example/specialfunctions_activations/example_elu.f90
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
program example_elu
use stdlib_kinds, only: sp
use stdlib_math, only: linspace
use stdlib_specialfunctions, only: elu

integer, parameter :: n = 10
real(sp) :: x(n), y(n)
implicit none

x = linspace(-2._sp, 2._sp, n)
y = elu( x , 1.0 )
end program example_elu

13 changes: 13 additions & 0 deletions example/specialfunctions_activations/example_gaussian.f90
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
program example_gaussian
use stdlib_kinds, only: sp
use stdlib_math, only: linspace
use stdlib_specialfunctions, only: gaussian

integer, parameter :: n = 10
real(sp) :: x(n), y(n)
implicit none

x = linspace(-2._sp, 2._sp, n)
y = gaussian( x )
end program example_gaussian

13 changes: 13 additions & 0 deletions example/specialfunctions_activations/example_gelu.f90
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
program example_gelu
use stdlib_kinds, only: sp
use stdlib_math, only: linspace
use stdlib_specialfunctions, only: gelu

integer, parameter :: n = 10
real(sp) :: x(n), y(n)
implicit none

x = linspace(-2._sp, 2._sp, n)
y = gelu( x )
end program example_gelu

13 changes: 13 additions & 0 deletions example/specialfunctions_activations/example_relu.f90
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
program example_relu
use stdlib_kinds, only: sp
use stdlib_math, only: linspace
use stdlib_specialfunctions, only: relu

integer, parameter :: n = 10
real(sp) :: x(n), y(n)
implicit none

x = linspace(-2._sp, 2._sp, n)
y = relu( x , 1.0 )
end program example_relu

13 changes: 13 additions & 0 deletions example/specialfunctions_activations/example_selu.f90
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
program example_selu
use stdlib_kinds, only: sp
use stdlib_math, only: linspace
use stdlib_specialfunctions, only: selu

integer, parameter :: n = 10
real(sp) :: x(n), y(n)
implicit none

x = linspace(-2._sp, 2._sp, n)
y = selu( x )
end program example_selu

13 changes: 13 additions & 0 deletions example/specialfunctions_activations/example_silu.f90
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
program example_silu
use stdlib_kinds, only: sp
use stdlib_math, only: linspace
use stdlib_specialfunctions, only: silu

integer, parameter :: n = 10
real(sp) :: x(n), y(n)
implicit none

x = linspace(-2._sp, 2._sp, n)
y = silu( x )
end program example_silu

Loading