From ccb2f1be7fdce80956f8d239f5304460219ca97c Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Thu, 29 Dec 2022 23:15:11 +0100
Subject: [PATCH 01/24] tiny changes related to JOSS review
---
README.md | 22 ++++++++++++++--------
1 file changed, 14 insertions(+), 8 deletions(-)
diff --git a/README.md b/README.md
index e88ac18f..1bdff1d9 100644
--- a/README.md
+++ b/README.md
@@ -28,10 +28,15 @@ text and images using Wide and Deep models in Pytorch
The content of this document is organized as follows:
-1. [introduction](#introduction)
-2. [The deeptabular component](#the-deeptabular-component)
-3. [installation](#installation)
-4. [quick start (tl;dr)](#quick-start)
+- [pytorch-widedeep](#pytorch-widedeep)
+ - [Introduction](#introduction)
+ - [The ``deeptabular`` component](#the-deeptabular-component)
+ - [Installation](#installation)
+ - [Developer Install](#developer-install)
+ - [Quick start](#quick-start)
+ - [Testing](#testing)
+ - [How to Contribute](#how-to-contribute)
+ - [Acknowledgments](#acknowledgments)
### Introduction
@@ -75,9 +80,10 @@ without a ``deephead`` component can be formulated as:
-Where *'W'* are the weight matrices applied to the wide model and to the final
-activations of the deep models, *'a'* are these final activations, and
-φ(x) are the cross product transformations of the original features *'x'*.
+Where σ is the sigmoid function, *'W'* are the weight matrices applied to the wide model and to the final
+activations of the deep models, *'a'* are these final activations,
+φ(x) are the cross product transformations of the original features *'x'*, and
+, and *'b'* is the bias term.
In case you are wondering what are *"cross product transformations"*, here is
a quote taken directly from the paper: *"For binary features, a cross-product
transformation (e.g., “AND(gender=female, language=en)”) is 1 if and only if
@@ -296,7 +302,7 @@ pytest tests
### How to Contribute
-Check [CONTRIBUTING](https://github.com/jrzaurin/pytorch-widedeep/CONTRIBUTING.MD) page.
+Check [CONTRIBUTING](https://github.com/jrzaurin/pytorch-widedeep/blob/master/CONTRIBUTING.MD) page.
### Acknowledgments
From 0c2955a07a7013199e33592b4cd9e9b125fe76fb Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Wed, 4 Jan 2023 16:48:16 +0100
Subject: [PATCH 02/24] adjusted contribution.md
---
CONTRIBUTING.MD | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/CONTRIBUTING.MD b/CONTRIBUTING.MD
index 0b12c8da..213bd81f 100644
--- a/CONTRIBUTING.MD
+++ b/CONTRIBUTING.MD
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- **[TBA]** Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/microsoft/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
From f075e2c3156a69000be38bad6390d32eec6f2ddc Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Wed, 4 Jan 2023 16:51:22 +0100
Subject: [PATCH 03/24] update contributing.md mkdocs
---
mkdocs/sources/contributing.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/mkdocs/sources/contributing.md b/mkdocs/sources/contributing.md
index 0b12c8da..213bd81f 100644
--- a/mkdocs/sources/contributing.md
+++ b/mkdocs/sources/contributing.md
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- **[TBA]** Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/microsoft/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
From 2dcd688691885020455c67fdad02b59f61548870 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:13:28 +0100
Subject: [PATCH 04/24] fixed torchmetrics test
---
tests/test_metrics/test_torchmetrics.py | 16 ++++++++--------
1 file changed, 8 insertions(+), 8 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 59d77069..00033a26 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -30,10 +30,10 @@ def f2_score_bin(y_true, y_pred):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("Accuracy", accuracy_score, Accuracy()),
- ("Precision", precision_score, Precision()),
- ("Recall", recall_score, Recall()),
- ("F1Score", f1_score, F1Score()),
+ ("Accuracy", accuracy_score, Accuracy(task="binary")),
+ ("Precision", precision_score, Precision(task="binary")),
+ ("Recall", recall_score, Recall(task="binary")),
+ ("F1Score", f1_score, F1Score(task="binary")),
("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
],
)
@@ -77,10 +77,10 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("Accuracy", accuracy_score, Accuracy(num_classes=3, average="micro")),
- ("Precision", precision_score, Precision(num_classes=3, average="macro")),
- ("Recall", recall_score, Recall(num_classes=3, average="macro")),
- ("F1Score", f1_score, F1Score(num_classes=3, average="macro")),
+ ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
+ ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
+ ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
+ ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
(
"FBetaScore",
f2_score_multi,
From 69a7731eb57f770e76bdfad2d0217166fbd35f47 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:20:12 +0100
Subject: [PATCH 05/24] black styling fix
---
tests/test_metrics/test_torchmetrics.py | 28 +++++++++++++++++++------
1 file changed, 22 insertions(+), 6 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 00033a26..bb97ac8b 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,7 +34,7 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -77,14 +77,30 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
- ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
- ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
- ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
+ (
+ "Accuracy",
+ accuracy_score,
+ Accuracy(task="multiclass", num_classes=3, average="micro"),
+ ),
+ (
+ "Precision",
+ precision_score,
+ Precision(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "Recall",
+ recall_score,
+ Recall(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "F1Score",
+ f1_score,
+ F1Score(task="multiclass", num_classes=3, average="macro"),
+ ),
(
"FBetaScore",
f2_score_multi,
- FBetaScore(beta=3, num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
),
],
)
From 8e05d01f76503686920dd702a908a00e072ed0bf Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:37:33 +0100
Subject: [PATCH 06/24] fixed torchmetrics
---
tests/test_metrics/test_torchmetrics.py | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index bb97ac8b..f6b58c40 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,14 +34,14 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
sk_res = sklearn_metric(y_true_bin_np, y_pred_bin_np.round())
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_bin_pt, y_true_bin_pt)
- wd_res = wd_logs[metric_name]
+ wd_res = wd_logs[f"Binary{metric_name}"]
if wd_res.size != 1:
wd_res = wd_res[1]
assert np.isclose(sk_res, wd_res)
@@ -100,7 +100,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"FBetaScore",
f2_score_multi,
- FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3.0, num_classes=3, average="macro"),
),
],
)
@@ -114,6 +114,6 @@ def test_muticlass_metrics(metric_name, sklearn_metric, torch_metric):
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_multi_pt, y_true_multi_pt)
- wd_res = wd_logs[metric_name]
+ wd_res = wd_logs[f"Multiclass{metric_name}"]
assert np.isclose(sk_res, wd_res, atol=0.01)
From 2bc274236601faa36cd829416a7b6d609131f641 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Thu, 29 Dec 2022 23:15:11 +0100
Subject: [PATCH 07/24] tiny changes related to JOSS review
---
README.md | 22 ++++++++++++++--------
1 file changed, 14 insertions(+), 8 deletions(-)
diff --git a/README.md b/README.md
index e88ac18f..1bdff1d9 100644
--- a/README.md
+++ b/README.md
@@ -28,10 +28,15 @@ text and images using Wide and Deep models in Pytorch
The content of this document is organized as follows:
-1. [introduction](#introduction)
-2. [The deeptabular component](#the-deeptabular-component)
-3. [installation](#installation)
-4. [quick start (tl;dr)](#quick-start)
+- [pytorch-widedeep](#pytorch-widedeep)
+ - [Introduction](#introduction)
+ - [The ``deeptabular`` component](#the-deeptabular-component)
+ - [Installation](#installation)
+ - [Developer Install](#developer-install)
+ - [Quick start](#quick-start)
+ - [Testing](#testing)
+ - [How to Contribute](#how-to-contribute)
+ - [Acknowledgments](#acknowledgments)
### Introduction
@@ -75,9 +80,10 @@ without a ``deephead`` component can be formulated as:
-Where *'W'* are the weight matrices applied to the wide model and to the final
-activations of the deep models, *'a'* are these final activations, and
-φ(x) are the cross product transformations of the original features *'x'*.
+Where σ is the sigmoid function, *'W'* are the weight matrices applied to the wide model and to the final
+activations of the deep models, *'a'* are these final activations,
+φ(x) are the cross product transformations of the original features *'x'*, and
+, and *'b'* is the bias term.
In case you are wondering what are *"cross product transformations"*, here is
a quote taken directly from the paper: *"For binary features, a cross-product
transformation (e.g., “AND(gender=female, language=en)”) is 1 if and only if
@@ -296,7 +302,7 @@ pytest tests
### How to Contribute
-Check [CONTRIBUTING](https://github.com/jrzaurin/pytorch-widedeep/CONTRIBUTING.MD) page.
+Check [CONTRIBUTING](https://github.com/jrzaurin/pytorch-widedeep/blob/master/CONTRIBUTING.MD) page.
### Acknowledgments
From 631fb9af19ba1004cd3f124621d975ca2a244da9 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Wed, 4 Jan 2023 16:48:16 +0100
Subject: [PATCH 08/24] adjusted contribution.md
---
CONTRIBUTING.MD | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/CONTRIBUTING.MD b/CONTRIBUTING.MD
index 0b12c8da..213bd81f 100644
--- a/CONTRIBUTING.MD
+++ b/CONTRIBUTING.MD
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- **[TBA]** Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/microsoft/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
From fbdcef8a3e26a91f87f03c7c2006a8181495c90b Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Wed, 4 Jan 2023 16:51:22 +0100
Subject: [PATCH 09/24] update contributing.md mkdocs
---
mkdocs/sources/contributing.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/mkdocs/sources/contributing.md b/mkdocs/sources/contributing.md
index 0b12c8da..213bd81f 100644
--- a/mkdocs/sources/contributing.md
+++ b/mkdocs/sources/contributing.md
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- **[TBA]** Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/microsoft/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
From c3e05635c90e242dfb7f623b8cc764090237c396 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:13:28 +0100
Subject: [PATCH 10/24] fixed torchmetrics test
---
tests/test_metrics/test_torchmetrics.py | 14 +++++++++-----
1 file changed, 9 insertions(+), 5 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 3213229a..4a8f6b69 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -30,11 +30,11 @@ def f2_score_bin(y_true, y_pred):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("BinaryAccuracy", accuracy_score, Accuracy(task="binary")),
- ("BinaryPrecision", precision_score, Precision(task="binary")),
- ("BinaryRecall", recall_score, Recall(task="binary")),
- ("BinaryF1Score", f1_score, F1Score(task="binary")),
- ("BinaryFBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
+ ("Accuracy", accuracy_score, Accuracy(task="binary")),
+ ("Precision", precision_score, Precision(task="binary")),
+ ("Recall", recall_score, Recall(task="binary")),
+ ("F1Score", f1_score, F1Score(task="binary")),
+ ("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -77,6 +77,10 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
+ ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
+ ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
+ ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
+ ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
(
"MulticlassAccuracy",
accuracy_score,
From ae81dab3a8d27d7f2c2609f764544c4b0dca2daf Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:20:12 +0100
Subject: [PATCH 11/24] black styling fix
---
tests/test_metrics/test_torchmetrics.py | 28 +++++++++++++++++++------
1 file changed, 22 insertions(+), 6 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 4a8f6b69..309ce3eb 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,7 +34,7 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -77,10 +77,26 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
- ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
- ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
- ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
+ (
+ "Accuracy",
+ accuracy_score,
+ Accuracy(task="multiclass", num_classes=3, average="micro"),
+ ),
+ (
+ "Precision",
+ precision_score,
+ Precision(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "Recall",
+ recall_score,
+ Recall(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "F1Score",
+ f1_score,
+ F1Score(task="multiclass", num_classes=3, average="macro"),
+ ),
(
"MulticlassAccuracy",
accuracy_score,
@@ -104,7 +120,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"MulticlassFBetaScore",
f2_score_multi,
- FBetaScore(beta=3.0, task="multiclass", num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
),
],
)
From 92b4cd97af24930636873270e5951291bc2706eb Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:37:33 +0100
Subject: [PATCH 12/24] fixed torchmetrics
---
tests/test_metrics/test_torchmetrics.py | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 309ce3eb..545cd488 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,14 +34,14 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
sk_res = sklearn_metric(y_true_bin_np, y_pred_bin_np.round())
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_bin_pt, y_true_bin_pt)
- wd_res = wd_logs[metric_name]
+ wd_res = wd_logs[f"Binary{metric_name}"]
if wd_res.size != 1:
wd_res = wd_res[1]
assert np.isclose(sk_res, wd_res)
@@ -120,7 +120,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"MulticlassFBetaScore",
f2_score_multi,
- FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3.0, num_classes=3, average="macro"),
),
],
)
@@ -134,6 +134,6 @@ def test_muticlass_metrics(metric_name, sklearn_metric, torch_metric):
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_multi_pt, y_true_multi_pt)
- wd_res = wd_logs[metric_name]
+ wd_res = wd_logs[f"Multiclass{metric_name}"]
assert np.isclose(sk_res, wd_res, atol=0.01)
From 9466866cdb8c2308b3d2ba901f697acf02e5557c Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Thu, 29 Dec 2022 23:15:11 +0100
Subject: [PATCH 13/24] tiny changes related to JOSS review
---
README.md | 22 ++++++++++++++--------
1 file changed, 14 insertions(+), 8 deletions(-)
diff --git a/README.md b/README.md
index 5e7eca22..6b48b4cd 100644
--- a/README.md
+++ b/README.md
@@ -28,10 +28,15 @@ text and images using Wide and Deep models in Pytorch
The content of this document is organized as follows:
-1. [introduction](#introduction)
-2. [The deeptabular component](#the-deeptabular-component)
-3. [installation](#installation)
-4. [quick start (tl;dr)](#quick-start)
+- [pytorch-widedeep](#pytorch-widedeep)
+ - [Introduction](#introduction)
+ - [The ``deeptabular`` component](#the-deeptabular-component)
+ - [Installation](#installation)
+ - [Developer Install](#developer-install)
+ - [Quick start](#quick-start)
+ - [Testing](#testing)
+ - [How to Contribute](#how-to-contribute)
+ - [Acknowledgments](#acknowledgments)
### Introduction
@@ -75,9 +80,10 @@ without a ``deephead`` component can be formulated as:
-Where *'W'* are the weight matrices applied to the wide model and to the final
-activations of the deep models, *'a'* are these final activations, and
-φ(x) are the cross product transformations of the original features *'x'*.
+Where σ is the sigmoid function, *'W'* are the weight matrices applied to the wide model and to the final
+activations of the deep models, *'a'* are these final activations,
+φ(x) are the cross product transformations of the original features *'x'*, and
+, and *'b'* is the bias term.
In case you are wondering what are *"cross product transformations"*, here is
a quote taken directly from the paper: *"For binary features, a cross-product
transformation (e.g., “AND(gender=female, language=en)”) is 1 if and only if
@@ -296,7 +302,7 @@ pytest tests
### How to Contribute
-Check [CONTRIBUTING](https://github.com/jrzaurin/pytorch-widedeep/CONTRIBUTING.MD) page.
+Check [CONTRIBUTING](https://github.com/jrzaurin/pytorch-widedeep/blob/master/CONTRIBUTING.MD) page.
### Acknowledgments
From 9f7fbc243d1b47c119fd2d31abf0da3182f90867 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Wed, 4 Jan 2023 16:48:16 +0100
Subject: [PATCH 14/24] adjusted contribution.md
---
CONTRIBUTING.MD | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/CONTRIBUTING.MD b/CONTRIBUTING.MD
index 0b12c8da..213bd81f 100644
--- a/CONTRIBUTING.MD
+++ b/CONTRIBUTING.MD
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- **[TBA]** Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/microsoft/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
From 42ac4f46f15383a811c76ac61dafe366fe13d51a Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Wed, 4 Jan 2023 16:51:22 +0100
Subject: [PATCH 15/24] update contributing.md mkdocs
---
mkdocs/sources/contributing.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/mkdocs/sources/contributing.md b/mkdocs/sources/contributing.md
index 0b12c8da..213bd81f 100644
--- a/mkdocs/sources/contributing.md
+++ b/mkdocs/sources/contributing.md
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- **[TBA]** Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/microsoft/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
From a601fbc283d5621337f955aed3439a37b7be7133 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:13:28 +0100
Subject: [PATCH 16/24] fixed torchmetrics test
---
tests/test_metrics/test_torchmetrics.py | 14 +++++++++-----
1 file changed, 9 insertions(+), 5 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 3213229a..4a8f6b69 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -30,11 +30,11 @@ def f2_score_bin(y_true, y_pred):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("BinaryAccuracy", accuracy_score, Accuracy(task="binary")),
- ("BinaryPrecision", precision_score, Precision(task="binary")),
- ("BinaryRecall", recall_score, Recall(task="binary")),
- ("BinaryF1Score", f1_score, F1Score(task="binary")),
- ("BinaryFBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
+ ("Accuracy", accuracy_score, Accuracy(task="binary")),
+ ("Precision", precision_score, Precision(task="binary")),
+ ("Recall", recall_score, Recall(task="binary")),
+ ("F1Score", f1_score, F1Score(task="binary")),
+ ("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -77,6 +77,10 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
+ ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
+ ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
+ ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
+ ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
(
"MulticlassAccuracy",
accuracy_score,
From c4581337185ad0c53b12be38402f23cb2ce653c5 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:20:12 +0100
Subject: [PATCH 17/24] black styling fix
---
tests/test_metrics/test_torchmetrics.py | 28 +++++++++++++++++++------
1 file changed, 22 insertions(+), 6 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 4a8f6b69..309ce3eb 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,7 +34,7 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -77,10 +77,26 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
- ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
- ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
- ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
+ (
+ "Accuracy",
+ accuracy_score,
+ Accuracy(task="multiclass", num_classes=3, average="micro"),
+ ),
+ (
+ "Precision",
+ precision_score,
+ Precision(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "Recall",
+ recall_score,
+ Recall(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "F1Score",
+ f1_score,
+ F1Score(task="multiclass", num_classes=3, average="macro"),
+ ),
(
"MulticlassAccuracy",
accuracy_score,
@@ -104,7 +120,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"MulticlassFBetaScore",
f2_score_multi,
- FBetaScore(beta=3.0, task="multiclass", num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
),
],
)
From 7f9677400ad5cc37625a31ae9f43b0b7e9dc0c01 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:37:33 +0100
Subject: [PATCH 18/24] fixed torchmetrics
---
tests/test_metrics/test_torchmetrics.py | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 309ce3eb..545cd488 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,14 +34,14 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
sk_res = sklearn_metric(y_true_bin_np, y_pred_bin_np.round())
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_bin_pt, y_true_bin_pt)
- wd_res = wd_logs[metric_name]
+ wd_res = wd_logs[f"Binary{metric_name}"]
if wd_res.size != 1:
wd_res = wd_res[1]
assert np.isclose(sk_res, wd_res)
@@ -120,7 +120,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"MulticlassFBetaScore",
f2_score_multi,
- FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3.0, num_classes=3, average="macro"),
),
],
)
@@ -134,6 +134,6 @@ def test_muticlass_metrics(metric_name, sklearn_metric, torch_metric):
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_multi_pt, y_true_multi_pt)
- wd_res = wd_logs[metric_name]
+ wd_res = wd_logs[f"Multiclass{metric_name}"]
assert np.isclose(sk_res, wd_res, atol=0.01)
From 2a6ceaaac11297593e86ad1f06b27efed9efa2cc Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:13:28 +0100
Subject: [PATCH 19/24] fixed torchmetrics test
---
tests/test_metrics/test_torchmetrics.py | 6 +++++-
1 file changed, 5 insertions(+), 1 deletion(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 545cd488..063d699f 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,7 +34,7 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
+ ("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -77,6 +77,10 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
+ ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
+ ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
+ ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
+ ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
(
"Accuracy",
accuracy_score,
From f79bc0f4989fc8f82e111ffbe7e05df70539ac94 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:20:12 +0100
Subject: [PATCH 20/24] black styling fix
---
tests/test_metrics/test_torchmetrics.py | 28 +++++++++++++++++++------
1 file changed, 22 insertions(+), 6 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 063d699f..b235795c 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,7 +34,7 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -77,10 +77,26 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("Accuracy", accuracy_score, Accuracy(task="multiclass", num_classes=3, average="micro")),
- ("Precision", precision_score, Precision(task="multiclass", num_classes=3, average="macro")),
- ("Recall", recall_score, Recall(task="multiclass", num_classes=3, average="macro")),
- ("F1Score", f1_score, F1Score(task="multiclass", num_classes=3, average="macro")),
+ (
+ "Accuracy",
+ accuracy_score,
+ Accuracy(task="multiclass", num_classes=3, average="micro"),
+ ),
+ (
+ "Precision",
+ precision_score,
+ Precision(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "Recall",
+ recall_score,
+ Recall(task="multiclass", num_classes=3, average="macro"),
+ ),
+ (
+ "F1Score",
+ f1_score,
+ F1Score(task="multiclass", num_classes=3, average="macro"),
+ ),
(
"Accuracy",
accuracy_score,
@@ -124,7 +140,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"MulticlassFBetaScore",
f2_score_multi,
- FBetaScore(task="multiclass", beta=3.0, num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
),
],
)
From b5666896585aa9869a3ce8d95cd91c0121c1f09d Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Sun, 8 Jan 2023 18:37:33 +0100
Subject: [PATCH 21/24] fixed torchmetrics
---
tests/test_metrics/test_torchmetrics.py | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index b235795c..8909a22a 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -34,7 +34,7 @@ def f2_score_bin(y_true, y_pred):
("Precision", precision_score, Precision(task="binary")),
("Recall", recall_score, Recall(task="binary")),
("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2)),
+ ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
@@ -140,7 +140,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"MulticlassFBetaScore",
f2_score_multi,
- FBetaScore(task="multiclass", beta=3, num_classes=3, average="macro"),
+ FBetaScore(task="multiclass", beta=3.0, num_classes=3, average="macro"),
),
],
)
From 2967a4e0799a5b85570783d8eb8eaba3cf725a06 Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Mon, 9 Jan 2023 11:50:18 +0100
Subject: [PATCH 22/24] again torchmetrics
---
tests/test_metrics/test_torchmetrics.py | 56 ++++---------------------
1 file changed, 8 insertions(+), 48 deletions(-)
diff --git a/tests/test_metrics/test_torchmetrics.py b/tests/test_metrics/test_torchmetrics.py
index 8909a22a..3213229a 100644
--- a/tests/test_metrics/test_torchmetrics.py
+++ b/tests/test_metrics/test_torchmetrics.py
@@ -30,18 +30,18 @@ def f2_score_bin(y_true, y_pred):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- ("Accuracy", accuracy_score, Accuracy(task="binary")),
- ("Precision", precision_score, Precision(task="binary")),
- ("Recall", recall_score, Recall(task="binary")),
- ("F1Score", f1_score, F1Score(task="binary")),
- ("FBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
+ ("BinaryAccuracy", accuracy_score, Accuracy(task="binary")),
+ ("BinaryPrecision", precision_score, Precision(task="binary")),
+ ("BinaryRecall", recall_score, Recall(task="binary")),
+ ("BinaryF1Score", f1_score, F1Score(task="binary")),
+ ("BinaryFBetaScore", f2_score_bin, FBetaScore(task="binary", beta=2.0)),
],
)
def test_binary_metrics(metric_name, sklearn_metric, torch_metric):
sk_res = sklearn_metric(y_true_bin_np, y_pred_bin_np.round())
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_bin_pt, y_true_bin_pt)
- wd_res = wd_logs[f"Binary{metric_name}"]
+ wd_res = wd_logs[metric_name]
if wd_res.size != 1:
wd_res = wd_res[1]
assert np.isclose(sk_res, wd_res)
@@ -77,46 +77,6 @@ def f2_score_multi(y_true, y_pred, average):
@pytest.mark.parametrize(
"metric_name, sklearn_metric, torch_metric",
[
- (
- "Accuracy",
- accuracy_score,
- Accuracy(task="multiclass", num_classes=3, average="micro"),
- ),
- (
- "Precision",
- precision_score,
- Precision(task="multiclass", num_classes=3, average="macro"),
- ),
- (
- "Recall",
- recall_score,
- Recall(task="multiclass", num_classes=3, average="macro"),
- ),
- (
- "F1Score",
- f1_score,
- F1Score(task="multiclass", num_classes=3, average="macro"),
- ),
- (
- "Accuracy",
- accuracy_score,
- Accuracy(task="multiclass", num_classes=3, average="micro"),
- ),
- (
- "Precision",
- precision_score,
- Precision(task="multiclass", num_classes=3, average="macro"),
- ),
- (
- "Recall",
- recall_score,
- Recall(task="multiclass", num_classes=3, average="macro"),
- ),
- (
- "F1Score",
- f1_score,
- F1Score(task="multiclass", num_classes=3, average="macro"),
- ),
(
"MulticlassAccuracy",
accuracy_score,
@@ -140,7 +100,7 @@ def f2_score_multi(y_true, y_pred, average):
(
"MulticlassFBetaScore",
f2_score_multi,
- FBetaScore(task="multiclass", beta=3.0, num_classes=3, average="macro"),
+ FBetaScore(beta=3.0, task="multiclass", num_classes=3, average="macro"),
),
],
)
@@ -154,6 +114,6 @@ def test_muticlass_metrics(metric_name, sklearn_metric, torch_metric):
wd_metric = MultipleMetrics(metrics=[torch_metric])
wd_logs = wd_metric(y_pred_multi_pt, y_true_multi_pt)
- wd_res = wd_logs[f"Multiclass{metric_name}"]
+ wd_res = wd_logs[metric_name]
assert np.isclose(sk_res, wd_res, atol=0.01)
From 83f92be6847b1d02947c24d589c238fde10e2fbc Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Mon, 9 Jan 2023 13:24:11 +0100
Subject: [PATCH 23/24] added roadmap link
---
CONTRIBUTING.MD | 2 +-
mkdocs/sources/contributing.md | 2 +-
mkdocs/sources/index.md | 11 ++++++-----
3 files changed, 8 insertions(+), 7 deletions(-)
diff --git a/CONTRIBUTING.MD b/CONTRIBUTING.MD
index 213bd81f..5aee05c7 100644
--- a/CONTRIBUTING.MD
+++ b/CONTRIBUTING.MD
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/users/jrzaurin/projects/3) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
diff --git a/mkdocs/sources/contributing.md b/mkdocs/sources/contributing.md
index 213bd81f..5aee05c7 100644
--- a/mkdocs/sources/contributing.md
+++ b/mkdocs/sources/contributing.md
@@ -1,6 +1,6 @@
Pytorch-widedeep is being developed and used by many active community members. Your help is very valuable to make it better for everyone.
-- Check for the [Roadmap](https://github.com/jrzaurin/pytorch-widedeep/projects/1) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
+- Check for the [Roadmap](https://github.com/users/jrzaurin/projects/3) or [Open an issue](https://github.com/jrzaurin/pytorch-widedeep/issues) to report problems or recommend new features and submit a draft pull requests, which will be changed to pull request after intial review
- Contribute to the [tests](https://github.com/jrzaurin/pytorch-widedeep/tree/master/tests) to make it more reliable.
- Contribute to the [documentation](https://github.com/jrzaurin/pytorch-widedeep/tree/master/docs) to make it clearer for everyone.
- Contribute to the [examples](https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples) to share your experience with other users.
diff --git a/mkdocs/sources/index.md b/mkdocs/sources/index.md
index 2ed7d534..6d6135b8 100644
--- a/mkdocs/sources/index.md
+++ b/mkdocs/sources/index.md
@@ -29,9 +29,9 @@ Pytorch
The content of this document is organized as follows:
-- [pytorch-widedeep](#pytorch-widedeep)
+- [**pytorch-widedeep**](#pytorch-widedeep)
- [Introduction](#introduction)
- - [The deeptabular component](#the-deeptabular-component)
+ - [The ``deeptabular`` component](#the-deeptabular-component)
- [Acknowledgments](#acknowledgments)
### Introduction
@@ -75,9 +75,10 @@ $$
-Where $W$ are the weight matrices applied to the wide model and to the final
-activations of the deep models, $a$ are these final activations, and
-$\phi(x)$ are the cross product transformations of the original features $x$.
+Where σ is the sigmoid function, *'W'* are the weight matrices applied to the wide model and to the final
+activations of the deep models, *'a'* are these final activations,
+φ(x) are the cross product transformations of the original features *'x'*, and
+, and *'b'* is the bias term.
In case you are wondering what are *"cross product transformations"*, here is
a quote taken directly from the paper: *"For binary features, a cross-product
transformation (e.g., “AND(gender=female, language=en)”) is 1 if and only if
From d35bcf9437a5e43f714cf8d718604bca4f6225ae Mon Sep 17 00:00:00 2001
From: Pavol Mulinka
Date: Mon, 9 Jan 2023 15:58:58 +0100
Subject: [PATCH 24/24] Empty-Commit