Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation Metrics to Consider #904

Open
BradKML opened this issue Mar 31, 2022 · 1 comment
Open

Evaluation Metrics to Consider #904

BradKML opened this issue Mar 31, 2022 · 1 comment

Comments

@BradKML
Copy link

BradKML commented Mar 31, 2022

Describe the workflow you want to enable

There are other weighting schemes than F1 scores and Accuracy that checks on classification strength.
There are pair-counting metrics (if ground truth data pairs match similarly to model pairs) and information based (compare model structure to ground truth).

Describe your proposed solution

Adaptation and expansion of external valuation metrics, for possible ensembles.

Describe alternatives you've considered, if relevant

@rasbt
Copy link
Owner

rasbt commented Mar 31, 2022

I am all in for adding more things that are not widely available elsewhere and are useful to others.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants