Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scarliles/honesty #69

Draft
wants to merge 85 commits into
base: submodulev3
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from 62 commits
Commits
Show all changes
85 commits
Select commit Hold shift + click to select a range
8c09f7f
init split condition injection
SamuelCarliles3 Feb 16, 2024
ecfc9b1
wip
SamuelCarliles3 Feb 16, 2024
0c3d5c0
wip
SamuelCarliles3 Feb 16, 2024
5fd12a2
wip
SamuelCarliles3 Feb 20, 2024
b593ee0
injection progress
SamuelCarliles3 Feb 27, 2024
180fac3
injection progress
SamuelCarliles3 Feb 27, 2024
c207c3e
split injection refactoring
SamuelCarliles3 Feb 27, 2024
7cc71c1
added condition parameter passthrough prototype
SamuelCarliles3 Feb 29, 2024
2470d49
some tidying
SamuelCarliles3 Feb 29, 2024
ee3399f
more tidying
SamuelCarliles3 Feb 29, 2024
a079e4f
splitter injection refactoring
SamuelCarliles3 Mar 10, 2024
5397b66
cython injection due diligence, converted min_sample and monotonic_cs…
SamuelCarliles3 Mar 15, 2024
44f1d57
tree tests pass huzzah!
SamuelCarliles3 Mar 18, 2024
4f19d53
added some splitconditions to header
SamuelCarliles3 Mar 18, 2024
cb71be0
commented out some sample code that was substantially increasing peak…
SamuelCarliles3 Mar 21, 2024
e34be5c
added vector resize
SamuelCarliles3 Apr 9, 2024
aac802e
wip
SamuelCarliles3 Apr 10, 2024
c12f2fd
Merge branch 'submodulev3' into scarliles/splitter-injection-redux
SamuelCarliles3 Apr 15, 2024
a7f5e92
settling injection memory management for now
SamuelCarliles3 Apr 15, 2024
7a70a0b
added regression forest benchmark
SamuelCarliles3 Apr 22, 2024
d9ad68a
Merge pull request #2 from ssec-jhu/scarliles/regression-benchmark
SamuelCarliles3 Apr 22, 2024
893d588
ran black for linting check
SamuelCarliles3 Apr 23, 2024
548493c
Merge branch 'submodulev3' of github.com:ssec-jhu/scikit-learn into s…
SamuelCarliles3 Apr 23, 2024
e4b53ff
Merge branch 'submodulev3' into scarliles/regression-benchmark
SamuelCarliles3 Apr 23, 2024
089d901
Merge branch 'neurodata:submodulev3' into submodulev3
SamuelCarliles3 Apr 24, 2024
3ba5f74
Merge branch 'submodulev3' of github.com:ssec-jhu/scikit-learn into s…
SamuelCarliles3 Apr 24, 2024
cf285c1
Merge branch 'scarliles/splitter-injection-redux' into scarliles/regr…
SamuelCarliles3 Apr 24, 2024
ffc6328
Merge pull request #3 from ssec-jhu/scarliles/regression-benchmark
SamuelCarliles3 Apr 24, 2024
87c90fd
initial pass at refactoring DepthFirstTreeBuilder.build
SamuelCarliles3 May 23, 2024
51da586
some renaming to make closure pattern more obvious
SamuelCarliles3 May 28, 2024
6c117a2
added SplitRecordFactory
SamuelCarliles3 May 28, 2024
c7b675b
Merge branch 'scarliles/update-node-refactor2' into scarliles/update-…
SamuelCarliles3 May 28, 2024
9e7b131
SplitRecordFactory progress
SamuelCarliles3 May 28, 2024
a017669
build loop refactor
SamuelCarliles3 May 29, 2024
4325b0a
add_or_update tweak
SamuelCarliles3 May 29, 2024
78c3a1b
reverted to back out build body refactor
SamuelCarliles3 May 30, 2024
b8cc636
refactor baby step
SamuelCarliles3 May 30, 2024
f225658
update node refactor more baby steps
SamuelCarliles3 May 30, 2024
bc17634
wip
SamuelCarliles3 Jun 14, 2024
c949182
added EventBroker class
SamuelCarliles3 Jun 16, 2024
247c4fc
added initial event firing to node_split_best
SamuelCarliles3 Jun 17, 2024
71da148
removed some old commented out code
SamuelCarliles3 Jun 17, 2024
a1fa950
honesty wip
SamuelCarliles3 Jun 30, 2024
ff0dfed
honesty wip
SamuelCarliles3 Jun 30, 2024
db4c947
honesty wip
SamuelCarliles3 Jul 1, 2024
2e87134
honesty wip
SamuelCarliles3 Jul 1, 2024
03c95d9
honesty wip
SamuelCarliles3 Jul 1, 2024
69fc530
honesty wip
SamuelCarliles3 Jul 3, 2024
61dfd0f
honesty wip
SamuelCarliles3 Jul 5, 2024
29a52be
Merge remote-tracking branch 'neurodata/submodulev3' into submodulev3
SamuelCarliles3 Jul 5, 2024
cf52ff5
broke sort functions, partitioners out of _splitter.pyx
SamuelCarliles3 Jul 5, 2024
8e433a6
refactored partitioner
SamuelCarliles3 Jul 6, 2024
09a8ec5
fixed some unintended commented out lines in SparsePartitioner
SamuelCarliles3 Jul 6, 2024
6bb7a33
Merge branch 'scarliles/defuse-partitioner' into scarliles/honesty
SamuelCarliles3 Jul 8, 2024
a2030a8
importing _honest_tree from treeple
SamuelCarliles3 Jul 10, 2024
64688e5
honesty wip
SamuelCarliles3 Jul 18, 2024
febf5e9
honesty wip
SamuelCarliles3 Jul 22, 2024
5e7d07d
honesty wip
SamuelCarliles3 Jul 31, 2024
2c4e992
honesty wip
SamuelCarliles3 Aug 1, 2024
2346e4d
honesty wip
SamuelCarliles3 Aug 4, 2024
551fcf1
honesty wip
SamuelCarliles3 Aug 4, 2024
f1fb747
honesty wip
SamuelCarliles3 Aug 4, 2024
2f2d15a
honest partition testing wip
SamuelCarliles3 Aug 9, 2024
cd79492
honest leaf validity test working
SamuelCarliles3 Aug 10, 2024
53cf65c
honest prediction wip
SamuelCarliles3 Aug 22, 2024
a9e065b
honest prediction wip
SamuelCarliles3 Aug 24, 2024
80c391d
honest prediction passing tests
SamuelCarliles3 Aug 24, 2024
9b5651e
hacked in working honest predict_proba, progress on honest regression
SamuelCarliles3 Aug 30, 2024
cbb23ee
first draft honest forest passing tests
SamuelCarliles3 Sep 3, 2024
c565d65
honesty wip
SamuelCarliles3 Sep 5, 2024
2316e4c
treeple-compatibility tweaks
SamuelCarliles3 Sep 8, 2024
71cacf3
might testing wip
SamuelCarliles3 Sep 18, 2024
6ea50cc
honest forest fixes, honest tree tests
SamuelCarliles3 Nov 6, 2024
492ddad
honest forest test added
SamuelCarliles3 Nov 6, 2024
92156cf
documented method and reasoning for Partitioner "defusing"
SamuelCarliles3 Dec 2, 2024
5291fb1
documented event broker
SamuelCarliles3 Dec 5, 2024
f655401
commented changes to splitter
SamuelCarliles3 Dec 6, 2024
877a822
commented changes to tree
SamuelCarliles3 Dec 6, 2024
3b16b8f
commented honesty module
SamuelCarliles3 Dec 6, 2024
5af6c0b
commented honest tree
SamuelCarliles3 Dec 9, 2024
d75a79b
commented classes.py
SamuelCarliles3 Dec 9, 2024
bdb4ee1
fixed dependency in honest tree tests
SamuelCarliles3 Dec 17, 2024
bd1dd04
Merge branch 'neurodata:submodulev3' into submodulev3
SamuelCarliles3 Dec 19, 2024
35432ee
merged back from submodulev3, overrode Partitioner and Splitter changes
SamuelCarliles3 Dec 31, 2024
7059bf7
commented out some flaky tests in tree which now fail. correct covera…
SamuelCarliles3 Dec 31, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 44 additions & 1 deletion asv_benchmarks/benchmarks/ensemble.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,58 @@
GradientBoostingClassifier,
HistGradientBoostingClassifier,
RandomForestClassifier,
RandomForestRegressor,
)

from .common import Benchmark, Estimator, Predictor
from .datasets import (
_20newsgroups_highdim_dataset,
_20newsgroups_lowdim_dataset,
_synth_classification_dataset,
_synth_regression_dataset,
_synth_regression_sparse_dataset,
)
from .utils import make_gen_classif_scorers
from .utils import make_gen_classif_scorers, make_gen_reg_scorers


class RandomForestRegressorBenchmark(Predictor, Estimator, Benchmark):
"""
Benchmarks for RandomForestRegressor.
"""

param_names = ["representation", "n_jobs"]
params = (["dense", "sparse"], Benchmark.n_jobs_vals)

def setup_cache(self):
super().setup_cache()

def make_data(self, params):
representation, n_jobs = params

if representation == "sparse":
data = _synth_regression_sparse_dataset()
else:
data = _synth_regression_dataset()

return data

def make_estimator(self, params):
representation, n_jobs = params

n_estimators = 500 if Benchmark.data_size == "large" else 100

estimator = RandomForestRegressor(
n_estimators=n_estimators,
min_samples_split=10,
max_features="log2",
n_jobs=n_jobs,
random_state=0,
)

return estimator

def make_scorers(self):
make_gen_reg_scorers(self)


class RandomForestClassifierBenchmark(Predictor, Estimator, Benchmark):
Expand Down
77 changes: 73 additions & 4 deletions sklearn/tree/_classes.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,34 @@
# =============================================================================


class BuildTreeArgs:
def __init__(
self,
X,
y,
sample_weight,
missing_values_in_feature_mask,
min_samples_leaf,
min_weight_leaf,
max_leaf_nodes,
min_samples_split,
max_depth,
random_state,
classes
):
self.X = X
self.y = y
self.sample_weight = sample_weight
self.missing_values_in_feature_mask = missing_values_in_feature_mask
self.min_samples_leaf = min_samples_leaf
self.min_weight_leaf = min_weight_leaf
self.max_leaf_nodes = max_leaf_nodes
self.min_samples_split = min_samples_split
self.max_depth = max_depth
self.random_state = random_state
self.classes = classes


class BaseDecisionTree(MultiOutputMixin, BaseEstimator, metaclass=ABCMeta):
"""Base class for decision trees.

Expand Down Expand Up @@ -155,6 +183,10 @@ def __init__(
self.ccp_alpha = ccp_alpha
self.store_leaf_values = store_leaf_values
self.monotonic_cst = monotonic_cst
self.presplit_conditions = None
self.postsplit_conditions = None
self.splitter_listeners = None
self.tree_build_listeners = None

def get_depth(self):
"""Return the depth of the decision tree.
Expand Down Expand Up @@ -228,7 +260,7 @@ def _compute_missing_values_in_feature_mask(self, X, estimator_name=None):
missing_values_in_feature_mask = _any_isnan_axis0(X)
return missing_values_in_feature_mask

def _fit(
def _prep_data(
self,
X,
y,
Expand Down Expand Up @@ -405,8 +437,7 @@ def _fit(
min_weight_leaf = self.min_weight_fraction_leaf * np.sum(sample_weight)
self.min_weight_leaf_ = min_weight_leaf

# build the actual tree now with the parameters
self = self._build_tree(
return BuildTreeArgs(
X=X,
y=y,
sample_weight=sample_weight,
Expand All @@ -417,9 +448,42 @@ def _fit(
min_samples_split=min_samples_split,
max_depth=max_depth,
random_state=random_state,
classes=classes
)


def _fit(
self,
X,
y,
sample_weight=None,
check_input=True,
missing_values_in_feature_mask=None,
classes=None,
):
bta = self._prep_data(
X=X,
y=y,
sample_weight=sample_weight,
check_input=check_input,
missing_values_in_feature_mask=missing_values_in_feature_mask,
classes=classes
)

# build the actual tree now with the parameters
return self._build_tree(
X=bta.X,
y=bta.y,
sample_weight=bta.sample_weight,
missing_values_in_feature_mask=bta.missing_values_in_feature_mask,
min_samples_leaf=bta.min_samples_leaf,
min_weight_leaf=bta.min_weight_leaf,
max_leaf_nodes=bta.max_leaf_nodes,
min_samples_split=bta.min_samples_split,
max_depth=bta.max_depth,
random_state=bta.random_state,
)

return self

def _build_tree(
self,
Expand Down Expand Up @@ -523,6 +587,9 @@ def _build_tree(
min_weight_leaf,
random_state,
monotonic_cst,
presplit_conditions=self.presplit_conditions,
postsplit_conditions=self.postsplit_conditions,
listeners=self.splitter_listeners
)

if is_classifier(self):
Expand All @@ -545,6 +612,7 @@ def _build_tree(
max_depth,
self.min_impurity_decrease,
self.store_leaf_values,
listeners = self.tree_build_listeners
)
else:
builder = BestFirstTreeBuilder(
Expand All @@ -556,6 +624,7 @@ def _build_tree(
max_leaf_nodes,
self.min_impurity_decrease,
self.store_leaf_values,
listeners = self.tree_build_listeners
)
builder.build(self.tree_, X, y, sample_weight, missing_values_in_feature_mask)

Expand Down
32 changes: 32 additions & 0 deletions sklearn/tree/_events.pxd
adam2392 marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Authors: Samuel Carliles <scarlil1@jhu.edu>
#
# License: BSD 3 clause

# See _events.pyx for details.

from libcpp.vector cimport vector
from ..utils._typedefs cimport float32_t, float64_t, intp_t, int32_t, uint32_t

ctypedef int EventType
ctypedef void* EventHandlerEnv
ctypedef void* EventData
ctypedef bint (*EventHandlerFunction)(
EventType event_type,
EventHandlerEnv handler_env,
EventData event_data
) noexcept nogil

cdef struct EventHandlerClosure:
EventHandlerFunction f
EventHandlerEnv e

cdef class EventHandler:
cdef public int[:] event_types
cdef EventHandlerClosure c

cdef class NullHandler(EventHandler):
pass

cdef class EventBroker:
cdef vector[vector[EventHandlerClosure]] listeners # listeners acts as a map from EventType to corresponding event handlers
cdef bint fire_event(self, EventType event_type, EventData event_data) noexcept nogil
61 changes: 61 additions & 0 deletions sklearn/tree/_events.pyx
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@

# Authors: Samuel Carliles <scarlil1@jhu.edu>
#
# License: BSD 3 clause


cdef class EventBroker:
def __cinit__(self, listeners: [EventHandler], event_types: [EventType]):
"""
Parameters:
- listeners ([EventHandler])
- event_types ([EventType]): a list of EventTypes that may be fired by this EventBroker

Notes:
- Don't mix event types in a single EventBroker instance,
i.e. don't use the same EventBroker for brokering NodeSplitEvent that you use
for brokering TreeBuildEvent, etc
"""
self.listeners.resize(max(event_types) + 1)

if(listeners is None):
for e in range(max(event_types) + 1):
self.listeners[e].resize(0)
else:
self.add_listeners(listeners, event_types)

def add_listeners(self, listeners: [EventHandler], event_types: [EventType]):
cdef int e, i, j, offset, mx, ct
cdef list l

# listeners is a vector of vectors which we index using EventType,
# so if event_types contains any EventType for which we don't already have a vector,
# its integer value will be larger than our current size + 1
mx = max(event_types)
offset = self.listeners.size()
if mx > offset + 1:
self.listeners.resize(mx + 1)

if(listeners is not None):
for e in event_types:
# find indices for all listeners to event type e
l = [j for j, _l in enumerate(listeners) if e in (<EventHandler>_l).event_types]
offset = self.listeners[e].size()
ct = len(l)
self.listeners[e].resize(offset + ct)
for i in range(ct):
j = l[i]
self.listeners[e][offset + i] = (<EventHandler>listeners[j]).c

cdef bint fire_event(self, EventType event_type, EventData event_data) noexcept nogil:
cdef bint result = True

#with gil:
# print(f"firing event {event_type}")
# print(f"listeners.size = {self.listeners.size()}")

if event_type < self.listeners.size():
for l in self.listeners[event_type]:
result = result and l.f(event_type, l.e, event_data)

return result
Loading