获取决策树回归器中叶节点的值



我一直在尝试分析我在sklearn训练DecisionTreeRegressor。我发现 http://scikit-learn.org/stable/auto_examples/tree/plot_unveil_tree_structure.html 确定拆分树中每个分支的属性很有用,特别是以下代码片段:

n_nodes = estimator.tree_.node_count
children_left = estimator.tree_.children_left
children_right = estimator.tree_.children_right
feature = estimator.tree_.feature
threshold = estimator.tree_.threshold
# The tree structure can be traversed to compute various properties such
# as the depth of each node and whether or not it is a leaf.
node_depth = np.zeros(shape=n_nodes, dtype=np.int64)
is_leaves = np.zeros(shape=n_nodes, dtype=bool)
stack = [(0, -1)]  # seed is the root node id and its parent depth
while len(stack) > 0:
node_id, parent_depth = stack.pop()
node_depth[node_id] = parent_depth + 1
# If we have a test node
if (children_left[node_id] != children_right[node_id]):
stack.append((children_left[node_id], parent_depth + 1))
stack.append((children_right[node_id], parent_depth + 1))
else:
is_leaves[node_id] = True

print("The binary tree structure has %s nodes and has "
"the following tree structure:"
% n_nodes)
for i in range(n_nodes):
if is_leaves[i]:
print("%snode=%s leaf node." % (node_depth[i] * "t", i))
else:
print("%snode=%s test node: go to node %s if X[:, %s] <= %s else to "
"node %s."
% (node_depth[i] * "t",
i,
children_left[i],
feature[i],
threshold[i],
children_right[i],
))

但是,这并没有告诉我每个叶节点的值。如果上面打印出看起来像这样的东西:

The binary tree structure has 7 nodes and has the following tree structure:
node=0 test node: go to node 1 if X[:, 2] <= 1.00764083862 else to node 4.
node=1 test node: go to node 2 if X[:, 2] <= 0.974808812141 else to node 3.
node=2 leaf node.
node=3 leaf node.
node=4 test node: go to node 5 if X[:, 0] <= -2.90554761887 else to node 6.
node=5 leaf node.
node=6 leaf node.

例如,我如何知道节点 2 表示的值?

您要查找的方法estimator.tree_.value

让我们做一个可重现的示例,因为您从文档中链接到的示例用于分类而不是回归:

import numpy as np
from sklearn.tree import DecisionTreeRegressor
# dummy data
rng = np.random.RandomState(1)
X = np.sort(5 * rng.rand(80, 1), axis=0)
y = np.sin(X).ravel()
y[::5] += 3 * (0.5 - rng.rand(16))
estimator = DecisionTreeRegressor(max_depth=3)
estimator.fit(X, y)

之后,逐字使用您的代码,我们得到:

The binary tree structure has 15 nodes and has the following tree structure: 
node=0 test node: go to node 1 if X[:, 0] <= 3.13275051117 else to node 8. 
node=1 test node: go to node 2 if X[:, 0] <= 0.513901114464 else to node 5. 
node=2 test node: go to node 3 if X[:, 0] <= 0.0460066311061 else to node 4. 
node=3 leaf node. 
node=4 leaf node. 
node=5 test node: go to node 6 if X[:, 0] <= 2.02933192253 else to node 7. 
node=6 leaf node. 
node=7 leaf node. 
node=8 test node: go to node 9 if X[:, 0] <= 3.85022854805 else to node 12. 
node=9 test node: go to node 10 if X[:, 0] <= 3.42930102348 else to node 11. 
node=10 leaf node. 
node=11 leaf node. 
node=12 test node: go to node 13 if X[:, 0] <= 4.68025827408 else to node 14. 
node=13 leaf node. 
node=14 leaf node.

现在,estimator.tree_.value包含所有树节点的值(此处为 15(:

len(estimator.tree_.value)
# 15

例如,为了获得节点 #3 的值,我们要求

estimator.tree_.value[3]
# array([[-1.1493464]])

有关value内容(包括非终端节点(的详细说明,请参阅我在

  1. 解释决策树回归(回归(的图形可视化输出,以及

  2. scikit-learn DecisionTreeClassifier.tree_.value是做什么的?(用于分类(。

最新更新