Information gain (decision tree)
id:
information-gain-decision-tree-206-1927725
title:
Information gain (decision tree)
text:
In information theory and machine learning, information gain is a synonym for Kullback–Leibler divergence; the amount of information gained about a random variable or signal from observing another random variable. However, in the context of decision trees, the term is sometimes used synonymously with mutual information, which is the conditional expected value of the Kullback–Leibler divergence of the univariate probability distribution of one variable from the conditional distribution of this va
brand slug:
wiki
category slug:
encyclopedia
description:
Gain from observing another random variable
original url:
https://en.wikipedia.org/wiki/Information_gain_(decision_tree)
date created:
2005-08-21T21:44:17Z
date modified:
2024-09-10T17:14:39Z
main entity:
{"identifier":"Q17083041","url":"https://www.wikidata.org/entity/Q17083041"}
image:
fields total:
13
integrity:
15