网站首页  英汉词典

请输入您要查询的英文单词:

 

单词 Relative entropy
释义

Relative entropy

中文百科

相对熵 Kullback–Leibler divergence

(重定向自Relative entropy)

相对熵(relative entropy)又称为KL散度Kullback–Leibler divergence,简称KLD),信息散度(information divergence),信息增益(information gain)。


KL散度是两个概率分布P和Q差别的非对称性的度量。 KL散度是用来 度量使用基于Q的编码来编码来自P的样本平均所需的额外的比特数。 典型情况下,P表示数据的真实分布,Q表示数据的理论分布,模型分布,或P的近似分布。

英语百科

Kullback–Leibler divergence 相对熵

(重定向自Relative entropy)
Illustration of the Kullback–Leibler (KL) divergence for two normal Gaussian distributions. Note the typical asymmetry for the Kullback–Leibler divergence is clearly visible.

In probability theory and information theory, the Kullback–Leibler divergence (also information divergence, information gain, relative entropy, KLIC, or KL divergence) is a measure of the difference between two probability distributions P and Q. It is not symmetric in P and Q. In applications, P typically represents the "true" distribution of data, observations, or a precisely calculated theoretical distribution, while Q typically represents a theory, model, description, or approximation of P.

随便看

 

英汉网英语在线翻译词典收录了3779314条英语词汇在线翻译词条,基本涵盖了全部常用英语词汇的中英文双语翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2024 encnc.com All Rights Reserved
更新时间:2025/6/27 21:59:52