嘿,亲!知识可是无价之宝呢,但咱这精心整理的资料也耗费了不少心血呀。小小地破费一下,绝对物超所值哦!如有下载和支付问题,请联系我们QQ(微信同号):813200300
本次赞助数额为: 2 元微信扫码支付:2 元
请留下您的邮箱,我们将在2小时内将文件发到您的邮箱
模式识别与机器学习经典书籍的配套代码Ch01-demo
% demos for ch01
clear;
k = 10; % variable range
n = 100; % number of variables
x = ceil(k*rand(1,n));
y = ceil(k*rand(1,n));
%% Entropy H(x), H(y)
Hx = entropy(x);
Hy = entropy(y);
%% Joint entropy H(x,y)
Hxy = jointEntropy(x,y);
%% Conditional entropy H(x|y)
Hx_y = condEntropy(x,y);
%% Mutual information I(x,y)
Ixy = mutInfo(x,y);
%% Relative entropy (KL divergence) KL(p(x)|p(y))
Dxy = relatEntropy(x,y);
%% Normalized mutual information I_n(x,y)
nIxy = nmi(x,y);
%% Nomalized variation information I_v(x,y)
vIxy = nvi(x,y);
%% H(x|y) = H(x,y)-H(y)
isequalf(Hx_y,Hxy-Hy)
%% I(x,y) = H(x)-H(x|y)
isequalf(Ixy,Hx-Hx_y)
%% I(x,y) = H(x) H(y)-H(x,y)
isequalf(Ixy,Hx Hy-Hxy)
%% I_n(x,y) = I(x,y)/sqrt(H(x)*H(y))
isequalf(nIxy,Ixy/sqrt(Hx*Hy))
%% I_v(x,y) = (1-I(x,y)/H(x,y))
isequalf(vIxy,1-Ixy/Hxy)