


WOLPAWENTROPY calculates the Mutual Information according to [1]
B = wolpaw_entropy(acc,M)
B = log2(M) + acc*log2(acc) + (1-acc)*log2((1-acc)/(M-1))
acc Accuracy [0..1] 1 = 100%;
N number of classes
B mutual information
Reference(s):
[1] J.R. Wolpaw, N. Birbaumer, W.J. Heetderks, D.J. McFarland, P. Huntereckham,
G. Schalk, E. Donchin, L.A. Quatrano, C-J. Robinson and T.M. Vaughan.
Brain-Computer Interface Technology: A Review of the First Inernational Meeting.
IEEE Transactions on Rehabilitation Engineering 8(2) (Jun 2000) 164-173.