It is often for some simple term to have may different names. But terms used in Machine Learning are absolutely special.
ROC curve! Receiver operating characteristic, sounds like something terribly new.
But this is the same as P-P plot in stats.
(HEP people draw it mirrored frequently)
What about tpr? True positive rate in machine learning. But HEP people use 'signal efficiency', or just 's'.
And survival function in statistic means really the same, though rarely applied to comparison of distributions.
Log-likelihood in stats was renamed to logloss in machine learning. Those who train neural networks, usually call it cross-entropy loss function, but scikit-learn developers call this binomial deviance. And I am still not sure that I know all the names.
For the first time such things just drive you crazy. Later there is no difference for you.
But this turns into real problem much later, because the same things are discussed in different terms, and you do not know which one to use when searching on the internet.
ROC curve! Receiver operating characteristic, sounds like something terribly new.
But this is the same as P-P plot in stats.
(HEP people draw it mirrored frequently)
What about tpr? True positive rate in machine learning. But HEP people use 'signal efficiency', or just 's'.
And survival function in statistic means really the same, though rarely applied to comparison of distributions.
Log-likelihood in stats was renamed to logloss in machine learning. Those who train neural networks, usually call it cross-entropy loss function, but scikit-learn developers call this binomial deviance. And I am still not sure that I know all the names.
For the first time such things just drive you crazy. Later there is no difference for you.
But this turns into real problem much later, because the same things are discussed in different terms, and you do not know which one to use when searching on the internet.
No comments :
Post a Comment