What is sum entropy? is it same as joint entropy?

math
Tags: #<Tag:0x00007fb8827bf1a0>

#1

Hi there,

I am sorry this question is not directly related to imagej or fiji. Can any one explain a bit on

SUM ENTROPY??? I couldn’t find anything about sum entropy but few web pages on joint entropy? I found this on a reference thesis related to my work. But really I don’t get it…

Thank you very much…!


#2

Good day,

I guess it refers to the disjunctive probability (sum of probabilities) of two processes. The context in this above text should make this clear.

Regards

Herbie


#3

Thank you very much. for the response I don’t get how to relate it with the image. I mean P(i) refers to probability of what?


#4

Please try to understand how entropy is defined.

You’ve asked this before and you’ve received several constructive answers.

Entropy of images is computed from their gray-value histograms which are empirical approximations of probability density functions. That’s where probabilities come into play …

HTH

Herbie

ADDITION:

What is sum entropy? Is it the same as joint entropy?

I don’t think so.
Joint probability is the conjunctive probability (product of probabilities) of two processes.