Numpy histogram density does not sum to 1

During a Computational Vision lab, while comparing histograms, I stumbled upon a peculiar behavior. The histograms pairwise kernel matrix – which is just a fancy name for the matrix holding histograms correlations one with another – did not have ones on the diagonal. This means that one histogram was not fully correlated to itself, which is weird.

numpy histogram integral not 1The comparison metric I was using is the simple histogram intersection one, defined as

    \[K_{hi} = \sum^{d}_{m=1}min(x_m,y_m)\]

Continue reading