For two discrete distributions P and Q, cross entropy is defined as : H(P,Q)=−x∈RX∑P(x)logQ(x)=H(P)+DKL(P∥Q)