Both information theory and statistical mechanics make rather cavalier use of a simple continuous version of the discrete entropy. Treatments often gloss over a number of subtleties in the definition of such a quantity, and this can lead to confusion. A proper continuous version of the discrete entropy is not easy to construct and may not exist. The differential entropy commonly bandied about actually is a discrete entropy in disguise, and possesses an implicit coarse-graining scale.
In this article, I review discrete entropy and probability densities, carefully analyze the continuous limit and issues encountered, and touch on several possible approaches. An enumeration of various axiomatic formulations also is provided. The piece is pedagogical and does not contain original research, though I offer a couple of my own thoughts on possible means of generalizing entropy.
While an acquaintance with probability and entropy is assumed, the discussion is fairly self contained and should be accessible to a broad audience.