
A measure of the amount of uncertainty in a probability distribution or a system subject to constraints. The term originated in thermodynamics, but has been used in a wide variety of contexts, notably in information theory and as the basis for entropymaximizing models of spatial interaction.
The concepts of macrostate and microstate are central to entropy analysis (note that some writers use the term mesostate where macrostate is employed here). Consider the distribution of 100 people into 10 regions: individual B to region 6, individual K to region 4, and so on. A macrostate is an aggregate frequency distribution of people across regions. Several different microstates may correspond to or give rise to the same macrostate: different individuals go to different regions, but the frequency distributions are the same. Entropy measures the relationship between a macrostate and the possible microstates that correspond to it. At one extreme, one macrostate (all 100 people in one region) has only one associated microstate, whereas the macrostate with ten people in each region corresponds to a large number of different microstates. The number of microstates corresponding to a macrostate is denoted here by W, and finding the entropy measure is a combinatorial calculation, given by:
{img src=show_image.php?name=bkhumgeofm4.gif }
the factorial of the total number of individuals N, divided by the product of the factorials for each ni (the number in each region). An alternative entropy measure, used in information theory, is the statistic:
{img src=show_image.php?name=bkhumgeofm5.gif }
where pi is the probability (or proportion) in a given region. H is perfectly related to log W. The entropy statistics W and H measure the uncertainty of a macrostate with regard to its microstates. Minimum entropy (H = 0) occurs for one pi equal to unity and the rest to zero; there is complete certainty because there is only one microstate. H is at a maximum when all the pi are equal (maximum uncertainty; all microstates equally likely). W and H can then be used to assess either the expected entropy of distributions and allocations, or the actual entropy of empirical patterns.
In geography, the information theory approach has used H to assess and compare entropy levels for settlement patterns and for trends in population and employment distributions. The entropymaximizing approach uses entropy as the basis for finding the most likely macrostate of a system subject to constraints.Â (LWH)
Suggested Reading Thomas, R.W. and Huggett, R.J. 1980: Modelling in geography: a mathematical approach. New York: Harper and Row, 15266 and 197200.Â Wilson, A.G. and Bennett, R.J. 1986: Mathematical methods in human geography and planning. Chichester: John Wiley. 
