This introductory chapter provides an overview of Benford' law. Benford's law, also known as the First-digit or Significant-digit law, is the empirical gem of statistical folklore that in many naturally occurring tables of numerical data, the significant digits are not uniformly distributed as might be expected, but instead follow a particular logarithmic distribution. In its most common formulation, the special case of the first significant (i.e., first non-zero) decimal digit, Benford's law asserts that the leading digit is not equally likely to be any one of the nine possible digits 1, 2, … , 9, but is 1 more than 30 percent of the time, and is 9 less than 5 percent of the time, with the probabilities decreasing monotonically in between. The remainder of the chapter covers the history of Benford' law, empirical evidence, early explanations and mathematical framework of Benford' law.