Thirty thousand years ago, humans kept track of numerical quantities by carving slashes on fragments of bone. It took approximately 25,000 years for the first iconic written numerals to emerge among human cultures (e.g., Sumerian cuneiform). Now, children acquire the meanings of verbal counting words, Arabic numerals, written number words, and the procedures of basic arithmetic operations, such as addition and subtraction, in just 6 years (between ages 2 and 8). What cognitive abilities enabled our ancestors to record tallies in the first place? Additionally, what cognitive abilities allow children to rapidly acquire the formal mathematics knowledge that took our ancestors many millennia to invent? Current research aims to discover the origins and organization of numerical information in humans using clues from child development, the organization of the human brain, and animal cognition.
This review traces the origins of numerical processing from “primitive” quantitative abilities to math intelligence quotient (IQ). “Primitive” quantitative abilities are those that many animals use to estimate the value of an object or event, for instance its distance, length, duration, number, amplitude, saturation, or luminance (among others).
Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627. E-mail: firstname.lastname@example.org.