Note: This is a longform essay that I entered in Brown’s Archaeology for the People competition back in September. On the one hand, I did not win, which means I am, sadly, neither rich nor famous. On the other hand, I now get to share it on the blog. All’s well that ends well.
I remember the first time I touched a child’s bones. Crouched on the side of a sun-drenched Portuguese hill, I knelt in front of a slope of sediment from which human remains sprouted like calcified weeds. The site of Bolores is just outside of Torres Vedras, a small, sleepy town nestled in the gentle hills of the Portuguese Centro, some 40 kilometers north of Lisbon. Between four and five thousand years ago, people gravitated to this small rock shelter perched halfway up a hill, carving out the soft rock of the overhang to bury their dead in a communal grave. Incredibly, the site was used for over a millennium, with generation after generation interring a few select individuals in the gentle curve of the hillside that had been gradually chiseled out of the limestone overhang. Centuries of use led many of the skeletons to become commingled; as space was dug out for each new interment, old bones became dislodged, overturned and upended, mixed with the bones of other people. Each burial was covered with sediment, and then the process was repeated, over tens and even hundreds of years.
The site was discovered when a local farmer noticed bones eroding out of the ridgeline, and so we were carefully undoing one thousand years worth of ritual, lifting out a handful of earth for each handful scattered in the past, levering up the limestone slabs that had been used as platforms to support the dead, and removing, bone by bone, every individual who had been carefully laid to rest in the rock shelter. We had been digging for a few weeks, and I was learning how to navigate a site peppered with extremely fragmentary human remains: walk barefoot, dig slowly, and keep your eyes peeled for teeth.
Most archaeologists could easily equip themselves with the leavings from a typical construction site – large shovels, pick-axes, trowels, rebar, and flagging tape – but the fragile nature of the human remains preserved at Bolores required a different approach. We worked with a tool-kit pilfered from a variety of other professions; our packs were filled with the cheap bamboo skewers favored by shish-taouk vendors, exquisite long-handled coffee spoons more appropriate to Portuguese cafés, finely-tipped artists’ paint-brushes, and brightly colored thumb tacks normally used to hold up elementary school posters. Excavating a prehistoric and commingled burial is a delicate affair, and so every day I would hunker down in front of a space barely 20 centimeters across, dislodging recalcitrant clumps of dried sediment with the skewer, lifting tiny spoonfuls of dirt out of crevices, lightly brushing away sand to reveal the contours of smooth patches of bone, and marking the location of finds with the thumb tacks.
On this particular afternoon I was hunched over a jumble of broken bones, a cornucopia of arm bone heads, vertebral bodies, and fractured thighbones. While gently brushing around a robust first metacarpal – the large bone just underneath the base of your thumb – I uncovered four tiny shafts. I called our resident bioarchaeologist, Anna, over for a look. “Are these animal? They’re so small that they can’t be human, right?” She crouched down for a minute with her brush, nose only inches away from the bones. “No, no, they’re human. Subadult phalanges. Maybe 2-4 years, based on the size?” She headed back to her unit and I rocked slowly back onto my heels. Even though I had worked on many other archaeological projects, this felt different. The remnants of the child’s hand were nestled up against the adult thumb, as if arranged to give them protection or comfort. It was impossible to tell whether the positioning was deliberate or merely the result of centuries of sedimentary admixture, but the potential purposefulness of the arrangement struck me. Thousands of years ago someone had lost a young child, carried them up to this isolated, wind-swept ridge kilometers away from the nearest settlement, and laid them to rest in the company of their ancestors.
As an archaeologist, it becomes easy to disassociate oneself from the past, to grow inured to the incessant, intriguing pressure of generations of ancient lives. The material traces of prehistoric existence metamorphose into data points on a spreadsheet, converting hours of artisanal effort, decades of use-wear, and flashes of personality into banal column entries with a single click of a mouse. Staring at the child’s fingers, stained reddish-brown by the surrounding earth, this degree of disassociation suddenly seemed impossible. For the first time I was viscerally aware of just how close I was to the past, confronted with the knowledge that these hands had been part of a person. The fragmentary cranial bones I was having so much trouble extracting, the femoral head that was slowly disintegrating and impossible to preserve – these were no longer simply objects, tiresome tasks to be checked off a late afternoon excavation to-do list. Everything in my immediate field of vision had once been part of a person, and had been deliberately interred in this cave because of other people, responding, emotionally, physically and socially, to their death.
The beginnings of bioarchaeology I am not alone in my obsession with bioarchaeology’s unique ability to put you in immediate, tangible contact with ancient people. Bioarchaeology, the study of human remains from archaeological sites, is a speciality that has grown rapidly within archaeology over the past fifty years. At the turn of the 19th century, when many practitioners of the discipline resembled the mustachioed protagonists of H. Rider Haggard novels, human remains were widely considered to be an inconvenience. Skeletons proved an impediment to accessing the elaborately decorated ceramics, golden vessels, obsidian blades, and other artifacts that excavators so eagerly sought after. As a result, human remains were treated with about as much dignity as the plough zone, the layer of modern agricultural activity archaeologists must remove before excavating a site.
Stories abound of early teams tunneling through burials, destroying carefully arranged ritual interments for the sole purpose of retrieving exotic or elaborate grave goods – a practice we now recognize as looting. Given this professional history, it is little wonder that NAGPRA, the Native American Grave and Repatriation Act, was put into place in 1990. This legislation stipulates that archaeologists in North America are legally obligated to return any excavated “cultural items” – a category which includes human remains – that are identified with or related to descendants living in modern Native American communities. Due in no small part to archaeology’s abysmal history of disrespectful treatment of indigenous human skeletons, early calls for repatriation of Native American remains poured in to national and local museums and state archaeologists. Thanks to the efforts of both active indigenous communities and a new generation of archaeologists supportive of NAGPRA and Native American rights, thousands of human remains that were carelessly excavated in the past century have since been reburied.
It was not until the second half of the 20th century that the traditional idea of “human bone as refuse” began to change. During the mid-1900s, pioneering skeletal biology research started to provide new ways to garner information from human remains. Anatomists were beginning to realize the utility of curating humans for study, amassing large-scale, comparative collections of individuals with known life-histories in order to chart the effects of age, illness and trauma on the human skeleton. Robert J. Terry, a professor of Anatomy at Washington University in St. Louis, was an early pioneer in this movement. He focused on skeletons collected from the university medical school’s anatomy cadavers, giving abandoned bodies that had become “property of the state” greater public utility than unceremonious cremation. Terry’s project quickly grew in scientific scope and professional scale. Researchers diligently documented biographical information like sex, age, and cause of death, collecting data on a broad cross-section of contemporary society. By the 1950s, the project began to comply with new legislation that mandated familial legal consent for cadaver donation. In 1967, the pioneering collection was moved to its new home in the Smithsonian Museum of Natural History, where it serves to this day as an unmatched repository for skeletal research.
Terry was not alone in understanding the utility of such large-scale skeletal samples. In the first three decades of the twentieth century, the anatomist T.Wingate Todd added thousands of human skeletons to the prodigious Hamann-Todd Collection, now housed at the Cleveland Museum of Natural History. Such collections were not mere cabinets of curiosities. Todd is famous within osteology, the branch of science devoted to the study of bones, for developing methods for estimating the age of an individual using the hip bone. Todd focused on the pubic symphyseal face of the pelvis, the region ten centimeters below the navel where the arching struts at the front of the hips interlock. He examined this area on hundreds of different skeletons, comparing individuals of different sexes, ancestries, and ages to map out age-related changes in the appearance of the bone’s surface. Other researchers drew on growing anatomical collections to refine the methods developed in Todd’s foundational studies, working with diverse skeletal samples collected from the Korean war and Los Angeles morgues, in order to study these same changes in different segments of the population. Over the course of fifty years, anatomists and physical anthropologists made groundbreaking strides in developing standards for accurate age estimation.
A fully fleshed prehistory In many ways, assessing the age of a human skeleton is akin to the process of estimating the age of a living person. When meeting a new acquaintance, we use a number of small cues to make an educated guess as to how old they are, subconsciously observing the number and depth of their wrinkles, the thickness of their hair, and the straightness of their spine. The visual cues that osteologists use to estimate the age of a skeleton are more arcane, but work in a similar fashion. Consider the auricular surface, the area where the back of the hip connects with the sacrum. If you focus on this anatomical region, the gentle billows on the surface of the bone gradually degrade into an irregular roughened expanse as an individual ages. The transformation from a smoothly undulating surface to a pitted swathe of pockmarked bone is as distinct and immediately recognizable to a bioarchaeologist as the difference between the dewy features of an adolescent girl and the wizened countenance of her grandmother standing beside her.
Evaluating biological sex relies upon similar kinds of pattern-recognition. Nine times out of ten, if you are given five seconds to decide whether a photograph of a stranger is male or female, you’ll make the correct choice. Bioarchaeologists use some of the same guidelines – the prominence of the chin, the robusticity of neck muscle attachments, the smoothness of the brow – to produce a qualitative assessment of the sex of an individual. However, just as physical appearance can be misleading in living humans – as in the case of more androgynous individuals like the model Andrej Pejic who famously walks as both a man and a woman – features of the bony cranium can also be deceptive. Strong-jawed, robust-browed females and small-chinned, slender-necked males are found in all populations, which is one reason that assessments of sex and age are explicitly defined as estimates rather than identifications. We make occasional mistakes in our everyday assignations of individual biology – your new co-worker, who you could have sworn was over forty, is only thirty-one; the petite, long-haired child who scampered past you at the park is male rather than female – so it is unsurprising that osteologists can be similarly misled by subtle changes in the architecture of the skeleton. For this reason, bioarchaeologists prefer to use the pelvis to assess sex, because the evolutionary pressures of childbirth have placed stronger constraints on formal differences between male and female structures in this region. Still, there are no absolutes, as wide-hipped males and slender-hipped females do make up part of the inherent variability of human populations.
While skeletons thus occasionally provide ambiguous clues as to an individual’s sex, one area in which human bone provides an extremely faithful record of the past is when it comes to health and activity. As is so famously encapsulated in Wolff’s law (the osteological equivalent of Newton’s laws of motion, or Einstein’s E=MC2), living bone adapts and adjusts to the stresses it is subjected to. As with muscles, greater use leads to stronger, more robust bones, while disuse and inactivity causes elements to wither and waste away. If a thighbone is enormous and has colossal, imposing ridges for muscle attachments, it is likely that the individual that it belongs to was extremely active during life. If, on the other hand, the bone is delicate and gracile, with relatively smooth surfaces, its owner was likely not as vigorous or active. Osteologists interested in differences in activity level and habitual patterns of movement have begun to quantify such differences. By measuring variability in size and patterning of these musculoskeletal stress markers – well-known anatomical regions where muscle tendons cleave to bone – bioarchaeologists have quantified differences in male and female daily activities in prehistoric Hudson Bay populations, charted changes in habitual subsistence tasks during the transition from foraging to farming in the Levant, and isolated the effects of the Spanish conquest on the labor patterns of indigenous south-western populations.
In addition to registering the activity of soft tissues like muscles and tendons, foundational work in paleopathology, or the study of ancient disease, demonstrated that skeletons also record some of the more painful aspects of daily life in prehistory. Starting in the 1960s, dedicated physical anthropologists like Don Ortner began to map out the precise links between specific diseases and their effects on human bone, creating reference manuals that allowed osteologists to make posthumous diagnoses of illness suffered during life. Treponemal infections like syphilis and yaws, for instance, famously decorate the cranial vault with radiating lesions called stellate scars, so named for their passing resemblance to stars. Tuberculosis weaves its insidious web in the spine, boring into vertebral bodies with destructive, gaping holes called psoas abscesses. Leprosy can voraciously attack the skeleton of the upper face, causing the bony nasal aperture to widen, while eating away at the bone of the upper jaw, forcing the bone to furiously attempt to rebuild itself in the face of this bacterial onslaught. Specific nutritional deficiencies can also be diagnosed – rickets, a demineralization disorder caused by a paucity of Vitamin C, calcium, or phosphate, is immediately apparent in the sickening curvature of the long bones of the arms and legs, elements so arched they appear to have been pulled and twisted like taffy at a fairground.
Even the most mundane daily activities take their osteological toll and leave archaeologically visible traces. Over time, the impact of every jolt, every rolled ankle, every basket lifted and grain processed, becomes inscribed in one’s bones. Bone breakage, if it occurs before death, results in the formation of a callous, a rounded protuberance of boney “band-aid” that surrounds the affected site, slowly knitting the bone back together. In other instances, particularly when fractured elements are not splinted together properly, bones heal incorrectly, coming together orthogonally in a condition called malunion. Many prehistoric populations were also affected by osteoarthritis, the bone-on-bone rubbing that occurs as older or more active individuals begin to wear out the cartilaginous padding that protects the articular surfaces of bones. Joint surfaces, like the area where your upper arm meets your shoulder, or your thigh meets your leg, become flattened and porous, covered with a highly visible gloss called eburnation that is the distinct result of bone wearing upon bone for many years. The polished sheen, surface marked by porous pinpricks and lipped by curled spicules of grasping bone, suggests that even the smallest movement in this condition would have resulted in a great deal of pain as the distal end of one element scraped over the proximal end of another. Thus, though the results of paleopathological research are often troubling, they serve to provide evidence of what daily life was like in the era before modern medicine: infectious disease, nutritional deficiencies and heavy labor loads were common problems in the ancient world, and trailblazing paleopathologists were responsible for demonstrating the extent and scope of such ailments.
During the 1970s, a dedicated group of young archaeologists began weaving these insights from anatomy and physical anthropology into archaeological practice. Estimating the age of an individual by examining the morphology of their pelvis or the closure of their cranial bones, reconstructing the demography of an ancient population by investigating dental wear, and recreating deadly trauma by analyzing the edges of fractured bone; such fantastic analytic feats were now possible. Instead of the refuse of an excavation, retrieving human skeletons now became the goal of many archaeological projects, and intensive, long-term research in varied parts of the world began to illuminate aspects of daily life that had previously been inaccessible. Large-scale projects in Western Europe, the American Midwest and North Africa opened a new door into the past. This new generation of bioarchaeologists turned ancient skeletons into individuals, rather than mere boxes of jumbled bones. They collected data on age, sex, health, diet and trauma, fleshing out a portrait of prehistory that had been previously been based largely on traces of material culture –artifacts like pottery, stone tools and architecture. By focusing on the individuals who produced and used these objects, bioarchaeologists began to bring actual people back to the forefront of our understanding of the past.
Piecing together a fragmentary past
Pioneering studies of early skeletal material focused on samples of well-preserved, relatively complete individual burials. Unfortunately, while entire skeletons can provide faithful records of individual lives, it is rare that bodies are preserved intact unless they are relatively recent inhumations. The archetype of the diligent archaeologist carefully brushing dirt away from an intact cranium and slowly revealing the recognizable and articulated silhouette of a complete body is somewhat misleading. Unless the excavation is focused on historical remains, human bones are rarely so neatly packaged. In most prehistoric excavations, the grunt work of bioarchaeology lies in the retrieval of fragments large enough to analyze; bone preserves well, but it is by no means impervious to the onslaught of time and taphonomy (the effects of weathering, erosion, and movement on bones). In sharp contrast to the complete skeletons prepared and analyzed by Terry and Todd, the majority of archaeologically retrieved skeletal material is often dishearteningly paltry – I have heard the term “cornflakes” used to describe the infinitesimally small, weathered fragments of crania and long bones that bioarchaeologists are tasked with analyzing.
That is why teeth are particularly valuable – their incredibly dense enamel makes them hardy survivors in the archaeological record, and they can withstand taphonomic conditions that destroy much larger portions of human bones. This resistance to destruction is particularly important, as one of the first goals of a bioarchaeologist is to figure out how many people are represented at a given site. This number is referred to as the Minimum Number of Individuals, or MNI, so called because it is calculated by figuring out the lowest number of people it would take to produce the bones represented in a skeletal assemblage. For example, if an osteologist establishes that a particular burial contained five separate left thighbones, then the remains of at least five people must have been interred in the grave. However, when burials are fragmentary, commingled or partially excavated, establishing the precise number of bones and the side of the body they come from is a tricky business. The sheer weight of surrounding rock and sediment, the insidious seepage of erosive water, and the buckling and twisting of the ground’s freeze – thaw cycles over the course of millennia all work to fracture human bones into miniscule puzzle pieces. Because bioarchaeological analysis occurs after excavation, these delicate flakes of fragmentary bone must be prised out of the earth and transported to a laboratory, only further subjecting them to damage and distortion. This is why, particularly when graves are ancient, bones are fragmentary, and individuals are inextricably mixed together, human teeth are invaluable.
Unlike a thighbone or a cranium, a tooth is often able to weather the onslaught of the thousands of years of site formation processes with equanimity. The choice between analyzing thousands of fragments of postcranial bone that may refit, but may not, and the largely complete human teeth that accompany them, is an obvious one. Unfortunately, the utility of human teeth is matched only by their ability to evade archaeologists’ eyes and fingers. At Bolores, a site containing the jumbled up bones of over thirty men, women, and children, anyone who found themselves at a loss for work was cheerfully relegated to the abandoned tires that decorated the edge of the site. Perched atop the baking rubber, fieldworkers would hunker down and screen for teeth, combing through small circular sieves for bright, polished flashes of enamel. Hunting for loose dentition is a mind-numbingly tedious process. Sorting through chunks of sediment and minute fragments of bone for loose teeth is the osteological equivalent of combing through a bowl of dry rice to hunt for large grains of kosher salt. However, the analytical pay-off is worth it, for human teeth encode precious information about the age, health and diet of prehistoric individuals.
To begin, teeth tell us that ancient populations were subject to severe dental problems as a result of their diets. Before the 1980s it was widely assumed that the development of agriculture represented a quantum leap forward in human well-being, with intrepid newfound farmers providing their communities with increased food security, better living conditions, and the opportunity to settle down and focus on the arduous process of civilization building. However, early bioarchaeological investigations revealed this to be an overly simplistic portrait, one that glossed over some of the heavy costs of humanity’s increasing reliance on domesticated crops.
One deleterious effect of the transition to agriculture in many parts of the world was a rise in the frequency of caries, or cavities, the pits bored into the surface of a tooth crown by bacterial infection or decay. As people domesticated key grains like wheat and maize, they began to inadvertently consume greater and greater amounts of starchy sugar – gulping it down in fermented drinks like the sweet maize-based chicha of Peru, consuming filling, sticky porridges of domesticated seeds in the Midwest, and tearing into hearty loaves of hand-ground wheat bread in the Old World. As a result, even a cursory examination of the teeth of any early horticultural community reveals shudder-inducing examples of dental disease. Working with the skeletons of early farmers in both North America and Europe, I’ve witnessed enormous, obviously painful caries that obliterate entire tooth crowns. Cavities aren’t the worst of it. Many adult skeletons also exhibit high walls of hardened plaque that completely imprison individual teeth. In the worst cases, jaw bones can be marked by insidiously smooth-walled open channels – abscesses so caustic that the infection ate through the bone itself. In the days before dentistry, these kinds of problems were more than mere inconveniences. Broken Hill, an early human fossil dated to 200-300,000 years before present, preserves some of the earliest evidence of just how serious these sorts of dental problems could be. This Zambian specimen of Homo heidelbergensis is well known because of its extensive caries and dental abscesses, insults so severe that they may have contributed to his death.
Just as teeth can tell us when people were getting too much of a certain food, they can also tell us when people weren’t getting enough. Another global pattern in the transition from hunting and gathering to agriculture is an increase in the number of linear enamel hypoplasias. Hypoplasias are defects in the enamel surface of a tooth that result from periods of growth interruption due to illness or malnutrition. These indented lines or pits are visible to the naked eye, but are easy to miss unless you are a diligent observer. Enter any bioarchaeology laboratory and you will find at least one person holding incisors up to a strong light, scanning them with the aid of a hand-loupe, and blinking like a confused owl when you walk into the room and interrupt them. The reason that osteologists devote so much effort to poring over the surfaces of canines, hunting for even the smallest defect on the smooth enamel surface, is because hypoplasias have the ability to tell us what an individual’s early experience of life was like.
Everyone vaguely remembers the childhood experience of losing their baby teeth. Sitting at your school desk with a furrowed brow, poking your tongue against the increasingly pliable back of an incisor or canine, you likely fell into one of two groups: dreading the painful pull of the inevitable slammed door, or eager to get the interminable process over with. The reason, of course, that we loose our baby teeth is because they’re pushed aside to make room for the new permanent dentition erupting beneath them. Additionally, because permanent teeth develop during our early childhood, they provide a faithful record of any severe stress experienced during that time period. Just as you are capable of prioritizing key needs in moments of crisis, deciding to finish a project at work rather than go to the gym, for example, your body also prioritizes the functions it feels are non-negotiable in times of crisis. Maintaining brain function? Indispensible. Adding an extra layer of enamel to a developing tooth when you barely have enough calories to keep yourself alive? Your body says you can do without it.
What bioarchaeologists found when they examined the teeth of early farmers was that many of them had experienced very serious episodes of nutritional stress during childhood. Far more farmers, in fact, preserved these sorts of defects than did earlier hunter-gatherers, suggesting that periods of semi-starvation were quite common in early agricultural communities. Part of this pattern is likely related to the unpredictability of crop yields and the annual risk that crops will fail. Hunter-gatherers who see signs of a bad season coming can pick up and move on, but farmers are tied to land they have invested in. When a bad year comes, agriculturalists simple have to batten down the hatches and make do, and the osteological record suggests that tightening one’s belt and waiting for spring was one of the only available solutions for many early farming communities. An important caveat, however, is that all of the adult individuals exhibiting these marks on their teeth lived past these periods of extreme nutritional stress. Since their adult dentition had erupted before death, the skeletons preserved in the archaeological represent the survivors of such periods of hardship, rather than the individuals who succumbed to starvation. A key question that follows is, what happened to the individuals who did not make it? How many children simply could not handle the terrible burden of slow starvation, and how were they treated by early archaeological communities?
Understanding prehistoric identities
At Bolores, one of the overarching excavation goals was to figure out how many children had been buried at the site relative to adults. Understanding the age composition of the collective burial would provide information about both demography and social practice, illustrating which age groups were subject to high mortality, and untangling how they were treated at death. Because human bone at the site was relatively fragmentary, the easiest way to address this question was by examining all of the teeth recovered from the dig.
Besides allowing bioarchaeologists to estimate how many people were buried at a specific site, teeth also provide an indication as to how old they were. Both adult and baby teeth exhibit relatively regular patterns in development and their sequence of eruption. Accordingly, if a site contains entire handfuls of adult third molars (also known as wisdom teeth) but no baby teeth, it is likely that the burial did not contain children. Perhaps children or infants were being buried elsewhere because of their age – in many societies, ‘personhood’ is not attained until later childhood, after specific cultural and developmental rites of passage like weaning or menarche have occurred. The Romans, in particular, were notorious for burying infants in spaces distinct from the burial grounds of adolescents and adults – tucking them into walls, nestling them beneath barns and interring them under kitchen hearths – all because their souls were believed to be too nascent and transitional to withstand full formal burial. In contrast, historic cemeteries in North America underscore the social and cultural importance of children, their costly tombstones and inclusion in graveyards demonstrating that they were considered to be social beings on par with adults. The proportion of infants and children buried in a particular area thus gives clues to not only the demographic pressures faced by prehistoric populations – whether death was more likely to occur just after weaning, for instance – but also reveals the ways in which ancient peoples structured their societies and how they conceived of personhood itself.
Preliminary osteological analysis of the Bolores remains revealed a curious pattern. By examining the number and development of baby teeth and the wear on fully formed permanent dentition, bioarchaeologists established that over half of the individuals buried at the site were children or adolescents, individuals who were under eighteen years old when they died. While this strikingly high number of young individuals could reflect the actuality of past demography, in which childhood mortality was high, it could also be related to the project’s rigorous excavation methods. Digging slowly, fine-screening for loose teeth, and diligently plotting the location of even the smallest bones allowed members of the team to virtually recreate the site, associating bones and teeth to produce an accurate estimate of how many young people had been buried there. However, whether an artifact of a past reality or the result of careful excavation methods, the high number of children at the site did reveal a very important facet of Late Prehistoric life in early Iberia.
Later prehistory in Spain and Portugal, circa 5,000 to 7,000 years ago, was a time of significant social change. The archaeological record attests to the increasing volume of communication and degree of interconnectedness between communities scattered throughout the Mediterranean and North Africa. Exquisite artifacts like delicate hairpins, carefully carved rabbit figurines, and hilts for intricate quartz daggers were crafted from pearl-sheened ivory and verdantly gleaming variscite stone, highlighting the increase in long-distance trade in exotic materials. People were slowly beginning to congregate on the landscape in larger groups, building bigger communities that would have required more sophisticated strategies for dealing with the daily pressures of living a mere hands breadth away from family, friends, neighbors, and potential enemies. In other places and at other times, these kinds of pressures led to the development of rigid schemes of social ranking, where specific social groups grasped the reigns of opportunity and held firmly to power for centuries – Game of Thrones for the prehistoric set. However, research at Bolores and other sites like it points to the existence of an alternate strategy. Despite the burgeoning size of towns and villages, and the increasing interconnections between vast swathes of the Old World, these Neolithic and Copper Age peoples still buried all members of their community together, in large mass graves that were often demarcated by megaliths or tucked away into natural landscape features like caves. Bioarchaeological analysis indicates that such inhumations provide an honest cross-section of society, with everyone from fragile toddlers to hale and hearty septuagenarians folded into the sedimentary embrace of the sun-baked soil.
A few hundred years later, during the subsequent Bronze Age, this approach would change drastically. Mortuary rituals began to underscore the individual and their accomplishments, rather than the collectivity of the community from whence they came. Instead of interment in a cave filled to the brim with ancestors, people were now buried beneath their own houses, with their own gender-specific sets of grave goods. Women were laid to rest accompanied by elaborate hair combs and metal jewelry, while men were interred with swords and other weapons. By unpacking the nature and number of individuals who comprised these burial populations, bioarchaeologists have illuminated the gradual shift in social focus from a long-standing, egalitarian community of which everyone was a part, to a more particular emphasis on the nuclear family and individual accomplishments—a way of life much more reminiscent of our own modern lives.
Crouched in the hot white Iberian sunlight on the steep slope of that hillside excavation, I had little understanding of the wealth of information human skeletal remains were able to provide about the past. However, that initial spark of communion, and sense of deeper connection to the people who had inhabited this rugged landscape five thousand years before my arrival, was an experience that fundamentally altered my approach to the past. Perhaps, as Robert Frost so eloquently contended “there’s something the dead are keeping back”. Fortunately, through the diligent translation of the information inscribed in human skeletons into concrete knowledge about life in the ancient world, bioarchaeology provides one means of gently encouraging the dead to relinquish some of the secrets they are so tightly holding onto.
Image Credits: Photos from Bolores, 2010 and 2012 field seasons.