Introduction

Carnivory in early Homo is a contentious academic issue. Articulated in the hunting-scavenging debate, archaeologists have produced for decades heuristically-heterogeneous interpretations of the same taphonomic evidences from the early Pleistocene archaeofaunal record (see summary of the predominant positions in1,2,3,4,5,6,7. Based on uncontroversial evidence, early hominin meat acquisition and consumption is coeval with the onset of encephalization in human evolution and its adoption triggered dietary changes that selected for modified postcranial anatomies adapted to different lifestyles from earlier Pliocene hominins8,9,10,11,12,13,14,15. The emergence of hominin meat-eating is also coeval with the earliest stone tool use16,17. This has also been used to controversially argue that such a dietary change played a crucial role in early hominin cognitive evolution8,12,18. Behaviorally, meat acquisition strategies have also been controversial because they impact our perception of how complex human behaviors emerged1,19,20,21.

Early Pleistocene hominins have been argued by some researchers to have been mostly kleptoparasitic when they entered the carnivore guild, both in Africa22,23,24,25,26 as well as in their initial colonization of Southern Europe27,28,29,30. The difference between the hypotheses posited for both regions is that in Africa, it was argued that hominins were passively accessing mostly defleshed carcasses from modern and extinct felids22,23,31, whereas in the Iberian peninsula, it has been argued that hominins were having access to large amounts of flesh abandoned mainly by sabertooth felids (namely, Megantereon and Homotherium), purportedly unable to efficiently deflesh their prey27,28,29,30. In contrast, it has also been argued that repeated access to bulk flesh in Africa could only have been feasible through either confrontational scavenging32,33, a combination of scavenging of large animals and hunting of smaller game4, or a combination of both strategies with predominance of hunting1,2,34,35.

Experimentally-replicated models of bone surface modifications (BSM) on early Pleistocene archaeofaunas, when BSM are considered simultaneously in a multivariate format, statistically suggest that anthropogenic sites from this period are most similar to experiments replicating primary access to fleshed carcasses36,37,38,39,40. Among BSM, cut marks imparted on fossil bones are the most important traces of hominin carcass butchery. Although some gross anatomical patterning in cut mark distribution was previously shown35,41, cut mark analyses have remained under scrutiny because they have been argued to be subjected to equifinal scenarios and to stochasticity22,42,43,44. Equifinality of cut mark patterns has been shown to be an artifact of method5,45. Regarding the widely held assumption that cut marks are random accidents on bones, it has also been shown that butchery creates channeling that is widely constrictive as to where cut marks appear clustered and scattered when dealing with complete or incomplete carcasses46. Therefore, cut mark anatomical distribution is largely reflective of butchery behaviors and these depend on whether carcasses were acquired completely fleshed or partially or substantially consumed by other carnivore commensals. This enables approaching hominin strategies of carcass acquisition depending on how complete carcasses were upon their exploitation by these agents. Here, kleptoparasitism is understood as a strategy where the bulk of food is obtained through stealing it from other predators47,48. This hypothesis will be tested in the present study for early Pleistocene hominins.

Objective analytical procedures to understand BSM anatomical patterning, based on 3D distribution of marks on bones, have recently been developed46. This makes classification of any given archaeofaunal assemblage within a set of referential frameworks an objective process. Here, we expand this methodology in conjunction with the use of Deep Learning (DL) analytical methods and state-of-the-art time series classification algorithms to understand if early Homo was having primary access to bulk flesh or secondary access to carcasses initially consumed by other predators. The former would probably have triggered important behavioral novelties in human evolution, such as intentional food-sharing and eusociality. The latter (especially if carcasses were mostly defleshed) would most likely not have required such behavioral and socio-reproductive modifications compared to the socio-reproductive strategies of humans´ closest primate relatives.

In order to study early Pleistocene hominin carcass acquisition strategies, we analyze cut mark anatomical distribution patterns in the anthropogenic site of FLK Zinj and in two of the most recently discovered and best preserved early Pleistocene anthropogenic sites in Africa: DS and PTK, (Bed I, Olduvai Gorge, Tanzania) (Fig. 1), dated to 1.84 Ma (see description in Methods). We compare these assemblages to wide-variance experimental sets reproducing primary and secondary access to carcasses in various stages of defleshing (including carcasses obtained at lion kills). Carcasses in these experiments were butchered with the aid of stone tools in both scenarios. Deer and sheep carcasses were used for the butchery experiments. Warthog, zebra and wildebeest carcasses were obtained from lion kills and were also processed with stone tools. The analogs used are composed of two augmented datasets; one using within-sample combinations (combined baseline sample) and another one creating new cases from a latent space using Generative Adversarial Networks (GAN). The goal of these two data sets was to increase within-sample variance so that opposing scenarios could be more reliably identified when contrasted with archaeological samples. Two types of approaches were adopted: one targeted cut mark intensity (frequency of cut marks and anatomical distribution, using raw data), and the other one focused on anatomical patterning (using only relative anatomical distribution of cut marks). These datasets were analyzed using time-series longitudinal classification machine learning algorithms. Six different algorithms were used in order to analyze convergence or not of results. Additionally, ensemble learning was used for classification. This was meant to reinforce the classification obtained by the time-series algorithms. The combination of the quantitative methods used in both cases provide unambiguous evidence of the carnivorous behavior of our ancestors (For detailed description of the methods see Supplementary Information).

Figure 1
figure 1

(Upper half) location of Olduvai Gorge in Tanzania and images of the PTK (left) DS (center) and FLK Zinj (left) sites during excavation. (Lower half) several examples of cut marked humerus and radius shafts from DS. (a) Cranio-lateral side of a radius midshaft fragment, (b) caudal side of a radius midshaft fragment, (c) medial side of a humerus midshaft fragment, (d) lateral side of a radius proximal shaft fragment, (e) medial side of a humerus midshaft fragment. The map in A was made using Google Earth Pro 7.3.3 (https://www.google.com/earth/versions/).

Results

The combined baseline sample

Time Series algorithms and ensemble learning

Four of the six algorithms applied on the raw data yielded accurate (100%) classifications of the testing sets of the primary and secondary access experimental subsamples (Table 1), with others following closely (> 80%). Three algorithms applied on the relative experimental data also produced perfect classification (100%) of the testing set. In both cases, the feature extraction classifiers excelled at classifying the testing sets; especially the WEASEL algorithms (Table 1). Markov Transition Fields also achieved complete classification (using a Random Forest classifier). The high accuracy of most algorithms and the convergence of several of them in classifying correctly all the testing set produced a reliable multi-testing framework within which the subsequent classification of the Olduvai sites was warranted.

Table 1 Time Series algorithms used on the combined experimental data set and on the archaeological samples and their results on the raw data set and on the relative data set.

Once the raw data set was tested, the algorithms were used to test the relative data set, which resulted from the percentage transformation of the raw data set. This transformation sought to analyze anatomical patterning irrespective of the number of cut marks documented. The total number of marks in each assemblage was modified so that each bin in the series contained the percentage value of marks with respect to the complete assemblage. The raw data set represents the patterning and intensity (i.e., frequency of clustering marks) analog, and the relative data set represents just the anatomical patterning analog. Thus, assemblages with widely different numbers of cut marks, but similar anatomical location could be identified. Intensity is so dependent on the interplay of so many variables (with a high degree of randomization), that it can be misleading. Here, we will emphasize the relative anatomical distribution of cut marks, because it is more directly indicative of butchery behaviors and access type to carcasses. Using this relative data set, the Time Series algorithms also succeeded in differentiating bone assemblages modeling primary and secondary access to carcasses. When comparing the archaeological assemblages to the experimental analogs, most of them classify FLK Zinj, DS and PTK as “primary access”, both in cut mark intensity (raw data) as well as anatomical patterning (relative data). The Markov Transition Field algorithm classifieds them as “primary” with a high probability: 0.91–0.91–0.92 on raw data respectively, and 0.91–0.93–0.87 using the relative data set.

This clear divergent classification of both experimental assemblages is also reflected in distance cladograms of each sample per group (Fig. 2). Both the complete carcass butchery and the Tarangire lion butchery sets appear clearly separated in most of the individual samples, which shows that the anatomical patterning of cut marks as well as the frequencies of cut marks in both experimental scenarios are different. Despite the channeling (i.e., constricted occurrence of marks) documented in the clustering of cut marks in certain loci due to butchery ergonomics and muscle insertions shared by fleshed and partially or totally defleshed carcasses46, resulting in a portion of mark clustering in similar areas, bulk defleshing impacts additionally in areas that show no flesh scraps when carcasses are obtained opportunistically from felid kills. Also, bulk defleshing generates more cut marks as a result of anatomically more extensive meat removal. This creates a clear diagnosis of primary and secondary access scenarios against which the archaeological record can be tested.

Figure 2
figure 2

Circular dendrograms showing association of samples in each experimental data set (using the baseline sample), based on similarity/dissimilarity distances (average method). (A) Cluster based on raw data. (B) Cluster based on the relative data set. Each color shows different groups of samples selected by the algorithm (25 groups). Notice when using raw data (A) how all the lion samples are clustered within the same group, whereas the human butchery sample displays a far wider variety and comprises most of the remaining cluster groups. This indicates much more randomness in cut mark intensity when humans butcher complete carcasses and far less so when they process lion-consumed carcasses, because the limited available resources restrict the anatomical portions where tools are used. When using relative data (B), the pattern is reversed. Notice how the human samples, which were far more variable in number of cut marks (intensity), reflect much more homogeneous patterning regarding their anatomical distribution. In this case, the lion samples are much more variable, because they reflect higher anatomical variability in the surviving scraps of flesh after lion consumption. Anatomical patterning, thus, is an information rich way of understanding how much meat hominins extracted from carcasses.

Ensemble learning (EL) applied to the relative data sets yielded similar results. A stacked model was used, based on the use of a single-layer of DL (neural networks with multiple hidden layers) base learners and a DL meta-learner. This resulted in perfect classification (100%) of the testing set. Four base learners reached a classification of 100% and one of 97% of the experimental samples (F-1 score = 1.00). DS, FLK Zinj and PTK were classified as “primary access”. Additionally, a majority vote ensemble model was used with the same base learner composition (testing set classified with 100% of accuracy, F1-score = 1.00), and DS, FLK Zinj and PTK were classified as “primary” with a probability of 99%, 94% and 97.9% respectively.

The GAN-augmented sample

The data augmentation carried out with the use of GAN resulted in 150 new cases (i.e., artificial carcasses) for each experimental group (primary and secondary access). Time Series algorithms classified DS and PTK mostly as “primary”, but with a decrease in accuracy of the testing sets and of the classification probabilities of the archaeological sets, which indicated a wider diversity and variance within the augmented data compared to the baseline data set (Table 2). Despite this artificial increase in within-sample variability, the algorithms reinforced the classification of FLK ZInj, DS and PTK as similar to the experimental data set replicating butchery of complete carcasses. In this case, some algorithms showed that at least one of the sites (PTK) showed similarities with the “secondary” data set when considering the raw data. In contrast, when using anatomical patterning, most algorithms coincide in classifying PTK as “primary”. This stresses the importance of patterning (i.e., where marks occur) over intensity (i.e., how many marks are documented) in determining butchery behaviors, given the higher stochasticity of the former.

Table 2 Time Series algorithms used on the GAN-augmented experimental data set and on the archaeological samples and their results on the raw data set and on the relative data set.

When applied to the GAN-augmented relative data set, the ensemble learning models yielded similar results to the combined baseline sample (see above). The stacked DL model yielded an accuracy of 71% (F1 = 0.70) of correct classifications on the testing set. This was substantially lower than the original combined carcass sample which was the base for GAN data augmentation, suggesting that the GAN augmentation had increased variance. Despite this, both PTK and DS were classified as “primary” assemblages. The majority voting ensemble model yielded, in contrast, complete accuracy in the classification of the testing set (100% of accuracy, F1-score = 1.00). This EL model classified DS (probability = 0.93), FLK Zinj (probability = 0.79) and PTK (probability = 0.668) as “primary” assemblages.

Despite the occasional discrepancy of some of the algorithms with the classification of PTK, probably impacted by preservation-related issues (bias in long bone portion representation), it must be stressed that all the tests and analyses applied underscore that both, but especially DS and FLK Zinj, can only be confidently classified within the “primary” access experimental data set. This indicates butchery of complete or fairly complete carcasses. Access to complete carcasses in African savannas, if done regularly, can only be achieved through dynamic (ie., predatory) strategies, since most carcasses that die naturally are utterly defleshed in a matter of hours by the intervention of scavengers31,49,50. This underscores the carnivore role of hominins in African early Pleistocene savanna biomes and proves that the hypothesis of regular kleptoparasitism among early Pleistocene hominins is false.

Discussion

The results of the present study show that by the early Pleistocene, hominins were already inserted in the carnivore guild. Their regular access to fleshed carcasses invalidates hypotheses positing the kleptoparasitic role of these ancestors. Like any other predator, hominins would have exploited available opportunities of carcasses found at other carnivore kills51; however, we argue that such strategies constituted a minor element in their carcass-acquisition behaviors. Carcasses obtained from felid kills would have registered typical taphonomic signatures (i.e., bone modifications) resulting from prior felid carcass consumption. In the past several years, with an already extensive archaeological record, such traces have been actively sought and only one unambiguous evidence of felid-hominin interaction has been found, at the DS site precisely (under review). Likewise, the first application of DL computer vision methods to the determination of agency in carnivore bone modification in a fossil assemblage showed that, with the aforementioned exception aside, all tooth pits from the DS archaeofaunal assemblage were caused by hyenas (under review). This eliminates the possibility of felids being the primary providers of carcasses for hominins and reinforces the results of the present study showing that the butchery pattern documented is typical of processing of complete (i.e., fully fleshed) carcasses. Carnivore intervention seems mostly restricted to post-depositional damage by durophagous fissipeds. This means that both small and medium-sized animals of up to 350 kg of weight at the two sites analyzed provided large surpluses of flesh for hominin groups.

These results underscore the role of meat in the early Pleistocene hominin diet and in their socio-reproductive behavior. Primary access to the animals present in these sites (also attested by the presence of eventration cut marks on rib fragments) would have required collective participation of, at least, several individuals, under a behavioral framework that implied intentional cooperation and expectation of resource sharing. Meat-eating would have increased diet quality, thus releasing Homo dentition from the selective pressure of copying with high strains caused by wide-range vegetarian omnivory. These changes in dentition could also be explained as selection for their use in preprocessing of meat52,53. The reduction in tooth size documented for the first time in this phase of human evolution seems to coincide with an increase of the neurocranium, perhaps being coevolving factors. This supports the impact of this dietary change in the morphing of basal metabolic energy allocation and the evolution of the human brain10,13,18,54. In addition, it is precisely at this time that we have evidence also of a major skeletal modification in some early Homo, with the emergence of H. erectus55,56. The modern human bauplan (long body, shortened forelimb and expansion of the hindlimb, with barrel-shaped thorax) emerges also at about this time. This probably is a reflection of the anatomical impact that the dietary change towards carnivory produced8. If hominins were predators, this would have required substantial behavioral changes resulting in major anatomical modifications compared to the previous Pliocene and the pene-contemporaneous early Pleistocene hominin stocks.

The presence of a nuchal ligament in H. erectus is suggestive of the stabilization of the head to the trunk, probably to counter the shock wave effect of the heel strike of the foot during running57,58. The fact that other cursor mammal runners have nuchal ligaments suggests that H. erectus was also a runner. The co-occurrence of a nuchal ligament with the earliest evidence of long legs in a larger body size (which, although also selected for by walking, are essential for running) supports the interpretation that this hominin taxon engaged in endurance running. Additional evidence for the stability needed during running comes from the long Achilles tendon, the podal plantar arch, the short forefoot and, especially, the enlarged semicircular canals of the ear57,58. An expanded gluteus maximus, which stabilizes the trunk, as inferred from the larger sacroiliac trough in H. erectus, would also have been essential for the proper biomechanical adaptation to running57,58. Bramble and Lieberman (2004) showed that the decoupling of head and shoulder muscles, would also have enabled the efficient swinging of the arms while keeping the head fixed and stable during running. Chimpanzees have their heads and shoulders connected by three muscles, which have been modified in modern humans. Only one of these muscles (the trapezius) still connects the shoulder and head in humans. Two independent tubular structures (a long neck and a long waist) also enhanced the trunk rotation during arm swinging to counter the angular momentum caused by the spinal rotation triggered by leg swinging. The first hominin in which all these features, which are clearly fundamental for running, are documented is H. erectus. If running was positively selected for the acquisition of animal food, then meat-eating and the evolution of human anatomy were interdependent co-evolving factors.

The predatory role of these hominins could also have left an anatomical imprint in the evolution of their arms. Humans are the most endowed primates to store and release energy at the shoulder needed for efficient throwing. No other primate can do so with as much control and with as much speed, strength and accuracy59. Although the capability of H. erectus to use their arms efficiently during throwing was questioned by their inferred narrow shoulders, low humeral torsion and anterior position of the scapula60,61, it has been shown that H. erectus claviculo-humeral ratio falls within the range of variation of modern humans59. In addition, no evidence exists in adult H. erectus specimens that the low humeral torsion is maintained ontogenetically as documented in the subadult KNMWT15000 individual. Roach and Richmond (2015) have also demonstrated that there is no relationship between clavicle length (i.e., shoulder width) and throwing performance. Biomechanical studies show that fast and strong throwing in modern humans is enabled by three factors: tall and mobile waists, humeral torsion and the laterally-oriented gleno-humeral joint62. H. erectus is the first hominin in which these three factors can be anatomically documented. However, the humeral torsion of this taxon is lower than modern humans and it is not known how this could have impacted this hominin´s skills at throwing. If the anatomical indications are correct in pointing at the throwing biomechanical efficiency in H. erectus, the subsequent inference is that such skill was probably selected for hunting. The evolutionary co-occurrence of these anatomical shifts and the taphonomic evidence in the archaeological record associated with this taxon that hominins were consuming meat provides compelling support to the interpretation that behavior and anatomy co-evolved through positive selection of predation and meat-consumption.

This dependence on meat could also have triggered important changes in early hominin physiology by adapting to a regular consumption of animal protein and fat. There is evidence that modern human physiology, which makes our species highly dependent on regular intake of cobalamine, may also have its origins in the early Pleistocene63. Choline, an essential nutrient that plays a crucial role in gene expression (through methylation of its oxidized form, S-adenosylmethionine) and in brain and liver function is also most abundant in meat and animal products, with very few plants containing any substantial amount of it64,65. Humans are also more dependent on this essential nutrient than other primates and failure to meet minimum doses leads to serious pathological conditions. Genomic analysis of the trypanosomic Taenia (tapeworm) indicates that human-host infection could have started by 1.7 Ma66,67, further suggesting that by that time hominins were facultative carnivores. Pathogens (viruses, bacteria, prions) associated with meat consumption limit other primates’ carnivory and suggest that humans evolved meat-specific genes that allowed a more effective buffer against pathogens and meat-related pathologies (e.g., hypercholesterolemia, vascular diseases)68,69, as well as hosting a different microbiome more apt for digesting animal fat and protein70,71,72.

The adoption of carnivory for early Homo is also relevant from an ecological perspective. Most modern African carnivores can be qualified as anything but occasional kleptoparasites, with the exception of vultures and brown hyenas. The latter survive in moderate to low competitive ecosystems and complement their diet with hunting of small animals and wide-range foraging for insects and plants. Even more kleptoparasitic organisms do not base the bulk of their food acquisition on this strategy, since it is risky and frequently incurs in additional costs compared to self-foraging73. Evolutionary Stable Strategy modeling shows that kleptoparasitism is an inefficient strategy unless used as a complement74. Opportunistic strategies are ecologically very valuable, because they reflect intensity in resource competition, high competitor biomass and limited herbivore biomass75. The higher the degree of trophic competition, the more frequent kleptoparasitic behaviors are and this impacts on predator group size and, indirectly, on prey size75,76. Predators affected by opportunistic competitors may buffer this behavior by expanding group size, switching to larger prey and being more successful at defending kills75. This shows that kleptoparasitism behaviors are only successful short-term, but cannot be a stable way of acquiring resources74; especially in African savannas, where carnivore diversity and high degree of trophic dynamic competition renders it adaptive for only a small number of specialists, such as vultures or brown hyenas. Hominins would have lacked the ability to monitor savanna habitats for miles from the air and their locomotion adaptation (priming endurance over speed) would have prevented them from being efficient scavengers like brown hyenas, which may tread for more than 30–40 km in each foraging bout77. Thus, the carnivoran impact of hominins on savanna ecosystems must be interpreted on different grounds.

Given that all the taphonomic evidence from early Pleistocene anthropogenic sites suggest that hominins may have successfully hunted small and medium-sized animals, their ecological impact affected first and foremost the predatory guild. Analytical studies of carnivore functional and evenness richness across the Pliocene and Pleistocene in East Africa show that there has been a loss of functional richness > 99% from the Pliocene until today78. Climatic and environmental information does not correlate with carnivore extinctions, especially after 2 Ma78. A thorough analysis of the past four million years shows, in contrasts, that there is a strong correlation between carnivore richness decrease and hominin brain expansion, suggesting that an increase in cognitive skills may have enabled hominins to overtake ecological niches occupied by other carnivore taxa during Quaternary79. This supports that “anthropogenic influence on biodiversity started millions of years earlier than currently assumed”79. If meat-eating allowed for a high-quality diet impacting on higher hominin demographics, hominin expanding densities across landscapes could have pressured several competing carnivores and could have placed the latter at selective disadvantage. Such a demographic increase is also supported by much bigger sizes of Acheulian sites after 1.7 Ma and the conspicuous evidence of megafaunal exploitation (not necessarily through hunting) after this date by hominins, probably suggesting bigger group sizes36,38,39, given the ecological correlation between carcass size exploitation and number of carnivore commensals80,81,82,83,84. Body-size ecology (including total carnivore mass and pack size) determines targeting specific prey sizes in mammalian predators80,81,82,83,84. The increase of hominin body size and anatomical robustness across the early and middle Pleistocene indicates a selection of physical strength, probably either reflecting predatory strategies that required force or an exaptation in this direction.

Although it has recently been argued that no evidence exist for the anthropogenic impact in African Pleistocene mammal faunas85, the arguments against it are affected by similar anecdotal assumptions as some arguments in favor of such an impact. The focus on megaherbivores and their higher representation in the past, for example, is biasing and needs to be properly justified. Megaherbivores´ ecological niches are preferentially situated in certain alluvial habitats, which favor their preservation because of the fast sedimentation processes operating in such environments. Most Plio-Pleistocene paleontological and archaeological localities represent portions of alluvial habitats. The greater representation of megaherbivores in the past has certainly been impacted by this and might as well be just a taphonomic artefact affecting our perception of megafaunal paleoecological diversity and biomass. Climatic–based interpretations overlook the bias introduced by taphonomy as much as arguments using taxa richness for ecological transitions. Faith et al.85 correctly argument that carnivoran richness and extinction rates can alternatively be reflecting sampling intensity (i.e., number of sites per period). They also counter-argument that grassland expansion could be responsible of such extinctions. However, it has been shown that it is sampling biases (i.e., number of sites and fossiliferous localities per period and collection intensity) that can account for specific taxic diversity and not climatic forcing, especially for the 1.9 Ma period86. No climatic explanation currently can be used to account for mammal evolution without taphonomic calibration86. Without this, true climatic evolutionary signals cannot be supported. In their own explanation, Faith et al.85 detect a decreasing carnivore taxonomic diversity “trend” after 1.9 Ma (and, especially after 1.5 Ma), which they interpret as ecological, while it is also related to the decrease of sampling intensity, given the smaller number of localities and their areal sizes at the end of early Pleistocene in Africa.

Faith et al.85 also reject for the sake of their arguments that hominin might be targeting meat-consumption in the Pliocene and early Pleistocene and claim that the early archaeological record merely documents bone marrow exploitation and hominin kleptoparasitic behaviors, against all taphonomic evidence currently available1,2. These authors additionally assume (without support) that small-carnivore extinction should precede large carnivore extinction if hominins were a competitive factor. This assumption is also not ecologically justified and also goes against the zooarchaeological evidence showing that starting 2.6 Ma, hominins were targeting medium-sized taxa preferentially, which affects large carnivores far more than small ones. An argument in favor of hominin impact can be found in Faith et al.´s own data, which clearly show an abrupt decline of large carnivore taxa at 1.8 Ma, precisely the age in which our analysis documents strong carnivoran adaptations by hominins consuming prey that is typical of larger carnivores. In addition, it is the megapredators that show a higher degree of specialization on meat, whereas mesopredators tend to be more generalists87, rendering the former more susceptible to be affected by competition.

By focusing on all taxa, including viverrids and mustelids, Faith et al. are also missing the point that it is the large carnivore trends that are most important for assessing the impact of hominins on the predatory guild. Additionally, Faith et al. put some faith in the intrinsic association of large carnivores and megaherbivores, when there is a strong record of similarly-sized carnivores and their preferential adaptation to the consumption of medium-sized (100–400 kg) prey, and even smaller animals88. No evidence exist that could be used to support taxon-wide association of predator–prey dependence between Homotherium or Pachycrocruta and megaherbivores. In most of the geographic record of Homotherium, for example, the taphonomic evidence of its involvement with megaherbivores is rather limited. It should be emphasized that in Africa, these large carnivores disappeared way before their potential megafaunal prey did.

FLK Zinj, DS and PTK are probably the best preserved of the early Pleistocene anthropogenic sites in Africa spanning the first million years of the archaeological record. They occur as stratigraphically discrete horizontal concentrations of hominin-processed faunal remains. The Kanjera site (Kenya) has also been used as an example of possibly predatory carnivory by early hominins4; however, the nature of this deposit presents more potential time-averaging problems that may have impacted on its integrity and resolution. Its structure as a thicker deposit indicates much more time for a potential diversity of biotic and abiotic agents to have intervened and create a palimpsest from which disentangling the hominin part is challenging. As a matter of fact, the reported frequency of anthropogenic BSM is marginal and substantially lower than those reported in experiments and archaeological assemblages where the agency in the accumulation of the assemblage is solely or mostly hominin. It would be interesting to apply the same methods described here to this assemblage.

The virtually identical patterning in the relative distribution of cut mark clustering on long bones at the three archaeological sites analyzed here (Fig. 3) indicates that: (a) such patterning has a behavioral non-stochastic nature, and b) it clearly indicates a better match with experiments reproducing primary butchery of complete carcasses. The main differences seem to be related to less secondary limb dismembering occurring at the archaeological assemblages. This may have interesting implications on the interpretation of the social behavior of carcass consumption by early hominins, which also seems to have been spatially more restricted than observed among modern human foragers. The relevance of the patterning discovered at the three sites cannot be overemphasized. These are the only taphonomically-supported fully anthropogenic sites in the African early Pleistocene prior to 1.5 Ma1,2. It means that if further behavioral variation existed, that must be uncovered with new sites. It also means that the pattern unveiled shows a socio-structural behavior in the adaptation of those early humans. The evidence of primary access and bulk defleshing by hominins at these three sites precedes similar evidence also documented at some sites after 1.5 Ma2. Some archaeofaunal assemblages from Olduvai (BK), Peninj (ST4) (Tanzania), and Swartkrans (member 3) (South Africa) have yielded an abundance of taphonomic data indicating that hominins regularly enjoyed early access to prime portions of ungulate carcasses90,91,92.

Figure 3
figure 3

Averaged relative distribution of cut marks in the experimental assemblages (baseline combined sample) replicating primary and secondary access to carcasses (upper) and in the FLK Zinj, DS and PTK archaeological data sets (lower). Colored vertical bands show bone portions where cut marks in “secondary access” experiments are non-existent in the experimental data set, and which would indicate access to fully fleshed carcasses. Notice how cut mark on the DS and PTK stylopodial fragments cluster preferential in these bone portions, suggesting bulk defleshing. Notice also how cut mark patterning in zeugopods are more ambiguous than in stylopods. Three areas of cut mark clustering are documented in the arcghaeological samples conforming to the primary access experiments: (AC) A indicates patterning in cut mark clustering on the proximal half of humeri and distal femora. (B) indicates patterning on the proximal half of femora. (C) Shows that the cut marks pattern documented in the three archaeological samples is virtually identical to the primary access experiments, with clustering on the proximal epiphysis and metadiaphysis and almost no cut marks on the mid-shaft. The position of the tibia is different to the other long bones because it was programmed (scanned) in Ikhnos the opposite way. The X axis show the longitudinal dimensions of the complete series of the four long bones placed sequentially.

The data analyzed and discussed here show that if a kleptoparasitic phase existed in human evolution, this must be researched prior to 2 Ma or earlier in the Pliocene89. Until more evidence is gathered, we will not know if hominin carnivory was a cladogenetic or an anagenetic event2. Nevertheless, its impact on the evolution of human socio-reproductive behaviors, physiology and anatomy is undeniable.