Introduction

There is broad agreement that archaeology can help build socio-ecological resilience through studies of very long-term human-environment interactions (Hudson et al., 2012; Boivin and Crowther, 2021). However, site or regional case studies of such interactions often remain inconclusive because of the fragmentary nature of the archaeological record and the use of different scales of analysis (Degroot et al., 2021). Even in contemporary contexts, there is frequent disagreement over the role of environmental versus socio-political factors (cf. Kelley et al., 2015; de Châtel, 2014), and archaeology’s ability to look at long-term sequences can lead to controversy over how to frame crises (McAnany and Yoffee, 2010). The concept of the Anthropocene has shifted archaeological research towards broader, more ‘global’ frames for anthropogenic impacts. Although many scholars date the onset of the Anthropocene to the Industrial Revolution or the post-1945 ‘Great Acceleration’, there is growing interest in understanding earlier human impacts on the earth system. Research on what is sometimes termed the ‘Palaeoanthropocene’ has investigated the role of fire, agriculture, trade, urbanisation and other human impacts on pre-industrial ecosystems (Smith and Zeder, 2013; Foley et al., 2013; Ellis et al., 2013; Boivin et al., 2016). While there is increasing consensus that such impacts were more important than previously realised, geographical variation over the Palaeoanthropocene remains poorly understood. Given that many technological advances—including pottery, agriculture and metalworking—seem to have originated in a few locations and diffused outwards (Bellwood, 2005; Jordan and Zvelebil, 2009; Shennan, 2018; Roberts et al., 2009), we might expect temporal and geographical lags in resulting anthropogenic landscape changes. Previous research has also suggested that population size and socio-political complexity affect environmental impacts (Mather et al., 1998; Mather and Needle, 2000; Atkinson et al., 2016; Sheehan et al., 2018) and it can be proposed that these factors might further generate geographical variation in the Palaeoanthropocene.

Here, we investigate geographic differences in the Eurasian Palaeoanthropocene using the Japanese Islands as a case study. We analyse claims that pre-industrial anthropogenic impacts in Japan were significantly reduced by the late arrival of agriculture, the importance of wet-rice farming, a reliance on seafood as a primary source of dietary protein, and cultural ideologies of environmental stewardship. The Japanese literature analysed here often builds its arguments around the proposed ‘uniqueness’ of Japan as compared especially to Europe and for that reason we include comparative context with other Eurasian societies. Future research needs to develop these comparisons in more detail and some suggestions in this respect are made in the Conclusions. We follow the standard periodisation of Japanese history in defining the premodern or pre-industrial period as that prior to the Meiji Restoration of 1868.

Claims for Japanese environmental exceptionalism

While there is currently no synthetic analysis of anthropogenic impacts in premodern Japan, numerous studies in both environmental history and archaeology have concluded that the prehistoric archipelago formed a highly engineered environment with humans as a ‘keystone species’ (Totman, 2004, 2014; Crawford, 1997; Bleed and Matsui, 2010; Aikens and Lee, 2014). Neolithic Jōmon populations, often characterised by high levels of sedentism, had extensive impacts on forests (Noshiro et al., 2016; Kitagawa et al., 2018). Long before the introduction of cereal farming in the first millennium BC, many plants were managed and cultivated (Crawford, 1997; Nishida, 1983; Sato et al., 2003; Nakayama, 2010). Wild boar were transported to islands beyond their natural habitat from at least the Initial Jōmon (ca. 9500–5500 BC) and perhaps as early as the Late Pleistocene in the Ryukyu Islands (Price and Hongo, 2020; Kawamura et al., 2016). After 1000 BC, barley, rice, wheat, and broomcorn and foxtail millet were introduced into the Japanese archipelago (Robbeets et al., 2021). Of these cereals, wet rice, recognised as a source of methane, had the greatest environmental impact (Fuller et al., 2011). State formation, urbanisation and growing commercialisation further affected forest clearance and land-use (Kawahata et al., 2014; Sasaki and Takahara, 2011).

Given this evidence for anthropogenic impacts in pre-industrial Japan, it may seem obdurate to raise arguments in favour of low-impact exceptionalism for the archipelago. However, such arguments are widespread in Japan where they have developed considerable traction within broader environmental discourse (Prohl, 2000; Rots, 2015, 2017; Reitan, 2017; Lindström, 2017, 2019; Hudson, 2021a). Ideas about the exceptionalism of human-environment relations have a long history in Japanese thought (Thomas, 1999). The belief that, despite the social and economic ravages of modernity, Japan has somehow managed to retain an authentic ‘harmony’ with nature has been an attractive narrative, suggesting an ability to maintain a degree of cultural independence and authenticity in the face of modernisation and Westernisation. While not discussed here, the intellectual roots of such ideas are diverse and include the Buddhist-Heideggerian philosophy of the Kyoto School, Tetsurō Watsuji’s writings on climate and culture, and the atavistic ‘civilisation theory’ of philosopher Takeshi Umehara (Morris-Suzuki, 1993; Prohl, 2000; Kalland, 2002; Reitan, 2017; Hudson, 2021a).

Many of the often nationalistic assertions made in these diverse writings can be quickly dismissed, but four claims are worth considering at more length because they suggest that certain structural conditions of Japan’s history and geography might have served to reduce anthropogenic impacts. These claims are that pre-industrial anthropogenic impacts in Japan were significantly reduced by: (1) the late arrival of agriculture, (2) an emphasis on wet-rice farming limited to alluvial plains, (3) a reliance on seafood rather than domesticated animals as a primary source of dietary protein, and (4) deep cultural ideologies of environmental stewardship. These claims (referenced below) are evaluated here against available archaeological and historical records from pre-industrial Japan. In several cases, relevant empirical evidence is poor or unavailable. Data on forest area, for example, are only available from 1850, although it is possible to make rough estimates back to the seventeenth century (Saito, 2009). After the introduction of cereal agriculture in the Yayoi period (ca. 1000 BC–AD 250), the zooarchaeological record of animal use in premodern Japan—key to any evaluation of claim 3—becomes very sparse. The dramatic decline in shell middens from the Yayoi (Hudson, 2019, 2021b) means that faunal remains were less commonly preserved. At the same time, archaeologists have tended to assume a priori that the consumption of meat was limited in the early historic period. Archaeological sites where animal bones are well preserved have often led to major re-evaluations of diet and foodways (e.g., Uchiyama, 1992). In cases where data are unavailable, we attempt to extrapolate likely trends based on theoretical understandings of human-environment relations and our knowledge of relevant historical sequences from Japan as compared with the rest of Eurasia.

In presenting the four claims of Japanese environmental exceptionalism below, we often cite a number of public intellectuals, particularly Takeshi Umehara (1925–2019) and Yoshinori Yasuda (b. 1946), who have been especially influential in promoting these claims through their positions as directors of research institutes and museums, board members for important foundations (including the national Japan Broadcasting Corporation NHK), their receipt of various awards for their work, and their visibility in the Japanese media. Umehara played a particularly influential role as first director-general of the International Research Center for Japanese Studies (Nichibunken), and as an advisor to the 1982–1987 prime minister Yasuhiro Nakasone with whom he co-authored two books. This does not mean that the views of these figures are universally accepted in the scholarly community; it does, however, indicate their influence beyond the boundaries of academia. Umehara and Yasuda have been particularly successful in generating a popularist appeal for their ideas. Claims about Japanese environmental exceptionalism have even made their way into national legislation and environmental policy (see Supplementary Information). A number of critiques of Umehara and Yasuda’s ideas about Japanese nature were presented at the time of the establishment of the International Research Center for Japanese Studies (Niiro, 1986; Buruma, 1987; Reader, 1990). Japanese archaeology and history have nevertheless largely ignored their writings, which are usually dismissed as environmental determinism. This lack of debate over past human-environment relations in Japan has allowed the incorporation of spurious ideas into public discourse and policy (Hudson, 2018).

Claim 1: late arrival of agriculture

The first claim is that due to the late arrival of farming in the Japanese Islands, its negative impacts have been reduced relative to regions with longer histories of agriculture. It is proposed, furthermore, that this late arrival meant that pre-agricultural traditions—assumed to be less destructive of nature—have retained a strong influence. Umehara (1999: p. 47) makes this argument most explicitly: ‘In Japan, where the introduction of rice cultivation was comparatively late, its blending with the native forest culture and the hunting and gathering culture created an amalgamation unique to Japan.’ This claim is unusual in Japanese historiography where most scholars believe that farming was a key stage in the development of civilisation. Although, as far as we are aware, Umehara is the only writer who has regarded the late arrival of agriculture in Japan as a positive trend, recent research on the variable resilience of early farming systems (e.g., Stevens and Fuller, 2012; Shennan et al., 2013; Downey et al., 2016; Bogaard et al., 2017; Hudson, 2020; Hudson and Robbeets, 2020; An et al., 2021)—an approach anticipated for medieval Japan by Farris (1985)—makes it worth reconsidering the long-term viability of agriculture in the Japanese archipelago.

On the one hand, the Japanese Islands certainly possess a long history of hunter-gathering as an important subsistence activity, which continued until 1000 BC in north Kyushu, the medieval era in Okinawa, and the late nineteenth century in Hokkaido (Hudson, 2014; Takamiya et al., 2015; Jarosz et al., 2022). However, this does not mean that hunter-gatherer societies in Japan formed a type of ‘ecotopia’ as imagined, among others, by Okamura (2018) and Yasuda (2007). As noted above, there is extensive evidence for the management of anthropogenic environments and Jōmon economic adaptations were not uniform or stable as claimed by Okamura (2018: p. 1). The Jōmon was a Eurasian Neolithic society characterised by sedentism, pottery use, polished stone axes for forest clearance, and the management of plants and animals. Umehara’s comment about the ‘blending’ of rice farming with the ‘native’ forest or hunter-gatherer culture is also contradicted by recent genomic research, which concludes that after a massive migration into the archipelago in the Yayoi and Kofun (AD 250–700) periods, the modern Japanese can only trace about 10% of their genetic ancestry back to the Jōmon (Cooke et al., 2021; Robbeets et al., 2021; Wang et al., 2021), a figure lower than that estimated for the hunter-gatherer ancestry of some European countries (cf. Haak et al., 2015).

The overall length of agriculture in a particular region has to be seen against the intensity and maturity of the farming system. This problem is ignored by Umehara and colleagues who claim that rice cultivation began very early in the Yangtze valley and continued essentially unchanged thereafter (Yasuda, 2009a). Such an understanding is contradicted by recent research showing that plant domestication was a much slower process than previously realised, that early farming societies were often subject to ‘boom and bust’ cycles, and that Bronze Age and later processes of proto-globalisation completely transformed Eurasian agriculture. In China, despite earlier experiments with plant cultivation, morphological domestication and dietary dependence on rice did not develop until at least the fifth millennium BC (Fuller et al., 2009). Research in Europe has suggested that ‘boom and bust’ cycles of early agriculture may, in part, have been caused by endogenous factors unrelated to climate change, with epidemic disease potentially contributing to other socio-ecological instabilities (Shennan et al., 2013; Downey et al., 2016; Scott, 2017; Rascovan et al., 2019). In East Asia, most work has focused on the role of the climatic cooling known as the 4.2k event (Wu and Liu, 2004; Wang et al., 2005; Liu and Feng, 2012; Constantine et al., 2019; An et al., 2021), although the possible role of disease has also been discussed (Farris, 1985; Hosner et al., 2016; Hudson and Robbeets, 2020). Wet-rice agriculture underwent significant evolution over time and the Bronze Age rice cultivation that reached Japan was highly intensive as compared to earlier forms of Neolithic farming in China (Ellis et al., 2013). Across Eurasia, the period between 2500-1500 BC saw a dynamic expansion and exchange of crops and farming systems (Liu et al., 2019; Spengler, 2019). It was this episode of early Eurasian ‘food globalisation’ which led to the arrival of full-scale agriculture in Japan. In short, while Japan was one of the last places in Eurasia to adopt cereal agriculture, the farming system which arrived in the archipelago in the Bronze Age involved highly intensive patterns of land-use.

Claim 2: the sustainability of rice

The second claim of Japanese exceptionalism with regard to the Palaeoanthropocene relates to the type of farming found in Japan. Wet rice is extolled as a ‘form of agriculture [which] made use only of the plains, leaving the mountains and hills mostly untouched. This is different from the practice of a wheat-growing culture, which also cultivates the slopes’ (Umehara, 1999: p. 41; see also Iwatsuki, 2008). Berglund (2008: p. 62) regards restricted agrarian expansion into uplands as one defining feature of Japan’s ‘rice-fish-seafood culture’. Such claims have been influential because ambiguously dated ‘traditional’ rice farming has been advocated by many Japanese institutions, in both national and international environmental policy, as an example of ‘wise use’ and harmonious coexistence with nature (Berglund et al., 2014; cf. Cwiertka, 2006; Cwiertka and Yasuhara, 2020; Lindström, 2017, 2019). Claim 2 is best evaluated by considering the role of rice, alluvial plain land-use, and deforestation as separate issues.

Rice in pre-industrial Japan

Rice was the most important tax cereal for the premodern state in Japan; while it thus had great social significance, it was not the cereal which was primarily consumed by most Japanese. Umehara (1990, 2007: p. 179) insists that, ‘the farming that eventually spread to the [Japanese] archipelago was not the wheat-based culture of the Yellow River, but the rice agriculture of the Yangzi.’ Yet research has shown a mixture of rice, millet and wild nuts and fruits in the Neolithic economy of the middle Yangtze (Nasu et al., 2012; Hosoya, 2011). Given that farming systems with low redundancy, e.g., which only have one main crop, tend to expand more slowly (Fuller and Lucas, 2017), agriculture seems to have spread to Japan as part of a diverse suite of crops, combining rice, wheat, barley and the millets (Nasu and Momohara, 2016; Robbeets et al., 2021; Hudson, 2022), a pattern also known from Bronze Age northeast China and Korea (Lee et al., 2007; Kwak et al., 2017; Liu et al., 2019). Agriculture in Japan became more diverse over time as new practices and plants were added. By the mid-sixteenth century, Jorge Álvares, a Portuguese merchant, described an annual cycle of three crops in Kyushu: ‘In November they sow wheat, barley, turnips, radishes and other vegetables such as beet which they eat; in March they sow Indian corn, maize, mangoes, chick-peas, beans, artichokes, cucumbers and melons; in July they sow rice, yams, garlic and onions’ (Cooper, 1965: pp. 6–7). A similar three-crop rotation was described by a Korean visitor to Settsu (modern Osaka-Kobe) in 1420 (Farris, 2006: p. 131).

Since premodern Japan’s taxation system was based on rice, historical documents focus on that cereal, making it difficult to calculate the proportion of agricultural production devoted to other crops. Non-rice crops were converted into corresponding value in rice, which functioned as a de facto currency. Totman (2014: p. 53) speculates that as little as one-third of crop production was of wet rice for most of Japan’s pre-industrial history. Rice was consumed by the aristocracy but perhaps provided only one tenth of the calories of the poorest farmers who ate millets, barley, wheat, beans and vegetables (von Verschuer, 2016: p. 257). In early modern Edo (Tokyo), over-consumption of polished white rice led to the spread of beriberi, which was countered by the consumption of vitamin B1-rich millets (Maruyama et al., 2018). As late as 1881, a European resident of Japan wrote that, ‘Rice is not the universal staple, although the inhabitants of the great cities mainly depend on it; there are many districts where it is almost unknown, and in the districts even where it is grown, it is a luxury—a dainty dish for festive occasions’ (Pfoundes, 1881: p. 229). Both rice and other cereals were widely traded. Sixteenth-century European visitors recorded that wheat flour was shipped from Japan to the Philippines (Cooper, 2001: p. 106).

Alluvial plain use

Since flat land comprises only around 15 per cent of the total surface area of Japan and the rest of the archipelago is covered in steep, mountainous terrain, Japan’s geography placed limits on the expansion of wet-rice farming. Many early rice paddies were concentrated in small valleys and it was only after about AD 1300 that large alluvial plains began to be developed for this purpose, a process aided by the introduction of Champa rice (Oryza sativa indica var. spontanea or perennis) from south China (Farris, 2006: pp. 132–133). Reclamation of marshy flood plains increased in the early modern Tokugawa period (1600–1868). Rice agriculture on alluvial plains required extensive levelling and dykeing, engineering works that created huge anthropogenic wetlands (Fuller and Qin, 2009; Brown, 2015). As Imamura (1996: p. 5) has commented, ‘Although wet-rice fields appear to be natural features, they are in many ways more akin to factories’. The resilience of these artificial paddy field systems was not necessarily high and, like the Chinese, the Japanese became locked in to a path dependency on expensive works to maintain them (Elvin, 1993). In areas unsuited to wet rice, such as mountains and small islands, there was an increase in slash-and-burn cultivation of buckwheat, foxtail millet and barnyard millet as population increased in the Tokugawa period (Kitō, 2012: p. 72). As noted in the previous section, no early agricultural systems were sustainable in the sense of remaining stable and unchanging since their inception: all such systems underwent cycles of expansion and contraction. A particularly important factor impacting the sustainability of rice agriculture in premodern Japan was its heavy dependence on labour inputs, which were sensitive to disruptions in local population densities resulting from epidemic disease, war and natural disasters—all frequent occurrences in the Japanese Islands (Farris, 1985; Kitō, 2000).

Rice and deforestation

Umehara (1996: p. 3) proposed that rice is much ‘gentler’ (yasashii) on nature than wheat because ‘water is the most important thing for rice civilisation’ and thus rice farmers have to protect forests in order to preserve water resources. The obvious point that all agriculture requires water is apparently forgotten, not least by Yasuda (2008: p. 53) who makes the curious statement that ‘water is not so important in European style land-use.’ Recent archaeological research has begun to clarify the sophistication of water management in Neolithic West Asia and Europe (Gebel, 2004; Garfinkel et al., 2006; Vostrovská et al., 2021). The underlying claim for Japan appears to be that the intensification of rice agriculture was able to support higher population densities without increasing forest clearance. As mentioned, empirical data on forest cover in Japan are only available from the mid-nineteenth century. Historians have estimated rates of deforestation back to the seventeenth century (Totman, 1989, 1995; Saito, 2009). For earlier periods, it is possible to use pollen data and to extrapolate trends from studies of land-use and trade in forest products. While it is therefore not possible to test Umehara’s claims about rice and deforestation for earlier periods, we can nevertheless make some inferences against known trends.

It is important to begin by considering what is meant by ‘forest cover’. Modern Japan has a high proportion of woodland against total land area. In the late twentieth century, for example, 67 per cent of Japan was forested as compared to 27 per cent of France; when calculated as woodland per capita, however, France had a higher proportion than Japan during the same period (Saito, 2009). According to Japan’s Forestry Agency (https://www.rinya.maff.go.jp/j/keikaku/genkyou/h29/1.html), 41% of forests in Japan were classified as artificial plantations in 2017. Kaplan and colleagues (2009) argue that forest cover on land usable for agriculture is a better measure of historical rates of deforestation than measures which include all land. By this method, rates of forest cover in pre-industrial Japan would likely be very low. However, deforestation in Japan was not limited to lowland areas suitable for wet-rice farming. Steep hillsides were used for slash-and-burn (swidden) cultivation, especially in western Japan. The government estimated swidden accounted for 15% of arable acreage in 1936 (von Verschuer, 2016: p. 130). Peasants also cleared forested hills ‘to create pastures for feeding livestock and for collecting grasses to mix with animal excrement and use as fertiliser’ (Saito, 2009: p. 385). In the thirteenth century, it was said that peasants near Kyoto had gathered so much grass and leaves for manuring that the hills around their village had lost all their vegetation (von Verschuer, 2016: p. 22). Estimates of the amount of ‘green manure’ used in premodern agriculture suggest that this custom could have significant impacts on local deforestation. For example, a text from the early seventeenth century noted that one man required fifty-two days per year to cut enough vegetation to fertilise one hectare and in 1686 two hundred pack-horse loads of green manure were used for the same area (von Verschuer, 2016: p. 24).

Landscape archaeology shows that clearance of both floodplain and hillside forests was associated with the introduction of agriculture in Japan in the first millennium BC (Okada, 2015). Mining and lumber for construction also resulted in major impacts on forests (Totman, 2014). The Nihon shoki records that in 676, indiscriminate burning and cutting of grass or firewood on mountains and plains in the Kinai region was forbidden (Aston, 1972: II, p. 332). Also in the seventh century, Suruga province (modern Shizuoka) was ordered to build a ship for an invasion of Korea (Aston, 1972: II, p. 269); the construction of a ship at such a distant location from the Korean peninsula may imply that forests were already suffering from depletion in the parts of western Japan controlled by the Yamato state. By the medieval period there was nevertheless an extensive international trade in lumber from Japan to China (von Verschuer, 2016: pp. 69–70). In an analysis of early modern Japan, Saito found some evidence that population increase did not always lead to greater deforestation. Rather than explaining this by the nature of rice cultivation or by conservation policies or cultural beliefs, however, Saito (2009: p. 403) argues that ‘market linkages worked to make regenerative forestry commercially viable’, showing the need to consider broader social factors.

Claim 3: domesticated animals versus marine foods

The third claim to environmental exceptionalism has been pre-industrial Japan’s supposed scarcity of all domesticated animals and corresponding reliance on marine foods (Jones, 1981: pp. 13–14; Sahara, 1992; Imamura, 1996; Berglund, 2008). Umehara (1999: pp. 41–42) argued that, ‘the form of agriculture introduced into Japan did not involve the raising of animals on pasture land, which could have also meant cutting down the trees on the hills and mountains.’ Yasuda (2006) has emphasised the role of fishing in Japanese history while claiming that all domesticated animals cause deforestation and environmental damage. Berglund (2008: p. 56) takes up these ideas, insisting that ‘Animal husbandry has never been important in Japan. By tradition the Japanese diet is based on rice and seafood.’ Since the use of marine resources is one theoretical means to reduce pressure on land-use (Kaplan et al., 2009: p. 3018), this claim is worth evaluating against the evidence from pre-industrial Japan.

Domesticated animals

There is little empirical data on numbers of domesticated animals in premodern Japan. While animal bones are often preserved in Neolithic shell middens, such middens become less common after the Bronze Age (Hudson, 2019, 2021b). As the zooarchaeological record is poor, scholars emphasise the importance of historical documents (Nishimoto, 2008), yet archaeology shows that what people write about animals is not necessarily the same as their actual consumption habits (Pluskowski, 2010). New analytical techniques hold some potential to expand our understanding of animal usage in Japan. For example, because environmental lead pollution increased in the Tokugawa period, high lead concentrations in bone are a way of distinguishing domesticated from wild animals in early modern sites (Eda et al., 2014). More broadly, from the Neolithic onwards all Eurasian societies used domesticated animals in a complex mosaic of adaptations depending on local environmental and social conditions; there was certainly no black and white dichotomy between ‘animal’ and ‘plant’ civilisations as proposed by Yasuda (2015, 2016). In Europe, the Neolithic saw high diversity as different combinations of crops and animals were employed to adapt to the new environments presented in the region (Ethier et al., 2017; Ivanova et al., 2018; Cubas et al., 2020). Even in historic times meat consumption was variable across Europe (Braudel, 1981: p. 105, pp. 194–199; McNeill, 1992: pp. 127–129; Chandezon, 2015). As with forestry, the role of commercialisation needs to be considered, and an increase in meat and milk consumption in Britain from the seventeenth century has been linked with the growing commercialisation of the economy (Francks, 2019). Some of the clearest zooarchaeological evidence for meat consumption in pre-industrial Japan also comes from commercialised contexts in Edo (Uchiyama, 1992).

Pigs and chickens appear to have been the only domesticated animals associated with the first introduction of cereal agriculture into the Japanese archipelago. By the early centuries of the first millennium AD, zooarchaeological evidence suggests pigs were undergoing greater mixing with wild boar (Anezaki, 2007). Following a classification developed by White (2011), Sand (2021) argues that suids in Japan were kept as ‘woodland pigs’ by foraging for food in forests; this contrasted with China where, because of higher population densities, ‘house pigs’ lived mainly on household refuse. Under Chinese influence, a ‘house pig’ pattern developed in Okinawa in the early modern era (Sand, 2021).

Chickens appear around 400 BC but are rare in archaeological sites until the medieval period (Hudson, 2019). Horses and cattle became common from the fifth century AD but, following a pattern found in China and Korea, may have been less widely used in Japan for agricultural activities than in Europe (Pomeranz, 2000: pp. 32–34.). Extensive grasslands were, however, maintained to raise horses for military use. Though probably rare in most of Japan until the nineteenth century, goats were known in Nagasaki and Okinawa from the medieval period (Thiede, 1998; Kreiner, 1996: .p. 21). That the Japanese raised sheep is mentioned by Luis de Guzman in his 1601 Historia de las missiones and, in the seventeenth century, the Japanese traded goatskins to Korea (Lach and van Kley, 1993: p. 1829, 1787).

While dead animals and their flesh could be considered a cause of religious defilement, a practical distinction was made between cattle and horses, which were generally protected as draught animals, and wild game, which damaged crops and was therefore killed and eaten on a regular basis (Shimizu, 2010). In 1546, Alvares noted that in Kagoshima, ‘There are no pigs, goats or sheep, and only a very few stringy hens’; by contrast, he explained that the Japanese ‘hunt and eat deer, rabbits, pheasants, quails, doves and other birds’ (Cooper, 1965: pp. 6–7). This hunting of wild animals was striking for Europeans since it was more restricted by social class in Europe (Pluskowski, 2018). In Japan, hunted meat was sometimes consumed clandestinely under medicinal pretexts and given euphemisms such as yamakujira (‘mountain whale’).

While further zooarchaeological research is needed, these observations may support a relatively low consumption of domesticated animals, but they do not prove a causal link between lack of domesticated animals and high levels of forest cover. Yasuda has been the most vociferous proponent of this argument, claiming that all domesticated animals destroy forests but reserving particular vitriol for sheep and goats (Yasuda, 2006: p. 107, 2016: p. 80). However, he frequently exaggerates evidence for ecological damage from ovicaprids, claiming, for example, that, ‘After the Anglo-Saxons [sic] brought sheep, goats and cattle to the American continent in 1620, 80% of America’s forests were lost within a mere three hundred years’ (Yasuda, 2010: p. 177). Ovicaprids were rare in early colonial North America where cattle and hogs dominate faunal assemblages in British and Spanish sites along the Atlantic coast (Reitz and Honerkamp, 1983). In northern Europe, with some exceptions such as Scotland, Neolithic pastoral economies were dominated by cattle; sheep and goats were never a primary cause of deforestation (Schulting, 2013; Meiggs, 1982; McNeill, 1992: pp. 72–74). Large flocks of sheep only become common in certain parts of Europe in the Bronze Age when they were used to produce wool (Sabatini and Bergerbrant, 2020). Of course, many upland areas around the Mediterranean have been subjected to deforestation and the browsing of ovicaprids has been part of that phenomenon. However, the impacts of pastoralism have not been continuous or irreversible (Walsh et al., 2007). In one well-studied example, upland pastoralism in northwest Greece historically served to increase the sustainability of lowland agriculture (van der Leeuw, 1998; Tainter, 2006). The assumption that ecological damage is only caused by domesticated and not by wild animals is contradicted by eighteenth-century Hachinohe in northern Honshu, where cash-crop farming of soybeans led to the so-called ‘wild boar famine’ of 1749 (Walker, 2001).

Fishing

A corollary to the proposed minor role for the consumption of domesticated animals in Japan has been the claim of reliance on fishing as an alternative source of protein. The concept of a rice-cultivating/fishing economy shared between Japan and southern China had been emphasised by ethnologists in the mid-twentieth century but was critiqued by archaeologists from the 1980s (cf. Hudson, 2021b). Although few details are provided, there seem to be two major assumptions in Yasuda’s (2006, 2009b) re-crafting of this proposal: that fish consumption in Japan has always been high and that fishing was linked closely with agriculture. Statistical data show that seafood consumption has been high in Japan in the post war period. According to the UN Food and Agriculture Organization, in 2009 the average Japanese person ate 56 kg of fish. In 2015, by contrast, average seafood consumption in the EU was 25 kg (European Union, 2017). However, no comparable data exist for pre-industrial Japan for which we can only extrapolate broad trends from the archaeological and historical records. Although human societies in the Japanese Islands have a long history of exploiting marine resources, significant changes occurred over time (Hudson, in press). High levels of seafood consumption in Japan since the 1950s have been underpinned by industrial offshore fisheries. These fisheries developed in the late nineteenth century, but prior to the Second World War they were not aimed primarily at the domestic market, rather at crab, salmon, tuna and other resources for export (Tsutsui, 2013). The Tokugawa period saw a substantial growth in inshore fisheries as a response to the crisis of sustainability of the eighteenth century (Totman, 1993: p. 34). This fish was not only used for direct human consumption: fishmeal was processed into fertiliser and used to boost production of commercial crops like cotton and tobacco, while whale oil was made into insect repellent (Totman, 1993: pp. 272–274, 2014: pp. 180–182). It is not even clear that pre-industrial seafood consumption in Japan was higher than in Europe (allowing for regional variations in both regions). A major expansion in long-distance offshore fisheries began in Europe by the eleventh century. This medieval globalisation of fishing suggests a heavy reliance on seafood in Europe, probably driven by urbanism and population growth and the Christian rule of avoiding meat on certain—increasingly numerous—fast days (Barrett et al., 2004). By contrast, the fact that Japan’s pre-industrial fisheries were limited to inshore catches suggests a smaller market than in Europe, though it may also imply that freshwater fish were under less pressure from human impacts in Japan than in Europe (cf. Barrett, 2018).

Yasuda’s claim of a rice-cultivating/fishing complex finds little support in the archaeological record. Some Yayoi sites have evidence for small-scale net fishing and catching octopus in ceramic pots. Carp aquaculture in rice paddy fields was introduced from China at this time (Nakajima et al., 2019). While these activities no doubt provided important supplements to the diet, fishing in post-Bronze Age Japan developed apart from farming through more specialist groups (Hudson, 2021b, in press). As compared to Europe, the extent to which commercial offshore fisheries developed in premodern Japan awaits further research.

Another important area for future work is stable isotope analyses of human bone, which provide a way to test the claim that pre-industrial Japanese obtained their protein primarily from marine foods. Currently available isotopic studies from premodern Japan do not show universally higher rates of marine food consumption as compared to Europe. Regional variation is high in both regions but \(\delta\)15N values—often a proxy for seafood consumption—from medieval York are, for example, higher than those from early modern Japan (Maruyama et al., 2018; Tsutaya et al., 2016a; Müldner and Richards, 2007). Moreover, the \(\delta\)15N values of rice and vegetables from early modern Japan may have been inflated due to the widespread use of dried fish as fertiliser (Tsutaya et al., 2016b).

Claim 4: cultural ideologies of stewardship

The final claim of environmental exceptionalism considered here relates to mentalités and the proposal that the late arrival of agriculture explains the continuing importance of pre-agricultural world views. Umehara (1970, 1991, 1996) argued that supposedly pro-environmentalist Shinto and animist beliefs in modern Japan reflect the tradition of a hunter-gatherer ‘forest civilisation’, and further claimed that those beliefs have made Japanese Buddhism more animistic in its teachings (see also Nagasawa, 2008). There are three major questions here. First, have Japanese religions such as Shinto remained unchanged from the Jōmon period until the present day? Second, have those religions fostered pro-environmentalist behaviours throughout Japanese history? Third, can the proposed pro-environmentalist behaviours be linked with religious practices?

The idea that ‘traditional’ mentalités have persisted in modern Japan is a common assumption made by both Japanese and non-Japanese observers. Sociologist Antonio Negri writes that ‘Japan’s powerful cultural traditions that manage to co-exist with super-modernity have the potential to solve [the] conundrum’ of ‘finding a new way to co-exist with nature’ (Yoneyama, 2019: p. 205). However, the valorisation of Shinto as Japan’s nature-friendly ‘indigenous religion’ goes against the consensus of research in religious history, which has shown that Shinto underwent huge changes over time (Kuroda, 1981; Teeuwen, 2012). Since Kuroda’s seminal work, scholars have debated the extent to which Shinto was a modern invention or the product of a more complex historical process of interaction with other religious currents—notably Buddhism and Confucianism. Kuroda argued that Shinto was not an independent tradition until the modern period and was, in essence, a product of modern nationalism. Others, while considering that Kuroda overstated the modernity of Shinto, and arguing that a tradition centred around kami (spirit) veneration had an independent existence before modern times, nonetheless reiterate the point that to view Shinto as a self-standing indigenous religion with an unchanging nature is historically incorrect (Breen and Teeuwen, 2000). After Kuroda it is hard to find any serious scholarship that has continued to adhere to the idea of Shinto as an indigenous, unchanging religion.

The second question about environmentalist attitudes draws on a contentious literature proposing that ‘Oriental’ and Indigenous religions have fostered greater environmental stewardship than the monotheistic beliefs of the Judaeo-Christian tradition (cf. White, 1967; Reader, 1990; Kalland, 2002; Elverskog, 2014). However, there is no empirical evidence that religious traditions have had any influence on cumulative anthropogenic impacts over the long term. Understandings of sustainability and environmentalism found in contemporary societies are not directly applicable to the past (Harkin and Lewis, 2007). Historical analyses have shown that attitudes to nature have changed over time in Japan and elsewhere (Morris-Suzuki, 1991; Thomas, 1999; Stolz, 2014; Thomas, 1983). The evidence summarised in this paper shows that premodern Japan was characterised by significant anthropogenic impacts despite a different religious history to the rest of Eurasia. The substantial increase in environmental damage in Japan beginning with the industrialisation of the late nineteenth century in fact coincided with Shinto becoming the official state ideology (Hardacre, 1989, 2017; Shimazono, 2009), yet few historians have argued that Shinto was a cause of those environmental transformations. Iwatsuki (2008: 8) attempts to explain this apparent contradiction by commenting that ‘It is a pity to note that Japanese Shintoism was under a strong influence of militarism after the Meiji Restoration, in parallel with the promotion of the advanced Western civilisation.’ Views of Shinto through an ‘environmentalist paradigm’ only entered the mainstream from the 1980s as part of attempts to revitalise that religious tradition and its political role (Rots, 2017).

It is possible to extract ‘green’ fragments from Japan’s religious tradition but these are usually discussed without consideration of historical context. Evidence as to how such ideas might have actually affected the use of natural resources in Japan is lacking. Within the Japanese religious tradition, itself characterised by a variety of beliefs and practices, there is a diverse discourse over relations between humans and the natural world. While there were various taboos connected with the taking of life, alternative positions existed (Grumbach, 2005; Nakamura, 2010). The Kokonchomonshū, a text completed in 1254, contains a story about a man from the Tōdaiji (Buddhist) temple who bought clams only to piously return them to the sea. That night, he was frightened by a dream in which the clams came back to complain that they had lost their chance for reincarnation. This story is regarded as support for the idea that the human consumption of animals was beneficial for their karma (Nakazawa, 2009). Grumbach (2005) provides a detailed analysis of the role of medieval venison offerings at the Suwa (Shinto) shrine. These examples illustrate religious debates over the use of animals in premodern Japan, but whether or not such discourse affected the actual use of clams, venison or other resources is entirely another question.

Another common device to claim a religious basis for Japanese environmental exceptionalism is the selective use of negative examples from the Christian West, framed in such a way as to imply that similar behaviour did not occur in Japan. Thus, for instance, deforestation in medieval Europe is blamed on the use of stained-glass windows for churches by Ishi et al. (2001: p. 127) who claim (without citations) that 1 m2 of stained glass required 50 m2 of forest. Such comments suggest that the construction of religious buildings in Japan did not also require extensive use of timber, but it has been estimated that in the eighth century the construction of the Tōdaiji temple in Nara would have required some 900 ha of top-grade timber (Totman, 2014). Heavy metal pollution around Tōdaiji was also high (Kawahata et al., 2014). After Japan’s imperial capital was moved to Kyoto at the end of the eighth century it remained there until 1868, in part due to the increasing costs of finding wood for further construction (Totman, 2014: p. 86). In short, although traces of earlier myths and traditional ecological knowledge relating to forests may remain—or have been re-imagined—in Japan, concepts like the ‘civilisation of the forest’ are reductionist and ahistorical and can be placed within the trope of the invention of tradition, which has been so pervasive in modern Japan (Vlastos, 1998).

The third question relates to the situation of religion in contemporary society. While various understandings of ‘secularism’ exist, by any criteria modern Japan is a highly secularised society in which support for religious institutions is in decline (Reader, 2012). At the same time, religion remains important in legal terms, with a constitutional guarantee of religious freedom and protection from state interference. The state is not allowed to privilege, fund or support religious institutions, which are located strictly in the private sphere. However, ever since the inauguration of Japan’s post war constitution there have been recurrent attempts to revise this situation, especially by portraying Shinto as an indigenous cultural phenomenon centred on customary practices and harmony with nature, and as an integral part of Japanese national identity and heritage (Prohl, 2000; Mullins, 2012; Rots, 2017). Those who portray Shinto in this way claim that it is an intrinsic element in the fabric of Japanese life, rather than a ‘religion’, and as such should be supported and given privileges by the state. The environmental concepts discussed in this essay and the claims made by their protagonists have provided some of the supposedly ‘intellectual’ underpinnings for such claims, while positing Shinto as an environmentally friendly tradition. The claims have been articulated by a powerful lobby of conservative political figures including former Prime Minister Shinzō Abe and the influential pressure group Nippon Kaigi, and also form part of a wider nationalist desire to revise the constitution and to remove Article Nine, its ‘peace clause’ (Guthmann, 2017). While these political manipulations tell us little about environmental sustainability in premodern Japan, they are a good example of how the environmental inventions and speculations we have discussed can give sustenance to attempts to redefine religious traditions and reorient society. This complex background to the relationship between Shinto and the environment has meant that the issues connected to Claim 4 have so far been discussed within the limited framework of ‘Japanese tradition’. The broader literature looking at how religion can promote sustainability (e.g., Haluza-DeLay, 2014; Hulme, 2017; Jenkins and Chapple, 2011) has been ignored and, at the same time, that literature has rarely considered Japan except within a narrow, stereotypical framework.

Conclusions

None of the claims for Japanese environmental exceptionalism investigated here were supported by the archaeological and historical records (Table 1). The Palaeoanthropocene in the Japanese Islands was characterised by extensive anthropogenic impacts, which would appear to be comparable to other regions of Eurasia and probably reflect global processes.

Table 1 Results of this study.

The claims analysed in this paper have usually been presented in the Japanese literature in a simplistic or sometimes nationalistic way not conducive to increasing understanding of long-term processes in environmental history. Nevertheless, several topics discussed here warrant further research and archaeology can play an important role in generating relevant datasets and analyses using new techniques (cf. Fernandes et al., 2021). Any claims of Japanese environmental exceptionalism need to be considered in the context of mainstream understandings of anthropogenic change. The climate, topography and location of the Japanese archipelago suggest high levels of forest cover would exist there under natural conditions without invoking a special role for religious traditions (cf. Rolett and Diamond, 2004). As noted, population density and socio-political complexity have been proposed as important predictors of anthropogenic impacts. Since both prehistoric and pre-industrial Japan were characterised by high levels of population density and socio-political complexity, these are likely to be significant predictors of anthropogenic impacts in the archipelago, although the role of epidemic disease in depressing population growth until the Middle Ages needs to be considered (Farris, 1985). Finally, while the claims discussed here have invariably treated ‘Japan’ as a single cultural unit, there is a need to consider possible geographic variation in anthropogenic impacts across the archipelago.

Our main concern in this study has been with historical issues, but the policy implications of our analysis also merit further consideration. As noted in the Introduction and the Supplementary Information, ideas about Japanese environmental exceptionalism have found their way not just into national policy, but also to Japan’s international contributions to United Nations biodiversity policies and nominations for UNESCO heritage status. The basic assumption shared in these policy documents is that since Japan’s traditional culture was in ‘harmony’ with nature, there is a need not only to celebrate and preserve those traditional systems but also to attempt to transmit them as best practice for other countries. Our results show that the factors often cited as explanations for Japan’s traditional ‘harmony with nature’ are not supported by the historical record. This does not mean that Japan has nothing to contribute to debates on sustainability. Rather, there is a need to shift discourse away from unfounded ideas about ‘harmony’ to consider to what extent Japanese societies were resilient to socio-ecological change over the long-term.