In this issue of Hypertension Research, Kamezaki et al.1 explore seasonal differences in the diagnosis of the metabolic syndrome and the role of insulin resistance in this relationship. The authors found that significantly more subjects met the criteria for the metabolic syndrome in winter than in summer, and that insulin resistance and increased blood pressure were the components that primarily drove the results. It appears that many subjects already met some but not all of the diagnostic criteria for the metabolic syndrome, such as having abdominal obesity and dyslipidemia, and the seasonal exacerbation of insulin resistance and/or blood pressure was sufficient to make them meet the full diagnostic criteria. The results from this study add to the evidence that metabolic parameters vary on a seasonal basis and lend support to the theory of the seasonal expression of the thrifty genotype in humans. According to this theory, the thrifty genotype evolved to be expressed during seasons of high food availability to facilitate the deposition of fat reserves through insulin resistance and other metabolic changes to prepare for later seasons of relative food scarcity.2 In the extreme case of hibernators, tremendous amounts of weight are gained in the late summer and fall, mainly through the accumulation of stores of triglycerides in their adipose tissue. The fat-storing phase of the circannual cycle is characterized by hyperinsulinaemia and insulin insensitivity in the autumn compared to summer, spring and winter. These changes are transitory though, and reverse themselves once hibernation begins. The stored fat, including large abdominal depots, provide the primary metabolic fuel for the entire winter hibernation season. Although humans do not hibernate, it is likely that we share at least some of the genes and pathways involved in orchestrating seasonal weight changes.3 If human metabolic changes follow a similar pattern then we would expect the highest levels of insulin resistance in the late summer and fall months, but unfortunately the current study examined metabolic differences only between summer and winter.

The central biological clock or suprachiasmatic nucleus (SCN) is likely to play a part in seasonal variations in metabolism. Circadian gene variants have been found to be associated with hypertension and high fasting blood glucose in humans.4 The SCN evolved to synchronize activity and consumption to the circadian and circannual cycles using the autonomic nervous system. The SCN requires repeated environmental cues from light exposure, sleep, activity and nutrient intake to entrain autonomic rhythms to the external environment. Reliable environmental cues of the impending seasons for our hunter–gatherer ancestors were the length of the daily photoperiod that changed in a precise and predictable fashion throughout the year, and the intake of seasonally available nutrients. As the distance from the equator increases the seasonal variations in light exposure, temperature and food availability become more dramatic. Our hunter–gatherer ancestors only had access to fruits on a seasonal basis, fruits that often ripened to have a higher glycemic index in the autumn. The winter diet was likely to have composed more wild game, and the carbohydrates that would have been available would have had a lower glycemic index. The sleep durations of our hunter–gatherer ancestors were likely to have been shorter in summer months due to the longer photoperiod, and experimental sleep restriction has been shown to result in compromised insulin sensitivity,5 increased appetite,6 raised blood pressure,7 and increased total and LDL cholesterol levels.8 The environments in which our circadian and metabolic regulatory systems evolved to be adaptive are quite different from those encountered in modern society. Today we have dramatic alterations in the timing and duration of external and internal zeitgebers that can send inappropriate signals to the SCN regarding the current season. We can be exposed to artificial light around the clock, stay awake for long hours, feed on sweets and simple sugars year round, and rely upon modern conveniences to reduce physical activity. Insulin resistance that was adaptive on a seasonal basis for our ancestors could, today, with our dramatically different lifestyles, result in year round fat deposition and over taxation of our beta cells, contributing toward the cluster of symptoms characterized by the metabolic syndrome.

Kamezaki et al.1 point out that the seasonal variations in metabolic parameters could be partially attributable to depressed mood and its concomitant symptoms. The diagnostic criteria for a major depressive episode includes symptoms that directly have an impact on body weight such as changes in appetite, fatigue, insomnia/hypersomnia and anhedonia. Seasonal affective disorder (SAD) is a type of depressive episode that occurs at a regular time of year, typically beginning in the fall when light exposure dwindles. The characteristic symptoms of SAD include carbohydrate cravings, increased appetite, hypersomnia, decreased energy, psychomotor retardation and anhedonia. The symptoms of SAD have been hypothesized to have evolved to facilitate weight gain and the conservation of energy to increase the likelihood of surviving winter months when food is less available.9 Increased appetite, and specifically carbohydrate cravings, would increase the motivation to forage for, and consume the remaining seasonally available supplies of fruits and carbohydrates. Hypersomnia, fatigue, psychomotor retardation and anhedonia would function to reduce physical activity and decrease caloric expenditure. In a study conducted by Rintamaki et al.,10 seasonal increases in body weight, waist circumference and the diagnosis of the metabolic syndrome were found to be significantly associated with seasonal changes in mood and behavior. They found that individuals with a global score on the Seasonal Pattern Assessment Questionnaire commensurate with having the winter blues had a 56% greater risk of having the metabolic syndrome. Unfortunately, the present study did not have measures of depressive symptoms, therefore the authors were not able to explore whether mood was associated with changes in metabolic parameters.

The finding of seasonal variation in the metabolic syndrome has implications for the interpretation of results from epidemiological studies that estimate the prevalence of the metabolic syndrome. The question arises whether prevalence estimates should be adjusted based upon the season in which the study was conducted. Another issue is whether the increase in the prevalence in the fall and winter months simply represents a temporary perturbation that returns to previous levels in the spring and summer. On an individual level, treatment providers should be wary that patients whose glucose and blood pressure levels are in borderline ranges in the spring and summer could easily be over the line in the fall and winter.

The results from this study by Kamezaki et al.1 should provide the impetus to conduct future research to explore the seasonal influences on metabolism. The current study included only two time points, summer and winter of one year, and only one population in one geographic locale. Ideally, future studies would include repeated measures of exposures, outcomes, and potential mediating and interacting variables over each of the four seasons for a period of years with a variety of populations. In addition to the metabolic measures, some of the variables of interest would include light exposure, latitude, temperature, sleep duration, dietary intake, appetite, physical activity, race and depressive symptoms. Having these measures would allow many interesting lines of inquiry. Would the metabolic measures be more pronounced in the fall and winter, and then return to their prior spring and summer levels? Even if they decreased in spring and summer, they could still be higher than the previous year, inching closer each year toward the diagnosis of diabetes, hypertension, abdominal obesity, dyslipidemia or the metabolic syndrome. A useful tool to generate hypotheses could be to compare the seasonal exposures that would have been expected for our hunter–gatherer ancestors, to our current, and previously before impossible, industrial age seasonal exposures. For instance, we would expect that a hunter–gatherer living in Greenland in the winter time would have eaten fewer carbohydrates, been exposed to less light, slept longer and had to endure cold temperatures. The fitness to endure these seasonal environmental challenges would have evolved over centuries in the peoples who inhabited these lands. Today, a Greenlander could easily gorge on oranges, bask in artificial light and get very little sleep, sending signals to their SCN about the season that are quite inconsistent with the dwindling daylight and cold temperatures outside. Would year round exposure, over many years, to lifestyles that were previously experienced only on a seasonal basis, prove to be problematic for our metabolic regulatory systems?