The Evolution of Seasonal Affective Disorder

Rebecca Palmer's image for:
"The Evolution of Seasonal Affective Disorder"
Image by: 

The most important difference between Seasonal Affective Disorder (SAD) and other types of depression is the atypical way in which it manifests itself. Both manic depression and major depressive disorder would have been evolutionarily disadvantageous for our ancestors, while SAD would have been beneficial in numerous ways. Today however, that manifestation of SAD has become an evolutionary throwback in our modern society and remains a problem for millions worldwide.
Seasonal Affective Disorder and its more popular sub category, Sub-Seasonal Affective Disorder, strikes annually from late autumn to early spring (and lasting the duration of the winter). Overall, some of the symptoms of SAD and S-SAD occur throughout the general population, but in varying degrees (Murray, 2003). Characterized by lethargy, hypersomnia, weight gain, craving carbohydrates and foods with a high fat content, SAD almost completely impairs functioning for those diagnosed (Smith, 2005). To be diagnosed with SAD, one must have recurring SAD symptoms for at least two concurrent years and not have those symptoms be related to seasonal stressors (or psychosocial factors) (Rawana, 2006). Sub-Seasonal Affective Disorder (S-SAD) is distinguished from SAD by the lesser severity of its symptoms. It is not correct to suggest that SAD and S-SAD are caused simply by the weather or the "holiday blues" because they extend for a period of months rather than the few weeks around the holidays (Partonen, 2001). Also, holidays are a relatively recent invention and have no real evolutionary value. Both SAD and S-SAD in modern society, although they were evolutionarily advantageous for the proto-human, have proved to be simply a regression to our more primitive history.
Without the aid of artificial lighting, the proto-human was confined to the hours of sunlight provided (photoperiod), and was completely unable to turn on a bedside lamp or a kitchen light to continue working. Because of this, the natural photoperiod in which he lived was considerably shorter during the winter months and consequently lengthened in the summer. The same is true today, yet we have the artificial lighting to aid and extend our hours of wakefulness and sleep. Photoperiod is of extraordinary value to the etiology of SAD according to the Phase Shift Hypothesis. The Phase Shift Hypothesis theorizes that the hours of sunlight during the day influence our circadian rhythm (which is responsible for hormone and neurotransmitter regulation, as well as our sleep and wake cycle) (Lewy et al, 2006). This phase shift interferes with one's normal sleep cycle, and has the possibility to set it completely awry. One can be either phase delayed or less commonly, phase advanced. Melatonin is secreted during the night (darkness) and continues to be secreted throughout the later dawn (Partonen, 2001). Because of the later rising sun and increase in melatonin production, the circadian rhythm is delayed. Conversely, because of the earlier setting sun, one can become phase advanced by the earlier secretion of the neurotransmitter. Both of these SAD individuals will become hypersomniatic and vegetative (despite the differences of being phase advanced or phase delayed).
Photoperiod is not only influenced by the seasons, but also influenced by latitude; the higher the latitude, the more extreme the differences in photoperiod are from season to season. Despite the evidence that latitude has some affect on SAD, age and sex have a greater impact than latitude and photoperiod alone (Hedge, 1996). Women of childbearing age are the most effected by SAD because of the season's impact on female hormones; while others between the ages of 20-30 are vulnerable (Partonen, 2001). Evolutionary theory can provide two explanations for the common age bracket of SAD: the drastically shorter lifespan of the proto-humans ,or possibly because these were the precious reproductive years. It would be advantageous to conserve resources throughout the winter and prepare for spring breeding.
This winter vegetation period was absolutely vital for the proto-human. Without decreasing his activity level, he risked burning all his energy during a time when food was most certainly scarce. Weight loss and mortality levels sharply increase during food scarcity, and the proto-human's solution would have been to switch his food source. Typically in the face of famine, a primate will change his diet to include plentiful, yet foods of low nutritional quality (Brockman, 2005). Through consuming high energy foods such as carbohydrates and fats, he would have been able to avoid death and pass on his genes to subsequent generations.
SAD sufferers often find themselves craving carbohydrates, chocolate, and other fatty foods during the winter months (Smith, 2005); while on the average, an individual with SAD will gain anywhere from nine to thirty pounds (4kg to 13kg) (Smyth, 1990). This winter weight gain had originally been fat storage for potentially lean months ahead. By craving low quality, but highly available food stuffs, those with SAD are merely mirroring their proto-ancestors. These eating habits would have ensured their survival through the barren winters, and also increased the possibility for fertility for the spring.
Any other types of depression other than SAD would have been an impediment to survival. Unlike the symptoms of SAD which include vegetation and weight gain, major depressive disorder (MDD) characterizes itself with weight loss and insomnia (Rawana, 2006). While weight loss has the potential to reduce fertility, insomnia most definitely reduces productivity. Also, MDD has no seasonal pattern, and may occur at the most inconvenient time (gathering or breeding time in the proto-human's life). If the proto-human was clinically depressed during gathering or breeding time, he would have compromised his own survival, as well as the survival of his genes.
The same can be said of manic depressive disorder. Manic depression is distinguished from major depressive disorder by the individual's severe highs and lows. This disorder also does not follow a seasonal pattern, and can strike at almost any moment. During the "lows", the individual has all the symptoms of major depressive disorder (weight loss, hopelessness, insomnia); while during the "highs" the individual is overly confident, joyful, and highly energetic. Both the highs and lows of manic depression are evolutionarily disadvantageous because of their extremes. With the "high" of becoming over confident one might incur poor judgment, and to the proto-human, good judgment was the difference between eating and being eaten. The "lows" of manic depression would have been detrimental to the proto-human's survival much the way major depression was detrimental.
However, if these types of depression have a genetic link, than his compromised survival might have been evolution's way of decreasing the amount of "depressed" genes. Proto-humans who were depressed risked their survival, and subsequently passed down less genetic material. Because SAD is evolutionarily beneficial, those genes linked to the disorder would have been carried down through generations. This might explain why the population as a whole experiences some decrease in mood and increase in sleep during the winter months (Eagles, 2006). To be affected by the seasons is a normal occurrence, yet SAD and S-SAD lie on the extremes of that norm.
SAD has been evolutionarily beneficial in the past, but remains a medical nuisance in modern society. Without it, the human race as a whole would be much less populated, yet now it is seen as a regression towards a primitive past. Many resign themselves to blaming the weight gain, sleepiness and carbohydrate craving on Christmas and Thanksgiving, rather than fully appreciating it's evolutionary benefits.
Brockman, Diane & VanSchaik, Carel P. (2005) Seasonality in primates. Cambridge, UK: Cambridge University Press.
Eagles, J.M. (2003) Seasonal affective disorder. British Journal of Psychiatry, 182, 174-176.
Hedge, A.L. & Woodson, H. (1996) Prevalence of seasonal changes in mood and behavior during the winter months in central Texas. Psychiatry Res. 62, 265-271.
Lewy, Alfred, J., Lefler, Bryan J., Emens, Jonathan S., Bauer, Vance K. (2006) The circadian basis of winter depression. Retrieved from PsychInfo 13 January 2007.
Murray, G., Allan, N.B., & Trinder, J. (2003) Seasonality and circadian phase delay: Prospective evidence that winter lowering of mood is associated with the shift toward eveningness. Journal of Affective Disorders. 76, 15-22.
Partonen, T., & Magnusson, A. (2001) Seasonal affective disorder, practice and research. Oxford: Oxford University Press.
Rawana, Jennine S. (2006) Integrating the cognitive-specifying and dual vulnerability hypothesis: implications for vegetative and cognitive / affective difference in seasonal, non-seasonal, and sub syndromal seasonal depression. Retrieved from PsychInfo database 12 January 2007.
Smith, Katherine D. (2005) Seasonal changes in mood and behavior among children and adolescence. Retrieved from PsychInfo database 12 January 2007.
Smyth, Angela. (1990) Seasonal affective disorder. London: Harper Collins.

More about this author: Rebecca Palmer

From Around the Web