ºÚÁÏÍø

ºÚÁÏÍø Scientific Reports

Your Research - Your Rights

Does Early-Life Starvation Influence Age-Specific Mortality? Evidence from the Ukraine Famine of 1933

Research Article ºÚÁÏÍø
Departament of Therapy and Geriatrics, D.F. Chebotarev Institute of Gerontology, Kiev, Ukraine
*Corresponding authors: Dr. Alexander M. Vaiserman
Departament of Therapy and Geriatrics
D.F. Chebotarev Institute of Gerontology
Vyshgorodskaya st 67
Kiev 04114, Ukraine
E-mail: vaiserman@ukrpost.net
 
Received July 17, 2012; Published August 23, 2012
 
Citation: Vaiserman AM, Pisaruk AV, Zabuga OG, Voitenko VP (2012) Does Early-Life Starvation Influence Age-Specific Mortality? Evidence from the Ukraine Famine of 1933. 1: 258. doi:10.4172/scientificreports.258
 
Copyright: © 2012 Vaiserman AM, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
 
Abstract
 
A link between famine in early life and risk of chronic diseases was established repeatedly. Contradictory evidence exists, however, regarding effects of early-life famine on age-specific mortality. Several studies around the world found both positive and negative associations between early-life exposure to famine and later-life survival. To study whether there exists a long-term effect of exposure to Ukraine famine of 1933 in early life on current adult mortality, age-specific mortality rates in the cohorts born before (1931/1932), during (1933) and after (1934/1935) the peak of famine were modeled using a Gompertz function. We failed to find any solid evidence that starvation during the Ukraine famine of 1933 significantly affected age-specific mortality rate in cohorts born during the famine. We suggest that selection effects and debilitation effects were roughly equal in magnitude but opposite in direction in the studied famine-exposed populations, resulting in a leveling of overall effect of the famine.
 
Keywords
 
Age-specific mortality, Early-life exposure, Gompertz function, Ukraine famine of 1933
 
Introduction
 
The effects of calorie restriction (CR; the reduced intake of a nutritious diet) starting in adult life have been widely studied in animal and human models [1]. However, the effects of malnutrition during the prenatal or early postnatal development on adult survival and longevity have been studied only in few studies. Susan E. Ozanne and associates have reported in a series of papers that changes in nutrition during fetal or early postnatal life are sufficient to have marked effects on lifespan in rats and mice [2-4]. Offspring born to normally fed dams but suckled by protein restricted dams grew slowly during lactation and exhibited significantly longer lifespan when fed ad libitum on standard chow. Conversely, offspring born to protein restricted dams but suckled by normally fed dams were smaller at birth, showed rapid catch-up growth and had a reduced longevity when fed ad libitum on standard chow.
 
For human beings, CR is defined as the deliberate reduction in calorie intake to level up to 30% below a standard calorie intake which, for a 70 kg male, is 2,500 calories per day. The experimental studies of the long-term effects of exposure to CR during human development are not feasible, both for ethical reasons and the prolonged follow-up required. Therefore, it is important that observational studies including natural experiments and cross-country studies in suitable populations can be realized. Natural experiment is ‘the naturally occurring circumstances in which subsets of the population have different levels of exposure to a supposed causal factor, in a situation resembling an actual experiment where human subjects would be randomly allocated to groups [5]. The natural experiments provide an opportunity to reexamine important scientific questions concerning the link between early-life conditions and adult morbidity and mortality. The famine has multiple features that are beneficial for its use as a natural experiment. A causal link between the famine in early life and increased risk of ageassociated chronic diseases was established in a number of studies [6]. Contradictory evidence exists, however, regarding the effects of the famine in early life on age-specific mortality rates and life expectancy.
 
Exposure to the Famine in Early Life and Age-Specific Mortality: Evidence around the World
 
Several studies around the world found both positive and negative associations between early-life exposure to famine and survival in later life. An analysis of the Dutch potato famine of 1846-47 found higher late-life mortality for cohorts born during the famine [7]. Men and women lost on average 4 and 2.5 years of life after age 50, respectively, after exposure at birth to the famine. Lower social classes were more affected by early life exposure to the Potato famine than higher social classes.
 
Other studies of the long-term effects of early-life famine on mortality in later life, however, did not find any differences in adult mortality for cohorts born during famine. In the [8] study where the effect of prenatal exposure to the Dutch famine 1944-45 on survival among 2254 people born in Amsterdam was investigated, the mortality up to age 50 was highest among those born before the famine (15.2%) and among those exposed to famine in late gestation (14.6%); it was lower among those exposed in mid (11.2%) or early gestation (11.5%), and was lowest among those conceived after the famine (7.2%). These differences were caused by effects on mortality in the first year after birth and were mainly related to nutrition and infections. There was, however, no effect of exposure to famine on mortality after the age of 18. In the subsequent study carried out on the same sample, [9] could not demonstrate an effect of prenatal exposure to famine on adult mortality up to the age of 57 years. Recently, Song (2009) [10] using retrospective individual mortality records of three cohorts of newborns (1954-58, 1959-62, and 1963-67) in China, examined the effect of being conceived or born during the 1959-1961 Great Leap Forward Famine on postnatal mortality. The results obtained show strong evidence of a short-term (period) effect of the famine, caused directly by starvation during the period of the famine. After controlling for period mortality fluctuation, however, the famine-born cohort does not show higher mortality than either the pre-famine or the post-famine cohort. In his subsequent research, Song (2010) [11] with the aim of identifying the long-term effects of the 1959-61 Great Leap Forward Famine determined the cohort mortality differences up to age 22 in three cohorts of newborns (1956-58, 1959-61, and 1962-64). In this study, mortality level of the non-famine cohort caught up to and exceeded the level of the famine cohort between ages 11 and 12. The study by Kannisto et al. [12] also failed to find any long-term consequences of the Great Finnish Famine 1866-68 on old age mortality. Remarkably, in this study survival from birth to age 17 years was significantly lower in cohorts born before and during the famine than in the cohorts born after the famine; at subsequent ages, including old age, mortality was practically identical in the famine-born cohorts and in the five cohorts born before and after the crisis. Similarly, in the analysis of the sustained effects of the 1974-75 famine on cohort mortality in a rural area of Bangladesh, Razzaque et al. [13] detected that mortality in the famine-born cohort was higher during the first and second years of life, while in the famine-conceived cohort it was higher during the first year and lower during the second compared to the non-famine cohort. No significant differences in mortality by cohort were observed between the ages 2 and 5 years.
 
The data indicating that malnutrition and stress in early life does not increase mortality over the lifetime were found also in non-human species, such as blue-footed booby [14]. In these birds, sibling conflict obliges younger brood members to grow up suffering aggressive subordination, food deprivation and elevated stress hormone. A study of 7927 individuals from two-fledgling and singleton broods from 20 cohorts found no significant evidence of a higher rate of mortality nor a lower rate of recruitment in younger fledglings than in elder fledglings or singletons at any age over the 20 year lifespan.
 
To date, no study of the long-term health and mortality outcomes of early-life exposure to Ukraine famine of 1933 has been carried out. In order to study whether there exists a long-term effect of exposure to Ukrainian famine of 1933 in early life on current adult mortality, we examined data regarding the age-specific mortality rates in birth cohorts exposed and unexposed to the famine in early life.
 
Ukraine Famine of 1933: Historical Background
 
The Ukrainian famine of 1933 represents a well-defined time period of severe malnutrition across entire population, including pregnant women, lactating mothers, as well as infants and young children. This famine (‘Holodomor’) was caused by the Soviet Union government's forced collectivization of agriculture in the early 1930s. Although famine caused by collectivization has affected the majority of the grain-producing regions of the Soviet Union, especially strict policy was largely limited to Ukraine. From November 1932 peasants from Ukraine were required to return “extra” grain they had previously earned for meeting their targets. State police and the Communist Party brigades were sent into these regions to root out any food they could find. In January 1933, Ukrainian borders were sealed by troops in order to prevent Ukrainian peasants from fleeing to other republics. By the end of February 1933, approximately 190,000 Ukrainian peasants had been caught trying to flee Ukraine and were forced to return to their villages to starve [15,16]. The urban workers were supplied by a rationing system (and therefore could occasionally assist their starving relatives of the countryside), but rations were gradually cut and by the spring of 1933, the urban residents also faced starvation. Demographic archives showing population mortality reveal that maximal famine occurred on March-August 1933 [17], and after grain collection in 1933, the famine began to decrease.
 
The reasons for the famine are a subject of intense academic and political debate [15,16]. Some historians suggest that the famine was a consequence of economic problems associated with radical economic changes implemented during the period of Soviet industrialization. However, it has been suggested by other scholars that the Soviet authorities used the famine to prevent the spread of the Ukrainian nationalism and may fall under the legal definition of genocide.
 
The exact number of deaths from the Great Famine of 1933 is hard to determine, because the Soviet government deliberately obscured its existence and refused to publish any statistics. American and European observers have made estimates of anywhere from 1 to 10 million deaths in the year 1933, the peak of the famine [18]. The average of these estimates is 5.5 million, and Dalrymple [18] concluded that this number is probably a reasonable estimate. In the early 1990's, the first estimates of the total demographic losses due to the famine were published. However, none of these estimates allow us to partition the total population losses into the parts which rely on birth deficits, migration flows and crisis over-mortality. In the early of twentieth century, Vallin and coauthors undertook reconstitution studies of the different factors responsible for the huge demographic fluctuations which have struck the Soviet Ukraine and, finally, they estimated the annual changes in Ukrainian mortality rates by sex and age during the years 1926 to 1965 [19]. The authors performed an analysis with sophisticated demographic tools with forward projection of expected growth from the 1926 census and backward projection from the 1939 census and estimate the amount of direct deaths for 1933 as 2.7 million [19].
 
 
Data and Methods
 
The reliable demographic data about the age-specific mortality in Ukraine were taken from the periodic life tables (1x1) at the Human Mortality Database (). The mortality rates at ages 30 to 70 were compared in the cohorts born before (1931/1932), during (1933) and after (1934/1935) the peak of the famine. The reasoning for the selection of the year 1933 as the peak-of-famine year was based on the monthly mortality recorded in Ukraine in 1932-1933 from which it is evident that the main peak of the famine happened during March to August 1933, when the number of deaths was about 5-10 times higher than those before and after the famine (Figure 1).
 
Figure 1: Total monthly mortality in Ukraine (all regions combined) in 1932- 1933. Modified and reproduced by Kulchytsky (2007).
 
The unexposed 1935 cohort was used as the reference group in these comparisons. A Gompertz function [20] was used to modeling the age-specific mortality rates in famine and nonfamine cohorts. The Gompertz model assumes that the rate of age-specific mortality increases exponentially over the lifespan. The Gompertz formula for the age-pattern of the force of mortality is as follows [21]:
 
μ(x) = Roexp (αx)
 
where μ(x) is the mortality rates at age x, and Ro and α are constants.
 
By expressing the equation in logarithmic form, we have:
 
ln μ(x) = ln(Ro) + αx
 
where α is the rate of increase (a constant) in the force (rate) of mortality at age x, and ln(Ro) is the age-independent mortality rate coefficient. The Gompertz parameters α and b [ln(Ro)] were calculated using linear regression. The Gompertz slope (α) is assumed to be a measure of aging rate, and the Gompertz. intercept (b) is supposed to be a measure of baseline mortality rate (referred to as frailty).
 
It was shown that the Gompertz function is a good fit to age-specific mortality data from age 30 to 70 [21]. Therefore, we used mortality data for this age range only.
 
Results
 
No increased adult mortality rates for the cohorts conceived or born during the peak of the Ukraine famine of 1933 compared to the reference 1935 cohort unexposed to famine were obtained in our study (Figure 2, Table 1). As we can see from the Figure 2, the patterns of age-specific mortality in all cohorts studied are quite similar. Thus, exposure to famine throughout prenatal life and early childhood was not associated with increased aging rate. There are only two exceptions. The Gompertz intercept b (frailty), has been found to be slightly decreased in male cohort born in 1932, and slightly increased in female birth cohort 1931 (Table 1). Thus, generally, we failed to find any solid evidence that starvation during the Ukraine famine of 1933 significantly affected age-specific mortality rate in cohorts conceived or born during the famine.
 
Figure 2: Mortality rate trajectories (Gompertz curves) at ages 30 to 70, plotted on a log scale against age, for the cohorts born before, during and after the peak of the Ukraine famine of 1933.
 
Table 1: Gompertz parameters for mortality trajectories in the cohorts born before, during and after the peak of the Ukraine famine of 1933.
 
Discussion
 
Similar results were obtained in most studies cited in the Introduction section. In most cases, famine-exposed cohorts continued to show a higher mortality than non-famine cohorts not long after the famine ended; then the cohort differences in mortality either disappeared or even become reversed. Such mortality pattern is commonly termed a “mortality crossover”, and it is usually considered as an evidence for both debilitation effect and selection effect [22]. The debilitation effect refers to the possibility that certain negative conditions (malnutrition, diseases, stresses, etc.) experienced early in life may permanently impair the health of famine survivors and thus leave an imprint on their mortality risks at all subsequent ages; on the other hand, the selection effect refers to the probability that famine survivors tend to be unusually well endowed with some genetic or congenital traits that may reduce mortality risk later in life [23]. The debilitation and selection effects work at different conceptual levels and through different casual mechanisms.
 
Debilitation is result of some biomedical processes on the individual level that can be identified in controlled experimental conditions, and there have been extensive animal studies successfully identifying some types of debilitation effect [24]. The difficulty with measuring debilitation effects on human subjects is mainly ethical and legal: there is no justification to deliberately put pregnant women and newborn babies in harmful situations such as severe nutritional deprivation and psychological stress for extended period of time for research purpose. This is why famine is considered as a good opportunity to identify debilitation effect of early-life exposure to malnutrition on human subjects. Selection effect does not actually increase or decrease individual’s mortality risk in the usual “treatment-effect” sense because it is not a result of processes operating at individual-level. Instead, it is a cohort-level phenomenon, a statistical artifact, produced by “unfair” comparisons in mortality level between a complete cohort (the nonfamine cohort) and a positively selected subset of a cohort (the famine cohort) consisting of only the genetically strong and well-endowed individuals [10]. Thus, the mortality selection during the famine could affect the frailty distribution in the population. Selection by famine could operate on each link of the chain from conception to death: conceptions may be reduced, and fetal, infant and adult survival may be also affected [25].
 
We assume that the similarity of cohort patterns of age-specific mortality found in our study can also be attributed to the both effects of debilitation and selection. It can be suggested that in male cohort born in 1932 where the Gompertz parameter b (frailty) was slightly decreased, the selection effect was prevailed, and in female birth cohort 1931 where the parameter b was slightly increased, the debilitation effect was prevailed. It is predictable because starting in the womb females have an advantage over males in their survivability [26], so they can stay alive even in severely debilitating conditions.
 
Our study has one limitation. The Human Mortality Database contains data for whole Ukraine population only. Thus, no regional data sources are available. As a result, we were forced to analyze simultaneously the Eastern Ukraine area exposed to the famine of 1933, and Western Ukraine area which was unexposed to the famine because this part of Ukraine was under Polish rule until 1939. This problem, however, is unlikely to strongly influence the results because the Western Ukraine population represents only up to 20 % of the whole Ukraine population [27]. This limitation, however, can only be completely addressed by future studies that expand and further develop the research in this area.
 
 
References