Sie sind auf Seite 1von 10

Malnutrition screening in the elderly

1. Dylan Harris, MB MRCP1
2. Nadim Haboubi, MD FRCP2

Department of Geriatric Medicine, Nevill Hall Hospital, Abergavenny, South Wales

2. 2Department of Medicine, Nevill Hall Hospital, Abergavenny, South Wales NP7 7EG,
1. Correspondence to: Dr Dylan Harris

Next Section
Malnutrition is a state in which a deficiency, excess or imbalance of energy, protein and other
nutrients causes adverse effects on body form, function and clinical outcome.1 To justify
screening for this state in the elderly, four criteria must be satisfied: malnutrition must be a
frequent cause of ill-health in this population; it must have a negative effect on outcomes; a
simple, reliable, valid and acceptable screening test must be available to detect those who are
malnourished or at risk of malnutrition; and there must be benefit from nutritional
intervention in those identified by screening. In this review we consider whether these
conditions are satisfied by tests of various kinds.
Previous SectionNext Section


The above question is difficult to answer in the absence of universally accepted criteria to
define malnutrition.2-7
In the older population, undernutrition rather than overnutrition is the main cause for concern,
since its relation to morbidity and mortality is stronger than that of obesity.8 The prevalence
of malnutrition increases with escalating frailty and physical dependence.1 The complex
biological process of ageing is accompanied by many socioeconomic factors that also impact
on nutritional status.9 Anorexia and weight loss are common in the elderly and the
physiological decrease in appetite and food intake that accompanies normal ageing can be
augmented by acute and chronic disease.10 Contributing factors are altered smell/taste, poor
dental health and age-related achlorhydria, in addition a decrease in physical activity leads to
reduction of lean body mass and accumulation of body fat.4,10,11 Also important are social
factors such as poverty and isolation, psychological factors such as depression and dementia,
and medical factors such as poor visual acuity and prescribed medication.1,4,10,12 Many of
these are reversible or responsive to treatment.10 In 2002 the Royal College of Physicians
highlighted the over-65s as a nutritionally vulnerable group, with 12% of those living in the

community at high or medium risk of malnutrition. The prevalence was reckoned at 20%
among those in residential accommodation and up to 40% in those admitted to hospital.13 The
College identified nutritional screening as an integral part of clinical practice.
The economic cost of preventable malnutrition to the National Health Service has been
estimated at 260 million a year.14
Previous SectionNext Section


There is ample evidence of the adverse consequences of malnutrition on physical and
psychosocial outcomes,13,15-17 and seemingly these are independent of underlying disease and
disability.18 Malnourished older people are at increased risk of falls, lengthy hospital stays
and rehabilitation, institutionalization, postoperative complications, infections, pressure
ulcers, poor wound healing, impaired muscle and respiratory function and death.1,7,19
Previous SectionNext Section


Nutritional screening, in its various forms, looks for characteristics associated with nutritional
problems so that the individuals identified can undergo full nutrition assessment and possible
The process, as well as being quick and simple, needs to be acceptable to patients and
healthcare workers. Furthermore, it must have good sensitivity for detecting treatable
malnutrition, even if the specificity is lower.21,22 Ideally, a single nutritional marker would be
consistently abnormal in patients with protein-energy malnutrition (high sensitivity),
consistently normal in patients without protein-energy malnutrition (high specificity),
nutrition-specific (unaffected by non-nutritional factors), and made normal by nutritional

Body mass index (BMI)

Body mass index (weight[kg]/height[m2]) predicts disease risk both in those termed
underweight and in those who are obese.6 The World Health Organization categorizes
underweight as BMI 518.5, normal 18.5-24.9, over-weight 25-29.9, obese 30-39.9 and
extreme obesity 440. However, BMI may be unreliable in the presence of confounding
factors such as oedema or ascites, and may not identify significant unintentional weight loss
if used as a single assessment.17,25,26 Furthermore, reliable measurement of height can be
difficult in the elderly because of vertebral compression, loss of muscle tone and postural
changes.17,27 For this reason, some screening tools use alternatives such as measurement of
ulna length.8


Skinfold thickness can be measured with standardized callipers but requires a skilled
technique. Several different sites can be usedsubscapular, supra-iliac, biceps, triceps, thigh,
calf. The distribution of skinfold thickness varies with ageing and between sexes and between
ethnic groups.23
Use of arm circumference depends on the assumption that the mass of the muscle group is
proportional to its protein content and also reflects total body muscle mass.23 Mid-upper arm
circumference is a helpful indicator of malnutrition applicable in ill patients (normal 23 cm
males, 422 cm females).28
Anthropometric indices are simple and inexpensive to obtain,29 but have to be interpreted in
the light of age, gender and ethnicity.27 Furthermore, some are unreliable in conditions that
cause limb oedema.

Biochemical markers
Serum proteins synthesized by the liver have been used as markers of nutritionalbumin,
transferrin, retinol-binding protein and thyroxine-binding prealbumin.6 Of these, serum
albumin has been most widely adopted because it predicts mortality and other outcomes (for
example, perioperative complications) in older people. Nutritional state, however, is not the
only factor affecting these proteins, others being inflammation and infection. This limits their
usefulness, especially in the acutely ill.5,6,16 In addition, the long half-life of albumin means
that serum albumin does not respond to short-term changes in protein and energy intake.16
Transferrin is a more sensitive indicator of early protein-energy malnutrition but is
unreliable in conditions including pregnancy, iron deficiency, hypoxaemia, chronic infection
and hepatic disease.16 A low total lymphocyte count signifies a poor prognosis and is
independent of low serum albumin.6,11 Malnutrition contributes to age-related immune
dysregulation, including decreased lymphocyte proliferation.10 A low total cholesterol has
also been correlated with risk of malnutrition3 and assessment of vitamin and trace element
status is also important (including thiamine, riboflavin, pyridoxine, calcium, vitamin D,
B12, folate and ferritin).
No biochemical marker on its own offers a satisfactory screening test. Their main value is in
more detailed assessment (particularly risk stratification of patients identified by screening)
and for monitoring.18

Malnutrition screening tools

In view of the limitations of individual methods, over fifty combinations have been tried,
with different criteria, scoring systems, intended users, and acceptability.8 Below we discuss
two that have been well validated.
The Malnutrition Universal Screening Tool (MUST) derives a score classifying malnutrition
risk as low, medium or high on the basis of three componentsBMI, history of unexplained
weight loss and acute illness effect.8 MUST was developed primarily for use in the
community (where it predicts admission rates and need for general practitioner visits) but has
also been shown to have high predictive validity in the hospital environment (length of
hospital stay, mortality in elderly wards, discharge destination in orthopaedic patients).8,21
Stratton et al.30 compared it with various other validated screening tools and found it as good
as and faster than most (3-5 minutes).

The Mini Nutritional Assessment (MNA) was developed to evaluate the risk of malnutrition in
the elderly in home-care programmes, nursing homes and hospitals. In theory it should be
better at identifying frail elderly patients at risk of undernutrition since it encompasses
physical and mental aspects of health;20,21,31 moreover, it detects risk of malnutrition at a time
when albumin levels and BMI are still normal.32 The score for screening is derived from six
componentsreduced food intake in the preceding three months; weight loss during the
preceding three months; mobility; psychological stress or acute disease in the preceding three
months; neuropsychological problems; body mass index.20
The MNA has predictive validity for adverse health outcome, social functioning, mortality
and rate of visits to the general practitioner as well as length of hospital stay, likelihood of
discharge to a nursing home and mortality.20,21 A score of 11 or more on the screening
component of the MNA offers strong evidence that malnutrition is absent.33 The MNA has
also shown itself practical and reliable.19,21,34
Previous SectionNext Section


The above key question is made difficult to answer by the wide range of interventions,
differences between populations studied and diversity of outcome measures.14,35,36 Most trials
have used primary nutritional outcome measures such as weight change and dietary intake
rather than mortality, morbidity and functional outcomes.14 On existing evidence, however,
dietary supplementation does appear beneficial in terms of weight gain, arm muscle
circumference, length of hospital stay and mortality.14,35-37 Weight gain is an outcome
measure of particular importance since it correlates with improvements in immune function,
muscle function and functional independence.36 In meta-analyses of the benefits of
supplementation in elderly people at risk from malnutrition, a significantly reduced mortality
is seen in the following groups: patients defined as undernourished; supplements of more
than 400 kcal per day; age over 75; supplements for more than 35 days; those acutely unwell;
patients in hospital or nursing homes.7
Large multicentre randomized controlled trials are needed to assess the benefits of nutritional
intervention in clearly defined patient groups.14,36 Evidence has been offered that nutritional
support is cost-effective, particularly in terms of hospital and nursing care.38
Previous SectionNext Section

The importance of nutrition is now specified in documents such as the UK National Service
Framework for older people but there is no consensus on methods of detection.
Anthropometry and biochemical markers have drawbacks, and the choice falls on screening
tools employing combinations that can be applied without specific skills or training.
In a particular hospital or community, there is much to be said for use of a single such tool,
and one that attracts wide support is MUST, supported by the British Dietetic Association,

the British Association for Parenteral and Enteral Nutrition, the Royal College of Nursing,
the Registered Nursing Homes Association and the Royal College of Physicians.


Serum Prealbumin: Is It a Marker of

Nutritional Status or of Risk of
1. Alan Shenkin
+ Author Affiliations
1. Division of Clinical Chemistry, University of Liverpool, Liverpool, United Kingdom,
Fax 44-151-7065813, E-mail
Hospitalized patients who are undernourished are more likely to develop clinical
complications and have relatively poor outcomes, with increased length of stay (LOS) 1 and
higher mortality compared with well-nourished patients. Provision of adequate nutritional
support reduces the complication rate and improves outcome (1). Considerable efforts have
therefore been made to identify patients at risk of malnutrition, with a view to early provision
of nutritional support. A full nutritional assessment is a complex process, involving detailed
assessment of nutritional intake, changes in body composition, signs or symptoms of
nutritional deficiency or excess, and laboratory tests, and it should include not only proteinenergy status but also vitamins and essential trace elements. Because of this complexity,
rapid screening tests have been sought to identify patients who may already be malnourished
or are at risk of malnutrition, who can then undergo a more detailed nutritional assessment.
The screening tools with the most validation for protein-energy malnutrition include body
mass index (weight/height2) in conjunction with recent changes in weight and a simple
assessment of illness severity (2). In many patients, however, obtaining an accurate
measurement of current and previous weight to allow calculation of rate of weight loss may
not be possible, so clinicians have sought a rapid, reliable laboratory method, usually
involving plasma proteins, to obtain comparable information.
Serum albumin is of virtually no value in assessment or monitoring of nutritional status (3)
but is mentioned here because, surprisingly, there still remain some clinicians who use it as
part of their nutritional assessment. The main factor affecting plasma albumin concentration
in patients is the rate of transcapillary escape into the interstitial fluid. This transcapillary
escape of albumin is markedly increased in disease [as part of the systemic inflammatory
response syndrome (SIRS)], leading to decreased plasma albumin concentrations (4). It is
inevitable that postoperative patients and patients with severe infection will have low plasma
albumin concentrations. The more severe the disease, the lower the albumin, and therefore
the lower the albumin, the worse the prognosis.
Prealbumin, also known as transthyretin, has a half-life in plasma of 2 days, much shorter
than that of albumin. Prealbumin is therefore more sensitive to changes in protein-energy

status than albumin, and its concentration closely reflects recent dietary intake rather than
overall nutritional status (5). Because of this short half-life, however, the concentration of
prealbumin falls rapidly as a result of the fall in its synthetic rate when there is a
reprioritization of synthesis toward acute-phase proteins such as C-reactive protein (CRP),
fibrinogen, or 1-acid glycoprotein. Moreover, prealbumin concentration in plasma, like that
of albumin, is affected by changes in transcapillary escape. Hence, interpretation of plasma
prealbumin is difficult in patients with infections, inflammation, or recent trauma (4). Despite
this difficulty, interest in prealbumin as a potential marker of nutritional status in certain
groups of patients led to the First International Congress on Transthyretin in Health and
Disease in 2002 (6).
Some studies have screened patients on the basis of their prealbumin on admission, with
values <100 mg/L being regarded as indicating severe risk of protein-energy malnutrition,
100170 mg/L moderate risk, and >170 mg/L no risk. This type of classification, however,
may often reflect severity of illness and the magnitude of the SIRS rather than nutritional
status. When screening protocols that use prealbumin have been compared with a 2-stage
process involving a screening questionnaire followed by an assessment by a professional
dietitian, the prealbumin protocols identified many more patients considered to be
malnourished (7)(8). The authors have tended to interpret this finding as showing the
increased sensitivity of prealbumin in detecting malnutrition, rather than the lack of
specificity of this test.
Nonetheless, these results do suggest a place for prealbumin measurement soon after
admission. In this issue of Clinical Chemistry, Devoto et al. (9) report their investigation of
the concordance of prealbumin measurement, made on day 3 after admission, with a Detailed
Nutritional Assessment (DNA) as a reference method to detect protein-energy malnutrition.
Intriguingly, they found excellent correlation of prealbumin with the DNA, in patients with
and without increased CRP (>5 mg/L). Devoto et al. (9) interpret this correlation as
indicating that prealbumin is a good screening tool for malnutrition, in both the presence and
absence of SIRS. Closer examination raises some concerns, however. First, the DNA score is
not affected only by nutritional statusit contains variables affected both by nutritional
status and by inflammation. Low albumin and low cholesterol, both of which are influenced
by SIRS, may account for up to 50% of DNA scores classified as malnourished. Thus, in
the group with increased CRP, it is not surprising that there is good concordance between
prealbumin and DNA. Similarly, because the DNA does contain true nutritional indicators
such as low nutritional intake or weight loss that lead to low prealbumin, it can be expected
that in patients without increased CRP, there would also be good concordance between DNA
and prealbumin concentration. Moreover, nearly half the patients in their study either had
undergone trauma or had an infection, so their CRP was probably stabilizing or decreasing on
treatment during the 3 days before prealbumin was measured. As noted below, an intake of as
little as 66% of the nutritional requirement could be associated with an increase in
prealbumin in such patients, and hence for many patients with an inadequate intake, the
prealbumin was already rising.
So, one-off measurements of prealbumin are of limited use in screening for malnutrition. A
better interpretation of the nutritional component could probably be achieved from 2
measurements, 3 to 5 days apart, to assess the trend both in prealbumin and in CRP.
What about prealbumin in monitoring adequacy of nutritional intake? In seriously ill patients,
very low prealbumin concentrations are typical and are inversely related to CRP (10).

Therefore, an increase in prealbumin in response to feeding might reasonably be interpreted

as a sign of either improvement of metabolic status or improvement of nutritional status. In
some studies, changes in prealbumin concentration have correlated with changes in nitrogen
balance (11). It has been suggested, but without experimental proof, that a weekly increase of
>40 mg/L in prealbumin concentrations reflects a switch to anabolism (12). An important
observation was that patients in intensive care units (ICUs) receiving an approximately
adequate nutritional intake showed a rise in prealbumin concentrations of 40 mg/L during 1
week, whereas a control group receiving an inadequate intake still showed an increase, but it
was somewhat smaller, at 20 mg/L, while CRP concentrations were decreasing substantially
(13). In one ICU study, a loss of total body protein was observed along with an increase in
prealbumin and a decrease in CRP (10). Although it is possible that a patient with a daily
deficit of 200400 kcal will still show improvement in visceral protein concentrations
associated with a reduction in SIRS, nutritional intake with a deficit of this magnitude would
not support muscle protein anabolism, which is a key objective of adequate nutritional
support (14).
Interpretation of nutritional status data may differ for patients in ICUs compared with more
typical patients requiring nutritional support. Nonetheless, Raguso et al. (14) reached the
interesting conclusion that in the early acute phase, an increase in prealbumin indicates that at
least 65% of protein-energy needs have been met. More importantly, they also concluded that
there had been a reprioritization of hepatic protein synthesis away from acute-phase proteins.
At a later stage, the prealbumin concentration may be a more accurate measure to assess the
adequacy of nutritional intake.
Given the complexity of this issue, it is not surprising that there are conflicting reports in the
literature regarding whether increasing prealbumin concentrations are associated with a
reduced hospital LOS. For example, in one study, ICU LOS was not affected by nutritional
therapy that led to a rise in prealbumin (13), whereas in a small study of medical and surgical
patients, Mears (15) found that when patients were classified as malnourished based on a
local protocol, those patients who were randomized to receive nutritional supplementation
had greater improvement in prealbumin, along with a significant reduction of 13 days in
LOS compared with those receiving standard care. Moreover, in a landmark study performed
more than 20 years ago, Bastow et al. (16) showed that overnight enteral tube feeding in
severely malnourished patients with femoral neck fracture led to a highly significant
reduction of rehabilitation time in hospital, associated with an increase in prealbumin. These
studies therefore confirm that in non-ICU patients, increasing prealbumin concentrations are
likely to be associated with an improvement in outcome, but the key question as to how any
change in prealbumin (taken together with CRP) can be used to inform changes in nutritional
therapy in an individual patient will require careful further studies.
Although prealbumin has been considered a potential nutritional marker for some time, it is
only in recent years, with improvement in analytical methods and recognition of the
importance of nutrition in patient outcome, that inclusion of prealbumin as part of inpatient
screening or monitoring profiles has been considered. As for all laboratory tests, an
understanding of the reasons for a low result will allow more appropriate action to be taken.
An important concept is the difference between patients who are classified as malnourished
and those at risk of malnutrition. Patients who are malnourished are already in a situation in
which they are likely to develop the complications of malnutrition and therefore require
active nutritional support without further delay, whereas patients at risk of malnutrition may
become malnourished over the next few days unless their disease process improves so that

metabolic demand decreases and oral nutritional intake increases, or active steps are taken to
ensure that their nutritional intake meets their ongoing metabolic demands.
A low prealbumin concentration can therefore be regarded primarily as a signal identifying
the at-risk patient who requires careful assessment and monitoring and for whom nutritional
support may be needed as part of the treatment plan. Nutritional assessment and monitoring
protocols should be developed in all hospitals treating patients with acute or chronic illness,
and these protocols should include assessment of adequacy of nutritional intake and possibly
serial measurements of plasma prealbumin and CRP concentrations.

J Am Diet Assoc. 2004 Aug;104(8):1258-64.

Hepatic proteins and nutrition assessment.

Fuhrman MP, Charney P, Mueller CM.

Coram Health Care, St. Louis, MO, USA.

Serum hepatic protein (albumin, transferrin, and prealbumin) levels have historically been
linked in clinical practice to nutritional status. This paradigm can be traced to two
conventional categories of malnutrition: kwashiorkor and marasmus. Explanations for both of
these conditions evolved before knowledge of the inflammatory processes of acute and
chronic illness were known. Substantial literature on the inflammatory process and its effects
on hepatic protein metabolism has replaced previous reports suggesting that nutritional status
and protein intake are the significant correlates with serum hepatic protein levels. Compelling
evidence suggests that serum hepatic protein levels correlate with morbidity and mortality.
Thus, serum hepatic protein levels are useful indicators of severity of illness. They help
identify those who are the most likely to develop malnutrition, even if well nourished prior to
trauma or the onset of illness. Furthermore, hepatic protein levels do not accurately measure
nutritional repletion. Low serum levels indicate that a patient is very ill and probably requires
aggressive and closely monitored medical nutrition therapy.
[PubMed - indexed for MEDLINE]

Surrogate Nutrition Markers, Malnutrition,

and Adequacy of Nutrition Support
1. David S. Seres, MD, CNSP
1. Departments of Medicine and Surgery, Beth Israel Medical Center and Albert
Einstein College of Medicine, New York, New Yorka
1. Correspondence: David S. Seres, MD, CNSP, 170 Second Avenue, Apartment 2D,
New York, NY 10003. Electronic mail may be sent to

Surrogate nutrition markers are used to assess adequacy of nourishment and to define
malnutrition despite evidence that fails to link nourishment, surrogate markers, and outcomes.
Markers such as serum levels of albumin, prealbumin, transferrin, and IGF-1 and delayed
hypersensitivity and total lymphocyte count may be valid to help stratify risk. However, it is
not appropriate to consider these as markers of adequacy of nourishment in the sick patient.

Lab Markers of Malnutrition

Aka: Lab Markers of Malnutrition, Malnutrition Labs
1. Indications
1. Unintentional Weight Loss
2. Suspected Protein and Calorie Malnutrition
3. Hospitalized patients with malnutrition risk factors
1. No eating within 5 days
2. Nutrient losses (e.g. Protein-losing Enteropathy)
3. Serum Albumin <3.2 g/dl
4. Chronic debilitating condition
1. Alcoholism
2. Cancer
3. Diabetes Mellitus
4. Renal disease
5. Advanced age
2. Pathophysiology: Protein-Calorie Malnutrition Consequence
1. Malnutrition alters metabolic function
1. Insulin levels
2. Growth Hormone levels
3. Cortisol levels
4. Decreases hepatic function
5. Diminishes mineral stores
2. Malnutrition predisposes to poor clinical outcomes
1. Associated with higher rate of mortality

2. Associated with prolonged hospitalization

3. Associated with slower clinical improvement
3. Lab Indicators of Malnutrition in Adults
1. Serum Prealbumin <15 mg/dl
1. Best marker for malnutrition
2. See Prealbumin for interpretation and monitoring
2. Serum Albumin <3.4 mg/dl
3. Serum Transferrin <200 mg/dl
4. Total Lymphocyte Count <1500/mm3
5. Total Cholesterol <160 mg/dl
4. References
1. Beck (2002) Am Fam Physician 65(8):1575-8
2. Grazewood (1998) J Fam Pract 47(1): 19-25
Sumber :