Hospital Based Nutrition Support: A Review of the Latest Evidence

Christopher J. Tignanelli, Jill Cherry–Bukowiec

Published Date: 2017-09-30
DOI10.4172/2472-1921.100057

Christopher J. Tignanelli1* and Jill Cherry – Bukowiec2

1Department of Surgery, University of Minnesota, Minneapolis, MN, USA

2Department of Surgery, University of Michigan, Ann Arbor, MI, USA

*Corresponding Author:
Christopher J. Tignanelli
Department of Surgery
University of Minnesota
420 Delaware St SE, MMC 195
Minneapolis, USA
Tel: (612) 626-1968
Fax: (612) 626-0439
E-mail: ctignane@umn.edu

Received Date: September 08, 2017; Accepted Date: September 25, 2017; Published Date: September 30, 2017

Citation: Tignanelli CJ, Bukowiec JC (2017) Hospital Based Nutrition Support: A Review of the Latest Evidence. J Clin Nutr Diet 3:22. doi: 10.4172/2472-1921.100057

Visit for more related articles at Journal of Clinical Nutrition & Dietetics

Abstract

Meeting appropriate nutritional demands in the inpatient setting is a fundamental aspect of optimal patient care. Optimizing nutrition delivery and preventing malnutrition can have a significant positive effect on clinical outcomes and costs of care. Despite extensive research, many questions remain regarding the delivery of nutrients to hospitalized patients, especially in the critically ill. Recent advances have been made over the past decade, and landmark studies have yielded an end to many controversial topics, such as the broad utilization of immunonutrition. However, there are still many questions that remain unanswered, for example how do we objectively define malnutrition? Cutting edge research in the areas of morphomics and metabolomics are raising new questions which are poised to revolutionize how we will answer today’s questions. In this review, we summarize the historical pedagogy underlying nutritional practice alongside contemporary evidence supporting current practice guidelines. Furthermore, we identify and explore key barriers preventing the rapid identification and treatment of malnutrition. We introduce two emerging technologies foremost in nutritional research that may eventually disrupt current barriers. And finally, we discuss key populations at specifically high risk for the development of malnutrition.

Keywords

Hospitalized patients; Nutrition support; Malnutrition; Wound healing

Introduction

The importance of appropriate nutritional therapy in hospitalized patients was brought to light in a 1974 publication by Charles Butterworth “The skeleton in the hospital closet”. In this article, he encouraged increased physician attention to what he termed as “iatrogenic malnutrition” [1]. Despite four decades since this publication, appropriate attention to nutritional management remains lacking. For example, a 2001 study of Medicare patients at risk for pressure ulcers showed that 76% were malnourished, and only 34% of patients at risk for pressure ulcers received nutritional consultation [2]. This neglect for the appropriate delivery of nutritional care stems beyond just the inpatient setting, but also to medical and graduate medical education where nutrition is underrepresented in training programs [3,4].

The appropriate nutrition of hospitalized patients has a direct effect on patient outcomes. Creating systems aimed at identifying patients at significant nutritional risk and monitoring adherence to evidenced based nutritional care practices have the potential to significantly improve outcomes and reduce costs. Patients who develop malnutrition represent a missed opportunity to deliver the critical nutrients needed and prevent the ramifications that come with a malnourished state. Despite increased calls for better nutritional screening, 1 in 3 patients who present to a hospital already meet the criteria for malnutrition [5]. Malnutrition has significant clinical and economic ramifications. For example, surgical patients, who are just at risk for malnutrition, have a two-fold increased risk of post-operative complications, significantly longer length of stay (LOS), increased mortality, and higher costs. 6 A 2013 review on malnutrition demonstrated hospital malnutrition is associated with an increased rate of pressure ulcers, impaired wound healing, infection rates (Clostridium difficile, surgical site infections (SSI), pneumonia, mediastinitis, catheter associated urinary tract infection (CAUTI)), risk of falls, increased LOS, readmission rates, costs, and mortality [5- 8]. However malnutrition is preventable and treatable, with multiple meta-analyses demonstrating that optimal delivery of appropriate nutrition is associated with significantly reduced mortality [9,10]. This review summarizes key barriers preventing the early recognition and treatment of malnourished patients and the controversies and practice guidelines surrounding current clinical practice.

Defining Malnutrition

Malnutrition is a major public health issue associated with substantial medical and economic implications. Hospitalized patients are at significant risk for malnutrition, with an estimated at risk prevalence ranging between 13-78% [11] Unfortunately, great controversy surrounds the formal definition of malnutrition, with multiple societies proposing contrasting definitions. Furthermore, this lack of a consistent objective measure for malnutrition leads to bias in diagnosis putting many patients at risk for misdiagnosis. This lack of an objective definition is likely because many nutritional parameters may be a mere reflection of the severity of patient illness rather than of nutritional status. This also severely limits the direct comparison of nutritional studies. A formal definition provided by a joint consensus statement from the Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition (ASPEN) recommends at least 2 of the following 6 characteristics for a diagnosis of malnutrition: weight loss, insufficient energy intake, muscle or fat loss, fluid accumulation, and diminished functional status as measured by hand grip strength [12]. As previously stated the subjective nature of this definition leaves many patients at risk for malnutrition which is often underdiagnosed and undertreated. Many have attempted to define malnutrition in terms of objectives measures such as weight, body mass index (BMI), mid arm circumference, albumin, prealbumin, retinol binding protein, and total lymphocyte count. Unfortunately, these measures alone are not optimal, and are impacted by fluid shifts, inflammation, and many other factors not directly related to nutrition status. There remains a current need for an evidenced based and objective definition for malnutrition.

While an objective definition is lacking, the risk factors for malnutrition are better defined. Malnutrition suffers from two sets of risk factors. Patient-specific risk factors include age and poor functional status, specific disease processes (cancer, alcoholism, gastrointestinal disease, and surgery), and treatments such as mechanical ventilation [11]. In contrast, organizational factors are a major and preventable source of malnutrition risk. These factors include: failure to recognize malnutrition, lack of nutritional screening, lack of training, confusion regarding responsibility, and failure to record height and weight. Patient-specific risk factors are less likely to be modifiable, as the majority are present on admission or due to severity of disease; however, organizational risk factors are potentially high yield targets for nutritional intervention. Quality improvement efforts should focus on reducing institutional barriers enabling the under diagnosis and under treatment of malnutrition.

Indicators of Nutritional Status

While it is important to be aware of the risk factors for malnutrition, the ability to objectively assess a patients’ nutritional state is perhaps even more important. Historically, this has been done through the measurement of albumin, with hypoalbuminemia defined as lower than 3.5 g/dL. However, as we previously mentioned this and other “objective” measures of malnutrition have significant clinical flaws. Despite these flaws, pre-operative albumin evaluation is so ubiquitous that it is routinely ordered prior to 75% of all elective operations, and has been shown to be one of the strongest pre-operative predictors of post-operative morbidity [13]. However, albumin alone is a poor marker of nutritional status for hospitalized patients. Volume status, renal and liver disease, and enteropathies are just a few of the disease processes which can alter albumin levels. Another limitation of albumin is its long half-life (20 days), diminishing it’s reliably to assess short-term nutritional changes. Another indicator routinely ordered is prealbumin. One key advantage of prealbumin is that its half-life is only 2 days, allowing for short-term trends of patients’ nutritional status (Table 1). Some evidence suggests that improving prealbumin levels may be a prognostic marker for certain subsets of patients (i.e., traumatic brain injury and renal injury); however, caution is recommended if using prealbumin to guide nutritional therapy as many disease processes affect levels (i.e., steroid use, alcoholism, inflammatory state, and micronutrient deficiencies) [14,15] Unfortunately, in critically ill patients, where objective markers of nutritional status are needed most, prealbumin suffers from its role as an acute phase reactant, and thus is affected by many of the same disease states that affect albumin levels. Furthermore, renal disease has been shown to increase prealbumin levels acutely. While these values can serve as potential markers of nutritional status, it’s important to point out that no studies have shown correcting these values results in improved outcomes in critically ill patients. Routine monitoring of acute phase reactants and attempts to correct abnormalities is thus not recommended [16,17].

Parameter Normal At risk for malnutrition Half-life
Albumin 3.5–5.0 g/dL Mild: 3.0–3.4 g/dL 20 days
Moderate: 2.4–2.9 g/dL
Severe: <2.4 g/dL
Prealbumin 16–40 mg/dL Mild: 10–15 mg/dL 2 days
Moderate: 5-9 mg/dL
Severe: < 5 mg/dL
C reactive protein (CRP) <0.8 mg/dL   19 hours
Transferrin 200–400 mg/dL Mild: 150–200 mg/dL 9 days
Moderate: 100–149 mg/dL
Severe: <100 mg/dL
Nitrogen Balance ± 4 grams/day    
Triceps skin fold thickness (1) Male: 11-12.5 mm <6.1 mm  
Female: 15-16.5 mm <11.6 mm  
Body mass index (BMI) 18.5–24.9 At risk if <18.5  
or >24.9
Calf circumference (2) 31 - 33 cm <31 cm  
Mid upper arm circumference  (MUAC) (3) Male: 25-29 <24.7 cm  
Female: 23.5–28.5 cm <23.5 cm  
Waist to hip circumference ratio (WHR) (3) Male: <0.90 Male: >0.90  
Female: <0.85 Female: >0.85  
Mid arm muscle circumference (MMC) (4) Male: 23-25 cm <21.1 cm  
Female: 20-23 cm <19.2 cm  

Table 1: Chemical and anthropometric indicators of nutritional status.

C-reactive protein (CRP) is another acute phase reactant that is theorized to help gauge nutritional status (Table 1). Given the inflammatory states’ effect on the utilization of prealbumin and albumin for nutritional assessment, the addition of CRP measurement has been suggested to differentiate inflammatory states with malnutrition states. Normally, CRP levels are inversely correlated with prealbumin levels in patients with inflammation [18]. However, in patients with weight loss and malnutrition, CRP and albumin both remain low [19] While data is limited, one study evaluated the routine utilization of CRP, albumin, and prealbumin in burn patients, and recommended the following framework. In patients with low albumin, prealbumin, and high CRP they argue inflammation is the likely culprit, in patients where all three remain low, malnutrition is the likely culprit [20]. This method likely oversimplifies the problem as these acute phase markers peak at various time periods during the inflammatory response limiting their usefulness clinically. A recent study evaluating the relationship of prealbumin and CRP in critically ill patients receiving enteral feeds noted no differences in prealbumin levels for patients being underfed or adequately fed and noted that inflammation seemed to be the main driver of rising prealbumin levels in the critically ill [16]. Transferrin is another acute phase reactant with an intermediate half-life compared with albumin and prealbumin. Unfortunately, this marker is also privy to fluctuations related to a multitude of other factors and its routine assessment is not recommended.

Another indicator of nutritional status that is less frequently utilized in the present day is nitrogen balance. Nitrogen balance is a measure of the gain or loss of protein (Table 1). Within 4 g/day is considered equilibrium. To best understand the limitations of this method one must understand how it is calculated. Nitrogen balance measures the excreted BUN and adds a constant of 4 g/day to that value. This constant estimate is derived from estimated non urinary urea nitrogen and gastrointestinal losses. There are significant limitations and confounding factors which can greatly skew the accuracy of this method (i.e., diarrhea). Furthermore, no large randomized trials evaluating the use of nitrogen balance to guide protein supplementation have shown a correlation with clinical outcomes [21]. Given this, the routine measurement of nitrogen balance has fallen out of favor and is no longer recommended.

Multiple anthropometric parameters exist to gauge nutritional status. Triceps skin fold thickness, BMI, calf circumference, mid upper arm circumference, waist to hip ratio, and mid arm muscle circumference have all been recommended (Table 1). Multiple studies have evaluated these parameters and how well they correlate with nutritional status. In one cross-sectional study of 109 elderly patients, arm circumference was the best predictor of nutritional status, this was followed by BMI, triceps skin fold thickness, and finally mid-arm circumference [22]. However, these methods are not utilized frequently to assess nutritional status, outside of BMI, for hospitalized patients as they all suffer from severe limitations. Many of these techniques suffer from lack of reliability. For example, studies have identified that arm circumference suffers from significant inter-observer variation depending where measurements are taken and how taut the measuring tape is pulled [23]. Another well-known limitation of BMI is its inability to account for a patient’s muscular state. Thus while many objective markers of nutritional status and thus malnutrition have been proposed, they are not without issues. The holy grail for objective measures of nutritional status continues and we recommend against the routine use of the above elements as independent measures of nutritional status and degree of malnutrition.

Screening for Nutritional Risk and Assessment of Malnutrition Using Scoring Systems

With a single objective measure for malnutrition lacking, there has been much enthusiasm using nutritional scoring systems to either screen for nutritional risk or assess a patient’s degree of malnutrition. Scoring systems have the benefit of aggregating multiple nutritional indicators to better screen nutritional risk and assess the degree of malnutrition. A few examples of scoring systems used to screen for nutritional risk include: Nutritional risk screening (NRS 2002), Malnutrition screening tool (MST), Malnutrition universal screening tool (MUST), and the NUTRIC score. Other systems such as, the Subjective Global Assessment (SGA) categorize the degree (i.e., moderate, severe) of malnutrition present [6]. These scoring systems allow for the rapid identification of patients at significant nutritional risk or the categorization of the degree of malnutrition permitting more aggressive efforts to improve nutritional status and combat malnutrition.

The two screening systems most frequently utilized in the inpatient setting include the NRS 2002 and the NUTRIC score (Table 2). The NRS 2002 was developed to identify patients who are most likely to benefit from nutritional support [24]. It was validated retrospectively against data from 128 clinical trials and showed that patients who fulfilled the criteria were more likely to have a positive clinical outcome with nutritional support than patients who did not meet these criteria. This method has also been evaluated prospectively in 212 patients. Those with an NRS 2002 greater than two received nutritional intervention resulting in an increase in net calories, reduction in the severity of complications, reduced LOS for patients with complications, and reduced LOS related to nutritional support [25].

Comparison of NUTRIC and NRS 2002 nutritional risk screening systems
  NUTRIC NRS 2002
Steps Single scoring system 4 question screen followed by scoring system if positive
     
Components - Age
- APACHE II
- SOFA
- Comorbidities
- Days from hospital to ICU  
admission
- IL-6*
- Weight loss >5%
- BMI
- Current food intake compared   
with prior week
- Disease severity (i.e., severe
PNA, head injury)
- ± age >70
     
Score range 0–10* 0-7
     
Interpretation <6=Low malnutrition risk
≥ 6=At risk
<3=No risk
≥ 3=At risk

Table 2: Comparison of NRS 2002 and NUTRIC nutritional risk screening systems.

The NUTRIC score is the most frequently utilized scoring system in the critical care setting. It was developed from 597 ICU admissions (excluding elective surgery and overdoses) (Table 2) [26]. One advantage of this system over the NRS 2002 is that it also takes into account disease severity by utilizing the patient’s APACHE II and sequential organ failure assessment (SOFA) scores. This scoring system correlates well with mortality and duration of mechanical ventilation. More importantly, in patients with a high nutritional risk, defined as NUTRIC score greater than five, mortality is directly correlated with the percent of calories received. Patient’s receiving near 100% of caloric needs had the lowest mortality compared with underfed patients. Without an objective gauge for malnutrition it is our practice to routinely utilize the NUTRIC score to gauge nutritional risk in our critical ill patients.

Analytic morphomics

Advances in analytic morphomics and metabolomics offer novel insights into the identification of the malnourished patient. Promising studies involving analytic morphomics allow the quantification of body tissue composition from images such as computerized tomography (CT) scans, thus allowing for a quantitative measure of sarcopenia and malnutrition. Given the limitations of objective measures of nutritional status the development of this technology is welcomed. One area where this technology has already proven beneficial is in the routine preoperative risk assessment preceding elective surgery. A common parameter used in analytic morphomics is the total psoas area (TPA) measured at the L4 vertebral landmark. This measure is then compared with gender specific norms as a surrogate marker for sarcopenia [27,28]. In one study specifically evaluating TPA in the elderly, increased TPA was shown to be inversely proportional with impaired mobility in elderly patients and directly proportional with reduced cognitive impairment and ability to perform activities of daily living [29]. Furthermore, morphomics elements such as TPA, have been shown to be independent predictors for morbidity and mortality in patients undergoing many types of surgeries, including liver transplantation, colectomies for cancer, and bowel resection for Crohn’s disease [30-32].

Metabolomics

Another cutting edge field of nutrition research involves metabolomics, or the study of a patient’s individual metabolic profile determined from blood, urine, and stool samples. While this technology has not yet been used clinically in the inpatient arena, it is important for the reader to be aware of its presence in the outpatient setting. Current research in this field analyzes patient’s biochemical pathways to identify changes in response to dietary patterns and disease processes. Another goal of this field seeks to identify nutritional biomarkers which can be used to later guide comparative nutritional research [33]. In one study researchers identified that low leptin levels were a prognostic biomarker for mortality in patients with severe acute malnutrition [34]. Eventually, the definition of malnutrition may include metabolomic and morphomic elements; however, these fields are still in their infancy. Until then clinicians must rely on scoring systems and subjective criteria currently available.

Methods to calculate nutritional needs

Even if one could identify every patient at risk for, or with malnutrition, controversy exists over the optimal number of calories patients should receive. To deliver the appropriate amount of nutrition for hospitalized patients one must be able to determine caloric requirements. For many patients, equations can adequately estimate energy expenditure. However, predictive equation estimates tend to be inaccurate in critically ill patient populations [35]. This leaves the majority of critically ill patients at significant risk for under and over feeding. One population of critically ill patients can benefit from the routine utilization of indirect calorimetry (IC). This technique allows for the direct measurement of resting energy expenditure in patients requiring mechanical ventilation. In this method, oxygen consumption and carbon dioxide production are measured to calculate energy expenditure. Though considered the gold standard, IC is not available in many ICU’s as it requires expensive equipment and highly trained personnel. Additionally, randomized data is limited regarding the role of IC and clinical outcomes. The TICACOS trial, a single center pilot study of 130 mechanically ventilated patients, identified a trend towards reduced mortality in patient whose nutritional requirements were determined by IC (32.3% vs. 47.7%, p = 0.058), despite significantly longer ventilator and ICU LOS.36 Given the lack of definitive benefit and the high resource utilization required for IC, many practitioners resort to estimating energy requirements using specific caloric goals, such as the Harris-Benedict equation (HBE). While a detailed review of each of these methods is outside the scope of this paper, commonly used equations are a specific calorie goal of 25-30 calories/kg/ day, the HBE, or in critically ill patients the Ireton-Jones equation. If predictive equations are being used to estimate a patient’s energy expenditure, it is important to add a 10–50% stress factor accounting for the increased metabolic demands of the patient’s disease state. Despite this, we and others have shown that these equations are wildly inaccurate putting patients at significant risks for over and under feeding [36]. We recommend the routine utilization of IC in mechanically ventilated critically ill patients. In patients who cannot receive IC, we resort to specific calorie goal calculations to determine nutritional needs with frequent re-assessments.

Methods to deliver nutrition and considerations

The optimal route of feeding is less controversial, in patients who can safely tolerate per os (PO) this is always the preferred route. In patients who cannot tolerate PO due to mechanical ventilation, aspiration, or other disease states, nasogastric or nasoduodenal feeding tubes may be necessary. Studies comparing initiating nasogastric verses nasodudodenal feeding have not found any advantages with nasoduodenal feeding in most patients. Gastric feeding is associated with a significantly quicker initiation of feeds and reaching the target feeding rate without any increase in complications, LOS, or ventilator days [37,38]. It is important to point out that patients at high risk for aspiration, or who are intolerant of gastric feedings, be fed via the nasoduodenal route. In this specific population there is an association with reduced episodes of aspiration and pneumonia [39,40].

Enteral formulations

Multiple enteral formulations exist for the nourishment of the malnourished or at risk patient. Despite much interest over the last decade in various specialty formulations, most studies have failed to identify a benefit. Therefore, it is our practice to utilize polymeric formulations for the majority of our patients. A review of our practices can be found in (Table 3). We will provide the reader with a brief review of the various formulations developed over the past few decades and key studies that have shaped our current practice patterns. There are six main classes of enteral formulations: elemental, semi-elemental, polymeric, diseasespecific, immune modulating/enhancing and anti-inflammatory. Elemental formulations can be thought of as, pre-digested nutrition, and contain individualized amino acids, glucose molecules, and typically low fat preparations. Semi-elemental formulations contain amino acid chains (peptides), simple sugars, and fat composed as medium chain triglycerides (MCT) and long chain triglycerides (LCT). Polymeric formulations are the least processed and contain whole proteins, complex carbohydrates, and long chain fatty acids. Potential benefits of elemental formulations include: better absorption and improved tolerance by patients with malabsorption syndromes or pancreatitis. The major downside of elemental formulations is a near four to eightfold increase in cost. The studies comparing these formulations in patients at risk for malabsorption have failed to show any benefit with elemental formulations despite theoretical benefits [41- 43]. Even in patients with documented short bowel syndrome the data is conflicting if there is any benefit [44-46]. Similarly, in patients with active Crohn’s disease, outside of one study which showed improved remission rates with elemental formulations, no other studies have found a benefit for elemental over nonelemental formulations [47]. One population we do routinely utilize elemental or semi-elemental formulations is in patients with pancreatitis. While once contraindicated, current evidence now supports that patients with acute pancreatitis without ileus be fed via the enteral route as this is associated with reduced infectious complications and improved outcomes [39,40,48- 50]. The theory behind using an elemental or semi-elemental formulation is the lack of pancreatic enzymes needed to break down polymeric formulations. Data supporting or challenging this practice is currently lacking.

General Patients Standard polymeric formula
Critically Ill Patients (ICU)
MICU Standard polymeric formula
Perioperative SICU May consider immune modulating EN
Pulmonary failure Standard energy dense polymeric formula
AKI Standard energy dense polymeric formula
AKI with CRRT Standard energy dense polymeric formula
+ Protein 2.5 g/kg/day
Hepatic failure Standard polymeric formula (use IBW)
Severe sepsis Standard polymeric formula
 
Burn Standard polymeric formula
+ Protein 2 g/kg/day
TBI Standard polymeric formula
May consider immune modulating EN
Obese (BMI>35) Standard polymeric formula (60% actual BW)
Trauma Standard polymeric formula
May consider immune modulating EN
Open abdomen Standard polymeric formula
+ Protein 15 g/liter of abdominal losses
Moderate/severe pancreatitis Standard elemental, semi-elemental or polymeric formula

Table 3: Guidelines for enteral formulation in specific patient populations in patients not tolerating PO.

Disease specific or specialty formulations were developed for specific disease states, for example diabetes. These formulations can also be organ specific, for example in pulmonary or renal failure. To give a specific example, in patients with pulmonary failure, it has been theorized that nutritional support with a high fat, low carbohydrate diet can reduce a patient’s respiratory quotient. Initial evidence supported this theory and even showed reduced ventilator time in a small sample of patients [51]. In 2003, a larger randomized controlled trial (RCT) attempted to elucidate this and found no benefit utilizing a disease specific formulation for patients with pulmonary failure [52]. This is just one of example of the failure of disease specific formulations to catch on. Given the lack of robust data supporting disease specific formulations, the most recent ASPEN guidelines recommend against their routine use [39,40]. Other formulations include: immune modulating/enhancing and anti-inflammatory. Four specific immunonutrients that have been heavily studied are glutamine, arginine, omega-3 fatty acids, and omega- 6 fatty acids. There has been much controversy surrounding which populations would benefit the most from these formulations. The 2001 US summit on immune-enhancing enteral therapy’s meta-analysis by Daren Heyland of 22 RCTs recommends immune enhancing formulations for the following patient populations: patients undergoing elective gastrointestinal surgery with moderate to severe malnutrition and patients with blunt or penetrating torso trauma with an injury severity score (ISS) ≥ 18 or an abdominal trauma index ≥ 20.53.

Glutamine: The theory behind the use of glutamine and arginine is that during periods of stress a patient’s natural supply of these amino acids is severely depleted thus making them relatively “essential” amino acids.54 Previously held philosophy recommended the routine use of glutamine in all critically ill patients given this logic; however, recent data has proven that glutamine should not be routinely given to “all critically ill patients” as it may cause harm [53-58]. The METAPLUS and REDOXS landmark studies ended the search for utility of glutamine in critically ill patients [59,60]. The REDOXS study was a blinded 2-2 factorial trial of 1223 critically ill patients who were randomized to receive combination enteral and parenteral glutamine, antioxidants, both, or placebo. The authors identified a trend towards increased mortality at 28 days (the primary endpoint) for patients who received glutamine (32.4% versus 27.2% (p=0.05, significance defined as p=0.044 for this study based on interim analysis)). Furthermore, patients who received glutamine had significantly higher in-hospital and 6-month mortality (p=0.02) [59]. This study ended the controversy of glutamine in critically ill patients and also suggested the routine supplementation of glutamine could be harmful. Of note, the REDOXS study did not specifically evaluate glutamine supplementation in hypoglutaminemic patients. The question remains if there are subsets of patients (i.e., burn and TBI) where glutamine supplementation may be beneficial, as preliminary research in TBI and burn patients have suggested improved outcomes [61-63].

Arginine: The majority of studies of arginine supplementation in hospital patients has focused on the critically ill, where it is believed arginine can offer the most benefit. One group specifically worth mentioning is supplementation in the post- MI population where arginine has been shown to be associated with increased mortality in the Vintage MI study [64]. In the critically ill patient the role of immunomodulation with arginine remains controversial. Five large meta-analyses have evaluated the role of these formulations in the critically ill, all have shown a reduction in infection rates with some showing reduced LOS and ventilator days [65-69]. Patients with sepsis represent a unique consideration for arginine supplementation.

Current ASPEN 2016 guidelines do not recommend the routine use of arginine for critically ill patients with sepsis. In this population the concern with arginine supplementation stems from an increase in nitric oxide (NO) which can exacerbate hypotension; however, studies have failed to definitively show a causal link between arginine supplementation, NO increase, and hypotension/mortality [70]. Further studies are needed to define the appropriate populations who would benefit the most from arginine supplementation. For the present, arginine supplementation in sepsis should be utilized at the provider’s discretion understanding that any potential benefits have not been fully elucidated; however, supplementation via the enteral route at doses of less than 30 grams/day has not been associated with adverse events.

Omega 3 verses omega 6 fatty acids: Two fatty acids, omega-3 (noninflammatory) and omega-6 (proinflammatory) deserve mention (Figure 1). Western diets are historically high in omega-6 fatty acids, whereas sources of omega-3 such as flax seed, fish oil (FO), and canola are less common. The role of omega-3 supplementation to curb the inflammatory response in patients with sepsis and acute respiratory distress syndrome (ARDs) has been studied in multiple RCTs and reviewed by Martin et al. [71]. Despite initial studies suggesting a benefit with omega fatty acid supplementation, recent evidence from two RCTs dispute these findings. The OMEGA trial evaluated the role of omega fatty acid supplementation in acute lung injury (ALI) and was stopped early due to lack of efficacy [72]. Another phase II trial evaluated FO (high in omega-3) supplementation in patients with ALI and failed to show a benefit in physiologic or clinical endpoints [73]. The results of these studies are summarized in (Table 4). Given the conflicting data, the most recent ASPEN guidelines do not make a recommendation regarding the use of noninflammatory formulations for patients with ARDS/ALI [39,40].

  Singer et al, 2006 Gadek et al, 1999 Pontes-Arruda et al, 2006 Rice et al, 2011 (OMEGA Trial) Stapleton et al, 2011
Sample Size (n) N=100 N=146 N=165 N=272 N=90
Population ALI ARDS Sepsis
(Severe or Shock)
ALI ALI
Results
Improved ICU LOS No Yes Yes No No
Reduced Vent days Yes Yes Yes No No
Improved mortality No No Yes No No
Reduced organ failure N/A Yes Yes No No

Table 4: Summary of Randomized Controlled Trials comparing noninflammatory enteral regimens.

clinical-nutrition-dietetics-Metabolic-pathways

Figure 1: Metabolic pathways of Omega-3 and Omega-6.

When to initiate enteral nutrition: To best combat malnutrition, enteral nutrition should be initiated as soon as clinically possible. Historically patients following elective gastrointestinal surgery were kept nothing per os (NPO) until flatus; this practice has since been abandoned as evidence has overwhelmingly supported the safety of early feeding. Furthermore, early feeding as part of enhanced recovery pathways has shown significant reductions in hospital LOS, readmission rates, and rates of complications [74-76]. This includes critically ill patients or those at high nutritional risk. Thus, early (within 48 hours) enteral nutrition is recommended if feasible. This has been highlighted in multiple meta-analyses which have in aggregate shown reduced mortality, reduced infections, and reduced LOS [39,40,77,78] Despite the potential for underfeeding via the enteral route, enteral nutrition is recommended over parental nutrition as it is associated with reduced infectious complications and ICU LOS. While it is ideal to progress to goal feeds as soon as possible, the EDEN study failed to identify any benefit with full feeding over trophic feeding (400 kcal/day) in patients with ALI [79]. This result is likely due to an increased endogenous generation of glucose during the first week of critical illness. One potential method to improve the rate of underfeeding is through the use of a daily volume based goal (i.e., PEP uP protocol) over a daily caloric based goal. Results for the initial studies utilizing the PEP uP protocol noted patients in the caloric goal group received approximately 58.8% of their daily caloric needs, whereas patients on the PEP uP protocol received approximately 83.2% of their daily caloric needs [80,81]. We believe the optimal method to combat malnutrition is the early and aggressive feeding of at risk patients via the enteral route. In the face of underfeeding via the enteral route we consider transitioning from calorie to volume based feeding targets.

Parenteral nutrition

Finally, in patients who do not tolerate enteral feeds and are at risk for malnutrition, parenteral nutrition should be considered, especially if patients are anticipated to require nutritional support for over 7 days. Parenteral nutrition has a few drawbacks. One of the main complications surrounding long term use of PN is the development of liver disease (steatosis and cholestasis). The primary component in PN that appears to drives this is the lipid emulsion [82]. Unfortunately, no emulsion is ideal, and thus multiple formulations exist. Soybean oil has been the formulation of choice since the 1970s; however, recently there has been the development of alternatives, such as: MCT, olive oil (OO), safflower, and FO. While an in-depth review of each agent is outside of the scope of this paper, an important point for the reader to understand is that each of these agents has differing inflammatory profiles [83-85]. For example, soy based emulsions are more pro-inflammatory than olive oil based emulsions, which are more pro-inflammatory than FO based emulsions [82].

Chronic PN should not be discontinued unless in the face of associated bacteremia. For non-malnourished patients, we recommend beginning PN after seven days in patients who fail enteral nutrition. The key landmark trial supporting this recommendation is the EPaNIC trial which randomized ICU patients with a nutritional risk assessment score >2 to early (2 days) vs. late (8 days) PN. The late PN group showed a reduced ICU LOS, the primary endpoint, of 3 vs. 4 days (p=0.02). Early PN was associated with an increased infection rate (22.8% vs. 26.2%, p=0.008).86 Numerous studies have supported this recommendation [39,40]. In malnourished patients, PN should be initiated as soon as possible. Multiple meta- analyses have shown reduced complications and reduced mortality in this population with early PN [86-89].

Special populations at high risk for malnutrition

There are a few specific populations at especially high risk for malnutrition: burns, end stage renal disease, liver failure, pancreatitis, critically ill patients requiring vasopressors, and bariatric surgery patients. Specific nutritional considerations are needed in each of these populations to prevent malnutrition.

Burns

Burn patients are at an exceptionally high rate of nutritional risk due to the hypermetabolic state associated with burns, and the need for frequent conscious sedation requiring interruptions in enteral feeding. We recommend beginning enteral nutrition in these patients within 36 hours if hemodynamically stable. As gastric feeding is associated with frequent interruptions for procedures and conscious sedation for dressing changes, we have developed a protocol utilizing continuous post-pyloric feeding without interruptions. We have found this to be safe with no episodes of procedure related aspiration or complications. 89 Burn patient’s energy requirements are frequently under calculated using equations such as the HBE by as much as 100%. A common method of calculating caloric needs for burn patients is to utilize 25 kcal/kg/day plus a burn factor of 40 kcal/% of burn surface area/day [90]. Due to higher protein requirements in burn patients, we recommend supplementation on the order of 1.5–2 g/kg/day compared with normal requirements of 0.8 g/kg/day. Burn patients can remain hypermetabolic for up to 1 year post injury [90]. There are two main drugs which have been utilized to restore metabolic homeostasis in these patients: propranolol and oxandrolone. While the majority of studies have been done in the pediatric population, results have been extrapolated to adult burn patients [91,92]. Our threshold to utilize these medications is in patients with greater than 20% total body surface area burns.

End Stage Renal Disease and Continuous Renal Replacement Therapy [ESRD and CRRT]

Critically ill patients requiring CRRT have significantly higher protein needs due to losses that occur across the hemofilter which can be as high as 24% [93]. Utilizing an escalating protein regimen, Scheinkestel et al. identified that patients requiring CRRT required at least 2 g/kg/day of protein in order to maintain a positive nitrogen balance [94]. A positive nitrogen balance was significantly associated with improved in-hospital mortality (p=0.03) and for every 1 g/day increase in nitrogen balance, they observed a 21% increase in survival (p=0.03) [94]. When patients transition off CRRT to intermittent hemodialysis one must remember to reduce the protein content.

Liver failure

Patients with liver failure and ascites are at risk of overfeeding due to calculation errors in weight from excess ascitic fluid. Given this, any equation based calculations should utilize a dry or usual body weight. An additional consideration is to reduce copper and manganese supplementation for patients with hyperbilirubinemia. Historically protein supplementation was restricted, as it was thought to exacerbate hepatic encephalopathy; however, it is now known this is not the case and decreasing protein supplementation can actually worsen encephalopathy. Finally, equations are often inaccurate in this population and therefore use of IC is strongly recommended [95].

Pancreatitis

Historically patients with pancreatitis were kept NPO until the pancreas “cooled off”. This practice has been challenged and the standard of care is to provide enteral nutrition to these patients as multiple RCTs and meta-analyses have shown reduced infections, hospital LOS, multiple organ failure, and mortality with enteral feeding [96,97]. The degree of disease severity should guide the route of enteral feeding. Patients with mild disease can be fed by mouth (PO), patients with more severe disease may require nasogastric or nasojejunal feedings. If patients are intolerant to enteral feeding after one week, PN should be started. Another area of controversy has been the role of pro-biotics in pancreatitis. The largest multicenter trial attempting to answer this question demonstrated increased mortality and multiple organ failure in patients treated with pro and pre-biotics vs. pre-biotics alone, but pro-biotic administration was of high doses directly to the duodenum [98]. A meta-analysis comparing pro-biotics with placebo, which included the previously mentioned study, showed reduced infection rate and hospital LOS in the pro-biotic group [99]. At the current time pro-biotics are not recommended for use in patients with pancreatitis, but these findings raise the question whether a different dose, administration protocol, or bacterial type of pro-biotic may be worth investigating.

Hemodynamically unstable patients

In patients without contraindications to enteral feeds, such as compromised mesenteric vascular supply, bowel obstruction or bowel discontinuity, consideration should be given to enteral feeding of patients who require decreasing levels of vasoactive agents for hemodynamic support once the patient is adequately resuscitated. The concern with feeding these patients stems from a reduction in splanchnic blood flow and a 0.3–3.8% rate of non-occlusive bowel necrosis [100]. Unfortunately, the data for or against feeding through these agents is limited to case reports and case series. One large series reported on 70 patients status post cardiac surgery that early enteral nutrition actually increased cardiac output and splanchnic blood flow in patients on dopamine and norepinephrine [101,102]. One large study evaluated early enteral feeding in 1174 septic patients from multiple institutions requiring vasoactive agents. They compared patients who received enteral nutrition within 48 hours verses those who did not. The group receiving early enteral nutrition had a lower ICU and in-hospital mortality, 22.5% vs. 28.3% (p=0.03) and 34% vs. 44% (p<0.001), respectively. These results for in-hospital mortality remained significant (p=0.01) for propensity matched patients. Surprisingly, these results were even stronger when taking into account the sickest patients on multiple vasopressors [103]. Another study retrospectively studied outcomes in 259 patients receiving concomitant enteral nutrition and vasopressors [104]. Enteral nutrition was tolerated by 75% of patients and bowel ischemia was noted in only 0.9% of patients. On multivariate analysis a norepinephrine equivalent of ≤ 12.5 mcg/min was associated with enteral nutrition tolerance. Of note, dopamine and vasopressin were least tolerated (p=0.18 and 0.0027, respectively) and conversely patients receiving phenylephrine were more likely to tolerate enteral nutrition (p=0.0023). Due to the lack of truly randomized data to guide management the practitioner should exercise caution around this population.

Bariatric surgery

As the prevalence of bariatric surgery increases so will the importance of early recognition of specific nutritional deficiencies that may follow. Patients undergoing malabsorptive procedures (i.e., Roux-en-Y gastric bypass) are at significantly higher risk to develop deficiencies than those undergoing restrictive procedures (i.e., gastric banding and gastric sleeves) [105,106]. These patients are at specifically high risk for the development of anemia. The two micronutrients most likely to contribute to this disorder include deficiencies in B12 and folate. These patients are also at risk to develop thiamine (B1), iron (Fe), Selenium (Se), Zinc (Zn), and Copper (Cu) trace mineral deficiencies [107]. Secondary to reduced fat digestion, they are particularly prone to develop deficiencies in fat-soluble vitamins (A, D, E, and K). Unlike water- soluble deficiencies which manifest quickly after surgery, these deficiencies develop later relative to the degree of fat malabsorption [105].

Over and Underfeeding

In order to prevent and treat malnutrition the optimal delivery of calories is needed. Extremes of calories, over and underfeeding, are both associated with significantly worse clinical outcomes. The risks of overfeeding have been known since the 1990s. Overfeeding is typically associated with the use of PN as previously mentioned. Chwals defined overfeeding as providing energy in excess of that needed for metabolic homeostasis.108 Excess caloric supplementation has been associated with multiple metabolic derangements. While overfeeding has not been directly associated with increased mortality, it is associated with gastric distension, vomiting, diarrhea, azotemia (from increased exogenous protein), hyperammonemia, hypertonic dehydration, hyperglycemia (less of an issue now-a-days due to improved glucose control), hyperlipidemia, hypertriglyceridemia, hypercapnia, and refeeding syndrome [108-110] Overfeeding can potentially be prevented through the routine use of IC when possible [111]. On the other end of the spectrum, underfeeding can lead to large accumulative negative energy balances which are associated with an increase in infectious complications, poor wound healing, prolonged LOS, and increased mortality. An early, yet pivotal study reported a 76% mortality associated with critically ill surgical patients who accumulated a 10,000 Kcal negative caloric balance during their hospital stay [112] Techniques to prevent underfeeding include utilizing the PEP uP protocol as previously discussed and using IC to identify appropriate caloric needs. It is imperative that patients at risk for malnutrition or who already have malnutrition receive aggressive and frequent re-evaluation of caloric needs to prevent these dangerous caloric extremes.

Refeeding

There is one population of severely malnourished patients where the reintroduction of feeds needs to be handled with care. Refeeding syndrome, a true nutritional emergency, was originally identified following World War II [113]. During starvation the body becomes deficient in many electrolytes including phosphate and potassium. However, due to intracellular contraction the serum concentration of these electrolytes remains stable. Upon reintroduction of nutrition, insulin levels increase resulting in intracellular transport of potassium, phosphate, and magnesium causing profound hypophosphatemia (the hallmark of refeeding syndrome) which can have profound physiologic effects. To prevent refeeding syndrome, practitioners must be able to recognize at risk patients. Once recognized, nutritional therapy should proceed slowly and electrolytes checked frequently. Specific populations at risk include: anorexia nervosa, alcoholism, cancer, post-operative, elderly with comorbidities, uncontrolled diabetes mellitus (DM), chronic malnutrition, diuretics, and long term antacid use [114] If refeeding syndrome is diagnosed, feeds should be cut in half and potentially held until repletion of electrolytes is able to be achieved and maintained. If feedings need to be held, they should be restarted slowly after adequate electrolyte replenishment is achieved.

Conclusion

Despite significant advances in the optimal delivery of appropriate nutrition, large multicenter trials, and exciting cutting edge research, malnutrition continues to go unrecognized, undertreated, and under taught in many hospitals and training programs across the country. While much is known about the optimal delivery of nutrition there are still some significant knowledge gaps. Current research should focus on the development of a uniform definition for malnutrition, standardize malnutrition screening protocols, and identify the optimal method to determine caloric needs. Additionally, special attention needs to be paid to specific at risk populations for the prevention and treatment of malnutrition.

References

open access journals, open access scientific research publisher, open access publisher
Select your language of interest to view the total content in your interested language

Viewing options

Flyer image

Share This Article