Two biochemical indicators are currently recommended for determining whether vitamin A deficiency (VAD) is a public health problem: serum retinol and serum retinol-binding protein (RBP). After consideration of 40 data sets and the original rationale for previously proposed cut-offs, a cut-off for serum retinol concentration was proposed at <0.70 micro mol/L (20 micro g/dL) in > or =15% of the sampled population. This cut-off should be applied to a representative group of preschool age children (6-71 mo). Because measurement of low serum retinol concentrations requires high precision, analysis should be done by HPLC. For serum RBP, a cut-off cannot be reliably specified, because available data are too few and too variable. However, because serum RBP concentration correlates well with serum retinol concentration, it can be used to determine whether VAD is a public health problem in those populations for which the relationship between serum concentrations of retinol and RBP have been established. More efforts to establish a reliable cut-off for RBP is warranted, because analysis, in particular radial immunodiffusion (RID), is relatively simple and inexpensive. Whereas HPLC and RID analyses must be done in a laboratory, methods are being developed for assessing serum retinol and RBP under more remote conditions.
In developed countries, food fortification has proven an effective and low-cost way to increase the micronutrient supply and reduce the consequences of micronutrient deficiencies. It has been rarely used in the developing world, but general conclusions can be drawn. The biological efficacy, but not the effectiveness, of fortifying oil and hydrogenated oil products as well as cereal flours and meals with vitamin A has been shown. Sugar has been fortified with vitamin A in Central American countries for years, and biological efficacy and program effectiveness are well established. Efficacy of fortifying monosodium glutamate with vitamin A was demonstrated but a program has not been established. Fortification with vitamin A in the developing world should satisfy certain elements for success. a) A potential food matrix (a food regularly consumed, produced by a few centralized factories, without sensorial changes compared with the nonfortified equivalent, and nutrient remains bioavailable and in a sufficient amount) is required. b) Fortified foods should provide at least 15% of the recommended daily intakes for the target group (e.g., individuals consuming the lowest amount of the fortified food). c) Voluntary fortification of processed foods should be regulated to prevent excessive consumption of vitamin A. d) Neighboring countries should harmonize technical standards, facilitate compliance and minimize conflicts over global trade laws. e) A practical monitoring system should be instituted. f) Social marketing activities should be permanent and aimed at industry, government and consumers. g) Food fortification should be combined with other strategies (e.g., supplementation) to reach those not adequately covered by fortification alone. Infants and small children, whose dietary habits differ from those of adults, require special attention. Fortification of food commodities is a very attractive and economic way to prevent and control vitamin A deficiency. Effective food fortification might make supplementation of postpartum women and older children unnecessary.
There is great interest in replacing 24-h urine Na with easier methods to assess dietary Na. However, whether alternative methods are reliable remains uncertain. More research, including the use of an appropriate study design and statistical testing, is required to determine the usefulness of alternative methods.
NTDs remain an important cause of perinatal mortality and infantile paralysis worldwide. Mandatory fortification of flour with folic acid has proved to be one of the most successful public health interventions in reducing the prevalence of NTD-affected pregnancies. Most developing countries have few, if any, common sources of folic acid, unlike many developed countries, which have folic acid available from ready-to-eat cereals and supplements. Expanding the number of developed and developing countries with folic acid flour fortification has tremendous potential to safely eliminate most folic acid-preventable NTDs.
Background Food fortification is one approach for addressing anemia, but information on program effectiveness is limited. Objective We evaluated the impact of Costa Rica’s fortification program on anemia in women aged 15–45 y and children aged 1–7 y. Design Reduced iron, an ineffective fortificant, was replaced by ferrous fumarate in wheat flour in 2002, and ferrous bisglycinate was added to maize flour in 1999 and to liquid and powdered milk in 2001. We used a one-group pretest-posttest design and national survey data from 1996 (baseline; 910 women, 965 children) and 2008–2009 (endline; 863 women, 403 children) to assess changes in iron deficiency (children only) and anemia. Data were also available for sentinel sites (1 urban, 1 rural) for 1999–2000 (405 women, 404 children) and 2008–2009 (474 women, 195 children), including 24-h recall data in children. Monitoring of fortification levels was routine. Results Foods were fortified as mandated. Fortification provided about one-half the estimated average requirement for iron in children, mostly and equally through wheat flour and milk. Anemia was reduced in children and women in national and sentinel site comparisons. At the national level, anemia declined in children from 19.3% (95% CI: 16.8%, 21.8%) to 4.0% (95% CI: 2.1%, 5.9%) and in women from 18.4% (95% CI: 15.8%, 20.9%) to 10.2% (95% CI: 8.2%, 12.2%). In children, iron deficiency declined from 26.9% (95% CI: 21.1%, 32.7%) to 6.8% (95% CI: 4.2%, 9.3%), and iron deficiency anemia, which was 6.2% (95% CI: 3.0%, 9.3%) at baseline, could no longer be detected at the endline. Conclusions A plausible impact pathway suggests that fortification improved iron status and reduced anemia. Although unlikely in the Costa Rican context, other explanations cannot be excluded in a pre/post comparison.
Fortification is the process of adding nutrients or non-nutrient bioactive components to edible products (e.g., food, food constituents, or supplements). Fortification can be used to correct or prevent widespread nutrient intake shortfalls and associated deficiencies, to balance the total nutrient profile of a diet, to restore nutrients lost in processing, or to appeal to consumers looking to supplement their diet. Food fortification could be considered as a public health strategy to enhance nutrient intakes of a population. Over the past century, fortification has been effective at reducing the risk of nutrient deficiency diseases such as beriberi, goiter, pellagra, and rickets. However, the world today is very different from when fortification emerged in the 1920s. Although early fortification programs were designed to eliminate deficiency diseases, current fortification programs are based on low dietary intakes rather than a diagnosable condition. Moving forward, we must be diligent in our approach to achieving effective and responsible fortification practices and policies, including responsible marketing of fortified products. Fortification must be applied prudently, its effects monitored diligently, and the public informed effectively about its benefits through consumer education efforts. Clear lines of authority for establishing fortification guidelines should be developed and should take into account changing population demographics, changes in the food supply, and advances in technology. This article is a summary of a symposium presented at the ASN Scientific Sessions and Annual Meeting at Experimental Biology 2014 on current issues involving fortification focusing primarily on the United States and Canada and recommendations for the development of responsible fortification practices to ensure their safety and effectiveness.
Anemia affects over 800 million women and children globally. Defined as a limited or insufficient functional red blood cell supply in peripheral blood, anemia causes a reduced oxygen supply to tissues and can have serious health consequences for women and children. Hemoglobin (Hb) concentration is most commonly measured for anemia diagnosis. Methods to measure Hb are usually invasive (requiring a blood sample); however, advances in diagnostic and clinical chemistry over the past decade have led to the development of new noninvasive methods. Accurate diagnosis at the individual level is important to identify individuals who require treatment. At the population level, anemia prevalence estimates are often the impetus for national nutrition policies or programs. Thus, it is essential that methods for Hb measurement are sensitive, specific, accurate, and reproducible. The objective of our narrative review is to describe the basic principles, advantages, limitations, and quality control issues related to methods of Hb measurement in clinical and field settings. We also discuss other biomarkers and tests that can help to determine the severity and underlying causes of anemia. In conclusion, there are many established and emerging methods to measure Hb concentration, each with their own advantages, limitations, and factors to consider before use.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.