854 resultados para based inspection and conditional monitoring
Resumo:
High-resolution quantitative computed tomography (HRQCT)-based analysis of spinal bone density and microstructure, finite element analysis (FEA), and DXA were used to investigate the vertebral bone status of men with glucocorticoid-induced osteoporosis (GIO). DXA of L1–L3 and total hip, QCT of L1–L3, and HRQCT of T12 were available for 73 men (54.6±14.0years) with GIO. Prevalent vertebral fracture status was evaluated on radiographs using a semi-quantitative (SQ) score (normal=0 to severe fracture=3), and the spinal deformity index (SDI) score (sum of SQ scores of T4 to L4 vertebrae). Thirty-one (42.4%) subjects had prevalent vertebral fractures. Cortical BMD (Ct.BMD) and thickness (Ct.Th), trabecular BMD (Tb.BMD), apparent trabecular bone volume fraction (app.BV/TV), and apparent trabecular separation (app.Tb.Sp) were analyzed by HRQCT. Stiffness and strength of T12 were computed by HRQCT-based nonlinear FEA for axial compression, anterior bending and axial torsion. In logistic regressions adjusted for age, glucocorticoid dose and osteoporosis treatment, Tb.BMD was most closely associated with vertebral fracture status (standardized odds ratio [sOR]: Tb.BMD T12: 4.05 [95% CI: 1.8–9.0], Tb.BMD L1–L3: 3.95 [1.8–8.9]). Strength divided by cross-sectional area for axial compression showed the most significant association with spine fracture status among FEA variables (2.56 [1.29–5.07]). SDI was best predicted by a microstructural model using Ct.Th and app.Tb.Sp (r2=0.57, p<0.001). Spinal or hip DXA measurements did not show significant associations with fracture status or severity. In this cross-sectional study of males with GIO, QCT, HRQCT-based measurements and FEA variables were superior to DXA in discriminating between patients of differing prevalent vertebral fracture status. A microstructural model combining aspects of cortical and trabecular bone reflected fracture severity most accurately.
Resumo:
Mapping ecosystem services (ES) and their trade-offs is a key requirement for informed decision making for land use planning and management of natural resources that aim to move towards increasing the sustainability of landscapes. The negotiations of the purposes of landscapes and the services they should provide are difficult as there is an increasing number of stakeholders active at different levels with a variety of interests present on one particular landscape.Traditionally, land cover data is at the basis for mapping and spatial monitoring of ecosystem services. In light of complex landscapes it is however questionable whether land cover per se and as a spatial base unit is suitable for monitoring and management at the meso-scale. Often the characteristics of a landscape are defined by prevalence, composition and specific spatial and temporal patterns of different land cover types. The spatial delineation of shifting cultivation agriculture represents a prominent example of a land use system with its different land use intensities that requires alternative methodologies that go beyond the common remote sensing approaches of pixel-based land cover analysis due to the spatial and temporal dynamics of rotating cultivated and fallow fields.Against this background we advocate that adopting a landscape perspective to spatial planning and decision making offers new space for negotiation and collaboration, taking into account the needs of local resource users, and of the global community. For this purpose we introduce landscape mosaicsdefined as new spatial unit describing generalized land use types. Landscape mosaics have allowed us to chart different land use systems and land use intensities and permitted us to delineate changes in these land use systems based on changes of external claims on these landscapes. The underlying idea behindthe landscape mosaics is to use land cover data typically derived from remote sensing data and to analyse and classify spatial patterns of this land cover data using a moving window approach. We developed the landscape mosaics approach in tropical, forest dominated landscapesparticularly shifting cultivation areas and present examples ofour work from northern Laos, eastern Madagascarand Yunnan Province in China.
Resumo:
We have developed a novel way to assess the mutagenicity of environmentally important metal carcinogens, such as nickel, by creating a positive selection system based upon the conditional expression of a retroviral transforming gene. The target gene is the v-mos gene in MuSVts110, a murine retrovirus possessing a growth temperature dependent defect in expression of the transforming gene due to viral RNA splicing. In normal rat kidney cells infected with MuSVts110 (6m2 cells), splicing of the MuSVts110 RNA to form the mRNA from which the transforming protein, p85$\sp{\rm gag-mos}$, is translated is growth-temperature dependent, occurring at 33 C and below but not at 39 C and above. This splicing "defect" is mediated by cis-acting viral sequences. Nickel chloride treatment of 6m2 cells followed by growth at 39 C, allowed the selection of "revertant" cells which constitutively express p85$\sp{\rm gag-mos}$ due to stable changes in the viral RNA splicing phenotype, suggesting that nickel, a carcinogen whose mutagenicity has not been well established, could induce mutations in mammalian genes. We also show by direct sequencing of PCR-amplified integrated MuSVts110 DNA from a 6m2 nickel-revertant cell line that the nickel-induced mutation affecting the splicing phenotype is a cis-acting 70-base duplication of a region of the viral DNA surrounding the 3$\sp\prime$ splice site. These findings provide the first example of the molecular basis for a nickel-induced DNA lesion and establish the mutagenicity of this potent carcinogen. ^
Resumo:
Recently it has been proposed that the evaluation of effects of pollutants on aquatic organisms can provide an early warning system of potential environmental and human health risks (NRC 1991). Unfortunately there are few methods available to aquatic biologists to conduct assessments of the effects of pollutants on aquatic animal community health. The primary goal of this research was to develop and evaluate the feasibility of such a method. Specifically, the primary objective of this study was to develop a prototype rapid bioassessment technique similar to the Index of Biotic Integrity (IBI) for the upper Texas and Northwestern Gulf of Mexico coastal tributaries. The IBI consists of a series of "metrics" which describes specific attributes of the aquatic community. Each of these metrics are given a score which is then subtotaled to derive a total assessment of the "health" of the aquatic community. This IBI procedure may provide an additional assessment tool for professionals in water quality management.^ The experimental design consisted primarily of compiling previously collected data from monitoring conducted by the Texas Natural Resource Conservation Commission (TNRCC) at five bayous classified according to potential for anthropogenic impact and salinity regime. Standardized hydrological, chemical, and biological monitoring had been conducted in each of these watersheds. The identification and evaluation of candidate metrics for inclusion in the estuarine IBI was conducted through the use of correlation analysis, cluster analysis, stepwise and normal discriminant analysis, and evaluation of cumulative distribution frequencies. Scores of each included metric were determined based on exceedances of specific percentiles. Individual scores were summed and a total IBI score and rank for the community computed.^ Results of these analyses yielded the proposed metrics and rankings listed in this report. Based on the results of this study, incorporation of an estuarine IBI method as a water quality assessment tool is warranted. Adopted metrics were correlated to seasonal trends and less so to salinity gradients observed during the study (0-25 ppt). Further refinement of this method is needed using a larger more inclusive data set which includes additional habitat types, salinity ranges, and temporal variation. ^
Resumo:
The discovery and characterization of oncofetal proteins have led to significant advances in early cancer diagnosis and therapeutic monitoring of patients undergoing cancer chemotherapy. These tumor-associated antigens are presently measured by sensitive, specific immunoassay techniques based on the detection of minute amounts of labeled antigen or antibody incorporated into immune complexes, which must be isolated from free antigen and antibody.^ Since there are several disadvantages with using radioisotopes, the most common immunolabel, one major objective was to prepare covalently coupled enzyme-antibody conjugates and evaluate their use as a practical alternative to radiolabeled immune reagents. An improved technique for the production of enzyme-antibody conjugates was developed that involves oxidizing the carbohydrate moieties on a glycoprotein enzyme, then introducing antibody in the presence of polyethylene glycol (PEG). Covalent enzyme-antibody conjugates involving alkaline phosphatase and amyloglucosidase were produced and characterized.^ In order to increase the sensitivity of detecting the amyloglucosidase-antibody conjugate, an enzyme cycling assay was developed that measures glucose, the product of maltose cleavage by amyloglucosidase, in the picomole range. The increased sensitivity obtained by combined usage of the amyloglucosidase-antibody conjugate and enzyme cycling assay was then compared to that of conventional enzyme immunoassay (EIA).^ For immune complex isolation, polystyrene tubes and protein A-bearing Staphylococcus aureus were evaluated as solid phase matrices, upon which antibodies can be immobilized. A sandwich-type EIA, using antibody-coated S. aureus, was developed that measures human albumin (HSA) in the nanogram range. The assay, using an alkaline phosphatase-anti-HSA conjugate, was applied to the determination of HSA in human urine and evaluated extensively for its clinical applicability.^ Finally, in view of the clinical significance of alpha-fetoprotein (AFP) as an oncofetal antigen and the difficulty with its purification for use as an immunogen and assay standard, a chemical purification protocol was developed that resulted in a high yield of immunochemically pure AFP. ^
Resumo:
BACKGROUND The number of older adults in the global population is increasing. This demographic shift leads to an increasing prevalence of age-associated disorders, such as Alzheimer's disease and other types of dementia. With the progression of the disease, the risk for institutional care increases, which contrasts with the desire of most patients to stay in their home environment. Despite doctors' and caregivers' awareness of the patient's cognitive status, they are often uncertain about its consequences on activities of daily living (ADL). To provide effective care, they need to know how patients cope with ADL, in particular, the estimation of risks associated with the cognitive decline. The occurrence, performance, and duration of different ADL are important indicators of functional ability. The patient's ability to cope with these activities is traditionally assessed with questionnaires, which has disadvantages (eg, lack of reliability and sensitivity). Several groups have proposed sensor-based systems to recognize and quantify these activities in the patient's home. Combined with Web technology, these systems can inform caregivers about their patients in real-time (e.g., via smartphone). OBJECTIVE We hypothesize that a non-intrusive system, which does not use body-mounted sensors, video-based imaging, and microphone recordings would be better suited for use in dementia patients. Since it does not require patient's attention and compliance, such a system might be well accepted by patients. We present a passive, Web-based, non-intrusive, assistive technology system that recognizes and classifies ADL. METHODS The components of this novel assistive technology system were wireless sensors distributed in every room of the participant's home and a central computer unit (CCU). The environmental data were acquired for 20 days (per participant) and then stored and processed on the CCU. In consultation with medical experts, eight ADL were classified. RESULTS In this study, 10 healthy participants (6 women, 4 men; mean age 48.8 years; SD 20.0 years; age range 28-79 years) were included. For explorative purposes, one female Alzheimer patient (Montreal Cognitive Assessment score=23, Timed Up and Go=19.8 seconds, Trail Making Test A=84.3 seconds, Trail Making Test B=146 seconds) was measured in parallel with the healthy subjects. In total, 1317 ADL were performed by the participants, 1211 ADL were classified correctly, and 106 ADL were missed. This led to an overall sensitivity of 91.27% and a specificity of 92.52%. Each subject performed an average of 134.8 ADL (SD 75). CONCLUSIONS The non-intrusive wireless sensor system can acquire environmental data essential for the classification of activities of daily living. By analyzing retrieved data, it is possible to distinguish and assign data patterns to subjects' specific activities and to identify eight different activities in daily living. The Web-based technology allows the system to improve care and provides valuable information about the patient in real-time.
Resumo:
This study focuses on relations between 7- and 9-year-old children’s and adults’ metacognitive monitoring and control processes. In addition to explicit confidence judgments (CJ), data for participants’ control behavior during learning and recall as well as implicit CJs were collected with an eye-tracking device (Tobii 1750). Results revealed developmental progression in both accuracy of implicit and explicit monitoring across age groups. In addition, efficiency of learning and recall strategies increases with age, as older participants allocate more fixation time to critical information and less time to peripheral or potentially interfering information. Correlational analyses, recall performance, metacognitive monitoring, and controlling indicate significant interrelations between all of these measures, with varying patterns of correlations within age groups. Results are discussed in regard to the intricate relationship between monitoring and recall and their relation to performance.
Resumo:
BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.
Resumo:
Until today, most of the documentation of forensic relevant medical findings is limited to traditional 2D photography, 2D conventional radiographs, sketches and verbal description. There are still some limitations of the classic documentation in forensic science especially if a 3D documentation is necessary. The goal of this paper is to demonstrate new 3D real data based geo-metric technology approaches. This paper present approaches to a 3D geo-metric documentation of injuries on the body surface and internal injuries in the living and deceased cases. Using modern imaging methods such as photogrammetry, optical surface and radiological CT/MRI scanning in combination it could be demonstrated that a real, full 3D data based individual documentation of the body surface and internal structures is possible in a non-invasive and non-destructive manner. Using the data merging/fusing and animation possibilities, it is possible to answer reconstructive questions of the dynamic development of patterned injuries (morphologic imprints) and to evaluate the possibility, that they are matchable or linkable to suspected injury-causing instruments. For the first time, to our knowledge, the method of optical and radiological 3D scanning was used to document the forensic relevant injuries of human body in combination with vehicle damages. By this complementary documentation approach, individual forensic real data based analysis and animation were possible linking body injuries to vehicle deformations or damages. These data allow conclusions to be drawn for automobile accident research, optimization of vehicle safety (pedestrian and passenger) and for further development of crash dummies. Real 3D data based documentation opens a new horizon for scientific reconstruction and animation by bringing added value and a real quality improvement in forensic science.
Resumo:
This chapter aims to overcome the gap existing between case study research, which typically provides qualitative and process-based insights, and national or global inventories that typically offer spatially explicit and quantitative analysis of broader patterns, and thus to present adequate evidence for policymaking regarding large-scale land acquisitions. Therefore, the chapter links spatial patterns of land acquisitions to underlying implementation processes of land allocation. Methodologically linking the described patterns and processes proved difficult, but we have identified indicators that could be added to inventories and monitoring systems to make linkage possible. Combining complementary approaches in this way may help to determine where policy space exists for more sustainable governance of land acquisitions, both geographically and with regard to processes of agrarian transitions. Our spatial analysis revealed two general patterns: (i) relatively large forestry-related acquisitions that target forested landscapes and often interfere with semi-subsistence farming systems; and (ii) smaller agriculture-related acquisitions that often target existing cropland and also interfere with semi-subsistence systems. Furthermore, our meta-analysis of land acquisition implementation processes shows that authoritarian, top-down processes dominate. Initially, the demands of powerful regional and domestic investors tend to override socio-ecological variables, local actors’ interests, and land governance mechanisms. As available land grows scarce, however, and local actors gain experience dealing with land acquisitions, it appears that land investments begin to fail or give way to more inclusive, bottom-up investment models.
Resumo:
Soils are fundamental to ensuring water, energy and food security. Within the context of sus- tainable food production, it is important to share knowledge on existing and emerging tech- nologies that support land and soil monitoring. Technologies, such as remote sensing, mobile soil testing, and digital soil mapping, have the potential to identify degraded and non- /little-responsive soils, and may also provide a basis for programmes targeting the protection and rehabilitation of soils. In the absence of such information, crop production assessments are often not based on the spatio-temporal variability in soil characteristics. In addition, uncertain- ties in soil information systems are notable and build up when predictions are used for monitor- ing soil properties or biophysical modelling. Consequently, interpretations of model-based results have to be done cautiously. As such they provide a scientific, but not always manage- able, basis for farmers and/or policymakers. In general, the key incentives for stakeholders to aim for sustainable management of soils and more resilient food systems are complex at farm as well as higher levels. The same is true of drivers of soil degradation. The decision- making process aimed at sustainable soil management, be that at farm or higher level, also in- volves other goals and objectives valued by stakeholders, e.g. land governance, improved envi- ronmental quality, climate change adaptation and mitigation etc. In this dialogue session we will share ideas on recent developments in the discourse on soils, their functions and the role of soil and land information in enhancing food system resilience.
Resumo:
Environmental quality monitoring of water resources is challenged with providing the basis for safeguarding the environment against adverse biological effects of anthropogenic chemical contamination from diffuse and point sources. While current regulatory efforts focus on monitoring and assessing a few legacy chemicals, many more anthropogenic chemicals can be detected simultaneously in our aquatic resources. However, exposure to chemical mixtures does not necessarily translate into adverse biological effects nor clearly shows whether mitigation measures are needed. Thus, the question which mixtures are present and which have associated combined effects becomes central for defining adequate monitoring and assessment strategies. Here we describe the vision of the international, EU-funded project SOLUTIONS, where three routes are explored to link the occurrence of chemical mixtures at specific sites to the assessment of adverse biological combination effects. First of all, multi-residue target and non-target screening techniques covering a broader range of anticipated chemicals co-occurring in the environment are being developed. By improving sensitivity and detection limits for known bioactive compounds of concern, new analytical chemistry data for multiple components can be obtained and used to characterise priority mixtures. This information on chemical occurrence will be used to predict mixture toxicity and to derive combined effect estimates suitable for advancing environmental quality standards. Secondly, bioanalytical tools will be explored to provide aggregate bioactivity measures integrating all components that produce common (adverse) outcomes even for mixtures of varying compositions. The ambition is to provide comprehensive arrays of effect-based tools and trait-based field observations that link multiple chemical exposures to various environmental protection goals more directly and to provide improved in situ observations for impact assessment of mixtures. Thirdly, effect-directed analysis (EDA) will be applied to identify major drivers of mixture toxicity. Refinements of EDA include the use of statistical approaches with monitoring information for guidance of experimental EDA studies. These three approaches will be explored using case studies at the Danube and Rhine river basins as well as rivers of the Iberian Peninsula. The synthesis of findings will be organised to provide guidance for future solution-oriented environmental monitoring and explore more systematic ways to assess mixture exposures and combination effects in future water quality monitoring.
Resumo:
Research on lifestyle physical activity interventions suggests that they help individuals meet the new recommendations for physical activity made by the Centers for Disease Control and Prevention (CDC) and the American College of Sports Medicine (ACSM). The purpose of this research was to describe the rates of adherence to two lifestyle physical activity intervention arms and to examine the association between adherence and outcome variables, using data from Project PRIME, a lifestyle physical activity intervention based on the transtheoretical model and conducted by the Cooper Institute of Aerobics Research, Dallas, Texas. Participants were 250 sedentary healthy adults, aged 35 to 70 years, primarily non-Hispanic White, and in the contemplation and preparation stages of readiness to change. They were randomized to a group (PRIME G) or a mail- and telephone-delivered condition (PRIME C). Adherence measures included attending class (PRIME G), completing a monthly telephone call with a health educator (PRIME C), and completing homework assignments and self-monitoring minutes of moderate- to vigorous physical activity (both groups). In the first results paper, adherence over time and between conditions was examined: Attendance in group, completing the monthly telephone call, and homework completion decreased over time, and participants in PRIME G were more likely to complete homework than those in PRIME C. Paper 2 aimed to determine whether the adherence measures predicted achievement of the CDC/ACSM physical activity guideline. In separate models for the two conditions, a latent variable measuring adherence was found to predict achievement of the guideline. Paper 3 examined the association between adherence measures and the transtheoretical model's processes of change within each condition. For both, participants who completed at least two thirds of the homework assignments improved their use of the processes of change more than those who completed less than that amount. These results suggest that encouraging adherence to a lifestyle physical activity intervention, at least among already motivated volunteers, may increase the likelihood of beneficial changes in the outcomes. ^
Resumo:
Background. Diabetes places a significant burden on the health care system. Reduction in blood glucose levels (HbA1c) reduces the risk of complications; however, little is known about the impact of disease management programs on medical costs for patients with diabetes. In 2001, economic costs associated with diabetes totaled $100 billion, and indirect costs totaled $54 billion. ^ Objective. To compare outcomes of nurse case management by treatment algorithms with conventional primary care for glycemic control and cardiovascular risk factors in type 2 diabetic patients in a low-income Mexican American community-based setting, and to compare the cost effectiveness of the two programs. Patient compliance was also assessed. ^ Research design and methods. An observational group-comparison to evaluate a treatment intervention for type 2 diabetes management was implemented at three out-patient health facilities in San Antonio, Texas. All eligible type 2 diabetic patients attending the clinics during 1994–1996 became part of the study. Data were obtained from the study database, medical records, hospital accounting, and pharmacy cost lists, and entered into a computerized database. Three groups were compared: a Community Clinic Nurse Case Manager (CC-TA) following treatment algorithms, a University Clinic Nurse Case Manager (UC-TA) following treatment algorithms, and Primary Care Physicians (PCP) following conventional care practices at a Family Practice Clinic. The algorithms provided a disease management model specifically for hyperglycemia, dyslipidemia, hypertension, and microalbuminuria that progressively moved the patient toward ideal goals through adjustments in medication, self-monitoring of blood glucose, meal planning, and reinforcement of diet and exercise. Cost effectiveness of hemoglobin AI, final endpoints was compared. ^ Results. There were 358 patients analyzed: 106 patients in CC-TA, 170 patients in UC-TA, and 82 patients in PCP groups. Change in hemoglobin A1c (HbA1c) was the primary outcome measured. HbA1c results were presented at baseline, 6 and 12 months for CC-TA (10.4%, 7.1%, 7.3%), UC-TA (10.5%, 7.1%, 7.2%), and PCP (10.0%, 8.5%, 8.7%). Mean patient compliance was 81%. Levels of cost effectiveness were significantly different between clinics. ^ Conclusion. Nurse case management with treatment algorithms significantly improved glycemic control in patients with type 2 diabetes, and was more cost effective. ^
Resumo:
Background. The United Nations' Millennium Development Goal (MDG) 4 aims for a two-thirds reduction in death rates for children under the age of five by 2015. The greatest risk of death is in the first week of life, yet most of these deaths can be prevented by such simple interventions as improved hygiene, exclusive breastfeeding, and thermal care. The percentage of deaths in Nigeria that occur in the first month of life make up 28% of all deaths under five years, a statistic that has remained unchanged despite various child health policies. This paper will address the challenges of reducing the neonatal mortality rate in Nigeria by examining the literature regarding efficacy of home-based, newborn care interventions and policies that have been implemented successfully in India. ^ Methods. I compared similarities and differences between India and Nigeria using qualitative descriptions and available quantitative data of various health indicators. The analysis included identifying policy-related factors and community approaches contributing to India's newborn survival rates. Databases and reference lists of articles were searched for randomized controlled trials of community health worker interventions shown to reduce neonatal mortality rates. ^ Results. While it appears that Nigeria spends more money than India on health per capita ($136 vs. $132, respectively) and as percent GDP (5.8% vs. 4.2%, respectively), it still lags behind India in its neonatal, infant, and under five mortality rates (40 vs. 32 deaths/1000 live births, 88 vs. 48 deaths/1000 live births, 143 vs. 63 deaths/1000 live births, respectively). Both countries have comparably low numbers of healthcare providers. Unlike their counterparts in Nigeria, Indian community health workers receive training on how to deliver postnatal care in the home setting and are monetarily compensated. Gender-related power differences still play a role in the societal structure of both countries. A search of randomized controlled trials of home-based newborn care strategies yielded three relevant articles. Community health workers trained to educate mothers and provide a preventive package of interventions involving clean cord care, thermal care, breastfeeding promotion, and danger sign recognition during multiple postnatal visits in rural India, Bangladesh, and Pakistan reduced neonatal mortality rates by 54%, 34%, and 15–20%, respectively. ^ Conclusion. Access to advanced technology is not necessary to reduce neonatal mortality rates in resource-limited countries. To address the urgency of neonatal mortality, countries with weak health systems need to start at the community level and invest in cost-effective, evidence-based newborn care interventions that utilize available human resources. While more randomized controlled studies are urgently needed, the current available evidence of models of postnatal care provision demonstrates that home-based care and health education provided by community health workers can reduce neonatal mortality rates in the immediate future.^