73 resultados para early warning indicators
em Scielo Saúde Pública - SP
Resumo:
Risk factor surveillance is a complementary tool of morbidity and mortality surveillance that improves the likelihood that public health interventions are implemented in a timely fashion. The aim of this study was to identify population predictors of malaria outbreaks in endemic municipalities of Colombia with the goal of developing an early warning system for malaria outbreaks. We conducted a multiple-group, exploratory, ecological study at the municipal level. Each of the 290 municipalities with endemic malaria that we studied was classified according to the presence or absence of outbreaks. The measurement of variables was based on historic registries and logistic regression was performed to analyse the data. Altitude above sea level [odds ratio (OR) 3.65, 95% confidence interval (CI) 1.34-9.98], variability in rainfall (OR 1.85, 95% CI 1.40-2.44) and the proportion of inhabitants over 45 years of age (OR 0.17, 95% CI 0.08-0.38) were factors associated with malaria outbreaks in Colombian municipalities. The results suggest that environmental and demographic factors could have a significant ability to predict malaria outbreaks on the municipal level in Colombia. To advance the development of an early warning system, it will be necessary to adjust and standardise the collection of required data and to evaluate the accuracy of the forecast models.
Resumo:
Wheat (Triticum aestivum L.) blast caused by Pyricularia grisea is a new disease in Brazil and no resistant cultivars are available. The interactions between temperature and wetness durations have been used in many early warning systems. Hence, growth chamber experiments to assess the effect of different temperatures (10, 15, 20, 25, 30 and 35ºC) and the duration of spike-wetness (0, 5, 10, 15, 20, 25, 30, 35 and 40 hours) on the intensity of blast in cultivar BR23 were carried out. Each temperature formed an experiment and the duration of wetness the treatments. The highest blast intensity was observed at 30°C and increased as the duration of the wetting period increased while the lowest occurred at 25°C and 10 hours of spike wetness. Regardless of the temperature, no symptoms occurred when the wetting period was less than 10 hours but at 25°C and a 40 h wetting period blast intensity exceeded 85%. These variations in blast intensity as a function of temperature are explained by a generalized beta model and as a function of the duration of spike wetness by the Gompertz model. Disease intensity was modeled as a function of both temperature and the durations of spike wetness and the resulting equation provided a precise description of the response of P. grisea to temperatures and the durations of spike wetness. This model was used to construct tables that can be used to predict the intensity of P. grisea wheat blast based on the temperatures and the durations of wheat spike wetness obtained in the field.
Resumo:
ABSTRACTBank failures affect owners, employees, and customers, possibly causing large-scale economic distress. Thus, banks must evaluate operational risks and develop early warning systems. This study investigates bank failures in the Organization for Economic Co-operation and Development, the North America Free Trade Area (NAFTA), the Association of Southeast Asian Nations, the European Union, newly industrialized countries, the G20, and the G8. We use financial ratios to analyze and explore the appropriateness of prediction models. Results show that capital ratios, interest income compared to interest expenses, non-interest income compared to non-interest expenses, return on equity, and provisions for loan losses have significantly negative correlations with bank failure. However, loan ratios, non-performing loans, and fixed assets all have significantly positive correlations with bank failure. In addition, the accuracy of the logistic model for banks from NAFTA countries provides the best prediction accuracy regarding bank failure.
Resumo:
National malaria control programmes have the responsibility to develop a policy for malaria disease management based on a set of defined criteria as efficacy, side effects, costs and compliance. These will fluctuate over time and national guidelines will require periodic re-assessment and revision. Changing a drug policy is a major undertaking that can take several years before being fully operational. The standard methods on which a decision can be taken are the in vivo and the in vitro tests. The latter allow a quantitative measurement of the drug response and the assessment of several drugs at once. However, in terms of drug policy change its results might be difficult to interpret although they may be used as an early warning system for 2nd or 3rd line drugs. The new WHO 14-days in vivo test addresses mainly the problem of treatment failure and of haematological parameters changes in sick children. It gives valuable information on whether a drug still `works'. None of these methods are well suited for large-scale studies. Molecular methods based on detection of mutations in parasite molecules targeted by antimalarial drugs could be attractive tools for surveillance. However, their relationship with in vivo test results needs to be established
Resumo:
Countries could use the monitoring of drug resistance in malaria parasites as an effective early warning system to develop the timely response mechanisms that are required to avert the further spread of malaria. Drug resistance surveillance is essential in areas where no drug resistance has been reported, especially if neighbouring countries have previously reported resistance. Here, we present the results of a four-year surveillance program based on the sequencing of the pfcrt gene of Plasmodium falciparum populations from endemic areas of Honduras. All isolates were susceptible to chloroquine, as revealed by the pfcrt “CVMNK” genotype in codons 72-76.
Resumo:
Acute cerebral hemorrhage (ACH) is an important clinical problem that is often monitored and studied with expensive devices such as computed tomography, magnetic resonance imaging, and positron emission tomography. These devices are not readily available in economically underdeveloped regions of the world, emergency departments, and emergency zones. We have developed a less expensive tool for non-contact monitoring of ACH. The system measures the magnetic induction phase shift (MIPS) between the electromagnetic signals on two coils. ACH was induced in 6 experimental rabbits and edema was induced in 4 control rabbits by stereotactic methods, and their intracranial pressure and heart rate were monitored for 1 h. Signals were continuously monitored for up to 1 h at an exciting frequency of 10.7 MHz. Autologous blood was administered to the experimental group, and saline to the control group (1 to 3 mL) by injection of 1-mL every 5 min. The results showed a significant increase in MIPS as a function of the injection volume, but the heart rate was stable. In the experimental (ACH) group, there was a statistically significant positive correlation of the intracranial pressure and MIPS. The change of MIPS was greater in the ACH group than in the control group. This high-sensitivity system could detect a 1-mL change in blood volume. The MIPS was significantly related to the intracranial pressure. This observation suggests that the method could be valuable for detecting early warning signs in emergency medicine and critical care units.
Resumo:
OBJECTIVE: To identify clustering areas of infants exposed to HIV during pregnancy and their association with indicators of primary care coverage and socioeconomic condition. METHODS: Ecological study where the unit of analysis was primary care coverage areas in the city of Porto Alegre, Southern Brazil, in 2003. Geographical Information System and spatial analysis tools were used to describe indicators of primary care coverage areas and socioeconomic condition, and estimate the prevalence of liveborn infants exposed to HIV during pregnancy and delivery. Data was obtained from Brazilian national databases. The association between different indicators was assessed using Spearman's nonparametric test. RESULTS: There was found an association between HIV infection and high birth rates (r=0.22, p<0.01) and lack of prenatal care (r=0.15, p<0.05). The highest HIV infection rates were seen in areas with poor socioeconomic conditions and difficult access to health services (r=0.28, p<0.01). The association found between higher rate of prenatal care among HIV-infected women and adequate immunization coverage (r=0.35, p<0.01) indicates that early detection of HIV infection is effective in those areas with better primary care services. CONCLUSIONS: Urban poverty is a strong determinant of mother-to-child HIV transmission but this trend can be fought with health surveillance at the primary care level.
Resumo:
Silent transmission of Mycobacterium leprae, as evidenced by stable leprosy incidence rates in various countries, remains a health challenge despite the implementation of multidrug therapy worldwide. Therefore, the development of tools for the early diagnosis of M. leprae infection should be emphasised in leprosy research. As part of the continuing effort to identify antigens that have diagnostic potential, unique M. leprae peptides derived from predicted virulence-associated proteins (group IV.A) were identified using advanced genome pattern programs and bioinformatics. Based on human leukocyte antigen (HLA)-binding motifs, we selected 21 peptides that were predicted to be promiscuous HLA-class I T-cell epitopes and eight peptides that were predicted to be HLA-class II restricted T-cell epitopes for field-testing in Brazil, Ethiopia and Nepal. High levels of interferon (IFN)-γ were induced when peripheral blood mononuclear cells (PBMCs) from tuberculoid/borderline tuberculoid leprosy patients located in Brazil and Ethiopia were stimulated with the ML2055 p35 peptide. PBMCs that were isolated from healthy endemic controls living in areas with high leprosy prevalence (EChigh) in Ethiopia also responded to the ML2055 p35 peptide. The Brazilian EChigh group recognised the ML1358 p20 and ML1358 p24 peptides. None of the peptides were recognised by PBMCs from healthy controls living in non-endemic region. In Nepal, mixtures of these peptides induced the production of IFN-γ by the PBMCs of leprosy patients and EChigh. Therefore, the M. leprae virulence-associated peptides identified in this study may be useful for identifying exposure to M. leprae in population with differing HLA polymorphisms.
Resumo:
A hybrid study combining technological production and methodological research aiming to establish associations between the data and information that are part of a Computerized Nursing Process according to the ICNP® Version 1.0, indicators of patient safety and quality of care. Based on the guidelines of the Agency for Healthcare Research and Quality and the American Association of Critical Care Nurses for the expansion of warning systems, five warning systems were developed: potential for iatrogenic pneumothorax, potential for care-related infections, potential for suture dehiscence in patients after abdominal or pelvic surgery, potential for loss of vascular access, and potential for endotracheal extubation. The warning systems are a continuous computerized resource of essential situations that promote patient safety and enable the construction of a way to stimulate clinical reasoning and support clinical decision making of nurses in intensive care.
Resumo:
Water and fertilizer among the production factors are the elements that most restrict the production of cashew. The precise amount of these factors is essential to the success of the crop yield. This research aimed to determine the best factor-product ratio and analyze technical and economic indicators, of productivity of the cashew clone BRS 189 (Anacardium occidentale) to production factors water and potassium. The experiment was conducted from May 2009 to December 2009 in an experimental area of 56.0 m x 112.0 m in the irrigated Curu - Pentecoste, located in the municipality of Pentecoste, Ceará, Brazil. Production factors water (W) and potassium (K) were the independent variables and productivity (Y), the dependent variable. Ten statistical models that have proven satisfactory for obtaining production function were tested. The marginal rate of substitution was obtained through the ratio of the potassium marginal physical product and the water marginal physical product. The most suited model to the conditions of the experiment was the quadratic polynomial without intercept and interaction. Considering that the price of the water was 0.10 R$ mm -1, the price of the potassium 2.19 R$ kg -1 and the price of the cashew 0.60 R$ kg-1, the amounts of water and K2O to obtain the maximum net income were 6,349.1 L plant-1 of water and 128.7 g plant -1year, -1 respectively. Substituting the values obtained in the production function, the maximum net income was achieved with a yield of 7,496.8 kg ha-1 of cashew.
Resumo:
In early lactation dairy cattle suffer metabolic alterations caused by negative energy balance, which predisposes to fatty liver and ketosis. The aim of this study was to evaluate the metabolic condition of high yielding dairy cows subjected to three treatments for preventing severe lipomobilization and ketosis in early lactation. Fifty four multiparous Holstein cows yielding >30 L/day were divided into four groups: control (CN= no treatment), glucose precursor (PG= propylene-glycol), hepatic protector (Mp= Mercepton®), and energy supplement with salts of linolenic and linoleic faty acids (Mg-E= Megalac-E®). Treatments were administrated randomly at moment of calving until 8 weeks postpartum. Blood samples were collected on days 1, 7, 14, 21, 28, 35, 42 and 49 postpartum. Body condition score (BCS) was evaluated at the same periods and milk yield was recorded at 2nd, 4th, 5th, 6th, 7th, and 8th weeks of lactation. Concentrations of non-esterified fatty acids (NEFA), albumin, AST, ß-hydroxybutyrate (BHBA), cholesterol, glucose, total protein, urea and triglycerides were analyzed in blood samples. Cut-off points for subclinical ketosis were defined when BHBA >1.4 mmol/L and NEFA >0.7 mmol/L. General occurrence of subclinical ketosis was 24% during the period. An ascendant curve of cholesterol and glucose was observed from the 1st to the 8th week of lactation, while any tendency was observed with BHBA and NEFA, although differences among treatments were detected (p<0.05). BCS decreased from a mean of 3.85 at 1st week to 2.53 at 8th week of lactation (p=0.001). Milk yield was higher in the Mg-E group compared with the other treatment groups (p<0.05) Compared with the CN group, the treatments with Mp and PG did not show significant differences in blood biochemistry and milk yield. Cows receiving PG and Mg-E showed higher values of BHBA and NEFA (P<0.05), indicating accentuated lipomobilization. Supplementation with Mg-E also resulted in significant higher concentrations of cholesterol, BHBA, urea, AST and lower values of glycemia. This performance may be explained by the highest milk yield observed with this treatment. Treatments with PG and Mp did not improve milk yield, compared with control cows, but did not show metabolic evidence of ketosis, fat mobilization or fatty liver. These results suggest that treatment with Mg-E improves milk production but induces a higher negative energy balance leading to moderated lipomobilization and ketone bodies production, increasing the risk of fatty liver.
Resumo:
Slow-release and organic fertilizers are promising alternatives to conventional fertilizers, as both reduce losses by leaching, volatilization and problems of toxicity and/or salinity to plants. The objective of this work was to evaluate the effect of different rates of the organic fertilizer Humato-Macota® compared with the slow-release fertilizer Osmocote® on the growth and nitrogen content in the dry matter of Rangpur lime. A field experiment was conducted in a factorial completely randomized design with an additional treatment (4 x 4 +1). The first factor consisted of four HumatoMacota® rates (0, 1, 2, and 3%) applied to the substrate; the second factor consisted of the same Humato-Macota® concentrations, but applied as fortnightly foliar sprays; the additional treatment consisted of application of 5 kgm-3 Osmocote® 18-05-09. Means of all growth characteristics (plant height, total dry matter, root/shoot ratio and leaf area) and the potential quantum yield of photosystem II (Fv/Fm) were higher when plants were fertilized with the slow-release fertilizer. The organic fertilizer applied alone did not meet the N requirement of Rangpur lime.
Resumo:
The performance, carcass traits and finishing costs of Suffolk lambs were evaluated in three systems: (1) lambs weaned with 22 kg of body weight (BW) and supplemented with concentrate on pasture until slaughter; (2) lambs weaned with 22 kg BW and fed in feedlot until slaughter; (3) lambs maintained in controlled nursing after 22 kg BW and creep fed in feedlot until slaughter. Average daily gain (ADG) was 224 g/d for lambs weaned and supplemented with concentrate on pasture, 386 g/d for lambs weaned in feedlot and 481 g/d for lambs under controlled nursing. Empty body weight and visceral fat deposition were highest in lambs from feedlot systems. Carcass weights and carcass yields were highest for lambs in controlled nursing. Finishing total costs were highest in controlled nursing and lowest in the system with weaning in feedlot. High concentrate diet associated with controlled nursing in feedlot allowed lambs to reach the growth potential and carcasses with higher weights, higher yields and higher fat content. After weaning, lambs in feedlot fed with high concentrate diet had higher weight gain than lambs supplemented with concentrate on pasture. Carcasses produced under these two systems presented the same characteristics. The system with weaning in feedlot showed the lowest cost per kg carcass.
Resumo:
OBJECTIVE: To examine the relationship between growth patterns in early childhood and the onset of menarche before age 12. METHODS: The study included 2,083 women from a birth cohort study conducted in the city of Pelotas, Southern Brazil, starting in 1982. Anthropometric, behavioral, and pregnancy-related variables were collected through home interviews. Statistical analyses were performed using Pearson's chi-square and chi-square test for linear trends. A multivariable analysis was carried out using Poisson regression based on a hierarchical model. RESULTS: Mean age of menarche was 12.4 years old and the prevalence of menarche before age 12 was 24.3%. Higher weight-for-age, height-for-age, and weight-for-height z-scores at 19.4 and 43.1 months of age were associated with linear tendencies of increased prevalence and relative risks of the onset of menarche before age 12. Girls who experienced rapid growth in weight-for-age z-score from birth to 19.4 months of age and in weight-for-age or height-for-age z-scores from 19.4 to 43.1 months of age also showed higher risk of menarche before age 12. Higher risk was seen when rapid growth in weight-for-age z-score was seen during these age intervals and the highest risk was found among those in the first tertile of Williams' curve at birth. Rapid growth in weight-for-height z-score was not associated with menarche before age 12. CONCLUSIONS: Menarche is affected by nutritional status and growth patterns during early childhood. Preventing overweight and obesity during early childhood and keeping a "normal" growth pattern seem crucial for the prevention of health conditions during adulthood.
Resumo:
OBJECTIVE: To evaluate the growth parameters in infants who were born to HIV-1-infected mothers. METHODS: The study was a longitudinal evaluation of the z-scores for the weight-for-age (WAZ), weight-for-length (WLZ) and length-for-age (LAZ) data collected from a cohort. A total of 97 non-infected and 33 HIV-infected infants born to HIV-1-infected mothers in Belo Horizonte, Southeastern Brazil, between 1995 and 2003 was studied. The average follow-up period for the infected and non-infected children was 15.8 months (variation: 6.8 to 18.0 months) and 14.3 months (variation: 6.3 to 18.6 months), respectively. A mixed-effects linear regression model was used and was fitted using a restricted maximum likelihood. RESULTS: There was an observed decrease over time in the WAZ, LAZ and WLZ among the infected infants. At six months of age, the mean differences in the WAZ, LAZ and WLZ between the HIV-infected and non-infected infants were 1.02, 0.59, and 0.63 standard deviations, respectively. At 12 months, the mean differences in the WAZ, LAZ and WLZ between the HIV-infected and non-infected infants were 1.15, 1.01, and 0.87 standard deviations, respectively. CONCLUSIONS: The precocious and increasing deterioration of the HIV-infected infants' anthropometric indicators demonstrates the importance of the early identification of HIV-infected infants who are at nutritional risk and the importance of the continuous assessment of nutritional interventions for these infants.