961 resultados para Soil-water Characteristic Curve


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data are reported on the background and performance of the K6 screening scale for serious mental illness (SMI) in the World Health Organization (WHO) World Mental Health (WMH) surveys. The K6 is a six-item scale developed to provide a brief valid screen for Diagnostic and Statistical Manual of Mental Disorders 4th edition (DSM-IV) SMI based on the criteria in the US ADAMHA Reorganization Act. Although methodological studies have documented good K6 validity in a number of countries, optimal scoring rules have never been proposed. Such rules are presented here based on analysis of K6 data in nationally or regionally representative WMH surveys in 14 countries (combined N = 41,770 respondents). Twelve-month prevalence of DSM-IV SMI was assessed with the fully-structured WHO Composite International Diagnostic Interview. Nested logistic regression analysis was used to generate estimates of the predicted probability of SMI for each respondent from K6 scores, taking into consideration the possibility of variable concordance as a function of respondent age, gender, education, and country. Concordance, assessed by calculating the area under the receiver operating characteristic curve, was generally substantial (median 0.83; range 0.76-0.89; inter-quartile range 0.81-0.85). Based on this result, optimal scaling rules are presented for use by investigators working with the K6 scale in the countries studied. Copyright (c) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was designed to examine the use of the QuantiFERON-TB Gold assay as an aid in the diagnosis of active pulmonary tuberculosis (TB) in Brazilian patients. Using the receiver operating characteristic curve, the cutoff was adjusted to >= 0.20 IU/ml. The sensitivity increased to 86%, with 100% specificity. All TB patients with negative sputum smear microscopy and negative culture results were positive using this test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The purpose of this study was to evaluate the diagnostic usefulness of ulnar nerve sonography in leprosy neuropathy with electrophysiologic correlation. Methods. Twenty-one consecutive patients with leprosy (12 men and 9 women; mean age +/- SD, 47.7 +/- 17.2 years) and 20 control participants (14 men and 6 women; mean age, 46.5 +/- 16.2 years) were evaluated with sonography. Leprosy diagnosis was established on the basis of clinical, bacteriologic, and histopathologic criteria. The reference standard for ulnar neuropathy in this study was clinical symptoms in patients with proven leprosy The sonographic cross-sectional areas (CSAs) of the ulnar nerve in 3 different regions were obtained. Statistical analyses included Student t tests and receiver operating characteristic curve analysis. Results. The CSAs of the ulnar nerve were significantly larger in the leprosy group than the control group for all regions (P < .01). Sonographic abnormalities in leprosy nerves included focal thickening (90.5%), hypoechoic areas (81%), loss of the fascicular pattern (33.3%), and focal hyperechoic areas (4.7%). Receiver operating characteristic curve analysis showed that a maximum CSA cutoff value of 9.8 mm(2) was the best discriminator (sensitivity, 0.91; specificity, 0.90). Three patients with normal electrophysiologic findings had abnormal sonographic findings. Two patients had normal sonographic findings, of which 1 had abnormal electrophysiologic findings, and the other refused electrophysiologic testing. Conclusions. Sonography and electrophysiology were complementary for identifying ulnar nerve neuropathy in patients with leprosy, with clinical symptoms as the reference standard. This reinforces the role of sonography in the investigation of leprosy ulnar neuropathy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods: We assessed the outcome of 56 patients with Chagas` cardiomyopathy ([31 men]; mean age of 55 years; mean left ventricular ejection fraction [LVEF] 42%) presenting with either sustained ventricular tachycardia (VT) or nonsustained VT (NSVT), before therapy with implantable cardioverter-defibrillator was available at our center. Results: Over a mean follow-up of 38 +/- 16 months (range, 1-61 months), 16 patients (29%) died, 11 due to sudden cardiac death (SCD), and five from progressive heart failure. Survivors and nonsurvivors had comparable baseline characteristics, except for a lower LVEF (46 +/- 7% vs 31 +/- 9%, P < 0.001) and a higher New York Heart Association class (P = 0.003) in those who died during follow-up. Receiver-operator characteristic curve analysis showed that an LVEF cutoff value of 38% had the best accuracy for predicting all-cause mortality and an LVEF cutoff value of 40% had the best accuracy for prediction of SCD. Using the multivariate Cox regression analysis, LVEF < 40% was the only predictor of all-cause mortality (hazard ratio [HR] 12.22, 95% confidence interval [CI] 3.46-43.17, P = 0.0001) and SCD (HR 6.58, 95% CI 1.74-24.88, P = 0.005). Conclusions: Patients with Chagas` cardiomyopathy presenting with either sustained VT or NSVT run a major risk for mortality when had concomitant severe or even moderate LV systolic dysfunction. (PACE 2011; 54-62).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context Novel therapies have improved the remission rate in chronic inflammatory disorders including juvenile idiopathic arthritis (JIA). Therefore, strategies of tapering therapy and reliable parameters for detecting subclinical inflammation have now become challenging questions. Objectives To analyze whether longer methotrexate treatment during remission of JIA prevents flares after withdrawal of medication and whether specific biomarkers identify patients at risk for flares. Design, Setting, and Patients Prospective, open, multicenter, medication-withdrawal randomized clinical trial including 364 patients (median age, 11.0 years) with JIA recruited in 61 centers from 29 countries between February 2005 and June 2006. Patients were included at first confirmation of clinical remission while continuing medication. At the time of therapy withdrawal, levels of the phagocyte activation marker myeloid-related proteins 8 and 14 heterocomplex (MRP8/14) were determined. Intervention Patients were randomly assigned to continue with methotrexate therapy for either 6 months (group 1 [n = 183]) or 12 months (group 2 [n = 181]) after induction of disease remission. Main Outcome Measures Primary outcome was relapse rate in the 2 treatment groups; secondary outcome was time to relapse. In a prespecified cohort analysis, the prognostic accuracy of MRP8/14 concentrations for the risk of flares was assessed. Results Intention-to-treat analysis of the primary outcome revealed relapse within 24 months after the inclusion into the study in 98 of 183 patients (relapse rate, 56.7%) in group 1 and 94 of 181 (55.6%) in group 2. The odds ratio for group 1 vs group 2 was 1.02 (95% CI, 0.82-1.27; P=.86). The median relapse-free interval after inclusion was 21.0 months in group 1 and 23.0 months in group 2. The hazard ratio for group 1 vs group 2 was 1.07 (95% CI, 0.82-1.41; P=.61). Median follow-up duration after inclusion was 34.2 and 34.3 months in groups 1 and 2, respectively. Levels of MRP8/14 during remission were significantly higher in patients who subsequently developed flares (median, 715 [IQR, 320-1110] ng/mL) compared with patients maintaining stable remission (400 [IQR, 220-800] ng/mL; P=.003). Low MRP8/14 levels indicated a low risk of flares within the next 3 months following the biomarker test (area under the receiver operating characteristic curve, 0.76; 95% CI, 0.62-0.90). Conclusions In patients with JIA in remission, a 12-month vs 6-month withdrawal of methotrexate did not reduce the relapse rate. Higher MRP8/14 concentrations were associated with risk of relapse after discontinuing methotrexate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To investigate the accuracy of the Brazilian version of the Addenbrooke Cognitive Examination-revised (ACE-R) in the diagnosis of mild Alzheimer disease (AD). Background: The ACE-R is an accurate and brief cognitive battery for the detection of mild dementia, especially for the discrimination between AD and frontotemporal dementia. Methods: The battery was administered to 31 patients with mild AD and 62 age-matched and education-matched cognitively healthy controls. Both groups were selected using the Dementia Rating Scale and were submitted to the ACE-R. Depression was ruled out in both groups by the Cornell Scale for Depression in Dementia. The performance of patients and controls in the ACE-R was compared and receiver operator characteristic curve analysis was undertaken to ascertain the accuracy of the instrument for the diagnosis of mild AD. Results: The mean scores at the ACE-R were 63.10 +/- 10.22 points for patients with AD and 83.63 +/- 7.90 points for controls. The cut-off score < 78 yielded high diagnostic accuracy (receiver operator characteristic area under the curve = 0.947), with 100% sensitivity, 82.26% specificity, 73.8% positive predictive value, and 100% negative predictive value. Conclusions: The Brazilian version of the ACE-R displayed high diagnostic accuracy for the identification of mild AD in the studied sample.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of experiments were conducted in drought-prone northeast Thailand to examine the magnitude of yield responses of diverse genotypes to drought stress environments and to identify traits that may confer drought resistance to rainfed lowland rice. One hundred and twenty eight genotypes were grown under non-stress and four different types of drought stress conditions. Under severe drought conditions, the maintenance of PWP of genotypes played a significant role in determining final grain yield. Because of their smaller plant size (lower total dry matter at anthesis) genotypes that extracted less soil water during the early stages of the drought period, tended to maintain higher PWP and had a higher fertile panicle percentage, filled grain percentage and final grain yield than other genotypes. PWP was correlated with delay in flowering (r = -0.387) indicating that the latter could be used as a measure of water potential under stress. Genotypes with well-developed root systems extracted water too rapidly and experienced severe water stress at flowering. RPR which showed smaller coefficient of variation was more useful than root mass density in identifying genotypes with large root system. Under less severe and prolonged drought conditions, genotypes that could achieve higher plant dry matter at anthesis were desirable. They had less delay in flowering, higher grain yield and higher drought response index, indicating the importance of ability to grow during the prolonged stress period. Other shoot characters (osmotic potential, leaf temperature, leaf rolling, leaf death) had little effect on grain yield under different drought conditions. This was associated with a lack of genetic variation and difficulty in estimating trait values precisely. Under mild stress conditions (yield loss less than 50%), there was no significant relationship between the measured drought characters and grain yield. Under these mild drought conditions, yield is determined more by yield potential and phenotype than by drought resistant mechanisms per se. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evaluation of the performance of the APACHE III (Acute Physiology and Chronic Health Evaluation) ICU (intensive care unit) and hospital mortality models at the Princess Alexandra Hospital, Brisbane is reported. Prospective collection of demographic, diagnostic, physiological, laboratory, admission and discharge data of 5681 consecutive eligible admissions (1 January 1995 to 1 January 2000) was conducted at the Princess Alexandra Hospital, a metropolitan Australian tertiary referral medical/surgical adult ICU. ROC (receiver operating characteristic) curve areas for the APACHE III ICU mortality and hospital mortality models demonstrated excellent discrimination. Observed ICU mortality (9.1%) was significantly overestimated by the APACHE III model adjusted for hospital characteristics (10.1%), but did not significantly differ from the prediction of the generic APACHE III model (8.6%). In contrast, observed hospital mortality (14.8%) agreed well with the prediction of the APACHE III model adjusted for hospital characteristics (14.6%), but was significantly underestimated by the unadjusted APACHE III model (13.2%). Calibration curves and goodness-of-fit analysis using Hosmer-Lemeshow statistics, demonstrated that calibration was good with the unadjusted APACHE III ICU mortality model, and the APACHE III hospital mortality model adjusted for hospital characteristics. Post hoc analysis revealed a declining annual SMR (standardized mortality rate) during the study period. This trend was present in each of the non-surgical, emergency and elective surgical diagnostic groups, and the change was temporally related to increased specialist staffing levels. This study demonstrates that the APACHE III model performs well on independent assessment in an Australian hospital. Changes observed in annual SMR using such a validated model support an hypothesis of improved survival outcomes 1995-1999.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a 2-yr multiple-site field study conducted in western Nebraska during 1999 and 2000, optimum dryland corn (Zea mays L.) population varied from less than 1.7 to more than 5.6 plants m(-2), depending largely on available water resources. The objective of this study was to use a modeling approach to investigate corn population recommendations for a wide range of seasonal variation. A corn growth simulation model (APSIM-maize) was coupled to long-term sequences of historical climatic data from western Nebraska to provide probabilistic estimates of dryland yield for a range of corn populations. Simulated populations ranged from 2 to 5 plants m(-2). Simulations began with one of three levels of available soil water at planting, either 80, 160, or 240 mm in the surface 1.5 m of a loam soil. Gross margins were maximized at 3 plants m(-2) when starting available water was 160 or 240 mm, and the expected probability of a financial loss at this population was reduced from about 10% at 160 mm to 0% at 240 mm. When starting available water was 80 mm, average gross margins were less than $15 ha(-1), and risk of financial loss exceeded 40%. Median yields were greatest when starting available soil water was 240 mm. However, perhaps the greater benefit of additional soil water at planting was reduction in the risk of making a financial loss. Dryland corn growers in western Nebraska are advised to use a population of 3 plants m(-2) as a base recommendation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O estabelecimento e o crescimento inicial de espécies florestais no campo são fortemente afetados pela disponibilidade de água no solo e pela época de plantio, por isso, o presente trabalho estuda o impacto do déficit hídrico no crescimento de mudas de dois clones do híbrido Eucalyptus grandis x Eucalyptus urophylla, ambos submetidos a 4 níveis de déficit hídrico, em duas épocas de plantio. O estudo foi realizado na área experimental do Núcleo de Estudos e Difusão de Tecnologia em Florestas, Recursos Hídricos e Agricultura Sustentável (NEDTEC), do Centro de Ciências Agrárias da Universidade Federal do Espírito Santo (CCA-UFES), localizado no município de Jerônimo Monteiro. O trabalho foi realizado em duas épocas distintas, sendo a primeira no período de 09 de fevereiro a 09 de junho de 2009 e a segunda no período de 11 de julho a 07 de novembro de 2009, visando à realização das observações em diferentes condições de regime de radiação, déficit de pressão do vapor do ar, temperatura, umidade relativa do ar e velocidade do vento. O delineamento experimental utilizado foi o inteiramente ao acaso em parcelas subdivididas 2 x 4, alocando-se os 4 níveis de déficits hídricos na parcela principal e as 2 épocas nas subparcelas, com três repetições. Os manejos hídricos aplicados foram: Déficit 0 (D0) sem déficit, Déficit 1(D1) corte da irrigação aos 30 dias de experimentação, permanecendo até o final do experimento, Déficit 2 (D2) corte da irrigação aos 30 dias de experimentação, suspensão da irrigação por 60 dias e posterior retomada da irrigação por mais 30 dias; Déficit 3 (D3) corte da irrigação aos 60 dias de experimentação, prolongando até o final do experimento. Os dados experimentais foram submetidos à análise de variância, e quando significativas, as médias foram comparadas pelo teste de média Tukey a 5% de probabilidade, para cada clone estudado. Com este trabalho, foi possível avaliar o impacto de diferentes déficits hídricos, no crescimento inicial das plantas, em duas épocas do ano e avaliar o incremento no desenvolvimento das plantas durante a aplicação dos tratamentos, com retiradas de amostras médias de cada tratamento a cada 30 dias. As variáveis medidas nos dois experimentos foram altura total da planta, diâmetro ao nível do coleto, número de folhas, área foliar, matéria seca de folhas, matéria seca de haste e ramos, matéria seca de raízes e matéria seca total. Foram avaliadas as variáveis climáticas durante todo o período experimental, nas duas épocas, a fim de determinar a condição do clima em cada época. Para os dois clones estudados, em geral, os déficits hídricos promoveram a redução das variáveis morfológicas estudadas e a época experimental foi o fator que mais influenciou a redução do crescimento das plantas. Sendo que a Época 1 foi a que proporcionou resultados superiores, e a Época 2 foi a que prejudicou mais o desenvolvimento das plantas, reduzindo significativamente todas as variáveis morfológicas em todos os déficits hídricos, inclusive o D0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O estabelecimento e o crescimento inicial de espécies florestais no campo são fortemente afetados pela disponibilidade de água no solo e pela época de plantio, por isso, o presente trabalho estuda o impacto do déficit hídrico no crescimento de mudas de dois clones do híbrido Eucalyptus grandis x Eucalyptus urophylla, ambos submetidos a 4 níveis de déficit hídrico, em duas épocas de plantio. O estudo foi realizado na área experimental do Núcleo de Estudos e Difusão de Tecnologia em Florestas, Recursos Hídricos e Agricultura Sustentável (NEDTEC), do Centro de Ciências Agrárias da Universidade Federal do Espírito Santo (CCA-UFES), localizado no município de Jerônimo Monteiro. O trabalho foi realizado em duas épocas distintas, sendo a primeira no período de 09 de fevereiro a 09 de junho de 2009 e a segunda no período de 11 de julho a 07 de novembro de 2009, visando à realização das observações em diferentes condições de regime de radiação, déficit de pressão do vapor do ar, temperatura, umidade relativa do ar e velocidade do vento. O delineamento experimental utilizado foi o inteiramente ao acaso em parcelas subdivididas 2 x 4, alocando-se os 4 níveis de déficits hídricos na parcela principal e as 2 épocas nas subparcelas, com três repetições. Os manejos hídricos aplicados foram: Déficit 0 (D0) sem déficit, Déficit 1(D1) corte da irrigação aos 30 dias de experimentação, permanecendo até o final do experimento, Déficit 2 (D2) corte da irrigação aos 30 dias de experimentação, suspensão da irrigação por 60 dias e posterior retomada da irrigação por mais 30 dias; Déficit 3 (D3) corte da irrigação aos 60 dias de experimentação, prolongando até o final do experimento. Os dados experimentais foram submetidos à análise de variância, e quando significativas, as médias foram comparadas pelo teste de média Tukey a 5% de probabilidade, para cada clone estudado. Com este trabalho, foi possível avaliar o impacto de diferentes déficits hídricos, no crescimento inicial das plantas, em duas épocas do ano e avaliar o incremento no desenvolvimento das plantas durante a aplicação dos tratamentos, com retiradas de amostras médias de cada tratamento a cada 30 dias. As variáveis medidas nos dois experimentos foram altura total da planta, diâmetro ao nível do coleto, número de folhas, área foliar, matéria seca de folhas, matéria seca de haste e ramos, matéria seca de raízes e matéria seca total. Foram avaliadas as variáveis climáticas durante todo o período experimental, nas duas épocas, a fim de determinar a condição do clima em cada época. Para os dois clones estudados, em geral, os déficits hídricos promoveram a redução das variáveis morfológicas estudadas e a época experimental foi o fator que mais influenciou a redução do crescimento das plantas. Sendo que a Época 1 foi a que proporcionou resultados superiores, e a Época 2 foi a que prejudicou mais o desenvolvimento das plantas, reduzindo significativamente todas as variáveis morfológicas em todos os déficits hídricos, inclusive o D0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The soil penetration resistance has been used to represent the compaction situation and several authors have attempted to relate the cone index (CI) with the bulk density. The importance of using the CI as source of information for decisions in agricultural activities, livestock and forestry manner, has become increasingly larger, which requires more knowledge about the penetrometers and penetrographs behavior. This study aimed to verify, in controlled laboratory conditions, the influence of soil water content and cone penetration rate to obtain the cone index, when density variation occurs. The soil was compacted by compression through a universal press cylinder which was specially designed to produce the test specimens. Bulk densities were determined from samples taken from the test specimens and their moisture content. The CI values obtained were between 0.258 and 4.776 MPa, measured in 4 moistures and 7 soil densities with 3 penetration speeds. It was concluded that the determination of IC is strongly influenced by the soil moisture but the penetration speed variation, used in this study, was not sufficient to influence the IC determination. However, the decrease in soil water content may increase the sensitiveness to detect a variation in bulk density by the use of cone index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light and soil water availability may limit carbon uptake of trees in tropical rainforests. The objective of this work was to determine how photosynthetic traits of juvenile trees respond to variations in rainfall seasonality, leaf nutrient content, and opening of the forest canopy. The correlation between leaf nutrient content and annual growth rate of saplings was also assessed. In a terra firme rainforest of the central Amazon, leaf nutrient content and gas exchange parameters were measured in five sapling tree species in the dry and rainy season of 2008. Sapling growth was measured in 2008 and 2009. Rainfall seasonality led to variations in soil water content, but it did not affect leaf gas exchange parameters. Subtle changes in the canopy opening affected CO2 saturated photosynthesis (A pot, p = 0.04). Although A pot was affected by leaf nutrient content (as follows: P > Mg > Ca > N > K), the relative growth rate of saplings correlated solely with leaf P content (r = 0.52, p = 0.003). At present, reduction in soil water content during the dry season does not seem to be strong enough to cause any effect on photosynthesis of saplings in central Amazonia. This study shows that leaf P content is positively correlated with sapling growth in the central Amazon. Therefore, the positive effect of atmospheric CO2 fertilization on long-term tree growth will depend on the ability of trees to absorb additional amount of P

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Tecnologia de Diagnóstico e Intervenção Cardiovascular. Área de especialização: Intervenção Cardiovascular.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To identify potential prognostic factors for neonatal mortality among newborns referred to intensive care units. METHODS: A live-birth cohort study was carried out in Goiânia, Central Brazil, from November 1999 to October 2000. Linked birth and infant death certificates were used to ascertain the cohort of live born infants. An additional active surveillance system of neonatal-based mortality was implemented. Exposure variables were collected from birth and death certificates. The outcome was survivors (n=713) and deaths (n=162) in all intensive care units in the study period. Cox's proportional hazards model was applied and a Receiver Operating Characteristic curve was used to compare the performance of statistically significant variables in the multivariable model. Adjusted mortality rates by birth weight and 5-min Apgar score were calculated for each intensive care unit. RESULTS: Low birth weight and 5-min Apgar score remained independently associated to death. Birth weight equal to 2,500g had 0.71 accuracy (95% CI: 0.65-0.77) for predicting neonatal death (sensitivity =72.2%). A wide variation in the mortality rates was found among intensive care units (9.5-48.1%) and two of them remained with significant high mortality rates even after adjusting for birth weight and 5-min Apgar score. CONCLUSIONS: This study corroborates birth weight as a sensitive screening variable in surveillance programs for neonatal death and also to target intensive care units with high mortality rates for implementing preventive actions and interventions during the delivery period.