961 resultados para Soil-water Characteristic Curve


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The solubilities and dissolution rates of three gypsum sources (analytical grade (AG), phosphogypsum (PG) and mined gypsum (MG)) with six MG size fractions ((mm) > 2.0, 1.0-2.0, 0.5-1.0, 0.25-0.5, 0.125-0.25, and < 0.125) were investigated in triple deionised water (TDI) and seawater to examine their suitability for bauxite residue amelioration. Gypsum solubility was greater in seawater (3.8 g L 1) than TDI (2.9 g L 1) due to the ionic strength effect, with dissolution in both TDI and seawater following first order kinetics. Dissolution rate constants varied with gypsum source (AR > PG > MG) due to reactivity and surface area differences, with 1:20 gypsum:solution suspensions reaching saturation within 15 s (AR) to 30 min (MG > 2.0). The ability of bauxite residue to adsorb Ca from solution was also examined. The quantity of the total solution Ca adsorbed was found to be small (5 %). These low rates of solution Ca adsorption combined with the comparatively rapid dissolution rates preclude the application of gypsum to the residue sand/seawater slurry as a method for residue amelioration. Instead, direct field application to the residue would ensure more efficient gypsum use. In addition, the formation of a sparingly soluble CaCO3 coating around the gypsum particles after mixing in a highly alkaline seawater/supernatant liquor (SNL) solution greatly reduced the rate of gypsum dissolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To simulate cropping systems, crop models must not only give reliable predictions of yield across a wide range of environmental conditions, they must also quantify water and nutrient use well, so that the status of the soil at maturity is a good representation of the starting conditions for the next cropping sequence. To assess the suitability for this task a range of crop models, currently used in Australia, were tested. The models differed in their design objectives, complexity and structure and were (i) tested on diverse, independent data sets from a wide range of environments and (ii) model components were further evaluated with one detailed data set from a semi-arid environment. All models were coded into the cropping systems shell APSIM, which provides a common soil water and nitrogen balance. Crop development was input, thus differences between simulations were caused entirely by difference in simulating crop growth. Under nitrogen non-limiting conditions between 73 and 85% of the observed kernel yield variation across environments was explained by the models. This ranged from 51 to 77% under varying nitrogen supply. Water and nitrogen effects on leaf area index were predicted poorly by all models resulting in erroneous predictions of dry matter accumulation and water use. When measured light interception was used as input, most models improved in their prediction of dry matter and yield. This test highlighted a range of compensating errors in all modelling approaches. Time course and final amount of water extraction was simulated well by two models, while others left up to 25% of potentially available soil water in the profile. Kernel nitrogen percentage was predicted poorly by all models due to its sensitivity to small dry matter changes. Yield and dry matter could be estimated adequately for a range of environmental conditions using the general concepts of radiation use efficiency and transpiration efficiency. However, leaf area and kernel nitrogen dynamics need to be improved to achieve better estimates of water and nitrogen use if such models are to be use to evaluate cropping systems. (C) 1998 Elsevier Science B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous work has identified several short-comings in the ability of four spring wheat and one barley model to simulate crop processes and resource utilization. This can have important implications when such models are used within systems models where final soil water and nitrogen conditions of one crop define the starting conditions of the following crop. In an attempt to overcome these limitations and to reconcile a range of modelling approaches, existing model components that worked demonstrably well were combined with new components for aspects where existing capabilities were inadequate. This resulted in the Integrated Wheat Model (I_WHEAT), which was developed as a module of the cropping systems model APSIM. To increase predictive capability of the model, process detail was reduced, where possible, by replacing groups of processes with conservative, biologically meaningful parameters. I_WHEAT does not contain a soil water or soil nitrogen balance. These are present as other modules of APSIM. In I_WHEAT, yield is simulated using a linear increase in harvest index whereby nitrogen or water limitations can lead to early termination of grainfilling and hence cessation of harvest index increase. Dry matter increase is calculated either from the amount of intercepted radiation and radiation conversion efficiency or from the amount of water transpired and transpiration efficiency, depending on the most limiting resource. Leaf area and tiller formation are calculated from thermal time and a cultivar specific phyllochron interval. Nitrogen limitation first reduces leaf area and then affects radiation conversion efficiency as it becomes more severe. Water or nitrogen limitations result in reduced leaf expansion, accelerated leaf senescence or tiller death. This reduces the radiation load on the crop canopy (i.e. demand for water) and can make nitrogen available for translocation to other organs. Sensitive feedbacks between light interception and dry matter accumulation are avoided by having environmental effects acting directly on leaf area development, rather than via biomass production. This makes the model more stable across environments without losing the interactions between the different external influences. When comparing model output with models tested previously using data from a wide range of agro-climatic conditions, yield and biomass predictions were equal to the best of those models, but improvements could be demonstrated for simulating leaf area dynamics in response to water and nitrogen supply, kernel nitrogen content, and total water and nitrogen use. I_WHEAT does not require calibration for any of the environments tested. Further model improvement should concentrate on improving phenology simulations, a more thorough derivation of coefficients to describe leaf area development and a better quantification of some processes related to nitrogen dynamics. (C) 1998 Elsevier Science B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The physical nonequilibrium of solute concentration resulting from preferential now of soil water has often led to models where the soil is partitioned into two regions: preferential flow paths, where solute transport occurs mainly by advection, and the remaining region, where significant solute transport occurs through diffusive exchange with the flow paths. These two-region models commonly ignore concentration gradients within the regions. Our objective was to develop a simple model to assess the influence of concentration gradients on solute transport and to compare model results with experiments conducted on structured materials. The model calculates the distribution of solutes in a single spherical aggregate surrounded by preferential now paths and subjected to alternating boundary conditions representing either an exchange of solutes between the two regions (a wet period) or no exchange but redistribution of solutes within the aggregate (a dry period). The key parameter in the model is the aggregate radius, which defines the diffusive time scales. We conducted intermittent leaching experiments on a column of packed porous spheres and on a large (300 mm long by 216 mm diameter) undisturbed field soil core to test the validity of the model and its application to field soils. Alternating wet and dry periods enhanced leaching by up to 20% for this soil, which was consistent with the model's prediction, given a fitted equivalent aggregate radius of 1.8 cm, If similar results are obtained for other soils, use of alternating wet and dry periods could improve management of solutes, for example in salinity control and in soil remediation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sensitivity and specificity of four self-report measures of disordered sleep - the Sleep Impairment Index (SII), the Sleep Disorders Questionnaire (SDQ), the Dysfunctional Beliefs and Attitudes About Sleep Scale (DBAS) and the Sleep-Wake Activity Inventory (SWAI) - were compared in subjects with insomnia and normal sleep. Nineteen young adult subjects met DSM-IV criteria for primary insomnia and another 19 were normal control subjects. Discriminatory characteristics of each measure were assessed using receiver operator characteristic curve analyses. Discriminatory power was maximised for each measure to produce cut-scores applicable for identification of individuals with insomnia. The DBAS, SII and SDQ psychiatric DIMS subscale were found to correlate, and discriminated well between the two groups. The SWAI nocturnal sleep subscale was not found to be an accurate discriminator. The results suggest differences in the measures in their ability to detect insomnia, and offer guidelines as to the optimal use of test scores to identify young adults suspected of insomnia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods Stepwise regression of annual data was applied to model incidence, calculated based on 91 cases, from lagged variables: antecedent precipitation, air temperature, soil water storage, absolute and relative air humidity, and Southern Oscillation Index (SOI). Results Multiple regression analyses resulted in a model, which explains 49% of the incidence variance, taking into account the absolute air humidity in the year of exposure, soil water storage and SOI of the previous 2 years. Conclusions The correlations may reflect enhanced fungal growth after increase in soil water storage in the longer term and greater spore release with increase in absolute air humidity in the short term.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The accuracy of multidetector computed tomographic (CT) angiography involving 64 detectors has not been well established. Methods: We conducted a multicenter study to examine the accuracy of 64-row, 0.5-mm multidetector CT angiography as compared with conventional coronary angiography in patients with suspected coronary artery disease. Nine centers enrolled patients who underwent calcium scoring and multidetector CT angiography before conventional coronary angiography. In 291 patients with calcium scores of 600 or less, segments 1.5 mm or more in diameter were analyzed by means of CT and conventional angiography at independent core laboratories. Stenoses of 50% or more were considered obstructive. The area under the receiver-operating-characteristic curve (AUC) was used to evaluate diagnostic accuracy relative to that of conventional angiography and subsequent revascularization status, whereas disease severity was assessed with the use of the modified Duke Coronary Artery Disease Index. Results: A total of 56% of patients had obstructive coronary artery disease. The patient-based diagnostic accuracy of quantitative CT angiography for detecting or ruling out stenoses of 50% or more according to conventional angiography revealed an AUC of 0.93 (95% confidence interval [CI], 0.90 to 0.96), with a sensitivity of 85% (95% CI, 79 to 90), a specificity of 90% (95% CI, 83 to 94), a positive predictive value of 91% (95% CI, 86 to 95), and a negative predictive value of 83% (95% CI, 75 to 89). CT angiography was similar to conventional angiography in its ability to identify patients who subsequently underwent revascularization: the AUC was 0.84 (95% CI, 0.79 to 0.88) for multidetector CT angiography and 0.82 (95% CI, 0.77 to 0.86) for conventional angiography. A per-vessel analysis of 866 vessels yielded an AUC of 0.91 (95% CI, 0.88 to 0.93). Disease severity ascertained by CT and conventional angiography was well correlated (r=0.81; 95% CI, 0.76 to 0.84). Two patients had important reactions to contrast medium after CT angiography. Conclusions: Multidetector CT angiography accurately identifies the presence and severity of obstructive coronary artery disease and subsequent revascularization in symptomatic patients. The negative and positive predictive values indicate that multidetector CT angiography cannot replace conventional coronary angiography at present. (ClinicalTrials.gov number, NCT00738218.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate the influence of cross-sectional arc calcification on the diagnostic accuracy of computed tomography (CT) angiography compared with conventional coronary angiography for the detection of obstructive coronary artery disease (CAD). Materials and Methods: Institutional Review Board approval and written informed consent were obtained from all centers and participants for this HIPAA-compliant study. Overall, 4511 segments from 371 symptomatic patients (279 men, 92 women; median age, 61 years [interquartile range, 53-67 years]) with clinical suspicion of CAD from the CORE-64 multi-center study were included in the analysis. Two independent blinded observers evaluated the percentage of diameter stenosis and the circumferential extent of calcium (arc calcium). The accuracy of quantitative multidetector CT angiography to depict substantial (>50%) stenoses was assessed by using quantitative coronary angiography (QCA). Cross-sectional arc calcium was rated on a segment level as follows: noncalcified or mild (<90 degrees), moderate (90 degrees-180 degrees), or severe (>180 degrees) calcification. Univariable and multivariable logistic regression, receiver operation characteristic curve, and clustering methods were used for statistical analyses. Results: A total of 1099 segments had mild calcification, 503 had moderate calcification, 338 had severe calcification, and 2571 segments were noncalcified. Calcified segments were highly associated (P < .001) with disagreement between CTA and QCA in multivariable analysis after controlling for sex, age, heart rate, and image quality. The prevalence of CAD was 5.4% in noncalcified segments, 15.0% in mildly calcified segments, 27.0% in moderately calcified segments, and 43.0% in severely calcified segments. A significant difference was found in area under the receiver operating characteristic curves (noncalcified: 0.86, mildly calcified: 0.85, moderately calcified: 0.82, severely calcified: 0.81; P < .05). Conclusion: In a symptomatic patient population, segment-based coronary artery calcification significantly decreased agreement between multidetector CT angiography and QCA to detect a coronary stenosis of at least 50%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A glasshouse trial, in which maize (Zea mays L. cv. Pioneer 3270) was grown in 35 north-eastern Australian soils of low magnesium (Mg) status, was undertaken to study the response to applied Mg. Of the soils studied, 20 were strongly acidic (pH(1:5 soil:water) <5.4), and in these soils the response to Mg was studied in both the presence and absence of lime. Magnesium application significantly (P < 0.05) increased dry matter yield in 10 soils, all of which were strongly acidic. However, significant Mg responses were recorded in 6 soils in the presence of lime, indicating that, in many situations, liming strategies may need to include consideration of Mg nutrition. Critical soil test values for 90% relative yield were 0.21 cmol(+)/kg of exchangeable Mg or 7% Mg saturation, whilst the critical (90% yield) plant tissue Mg concentration (whole shoots) was 0.15%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A risk score model was developed based in a population of 1,224 individuals from the general population without known diabetes aging 35 years or more from an urban Brazilian population sample in order to select individuals who should be screened in subsequent testing and improve the efficacy of public health assurance. External validation was performed in a second, independent, population from a different city ascertained through a similar epidemiological protocol. The risk score was developed by multiple logistic regression and model performance and cutoff values were derived from a receiver operating characteristic curve. Model`s capacity of predicting fasting blood glucose levels was tested analyzing data from a 5-year follow-up protocol conducted in the general population. Items independently and significantly associated with diabetes were age, BMI and known hypertension. Sensitivity, specificity and proportion of further testing necessary for the best cutoff value were 75.9, 66.9 and 37.2%, respectively. External validation confirmed the model`s adequacy (AUC equal to 0.72). Finally, model score was also capable of predicting fasting blood glucose progression in non-diabetic individuals in a 5-year follow-up period. In conclusion, this simple diabetes risk score was able to identify individuals with an increased likelihood of having diabetes and it can be used to stratify subpopulations in which performing of subsequent tests is necessary and probably cost-effective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Direct Assessment of Functional Status-Revised (DAFS-R) is an instrument developed to objectively measure functional capacities required for independent living. The objective of this study was to translate and culturally adapt the DAFS-R for Brazilian Portuguese (DAFS-BR) and to evaluate its reliability and validity. The DAFS-BR was administered to 89 older patients classified previously as normal controls, mild cognitive impairment (MCI) and Alzheimer`s disease (AD). The results indicated good internal consistency (Cronbach`s alpha = 0.78) in the total sample. The DAFS-BR showed high interobserver reliability (0.996; p < .001) as well as test-retest stability over 1-week interval (0.995; p < .001). Correlation between the DAFS-BR total score and the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE) was moderate and significant (r = -.65, p < .001) in the total sample, whereas it did not reach statistical significance within each diagnostic group. Receiver operating characteristic curve analyses suggested that DAFS-BR has good sensitivity and specificity to identify MCI and AD. Results suggest that DAFS-BR can document degrees of severity of functional impairment among Brazilian older adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Although suicide is a leading cause of death worldwide, clinicians and researchers lack a data-driven method to assess the risk of suicide attempts. This study reports the results of an analysis of a large cross-national epidemiologic survey database that estimates the 12-month prevalence of suicidal behaviors, identifies risk factors for suicide attempts, and combines these factors to create a risk index for 12-month suicide attempts separately for developed and developing countries. Method: Data come from the World Health Organization (WHO) World Mental Health (WMH) Surveys (conducted 2001-2007), in which 108,705 adults from 21 countries were interviewed using the WHO Composite International Diagnostic Interview. The survey assessed suicidal behaviors and potential risk factors across multiple domains, including socio-demographic characteristics, parent psychopathology, childhood adversities, DSM-IV disorders, and history of suicidal behavior. Results: Twelve-month prevalence estimates of suicide ideation, plans, and attempts are 2.0%, 0.6%, and 0.3%, respectively, for developed countries and 2.1%, 0.7%, and 0.4%, respectively, for developing countries. Risk factors for suicidal behaviors in both developed and developing countries include female sex, younger age, lower education and income, unmarried status, unemployment, parent psychopathology, childhood adversities, and presence of diverse 12-month DSM-IV mental disorders. Combining risk factors from multiple domains produced risk indices that accurately predicted 12-month suicide attempts in both developed and developing countries (area under the receiver operating characteristic curve = 0.74-0.80). Conclusions: Suicidal behaviors occur at similar rates in both developed and developing countries. Risk indices assessing multiple domains can predict suicide attempts with fairly good accuracy and may be useful in aiding clinicians in the prediction of these behaviors. J Clin Psychiatry 2010;71(12):1617-1628 (C) Copyright 2010 Physicians Postgraduate Press, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Inorganic apparent strong ion difference (SIDai) improves chloride-associated acidosis recognition in dysnatremic patients. We investigated whether the difference between sodium and chloride (Na+-C1-) or the ratio between chloride and sodium (Cl-/Na+) could be used as SIDai surrogates in mixed and dysnatremic patients. Patients and Methods: Two arterial blood samples were collected from 128 patients. Physicochemical analytical approach was used. Correlation, agreement, accuracy, sensitivity, and specificity were measured to examine whether Na(+)-C1(-) and CI(-)/Na(+) could be used instead of SIDai in the diagnosis of acidosis. Results: Na(+)-C1(-) and CF/Na+ were well correlated with SIDai (R = 0.987, P < 0.001 and R = 0.959, P < 0.001, respectively). Bias between Na(+)-C1(-) and SIDai was high (6.384 with a limit of agreement of 4.4638.305 mEq/L). Accuracy values for the identification of SIDai acidosis (<38.9 mEq/L) were 0.989 (95% confidence interval [CI], 0.980-0.998) for Na+-C1- and 0.974 (95% CI, 0.959-0.989) for Cr/Na+. Receiver operator characteristic curve showed that values revealing SIDai acidosis were less than 32.5 mEq/L for Nata- and more than 0.764 for C17Na+ with sensitivities of 94.0% and 92.0% and specificities of 97.0% and 90.0%, respectively. Nata- was a reliable S IDai surrogate in dysnatremic patients. Conclusions: Nata- and CI-/Na+ are good tools to disclose S IDai acidosis. In patients with dysnatremia, Nata- is an accurate tool to diagnose SIDai acidosis. (C) 2010 Elsevier Inc. All rights reserved.