912 resultados para Simulated annealing algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cropping system influences the interception of water by plants, water storage in depressions on the soil surface, water infiltration into the soil and runoff. The aim of this study was to quantify some hydrological processes under no tillage cropping systems at the edge of a slope, in 2009 and 2010, in a Humic Dystrudept soil, with the following treatments: corn, soybeans, and common beans alone; and intercropped corn and common bean. Treatments consisted of four simulated rainfall tests at different times, with a planned intensity of 64 mm h-1 and 90 min duration. The first test was applied 18 days after sowing, and the others at 39, 75 and 120 days after the first test. Different times of the simulated rainfall and stages of the crop cycle affected soil water content prior to the rain, and the time runoff began and its peak flow and, thus, the surface hydrological processes. The depth of the runoff and the depth of the water intercepted by the crop + soil infiltration + soil surface storage were affected by the crop systems and the rainfall applied at different times. The corn crop was the most effective treatment for controlling runoff, with a water loss ratio of 0.38, equivalent to 75 % of the water loss ratio exhibited by common bean (0.51), the least effective treatment in relation to the others. Total water loss by runoff decreased linearly with an increase in the time that runoff began, regardless of the treatment; however, soil water content on the gravimetric basis increased linearly from the beginning to the end of the rainfall.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Surface roughness of the soil is formed by mechanical tillage and is also influenced by the kind and amount of plant residue, among other factors. Its persistence over time mainly depends on the fundamental characteristics of rain and soil type. However, few studies have been developed to evaluate these factors in Latossolos (Oxisols). In this study, we evaluated the effect of soil tillage and of amounts of plant residue on surface roughness of an Oxisol under simulated rain. Treatments consisted of the combination of the tillage systems of no-tillage (NT), conventional tillage (CT), and minimum tillage (MT) with rates of plant residue of 0, 1, and 2 Mg ha-1 of oats (Avena strigosa Schreb) and 0, 3, and 6 Mg ha-1 of maize (Zea mays L.). Seven simulated rains were applied on each experimental plot, with intensity of 60±2 mm h-1 and duration of 1 h at weekly intervals. The values of the random roughness index ranged from 2.94 to 17.71 mm in oats, and from 5.91 to 20.37 mm in maize, showing that CT and MT are effective in increasing soil surface roughness. It was seen that soil tillage operations carried out with the chisel plow and the leveling disk harrow are more effective in increasing soil roughness than those carried out with the heavy disk harrow and leveling disk harrow. The roughness index of the soil surface decreases exponentially with the increase in the rainfall volume applied under conditions of no tillage without soil cover, conventional tillage, and minimum tillage. The oat and maize crop residue present on the soil surface is effective in maintaining the roughness of the soil surface under no-tillage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The description of the fate of fertilizer-derived nitrogen (N) in agricultural systems is an essential tool to enhance management practices that maximize nutrient use by crops and minimize losses. Soil erosion causes loss of nutrients such as N, causing negative effects on surface and ground water quality, aside from losses in agricultural productivity by soil depletion. Studies correlating the percentage of fertilizer-derived N (FDN) with soil erosion rates and the factors involved in this process are scarce. The losses of soil and fertilizer-derived N by water erosion in soil under conventional tillage and no tillage under different rainfall intensities were quantified, identifying the intervening factors that increase loss. The experiment was carried out on plots (3.5 × 11 m) with two treatments and three replications, under simulated rainfall. The treatments consisted of soil with and soil without tillage. Three successive rainfalls were applied in intervals of 24 h, at intensities of 30 mm/h, 30 mm/h and 70 mm/h. The applied N fertilizer was isotopically labeled (15N) and incorporated into the soil in a line perpendicular to the plot length. Tillage absence resulted in higher soil losses and higher total nitrogen losses (TN) by erosion induced by the rainfalls. The FDN losses followed another pattern, since FDN contributions were highest from tilled plots, even when soil and TN losses were lowest, i.e., the smaller the amount of eroded sediment, the greater the percentage of FDN associated with these. Rain intensity did not affect the FDN loss, and losses were greatest after less intense rainfalls in both treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To objectively characterize different heart tissues from functional and viability images provided by composite-strain-encoding (C-SENC) MRI. MATERIALS AND METHODS: C-SENC is a new MRI technique for simultaneously acquiring cardiac functional and viability images. In this work, an unsupervised multi-stage fuzzy clustering method is proposed to identify different heart tissues in the C-SENC images. The method is based on sequential application of the fuzzy c-means (FCM) and iterative self-organizing data (ISODATA) clustering algorithms. The proposed method is tested on simulated heart images and on images from nine patients with and without myocardial infarction (MI). The resulting clustered images are compared with MRI delayed-enhancement (DE) viability images for determining MI. Also, Bland-Altman analysis is conducted between the two methods. RESULTS: Normal myocardium, infarcted myocardium, and blood are correctly identified using the proposed method. The clustered images correctly identified 90 +/- 4% of the pixels defined as infarct in the DE images. In addition, 89 +/- 5% of the pixels defined as infarct in the clustered images were also defined as infarct in DE images. The Bland-Altman results show no bias between the two methods in identifying MI. CONCLUSION: The proposed technique allows for objectively identifying divergent heart tissues, which would be potentially important for clinical decision-making in patients with MI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: DNA sequence polymorphisms analysis can provide valuable information on the evolutionary forces shaping nucleotide variation, and provides an insight into the functional significance of genomic regions. The recent ongoing genome projects will radically improve our capabilities to detect specific genomic regions shaped by natural selection. Current available methods and software, however, are unsatisfactory for such genome-wide analysis. RESULTS: We have developed methods for the analysis of DNA sequence polymorphisms at the genome-wide scale. These methods, which have been tested on a coalescent-simulated and actual data files from mouse and human, have been implemented in the VariScan software package version 2.0. Additionally, we have also incorporated a graphical-user interface. The main features of this software are: i) exhaustive population-genetic analyses including those based on the coalescent theory; ii) analysis adapted to the shallow data generated by the high-throughput genome projects; iii) use of genome annotations to conduct a comprehensive analyses separately for different functional regions; iv) identification of relevant genomic regions by the sliding-window and wavelet-multiresolution approaches; v) visualization of the results integrated with current genome annotations in commonly available genome browsers. CONCLUSION: VariScan is a powerful and flexible suite of software for the analysis of DNA polymorphisms. The current version implements new algorithms, methods, and capabilities, providing an important tool for an exhaustive exploratory analysis of genome-wide DNA polymorphism data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: This research explored medical students' use and perception of technical language in a practical training setting to enhance skills in breaking bad news in oncology. METHODS: Terms potentially confusing to laypeople were selected from 108 videotaped interviews conducted in an undergraduate Communication Skills Training. A subset of these terms was included in a questionnaire completed by students (N=111) with the aim of gaining insight into their perceptions of different speech registers and of patient understanding. Excerpts of interviews were analyzed qualitatively to investigate students' communication strategies with respect to these technical terms. RESULTS: Fewer than half of the terms were clarified. Students checked for simulated patients' understanding of the terms palliative and metastasis/to metastasize in 22-23% of the interviews. The term ambulatory was spontaneously explained in 75% of the interviews, hepatic and metastasis/to metastasize in 22-24%. Most provided explanations were in plain language; metastasis/to metastasize and ganglion/ganglionic were among terms most frequently explained in technical language. CONCLUSION: A significant number of terms potentially unfamiliar and confusing to patients remained unclarified in training interviews conducted by senior medical students, even when they perceived the terms as technical. PRACTICE IMPLICATIONS: This exploration may offer important insights for improving future physicians' skills.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensity-modulated radiotherapy (IMRT) treatment plan verification by comparison with measured data requires having access to the linear accelerator and is time consuming. In this paper, we propose a method for monitor unit (MU) calculation and plan comparison for step and shoot IMRT based on the Monte Carlo code EGSnrc/BEAMnrc. The beamlets of an IMRT treatment plan are individually simulated using Monte Carlo and converted into absorbed dose to water per MU. The dose of the whole treatment can be expressed through a linear matrix equation of the MU and dose per MU of every beamlet. Due to the positivity of the absorbed dose and MU values, this equation is solved for the MU values using a non-negative least-squares fit optimization algorithm (NNLS). The Monte Carlo plan is formed by multiplying the Monte Carlo absorbed dose to water per MU with the Monte Carlo/NNLS MU. Several treatment plan localizations calculated with a commercial treatment planning system (TPS) are compared with the proposed method for validation. The Monte Carlo/NNLS MUs are close to the ones calculated by the TPS and lead to a treatment dose distribution which is clinically equivalent to the one calculated by the TPS. This procedure can be used as an IMRT QA and further development could allow this technique to be used for other radiotherapy techniques like tomotherapy or volumetric modulated arc therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selostus: Kevätvehnän ja nurminadan fotosynteesi ja Rubisco-kinetiikka simuloidun ilmastonmuutoksen eli kohotetun hiilidioksidipitoisuuden ja kohotetun lämpötilan oloissa

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regulatory gene networks contain generic modules, like those involving feedback loops, which are essential for the regulation of many biological functions (Guido et al. in Nature 439:856-860, 2006). We consider a class of self-regulated genes which are the building blocks of many regulatory gene networks, and study the steady-state distribution of the associated Gillespie algorithm by providing efficient numerical algorithms. We also study a regulatory gene network of interest in gene therapy, using mean-field models with time delays. Convergence of the related time-nonhomogeneous Markov chain is established for a class of linear catalytic networks with feedback loops.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cross-hole radar tomography is a useful tool for mapping shallow subsurface electrical properties viz. dielectric permittivity and electrical conductivity. Common practice is to invert cross-hole radar data with ray-based tomographic algorithms using first arrival traveltimes and first cycle amplitudes. However, the resolution of conventional standard ray-based inversion schemes for cross-hole ground-penetrating radar (GPR) is limited because only a fraction of the information contained in the radar data is used. The resolution can be improved significantly by using a full-waveform inversion that considers the entire waveform, or significant parts thereof. A recently developed 2D time-domain vectorial full-waveform crosshole radar inversion code has been modified in the present study by allowing optimized acquisition setups that reduce the acquisition time and computational costs significantly. This is achieved by minimizing the number of transmitter points and maximizing the number of receiver positions. The improved algorithm was employed to invert cross-hole GPR data acquired within a gravel aquifer (4-10 m depth) in the Thur valley, Switzerland. The simulated traces of the final model obtained by the full-waveform inversion fit the observed traces very well in the lower part of the section and reasonably well in the upper part of the section. Compared to the ray-based inversion, the results from the full-waveform inversion show significantly higher resolution images. At either side, 2.5 m distance away from the cross-hole plane, borehole logs were acquired. There is a good correspondence between the conductivity tomograms and the natural gamma logs at the boundary of the gravel layer and the underlying lacustrine clay deposits. Using existing petrophysical models, the inversion results and neutron-neutron logs are converted to porosity. Without any additional calibration, the values obtained for the converted neutron-neutron logs and permittivity results are very close and similar vertical variations can be observed. The full-waveform inversion provides in both cases additional information about the subsurface. Due to the presence of the water table and associated refracted/reflected waves, the upper traces are not well fitted and the upper 2 m in the permittivity and conductivity tomograms are not reliably reconstructed because the unsaturated zone is not incorporated into the inversion domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a very fast method for blindly approximating a nonlinear mapping which transforms a sum of random variables. The estimation is surprisingly good even when the basic assumption is not satisfied.We use the method for providing a good initialization for inverting post-nonlinear mixtures and Wiener systems. Experiments show that the algorithm speed is strongly improved and the asymptotic performance is preserved with a very low extra computational cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a comprehensive study of different Independent Component Analysis (ICA) algorithms for the calculation of coherency and sharpness of electroencephalogram (EEG) signals, in order to investigate the possibility of early detection of Alzheimer’s disease (AD). We found that ICA algorithms can help in the artifact rejection and noise reduction, improving the discriminative property of features in high frequency bands (specially in high alpha and beta ranges). In addition to different ICA algorithms, the optimum number of selected components is investigated, in order to help decision processes for future works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a quantitative comparisons of different independent component analysis (ICA) algorithms in order to investigate their potential use in preprocessing (such as noise reduction and feature extraction) the electroencephalogram (EEG) data for early detection of Alzhemier disease (AD) or discrimination between AD (or mild cognitive impairment, MCI) and age-match control subjects.