940 resultados para Computerized simulation.
Resumo:
Selostus: Kohotettujen CO‚́‚:n ja lämpötilan vaikutukset kevätvehnän fenologiseen kehitykseen ja sadontuottomahdollisuuksiin
Resumo:
The objective of this work was to adapt the CROPGRO model, which is part of the DSSAT system, for simulating the cowpea (Vigna unguiculata) growth and development under soil and climate conditions of the Baixo Parnaíba region, Piauí State, Brazil. In the CROPGRO, only input parameters that define crop species, cultivars, and ecotype were changed in order to characterize the cowpea crop. Soil and climate files were created for the considered site. Field experiments without water deficit were used to calibrate the model. In these experiments, dry matter (DM), leaf area index (LAI), yield components and grain yield of cowpea (cv. BR 14 Mulato) were evaluated. The results showed good fit for DM and LAI estimates. The medium values of R² and medium absolute error (MAE) were, respectively, 0.95 and 264.9 kg ha-1 for DM, and 0.97 and 0.22 for LAI. The difference between observed and simulated values of plant phenology varied from 0 to 3 days. The model also presented good performance for yield components simulation, excluding 100-grain weight, for which the error ranged from 20.9% to 34.3%. Considering the medium values of crop yield in two years, the model presented an error from 5.6%.
Resumo:
In this paper, we present a computer simulation study of the ion binding process at an ionizable surface using a semi-grand canonical Monte Carlo method that models the surface as a discrete distribution of charged and neutral functional groups in equilibrium with explicit ions modelled in the context of the primitive model. The parameters of the simulation model were tuned and checked by comparison with experimental titrations of carboxylated latex particles in the presence of different ionic strengths of monovalent ions. The titration of these particles was analysed by calculating the degree of dissociation of the latex functional groups vs. pH curves at different background salt concentrations. As the charge of the titrated surface changes during the simulation, a procedure to keep the electroneutrality of the system is required. Here, two approaches are used with the choice depending on the ion selected to maintain electroneutrality: counterion or coion procedures. We compare and discuss the difference between the procedures. The simulations also provided a microscopic description of the electrostatic double layer (EDL) structure as a function of p H and ionic strength. The results allow us to quantify the effect of the size of the background salt ions and of the surface functional groups on the degree of dissociation. The non-homogeneous structure of the EDL was revealed by plotting the counterion density profiles around charged and neutral surface functional groups.
Resumo:
AIMS: This study was performed to compare the sensitivity of ultrasonography, computerized tomography during arterial portography, delayed computerized tomography, and magnetic resonance imaging to detect focal liver lesions. Forty three patients with primary or secondary malignant liver lesions were studied prior to surgical intervention. METHODS: The results of the imaging studies were compared with intraoperative examination of the liver, intraoperative ultrasonography and pathology results (29 patients). In the non-operated (14 patients) group, we compared the number of lesions detected by each technique. RESULTS: One hundred and forty six lesions were detected. There was 84% sensitivity with computerized tomography during arterial portography, 61.3% with delayed scan, 63.3% with magnetic resonance imaging and 51% with ultrasonography in operated patients. In patients who did not undergo surgery, magnetic resonance imaging was more sensitive in detecting lesions. CONCLUSIONS: In operated and non-operated patients series, CT during arterial portography had the highest sensitivity, but magnetic resonance imaging had the most consistent overall results.
Resumo:
A simulation model of the effects of hormone replacement therapy (HRT) on hip fractures and their consequences is based on a population of 100,000 post-menopausal women. This cohort is confronted with literature derived probabilities of cancers (endometrium or breast, which are contra-indications to HRT), hip fracture, disability requiring nursing home or home care, and death. Administration of HRT for life prevents 55,5% of hip fractures, 22,6% of years with home care and 4,4% of years in nursing homes. If HRT is administered for 15 years, these results are 15,5%, 10% and 2,2%, respectively. A slight gain in life expectancy is observed for both durations of HRT. The net financial loss in the simulated population is 222 million Swiss Francs (cost/benefit ratio 1.25) for lifelong administration of HRT, and 153 million Swiss Francs (cost/benefit ratio 1.42) if HRT is administered during 15 years.
Resumo:
BACKGROUND: Exposure to combination antiretroviral therapy (cART) can lead to important metabolic changes and increased risk of coronary heart disease (CHD). Computerized clinical decision support systems have been advocated to improve the management of patients at risk for CHD but it is unclear whether such systems reduce patients' risk for CHD. METHODS: We conducted a cluster trial within the Swiss HIV Cohort Study (SHCS) of HIV-infected patients, aged 18 years or older, not pregnant and receiving cART for >3 months. We randomized 165 physicians to either guidelines for CHD risk factor management alone or guidelines plus CHD risk profiles. Risk profiles included the Framingham risk score, CHD drug prescriptions and CHD events based on biannual assessments, and were continuously updated by the SHCS data centre and integrated into patient charts by study nurses. Outcome measures were total cholesterol, systolic and diastolic blood pressure and Framingham risk score. RESULTS: A total of 3,266 patients (80% of those eligible) had a final assessment of the primary outcome at least 12 months after the start of the trial. Mean (95% confidence interval) patient differences where physicians received CHD risk profiles and guidelines, rather than guidelines alone, were total cholesterol -0.02 mmol/l (-0.09-0.06), systolic blood pressure -0.4 mmHg (-1.6-0.8), diastolic blood pressure -0.4 mmHg (-1.5-0.7) and Framingham 10-year risk score -0.2% (-0.5-0.1). CONCLUSIONS: Systemic computerized routine provision of CHD risk profiles in addition to guidelines does not significantly improve risk factors for CHD in patients on cART.
Resumo:
Abstract: Asthma prevalence in children and adolescents in Spain is 10-17%. It is the most common chronic illness during childhood. Prevalence has been increasing over the last 40 years and there is considerable evidence that, among other factors, continued exposure to cigarette smoke results in asthma in children. No statistical or simulation model exist to forecast the evolution of childhood asthma in Europe. Such a model needs to incorporate the main risk factors that can be managed by medical authorities, such as tobacco (OR = 1.44), to establish how they affect the present generation of children. A simulation model using conditional probability and discrete event simulation for childhood asthma was developed and validated by simulating realistic scenario. The parameters used for the model (input data) were those found in the bibliography, especially those related to the incidence of smoking in Spain. We also used data from a panel of experts from the Hospital del Mar (Barcelona) related to actual evolution and asthma phenotypes. The results obtained from the simulation established a threshold of a 15-20% smoking population for a reduction in the prevalence of asthma. This is still far from the current level in Spain, where 24% of people smoke. We conclude that more effort must be made to combat smoking and other childhood asthma risk factors, in order to significantly reduce the number of cases. Once completed, this simulation methodology can realistically be used to forecast the evolution of childhood asthma as a function of variation in different risk factors.
Resumo:
This report presents an overview of where the computerized highway information system is now, and its status as a planning and programming tool for state highway agencies. A computerized highway information system is simply a computer linked system which can be used by many divisions of a transportation agency to obtain information to meet data reporting, analyses or other informational needs. The description of the highway information system includes: current use and status, applications, organization and system development, benefits and problems.
Resumo:
The Center for Transportation Research and Education (CTRE) used the traffic simulation model CORSIM to access proposed capacity and safety improvement strategies for the U.S. 61 corridor through Burlington, Iowa. The comparison between the base and alternative models allow for evaluation of the traffic flow performance under the existing conditions as well as other design scenarios. The models also provide visualization of performance for interpretation by technical staff, public policy makers, and the public. The objectives of this project are to evaluate the use of traffic simulation models for future use by the Iowa Department of Transportation (DOT) and to develop procedures for employing simulation modeling to conduct the analysis of alternative designs. This report presents both the findings of the U.S. 61 evaluation and an overview of model development procedures. The first part of the report includes the simulation modeling development procedures. The simulation analysis is illustrated through the Burlington U.S. 61 corridor case study application. Part I is not intended to be a user manual but simply introductory guidelines for traffic simulation modeling. Part II of the report evaluates the proposed improvement concepts in a side by side comparison of the base and alternative models.
Resumo:
Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.
Resumo:
When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
The objective of this study was to improve the simulation of node number in soybean cultivars with determinate stem habits. A nonlinear model considering two approaches to input daily air temperature data (daily mean temperature and daily minimum/maximum air temperatures) was used. The node number on the main stem data of ten soybean cultivars was collected in a three-year field experiment (from 2004/2005 to 2006/2007) at Santa Maria, RS, Brazil. Node number was simulated using the Soydev model, which has a nonlinear temperature response function [f(T)]. The f(T) was calculated using two methods: using daily mean air temperature calculated as the arithmetic average among daily minimum and maximum air temperatures (Soydev tmean); and calculating an f(T) using minimum air temperature and other using maximum air temperature and then averaging the two f(T)s (Soydev tmm). Root mean square error (RMSE) and deviations (simulated minus observed) were used as statistics to evaluate the performance of the two versions of Soydev. Simulations of node number in soybean were better with the Soydev tmm version, with a 0.5 to 1.4 node RMSE. Node number can be simulated for several soybean cultivars using only one set of model coefficients, with a 0.8 to 2.4 node RMSE.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.