992 resultados para Reliability simulation
Resumo:
A version of the Agricultural Production Systems Simulator (APSIM) capable of simulating the key agronomic aspects of intercropping maize between legume shrub hedgerows was described and parameterised in the first paper of this series (Nelson et al., this issue). In this paper, APSIM is used to simulate maize yields and soil erosion from traditional open-field farming and hedgerow intercropping in the Philippine uplands. Two variants of open-field farming were simulated using APSIM, continuous and fallow, for comparison with intercropping maize between leguminous shrub hedgerows. Continuous open-field maize farming was predicted to be unsustainable in the long term, while fallow open-field farming was predicted to slow productivity decline by spreading the effect of erosion over a larger cropping area. Hedgerow intercropping was predicted to reduce erosion by maintaining soil surface cover during periods of intense rainfall, contributing to sustainable production of maize in the long term. In the third paper in this series, Nelson et al. (this issue) use cost-benefit analysis to compare the economic viability of hedgerow intercropping relative to traditional open-field farming of maize in relatively inaccessible upland areas. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.
Resumo:
A space-marching code for the simulation and optimization of inviscid supersonic flow in three dimensions is described. The now in a scramjet module with a relatively complex three-dimensional geometry is examined and wall-pressure estimates are compared with experimental data. Given that viscous effects are not presently included, the comparison is reasonable. The thermodynamic compromise of adding heat in a diverging combustor is also examined. The code is then used to optimize the shape of a thrust surface for a simpler (box-section) scramjet module in the presence of uniform and nonuniform heat distributions. The optimum two-dimensional profiles for the thrust surface are obtained via a perturbation procedure that requires about 30-50 now solutions. It is found that the final shapes are fairly insensitive to the details of the heat distribution.
Resumo:
Two experimental studies were conducted to examine whether the stress-buffering effects of behavioral control on work task responses varied as a function of procedural information. Study 1 manipulated low and high levels of task demands, behavioral control, and procedural information for 128 introductory psychology students completing an in-basket activity. ANOVA procedures revealed a significant three-way interaction among these variables in the prediction of subjective task performance and task satisfaction. It was found that procedural information buffered the negative effects of task demands on ratings of performance and satisfaction only under conditions of low behavioral control. This pattern of results suggests that procedural information may have a compensatory effect when the work environment is characterized by a combination of high task demands and low behavioral control. Study 2 (N = 256) utilized simple and complex versions of the in-basket activity to examine the extent to which the interactive relationship among task demands, behavioral control, and procedural information varied as a function of task complexity. There was further support for the stress-buffering role of procedural information on work task responses under conditions of low behavioral control. This effect was, however, only present when the in-basket activity was characterized by high task complexity, suggesting that the interactive relationship among these variables may depend on the type of tasks performed at work. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
RWMODEL II simulates the Rescorla-Wagner model of Pavlovian conditioning. It is written in Delphi and runs under Windows 3.1 and Windows 95. The program was designed for novice and expert users and can be employed in teaching, as well as in research. It is user friendly and requires a minimal level of computer literacy but is sufficiently flexible to permit a wide range of simulations. It allows the display of empirical data, against which predictions from the model can be validated.
The Las Campanas/AAT rich cluster survey - I. Precision and reliability of the photometric catalogue
Resumo:
The Las Campanas Observatory and Anglo-Australian Telescope Rich Cluster Survey (LARCS) is a panoramic imaging and spectroscopic survey of an X-ray luminosity-selected sample of 21 clusters of galaxies at 0.07 < z < 0.16. Charge-coupled device (CCD) imaging was obtained in B and R of typically 2 degrees wide regions centred on the 21 clusters, and the galaxy sample selected from the imaging is being used for an on-going spectroscopic survey of the clusters with the 2dF spectrograph on the Anglo-Australian Telescope. This paper presents the reduction of the imaging data and the photometric analysis used in the survey. Based on an overlapping area of 12.3 deg(2) we compare the CCD-based LARCS catalogue with the photographic-based galaxy catalogue used for the input to the 2dF Galaxy Redshift Survey (2dFGRS) from the APM, to the completeness of the GRS/APM catalogue, b(J) = 19.45. This comparison confirms the reliability of the photometry across our mosaics and between the clusters in our survey. This comparison also provides useful information concerning the properties of the GRS/APM. The stellar contamination in the GRS/APM galaxy catalogue is confirmed as around 5-10 per cent, as originally estimated. However, using the superior sensitivity and spatial resolution in the LARCS survey evidence is found for four distinct populations of galaxies that are systematically omitted from the GRS/APM catalogue. The characteristics of the 'missing' galaxy populations are described, reasons for their absence examined and the impact they will have on the conclusions drawn from the 2dF Galaxy Redshift Survey are discussed.
Resumo:
Models of population dynamics are commonly used to predict risks in ecology, particularly risks of population decline. There is often considerable uncertainty associated with these predictions. However, alternatives to predictions based on population models have not been assessed. We used simulation models of hypothetical species to generate the kinds of data that might typically be available to ecologists and then invited other researchers to predict risks of population declines using these data. The accuracy of the predictions was assessed by comparison with the forecasts of the original model. The researchers used either population models or subjective judgement to make their predictions. Predictions made using models were only slightly more accurate than subjective judgements of risk. However, predictions using models tended to be unbiased, while subjective judgements were biased towards over-estimation. Psychology literature suggests that the bias of subjective judgements is likely to vary somewhat unpredictably among people, depending on their stake in the outcome. This will make subjective predictions more uncertain and less transparent than those based on models. (C) 2004 Elsevier SAS. All rights reserved.
Resumo:
Numerical methods are used to simulate the double-diffusion driven convective pore-fluid flow and rock alteration in three-dimensional fluid-saturated geological fault zones. The double diffusion is caused by a combination of both the positive upward temperature gradient and the positive downward salinity concentration gradient within a three-dimensional fluid-saturated geological fault zone, which is assumed to be more permeable than its surrounding rocks. In order to ensure the physical meaningfulness of the obtained numerical solutions, the numerical method used in this study is validated by a benchmark problem, for which the analytical solution to the critical Rayleigh number of the system is available. The theoretical value of the critical Rayleigh number of a three-dimensional fluid-saturated geological fault zone system can be used to judge whether or not the double-diffusion driven convective pore-fluid flow can take place within the system. After the possibility of triggering the double-diffusion driven convective pore-fluid flow is theoretically validated for the numerical model of a three-dimensional fluid-saturated geological fault zone system, the corresponding numerical solutions for the convective flow and temperature are directly coupled with a geochemical system. Through the numerical simulation of the coupled system between the convective fluid flow, heat transfer, mass transport and chemical reactions, we have investigated the effect of the double-diffusion driven convective pore-fluid flow on the rock alteration, which is the direct consequence of mineral redistribution due to its dissolution, transportation and precipitation, within the three-dimensional fluid-saturated geological fault zone system. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A comprehensive probabilistic model for simulating dendrite morphology and investigating dendritic growth kinetics during solidification has been developed, based on a modified Cellular Automaton (mCA) for microscopic modeling of nucleation, growth of crystals and solute diffusion. The mCA model numerically calculated solute redistribution both in the solid and liquid phases, the curvature of dendrite tips and the growth anisotropy. This modeling takes account of thermal, curvature and solute diffusion effects. Therefore, it can simulate microstructure formation both on the scale of the dendrite tip length. This model was then applied for simulating dendritic solidification of an Al-7%Si alloy. Both directional and equiaxed dendritic growth has been performed to investigate the growth anisotropy and cooling rate on dendrite morphology. Furthermore, the competitive growth and selection of dendritic crystals have also investigated.
Resumo:
Objective To assess the validity and the reliability of the Portuguese version of the Delirium Rating Scale-Revised-98 (DRS-R-98). Methods The scale was translated into Portuguese and back-translated into English. After assessing its face validity, five diagnostic groups (n = 64; delirium, depression, dementia, schizophrenia and others) were evaluated by two independent researchers blinded to the diagnosis. Diagnosis and severity of delirium as measured by the DRS-R-98 were compared to clinical diagnosis, Mini-Mental State Exam, Confusion Assessment Method, and Clinical Global Impressions scale (CGI). Results Mean and rnedian DRS-R-98 total scores significantly distinguished delirium from the other groups (p < 0.001). Inter-rater reliability (ICC between 0.9 and 1) and internal consistency (alpha = 0.91) were very high. DRS-R-98 severity scores correlated highly with the CGI. Mean DRS-R-98 severity scores during delirium differed significantly (p < 0.01) from the post-treatment values. The area under the curve established by ROC analysis was 0.99 and using the cut-off Value of 20 the scale showed sensitivity and specificity of 92.6% and 94.6%, respectively. Conclusion The Portuguese version of the DRS-R-98 is a valid and reliable measure of delirium that distinguishes delirium from other disorders and is sensitive to change in delirium severity, which may be of great value for longitudinal studies. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
The technical reliability (i.e., interinstrument and interoperator reliability) of three SEAC-swept frequency bioimpedance monitors was assessed for both errors of measurement and associated analyses. In addition, intraoperator and intrainstrument variability was evaluated for repeat measures over a 4-hour period. The measured impedance values from a range of resistance-capacitance circuits were accurate to within 3% of theoretical values over a range of 50-800 ohms. Similarly, phase was measured over the range 1 degrees-19 degrees with a maximum deviation of 1.3 degrees from the theoretical value. The extrapolated impedance at zero frequency was equally well determined (+/-3%). However, the accuracy of the extrapolated value at infinite frequency was decreased, particularly at impedances below 50 ohms (approaching the lower limit of the measurement range of the instrument). The interinstrument/operator variation for whole body measurements were recorded on human volunteers with biases of less than +/-1% for measured impedance values and less than 3% for phase. The variation in the extrapolated values of impedance at zero and infinite frequencies included variations due to operator choice of the analysis parameters but was still less than +/-0.5%. (C) 1997 Wiley-Liss, Inc.
Resumo:
The St. Lawrence Island polynya (SLIP) is a commonly occurring winter phenomenon in the Bering Sea, in which dense saline water produced during new ice formation is thought to flow northward through the Bering Strait to help maintain the Arctic Ocean halocline. Winter darkness and inclement weather conditions have made continuous in situ and remote observation of this polynya difficult. However, imagery acquired from the European Space Agency ERS-1 Synthetic Aperture Radar (SAR) has allowed observation of the St. Lawrence Island polynya using both the imagery and derived ice displacement products. With the development of ARCSyM, a high resolution regional model of the Arctic atmosphere/sea ice system, simulation of the SLIP in a climate model is now possible. Intercomparisons between remotely sensed products and simulations can lead to additional insight into the SLIP formation process. Low resolution SAR, SSM/I and AVHRR infrared imagery for the St. Lawrence Island region are compared with the results of a model simulation for the period of 24-27 February 1992. The imagery illustrates a polynya event (polynya opening). With the northerly winds strong and consistent over several days, the coupled model captures the SLIP event with moderate accuracy. However, the introduction of a stability dependent atmosphere-ice drag coefficient, which allows feedbacks between atmospheric stability, open water, and air-ice drag, produces a more accurate simulation of the SLIP in comparison to satellite imagery. Model experiments show that the polynya event is forced primarily by changes in atmospheric circulation followed by persistent favorable conditions: ocean surface currents are found to have a small but positive impact on the simulation which is enhanced when wind forcing is weak or variable.