993 resultados para applied game


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stream-water flows and in-stream nitrate and ammonium concentrations in a small (36.7 ha) Atlantic Forest catchment were simulated using the Integrated Nitrogen in CAtchments (INCA) model version 1.9.4. The catchment, at Cunha, is in the Serra do Mar State Park, SE Brazil and is nearly pristine because the nearest major conurbations, Sao Paulo and Rio, are some 450 km distant. However, intensive farming may increase nitrogen (N) deposition and there are growing pressures for urbanisation. The mean-monthly discharges and NO3-N concentration dynamics were simulated adequately for the calibration and validation periods with (simulated) loss rates of 6.55 kg.ha(-1) yr(-1) for NO3-N and 3.85 kg.ha(-1) yr(-1) for NH4-N. To investigate the effects of elevated levels of N deposition in the future, various scenarios for atmospheric deposition were simulated; the highest value corresponded to that in a highly polluted area of Atlantic Forest in Sao Paulo City. It was found that doubling the atmospheric deposition generated a 25% increase in the N leaching rate, while at levels approaching the highly polluted Sao Paulo deposition rate, five times higher than the current rate, leaching increased by 240%, which would create highly eutrophic conditions, detrimental to downstream water quality. The results indicate that the INCA model can be useful for estimating N concentration and fluxes for different atmospheric deposition rates and hydrological conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil contamination by arsenic (As) presents a hazard in many countries and there is a need for techniques to minimize As uptake by plants. A proposed in situ remediation method was tested by growing lettuce (Lactuca sativa L. cv. Kermit) in a greenhouse pot experiment on soil that contained 577 mg As kg(-1), taken from a former As smelter site. All combinations of iron (Fe) oxides, at concentrations of 0.00, 0.22, 0.54, and 1.09% (w/w), and lime, at concentrations of 0.00, 0.27, 0.68, and 1.36% (w/w), were tested in a factorial design. To create the treatments, field-moist soil, commercial-grade FeSO4, and ground agricultural lime were mixed and stored for one week, allowing Fe oxides to precipitate. Iron oxides gave highly significant (P < 0.001) reductions in lettuce As concentrations, down to 11% of the lettuce As concentration for untreated soil. For the Fe oxides and lime treatment combinations where soil pH was maintained nearly constant, the lettuce As concentration declined in an exponential relationship with increasing FeSO4 application rate and lettuce yield was almost unchanged. Iron oxides applied at a concentration of 1.09% did not give significantly lower lettuce As concentrations than the 0.54% treatment. Simultaneous addition of lime with FeSO4 was essential. Ferrous sulfate with insufficient lime lowered soil pH and caused mobilization of Al, Ba, Co, Cr, Cu, Fe, K, Mg, Mn, Na, Ni, Pb, Sr, and Zn. At the highest Fe oxide to lime ratios, Mn toxicity caused severe yield loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computer game was used to study psychophysiological reactions to emotion-relevant events. Two dimensions proposed by Scherer (1984a, 1984b) in his appraisal theory, the intrinsic pleasantness and goal conduciveness of game events, were studied in a factorial design. The relative level at which a player performed at the moment of an event was also taken into account. A total of 33 participants played the game while cardiac activity, skin conductance, skin temperature, and muscle activity as well as emotion self-reports were assessed. The self-reports indicate that game events altered levels of pride, joy, anger, and surprise. Goal conduciveness had little effect on muscle activity but was associated with significant autonomic effects, including changes to interbeat interval, pulse transit time, skin conductance, and finger temperature. The manipulation of intrinsic pleasantness had little impact on physiological responses. The results show the utility of attempting to manipulate emotion-constituent appraisals and measure their peripheral physiological signatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The chess endgame is increasingly being seen through the lens of, and therefore effectively defined by, a data ‘model’ of itself. It is vital that such models are clearly faithful to the reality they purport to represent. This paper examines that issue and systems engineering responses to it, using the chess endgame as the exemplar scenario. A structured survey has been carried out of the intrinsic challenges and complexity of creating endgame data by reviewing the past pattern of errors during work in progress, surfacing in publications and occurring after the data was generated. Specific measures are proposed to counter observed classes of error-risk, including a preliminary survey of techniques for using state-of-the-art verification tools to generate EGTs that are correct by construction. The approach may be applied generically beyond the game domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 1999 Kasparov-World game for the first time enabled anyone to join a team playing against a World Chess Champion via the web. It included a surprise in the opening, complex middle-game strategy and a deep ending. As the game headed for its mysterious finale, the World Team re-quested a KQQKQQ endgame table and was provided with two by the authors. This paper describes their work, compares the methods used, examines the issues raised and summarises the concepts involved for the benefit of future workers in the endgame field. It also notes the contribution of this endgame to chess itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensitivity, specificity, and reproducibility are vital to interpret neuroscientific results from functional magnetic resonance imaging (fMRI) experiments. Here we examine the scan–rescan reliability of the percent signal change (PSC) and parameters estimated using Dynamic Causal Modeling (DCM) in scans taken in the same scan session, less than 5 min apart. We find fair to good reliability of PSC in regions that are involved with the task, and fair to excellent reliability with DCM. Also, the DCM analysis uncovers group differences that were not present in the analysis of PSC, which implies that DCM may be more sensitive to the nuances of signal changes in fMRI data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article illustrates the usefulness of applying bootstrap procedures to total factor productivity Malmquist indices, derived with data envelopment analysis (DEA), for a sample of 250 Polish farms during 1996-2000. The confidence intervals constructed as in Simar and Wilson suggest that the common portrayal of productivity decline in Polish agriculture may be misleading. However, a cluster analysis based on bootstrap confidence intervals reveals that important policy conclusions can be drawn regarding productivity enhancement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The games-against-nature approach to the analysis of uncertainty in decision-making relies on the assumption that the behaviour of a decision-maker can be explained by concepts such as maximin, minimax regret, or a similarly defined criterion. In reality, however, these criteria represent a spectrum and, the actual behaviour of a decision-maker is most likely to embody a mixture of such idealisations. This paper proposes that in game-theoretic approach to decision-making under uncertainty, a more realistic representation of a decision-maker's behaviour can be achieved by synthesising games-against-nature with goal programming into a single framework. The proposed formulation is illustrated by using a well-known example from the literature on mathematical programming models for agricultural-decision-making. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the comparison of two formulations in terms of average bioequivalence using the 2 × 2 cross-over design. In a bioequivalence study, the primary outcome is a pharmacokinetic measure, such as the area under the plasma concentration by time curve, which is usually assumed to have a lognormal distribution. The criterion typically used for claiming bioequivalence is that the 90% confidence interval for the ratio of the means should lie within the interval (0.80, 1.25), or equivalently the 90% confidence interval for the differences in the means on the natural log scale should be within the interval (-0.2231, 0.2231). We compare the gold standard method for calculation of the sample size based on the non-central t distribution with those based on the central t and normal distributions. In practice, the differences between the various approaches are likely to be small. Further approximations to the power function are sometimes used to simplify the calculations. These approximations should be used with caution, because the sample size required for a desirable level of power might be under- or overestimated compared to the gold standard method. However, in some situations the approximate methods produce very similar sample sizes to the gold standard method. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The International Citicoline Trial in acUte Stroke is a sequential phase III study of the use of the drug citicoline in the treatment of acute ischaemic stroke, which was initiated in 2006 in 56 treatment centres. The primary objective of the trial is to demonstrate improved recovery of patients randomized to citicoline relative to those randomized to placebo after 12 weeks of follow-up. The primary analysis will take the form of a global test combining the dichotomized results of assessments on three well-established scales: the Barthel Index, the modified Rankin scale and the National Institutes of Health Stroke Scale. This approach was previously used in the analysis of the influential National Institute of Neurological Disorders and Stroke trial of recombinant tissue plasminogen activator in stroke. The purpose of this paper is to describe how this trial was designed, and in particular how the simultaneous objectives of taking into account three assessment scales, performing a series of interim analyses and conducting treatment allocation and adjusting the analyses to account for prognostic factors, including more than 50 treatment centres, were addressed. Copyright (C) 2008 John Wiley & Sons, Ltd.