93 resultados para applied game
em CentAUR: Central Archive University of Reading - UK
Resumo:
The aim of this paper is essentially twofold: first, to describe the use of spherical nonparametric estimators for determining statistical diagnostic fields from ensembles of feature tracks on a global domain, and second, to report the application of these techniques to data derived from a modern general circulation model. New spherical kernel functions are introduced that are more efficiently computed than the traditional exponential kernels. The data-driven techniques of cross-validation to determine the amount elf smoothing objectively, and adaptive smoothing to vary the smoothing locally, are also considered. Also introduced are techniques for combining seasonal statistical distributions to produce longer-term statistical distributions. Although all calculations are performed globally, only the results for the Northern Hemisphere winter (December, January, February) and Southern Hemisphere winter (June, July, August) cyclonic activity are presented, discussed, and compared with previous studies. Overall, results for the two hemispheric winters are in good agreement with previous studies, both for model-based studies and observational studies.
Resumo:
The Turing Test, originally configured for a human to distinguish between an unseen man and unseen woman through a text-based conversational measure of gender, is the ultimate test for thinking. So conceived Alan Turing when he replaced the woman with a machine. His assertion, that once a machine deceived a human judge into believing that they were the human, then that machine should be attributed with intelligence. But is the Turing Test nothing more than a mindless game? We present results from recent Loebner Prizes, a platform for the Turing Test, and find that machines in the contest appear conversationally worse rather than better, from 2004 to 2006, showing a downward trend in highest scores awarded to them by human judges. Thus the machines are not thinking in the same way as a human intelligent entity would.
Resumo:
Soil contamination by arsenic (As) presents a hazard in many countries and there is a need for techniques to minimize As uptake by plants. A proposed in situ remediation method was tested by growing lettuce (Lactuca sativa L. cv. Kermit) in a greenhouse pot experiment on soil that contained 577 mg As kg(-1), taken from a former As smelter site. All combinations of iron (Fe) oxides, at concentrations of 0.00, 0.22, 0.54, and 1.09% (w/w), and lime, at concentrations of 0.00, 0.27, 0.68, and 1.36% (w/w), were tested in a factorial design. To create the treatments, field-moist soil, commercial-grade FeSO4, and ground agricultural lime were mixed and stored for one week, allowing Fe oxides to precipitate. Iron oxides gave highly significant (P < 0.001) reductions in lettuce As concentrations, down to 11% of the lettuce As concentration for untreated soil. For the Fe oxides and lime treatment combinations where soil pH was maintained nearly constant, the lettuce As concentration declined in an exponential relationship with increasing FeSO4 application rate and lettuce yield was almost unchanged. Iron oxides applied at a concentration of 1.09% did not give significantly lower lettuce As concentrations than the 0.54% treatment. Simultaneous addition of lime with FeSO4 was essential. Ferrous sulfate with insufficient lime lowered soil pH and caused mobilization of Al, Ba, Co, Cr, Cu, Fe, K, Mg, Mn, Na, Ni, Pb, Sr, and Zn. At the highest Fe oxide to lime ratios, Mn toxicity caused severe yield loss.
Resumo:
A pot experiment was conducted to test the hypothesis that decomposition of organic matter in sewage sludge and the consequent formation of dissolved organic compounds (DOC) would lead to an increase in the bioavailability of the heavy metals. Two Brown Earth soils, one with clayey loam texture (CL) and the other a loamy sand (LS) were mixed with sewage sludge at rates equivalent to 0, 10 and 50 1 dry sludge ha(-1) and the pots were sown with ryegrass (Lolium perenne L.). The organic matter content and heavy metal availability assessed with soil extractions with 0.05 M CaCl2 were monitored over a residual time of two years, while plant uptake over one year, after addition of the sludge. It was found that the concentrations of Cd and Ni in both the ryegrass and the soil extracts increased slightly but significantly during the first year. In most cases, this increase was most evident especially at the higher sludge application rate (50 t ha(-1)). However, in the second year metal availability reached a plateau. Zinc concentrations in the ryegrass did not show an increase but the CaCl2 extracts increased during the first year. In contrast, organic matter content decreased rapidly in the first months of the first year and much more slowly in the second (total decrease of 16%). The concentrations of DOC increased significantly in the more organic rich CL soil in the course of two years. The pattern followed by the decomposition of organic matter with time and the production of DOC may provide at least a partial explanation for trend towards increased metal availability.
Resumo:
Stream-water flows and in-stream nitrate and ammonium concentrations in a small (36.7 ha) Atlantic Forest catchment were simulated using the Integrated Nitrogen in CAtchments (INCA) model version 1.9.4. The catchment, at Cunha, is in the Serra do Mar State Park, SE Brazil and is nearly pristine because the nearest major conurbations, Sao Paulo and Rio, are some 450 km distant. However, intensive farming may increase nitrogen (N) deposition and there are growing pressures for urbanisation. The mean-monthly discharges and NO3-N concentration dynamics were simulated adequately for the calibration and validation periods with (simulated) loss rates of 6.55 kg.ha(-1) yr(-1) for NO3-N and 3.85 kg.ha(-1) yr(-1) for NH4-N. To investigate the effects of elevated levels of N deposition in the future, various scenarios for atmospheric deposition were simulated; the highest value corresponded to that in a highly polluted area of Atlantic Forest in Sao Paulo City. It was found that doubling the atmospheric deposition generated a 25% increase in the N leaching rate, while at levels approaching the highly polluted Sao Paulo deposition rate, five times higher than the current rate, leaching increased by 240%, which would create highly eutrophic conditions, detrimental to downstream water quality. The results indicate that the INCA model can be useful for estimating N concentration and fluxes for different atmospheric deposition rates and hydrological conditions.
Resumo:
Soil contamination by arsenic (As) presents a hazard in many countries and there is a need for techniques to minimize As uptake by plants. A proposed in situ remediation method was tested by growing lettuce (Lactuca sativa L. cv. Kermit) in a greenhouse pot experiment on soil that contained 577 mg As kg(-1), taken from a former As smelter site. All combinations of iron (Fe) oxides, at concentrations of 0.00, 0.22, 0.54, and 1.09% (w/w), and lime, at concentrations of 0.00, 0.27, 0.68, and 1.36% (w/w), were tested in a factorial design. To create the treatments, field-moist soil, commercial-grade FeSO4, and ground agricultural lime were mixed and stored for one week, allowing Fe oxides to precipitate. Iron oxides gave highly significant (P < 0.001) reductions in lettuce As concentrations, down to 11% of the lettuce As concentration for untreated soil. For the Fe oxides and lime treatment combinations where soil pH was maintained nearly constant, the lettuce As concentration declined in an exponential relationship with increasing FeSO4 application rate and lettuce yield was almost unchanged. Iron oxides applied at a concentration of 1.09% did not give significantly lower lettuce As concentrations than the 0.54% treatment. Simultaneous addition of lime with FeSO4 was essential. Ferrous sulfate with insufficient lime lowered soil pH and caused mobilization of Al, Ba, Co, Cr, Cu, Fe, K, Mg, Mn, Na, Ni, Pb, Sr, and Zn. At the highest Fe oxide to lime ratios, Mn toxicity caused severe yield loss.
Resumo:
A computer game was used to study psychophysiological reactions to emotion-relevant events. Two dimensions proposed by Scherer (1984a, 1984b) in his appraisal theory, the intrinsic pleasantness and goal conduciveness of game events, were studied in a factorial design. The relative level at which a player performed at the moment of an event was also taken into account. A total of 33 participants played the game while cardiac activity, skin conductance, skin temperature, and muscle activity as well as emotion self-reports were assessed. The self-reports indicate that game events altered levels of pride, joy, anger, and surprise. Goal conduciveness had little effect on muscle activity but was associated with significant autonomic effects, including changes to interbeat interval, pulse transit time, skin conductance, and finger temperature. The manipulation of intrinsic pleasantness had little impact on physiological responses. The results show the utility of attempting to manipulate emotion-constituent appraisals and measure their peripheral physiological signatures.
Resumo:
The chess endgame is increasingly being seen through the lens of, and therefore effectively defined by, a data ‘model’ of itself. It is vital that such models are clearly faithful to the reality they purport to represent. This paper examines that issue and systems engineering responses to it, using the chess endgame as the exemplar scenario. A structured survey has been carried out of the intrinsic challenges and complexity of creating endgame data by reviewing the past pattern of errors during work in progress, surfacing in publications and occurring after the data was generated. Specific measures are proposed to counter observed classes of error-risk, including a preliminary survey of techniques for using state-of-the-art verification tools to generate EGTs that are correct by construction. The approach may be applied generically beyond the game domain.
Resumo:
The 1999 Kasparov-World game for the first time enabled anyone to join a team playing against a World Chess Champion via the web. It included a surprise in the opening, complex middle-game strategy and a deep ending. As the game headed for its mysterious finale, the World Team re-quested a KQQKQQ endgame table and was provided with two by the authors. This paper describes their work, compares the methods used, examines the issues raised and summarises the concepts involved for the benefit of future workers in the endgame field. It also notes the contribution of this endgame to chess itself.
Resumo:
Sensitivity, specificity, and reproducibility are vital to interpret neuroscientific results from functional magnetic resonance imaging (fMRI) experiments. Here we examine the scan–rescan reliability of the percent signal change (PSC) and parameters estimated using Dynamic Causal Modeling (DCM) in scans taken in the same scan session, less than 5 min apart. We find fair to good reliability of PSC in regions that are involved with the task, and fair to excellent reliability with DCM. Also, the DCM analysis uncovers group differences that were not present in the analysis of PSC, which implies that DCM may be more sensitive to the nuances of signal changes in fMRI data.
Resumo:
This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).
Resumo:
This article illustrates the usefulness of applying bootstrap procedures to total factor productivity Malmquist indices, derived with data envelopment analysis (DEA), for a sample of 250 Polish farms during 1996-2000. The confidence intervals constructed as in Simar and Wilson suggest that the common portrayal of productivity decline in Polish agriculture may be misleading. However, a cluster analysis based on bootstrap confidence intervals reveals that important policy conclusions can be drawn regarding productivity enhancement.
Resumo:
The games-against-nature approach to the analysis of uncertainty in decision-making relies on the assumption that the behaviour of a decision-maker can be explained by concepts such as maximin, minimax regret, or a similarly defined criterion. In reality, however, these criteria represent a spectrum and, the actual behaviour of a decision-maker is most likely to embody a mixture of such idealisations. This paper proposes that in game-theoretic approach to decision-making under uncertainty, a more realistic representation of a decision-maker's behaviour can be achieved by synthesising games-against-nature with goal programming into a single framework. The proposed formulation is illustrated by using a well-known example from the literature on mathematical programming models for agricultural-decision-making. (c) 2005 Elsevier Inc. All rights reserved.