808 resultados para nature of performance
Resumo:
The effects of background English and Welsh speech on memory for visually-presented English words were contrasted amongst monolingual English speakers and bilingual Welsh-English speakers. Equivalent disruption to the English language task was observed amongst Welsh-speaking bilinguals from both English and Welsh speech, but English-speaking monolinguals displayed less disruption from the Welsh speech. An effect of the meaning of the background speech was therefore apparent amongst bilinguals even when the focal memory task was presented in a different language from the distracting speech. A second experiment tested only English-speaking monolinguals, using English as background speech, but varied the demands of the focal task. Participants were asked either to count the number of vowels in words visually presented for future recall, or to rate them for pleasantness, before subsequently being asked to recall the words. Greater disruption to recall was observed from meaningful background speech when participants initially rated the words for pleasantness than when they initially counted the vowels within the words. These results show that background speech is automatically analyzed for meaning, but whether the meaning of the background speech causes distraction is critically dependent upon the nature of the focal task. The data underscore the need to consider not only the nature of office noise, but also the demands and content of the work task when assessing the effects of office noise on work performance.
Resumo:
Many different performance measures have been developed to evaluate field predictions in meteorology. However, a researcher or practitioner encountering a new or unfamiliar measure may have difficulty in interpreting its results, which may lead to them avoiding new measures and relying on those that are familiar. In the context of evaluating forecasts of extreme events for hydrological applications, this article aims to promote the use of a range of performance measures. Some of the types of performance measures that are introduced in order to demonstrate a six-step approach to tackle a new measure. Using the example of the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble precipitation predictions for the Danube floods of July and August 2002, to show how to use new performance measures with this approach and the way to choose between different performance measures based on their suitability for the task at hand is shown. Copyright © 2008 Royal Meteorological Society
Resumo:
The Chartered Institute of Building Service Engineers (CIBSE) produced a technical memorandum (TM36) presenting research on future climate impacting building energy use and thermal comfort. One climate projection for each of four CO2 emissions scenario were used in TM36, so providing a deterministic outlook. As part of the UK Climate Impacts Programme (UKCIP) probabilistic climate projections are being studied in relation to building energy simulation techniques. Including uncertainty in climate projections is considered an important advance to climate impacts modelling and is included in the latest UKCIP data (UKCP09). Incorporating the stochastic nature of these new climate projections in building energy modelling requires a significant increase in data handling and careful statistical interpretation of the results to provide meaningful conclusions. This paper compares the results from building energy simulations when applying deterministic and probabilistic climate data. This is based on two case study buildings: (i) a mixed-mode office building with exposed thermal mass and (ii) a mechanically ventilated, light-weight office building. Building (i) represents an energy efficient building design that provides passive and active measures to maintain thermal comfort. Building (ii) relies entirely on mechanical means for heating and cooling, with its light-weight construction raising concern over increased cooling loads in a warmer climate. Devising an effective probabilistic approach highlighted greater uncertainty in predicting building performance, depending on the type of building modelled and the performance factors under consideration. Results indicate that the range of calculated quantities depends not only on the building type but is strongly dependent on the performance parameters that are of interest. Uncertainty is likely to be particularly marked with regard to thermal comfort in naturally ventilated buildings.
Resumo:
Objective: This work investigates the nature of the comprehension impairment in Wernicke’s aphasia, by examining the relationship between deficits in auditory processing of fundamental, non-verbal acoustic stimuli and auditory comprehension. Wernicke’s aphasia, a condition resulting in severely disrupted auditory comprehension, primarily occurs following a cerebrovascular accident (CVA) to the left temporo-parietal cortex. Whilst damage to posterior superior temporal areas is associated with auditory linguistic comprehension impairments, functional imaging indicates that these areas may not be specific to speech processing but part of a network for generic auditory analysis. Methods: We examined analysis of basic acoustic stimuli in Wernicke’s aphasia participants (n = 10) using auditory stimuli reflective of theories of cortical auditory processing and of speech cues. Auditory spectral, temporal and spectro-temporal analysis was assessed using pure tone frequency discrimination, frequency modulation (FM) detection and the detection of dynamic modulation (DM) in “moving ripple” stimuli. All tasks used criterion-free, adaptive measures of threshold to ensure reliable results at the individual level. Results: Participants with Wernicke’s aphasia showed normal frequency discrimination but significant impairments in FM and DM detection, relative to age- and hearing-matched controls at the group level (n = 10). At the individual level, there was considerable variation in performance, and thresholds for both frequency and dynamic modulation detection correlated significantly with auditory comprehension abilities in the Wernicke’s aphasia participants. Conclusion: These results demonstrate the co-occurrence of a deficit in fundamental auditory processing of temporal and spectrotemporal nonverbal stimuli in Wernicke’s aphasia, which may have a causal contribution to the auditory language comprehension impairment Results are discussed in the context of traditional neuropsychology and current models of cortical auditory processing.
Resumo:
This paper reports on the progress made by a group of fourteen 11-year-old children who had been originally identified as being precocious readers before they started primary school at the age of 5-years. The data enable comparisons to be made with the performance of the children when they were younger so that a six year longitudinal analysis can be made. The children who began school as precocious readers continued to make progress in reading accuracy, rate and comprehension, thereby maintaining their superior performance relative to a comparison group. However, their progress appeared to follow the same developmental trajectory as that of the comparison group. Measures of phonological awareness showed that there are long term, stable individual differences which correlated with all measures of reading. The children who were reading precociously early showed significantly higher levels of phonological awareness than the comparison children. In addition, they showed the same levels of performance on this task as a further group of high achieving young adults. A positive effect of being able to read at precociously early age was identified in the significantly higher levels of receptive vocabulary found amongst the these children. The analyses indicated that rises in receptive vocabulary resulted from reading performance rather than the other way round
Resumo:
Soluble reactive phosphorus (SRP) plays a key role in eutrophication, a global problem decreasing habitat quality and in-stream biodiversity. Mitigation strategies are required to prevent SRP fluxes from exceeding critical levels, and must be robust in the face of potential changes in climate, land use and a myriad of other influences. To establish the longevity of these strategies it is therefore crucial to consider the sensitivity of catchments to multiple future stressors. This study evaluates how the water quality and hydrology of a major river system in the UK (the River Thames) respond to alterations in climate, land use and water resource allocations, and investigates how these changes impact the relative performance of management strategies over an 80-year period. In the River Thames, the relative contributions of SRP from diffuse and point sources vary seasonally. Diffuse sources of SRP from agriculture dominate during periods of high runoff, and point sources during low flow periods. SRP concentrations rose under any future scenario which either increased a) surface runoff or b) the area of cultivated land. Under these conditions, SRP was sourced from agriculture, and the most effective single mitigation measures were those which addressed diffuse SRP sources. Conversely, where future scenarios reduced flow e.g. during winters of reservoir construction, the significance of point source inputs increased, and mitigation measures addressing these issues became more effective. In catchments with multiple point and diffuse sources of SRP, an all-encompassing effective mitigation approach is difficult to achieve with a single strategy. In order to attain maximum efficiency, multiple strategies might therefore be employed at different times and locations, to target the variable nature of dominant SRP sources and pathways.
Resumo:
In this paper we consider the structure of dynamically evolving networks modelling information and activity moving across a large set of vertices. We adopt the communicability concept that generalizes that of centrality which is defined for static networks. We define the primary network structure within the whole as comprising of the most influential vertices (both as senders and receivers of dynamically sequenced activity). We present a methodology based on successive vertex knockouts, up to a very small fraction of the whole primary network,that can characterize the nature of the primary network as being either relatively robust and lattice-like (with redundancies built in) or relatively fragile and tree-like (with sensitivities and few redundancies). We apply these ideas to the analysis of evolving networks derived from fMRI scans of resting human brains. We show that the estimation of performance parameters via the structure tests of the corresponding primary networks is subject to less variability than that observed across a very large population of such scans. Hence the differences within the population are significant.
Resumo:
Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.
Resumo:
Medium range flood forecasting activities, driven by various meteorological forecasts ranging from high resolution deterministic forecasts to low spatial resolution ensemble prediction systems, share a major challenge in the appropriateness and design of performance measures. In this paper possible limitations of some traditional hydrological and meteorological prediction quality and verification measures are identified. Some simple modifications are applied in order to circumvent the problem of the autocorrelation dominating river discharge time-series and in order to create a benchmark model enabling the decision makers to evaluate the forecast quality and the model quality. Although the performance period is quite short the advantage of a simple cost-loss function as a measure of forecast quality can be demonstrated.
Resumo:
Vintage-based vector autoregressive models of a single macroeconomic variable are shown to be a useful vehicle for obtaining forecasts of different maturities of future and past observations, including estimates of post-revision values. The forecasting performance of models which include information on annual revisions is superior to that of models which only include the first two data releases. However, the empirical results indicate that a model which reflects the seasonal nature of data releases more closely does not offer much improvement over an unrestricted vintage-based model which includes three rounds of annual revisions.
Resumo:
As the integration of vertical axis wind turbines in the built environment is a promising alternative to horizontal axis wind turbines, a 2D computational investigation of an augmented wind turbine is proposed and analysed. In the initial CFD analysis, three parameters are carefully investigated: mesh resolution; turbulence model; and time step size. It appears that the mesh resolution and the turbulence model affect result accuracy; while the time step size examined, for the unsteady nature of the flow, has small impact on the numerical results. In the CFD validation of the open rotor with secondary data, the numerical results are in good agreement in terms of shape. It is, however, observed a discrepancy factor of 2 between numerical and experimental data. Successively, the introduction of an omnidirectional stator around the wind turbine increases the power and torque coefficients by around 30–35% when compared to the open case; but attention needs to be given to the orientation of the stator blades for optimum performance. It is found that the power and torque coefficients of the augmented wind turbine are independent of the incident wind speed considered.
Resumo:
Whilst common sense knowledge has been well researched in terms of intelligence and (in particular) artificial intelligence, specific, factual knowledge also plays a critical part in practice. When it comes to testing for intelligence, testing for factual knowledge is, in every-day life, frequently used as a front line tool. This paper presents new results which were the outcome of a series of practical Turing tests held on 23rd June 2012 at Bletchley Park, England. The focus of this paper is on the employment of specific knowledge testing by interrogators. Of interest are prejudiced assumptions made by interrogators as to what they believe should be widely known and subsequently the conclusions drawn if an entity does or does not appear to know a particular fact known to the interrogator. The paper is not at all about the performance of machines or hidden humans but rather the strategies based on assumptions of Turing test interrogators. Full, unedited transcripts from the tests are shown for the reader as working examples. As a result, it might be possible to draw critical conclusions with regard to the nature of human concepts of intelligence, in terms of the role played by specific, factual knowledge in our understanding of intelligence, whether this is exhibited by a human or a machine. This is specifically intended as a position paper, firstly by claiming that practicalising Turing's test is a useful exercise throwing light on how we humans think, and secondly, by taking a potentially controversial stance, because some interrogators adopt a solipsist questioning style of hidden entities with a view that it is a thinking intelligent human if it thinks like them and knows what they know. The paper is aimed at opening discussion with regard to the different aspects considered.
Resumo:
The assessment of chess players is an increasingly attractive opportunity and an unfortunate necessity. The chess community needs to limit potential reputational damage by inhibiting cheating and unjustified accusations of cheating: there has been a recent rise in both. A number of counter-intuitive discoveries have been made by benchmarking the intrinsic merit of players’ moves: these call for further investigation. Is Capablanca actually, objectively the most accurate World Champion? Has ELO rating inflation not taken place? Stimulated by FIDE/ACP, we revisit the fundamentals of the subject to advance a framework suitable for improved standards of computational experiment and more precise results. Other domains look to chess as the demonstrator of good practice, including the rating of professionals making high-value decisions under pressure, personnel evaluation by Multichoice Assessment and the organization of crowd-sourcing in citizen science projects. The ‘3P’ themes of performance, prediction and profiling pervade all these domains.
Resumo:
Studies of climate change impacts on the terrestrial biosphere have been completed without recognition of the integrated nature of the biosphere. Improved assessment of the impacts of climate change on food and water security requires the development and use of models not only representing each component but also their interactions. To meet this requirement the Joint UK Land Environment Simulator (JULES) land surface model has been modified to include a generic parametrisation of annual crops. The new model, JULES-crop, is described and evaluation at global and site levels for the four globally important crops; wheat, soybean, maize and rice. JULES-crop demonstrates skill in simulating the inter-annual variations of yield for maize and soybean at the global and country levels, and for wheat for major spring wheat producing countries. The impact of the new parametrisation, compared to the standard configuration, on the simulation of surface heat fluxes is largely an alteration of the partitioning between latent and sensible heat fluxes during the later part of the growing season. Further evaluation at the site level shows the model captures the seasonality of leaf area index, gross primary production and canopy height better than in the standard JULES. However, this does not lead to an improvement in the simulation of sensible and latent heat fluxes. The performance of JULES-crop from both an Earth system and crop yield model perspective is encouraging. However, more effort is needed to develop the parametrisation of the model for specific applications. Key future model developments identified include the introduction of processes such as irrigation and nitrogen limitation which will enable better representation of the spatial variability in yield.
Resumo:
The intellectual societies known as Academies played a vital role in the development of culture, and scholarly debate throughout Italy between 1525-1700. They were fundamental in establishing the intellectual networks later defined as the ‘République des Lettres’, and in the dissemination of ideas in early modern Europe, through print, manuscript, oral debate and performance. This volume surveys the social and cultural role of Academies, challenging received ideas and incorporating recent archival findings on individuals, networks and texts. Ranging over Academies in both major and smaller or peripheral centres, these collected studies explore the interrelationships of Academies with other cultural forums. Individual essays examine the fluid nature of academies and their changing relationships to the political authorities; their role in the promotion of literature, the visual arts and theatre; and the diverse membership recorded for many academies, which included scientists, writers, printers, artists, political and religious thinkers, and, unusually, a number of talented women. Contributions by established international scholars together with studies by younger scholars active in this developing field of research map out new perspectives on the dynamic place of the Academies in early modern Italy. The publication results from the research collaboration ‘The Italian Academies 1525-1700: the first intellectual networks of early modern Europe’ funded by the Arts and Humanities Research Council and is edited by the senior investigators.