17 resultados para usable leftovers
em CentAUR: Central Archive University of Reading - UK
Resumo:
The equations of Milsom are evaluated, giving the ground range and group delay of radio waves propagated via the horizontally stratified model ionosphere proposed by Bradley and Dudeney. Expressions for the ground range which allow for the effects of the underlying E- and F1-regions are used to evaluate the basic maximum usable frequency or M-factors for single F-layer hops. An algorithm for the rapid calculation of the M-factor at a given range is developed, and shown to be accurate to within 5%. The results reveal that the M(3000)F2-factor scaled from vertical-incidence ionograms using the standard URSI procedure can be up to 7.5% in error. A simple addition to the algorithm effects a correction to ionogram values to make these accurate to 0.5%.
Resumo:
The existence of endgame databases challenges us to extract higher-grade information and knowledge from their basic data content. Chess players, for example, would like simple and usable endgame theories if such holy grail exists: endgame experts would like to provide such insights and be inspired by computers to do so. Here, we investigate the use of artificial neural networks (NNs) to mine these databases and we report on a first use of NNs on KPK. The results encourage us to suggest further work on chess applications of neural networks and other data-mining techniques.
Resumo:
In recent years there has been a growing debate over whether or not standards should be produced for user system interfaces. Those in favor of standardization argue that standards in this area will result in more usable systems, while those against argue that standardization is neither practical nor desirable. The present paper reviews both sides of this debate in relation to expert systems. It argues that in many areas guidelines are more appropriate than standards for user interface design.
Resumo:
This paper describes and analyses the experience of designing, installing and evaluating a farmer-usable touch screen information kiosk on cattle health in a veterinary institution in Pondicherry. The contents of the kiosk were prepared based on identified demands for information on cattle health, arrived at through various stakeholders meetings. Information on these cattle diseases and conditions affecting the livelihoods of the poor was provided through graphics, text and audio back-up, keeping in mind the needs of landless and illiterate poor cattle owners. A methodology for kiosk evaluation based on the feedback obtained from kiosk facilitator, critical group reflection and individual users was formulated. The formative evaluation reveals the potential strength this ICT has in transferring information to the cattle owners in a service delivery centre. Such information is vital in preventing diseases and helps cattle owners to present and treat their animals at an early stage of disease condition. This in turn helps prevent direct and indirect losses to the cattle owners. The study reveals how an information kiosk installed at a government institution as a freely accessible source of information to all farmers irrespective of their class and caste can help in transfer of information among poor cattle owners, provided periodic updating, interactivity and communication variability are taken care of. Being in the veterinary centre, the kiosk helps stimulate dialogue, and facilitates demand of services based on the information provided by the kiosk screens.
Resumo:
In the BiodiversityWorld project we are building a GRID to support scientific biodiversity-related research. The requirements associated with such a GRID are somewhat different from other GRIDs, and this has influenced the architecture that we have developed. In this paper we outline these requirements, most notably the need to interoperate over a diverse set of legacy databases and applications in an environment that supports effective resource discovery and use of these resources in complex workflows. Our architecture provides an invocation model that is usable over a wide range of resource types and underlying GRID middleware. However, there is a trade-off between the flexibility provided by our architecture and its performance. We discuss how this affects the inclusion of computationally intensive applications and applications that are highly interactive; we also consider the broader issue of interoperation with other GRIDs.
Resumo:
Abstract. Different types of mental activity are utilised as an input in Brain-Computer Interface (BCI) systems. One such activity type is based on Event-Related Potentials (ERPs). The characteristics of ERPs are not visible in single-trials, thus averaging over a number of trials is necessary before the signals become usable. An improvement in ERP-based BCI operation and system usability could be obtained if the use of single-trial ERP data was possible. The method of Independent Component Analysis (ICA) can be utilised to separate single-trial recordings of ERP data into components that correspond to ERP characteristics, background electroencephalogram (EEG) activity and other components with non- cerebral origin. Choice of specific components and their use to reconstruct “denoised” single-trial data could improve the signal quality, thus allowing the successful use of single-trial data without the need for averaging. This paper assesses single-trial ERP signals reconstructed using a selection of estimated components from the application of ICA on the raw ERP data. Signal improvement is measured using Contrast-To-Noise measures. It was found that such analysis improves the signal quality in all single-trials.
Resumo:
Archaeological excavations alongside the river Wandle in Wallington produced evidence of the environmental history and human exploitation of the area. The recovery of a large assemblage of struck flint provided information on the nature of the prehistoric activities represented, while a detailed environmental archaeological programme permitted an examination of both the local sediment successions and thus an opportunity to reconstruct the environmental history of the site. The site revealed a complex sedimentary sequence deposited in riverine conditions, commencing during the early Holocene (from c 10,000 years before present) and continuing through the late Holocene (c last 3000 years). Large flint nodules were washed by the river onto the site where they were procured and worked by Mesolithic and Bronze Age communities. Potentially usable nodules had been tested, and suitable pieces completely reduced, while the majority of useful flakes and blades had been removed for use elsewhere. Small numbers of retouched pieces, such as scrapers and piercers, indicate that domestic activities took place nearby. By the Saxon period the site had begun to stabilise, although it remained marshy and probably peripheral to habitation. Two pits from this period were excavated, one of which contained an antler pick. A small quantity of cereal grain also suggests that cultivated land lay in the vicinity of the site. During the 19th century a mill race was dug across the site, redirecting water from the river Wandle, which resulted in episodic flooding.
Resumo:
A simple four-dimensional assimilation technique, called Newtonian relaxation, has been applied to the Hamburg climate model (ECHAM), to enable comparison of model output with observations for short periods of time. The prognostic model variables vorticity, divergence, temperature, and surface pressure have been relaxed toward European Center for Medium-Range Weather Forecasts (ECMWF) global meteorological analyses. Several experiments have been carried out, in which the values of the relaxation coefficients have been varied to find out which values are most usable for our purpose. To be able to use the method for validation of model physics or chemistry, good agreement of the model simulated mass and wind field is required. In addition, the model physics should not be disturbed too strongly by the relaxation forcing itself. Both aspects have been investigated. Good agreement with basic observed quantities, like wind, temperature, and pressure is obtained for most simulations in the extratropics. Derived variables, like precipitation and evaporation, have been compared with ECMWF forecasts and observations. Agreement for these variables is smaller than for the basic observed quantities. Nevertheless, considerable improvement is obtained relative to a control run without assimilation. Differences between tropics and extratropics are smaller than for the basic observed quantities. Results also show that precipitation and evaporation are affected by a sort of continuous spin-up which is introduced by the relaxation: the bias (ECMWF-ECHAM) is increasing with increasing relaxation forcing. In agreement with this result we found that with increasing relaxation forcing the vertical exchange of tracers by turbulent boundary layer mixing and, in a lesser extent, by convection, is reduced.
Resumo:
Although Richard Hooker’s private attitudes were clericalist and authoritarian, his constitutional theory subordinated clergymen to laymen and monarchy to parliamentary statute. This article explains why his political ideas were nonetheless appropriate to his presumed religious purposes. It notes a very intimate connection between his teleological conception of a law and his hostility towards conventional high Calvinist ideas about predestination. The most significant anomaly within his broadly Aristotelian world-view was his belief that politics is nothing but a means to cope with sin. This too can be linked to his religious ends, but it creates an ambiguity that made his doctrines usable by Locke.
Resumo:
Particle filters are fully non-linear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high-dimensional geophysical applications the number of particles required by the sequential importance resampling (SIR) particle filter in order to capture the high probability region of the posterior, is too large to make them usable. However particle filters can be formulated using proposal densities, which gives greater freedom in how particles are sampled and allows for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high probability region of the posterior probability density function. This gives rise to the possibility of non-linear data assimilation in large dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model.
Resumo:
Purpose – Today marketers operate in globalised markets, planning new ways to engage with domestic and foreign customers alike. While there is a greater need to understand these two customer groups, few studies examine the impact of customer engagement tactics on the two customer groups, focusing on their perceptual differences. Even less attention is given to customer engagement tactics in a cross-cultural framework. In this research, the authors investigate customers in China and UK, aiming to compare their perceptual differences on the impact of multiple customer engagement tactics. Design/methodology/approach – Using a quantitative approach with 286 usable responses from China and the UK obtained through a combination of person-administered survey and computer-based survey screening process, the authors test a series of hypotheses to distinguish across-cultural differences. Findings – Findings show that the collectivists (Chinese customers) perceive customer engagement tactics differently than the individualists (UK customers). The Chinese customers are more sensitive to price and reputation, whereas the UK customers respond more strongly to service, communication and customisation. Chinese customers’ concerns with extensive price and reputation comparisons may be explained by their awareness towards face (status), increased self-expression and equality. Practical implications – The findings challenge the conventional practice of using similar customer engagement tactics for a specific market place with little concern for multiple cultural backgrounds. The paper proposes strategies for marketers facing challenges in this globalised context. Originality/value – Several contributions have been made to the literatures. First, the study showed the effects of culture on the customers’ perceptual differences. Second, the study provided more information to clarify customers’ different reactions towards customer engagement tactics, highlighted by concerns towards face and status. Third, the study provided empirical evidence to support the use of multiple customer engagement tactics to the across cultural studies.
Resumo:
There is a critical need for screening and diagnostic tools (SDT) for autism spectrum conditions (ASC) in regional languages in South Asia. To address this, we translated four widely used SDT (Social Communication Disorder Checklist, Autism Spectrum Quotient, Social Communication Questionnaire, and Autism Diagnostic Observation Schedule) into Bengali and Hindi, two main regional languages (∼ 360 million speakers), and tested their usability in children with and without ASC. We found a significant difference in scores between children with ASC (n = 45 in Bengali, n = 40 in Hindi) and typically developing children (n = 43 in Bengali, n = 42 in Hindi) on all SDTs. These results demonstrate that these SDTs are usable in South Asia, and constitute an important resource for epidemiology research and clinical diagnosis in the region.