74 resultados para Input and outputs
em CentAUR: Central Archive University of Reading - UK
Resumo:
Motivation: There is a frequent need to apply a large range of local or remote prediction and annotation tools to one or more sequences. We have created a tool able to dispatch one or more sequences to assorted services by defining a consistent XML format for data and annotations. Results: By analyzing annotation tools, we have determined that annotations can be described using one or more of the six forms of data: numeric or textual annotation of residues, domains (residue ranges) or whole sequences. With this in mind, XML DTDs have been designed to store the input and output of any server. Plug-in wrappers to a number of services have been written which are called from a master script. The resulting APATML is then formatted for display in HTML. Alternatively further tools may be written to perform post-analysis.
Resumo:
In immediate recall tasks, visual recency is substantially enhanced when output interference is low (Cowan, Saults, Elliott, & Moreno, 2002; Craik, 1969) whereas auditory recency remains high even under conditions of high output interference. Ibis auditory advantage has been interpreted in terms of auditory resistance to output interference (e.g., Neath & Surprenant, 2003). In this study the auditory-visual difference at low output interference re-emerged when ceiling effects were accounted for, but only with spoken output. With written responding the auditory advantage remained significantly larger with high than with low output interference. These new data suggest that both superior auditory encoding and modality-specific output interference contribute to the classic auditory-visual modality effect.
Resumo:
Wild bird feeding is popular in domestic gardens across the world. Nevertheless, there is surprisingly little empirical information on certain aspects of the activity and no year-round quantitative records of the amounts and nature of the different foods provided in individual gardens. We sought to characterise garden bird feeding in a large UK urban area in two ways. First, we conducted face-to-face questionnaires with a representative cross-section of residents. Just over half fed birds, the majority doing so year round and at least weekly. Second, a two-year study recorded all foodstuffs put out by households on every provisioning occasion. A median of 628 kcal/garden/day was given. Provisioning level was not significantly influenced by weather or season. Comparisons between the data sets revealed significantly less frequent feeding amongst these ‘keen’ feeders than the face-to-face questionnaire respondents, suggesting that one-off questionnaires may overestimate provisioning frequency. Assuming 100% uptake, the median provisioning level equates to sufficient supplementary resources across the UK to support 196 million individuals of a hypothetical average garden-feeding bird species (based on 10 common UK garden-feeding birds’ energy requirements). Taking the lowest provisioning level recorded (101 kcal/day) as a conservative measure, 31 million of these average individuals could theoretically be supported.
Resumo:
A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.
Resumo:
Context-aware multimodal interactive systems aim to adapt to the needs and behavioural patterns of users and offer a way forward for enhancing the efficacy and quality of experience (QoE) in human-computer interaction. The various modalities that constribute to such systems each provide a specific uni-modal response that is integratively presented as a multi-modal interface capable of interpretation of multi-modal user input and appropriately responding to it through dynamically adapted multi-modal interactive flow management , This paper presents an initial background study in the context of the first phase of a PhD research programme in the area of optimisation of data fusion techniques to serve multimodal interactivite systems, their applications and requirements.
Resumo:
The main inputs, outputs and transfers of potassium (K) in soils and swards under typical south west England conditions were determined during 1999/00 and 2000/01 to establish soil and field gate K budgets under different fertilizer nitrogen (N) (0 and 280 kg ha(-1) yr(-1)) and drainage (undrained and drained) treatments. Plots receiving fertilizer N also received farmyard manure (FYM). Potassium soil budgets ranged, on average for the two years, from -5 (+N, drained) to +9 (no N and undrained) kg K ha(-1) yr(-1) and field gate budgets from +23 (+N, drained) to +89 (+N, undrained). The main inputs and outputs to the soil K budgets were fertilizer application (65%) and plant uptake (93%). Animals had a minor effect on K export but a major impact on K recycling. Nitrogen fertilizer application and drainage increased K uptake by the grass and, with it, the efficiency of K used. It also depleted easily available soil K, which could be associated with smaller K losses by leaching.
Resumo:
A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The paper reports the findings of a study designed to consider the impact of the adoption of Bt cotton on markets, businesses, and institutional arrangements in India. Given that evidence to date suggests that widespread adoption of Bt cotton by farmers is likely to increase production, this study aims to assess possible implications for markets (access to inputs, prices of inputs and outputs, etc.) and local industries and to identify potential winners and losers. The results suggest that there are impacts on the cotton industry following from the release of Bt hybrids, and so far the impacts are most noticeable "upstream" (i.e., the input suppliers), where companies are rapidly moving away from the sale of bollworm insecticide and attempting to sell Bt seeds. Seed companies are looking for partnerships with Monsanto, the owner of the Bt gene. One reason that companies are keen to move away from insecticide is so they can avoid the need for credit supply to their customers. Seed purchase is not normally through credit, whereas insecticide purchase is. Issues for companies "downstream" (gins, textile manufacturers) relate more to the better quality of Bt cotton and the need for adequate segregation of Bt and non-Bt cotton.
Resumo:
The use of special units for logarithmic ratio quantities is reviewed. The neper is used with a natural logarithm (logarithm to the base e) to express the logarithm of the amplitude ratio of two pure sinusoidal signals, particularly in the context of linear systems where it is desired to represent the gain or loss in amplitude of a single-frequency signal between the input and output. The bel, and its more commonly used submultiple, the decibel, are used with a decadic logarithm (logarithm to the base 10) to measure the ratio of two power-like quantities, such as a mean square signal or a mean square sound pressure in acoustics. Thus two distinctly different quantities are involved. In this review we define the quantities first, without reference to the units, as is standard practice in any system of quantities and units. We show that two different definitions of the quantity power level, or logarithmic power ratio, are possible. We show that this leads to two different interpretations for the meaning and numerical values of the units bel and decibel. We review the question of which of these alternative definitions is actually used, or is used by implication, by workers in the field. Finally, we discuss the relative advantages of the alternative definitions.
Resumo:
This paper reviews the economic framework for the delivery of livestock services to the poor. It is argued that the demand for livestock products is likely to increase rapidly and the ability of the poor to participate in the opportunities presented by this growth is linked critically to the availability of good service support, both on the input and output side. Governments therefore have a responsibility to supply the necessary public goods (including the institutions and legal frameworks), and the market infrastructure for facilitating the emergence of efficient markets for livestock services. The paper further argues that the dynamics of public policy in developing countries are much more complex than the simple application of economic logic. It is the larger political economy that often dictates policy choices. It is therefore important to integrate political economy and governance issues into the economic debate on livestock service delivery. The paper also reviews the context in which the markets for livestock services will need to function. Different countries are facing very different sets of issues, and the identification of possible interventions in livestock service markets would require careful field research and analysis. In this context, the paper suggests the elements of a research agenda for the next few years.
Resumo:
The relationships between wheat protein quality and baking properties of 20 flour samples were studied for two breadmaking processes; a hearth bread test and the Chorleywood Bread Process (CBP). The strain hardening index obtained from dough inflation measurements, the proportion of unextractable polymeric protein, and mixing properties were among the variables found to be good indicators of protein quality and suitable for predicting potential baking quality of wheat flours. By partial least squares regression, flour and dough test variables were able to account for 71-93% of the variation in crumb texture, form ratio and volume of hearth loaves made using optimal mixing and fixed proving times. These protein quality variables were, however, not related to the volume of loaves produced by the CBP using mixing to constant work input and proving to constant height. On the other hand, variation in crumb texture of CBP loaves (54-55%) could be explained by protein quality. The results underline that the choice of baking procedure and loaf characteristics is vital in assessing the protein quality of flours. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
A look is taken here at how the use of implant technology is rapidly diminishing the effects of certain neural illnesses and distinctly increasing the range of abilities of those affected. An indication is given of a number of problem areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking the human brain directly with a computer. In order to assess the possible opportunities, both human and animal studies are reported on. The main thrust of the paper is however a discussion of neural implant experimentation linking the human nervous system bi-directionally with the internet. With this in place neural signals were transmitted to various technological devices to directly control them, in some cases via the internet, and feedback to the brain was obtained from such as the fingertips of a robot hand, ultrasonic (extra) sensory input and neural signals directly from another human's nervous system. Consideration is given to the prospects for neural implant technology in the future, both in the short term as a therapeutic device and in the long term as a form of enhancement, including the realistic potential for thought communication potentially opening up commercial opportunities. Clearly though, an individual whose brain is part human - part machine can have abilities that far surpass those with a human brain alone. Will such an individual exhibit different moral and ethical values to those of a human.? If so, what effects might this have on society?
Resumo:
A quasi-optical deembedding technique for characterizing waveguides is demonstrated using wide-band time-resolved terahertz spectroscopy. A transfer function representation is adopted for the description of the signal in the input and output port of the waveguides. The time-domain responses were discretized and the waveguide transfer function was obtained through a parametric approach in the z-domain after describing the system with an AutoRegressive with eXogenous input (ARX), as well as with a state-space model. Prior to the identification procedure, filtering was performed in the wavelet domain to minimize both signal distortion, as well as the noise propagating in the ARX and subspace models. The optimal filtering procedure used in the wavelet domain for the recorded time-domain signatures is described in detail. The effect of filtering prior to the identification procedures is elucidated with the aid of pole-zero diagrams. Models derived from measurements of terahertz transients in a precision WR-8 waveguide adjustable short are presented.
Resumo:
A look is taken here at how the use of implant technology is rapidly diminishing the effects of certain neural illnesses and distinctly increasing the range of abilities of those affected. An indication is given of a number of problem areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking the human brain directly with a computer. In order to assess the possible opportunities, both human and animal studies are reported on. The main thrust of the paper is, however, a discussion of neural implant experimentation linking the human nervous system bi-directionally with the internet. With this in place, neural signals were transmitted to various technological devices to directly control them, in some cases via the internet, and feedback to the brain was obtained from, for example, the fingertips of a robot hand, and ultrasonic (extra) sensory input and neural signals directly from another human's nervous system. Consideration is given to the prospects for neural implant technology in the future, both in the short term as a therapeutic device and in the long term as a form of enhancement, including the realistic potential for thought communication-potentially opening up commercial opportunities. Clearly though, an individual whose brain is part human-part machine can have abilities that far surpass those with a human brain alone. Will such an individual exhibit different moral and ethical values from those of a human? If so, what effects might this have on society? (C) 2008 Elsevier B.V. All rights reserved.