924 resultados para Input and outputs
Resumo:
The statistical analysis of compositional data should be treated using logratios of parts, which are difficult to use correctly in standard statistical packages. For this reason a freeware package, named CoDaPack was created. This software implements most of the basic statistical methods suitable for compositional data. In this paper we describe the new version of the package that now is called CoDaPack3D. It is developed in Visual Basic for applications (associated with Excel©), Visual Basic and Open GL, and it is oriented towards users with a minimum knowledge of computers with the aim at being simple and easy to use. This new version includes new graphical output in 2D and 3D. These outputs could be zoomed and, in 3D, rotated. Also a customization menu is included and outputs could be saved in jpeg format. Also this new version includes an interactive help and all dialog windows have been improved in order to facilitate its use. To use CoDaPack one has to access Excel© and introduce the data in a standard spreadsheet. These should be organized as a matrix where Excel© rows correspond to the observations and columns to the parts. The user executes macros that return numerical or graphical results. There are two kinds of numerical results: new variables and descriptive statistics, and both appear on the same sheet. Graphical output appears in independent windows. In the present version there are 8 menus, with a total of 38 submenus which, after some dialogue, directly call the corresponding macro. The dialogues ask the user to input variables and further parameters needed, as well as where to put these results. The web site http://ima.udg.es/CoDaPack contains this freeware package and only Microsoft Excel© under Microsoft Windows© is required to run the software. Kew words: Compositional data Analysis, Software
Resumo:
Context-aware multimodal interactive systems aim to adapt to the needs and behavioural patterns of users and offer a way forward for enhancing the efficacy and quality of experience (QoE) in human-computer interaction. The various modalities that constribute to such systems each provide a specific uni-modal response that is integratively presented as a multi-modal interface capable of interpretation of multi-modal user input and appropriately responding to it through dynamically adapted multi-modal interactive flow management , This paper presents an initial background study in the context of the first phase of a PhD research programme in the area of optimisation of data fusion techniques to serve multimodal interactivite systems, their applications and requirements.
Resumo:
The main inputs, outputs and transfers of potassium (K) in soils and swards under typical south west England conditions were determined during 1999/00 and 2000/01 to establish soil and field gate K budgets under different fertilizer nitrogen (N) (0 and 280 kg ha(-1) yr(-1)) and drainage (undrained and drained) treatments. Plots receiving fertilizer N also received farmyard manure (FYM). Potassium soil budgets ranged, on average for the two years, from -5 (+N, drained) to +9 (no N and undrained) kg K ha(-1) yr(-1) and field gate budgets from +23 (+N, drained) to +89 (+N, undrained). The main inputs and outputs to the soil K budgets were fertilizer application (65%) and plant uptake (93%). Animals had a minor effect on K export but a major impact on K recycling. Nitrogen fertilizer application and drainage increased K uptake by the grass and, with it, the efficiency of K used. It also depleted easily available soil K, which could be associated with smaller K losses by leaching.
Resumo:
A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The paper reports the findings of a study designed to consider the impact of the adoption of Bt cotton on markets, businesses, and institutional arrangements in India. Given that evidence to date suggests that widespread adoption of Bt cotton by farmers is likely to increase production, this study aims to assess possible implications for markets (access to inputs, prices of inputs and outputs, etc.) and local industries and to identify potential winners and losers. The results suggest that there are impacts on the cotton industry following from the release of Bt hybrids, and so far the impacts are most noticeable "upstream" (i.e., the input suppliers), where companies are rapidly moving away from the sale of bollworm insecticide and attempting to sell Bt seeds. Seed companies are looking for partnerships with Monsanto, the owner of the Bt gene. One reason that companies are keen to move away from insecticide is so they can avoid the need for credit supply to their customers. Seed purchase is not normally through credit, whereas insecticide purchase is. Issues for companies "downstream" (gins, textile manufacturers) relate more to the better quality of Bt cotton and the need for adequate segregation of Bt and non-Bt cotton.
Resumo:
The use of special units for logarithmic ratio quantities is reviewed. The neper is used with a natural logarithm (logarithm to the base e) to express the logarithm of the amplitude ratio of two pure sinusoidal signals, particularly in the context of linear systems where it is desired to represent the gain or loss in amplitude of a single-frequency signal between the input and output. The bel, and its more commonly used submultiple, the decibel, are used with a decadic logarithm (logarithm to the base 10) to measure the ratio of two power-like quantities, such as a mean square signal or a mean square sound pressure in acoustics. Thus two distinctly different quantities are involved. In this review we define the quantities first, without reference to the units, as is standard practice in any system of quantities and units. We show that two different definitions of the quantity power level, or logarithmic power ratio, are possible. We show that this leads to two different interpretations for the meaning and numerical values of the units bel and decibel. We review the question of which of these alternative definitions is actually used, or is used by implication, by workers in the field. Finally, we discuss the relative advantages of the alternative definitions.
Resumo:
This paper reviews the economic framework for the delivery of livestock services to the poor. It is argued that the demand for livestock products is likely to increase rapidly and the ability of the poor to participate in the opportunities presented by this growth is linked critically to the availability of good service support, both on the input and output side. Governments therefore have a responsibility to supply the necessary public goods (including the institutions and legal frameworks), and the market infrastructure for facilitating the emergence of efficient markets for livestock services. The paper further argues that the dynamics of public policy in developing countries are much more complex than the simple application of economic logic. It is the larger political economy that often dictates policy choices. It is therefore important to integrate political economy and governance issues into the economic debate on livestock service delivery. The paper also reviews the context in which the markets for livestock services will need to function. Different countries are facing very different sets of issues, and the identification of possible interventions in livestock service markets would require careful field research and analysis. In this context, the paper suggests the elements of a research agenda for the next few years.
Resumo:
The relationships between wheat protein quality and baking properties of 20 flour samples were studied for two breadmaking processes; a hearth bread test and the Chorleywood Bread Process (CBP). The strain hardening index obtained from dough inflation measurements, the proportion of unextractable polymeric protein, and mixing properties were among the variables found to be good indicators of protein quality and suitable for predicting potential baking quality of wheat flours. By partial least squares regression, flour and dough test variables were able to account for 71-93% of the variation in crumb texture, form ratio and volume of hearth loaves made using optimal mixing and fixed proving times. These protein quality variables were, however, not related to the volume of loaves produced by the CBP using mixing to constant work input and proving to constant height. On the other hand, variation in crumb texture of CBP loaves (54-55%) could be explained by protein quality. The results underline that the choice of baking procedure and loaf characteristics is vital in assessing the protein quality of flours. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
A look is taken here at how the use of implant technology is rapidly diminishing the effects of certain neural illnesses and distinctly increasing the range of abilities of those affected. An indication is given of a number of problem areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking the human brain directly with a computer. In order to assess the possible opportunities, both human and animal studies are reported on. The main thrust of the paper is however a discussion of neural implant experimentation linking the human nervous system bi-directionally with the internet. With this in place neural signals were transmitted to various technological devices to directly control them, in some cases via the internet, and feedback to the brain was obtained from such as the fingertips of a robot hand, ultrasonic (extra) sensory input and neural signals directly from another human's nervous system. Consideration is given to the prospects for neural implant technology in the future, both in the short term as a therapeutic device and in the long term as a form of enhancement, including the realistic potential for thought communication potentially opening up commercial opportunities. Clearly though, an individual whose brain is part human - part machine can have abilities that far surpass those with a human brain alone. Will such an individual exhibit different moral and ethical values to those of a human.? If so, what effects might this have on society?
Resumo:
A quasi-optical deembedding technique for characterizing waveguides is demonstrated using wide-band time-resolved terahertz spectroscopy. A transfer function representation is adopted for the description of the signal in the input and output port of the waveguides. The time-domain responses were discretized and the waveguide transfer function was obtained through a parametric approach in the z-domain after describing the system with an AutoRegressive with eXogenous input (ARX), as well as with a state-space model. Prior to the identification procedure, filtering was performed in the wavelet domain to minimize both signal distortion, as well as the noise propagating in the ARX and subspace models. The optimal filtering procedure used in the wavelet domain for the recorded time-domain signatures is described in detail. The effect of filtering prior to the identification procedures is elucidated with the aid of pole-zero diagrams. Models derived from measurements of terahertz transients in a precision WR-8 waveguide adjustable short are presented.
Resumo:
A look is taken here at how the use of implant technology is rapidly diminishing the effects of certain neural illnesses and distinctly increasing the range of abilities of those affected. An indication is given of a number of problem areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking the human brain directly with a computer. In order to assess the possible opportunities, both human and animal studies are reported on. The main thrust of the paper is, however, a discussion of neural implant experimentation linking the human nervous system bi-directionally with the internet. With this in place, neural signals were transmitted to various technological devices to directly control them, in some cases via the internet, and feedback to the brain was obtained from, for example, the fingertips of a robot hand, and ultrasonic (extra) sensory input and neural signals directly from another human's nervous system. Consideration is given to the prospects for neural implant technology in the future, both in the short term as a therapeutic device and in the long term as a form of enhancement, including the realistic potential for thought communication-potentially opening up commercial opportunities. Clearly though, an individual whose brain is part human-part machine can have abilities that far surpass those with a human brain alone. Will such an individual exhibit different moral and ethical values from those of a human? If so, what effects might this have on society? (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A novel Neuropredictive Teleoperation (NPT) Scheme is presented. The design results from two key ideas: the exploitation of the measured or estimated neural input to the human arm or its electromyograph (EMG) as the system input and the employment of a predictor of the arm movement, based on this neural signal and an arm model, to compensate for time delays in the system. Although a multitude of such models, as well as measuring devices for the neural signals and the EMG, have been proposed, current telemanipulator research has only been considering highly simplified arm models. In the present design, the bilateral constraint that the master and slave are simultaneously compliant to each other's state (equal positions and forces) is abandoned, thus obtaining a simple to analyzesuccession of only locally controlled modules, and a robustness to time delays of up to 500 ms. The proposed designs were inspired by well established physiological evidence that the brain, rather than controlling the movement on-line, programs the arm with an action plan of a complete movement, which is then executed largely in open loop, regulated only by local reflex loops. As a model of the human arm the well-established Stark model is employed, whose mathematical representation is modified to make it suitable for an engineering application. The proposed scheme is however valid for any arm model. BIBO-stability and passivity results for a variety of local control laws are reported. Simulation results and comparisons with traditional designs also highlight the advantages of the proposed design.
Resumo:
Purpose – The purpose of this paper is to investigate the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Design/methodology/approach – Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. Findings – It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models. Evidence is found of equifinality in the outputs of a simple, aggregated model of development viability relative to more complex, disaggregated models. Originality/value – Development viability appraisal has become increasingly important in the planning system. Consequently, the theory, application and outputs from development appraisal are under intense scrutiny from a wide range of users. However, there has been very little published evaluation of viability models. This paper contributes to the limited literature in this area.
Resumo:
The hydrological balance of the Black Sea is governed by riverine input and by the exchange with the Mediterranean Sea. A speleothem record from a cave in northern Turkey that tracks the isotopic signature of Black Sea surface water suggests an open connection to the Mediterranean Sea in at least twelve periods in the past 670,000 years.
Resumo:
An important constraint on how hemodynamic neuroimaging signals such as fMRI can be interpreted in terms of the underlying evoked activity is an understanding of neurovascular coupling mechanisms that actually generate hemodynamic responses. The predominant view at present is that the hemodynamic response is most correlated with synaptic input and subsequent neural processing rather than spiking output. It is still not clear whether input or processing is more important in the generation of hemodynamics responses. In order to investigate this we measured the hemodynamic and neural responses to electrical whisker pad stimuli in rat whisker barrel somatosensory cortex both before and after the local cortical injections of the GABAA agonist muscimol. Muscimol would not be expected to affect the thalamocortical input into the cortex but would inhibit subsequent intra-cortical processing. Pre-muscimol infusion whisker stimuli elicited the expected neural and accompanying hemodynamic responses to that reported previously. Following infusion of muscimol, although the temporal profile of neural responses to each pulse of the stimulus train was similar, the average response was reduced in magnitude by ∼79% compared to that elicited pre-infusion. The whisker-evoked hemodynamic responses were reduced by a commensurate magnitude suggesting that, although the neurovascular coupling relationships were similar for synaptic input as well as for cortical processing, the magnitude of the overall response is dominated by processing rather than from that produced from the thalamocortical input alone.