60 resultados para Dynamic Input-Output Balance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show theoretically and experimentally a mechanismbehind the emergence of wide or bimodal protein distributions in biochemical networks with nonlinear input-output characteristics (the dose-response curve) and variability in protein abundance. Large cell-to-cell variation in the nonlinear dose-response characteristics can be beneficial to facilitate two distinct groups of response levels as opposed to a graded response. Under the circumstances that we quantify mathematically, the two distinct responses can coexist within a cellular population, leading to the emergence of a bimodal protein distribution. Using flow cytometry, we demonstrate the appearance of wide distributions in the hypoxia-inducible factor-mediated response network in HCT116 cells. With help of our theoretical framework, we perform a novel calculation of the magnitude of cell-to-cell heterogeneity in the dose-response obtained experimentally. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study a small circuit of coupled nonlinear elements to investigate general features of signal transmission through networks. The small circuit itself is perceived as building block for larger networks. Individual dynamics and coupling are motivated by neuronal systems: We consider two types of dynamical modes for an individual element, regular spiking and chattering and each individual element can receive excitatory and/or inhibitory inputs and is subjected to different feedback types (excitatory and inhibitory; forward and recurrent). Both, deterministic and stochastic simulations are carried out to study the input-output relationships of these networks. Major results for regular spiking elements include frequency locking, spike rate amplification for strong synaptic coupling, and inhibition-induced spike rate control which can be interpreted as a output frequency rectification. For chattering elements, spike rate amplification for low frequencies and silencing for large frequencies is characteristic

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The fundamental failure of current approaches to ontology learning is to view it as single pipeline with one or more specific inputs and a single static output. In this paper, we present a novel approach to ontology learning which takes an iterative view of knowledge acquisition for ontologies. Our approach is founded on three open-ended resources: a set of texts, a set of learning patterns and a set of ontological triples, and the system seeks to maintain these in equilibrium. As events occur which disturb this equilibrium, actions are triggered to re-establish a balance between the resources. We present a gold standard based evaluation of the final output of the system, the intermediate output showing the iterative process and a comparison of performance using different seed input. The results are comparable to existing performance in the literature.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This preliminary report describes work carried out as part of work package 1.2 of the MUCM research project. The report is split in two parts: the ?rst part (Sections 1 and 2) summarises the state of the art in emulation of computer models, while the second presents some initial work on the emulation of dynamic models. In the ?rst part, we describe the basics of emulation, introduce the notation and put together the key results for the emulation of models with single and multiple outputs, with or without the use of mean function. In the second part, we present preliminary results on the chaotic Lorenz 63 model. We look at emulation of a single time step, and repeated application of the emulator for sequential predic- tion. After some design considerations, the emulator is compared with the exact simulator on a number of runs to assess its performance. Several general issues related to emulating dynamic models are raised and discussed. Current work on the larger Lorenz 96 model (40 variables) is presented in the context of dimension reduction, with results to be provided in a follow-up report. The notation used in this report are summarised in appendix.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main theme of research of this project concerns the study of neutral networks to control uncertain and non-linear control systems. This involves the control of continuous time, discrete time, hybrid and stochastic systems with input, state or output constraints by ensuring good performances. A great part of this project is devoted to the opening of frontiers between several mathematical and engineering approaches in order to tackle complex but very common non-linear control problems. The objectives are: 1. Design and develop procedures for neutral network enhanced self-tuning adaptive non-linear control systems; 2. To design, as a general procedure, neural network generalised minimum variance self-tuning controller for non-linear dynamic plants (Integration of neural network mapping with generalised minimum variance self-tuning controller strategies); 3. To develop a software package to evaluate control system performances using Matlab, Simulink and Neural Network toolbox. An adaptive control algorithm utilising a recurrent network as a model of a partial unknown non-linear plant with unmeasurable state is proposed. Appropriately, it appears that structured recurrent neural networks can provide conveniently parameterised dynamic models for many non-linear systems for use in adaptive control. Properties of static neural networks, which enabled successful design of stable adaptive control in the state feedback case, are also identified. A survey of the existing results is presented which puts them in a systematic framework showing their relation to classical self-tuning adaptive control application of neural control to a SISO/MIMO control. Simulation results demonstrate that the self-tuning design methods may be practically applicable to a reasonably large class of unknown linear and non-linear dynamic control systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Predicting future need for water resources has traditionally been, at best, a crude mixture of art and science. This has prevented the evaluation of water need from being carried out in either a consistent or comprehensive manner. This inconsistent and somewhat arbitrary approach to water resources planning led to well publicised premature developments in the 1970's and 1980's but privatisation of the Water Industry, including creation of the Office of Water Services and the National Rivers Authority in 1989, turned the tide of resource planning to the point where funding of schemes and their justification by the Regulators could no longer be assumed. Furthermore, considerable areas of uncertainty were beginning to enter the debate and complicate the assessment It was also no longer appropriate to consider that contingencies would continue to lie solely on the demand side of the equation. An inability to calculate the balance between supply and demand may mean an inability to meet standards of service or, arguably worse, an excessive provision of water resources and excessive costs to customers. United Kingdom Water Industry Research limited (UKWlR) Headroom project in 1998 provided a simple methodology for the calculation of planning margins. This methodology, although well received, was not, however, accepted by the Regulators as a tool sufficient to promote resource development. This thesis begins by considering the history of water resource planning in the UK, moving on to discuss events following privatisation of the water industry post·1985. The mid section of the research forms the bulk of original work and provides a scoping exercise which reveals a catalogue of uncertainties prevalent within the supply-demand balance. Each of these uncertainties is considered in terms of materiality, scope, and whether it can be quantified within a risk analysis package. Many of the areas of uncertainty identified would merit further research. A workable, yet robust, methodology for evaluating the balance between water resources and water demands by using a spreadsheet based risk analysis package is presented. The technique involves statistical sampling and simulation such that samples are taken from input distributions on both the supply and demand side of the equation and the imbalance between supply and demand is calculated in the form of an output distribution. The percentiles of the output distribution represent different standards of service to the customer. The model allows dependencies between distributions to be considered, for improved uncertainties to be assessed and for the impact of uncertain solutions to any imbalance to be calculated directly. The method is considered a Significant leap forward in the field of water resource planning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With business incubators deemed as a potent infrastructural element for entrepreneurship development, business incubation management practice and performance have received widespread attention. However, despite this surge of interest, scholars have questioned the extent to which business incubation delivers added value. Thus, there is a growing awareness among researchers, practitioners and policy makers of the need for more rigorous evaluation of the business incubation output performance. Aligned to this is an increasing demand for benchmarking business incubation input/process performance and highlighting best practice. This paper offers a business incubation assessment framework, which considers input/process and output performance domains with relevant indicators. This tool adds value on different levels. It has been developed in collaboration with practitioners and industry experts and therefore it would be relevant and useful to business incubation managers. Once a large enough database of completed questionnaires has been populated on an online platform managed by a coordinating mechanism, such as a business incubation membership association, business incubator managers can reflect on their practices by using this assessment framework to learn their relative position vis-à-vis their peers against each domain. This will enable them to align with best practice in this field. Beyond implications for business incubation management practice, this performance assessment framework would also be useful to researchers and policy makers concerned with business incubation management practice and impact. Future large-scale research could test for construct validity and reliability. Also, discriminant analysis could help link input and process indicators with output measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the dependence of Bayesian error bars on the distribution of data in input space. For generalized linear regression models we derive an upper bound on the error bars which shows that, in the neighbourhood of the data points, the error bars are substantially reduced from their prior values. For regions of high data density we also show that the contribution to the output variance due to the uncertainty in the weights can exhibit an approximate inverse proportionality to the probability density. Empirical results support these conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian processes provide natural non-parametric prior distributions over regression functions. In this paper we consider regression problems where there is noise on the output, and the variance of the noise depends on the inputs. If we assume that the noise is a smooth function of the inputs, then it is natural to model the noise variance using a second Gaussian process, in addition to the Gaussian process governing the noise-free output value. We show that prior uncertainty about the parameters controlling both processes can be handled and that the posterior distribution of the noise rate can be sampled from using Markov chain Monte Carlo methods. Our results on a synthetic data set give a posterior noise variance that well-approximates the true variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for low bit-rate speech coding is the result of growing demand on the available radio bandwidth for mobile communications both for military purposes and for the public sector. To meet this growing demand it is required that the available bandwidth be utilized in the most economic way to accommodate more services. Two low bit-rate speech coders have been built and tested in this project. The two coders combine predictive coding with delta modulation, a property which enables them to achieve simultaneously the low bit-rate and good speech quality requirements. To enhance their efficiency, the predictor coefficients and the quantizer step size are updated periodically in each coder. This enables the coders to keep up with changes in the characteristics of the speech signal with time and with changes in the dynamic range of the speech waveform. However, the two coders differ in the method of updating their predictor coefficients. One updates the coefficients once every one hundred sampling periods and extracts the coefficients from input speech samples. This is known in this project as the Forward Adaptive Coder. Since the coefficients are extracted from input speech samples, these must be transmitted to the receiver to reconstruct the transmitted speech sample, thus adding to the transmission bit rate. The other updates its coefficients every sampling period, based on information of output data. This coder is known as the Backward Adaptive Coder. Results of subjective tests showed both coders to be reasonably robust to quantization noise. Both were graded quite good, with the Forward Adaptive performing slightly better, but with a slightly higher transmission bit rate for the same speech quality, than its Backward counterpart. The coders yielded acceptable speech quality of 9.6kbps for the Forward Adaptive and 8kbps for the Backward Adaptive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the design and implementation of a new dynamic simulator called DASP. It is a computer program package written in standard Fortran 77 for the dynamic analysis and simulation of chemical plants. Its main uses include the investigation of a plant's response to disturbances, the determination of the optimal ranges and sensitivities of controller settings and the simulation of the startup and shutdown of chemical plants. The design and structure of the program and a number of features incorporated into it combine to make DASP an effective tool for dynamic simulation. It is an equation-oriented dynamic simulator but the model equations describing the user's problem are generated from in-built model equation library. A combination of the structuring of the model subroutines, the concept of a unit module, and the use of the connection matrix of the problem given by the user have been exploited to achieve this objective. The Executive program has a structure similar to that of a CSSL-type simulator. DASP solves a system of differential equations coupled to nonlinear algebraic equations using an advanced mixed equation solver. The strategy used in formulating the model equations makes it possible to obtain the steady state solution of the problem using the same model equations. DASP can handle state and time events in an efficient way and this includes the modification of the flowsheet. DASP is highly portable and this has been demonstrated by running it on a number of computers with only trivial modifications. The program runs on a microcomputer with 640 kByte of memory. It is a semi-interactive program, with the bulk of all input data given in pre-prepared data files with communication with the user is via an interactive terminal. Using the features in-built in the package, the user can view or modify the values of any input data, variables and parameters in the model, and modify the structure of the flowsheet of the problem during a simulation session. The program has been demonstrated and verified using a number of example problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies have shown that the brand “owner” is very influential in positioning the brand and when the brand “owner” ceases his or her active role the brand will be perceived differently by the consumers. Balance Theory (HBT), a cognitive psychological theory, studies the triadic relationships between two persons and an entity and predicts that when a person’s original perception of the relationship is disturbed, the person restructures to a new balanced perception. Consequently, this research was undertaken to: conceptualize the brand owner’s impact on consumer’s brand perception; test the applicability of both the static and dynamic predictions of the Heider’s Balance Theory in brand owner-consumer-brand relation (OCB); construct and test a model of brand owner-consumer-brand relation; and examine if personality has an influence on OCB. A discovery-oriented approach was taken to understand the selected market segment, the ready-to-wear and diffusion lines of international designer labels. Chinese Brand Personality Scale, fashion proneness and hedonic and utilitarian shopping scales were developed, and validated. 51 customers were surveyed. Both traditional and extended methods used in the Balance Theory were employed in this study. Responses to liked brand have been used to test and develop the model, while those for disliked brand were used for test and confirmation. A “what if’ experimental approach was employed to test the applicability of dynamic HBT theory in OCB Model. The hypothesized OCB Model has been tested and validated. Consumers have been found to have separate views on the brand and the brand owner; and their responses to contrasting ethical and non-ethical news of the brand owner are different. Personality has been found to have an influence and two personality adapted models have been tested and validated. The actual results go beyond the prediction of the Balance Theory. Dominant triple positive balance mode, dominant negative balance mode, and mode of extreme antipathy have been found. It has been found that not all balanced modes are good for the brand. Contrary to Heider’s findings, simply liking may not necessarily lead to unit relation in the OCB Model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis described the research carried out on the development of a novel hardwired tactile sensing system tailored for the application of a next generation of surgical robotic and clinical devices, namely a steerable endoscope with tactile feedback, and a surface plate for patient posture and balance. Two case studies are examined. The first is a one-dimensional sensor for the steerable endoscope retrieving shape and ‘touch’ information. The second is a two-dimensional surface which interprets the three-dimensional motion of a contacting moving load. This research can be used to retrieve information from a distributive tactile sensing surface of a different configuration, and can interpret dynamic and static disturbances. This novel approach to sensing has the potential to discriminate contact and palpation in minimal invasive surgery (MIS) tools, and posture and balance in patients. The hardwired technology uses an embedded system based on Field Programmable Gate Arrays (FPGA) as the platform to perform the sensory signal processing part in real time. High speed robust operation is an advantage from this system leading to versatile application involving dynamic real time interpretation as described in this research. In this research the sensory signal processing uses neural networks to derive information from input pattern from the contacting surface. Three neural network architectures namely single, multiple and cascaded were introduced in an attempt to find the optimum solution for discrimination of the contacting outputs. These architectures were modelled and implemented into the FPGA. With the recent introduction of modern digital design flows and synthesis tools that essentially take a high-level sensory processing behaviour specification for a design, fast prototyping of the neural network function can be achieved easily. This thesis outlines the challenge of the implementations and verifications of the performances.