43 resultados para accommodate

em Aston University Research Archive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neural networks are usually curved statistical models. They do not have finite dimensional sufficient statistics, so on-line learning on the model itself inevitably loses information. In this paper we propose a new scheme for training curved models, inspired by the ideas of ancillary statistics and adaptive critics. At each point estimate an auxiliary flat model (exponential family) is built to locally accommodate both the usual statistic (tangent to the model) and an ancillary statistic (normal to the model). The auxiliary model plays a role in determining credit assignment analogous to that played by an adaptive critic in solving temporal problems. The method is illustrated with the Cauchy model and the algorithm is proved to be asymptotically efficient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Received wisdom has it that positive polarity items such as someone are incompatible with negation (?*Someone didn't come). Yet negative contexts are attested with such items not only in their specific indefinite reading (e.g. There's someone who didn't come), but also in their non-specific reading (It isn't the case that someone came). It is the non-specific reading of indefinite quelqu'un as subject of a negative verb phrase which is analysed by the present paper. On the basis of a corpus of attested cases, it demonstrates that polemic contrast is the crucial condition of the considered interpretation. As quelqu'un is included within a presupposed proposition that is rejected as a whole by negation, negative contexts can accommodate an item which does not normally yield the interpretations negation does. Interpretation is thus presented as process of mutual adjustment between contextual readings allowed for by items, readings which can be modalised by discursive values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the performance of error-correcting codes, where the code word comprises products of K bits selected from the original message and decoding is carried out utilizing a connectivity tensor with C connections per index. Shannon's bound for the channel capacity is recovered for large K and zero temperature when the code rate K/C is finite. Close to optimal error-correcting capability is obtained for finite K and C. We examine the finite-temperature case to assess the use of simulated annealing for decoding and extend the analysis to accommodate other types of noisy channels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the performance of parity check codes using the mapping onto spin glasses proposed by Sourlas. We study codes where each parity check comprises products of K bits selected from the original digital message with exactly C parity checks per message bit. We show, using the replica method, that these codes saturate Shannon's coding bound for K?8 when the code rate K/C is finite. We then examine the finite temperature case to asses the use of simulated annealing methods for decoding, study the performance of the finite K case and extend the analysis to accommodate different types of noisy channels. The analogy between statistical physics methods and decoding by belief propagation is also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have recently developed a principled approach to interactive non-linear hierarchical visualization [8] based on the Generative Topographic Mapping (GTM). Hierarchical plots are needed when a single visualization plot is not sufficient (e.g. when dealing with large quantities of data). In this paper we extend our system by giving the user a choice of initializing the child plots of the current plot in either interactive, or automatic mode. In the interactive mode the user interactively selects ``regions of interest'' as in [8], whereas in the automatic mode an unsupervised minimum message length (MML)-driven construction of a mixture of GTMs is used. The latter is particularly useful when the plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. We illustrate our approach on a data set of 2300 18-dimensional points and mention extension of our system to accommodate discrete data types.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Global Partnership for Effective Diabetes Management was established in 2004 to provide practical guidance to improving glycaemic control for people with type 2 diabetes. Those recommendations have been updated to take account of recent trials assessing the effects of intensive glucose control. We continue to emphasis the importance of early and sustained glycaemic control, aiming for HbA( 1c) 6.5-7% wherever safe and appropriate. Individualisation of targets and the management process is strongly encouraged to accommodate patient circumstances and to avoid hypoglycaemia. Prompt introduction of combinations of agents is suggested when monotherapy is inadequate.Treatments will preferably address the underlying pathophysiology of type 2 diabetes and integrate within a wider programme of care which also aims to reduce modifiable cardiovascular risk factors and better equip patients in the self-management of their condition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Knowledge management (KM) is a developing field that focuses on harnessing knowledge for use by a person or community. However, most KM research focuses on improving decision making capacity in business communities, neglecting applications in wider society and non-decision making activities. This paper explores the potential of KM for rural communities, specifically for those that want to preserve their social history and collective memories (what we call heritage) to enrich the lives of others. In KM terms, this is a task of accumulating and recording knowledge to enable its retention for future use. We report a case study of Cardrona, a valley of approximately 120 people in New Zealand’s South Island. Realising that time would erode knowledge of their community a small, motivated group of residents initiated a KM programme to create a legacy for a wider community including younger generations, tourists and scholars. This paper applies KM principles to rural communities that want to harness their collective knowledge for wider societal gain, and develops a framework to accommodate them. As a result, we call for a wider conceptualization of KM to include motives for managing knowledge beyond decision making to accommodate community KM (cKM).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need for low bit-rate speech coding is the result of growing demand on the available radio bandwidth for mobile communications both for military purposes and for the public sector. To meet this growing demand it is required that the available bandwidth be utilized in the most economic way to accommodate more services. Two low bit-rate speech coders have been built and tested in this project. The two coders combine predictive coding with delta modulation, a property which enables them to achieve simultaneously the low bit-rate and good speech quality requirements. To enhance their efficiency, the predictor coefficients and the quantizer step size are updated periodically in each coder. This enables the coders to keep up with changes in the characteristics of the speech signal with time and with changes in the dynamic range of the speech waveform. However, the two coders differ in the method of updating their predictor coefficients. One updates the coefficients once every one hundred sampling periods and extracts the coefficients from input speech samples. This is known in this project as the Forward Adaptive Coder. Since the coefficients are extracted from input speech samples, these must be transmitted to the receiver to reconstruct the transmitted speech sample, thus adding to the transmission bit rate. The other updates its coefficients every sampling period, based on information of output data. This coder is known as the Backward Adaptive Coder. Results of subjective tests showed both coders to be reasonably robust to quantization noise. Both were graded quite good, with the Forward Adaptive performing slightly better, but with a slightly higher transmission bit rate for the same speech quality, than its Backward counterpart. The coders yielded acceptable speech quality of 9.6kbps for the Forward Adaptive and 8kbps for the Backward Adaptive.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pulse compression techniques originated in radar.The present work is concerned with the utilization of these techniques in general, and the linear FM (LFM) technique in particular, for comnunications. It introduces these techniques from an optimum communications viewpoint and outlines their capabilities.It also considers the candidacy of the class of LFM signals for digital data transmission and the LFM spectrum. Work related to the utilization of LFM signals for digital data transmission has been mostly experimental and mainly concerned with employing two rectangular LFM pulses (or chirps) with reversed slopes to convey the bits 1 and 0 in an incoherent node.No systematic theory for LFM signal design and system performance has been available. Accordingly, the present work establishes such a theory taking into account coherent and noncoherent single-link and multiplex signalling modes. Some new results concerning the slope-reversal chirp pair are obtained. The LFM technique combines the typical capabilities of pulse compression with a relative ease of implementation. However, these merits are often hampered by the difficulty of handling the LFM spectrum which cannot generally be expressed closed-form. The common practice is to obtain a plot of this spectrum with a digital computer for every single set of LFM pulse parameters.Moreover, reported work has been Justly confined to the spectrum of an ideally rectangular chirp pulse with no rise or fall times.Accordingly, the present work comprises a systerratic study of the LFM spectrum which takes the rise and fall time of the chirp pulse into account and can accommodate any LFM pulse with any parameters.It· formulates rather simple and accurate prediction criteria concerning the behaviour of this spectrum in the different frequency regions. These criteria would facilitate the handling of the LFM technique in theory and practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis describes an investigation into methods for the specification, design and implementation of computer control systems for flexible manufacturing machines comprising multiple, independent, electromechanically-driven mechanisms. An analysis is made of the elements of conventional mechanically-coupled machines in order that the operational functions of these elements may be identified. This analysis is used to define the scope of requirements necessary to specify the format, function and operation of a flexible, independently driven mechanism machine. A discussion of how this type of machine can accommodate modern manufacturing needs of high-speed and flexibility is presented. A sequential method of capturing requirements for such machines is detailed based on a hierarchical partitioning of machine requirements from product to independent drive mechanism. A classification of mechanisms using notations, including Data flow diagrams and Petri-nets, is described which supports capture and allows validation of requirements. A generic design for a modular, IDM machine controller is derived based upon hierarchy of control identified in these machines. A two mechanism experimental machine is detailed which is used to demonstrate the application of the specification, design and implementation techniques. A computer controller prototype and a fully flexible implementation for the IDM machine, based on Petri-net models described using the concurrent programming language Occam, is detailed. The ability of this modular computer controller to support flexible, safe and fault-tolerant operation of the two intermittent motion, discrete-synchronisation independent drive mechanisms is presented. The application of the machine development methodology to industrial projects is established.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature on the evaporation of drops of pure liquids, drops containing solids and droplet sprays has been critically reviewed. An experimental study was undertaken on the drying of suspended drops of pure water and aqueous sodium sulphate decahydrate with concentrations varying from 5 to 54. 1 wt. %. Individual drops were suspended from a glass filament balance in a 26 mm I.D. vertical wind tunnel, designed and constructed to supply hot de-humidified air, to simulate conditions encountered in commercial spray driers. A novel thin film thermocouple was developed to facilitate the simultaneous measurement of drop weight and core temperature. The heat conduction through the thermocouple was reduced because of its unique design; using essentially a single 50μ diameter nickel wire. For pure water drops, the Nusselt number was found to be a function of the Reynolds, Prandtl and Transfer numbers for a temperature range between 19 to 79°C.                  Nu = 2 + 0.19 (1/B)0.24 Re0.5 Pr0.33 Two distinct periods were observed during the drying of aqueous sodium sulphate decahydrate. The first period was characterised by the evaporation from a free liquid surface, whilst drying in the second period was controlled by the crust resistance. Fracturing of the crust occurred randomly but was more frequent at higher concentrations and temperatures. A model was proposed for the drying of slurry drops, based on a receding evaporation interface. The model was solved numerically for the variation of core temperature, drop weight and crust thickness as a function of time. Experimental results were in excellent agreement with the model predictions although at higher temperatures modifications to the model had to be made to accommodate the unusual behaviour of sodium sulphate slurries, i.e. the formation of hydrates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work described in this thesis is concerned with mechanisms of contact lens lubrication. There are three major driving forces in contact lens design and development; cost, convenience, and comfort. Lubrication, as reflected in the coefficient of friction, is becoming recognised as one of the major factors affecting the comfort of the current generation of contact lenses, which have benefited from several decades of design and production improvements. This work started with the study of the in-eye release of soluble macromolecules from a contact lens matrix. The vehicle for the study was the family of CIBA Vision Focus® DAILIES® daily disposable contact lenses which is based on polyvinyl alcohol (PVA). The effective release of linear soluble PVA from DAILIES on the surface of the lens was shown to be beneficial in terms of patient comfort. There was a need to develop a novel characterisation technique in order to study these effects at surfaces; this led to the study of a novel tribological technique, which allowed the friction coefficients of different types of contact lenses to be measured reproducibly at genuinely low values. The tribometer needed the ability to accommodate the following features: (a) an approximation to eye lid load, (b) both new and ex-vivo lenses, (c) variations in substrate, (d) different ocular lubricants (including tears). The tribometer and measuring technique developed in this way was used to examine the surface friction and lubrication mechanisms of two different types of contact lenses: daily disposables and silicone hydrogels. The results from the tribometer in terms of both mean friction coefficient and the friction profiles obtained allowed various mechanisms used for surface enhancement now seen in the daily disposable contact lens sector to be evaluated. The three major methods used are: release of soluble macromolecules (such as PVA) from the lens matrix, irreversible surface binding of a macromolecule (such as polyvinyl pyrrolidone) by charge transfer and the simple polymer adsorption (e.g. Pluoronic) at the lens surface. The tribological technique was also used to examine the trends in the development of silicone hydrogel contact lenses. The focus of the principles in the design of silicone hydrogels has now shifted from oxygen permeability, to the improvement of surface properties. Presently, tribological studies reflect the most effective in vitro method of surface evaluation in relation to the in-eye comfort.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is very well known that contrast detection thresholds improve with the size of a grating-type stimulus, but it is thought that the benefit of size is abolished for contrast discriminations well above threshold (e.g., Legge, G. E., & Foley, J. M. (1980)]. Here we challenge the generality of this view. We performed contrast detection and contrast discrimination for circular patches of sine wave grating as a function of stimulus size. We confirm that sensitivity improves with approximately the fourth-root of stimulus area at detection threshold (a log-log slope of -0.25) but find individual differences (IDs) for the suprathreshold discrimination task. For several observers, performance was largely unaffected by area, but for others performance first improved (by as much as a log-log slope of -0.5) and then reached a plateau. We replicated these different results several times on the same observers. All of these results were described in the context of a recent gain control model of area summation [Meese, T. S. (2004)], extended to accommodate the multiple stimulus sizes used here. In this model, (i) excitation increased with the fourth-root of stimulus area for all observers, and (ii) IDs in the discrimination data were described by IDs in the relation between suppression and area. This means that empirical summation in the contrast discrimination task can be attributed to growth in suppression with stimulus size that does not keep pace with the growth in excitation. © 2005 ARVO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The global market has become increasingly dynamic, unpredictable and customer-driven. This has led to rising rates of new product introduction and turbulent demand patterns across product mixes. As a result, manufacturing enterprises were facing mounting challenges to be agile and responsive to cope with market changes, so as to achieve the competitiveness of producing and delivering products to the market timely and cost-effectively. This paper introduces a currency-based iterative agent bidding mechanism to effectively and cost-efficiently integrate the activities associated with production planning and control, so as to achieve an optimised process plan and schedule. The aim is to enhance the agility of manufacturing systems to accommodate dynamic changes in the market and production. The iterative bidding mechanism is executed based on currency-like metrics; each operation to be performed is assigned with a virtual currency value and agents bid for the operation if they make a virtual profit based on this value. These currency values are optimised iteratively and so does the bidding process based on new sets of values. This is aimed at obtaining better and better production plans, leading to near-optimality. A genetic algorithm is proposed to optimise the currency values at each iteration. In this paper, the implementation of the mechanism and the test case simulation results are also discussed. © 2012 Elsevier Ltd. All rights reserved.