43 resultados para parallel processing systems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which accounts for input noise provided that a model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method adds an extra term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable, and sampling this jointly with the network’s weights, using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless input. This leads to the possibility of training an accurate model of a system using less accurate, or more uncertain, data. This is demonstrated on both the, synthetic, noisy sine wave problem and a real problem of inferring the forward model for a satellite radar backscatter system used to predict sea surface wind vectors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diffusion processes are a family of continuous-time continuous-state stochastic processes that are in general only partially observed. The joint estimation of the forcing parameters and the system noise (volatility) in these dynamical systems is a crucial, but non-trivial task, especially when the system is nonlinear and multimodal. We propose a variational treatment of diffusion processes, which allows us to compute type II maximum likelihood estimates of the parameters by simple gradient techniques and which is computationally less demanding than most MCMC approaches. We also show how a cheap estimate of the posterior over the parameters can be constructed based on the variational free energy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Satellite information, in combination with conventional point source measurements, can be a valuable source of information. This thesis is devoted to the spatial estimation of areal rainfall over a region using both the measurements from a dense and sparse network of rain-gauges and images from the meteorological satellites. A primary concern is to study the effects of such satellite assisted rainfall estimates on the performance of rainfall-runoff models. Low-cost image processing systems and peripherals are used to process and manipulate the data. Both secondary as well as primary satellite images were used for analysis. The secondary data was obtained from the in-house satellite receiver and the primary data was obtained from an outside source. Ground truth data was obtained from the local Water Authority. A number of algorithms are presented that combine the satellite and conventional data sources to produce areal rainfall estimates and the results are compared with some of the more traditional methodologies. The results indicate that the satellite cloud information is valuable in the assessment of the spatial distribution of areal rainfall, for both half-hourly as well as daily estimates of rainfall. It is also demonstrated how the performance of the simple multiple regression rainfall-runoff model is improved when satellite cloud information is used as a separate input in addition to rainfall estimates from conventional means. The use of low-cost equipment, from image processing systems to satellite imagery, makes it possible for developing countries to introduce such systems in areas where the benefits are greatest.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neuroimaging studies of cortical activation during image transformation tasks have shown that mental rotation may rely on similar brain regions as those underlying visual perceptual mechanisms. The V5 complex, which is specialised for visual motion, is one region that has been implicated. We used functional magnetic resonance imaging (fMRI) to investigate rotational and linear transformation of stimuli. Areas of significant brain activation were identified for each of the primary mental transformation tasks in contrast to its own perceptual reference task which was cognitively matched in all respects except for the variable of interest. Analysis of group data for perception of rotational and linear motion showed activation in areas corresponding to V5 as defined in earlier studies. Both rotational and linear mental transformations activated Brodman Area (BA) 19 but did not activate V5. An area within the inferior temporal gyrus, representing an inferior satellite area of V5, was activated by both the rotational perception and rotational transformation tasks, but showed no activation in response to linear motion perception or transformation. The findings demonstrate the extent to which neural substrates for image transformation and perception overlap and are distinct as well as revealing functional specialisation within perception and transformation processing systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Educational institutions are under pressure to provide high quality education to large numbers of students very efficiently. The efficiency target combined with the large numbers generally militates against providing students with a great deal of personal or small group tutorial contact with academic staff. As a result of this, students often develop their learning criteria as a group activity, being guided by comparisons one with another rather than the formal assessments made of their submitted work. IT systems and the World Wide Web are increasingly employed to amplify the resources of academic departments although their emphasis tends to be with course administration rather than learning support. The ready availability of information on the World Wide Web and the ease with which is may be incorporated into essays can lead students to develop a limited view of learning as the process of finding, editing and linking information. This paper examines a module design strategy for tackling these issues, based on developments in modules where practical knowledge is a significant element of the learning objectives. Attempts to make effective use of IT support in these modules will be reviewed as a contribution to the development of an IT for learning strategy currently being undertaken in the author’s Institution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Various monoacrylic compounds containing a hindered phenol function (e.g.3,5-di-tert.-butyl-4-hydroxy benzyl alcohol, DBBA and vinyl-3-[3',5'-di-tert.-butyl-4-hydroxy phenyl] propionate, VDBP), and a benzophenone function (2-hydroxy-4-[beta hydroxy ethoxy] benzophenone, HAEB) were synthesised and used as reactive antioxidants (AO's) for polypropylene (PP). These compounds were reacted with PP melt in the presence of low concentration of a free radical generator such a peroxide (reactive processing) to produce bound-antioxidant concentrates. The binding reaction of these AO's onto PP was found to be low and this was shown to be mainly due to competing reactions such as homopolymerisation of the antioxidant. At high concentrations of peroxide, higher binding efficiency resulted, but, this was accompanied by melt degradation of the polymer. In a special reactive processing procedure, a di- or a trifunctional reactant (referred to as coagent), e.g.tri-methylol propane tri-acrylate, Tris, and Divinyl benzene, DVB, were used with the antioxidant and this has led to an enhanced efficiency of the grating reaction of antioxidants on the polymer in the melt. The evidence suggests that this is due to copolymerisation of the antioxidants with the coagent as well as grafting of the copolymers onto the polymer backbone. Although the 'bound' AO's containing a UV stabilising function showed lower overall stabilisation effect than the unbound analogues before extraction, they were still much more effective when subjected to exhaustive solvent extraction. Furthermore, a very effective synergistic stabilising activity when two reactive AO's containing thermal and UV stabilising functions e.g. DBBA and HAEB, were reactively processed with PP in the presence of a coagent. The stabilising effectiveness of such a synergist was much higher than that of the unbound analogues both before and after extraction. Analysis using the GPC technique of concentrates containing bound-DBBA processed in the presence of Tris coagent showed higher molecular weight (Mn), compared to that of a polymer processed without the coagent, but was still lower than that of the control processed PP with no additives. This indicates that Tris coagent may inhibit further melt degradation of the polymer. Model reactions of DBBA in liquid hydrocarbon (decalin) and analysis of the products using FTIR and NMR spectroscopy showed the formation of grafted DBBA onto decalin molecules as well as homopolymerisation of the AO. In the presence of Tris coagent, copolymerisation of DBBA with the Tris inevitably occured; which was followed by grafting of the copolymer onto the decalin, FTIR and NMR results of the polymer concentrates containing bound-DBBA processed with and without Tris, showed similar behaviour as the above model reactions. This evidence supports the effect of Tris in enhancing the efficiency of the reaction of DBBA in the polymer melt. Reactive procesing of HAEB in polymer melts exhibited crosslinking formation In the early stages of the reaction, however, in the final stage, the crosslinked structure was 'broken down' or rearranged to give an almost gel free polymer with high antioxidant binding efficiency.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Improving bit error rates in optical communication systems is a difficult and important problem. The error correction must take place at high speed and be extremely accurate. We show the feasibility of using hardware implementable machine learning techniques. This may enable some error correction at the speed required.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Multi-agent systems are complex systems comprised of multiple intelligent agents that act either independently or in cooperation with one another. Agent-based modelling is a method for studying complex systems like economies, societies, ecologies etc. Due to their complexity, very often mathematical analysis is limited in its ability to analyse such systems. In this case, agent-based modelling offers a practical, constructive method of analysis. The objective of this book is to shed light on some emergent properties of multi-agent systems. The authors focus their investigation on the effect of knowledge exchange on the convergence of complex, multi-agent systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Improving bit error rates in optical communication systems is a difficult and important problem. The error correction must take place at high speed and be extremely accurate. We show the feasibility of using hardware implementable machine learning techniques. This may enable some error correction at the speed required.