12 resultados para model complexity

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a Bayesian framework for regression problems, which covers areas which are usually dealt with by function approximation. An online learning algorithm is derived which solves regression problems with a Kalman filter. Its solution always improves with increasing model complexity, without the risk of over-fitting. In the infinite dimension limit it approaches the true Bayesian posterior. The issues of prior selection and over-fitting are also discussed, showing that some of the commonly held beliefs are misleading. The practical implementation is summarised. Simulations using 13 popular publicly available data sets are used to demonstrate the method and highlight important issues concerning the choice of priors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis describes the Generative Topographic Mapping (GTM) --- a non-linear latent variable model, intended for modelling continuous, intrinsically low-dimensional probability distributions, embedded in high-dimensional spaces. It can be seen as a non-linear form of principal component analysis or factor analysis. It also provides a principled alternative to the self-organizing map --- a widely established neural network model for unsupervised learning --- resolving many of its associated theoretical problems. An important, potential application of the GTM is visualization of high-dimensional data. Since the GTM is non-linear, the relationship between data and its visual representation may be far from trivial, but a better understanding of this relationship can be gained by computing the so-called magnification factor. In essence, the magnification factor relates the distances between data points, as they appear when visualized, to the actual distances between those data points. There are two principal limitations of the basic GTM model. The computational effort required will grow exponentially with the intrinsic dimensionality of the density model. However, if the intended application is visualization, this will typically not be a problem. The other limitation is the inherent structure of the GTM, which makes it most suitable for modelling moderately curved probability distributions of approximately rectangular shape. When the target distribution is very different to that, theaim of maintaining an `interpretable' structure, suitable for visualizing data, may come in conflict with the aim of providing a good density model. The fact that the GTM is a probabilistic model means that results from probability theory and statistics can be used to address problems such as model complexity. Furthermore, this framework provides solid ground for extending the GTM to wider contexts than that of this thesis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article develops a relational model of institutional work and complexity. This model advances current institutional debates on institutional complexity and institutional work in three ways. First, it provides a relational and dynamic perspective on institutional complexity by explaining how constellations of logics - and their degree of internal contradiction - are constructed rather than given. Second, it refines our current understanding of agency, intentionality and effort in institutional work by demonstrating how different dimensions of agency interact dynamically in the institutional work of reconstructing institutional complexity. Third, it situates institutional work in the everyday practice of individuals coping with the institutional complexities of their work. In doing so, it reconnects the construction of institutionally complex settings to the actions and interactions of the individuals who inhabit them. © The Author(s) 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The soil-plant-moisture subsystem is an important component of the hydrological cycle. Over the last 20 or so years a number of computer models of varying complexity have represented this subsystem with differing degrees of success. The aim of this present work has been to improve and extend an existing model. The new model is less site specific thus allowing for the simulation of a wide range of soil types and profiles. Several processes, not included in the original model, are simulated by the inclusion of new algorithms, including: macropore flow; hysteresis and plant growth. Changes have also been made to the infiltration, water uptake and water flow algorithms. Using field data from various sources, regression equations have been derived which relate parameters in the suction-conductivity-moisture content relationships to easily measured soil properties such as particle-size distribution data. Independent tests have been performed on laboratory data produced by Hedges (1989). The parameters found by regression for the suction relationships were then used in equations describing the infiltration and macropore processes. An extensive literature review produced a new model for calculating plant growth from actual transpiration, which was itself partly determined by the root densities and leaf area indices derived by the plant growth model. The new infiltration model uses intensity/duration curves to disaggregate daily rainfall inputs into hourly amounts. The final model has been calibrated and tested against field data, and its performance compared to that of the original model. Simulations have also been carried out to investigate the effects of various parameters on infiltration, macropore flow, actual transpiration and plant growth. Qualitatively comparisons have been made between these results and data given in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A main unsolved problem in the RNA World scenario for the origin of life is how a template-dependent RNA polymerase ribozyme emerged from short RNA oligomers obtained by random polymerization on mineral surfaces. A number of computational studies have shown that the structural repertoire yielded by that process is dominated by topologically simple structures, notably hairpin-like ones. A fraction of these could display RNA ligase activity and catalyze the assembly of larger, eventually functional RNA molecules retaining their previous modular structure: molecular complexity increases but template replication is absent. This allows us to build up a stepwise model of ligation- based, modular evolution that could pave the way to the emergence of a ribozyme with RNA replicase activity, step at which information-driven Darwinian evolution would be triggered. Copyright © 2009 RNA Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current models of word production assume that words are stored as linear sequences of phonemes which are structured into syllables only at the moment of production. This is because syllable structure is always recoverable from the sequence of phonemes. In contrast, we present theoretical and empirical evidence that syllable structure is lexically represented. Storing syllable structure would have the advantage of making representations more stable and resistant to damage. On the other hand, re-syllabifications affect only a minimal part of phonological representations and occur only in some languages and depending on speech register. Evidence for these claims comes from analyses of aphasic errors which not only respect phonotactic constraints, but also avoid transformations which move the syllabic structure of the word further away from the original structure, even when equating for segmental complexity. This is true across tasks, types of errors, and, crucially, types of patients. The same syllabic effects are shown by apraxic patients and by phonological patients who have more central difficulties in retrieving phonological representations. If syllable structure was only computed after phoneme retrieval, it would have no way to influence the errors of phonological patients. Our results have implications for psycholinguistic and computational models of language as well as for clinical and educational practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this letter, a nonlinear semi-analytical model (NSAM) for simulation of few-mode fiber transmission is proposed. The NSAM considers the mode mixing arising from the Kerr effect and waveguide imperfections. An analytical explanation of the model is presented, as well as simulation results for the transmission over a two mode fiber (TMF) of 112 Gb/s using coherently detected polarization multiplexed quadrature phase-shift-keying modulation. The simulations show that by transmitting over only one of the two modes on TMFs, long-haul transmission can be realized without increase of receiver complexity. For a 6000-km transmission link, a small modal dispersion penalty is observed in the linear domain, while a significant increase of the nonlinear threshold is observed due to the large core of TMF. © 2006 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern advances in technology have led to more complex manufacturing processes whose success centres on the ability to control these processes with a very high level of accuracy. Plant complexity inevitably leads to poor models that exhibit a high degree of parametric or functional uncertainty. The situation becomes even more complex if the plant to be controlled is characterised by a multivalued function or even if it exhibits a number of modes of behaviour during its operation. Since an intelligent controller is expected to operate and guarantee the best performance where complexity and uncertainty coexist and interact, control engineers and theorists have recently developed new control techniques under the framework of intelligent control to enhance the performance of the controller for more complex and uncertain plants. These techniques are based on incorporating model uncertainty. The newly developed control algorithms for incorporating model uncertainty are proven to give more accurate control results under uncertain conditions. In this paper, we survey some approaches that appear to be promising for enhancing the performance of intelligent control systems in the face of higher levels of complexity and uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of high phase noise in addition to additive white Gaussian noise in coherent optical systems affects the performance of forward error correction (FEC) schemes. In this paper, we propose a simple scheme for such systems, using block interleavers and binary Bose–Chaudhuri–Hocquenghem (BCH) codes. The block interleavers are specifically optimized for differential quadrature phase shift keying modulation. We propose a method for selecting BCH codes that, together with the interleavers, achieve a target post-FEC bit error rate (BER). This combination of interleavers and BCH codes has very low implementation complexity. In addition, our approach is straightforward, requiring only short pre-FEC simulations to parameterize a model, based on which we select codes analytically. We aim to correct a pre-FEC BER of around (Formula presented.). We evaluate the accuracy of our approach using numerical simulations. For a target post-FEC BER of (Formula presented.), codes selected using our method result in BERs around 3(Formula presented.) target and achieve the target with around 0.2 dB extra signal-to-noise ratio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The semantic model developed in this research was in response to the difficulty a group of mathematics learners had with conventional mathematical language and their interpretation of mathematical constructs. In order to develop the model ideas from linguistics, psycholinguistics, cognitive psychology, formal languages and natural language processing were investigated. This investigation led to the identification of four main processes: the parsing process, syntactic processing, semantic processing and conceptual processing. The model showed the complex interdependency between these four processes and provided a theoretical framework in which the behaviour of the mathematics learner could be analysed. The model was then extended to include the use of technological artefacts into the learning process. To facilitate this aspect of the research, the theory of instrumentation was incorporated into the semantic model. The conclusion of this research was that although the cognitive processes were interdependent, they could develop at different rates until mastery of a topic was achieved. It also found that the introduction of a technological artefact into the learning environment introduced another layer of complexity, both in terms of the learning process and the underlying relationship between the four cognitive processes.