234 resultados para Generalisation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The scaling problems which afflict attempts to optimise neural networks (NNs) with genetic algorithms (GAs) are disclosed. A novel GA-NN hybrid is introduced, based on the bumptree, a little-used connectionist model. As well as being computationally efficient, the bumptree is shown to be more amenable to genetic coding lthan other NN models. A hierarchical genetic coding scheme is developed for the bumptree and shown to have low redundancy, as well as being complete and closed with respect to the search space. When applied to optimising bumptree architectures for classification problems the GA discovers bumptrees which significantly out-perform those constructed using a standard algorithm. The fields of artificial life, control and robotics are identified as likely application areas for the evolutionary optimisation of NNs. An artificial life case-study is presented and discussed. Experiments are reported which show that the GA-bumptree is able to learn simulated pole balancing and car parking tasks using only limited environmental feedback. A simple modification of the fitness function allows the GA-bumptree to learn mappings which are multi-modal, such as robot arm inverse kinematics. The dynamics of the 'geographic speciation' selection model used by the GA-bumptree are investigated empirically and the convergence profile is introduced as an analytical tool. The relationships between the rate of genetic convergence and the phenomena of speciation, genetic drift and punctuated equilibrium arc discussed. The importance of genetic linkage to GA design is discussed and two new recombination operators arc introduced. The first, linkage mapped crossover (LMX) is shown to be a generalisation of existing crossover operators. LMX provides a new framework for incorporating prior knowledge into GAs.Its adaptive form, ALMX, is shown to be able to infer linkage relationships automatically during genetic search.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A number of researchers have investigated the application of neural networks to visual recognition, with much of the emphasis placed on exploiting the network's ability to generalise. However, despite the benefits of such an approach it is not at all obvious how networks can be developed which are capable of recognising objects subject to changes in rotation, translation and viewpoint. In this study, we suggest that a possible solution to this problem can be found by studying aspects of visual psychology and in particular, perceptual organisation. For example, it appears that grouping together lines based upon perceptually significant features can facilitate viewpoint independent recognition. The work presented here identifies simple grouping measures based on parallelism and connectivity and shows how it is possible to train multi-layer perceptrons (MLPs) to detect and determine the perceptual significance of any group presented. In this way, it is shown how MLPs which are trained via backpropagation to perform individual grouping tasks, can be brought together into a novel, large scale network capable of determining the perceptual significance of the whole input pattern. Finally the applicability of such significance values for recognition is investigated and results indicate that both the NILP and the Kohonen Feature Map can be trained to recognise simple shapes described in terms of perceptual significances. This study has also provided an opportunity to investigate aspects of the backpropagation algorithm, particularly the ability to generalise. In this study we report the results of various generalisation tests. In applying the backpropagation algorithm to certain problems, we found that there was a deficiency in performance with the standard learning algorithm. An improvement in performance could however, be obtained when suitable modifications were made to the algorithm. The modifications and consequent results are reported here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are been a resurgence of interest in the neural networks field in recent years, provoked in part by the discovery of the properties of multi-layer networks. This interest has in turn raised questions about the possibility of making neural network behaviour more adaptive by automating some of the processes involved. Prior to these particular questions, the process of determining the parameters and network architecture required to solve a given problem had been a time consuming activity. A number of researchers have attempted to address these issues by automating these processes, concentrating in particular on the dynamic selection of an appropriate network architecture.The work presented here specifically explores the area of automatic architecture selection; it focuses upon the design and implementation of a dynamic algorithm based on the Back-Propagation learning algorithm. The algorithm constructs a single hidden layer as the learning process proceeds using individual pattern error as the basis of unit insertion. This algorithm is applied to several problems of differing type and complexity and is found to produce near minimal architectures that are shown to have a high level of generalisation ability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aims of this study were to investigate the beliefs concerning the philosophy of science held by practising science teachers and to relate those beliefs to their pupils' understanding of the philosophy of science. Three philosophies of science, differing in the way they relate experimental work to other parts of the scientific enterprise, are described. By the use of questionnaire techniques, teachers of four extreme types were identified. These are: the H type or hypothetico-deductivist teacher, who sees experiments as potential falsifiers of hypotheses or of logical deductions from them; the I type or inductivist teacher, who regards experiments mainly as a way of increasing the range of observations available for recording before patterns are noted and inductive generalisation is carried out; the V type or verificationist teacher, who expects experiments to provide proof and to demonstrate the truth or accuracy of scientific statements; and the 0 type, who has no discernible philosophical beliefs about the nature of science or its methodology. Following interviews of selected teachers to check their responses to the questionnaire and to determine their normal teaching methods, an experiment was organised in which parallel groups were given H, I and V type teaching in the normal school situation during most of one academic year. Using pre-test and post-test scores on a specially developed test of pupil understanding of the philosophy of science, it was shown that pupils were positively affected by their teacher's implied philosophy of science. There was also some indication that V type teaching improved marks obtained in school science examinations, but appeared to discourage the more able from continuing the study of science. Effects were also noted on vocabulary used by pupils to describe scientists and their activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The controversy surrounding the non-uniqueness of predictive gene lists (PGL) of small selected subsets of genes from very large potential candidates as available in DNA microarray experiments is now widely acknowledged 1. Many of these studies have focused on constructing discriminative semi-parametric models and as such are also subject to the issue of random correlations of sparse model selection in high dimensional spaces. In this work we outline a different approach based around an unsupervised patient-specific nonlinear topographic projection in predictive gene lists. Methods: We construct nonlinear topographic projection maps based on inter-patient gene-list relative dissimilarities. The Neuroscale, the Stochastic Neighbor Embedding(SNE) and the Locally Linear Embedding(LLE) techniques have been used to construct two-dimensional projective visualisation plots of 70 dimensional PGLs per patient, classifiers are also constructed to identify the prognosis indicator of each patient using the resulting projections from those visualisation techniques and investigate whether a-posteriori two prognosis groups are separable on the evidence of the gene lists. A literature-proposed predictive gene list for breast cancer is benchmarked against a separate gene list using the above methods. Generalisation ability is investigated by using the mapping capability of Neuroscale to visualise the follow-up study, but based on the projections derived from the original dataset. Results: The results indicate that small subsets of patient-specific PGLs have insufficient prognostic dissimilarity to permit a distinction between two prognosis patients. Uncertainty and diversity across multiple gene expressions prevents unambiguous or even confident patient grouping. Comparative projections across different PGLs provide similar results. Conclusion: The random correlation effect to an arbitrary outcome induced by small subset selection from very high dimensional interrelated gene expression profiles leads to an outcome with associated uncertainty. This continuum and uncertainty precludes any attempts at constructing discriminative classifiers. However a patient's gene expression profile could possibly be used in treatment planning, based on knowledge of other patients' responses. We conclude that many of the patients involved in such medical studies are intrinsically unclassifiable on the basis of provided PGL evidence. This additional category of 'unclassifiable' should be accommodated within medical decision support systems if serious errors and unnecessary adjuvant therapy are to be avoided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gately [1974] recently introduced the concept of an individual player's “propensity to disrupt” a payoff vector in a three-person characteristic function game. As a generalisation of this concept we propose the “disruption nucleolus” of ann-person game. The properties and computational possibilities of this concept are analogous to those of the nucleolus itself. Two numerical examples are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses the question of how business schoolsestablished as public privatepartnerships (PPPs) within a regional university in the English-speaking Caribbean survived for over twenty-one years and achieved legitimacy in their environment. The aim of the study was to examine how public and private sector actors contributed to the evolution of the PPPs. A social network perspective provided a broad relational focus from which to explore the phenomenon and engage disciplinary and middle-rangetheories to develop explanations. Legitimacy theory provided an appropriate performance dimension from which to assess PPP success. An embedded multiple-case research design, with three case sites analysed at three levels including the country and university environment, the PPP as a firm and the subgroup level constituted the methodological framing of the research process. The analysis techniques included four methods but relied primarily on discourse and social network analysis of interview data from 40 respondents across the three sites. A staged analysis of the evolution of the firm provided the ‘time and effects’ antecedents which formed the basis for sense-making to arrive at explanations of the public-private relationship-influenced change. A conceptual model guided the study and explanations from the cross-case analysis were used to refine the process model and develop a dynamic framework and set of theoretical propositions that would underpin explanations of PPP success and legitimacy in matched contexts through analytical generalisation. The study found that PPP success was based on different models of collaboration and partner resource contribution that arose from a confluence of variables including the development of shared purpose, private voluntary control in corporate governance mechanisms and boundary spanning leadership. The study contributes a contextual theory that explains how PPPs work and a research agenda of ‘corporate governance as inspiration’ from a sociological perspective of ‘liquid modernity’. Recommendations for policy and management practice were developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Findings: As part of the consequences of new public management reforms, we illustrate how institutional entrepreneurs de-established an older state-run bureaucratic and engineering-based routine and replaced it with a business- and accounting-based routine. Eventually, new accounting routines were reproduced and taken for granted by telecommunications management and employees. Research Limitations/implications: As this study is limited to a single case study, no generalisation except to theory can be made. There are implications for privatisation of state sector organisations both locally and internationally. Originality/value: The paper makes a contribution to elaborating the role of institutional entrepreneurs as agents of change towards privatisation and how accounting was used as a technology of change. Purpose: The purpose of this paper is to explicate the role of institutional entrepreneurs who use accounting technology to accomplish change within a privatised telecommunications company. Design/methodology: The case study method is adopted. The authors draw on recent extension to institutional theory that gives greater emphasis to agency including concepts such as embeddedness, institutional entrepreneurs and institutional contradiction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research indicates associative and strategic deficits mediate age related deficits in memory, whereas simple associative processes are independent of strategic processing and strategic processes mediate resistance to interference. The present study showed age-related deficits in a contingency learning task, although older participants' resistance to interference was not disproportionately affected. Recognition memory predicted discrimination, whereas general cognitive ability predicted resistance to interference, suggesting differentiation between associative and strategic processes in learning and memory, and age declines in associative processes. Older participants' generalisation of associative strength from existing to novel stimulus-response associations was consistent with elemental learning theories, whereas configural models predicted younger participants' responses. This is consistent with associative deficits and reliance on item-level representations in memory during later life. © 2011 Psychology Press Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper studies a generalisation of the dynamic Leontief input-output model. The standard dynamic Leontief model will be extended with the balance equation of renewable resources. The renewable stocks will increase regenerating and decrease exploiting primary natural resources. In this study the controllability of this extended model is examined by taking the consumption as the control parameter. Assuming balanced growth for both consumption and production, we investigate the exhaustion of renewable resources in dependence on the balanced growth rate and on the rate of natural regeneration. In doing so, classic results from control theory and on eigenvalue problems in linear algebra are applied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper studies a generalisation of the dynamic Leontief input-output model. The standard dynamic Leontief model will be extended with the balance equation of renewable resources. The renewable stocks will increase regenerating and decrease exploiting primary natural resources. In this study the controllability of this extended model is examined by taking the consumption as the control parameter. Assuming balanced growth for both consumption and production, we investigate the exhaustion of renewable resources in dependence on the balanced growth rate and on the rate of natural regeneration. In doing so, classic results from control theory and on eigenvalue problems in linear algebra are applied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proof critics are a technology from the proof planning paradigm. They examine failed proof attempts in order to extract information which can be used to generate a patch which will allow the proof to go through. We consider the proof of the $quot;whisky problem$quot;, a challenge problem from the domain of temporal logic. The proof requires a generalisation of the original conjecture and we examine two proof critics which can be used to create this generalisation. Using these critics we believe we have produced the first automatic proofs of this challenge problem. We use this example to motivate a comparison of the two critics and propose that there is a place for specialist critics as well as powerful general critics. In particular we advocate the development of critics that do not use meta-variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coinduction is a proof rule. It is the dual of induction. It allows reasoning about non--well--founded structures such as lazy lists or streams and is of particular use for reasoning about equivalences. A central difficulty in the automation of coinductive proof is the choice of a relation (called a bisimulation). We present an automation of coinductive theorem proving. This automation is based on the idea of proof planning. Proof planning constructs the higher level steps in a proof, using knowledge of the general structure of a family of proofs and exploiting this knowledge to control the proof search. Part of proof planning involves the use of failure information to modify the plan by the use of a proof critic which exploits the information gained from the failed proof attempt. Our approach to the problem was to develop a strategy that makes an initial simple guess at a bisimulation and then uses generalisation techniques, motivated by a critic, to refine this guess, so that a larger class of coinductive problems can be automatically verified. The implementation of this strategy has focused on the use of coinduction to prove the equivalence of programs in a small lazy functional language which is similar to Haskell. We have developed a proof plan for coinduction and a critic associated with this proof plan. These have been implemented in CoClam, an extended version of Clam with encouraging results. The planner has been successfully tested on a number of theorems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topic of this thesis is the application of distributive laws between comonads to the theory of cyclic homology. The work herein is based on the three papers 'Cyclic homology arising from adjunctions', 'Factorisations of distributive laws', and 'Hochschild homology, lax codescent,and duplicial structure', to which the current author has contributed. Explicitly, our main aims are: 1) To study how the cyclic homology of associative algebras and of Hopf algebras in the original sense of Connes and Moscovici arises from a distributive law, and to clarify the role of different notions of bimonad in this generalisation. 2) To extend the procedure of twisting the cyclic homology of a unital associative algebra to any duplicial object defined by a distributive law. 3) To study the universality of Bohm and Stefan’s approach to constructing duplicial objects, which we do in terms of a 2-categorical generalisation of Hochschild (co)homology. 4) To characterise those categories whose nerve admits a duplicial structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Doutoramento em Economia