949 resultados para thesis coding
Resumo:
Fault tree analysis is used as a tool within hazard and operability (Hazop) studies. The present study proposes a new methodology for obtaining the exact TOP event probability of coherent fault trees. The technique uses a top-down approach similar to that of FATRAM. This new Fault Tree Disjoint Reduction Algorithm resolves all the intermediate events in the tree except OR gates with basic event inputs so that a near minimal cut sets expression is obtained. Then Bennetts' disjoint technique is applied and remaining OR gates are resolved. The technique has been found to be appropriate as an alternative to Monte Carlo simulation methods when rare events are countered and exact results are needed. The algorithm has been developed in FORTRAN 77 on the Perq workstation as an addition to the Aston Hazop package. The Perq graphical environment enabled a friendly user interface to be created. The total package takes as its input cause and symptom equations using Lihou's form of coding and produces both drawings of fault trees and the Boolean sum of products expression into which reliability data can be substituted directly.
Resumo:
The scaling problems which afflict attempts to optimise neural networks (NNs) with genetic algorithms (GAs) are disclosed. A novel GA-NN hybrid is introduced, based on the bumptree, a little-used connectionist model. As well as being computationally efficient, the bumptree is shown to be more amenable to genetic coding lthan other NN models. A hierarchical genetic coding scheme is developed for the bumptree and shown to have low redundancy, as well as being complete and closed with respect to the search space. When applied to optimising bumptree architectures for classification problems the GA discovers bumptrees which significantly out-perform those constructed using a standard algorithm. The fields of artificial life, control and robotics are identified as likely application areas for the evolutionary optimisation of NNs. An artificial life case-study is presented and discussed. Experiments are reported which show that the GA-bumptree is able to learn simulated pole balancing and car parking tasks using only limited environmental feedback. A simple modification of the fitness function allows the GA-bumptree to learn mappings which are multi-modal, such as robot arm inverse kinematics. The dynamics of the 'geographic speciation' selection model used by the GA-bumptree are investigated empirically and the convergence profile is introduced as an analytical tool. The relationships between the rate of genetic convergence and the phenomena of speciation, genetic drift and punctuated equilibrium arc discussed. The importance of genetic linkage to GA design is discussed and two new recombination operators arc introduced. The first, linkage mapped crossover (LMX) is shown to be a generalisation of existing crossover operators. LMX provides a new framework for incorporating prior knowledge into GAs.Its adaptive form, ALMX, is shown to be able to infer linkage relationships automatically during genetic search.
Resumo:
This thesis explores the processes of team innovation. It utilises two studies, an organisationally based pilot and an experimental study, to examine and identify aspects of teams' behaviours that are important for successful innovative outcome. The pilot study, based in two automotive manufacturers, involved the collection of team members' experiences through semi-structured interviews, and identified a number of factors that affected teams' innovative performance. These included: the application of ideative & dissemination processes; the importance of good team relationships, especially those of a more informal nature, in facilitating information and ideative processes; the role of external linkages in enhancing quality and radicality of innovations; and the potential attenuation of innovative ideas by time deadlines. This study revealed a number key team behaviours that may be important in successful innovation outcomes. These included; goal setting, idea generation and development, external contact, task and personal information exchange, leadership, positive feedback and resource deployment. These behaviours formed the basis of a coding system used in the second part of the research. Building on the results from the field based research, an experimental study was undertaken to examine the behavioural differences between three groups of sixteen teams undertaking innovative an task to produce an anti-drugs poster. They were randomly assigned to one of three innovation category conditions suggested by King and Anderson (1990), emergent, imported and imposed. These conditions determined the teams level of access to additional information on previously successful campaigns and the degree of freedom they had with regarding to the design of the poster. In addition, a further experimental condition was imposed on half of the teams per category which involved a formal time deadline for task completion. The teams were video taped for the duration of their innovation and their behaviours analysed and coded in five main aspects including; ideation, external focus, goal setting, interpersonal, directive and resource related activities. A panel of experts, utilising five scales developed from West and Anderson's (1996) innovation outcome measures, assessed the teams' outputs. ANOVAs and repeated measure ANOVAs were deployed to identify whether there were significant differences between the different conditions. The results indicated that there were some behavioural differences between the categories and that over the duration of the task behavioural changes were identified. The results, however, revealed a complex picture and suggested limited support for three distinctive innovation categories. There were many differences in behaviours, but rarely between more than two of the categories. A main finding was the impact that different levels of constraint had in changing teams' focus of attention. For example, emergent teams were found to use both their own team and external resources, whilst those who could import information about other successful campaigns were likely to concentrate outside the team and pay limited attention to the internal resources available within the team. In contrast, those operating under task constraints with aspects of the task imposed onto them were more likely to attend to internal team resources and pay limited attention to the external world. As indicated by the earlier field study, time deadlines did significantly change teams' behaviour, reducing ideative and information exchange behaviours. The model shows an important behavioural progression related to innovate teams. This progression involved the teams' openness initially to external sources, and then to the intra-team environment. Premature closure on the final idea before their mid-point was found to have a detrimental impact on team's innovation. Ideative behaviour per se was not significant for innovation outcome, instead the development of intra-team support and trust emerged as crucial. Analysis of variance revealed some limited differentiation between the behaviours of teams operating under the aforementioned three innovation categories. There were also distinct detrimental differences in the behaviour of those operating under a time deadline. Overall, the study identified the complex interrelationships of team behaviours and outcomes, and between teams and their context.
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
This research project focused upon the design strategies adopted by expert and novice designers. It was based upon a desire to compare the design problem solving strategies of novices, in this case key stage three pupils studying technolgy within the United Kingdom National Curriculum, with designers who could be considered to have developed expertise. The findings helped to provide insights into potential teaching strategies to suit novice designers. Verbal protocols were made as samples of expert and novice designers solved a design problem and talked aloud as they worked. The verbalisations were recorded on video tape. The protocols were transcribed and segmented, with each segment being assigned to a predetermined coding system which represented a model of design problem solving. The results of the encoding were analysed and consideration was also given to the general design strategy and heuristics used by the expert and novice designers. The drawings and models produced during the generation of the protocols were also analysed and considered. A number of significant differences between the problem solving strategies adopted by the expert and novice designers were identified. First of all, differences were observed in the way expert and novice designers used the problem statement and solution validation during the process. Differences were also identified in the way holistic solutions were generated near the start of the process, and also in the cycles of exploration and the processes of integration. The way design and technological knowledge was used provided further insights into the differences between experts and novices, as did the role of drawing and modelling during the process. In more general terms, differences were identified in the heuristics and overall design strategies adopted by the expert and novice designers. The above findings provided a basis for discussing teaching strategies appropriate for novice designers. Finally, opportunities for future research were discussed.
Resumo:
An increasing number of organisational researchers have turned to social capital theory in an attempt to better understand the impetus for knowledge sharing at the individual and organisational level. This thesis extends that research by investigating the impact of social capital on knowledge sharing at the group-level in the organisational project context. The objective of the thesis is to investigate the importance of social capital in fostering tacit knowledge sharing among the team members of a project. The analytical focus is on the Nahapiet and Ghoshal framework of social capital but also includes elements of other scholars' work. In brief, social capital is defined as an asset that is embedded in the network of relationships possessed by an individual or social unit. It is argued that the main dimensions of social capital that are of relevance to knowledge sharing are structural, cognitive, and relational because these, among other things, foster the exchange and combination of knowledge and resources among the team members. Empirically, the study is based on the grounded theory method. Data were collected from five projects in large, medium, and small ICT companies in Malaysia. Underpinned by the constant comparative method, data were derived from 55 interviews, and observations. The data were analysed using open, axial, and selective coding. The analysis also involved counting frequency occurrence from the coding generated by grounded theory to find the important items and categories under social capital dimensions and knowledge sharing, and for further explaining sub-groups within the data. The analysis shows that the most important dimension for tacit knowledge sharing is structural capital. Most importantly, the findings also suggest that structural capital is a prerequisite of cognitive capital and relational capital at the group-level in an organisational project. It also found that in a project context, relational capital is hard to realise because it requires time and frequent interactions among the team members. The findings from quantitative analysis show that frequent meetings and interactions, relationship, positions, shared visions, shared objectives, and collaboration are among the factors that foster the sharing of tacit knowledge among the team members. In conclusion, the present study adds to the existing literature on social capital in two main ways. Firstly, it distinguishes the dimensions of social capital and identifies that structural capital is the most important dimension in social capital and it is a prerequisite of cognitive and relational capital in a project context. Secondly, it identifies the causal sequence in the dimension of social capital suggesting avenues for further theoretical and empirical work in this emerging area of inquiry.
Resumo:
Randomisation of DNA using conventional methodology requires an excess of genes to be cloned, since with randomised codons NNN or NNG/T 64 genes or 32 genes must be cloned to encode 20 amino acids respectively. Thus, as the number of randomised codons increases, the number of genes required to encode a full set of proteins increases exponentially. Various methods have been developed that address the problems associated with excess of genes that occurs due to the degeneracy of the genetic code. These range from chemical methodologies to biological methods. These all involve the replacement, insertion or deletion of codon(s) rather than individual nucleotides. The biological methods are however limited to random insertion/deletion or replacement. Recent work by Hughes et al., (2003) has randomised three binding residues of a zinc finger gene. The drawback with this is the fact that consecutive codons cannot undergo saturation mutagenesis. This thesis describes the development of a method of saturation mutagenesis that can be used to randomise any number of consecutive codons in a DNA strand. The method makes use of “MAX” oligonucleotides coding for each of the 20 amino acids that are ligated to a conserved sequence of DNA using T4 DNA ligase. The “MAX” oligonucleotides were synthesised in such a way, with an MlyI restriction site, that restriction of the oligonucleotides occurred after the three nucleotides coding for the amino acids. This use of the MlyI site and the restrict, purify, ligate and amplify method allows the insertion of “MAX” codons at any position in the DNA. This methodology reduces the number of clones that are required to produce a representative library and has been demonstrated to be effective to 7 amino acid positions.
Resumo:
This thesis was focused on theoretical models of synchronization to cortical dynamics as measured by magnetoencephalography (MEG). Dynamical systems theory was used in both identifying relevant variables for brain coordination and also in devising methods for their quantification. We presented a method for studying interactions of linear and chaotic neuronal sources using MEG beamforming techniques. We showed that such sources can be accurately reconstructed in terms of their location, temporal dynamics and possible interactions. Synchronization in low-dimensional nonlinear systems was studied to explore specific correlates of functional integration and segregation. In the case of interacting dissimilar systems, relevant coordination phenomena involved generalized and phase synchronization, which were often intermittent. Spatially-extended systems were then studied. For locally-coupled dissimilar systems, as in the case of cortical columns, clustering behaviour occurred. Synchronized clusters emerged at different frequencies and their boundaries were marked through oscillation death. The macroscopic mean field revealed sharp spectral peaks at the frequencies of the clusters and broader spectral drops at their boundaries. These results question existing models of Event Related Synchronization and Desynchronization. We re-examined the concept of the steady-state evoked response following an AM stimulus. We showed that very little variability in the AM following response could be accounted by system noise. We presented a methodology for detecting local and global nonlinear interactions from MEG data in order to account for residual variability. We found crosshemispheric nonlinear interactions of ongoing cortical rhythms concurrent with the stimulus and interactions of these rhythms with the following AM responses. Finally, we hypothesized that holistic spatial stimuli would be accompanied by the emergence of clusters in primary visual cortex resulting in frequency-specific MEG oscillations. Indeed, we found different frequency distributions in induced gamma oscillations for different spatial stimuli, which was suggestive of temporal coding of these spatial stimuli. Further, we addressed the bursting character of these oscillations, which was suggestive of intermittent nonlinear dynamics. However, we did not observe the characteristic-3/2 power-law scaling in the distribution of interburst intervals. Further, this distribution was only seldom significantly different to the one obtained in surrogate data, where nonlinear structure was destroyed. In conclusion, the work presented in this thesis suggests that advances in dynamical systems theory in conjunction with developments in magnetoencephalography may facilitate a mapping between levels of description int he brain. this may potentially represent a major advancement in neuroscience.
Resumo:
The overall aim of this study was to examine experimentally the effects of noise upon short-term memory tasks in the hope of shedding further light upon the apparently inconsistent results of previous research in the area. Seven experiments are presented. The first chapter of the thesis comprised a comprehensive review of the literature on noise and human performance while in the second chapter some theoretical questions concerning the effects of noise were considered in more detail follovred by a more detailed examination of the effects of noise upon memory. Chapter 3 described an experiment which examined the effects of noise on attention allocation in short-term memory as a function of list length. The results provided only weak evidence of increased selectivity in noise. In further chapters no~effects Here investigated in conjunction vrith various parameters of short-term memory tasks e.g. the retention interval, presentation rate. The results suggested that noise effects were significantly affected by the length of the retention interval but not by the rate of presentation. Later chapters examined the possibility of differential noise effects on the mode of recall (recall v. recognition) and the type of presentation (sequential v. simultaneous) as well as an investigation of the effect of varying the point of introduction of the noise and the importance of individual differences in noise research. The results of this study were consistent with the hypothesis that noise at presentation facilitates phonemic coding. However, noise during recall appeared to affect the retrieval strategy adopted by the subject.
Resumo:
This thesis attempts a psychological investigation of hemispheric functioning in developmental dyslexia. Previous work using neuropsychological methods with developmental dyslexics is reviewed ,and original work is presented both of a conventional psychometric nature and also utilising a new means of intervention. At the inception of inquiry into dyslexia, comparisons were drawn between developmental dyslexia and acquired alexia, promoting a model of brain damage as the common cause. Subsequent investigators found developmental dyslexics to be neurologically intact, and so an alternative hypothesis was offered, namely that language is abnormally localized (not in the left hemisphere). Research in the last decade, using the advanced techniques of modern neuropsychology, has indicated that developmental dyslexics are probably left hemisphere dominant for language. The development of a new type of pharmaceutical prep~ration (that appears to have a left hemisphere effect) offers an oppertunity to test the experimental hypothesis. This hypothesis propounds that most dyslexics are left hemisphere language dominant, but some of these language related operations are dysfunctioning. The methods utilised are those of psychological assessment of cognitive function, both in a traditional psychometric situation, and with a new form of intervention (Piracetam). The information resulting from intervention will be judged on its therapeutic validity and contribution to the understanding of hemispheric functioning in dyslexics. The experimental studies using conventional psychometric evaluation revealed a dyslexic profile of poor sequencing and name coding ability, with adequate spatial and verbal reasoning skills. Neuropsychological information would tend to suggest that this profile was indicative of adequate right hemsiphere abilities and deficits in some left hemsiphere abilities. When an intervention agent (Piracetam) was used with young adult dyslexics there were improvements in both the rate of acquisition and conservation of verbal learning. An experimental study with dyslexic children revealed that Piracetam appeared to improve reading, writing and sequencing, but did not influence spatial abilities. This would seem to concord with other recent findings, that deve~mental dyslexics may have left hemisphere language localisation, although some of these language related abilities are dysfunctioning.
Resumo:
We present and evaluate a novel idea for scalable lossy colour image coding with Matching Pursuit (MP) performed in a transform domain. The idea is to exploit correlations in RGB colour space between image subbands after wavelet transformation rather than in the spatial domain. We propose a simple quantisation and coding scheme of colour MP decomposition based on Run Length Encoding (RLE) which can achieve comparable performance to JPEG 2000 even though the latter utilises careful data modelling at the coding stage. Thus, the obtained image representation has the potential to outperform JPEG 2000 with a more sophisticated coding algorithm.
Resumo:
This research was originally undertaken to aid the Jamaican government and the World Bank in making funding decisions relative to improvement of road systems and traffic control in Jamaica. An investigation of the frequency and causes of road accidents and an evaluation of their impact on the Jamaican economy were carried out, and a model system which might be applied was developed. It is believed that the importance of road accident economic and manpower losses to the survival of developing countries, such as Jamaica, cannot be overemphasized. It is suggested that the World Bank, in cooperation with national governments, has a role to play in alleviating this serious problem. Data was collected from such organizations as the Jamaica Ministry of Construction, Police Department, the World Bank, and the World Health Organization. A variety of methodologies were utilized to organize this data in useful and understandable forms. The most important conclusion of this research is that solvable problems in road systems and in traffic control result in the unnecessary loss of useful citizens, in both developed and developing countries. However, a lack of information and understanding regarding the impact of high rates of road accident death and injury on the national economy and stability of a country results in an apparent lack of concern. Having little internal expertise in the field of road accident prevention, developing countries usually hire consultants to help them address this problem. In the case of Jamaica, this practice has resulted in distrust and hard feelings between the Jamaican authorities and major organizations involved in the field. Jamaican officials have found confusing the recommendations of most experts contracted to study traffic safety. The attempts of foreign consultants to utilize a technological approach (the use of coding systems and computers), methods which do not appear cost-effective for Jamaica, have resulted in the expenditure of limited funds for studies which offer no feasible approach to the problem. This funding limitation, which hampers research and road improvement, could be alleviated by such organizations as the World Bank. The causes of high accident rates are many, it was found. Formulation of a plan to address this serious problem must take into account the current failure to appreciate the impact of a high level of road accidents on national economy and stability, inability to find a feasible approach to the problem, and inadequate funding. Such a plan is discussed in detail in the main text of this research.
Resumo:
The orientations of lines and edges are important in defining the structure of the visual environment, and observers can detect differences in line orientation within the first few hundred milliseconds of scene viewing. The present work is a psychophysical investigation of the mechanisms of early visual orientation-processing. In experiments with briefly presented displays of line elements, observers indicated whether all the elements were uniformly oriented or whether a uniquely oriented target was present among uniformly oriented nontargets. The minimum difference between nontarget and target orientations that was required for effective target-detection (the orientation increment threshold) varied little with the number of elements and their spatial density, but the percentage of correct responses in detection of a large orientation-difference increased with increasing element density. The differing variations with element density of thresholds and percent-correct scores may indicate the operation of more than one mechanism in early visual orientation-processIng. Reducing element length caused threshold to increase with increasing number of elements, showing that the effectiveness of rapid, spatially parallel orientation-processing depends on element length. Orientational anisotropy in line-target detection has been reported previously: a coarse periodic variation and some finer variations in orientation increment threshold with nontarget orientation have been found. In the present work, the prominence of the coarse variation in relation to finer variations decreased with increasing effective viewing duration, as if the operation of coarse orientation-processing mechanisms precedes the operation of finer ones. Orientational anisotropy was prominent even when observers lay horizontally and viewed displays by looking upwards through a black cylinder that excluded all possible visual references for orientation. So, gravitational and visual cues are not essential to the definition of an orientational reference frame for early vision, and such a reference can be well defined by retinocentric neural coding, awareness of body-axis orientation, or both.
Resumo:
This thesis proposes that despite many experimental studies of thinking, and the development of models of thinking, such as Bruner's (1966) enactive, iconic and symbolic developmental modes, the imagery and inner verbal strategies used by children need further investigation to establish a coherent, theoretical basis from which to create experimental curricula for direct improvement of those strategies. Five hundred and twenty-three first, second and third year comprehensive school children were tested on 'recall' imagery, using a modified Betts Imagery Test; and a test of dual-coding processes (Paivio, 1971, p.179), by the P/W Visual/Verbal Questionnaire, measuring 'applied imagery' and inner verbalising. Three lines of investigation were pursued: 1. An investigation a. of hypothetical representational strategy differences between boys and girls; and b. the extent to which strategies change with increasing age. 2. The second and third year children's use of representational processes, were taken separately and compared with performance measures of perception, field independence, creativity, self-sufficiency and self-concept. 3. The second and third year children were categorised into four dual-coding strategy groups: a. High Visual/High Verbal b. Low Visual/High Verbal c. High Visual/Low Verbal d. Low Visual/Low Verbal These groups were compared on the same performance measures. The main result indicates that: 1. A hierarchy of dual-coding strategy use can be identified that is significantly related (.01, Binomial Test) to success or failure in the performance measures: the High Visual/High Verbal group registering the highest scores, the Low Visual/High Verbal and High Visual/Low Verbal groups registering intermediate scores, and the Low Visual/Low Verbal group registering the lowest scores on the performance measures. Subsidiary results indicate that: 2. Boys' use of visual strategies declines, and of verbal strategies increases, with age; girls' recall imagery strategy increases with age. Educational implications from the main result are discussed, the establishment of experimental curricula proposed, and further research suggested.
Resumo:
The need for an adequate information system for the Highways Departments in the United Kingdom has been recognised by the report of a committee presented to the Minister of Transport in 1970, (The Marshall Report). This research aims to present a comprehensive information system on a sound theoretical basis which should enable the different levels of management to execute their work adequately. The suggested system presented in this research covers the different functions of the Highways Department, and presents a suggested solution for problems which may occur during the planning and controlling of work in the different locations of the Highways Department. The information system consists of:- 1. A coding system covering the cost units, cost centres and cost elements. 2. Cost accounting records for the cost units and cost centres. 3. A budgeting and budgetary control system covering, the different planning methods and procedures which are required for preparing the capital expenditure budget, the improvement and maintenance operation flexible budgets and programme of work, the plant budget, the administration budget, and the purchasing budget. 4. A reporting system which ensures that the different levels of management are receiving relevant and timely information. 5. The flow of documents which covers the relationship between the prime documents, the cost accounting records, budgets, reports and their relation to the different sections and offices within the department. A comprehensive cost units, cost centres, and cost elements codes together with a number of examples demonstrating the results of the survey, and examples of the application and procedures of the suggested information system have been illustrated separately as appendices. The emphasis is on the information required for internal control by management personnel within the County Council.