923 resultados para n-way data analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes how the statistical technique of cluster analysis and the machine learning technique of rule induction can be combined to explore a database. The ways in which such an approach alleviates the problems associated with other techniques for data analysis are discussed. We report the results of experiments carried out on a database from the medical diagnosis domain. Finally we describe the future developments which we plan to carry out to build on our current work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper examines the source country determinants of FDI into Japan. The paper highlights certain methodological and theoretical weaknesses in the previous literature and offers some explanations for hitherto ambiguous results. Specifically, the paper highlights the importance of panel data analysis, and the identification of fixed effects in the analysis rather than simply pooling the data. Indeed, we argue that many of the results reported elsewhere are a feature of this mis-specification. To this end, pooled, fixed effects and random effects estimates are compared. The results suggest that FDI into Japan is inversely related to trade flows, such that trade and FDI are substitutes. Moreover, the results also suggest that FDI increases with home country political and economic stability. The paper also shows that previously reported results, regarding the importance of exchange rates, relative borrowing costs and labour costs in explaining FDI flows, are sensitive to the econometric specification and estimation approach. The paper also discusses the importance of these results within a policy context. In recent years Japan has sought to attract FDI, though many firms still complain of barriers to inward investment penetration in Japan. The results show that cultural and geographic distance are only of marginal importance in explaining FDI, and that the results are consistent with the market-seeking explanation of FDI. As such, the attitude to risk in the source country is strongly related to the size of FDI flows to Japan. © 2007 The Authors Journal compilation © 2007 Blackwell Publishing Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present work describes the development of a proton induced X-ray emission (PIXE) analysis system, especially designed and builtfor routine quantitative multi-elemental analysis of a large number of samples. The historical and general developments of the analytical technique and the physical processes involved are discussed. The philosophy, design, constructional details and evaluation of a versatile vacuum chamber, an automatic multi-sample changer, an on-demand beam pulsing system and ion beam current monitoring facility are described.The system calibration using thin standard foils of Si, P, S,Cl, K, Ca, Ti, V, Fe, Cu, Ga, Ge, Rb, Y and Mo was undertaken at proton beam energies of 1 to 3 MeV in steps of 0.5 MeV energy and compared with theoretical calculations. An independent calibration check using bovine liver Standard Reference Material was performed.  The minimum detectable limits have been experimentally determined at detector positions of 90° and 135° with respect to the incident beam for the above range of proton energies as a function of atomic number Z. The system has detection limits of typically 10- 7 to 10- 9 g for elements 14data analysis and calculations of areal density of thin foils using Rutherford backscattering data.  Amniotic fluid samples supplied by South Sefton Health Authority were successfully analysed for their low base line elemental concentrations. In conclusion the findings of this work are discussed with suggestions for further work .

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article examines the negotiation of face in post observation feedback conferences on an initial teacher training programme. The conferences were held in groups with one trainer and up to four trainees and followed a set of generic norms. These norms include the right to offer advice and to criticise, speech acts which are often considered to be face threatening in more normal contexts. However, as the data analysis shows, participants also interact in ways that challenge the generic norms, some of which might be considered more conventionally face attacking. The article argues that face should be analysed at the level of interaction (Haugh and Bargiela-Chiappini, 2010) and that situated and contextual detail is relevant to its analysis. It suggests that linguistic ethnography, which 'marries' (Wetherell, 2007) linguistics and ethnography, provides a useful theoretical framework for doing so. To this end the study draws on real-life talk-in-interaction (from transcribed recordings), the participants' perspectives (from focus groups and interviews) and situated detail (from fieldnotes) to produce a contextualised and nuanced analysis. © 2011 Elsevier B.V.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of this study is to determine if various measures of contraction rate are regionally patterned in written Standard American English. In order to answer this question, this study employs a corpus-based approach to data collection and a statistical approach to data analysis. Based on a spatial autocorrelation analysis of the values of eleven measures of contraction across a 25 million word corpus of letters to the editor representing the language of 200 cities from across the contiguous United States, two primary regional patterns were identified: easterners tend to produce relatively few standard contractions (not contraction, verb contraction) compared to westerners, and northeasterners tend to produce relatively few non-standard contractions (to contraction, non-standard not contraction) compared to southeasterners. These findings demonstrate that regional linguistic variation exists in written Standard American English and that regional linguistic variation is more common than is generally assumed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis seeks to describe the development of an inexpensive and efficient clustering technique for multivariate data analysis. The technique starts from a multivariate data matrix and ends with graphical representation of the data and pattern recognition discriminant function. The technique also results in distances frequency distribution that might be useful in detecting clustering in the data or for the estimation of parameters useful in the discrimination between the different populations in the data. The technique can also be used in feature selection. The technique is essentially for the discovery of data structure by revealing the component parts of the data. lhe thesis offers three distinct contributions for cluster analysis and pattern recognition techniques. The first contribution is the introduction of transformation function in the technique of nonlinear mapping. The second contribution is the us~ of distances frequency distribution instead of distances time-sequence in nonlinear mapping, The third contribution is the formulation of a new generalised and normalised error function together with its optimal step size formula for gradient method minimisation. The thesis consists of five chapters. The first chapter is the introduction. The second chapter describes multidimensional scaling as an origin of nonlinear mapping technique. The third chapter describes the first developing step in the technique of nonlinear mapping that is the introduction of "transformation function". The fourth chapter describes the second developing step of the nonlinear mapping technique. This is the use of distances frequency distribution instead of distances time-sequence. The chapter also includes the new generalised and normalised error function formulation. Finally, the fifth chapter, the conclusion, evaluates all developments and proposes a new program. for cluster analysis and pattern recognition by integrating all the new features.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The role of the production system as a key determinant of competitive performance of business operations- has long been the subject of industrial organization research, even predating the .explicit conceptua1isation of manufacturing, strategy in the literature. Particular emergent production issues such as the globalisation of production, global supply chain management, management of integrated manufacturing and a growing e~busjness environment are expected to critically influence the overall competitive performance and therefore the strategic success of the organization. More than ever, there is a critical need to configure and improve production system and operations competence in a strategic way so as to contribute to the long-term competitiveness of the organization. In order to operate competitively and profitably, manufacturing companies, no matter how well managed, all need a long-term 'strategic direction' for the development of operations competence in order to consistently produce more market value with less cost towards a leadership position. As to the long-term competitiveness, it is more important to establish a dynamic 'strategic perspective' for continuous operational improvements in pursuit of this direction, as well as ongoing reviews of the direction in relation to the overall operating context. However, it also clear that the 'existing paradigm of manufacturing strategy development' is incapable of adequately responding to the increasing complexities and variations of contemporary business operations. This has been factually reflected as many manufacturing companies are finding that methodologies advocated in the existing paradigm for developing manufacturing strategy have very limited scale and scope for contextual contingency in empirical application. More importantly, there has also emerged a deficiency in the multidimensional and integrative profile from a theoretical perspective when operationalising the underlying concept of strategic manufacturing management established in the literature. The point of departure for this study was a recognition of such contextual and unitary limitations in the existing paradigm of manufacturing strategy development when applied to contemporary industrial organizations in general, and Chinese State Owned Enterprises (SOEs) in particular. As China gradually becomes integrated into the world economy, the relevance of Western management theory and its paradigm becomes a practical matter as much as a theoretical issue. Since China markedly differs from Western countries in terms of culture, society, and political and economic systems, it presents promising grounds to test and refine existing management theories and paradigms with greater contextual contingency and wider theoretical perspective. Under China's ongoing programmes of SOE reform, there has been an increased recognition that strategy development is the very essence of the management task for managers of manufacturing companies in the same way as it is for their counterparts in Western economies. However, the Western paradigm often displays a rather naive and unitary perspective of the nature of strategic management decision-making, one which largely overlooks context-embedded factors and social/political influences on the development of manufacturing strategy. This thesis studies the successful experiences of developing manufacturing strategy from five high-performing large-scale SOEs within China’s petrochemical industry. China’s petrochemical industry constitutes a basic heavy industrial sector, which has always been a strategic focus for reform and development by the Chinese government. Using a confirmation approach, the study has focused on exploring and conceptualising the empirical paradigm of manufacturing strategy development practiced by management. That is examining the ‘empirical specifics’ and surfacing the ‘managerial perceptions’ of content configuration, context of consideration, and process organization for developing a manufacturing strategy during the practice. The research investigation adopts a qualitative exploratory case study methodology with a semi-structural front-end research design. Data collection follows a longitudinal and multiple-case design and triangulates case evidence from sources including qualitative interviews, direct observation, and a search of documentations and archival records. Data analysis follows an investigative progression from a within-case preliminary interpretation of facts to a cross-case search for patterns through theoretical comparison and analytical generalization. The underlying conceptions in both the literature of manufacturing strategy and related studies in business strategy were used to develop theoretical framework and analytical templates applied during data collection and analysis. The thesis makes both empirical and theoretical contributions to our understanding of 'contemporary management paradigm of manufacturing strategy development'. First, it provides a valuable contextual contingency of the 'subject' using the business setting of China's SOEs in petrochemical industry. This has been unpacked into empirical configurations developed for its context of consideration, its content and process respectively. Of special note, a lean paradigm of business operations and production management discovered at case companies has significant implications as an emerging alternative for high-volume capital intensive state manufacturing in China. Second, it provides a multidimensional and integrative theoretical profile of the 'subject' based upon managerial perspectives conceptualised at case companies when operationalising manufacturing strategy. This has been unpacked into conceptual frameworks developed for its context of consideration, its content constructs, and its process patterns respectively. Notably, a synergies perspective towards the operating context, competitive priorities and competence development of business operations and production management has significant implications for implementing a lean manufacturing paradigm. As a whole, in so doing, the thesis established a theoretical platform for future refinement and development of context-specific methodologies for developing manufacturing strategy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A notable feature of the recent commercialisation of biotechnology has been the success of 200 or so new firms, established in America since 1976, in exploiting specialised market niches. A key factor in their formation has been the ready availability of venture capital funding. These firms have been instrumental in establishing America's lead in exploiting biotechnology. It is this example which Britain has attempted to emulate as part of its strategy for developing its own biotechnology capabilities. This thesis investigated some aspects of the relationship between biotechnology and venture capital, concentrating on the determinants of the venture capitalist's investment decision. Following an extensive literature survey, two hypothetical business proposals were used to find what venture capitalists themselves consider to be the key elements of this decision. It was found that venture capitalists invest in people, not products, and businesses, not industries. It was concluded that venture capital-backed small firms should, therefore, be seen as an adjunct to the development of biotechnology in Britain, rather than as a substitute for a co-ordinated, co-operative strategy involving Government, the financial institutions, industry and academia. This is chiefly because the small size of the UK's domestic market means that many potentially important innovations in biotechnology may continue to be lost, since the short term identification of market opportunities for biotechnology products will dictate that they are insupportable in Britain alone. In addition, the data analysis highlighted some interesting methodological issues concerning the investigation of investment decision making. These related especially to shortcomings in the use of scoresheets and questionnaires in research in this area. The conclusion here was that future research should concentrate on the reasons why an individual reaches an investment decision. It is argued that only in this way can the nature of the evaluation procedures employed by venture capitalists be properly understood.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This book is aimed primarily at microbiologists who are undertaking research and who require a basic knowledge of statistics to analyse their experimental data. Computer software employing a wide range of data analysis methods is widely available to experimental scientists. The availability of this software, however, makes it essential that investigators understand the basic principles of statistics. Statistical analysis of data can be complex with many different methods of approach, each of which applies in a particular experimental circumstance. Hence, it is possible to apply an incorrect statistical method to data and to draw the wrong conclusions from an experiment. The purpose of this book, which has its origin in a series of articles published in the Society for Applied Microbiology journal ‘The Microbiologist’, is an attempt to present the basic logic of statistics as clearly as possible and therefore, to dispel some of the myths that often surround the subject. The 28 ‘Statnotes’ deal with various topics that are likely to be encountered, including the nature of variables, the comparison of means of two or more groups, non-parametric statistics, analysis of variance, correlating variables, and more complex methods such as multiple linear regression and principal components analysis. In each case, the relevant statistical method is illustrated with examples drawn from experiments in microbiological research. The text incorporates a glossary of the most commonly used statistical terms and there are two appendices designed to aid the investigator in the selection of the most appropriate test.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

University students encounter difficulties with academic English because of its vocabulary, phraseology, and variability, and also because academic English differs in many respects from general English, the language which they have experienced before starting their university studies. Although students have been provided with many dictionaries that contain some helpful information on words used in academic English, these dictionaries remain focused on the uses of words in general English. There is therefore a gap in the dictionary market for a dictionary for university students, and this thesis provides a proposal for such a dictionary (called the Dictionary of Academic English; DOAE) in the form of a model which depicts how the dictionary should be designed, compiled, and offered to students. The model draws on state-of-the-art techniques in lexicography, dictionary-use research, and corpus linguistics. The model demanded the creation of a completely new corpus of academic language (Corpus of Academic Journal Articles; CAJA). The main advantages of the corpus are its large size (83.5 million words) and balance. Having access to a large corpus of academic language was essential for a corpus-driven approach to data analysis. A good corpus balance in terms of domains enabled a detailed domain-labelling of senses, patterns, collocates, etc. in the dictionary database, which was then used to tailor the output according to the needs of different types of student. The model proposes an online dictionary that is designed as an online dictionary from the outset. The proposed dictionary is revolutionary in the way it addresses the needs of different types of student. It presents students with a dynamic dictionary whose contents can be customised according to the user's native language, subject of study, variant spelling preferences, and/or visual preferences (e.g. black and white).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite abundant literature on human behaviour in the face of danger, much remains to be discovered. Some descriptive models of behaviour in the face of danger are reviewed in order to identify areas where documentation is lacking. It is argued that little is known about recognition and assessment of danger and yet, these are important aspects of cognitive processes. Speculative arguments about hazard assessment are reviewed and tested against the results of previous studies. Once hypotheses are formulated, the reason for retaining the reportory grid as the main research instrument are outlined, and the choice of data analysis techniques is described. Whilst all samples used repertory grids, the rating scales were different between samples; therefore, an analysis is performed of the way in which rating scales were used in the various samples and of some reasons why the scales were used differently. Then, individual grids are looked into and compared between respondents within each sample; consensus grids are also discussed. the major results from all samples are then contrasted and compared. It was hypothesized that hazard assessment would encompass three main dimensions, i.e. 'controllability', 'severity of consequences' and 'likelihood of occurrence', which would emerge in that order. the results suggest that these dimensions are but facets of two broader dimensions labelled 'scope of human intervention' and 'dangerousness'. It seems that these two dimensions encompass a number of more specific dimensions some of which can be further fragmented. Thus, hazard assessment appears to be a more complex process about which much remains to be discovered. Some of the ways in which further discovery might proceed are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Post-disaster housing reconstruction projects face several challenges. Resources and material supplies are often scarce; several and different types of organizations are involved, while projects must be completed as quickly as possible to foster recovery. Within this context, the chapter aims to increase the understanding of relief supply chain design in reconstruction. In addition, the chapter is introducing a community based and beneficiary perspective to relief supply chains by evaluating the implications of local components for supply chain design in reconstruction. This is achieved through the means of secondary data analysis based on the evaluation reports of two major housing reconstruction projects that took place in Europe the last decade. A comparative analysis of the organizational designs of these projects highlights the ways in which users can be involved. The performance of reconstruction supply chains seems to depend to a large extent on the way beneficiaries are integrated in supply chain design impacting positively on the effectiveness of reconstruction supply chains.