48 resultados para Dynamic data analysis
Resumo:
This paper examines the source country determinants of FDI into Japan. The paper highlights certain methodological and theoretical weaknesses in the previous literature and offers some explanations for hitherto ambiguous results. Specifically, the paper highlights the importance of panel data analysis, and the identification of fixed effects in the analysis rather than simply pooling the data. Indeed, we argue that many of the results reported elsewhere are a feature of this mis-specification. To this end, pooled, fixed effects and random effects estimates are compared. The results suggest that FDI into Japan is inversely related to trade flows, such that trade and FDI are substitutes. Moreover, the results also suggest that FDI increases with home country political and economic stability. The paper also shows that previously reported results, regarding the importance of exchange rates, relative borrowing costs and labour costs in explaining FDI flows, are sensitive to the econometric specification and estimation approach. The paper also discusses the importance of these results within a policy context. In recent years Japan has sought to attract FDI, though many firms still complain of barriers to inward investment penetration in Japan. The results show that cultural and geographic distance are only of marginal importance in explaining FDI, and that the results are consistent with the market-seeking explanation of FDI. As such, the attitude to risk in the source country is strongly related to the size of FDI flows to Japan. © 2007 The Authors Journal compilation © 2007 Blackwell Publishing Ltd.
Resumo:
Analysis of variance (ANOVA) is the most efficient method available for the analysis of experimental data. Analysis of variance is a method of considerable complexity and subtlety, with many different variations, each of which applies in a particular experimental context. Hence, it is possible to apply the wrong type of ANOVA to data and, therefore, to draw an erroneous conclusion from an experiment. This article reviews the types of ANOVA most likely to arise in clinical experiments in optometry including the one-way ANOVA ('fixed' and 'random effect' models), two-way ANOVA in randomised blocks, three-way ANOVA, and factorial experimental designs (including the varieties known as 'split-plot' and 'repeated measures'). For each ANOVA, the appropriate experimental design is described, a statistical model is formulated, and the advantages and limitations of each type of design discussed. In addition, the problems of non-conformity to the statistical model and determination of the number of replications are considered. © 2002 The College of Optometrists.
Resumo:
Most current 3D landscape visualisation systems either use bespoke hardware solutions, or offer a limited amount of interaction and detail when used in realtime mode. We are developing a modular, data driven 3D visualisation system that can be readily customised to specific requirements. By utilising the latest software engineering methods and bringing a dynamic data driven approach to geo-spatial data visualisation we will deliver an unparalleled level of customisation in near-photo realistic, realtime 3D landscape visualisation. In this paper we show the system framework and describe how this employs data driven techniques. In particular we discuss how data driven approaches are applied to the spatiotemporal management aspect of the application framework, and describe the advantages these convey.
Resumo:
The present work describes the development of a proton induced X-ray emission (PIXE) analysis system, especially designed and builtfor routine quantitative multi-elemental analysis of a large number of samples. The historical and general developments of the analytical technique and the physical processes involved are discussed. The philosophy, design, constructional details and evaluation of a versatile vacuum chamber, an automatic multi-sample changer, an on-demand beam pulsing system and ion beam current monitoring facility are described.The system calibration using thin standard foils of Si, P, S,Cl, K, Ca, Ti, V, Fe, Cu, Ga, Ge, Rb, Y and Mo was undertaken at proton beam energies of 1 to 3 MeV in steps of 0.5 MeV energy and compared with theoretical calculations. An independent calibration check using bovine liver Standard Reference Material was performed. The minimum detectable limits have been experimentally determined at detector positions of 90° and 135° with respect to the incident beam for the above range of proton energies as a function of atomic number Z. The system has detection limits of typically 10- 7 to 10- 9 g for elements 14
Resumo:
This article examines the negotiation of face in post observation feedback conferences on an initial teacher training programme. The conferences were held in groups with one trainer and up to four trainees and followed a set of generic norms. These norms include the right to offer advice and to criticise, speech acts which are often considered to be face threatening in more normal contexts. However, as the data analysis shows, participants also interact in ways that challenge the generic norms, some of which might be considered more conventionally face attacking. The article argues that face should be analysed at the level of interaction (Haugh and Bargiela-Chiappini, 2010) and that situated and contextual detail is relevant to its analysis. It suggests that linguistic ethnography, which 'marries' (Wetherell, 2007) linguistics and ethnography, provides a useful theoretical framework for doing so. To this end the study draws on real-life talk-in-interaction (from transcribed recordings), the participants' perspectives (from focus groups and interviews) and situated detail (from fieldnotes) to produce a contextualised and nuanced analysis. © 2011 Elsevier B.V.
Resumo:
The goal of this study is to determine if various measures of contraction rate are regionally patterned in written Standard American English. In order to answer this question, this study employs a corpus-based approach to data collection and a statistical approach to data analysis. Based on a spatial autocorrelation analysis of the values of eleven measures of contraction across a 25 million word corpus of letters to the editor representing the language of 200 cities from across the contiguous United States, two primary regional patterns were identified: easterners tend to produce relatively few standard contractions (not contraction, verb contraction) compared to westerners, and northeasterners tend to produce relatively few non-standard contractions (to contraction, non-standard not contraction) compared to southeasterners. These findings demonstrate that regional linguistic variation exists in written Standard American English and that regional linguistic variation is more common than is generally assumed.
Resumo:
This thesis seeks to describe the development of an inexpensive and efficient clustering technique for multivariate data analysis. The technique starts from a multivariate data matrix and ends with graphical representation of the data and pattern recognition discriminant function. The technique also results in distances frequency distribution that might be useful in detecting clustering in the data or for the estimation of parameters useful in the discrimination between the different populations in the data. The technique can also be used in feature selection. The technique is essentially for the discovery of data structure by revealing the component parts of the data. lhe thesis offers three distinct contributions for cluster analysis and pattern recognition techniques. The first contribution is the introduction of transformation function in the technique of nonlinear mapping. The second contribution is the us~ of distances frequency distribution instead of distances time-sequence in nonlinear mapping, The third contribution is the formulation of a new generalised and normalised error function together with its optimal step size formula for gradient method minimisation. The thesis consists of five chapters. The first chapter is the introduction. The second chapter describes multidimensional scaling as an origin of nonlinear mapping technique. The third chapter describes the first developing step in the technique of nonlinear mapping that is the introduction of "transformation function". The fourth chapter describes the second developing step of the nonlinear mapping technique. This is the use of distances frequency distribution instead of distances time-sequence. The chapter also includes the new generalised and normalised error function formulation. Finally, the fifth chapter, the conclusion, evaluates all developments and proposes a new program. for cluster analysis and pattern recognition by integrating all the new features.
Resumo:
The role of the production system as a key determinant of competitive performance of business operations- has long been the subject of industrial organization research, even predating the .explicit conceptua1isation of manufacturing, strategy in the literature. Particular emergent production issues such as the globalisation of production, global supply chain management, management of integrated manufacturing and a growing e~busjness environment are expected to critically influence the overall competitive performance and therefore the strategic success of the organization. More than ever, there is a critical need to configure and improve production system and operations competence in a strategic way so as to contribute to the long-term competitiveness of the organization. In order to operate competitively and profitably, manufacturing companies, no matter how well managed, all need a long-term 'strategic direction' for the development of operations competence in order to consistently produce more market value with less cost towards a leadership position. As to the long-term competitiveness, it is more important to establish a dynamic 'strategic perspective' for continuous operational improvements in pursuit of this direction, as well as ongoing reviews of the direction in relation to the overall operating context. However, it also clear that the 'existing paradigm of manufacturing strategy development' is incapable of adequately responding to the increasing complexities and variations of contemporary business operations. This has been factually reflected as many manufacturing companies are finding that methodologies advocated in the existing paradigm for developing manufacturing strategy have very limited scale and scope for contextual contingency in empirical application. More importantly, there has also emerged a deficiency in the multidimensional and integrative profile from a theoretical perspective when operationalising the underlying concept of strategic manufacturing management established in the literature. The point of departure for this study was a recognition of such contextual and unitary limitations in the existing paradigm of manufacturing strategy development when applied to contemporary industrial organizations in general, and Chinese State Owned Enterprises (SOEs) in particular. As China gradually becomes integrated into the world economy, the relevance of Western management theory and its paradigm becomes a practical matter as much as a theoretical issue. Since China markedly differs from Western countries in terms of culture, society, and political and economic systems, it presents promising grounds to test and refine existing management theories and paradigms with greater contextual contingency and wider theoretical perspective. Under China's ongoing programmes of SOE reform, there has been an increased recognition that strategy development is the very essence of the management task for managers of manufacturing companies in the same way as it is for their counterparts in Western economies. However, the Western paradigm often displays a rather naive and unitary perspective of the nature of strategic management decision-making, one which largely overlooks context-embedded factors and social/political influences on the development of manufacturing strategy. This thesis studies the successful experiences of developing manufacturing strategy from five high-performing large-scale SOEs within China’s petrochemical industry. China’s petrochemical industry constitutes a basic heavy industrial sector, which has always been a strategic focus for reform and development by the Chinese government. Using a confirmation approach, the study has focused on exploring and conceptualising the empirical paradigm of manufacturing strategy development practiced by management. That is examining the ‘empirical specifics’ and surfacing the ‘managerial perceptions’ of content configuration, context of consideration, and process organization for developing a manufacturing strategy during the practice. The research investigation adopts a qualitative exploratory case study methodology with a semi-structural front-end research design. Data collection follows a longitudinal and multiple-case design and triangulates case evidence from sources including qualitative interviews, direct observation, and a search of documentations and archival records. Data analysis follows an investigative progression from a within-case preliminary interpretation of facts to a cross-case search for patterns through theoretical comparison and analytical generalization. The underlying conceptions in both the literature of manufacturing strategy and related studies in business strategy were used to develop theoretical framework and analytical templates applied during data collection and analysis. The thesis makes both empirical and theoretical contributions to our understanding of 'contemporary management paradigm of manufacturing strategy development'. First, it provides a valuable contextual contingency of the 'subject' using the business setting of China's SOEs in petrochemical industry. This has been unpacked into empirical configurations developed for its context of consideration, its content and process respectively. Of special note, a lean paradigm of business operations and production management discovered at case companies has significant implications as an emerging alternative for high-volume capital intensive state manufacturing in China. Second, it provides a multidimensional and integrative theoretical profile of the 'subject' based upon managerial perspectives conceptualised at case companies when operationalising manufacturing strategy. This has been unpacked into conceptual frameworks developed for its context of consideration, its content constructs, and its process patterns respectively. Notably, a synergies perspective towards the operating context, competitive priorities and competence development of business operations and production management has significant implications for implementing a lean manufacturing paradigm. As a whole, in so doing, the thesis established a theoretical platform for future refinement and development of context-specific methodologies for developing manufacturing strategy.
Resumo:
This book is aimed primarily at microbiologists who are undertaking research and who require a basic knowledge of statistics to analyse their experimental data. Computer software employing a wide range of data analysis methods is widely available to experimental scientists. The availability of this software, however, makes it essential that investigators understand the basic principles of statistics. Statistical analysis of data can be complex with many different methods of approach, each of which applies in a particular experimental circumstance. Hence, it is possible to apply an incorrect statistical method to data and to draw the wrong conclusions from an experiment. The purpose of this book, which has its origin in a series of articles published in the Society for Applied Microbiology journal ‘The Microbiologist’, is an attempt to present the basic logic of statistics as clearly as possible and therefore, to dispel some of the myths that often surround the subject. The 28 ‘Statnotes’ deal with various topics that are likely to be encountered, including the nature of variables, the comparison of means of two or more groups, non-parametric statistics, analysis of variance, correlating variables, and more complex methods such as multiple linear regression and principal components analysis. In each case, the relevant statistical method is illustrated with examples drawn from experiments in microbiological research. The text incorporates a glossary of the most commonly used statistical terms and there are two appendices designed to aid the investigator in the selection of the most appropriate test.
Resumo:
University students encounter difficulties with academic English because of its vocabulary, phraseology, and variability, and also because academic English differs in many respects from general English, the language which they have experienced before starting their university studies. Although students have been provided with many dictionaries that contain some helpful information on words used in academic English, these dictionaries remain focused on the uses of words in general English. There is therefore a gap in the dictionary market for a dictionary for university students, and this thesis provides a proposal for such a dictionary (called the Dictionary of Academic English; DOAE) in the form of a model which depicts how the dictionary should be designed, compiled, and offered to students. The model draws on state-of-the-art techniques in lexicography, dictionary-use research, and corpus linguistics. The model demanded the creation of a completely new corpus of academic language (Corpus of Academic Journal Articles; CAJA). The main advantages of the corpus are its large size (83.5 million words) and balance. Having access to a large corpus of academic language was essential for a corpus-driven approach to data analysis. A good corpus balance in terms of domains enabled a detailed domain-labelling of senses, patterns, collocates, etc. in the dictionary database, which was then used to tailor the output according to the needs of different types of student. The model proposes an online dictionary that is designed as an online dictionary from the outset. The proposed dictionary is revolutionary in the way it addresses the needs of different types of student. It presents students with a dynamic dictionary whose contents can be customised according to the user's native language, subject of study, variant spelling preferences, and/or visual preferences (e.g. black and white).
Resumo:
This article provides a unique contribution to the debates about archived qualitative data by drawing on two uses of the same data - British Migrants in Spain: the Extent and Nature of Social Integration, 2003-2005 - by Jones (2009) and Oliver and O'Reilly (2010), both of which utilise Bourdieu's concepts analytically and produce broadly similar findings. We argue that whilst the insights and experiences of those researchers directly involved in data collection are important resources for developing contextual knowledge used in data analysis, other kinds of critical distance can also facilitate credible data use. We therefore challenge the assumption that the idiosyncratic relationship between context, reflexivity and interpretation limits the future use of data. Moreover, regardless of the complex genealogy of the data itself, given the number of contingencies shaping the qualitative research process and thus the potential for partial or inaccurate interpretation, contextual familiarity need not be privileged over other aspects of qualitative praxis such as sustained theoretical insight, sociological imagination and methodological rigour. © Sociological Research Online, 1996-2012.
Resumo:
INTRODUCTION: Bipolar disorder requires long-term treatment but non-adherence is a common problem. Antipsychotic long-acting injections (LAIs) have been suggested to improve adherence but none are licensed in the UK for bipolar. However, the use of second-generation antipsychotics (SGA) LAIs in bipolar is not uncommon albeit there is a lack of systematic review in this area. This study aims to systematically review safety and efficacy of SGA LAIs in the maintenance treatment of bipolar disorder. METHODS AND ANALYSIS: The protocol is based on Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) and will include only randomised controlled trials comparing SGA LAIs in bipolar. PubMed, EMBASE, CINAHL, Cochrane Library (CENTRAL), PsychINFO, LiLACS, http://www.clinicaltrials.gov will be searched, with no language restriction, from 2000 to January 2016 as first SGA LAIs came to the market after 2000. Manufacturers of SGA LAIs will also be contacted. Primary efficacy outcome is relapse rate or delayed time to relapse or reduction in hospitalisation and primary safety outcomes are drop-out rates, all-cause discontinuation and discontinuation due to adverse events. Qualitative reporting of evidence will be based on 21 items listed on standards for reporting qualitative research (SRQR) focusing on study quality (assessed using the Jadad score, allocation concealment and data analysis), risk of bias and effect size. Publication bias will be assessed using funnel plots. If sufficient data are available meta-analysis will be performed with primary effect size as relative risk presented with 95% CI. Sensitivity analysis, conditional on number of studies and sample size, will be carried out on manic versus depressive symptoms and monotherapy versus adjunctive therapy.
Resumo:
This thesis examines the ways Indonesian politicians exploit the rhetorical power of metaphors in the Indonesian political discourse. The research applies the Conceptual Metaphor Theory, Metaphorical Frame Analysis and Critical Discourse Analysis to textual and oral data. The corpus comprises: 150 political news articles from two newspapers (Harian Kompas and Harian Waspada, 2010-2011 edition), 30 recordings of two television news and talk-show programmes (TV-One and Metro-TV), and 20 interviews with four legislators, two educated persons and two laymen. For this study, a corpus of written bahasa Indonesia was also compiled, which comprises 150 texts of approximately 439,472 tokens. The data analysis shows the potential power of metaphors in relation to how politicians communicate the results of their thinking, reasoning and meaning-making through language and discourse and its social consequences. The data analysis firstly revealed 1155 metaphors. These metaphors were then classified into the categories of conventional metaphor, cognitive function of metaphor, metaphorical mapping and metaphor variation. The degree of conventionality of metaphors is established based on the sum of expressions in each group of metaphors. Secondly, the analysis revealed that metaphor variation is influenced by the broader Indonesian cultural context and the natural and physical environment, such as the social dimension, the regional, style and the individual. The mapping system of metaphor is unidirectionality. Thirdly, the data show that metaphoric thought pervades political discourse in relation to its uses as: (1) a felicitous tool for the rhetoric of political leaders, (2) part of meaning-making that keeps the discourse contexts alive and active, and (3) the degree to which metaphor and discourse shape the conceptual structures of politicians‟ rhetoric. Fourthly, the analysis of data revealed that the Indonesian political discourse attempts to create both distance and solidarity towards general and specific social categories accomplished via metaphorical and frame references to the conceptualisations of us/them. The result of the analysis shows that metaphor and frame are excellent indicators of the us/them categories which work dialectically in the discourse. The acts of categorisation via metaphors and frames at both textual and conceptual level activate asymmetrical concepts and contribute to social and political hierarchical constructs, i.e. WEAKNESS vs.POWER, STUDENT vs. TEACHER, GHOST vs. CHOSEN WARRIOR, and so on. This analysis underscores the dynamic nature of categories by documenting metaphorical transfers between, i.e. ENEMY, DISEASE, BUSINESS, MYSTERIOUS OBJECT and CORRUPTION, LAW, POLITICS and CASE. The metaphorical transfers showed that politicians try to dictate how they categorise each other in order to mobilise audiences to act on behalf of their ideologies and to create distance and solidarity.
Resumo:
Failure to detect patients at risk of attempting suicide can result in tragic consequences. Identifying risks earlier and more accurately helps prevent serious incidents occurring and is the objective of the GRiST clinical decision support system (CDSS). One of the problems it faces is high variability in the type and quantity of data submitted for patients, who are assessed in multiple contexts along the care pathway. Although GRiST identifies up to 138 patient cues to collect, only about half of them are relevant for any one patient and their roles may not be for risk evaluation but more for risk management. This paper explores the data collection behaviour of clinicians using GRiST to see whether it can elucidate which variables are important for risk evaluations and when. The GRiST CDSS is based on a cognitive model of human expertise manifested by a sophisticated hierarchical knowledge structure or tree. This structure is used by the GRiST interface to provide top-down controlled access to the patient data. Our research explores relationships between the answers given to these higher-level 'branch' questions to see whether they can help direct assessors to the most important data, depending on the patient profile and assessment context. The outcome is a model for dynamic data collection driven by the knowledge hierarchy. It has potential for improving other clinical decision support systems operating in domains with high dimensional data that are only partially collected and in a variety of combinations.
Resumo:
Significance: Oxidized phospholipids are now well-recognized as markers of biological oxidative stress and bioactive molecules with both pro-inflammatory and anti-inflammatory effects. While analytical methods continue to be developed for studies of generic lipid oxidation, mass spectrometry (MS) has underpinned the advances in knowledge of specific oxidized phospholipids by allowing their identification and characterization, and is responsible for the expansion of oxidative lipidomics. Recent Advances: Studies of oxidized phospholipids in biological samples, both from animal models and clinical samples, have been facilitated by the recent improvements in MS, especially targeted routines that depend on the fragmentation pattern of the parent molecular ion and improved resolution and mass accuracy. MS can be used to identify selectively individual compounds or groups of compounds with common features, which greatly improves the sensitivity and specificity of detection. Application of these methods have enabled important advances in understanding the mechanisms of inflammatory diseases such as atherosclerosis, steatohepatitis, leprosy and cystic fibrosis, and offer potential for developing biomarkers of molecular aspects of the diseases. Critical Issues and Future Directions: The future in this field will depend on development of improved MS technologies, such as ion mobility, novel enrichment methods and databases and software for data analysis, owing to the very large amount of data generated in these experiments. Imaging of oxidized phospholipids in tissue MS is an additional exciting direction emerging that can be expected to advance understanding of physiology and disease.