65 resultados para Data-driven Methods
Resumo:
This book is aimed primarily at microbiologists who are undertaking research and who require a basic knowledge of statistics to analyse their experimental data. Computer software employing a wide range of data analysis methods is widely available to experimental scientists. The availability of this software, however, makes it essential that investigators understand the basic principles of statistics. Statistical analysis of data can be complex with many different methods of approach, each of which applies in a particular experimental circumstance. Hence, it is possible to apply an incorrect statistical method to data and to draw the wrong conclusions from an experiment. The purpose of this book, which has its origin in a series of articles published in the Society for Applied Microbiology journal ‘The Microbiologist’, is an attempt to present the basic logic of statistics as clearly as possible and therefore, to dispel some of the myths that often surround the subject. The 28 ‘Statnotes’ deal with various topics that are likely to be encountered, including the nature of variables, the comparison of means of two or more groups, non-parametric statistics, analysis of variance, correlating variables, and more complex methods such as multiple linear regression and principal components analysis. In each case, the relevant statistical method is illustrated with examples drawn from experiments in microbiological research. The text incorporates a glossary of the most commonly used statistical terms and there are two appendices designed to aid the investigator in the selection of the most appropriate test.
Resumo:
Excessive consumption of dietary fat is acknowledged to be a widespread problem linked to a range of medical conditions. Despite this, little is known about the specific sensory appeal held by fats and no previous published research exists concerning human perception of non-textural taste qualities in fats. This research aimed to address whether a taste component can be found in sensory perception of pure fats. It also examined whether individual differences existed in human taste responses to fat, using both aggregated data analysis methods and multidimensional scaling. Results indicated that individuals were able to detect both the primary taste qualities of sweet, salty, sour and bitter in pure processed oils and reliably ascribe their own individually-generated taste labels, suggested that a taste component may be present in human responses to fat. Individual variation appeared to exist, both in the perception of given taste qualities and in perceived intensity and preferences. A number of factors were examined in relation to such individual differences in taste perception, including age, gender, genetic sensitivity to 6-n-propylthiouracil, body mass, dietary preferences and intake, dieting behaviours and restraint. Results revealed that, to varying extents, gender, age, sensitivity to 6-n-propylthiouracil, dietary preferences, habitual dietary intake and restraint all appeared to be related to individual variation in taste responses to fat. However, in general, these differences appeared to exist in the form of differing preferences and levels of intensity with which taste qualities detected in fat were perceived, as opposed to the perception of specific taste qualities being associated with given traits or states. Equally, each of these factors appeared to exert only a limited influence upon variation in sensory responses and thus the potential for using taste responses to fats as a marker for issues such as over-consumption, obesity or eating disorder is at present limited.
Resumo:
In analysing manufacturing systems, for either design or operational reasons, failure to account for the potentially significant dynamics could produce invalid results. There are many analysis techniques that can be used, however, simulation is unique in its ability to assess detailed, dynamic behaviour. The use of simulation to analyse manufacturing systems would therefore seem appropriate if not essential. Many simulation software products are available but their ease of use and scope of application vary greatly. This is illustrated at one extreme by simulators which offer rapid but limited application whilst at the other simulation languages which are extremely flexible but tedious to code. Given that a typical manufacturing engineer does not posses in depth programming and simulation skills then the use of simulators over simulation languages would seem a more appropriate choice. Whilst simulators offer ease of use their limited functionality may preclude their use in many applications. The construction of current simulators makes it difficult to amend or extend the functionality of the system to meet new challenges. Some simulators could even become obsolete as users, demand modelling functionality that reflects the latest manufacturing system design and operation concepts. This thesis examines the deficiencies in current simulation tools and considers whether they can be overcome by the application of object-oriented principles. Object-oriented techniques have gained in popularity in recent years and are seen as having the potential to overcome any of the problems traditionally associated with software construction. There are a number of key concepts that are exploited in the work described in this thesis: the use of object-oriented techniques to act as a framework for abstracting engineering concepts into a simulation tool and the ability to reuse and extend object-oriented software. It is argued that current object-oriented simulation tools are deficient and that in designing such tools, object -oriented techniques should be used not just for the creation of individual simulation objects but for the creation of the complete software. This results in the ability to construct an easy to use simulator that is not limited by its initial functionality. The thesis presents the design of an object-oriented data driven simulator which can be freely extended. Discussion and work is focused on discrete parts manufacture. The system developed retains the ease of use typical of data driven simulators. Whilst removing any limitation on its potential range of applications. Reference is given to additions made to the simulator by other developers not involved in the original software development. Particular emphasis is put on the requirements of the manufacturing engineer and the need for Ihe engineer to carrv out dynamic evaluations.
Resumo:
Many tests of financial contagion require a definition of the dates separating calm from crisis periods. We propose to use a battery of break search procedures for individual time series to objectively identify potential break dates in relationships between countries. Applied to the biggest European stock markets and combined with two well established tests for financial contagion, this approach results in break dates which correctly identify the timing of changes in cross-country transmission mechanisms. Application of break search procedures breathes new life into the established contagion tests, allowing for an objective, data-driven timing of crisis periods.
Resumo:
This paper demonstrates that the conventional approach of using official liberalisation dates as the only existing breakdates could lead to inaccurate conclusions as to the effect of the underlying liberalisation policies. It also proposes an alternative paradigm for obtaining more robust estimates of volatility changes around official liberalisation dates and/or other important market events. By focusing on five East Asian emerging markets, all of which liberalised their financial markets in the late, and by using recent advances in the econometrics of structural change, it shows that (i) the detected breakdates in the volatility of stock market returns can be dramatically different to official liberalisation dates and (ii) the use of official liberalisation dates as breakdates can readily entail inaccurate inference. In contrast, the use of data-driven techniques for the detection of multiple structural changes leads to a richer and inevitably more accurate pattern of volatility evolution emerges in comparison with focussing on official liberalisation dates.
Resumo:
This paper investigates whether the non-normality typically observed in daily stock-market returns could arise because of the joint existence of breaks and GARCH effects. It proposes a data-driven procedure to credibly identify the number and timing of breaks and applies it on the benchmark stock-market indices of 27 OECD countries. The findings suggest that a substantial element of the observed deviations from normality might indeed be due to the co-existence of breaks and GARCH effects. However, the presence of structural changes is found to be the primary reason for the non-normality and not the GARCH effects. Also, there is still some remaining excess kurtosis that is unlikely to be linked to the specification of the conditional volatility or the presence of breaks. Finally, an interesting sideline result implies that GARCH models have limited capacity in forecasting stock-market volatility.
Resumo:
Failure to detect or account for structural changes in economic modelling can lead to misleading policy inferences, which can be perilous, especially for the more fragile economies of developing countries. Using three potential monetary policy instruments (Money Base, M0, and Reserve Money) for 13 member-states of the CFA Franc zone over the period 1989:11-2002:09, we investigate the magnitude of information extracted by employing data-driven techniques when analyzing breaks in time-series, rather than the simplifying practice of imposing policy implementation dates as break dates. The paper also tests Granger's (1980) aggregation theory and highlights some policy implications of the results.
Resumo:
This article focuses on the deviations from normality of stock returns before and after a financial liberalisation reform, and shows the extent to which inference based on statistical measures of stock market efficiency can be affected by not controlling for breaks. Drawing from recent advances in the econometrics of structural change, it compares the distribution of the returns of five East Asian emerging markets when breaks in the mean and variance are either (i) imposed using certain official liberalisation dates or (ii) detected non-parametrically using a data-driven procedure. The results suggest that measuring deviations from normality of stock returns with no provision for potentially existing breaks incorporates substantial bias. This is likely to severely affect any inference based on the corresponding descriptive or test statistics.
Resumo:
This paper investigates the environmental sustainability and competitiveness perceptions of small farmers in a region in northern Brazil. The main data collection instruments included a survey questionnaire and an analysis of the region's strategic plan. In total, ninety-nine goat and sheep breeding farmers were surveyed. Data analysis methods included descriptive statistics, cluster analysis, and chi-squared tests. The main results relate to the impact of education, land size, and location on the farmers' perceptions of competitiveness and environmental issues. Farmers with longer periods of education have higher perception scores about business competitiveness and environmental sustainability than those with less formal education. Farmers who are working larger land areas also have higher scores than those with smaller farms. Lastly, location can yield factors that impact on farmers' perceptions. In our study, farmers located in Angicos and Lajes had higher perception scores than Pedro Avelino and Afonso Bezerra, despite the geographical proximity of these municipalities. On the other hand, three other profile variables did not impact on farmers' perceptions, namely: family income, dairy production volume, and associative condition. The authors believe the results and insights can be extended to livestock farming in other developing countries and contribute generally to fostering effective sustainable development policies, mainly in the agribusiness sector. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
This thesis is about the discretionary role of the line manager in inspiring the work engagement of staff and their resulting innovative behaviour examined through the lens of Social Exchange Theory (Blau, 1964) and the Job Demands-Resources theory (Bakker, Demerouti, Nachreiner & Schaufeli, 2001). The study is focused on a large British Public Sector organisation undergoing a major organisational shift in the way in which they operate as part of the public sector. It is often claimed that people do not leave organisations; they leave line managers (Kozlowski & Doherty, 1989). Regardless of the knowledge in the literature concerning the importance of the line manager in organisations (Purcell, 2003), the engagement literature in particular is lacking in the consideration of such a fundamental figure in organisational life. Further, the understanding of the black box of managerial discretion and its relationship to employee and organisation related outcomes would benefit from greater exploration (Purcell, 2003; Gerhart, 2005; Scott, et al, 2009). The purpose of this research is to address these gaps with relation to the innovative behaviour of employees in the public sector – an area that is not typically associated with the public sector (Bhatta, 2003; McGuire, Stoner & Mylona, 2008; Hughes, Moore & Kataria, 2011). The study is a CASE Award PhD thesis, requiring academic and practical elements to the research. The study is of one case organisation, focusing on one service characterised by a high level of adoption of Strategic Human Resource Management activities and operating in a rather unique manner for the public sector, having private sector competition for work. The study involved a mixed methods approach to data collection. Preliminary focus groups with 45 participants were conducted, followed by an ethnographic period of five months embedded into the service conducting interviews and observations. This culminated in a quantitative survey delivered within the wider directorate to approximately 500 staff members. The study used aspects of the Grounded Theory (Glaser & Strauss, 1967) approach to analyse the data and developed results that highlight the importance of the line manager in an area characterised by SHRM and organisational change for engaging employees and encouraging innovative behaviour. This survey was completed on behalf of the organisation and the findings of this are presented in appendix 1, in order to keep the focus of the PhD on theory development. Implications for theory and practice are discussed alongside the core finding. Line managers’ discretion surrounding the provision of job resources (in particular trust, autonomy and implementation and interpretation of combined bundles of SHRM policies and procedures) influenced the exchange process by which employees responded with work engagement and innovative behaviour. Limitations to the research are the limitations commonly attributed to cross-sectional data collection methods and those surrounding generalisability of the qualitative findings outside of the contextual factors characterising the service area. Suggestions for future research involve addressing these limitations and further exploration of the discretionary role with regards to extending our understanding of line manager discretion.
Resumo:
MOTIVATION: G protein-coupled receptors (GPCRs) play an important role in many physiological systems by transducing an extracellular signal into an intracellular response. Over 50% of all marketed drugs are targeted towards a GPCR. There is considerable interest in developing an algorithm that could effectively predict the function of a GPCR from its primary sequence. Such an algorithm is useful not only in identifying novel GPCR sequences but in characterizing the interrelationships between known GPCRs. RESULTS: An alignment-free approach to GPCR classification has been developed using techniques drawn from data mining and proteochemometrics. A dataset of over 8000 sequences was constructed to train the algorithm. This represents one of the largest GPCR datasets currently available. A predictive algorithm was developed based upon the simplest reasonable numerical representation of the protein's physicochemical properties. A selective top-down approach was developed, which used a hierarchical classifier to assign sequences to subdivisions within the GPCR hierarchy. The predictive performance of the algorithm was assessed against several standard data mining classifiers and further validated against Support Vector Machine-based GPCR prediction servers. The selective top-down approach achieves significantly higher accuracy than standard data mining methods in almost all cases.
Resumo:
Adopting a grounded theory methodology, the study describes how an event and pressure impact upon a process of deinstitutionalization and institutional change. Three case studies were theoretically sampled in relation to each other. They yielded mainly qualitative data from methods that included interviews, observations, participant observations, and document reviews. Each case consisted of a boundaried cluster of small enterprises that were not industry specific and were geographically dispersed. Overall findings describe how an event, i.e. a stimulus, causes disruption, which in turn may cause pressure. Pressure is then translated as a tension within the institutional environment, which is characterized by opposing forces that encourage institutional breakdown and institutional maintenance. Several contributions are made: Deinstitutionalization as a process is inextricable from the formation of institutions – both are needed to make sense of institutional change on a conceptual level but are also inseparable experientially in the field; stimuli are conceptually different to pressures; the historical basis of a stimulus may impact on whether pressure and institutional change occurs; pressure exists in a more dynamic capacity rather than only as a catalyst; institutional breakdown is a non-linear irregular process; ethical and survival pressures as new types were identified; institutional current, as an underpinning mechanism, influences how the tension between institutional breakdown and maintenance plays out.
Resumo:
This paper presents a novel intonation modelling approach and demonstrates its applicability using the Standard Yorùbá language. Our approach is motivated by the theory that abstract and realised forms of intonation and other dimensions of prosody should be modelled within a modular and unified framework. In our model, this framework is implemented using the Relational Tree (R-Tree) technique. The R-Tree is a sophisticated data structure for representing a multi-dimensional waveform in the form of a tree. Our R-Tree for an utterance is generated in two steps. First, the abstract structure of the waveform, called the Skeletal Tree (S-Tree), is generated using tone phonological rules for the target language. Second, the numerical values of the perceptually significant peaks and valleys on the S-Tree are computed using a fuzzy logic based model. The resulting points are then joined by applying interpolation techniques. The actual intonation contour is synthesised by Pitch Synchronous Overlap Technique (PSOLA) using the Praat software. We performed both quantitative and qualitative evaluations of our model. The preliminary results suggest that, although the model does not predict the numerical speech data as accurately as contemporary data-driven approaches, it produces synthetic speech with comparable intelligibility and naturalness. Furthermore, our model is easy to implement, interpret and adapt to other tone languages.
Resumo:
Listening is typically the first language skill to develop in first language (L1) users and has been recognized as a basic and fundamental tool for communication. Despite the importance of listening, aural abilities are often taken for granted, and many people overlook their dependency on listening and the complexities that combine to enable this multi-faceted skill. When second language (L2) students are learning their new language, listening is crucial, as it provides access to oral input and facilitates social interaction. Yet L2 students find listening challenging, and L2 teachers often lack sufficient pedagogy to help learners develop listening abilities that they can use in and beyond the classroom. In an effort to provide a pedagogic alternative to more traditional and limited L2 listening instruction, this thesis investigated the viability of listening strategy instruction (LSI) over three semesters at a private university in Japan through a qualitative action research (AR) intervention. An LSI program was planned and implemented with six classes over the course of three AR phases. Two teachers used the LSI with 121 learners throughout the project. Following each AR phase, student and teacher perceptions of the methodology were investigated via questionnaires and interviews, which were primary data collection methods. Secondary research methods (class observations, pre/post-semester test scores, and a research journal) supplemented the primary methods. Data were analyzed and triangulated for emerging themes related to participants’ perceptions of LSI and the viability thereof. These data showed consistent positive perceptions of LSI on the parts of both learners and teachers, although some aspects of LSI required additional refinement. This project provided insights on LSI specific to the university context in Japan and also produced principles for LSI program planning and implementation that can inform the broader L2 education community.
Resumo:
This paper draws on contributions to and discussions at a recent MRC HSRC-sponsored workshop 'Researching users' experiences of health care: the case of cancer'. We focus on the methodological and ethical challenges that currently face researchers who use self-report methods to investigate experiences of cancer and cancer care. These challenges relate to: the theoretical and conceptual underpinnings of research; participation rates and participant profiles; data collection methods (the retrospective nature of accounts, description and measurement, and data collection as intervention); social desirability considerations; relationship considerations; the experiences of contributing to research; and the synthesis and presentation of findings. We suggest that methodological research to tackle these challenges should be integrated into substantive research projects to promote the development of a strong knowledge base about experiences of cancer and cancer care.