970 resultados para Univariate Analysis box-jenkins methodology
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Objectives. To evaluate our experience with total pharyngolaryngectomy in the treatment of hypopharyngeal squamous cell carcinoma. Study Design: Retrospective analysis of consecutively treated patients in an academic otolaryngology, head and neck department. Methods. One hundred eighty patients who had total pharyngolaryngectomy performed for hypopharyngeal carcinoma were included in this study. Patients with a history of previous head and neck cancer were excluded. Clinicopathologic parameters were recorded and survival calculated using the Kaplan-Meier method. Results. One hundred sixty-two (90%) of the patients were male, and the patients had a mean age of 62 years. The majority (91%) of patients had advanced overall clinical stage disease (stage 3,4). Thirty-one (17.8%) and 43 (24%) patients developed locoregional and metastatic disease recurrence, respectively. The 2- and 5-year disease-specific survival rates were 72% and 52%, respectively. Advanced nodal stage, perineural invasion, lymphovascular invasion, and positive margins were predictors of poor survival on univariate analysis, and lymphovascular invasion was an independent prognostic factor on multivariate analysis. Conclusion: Surgery and postoperative radiotherapy remains the treatment against which other modalities should be compared for advanced stage hypopharyngeal squamous cell carcinoma.
Resumo:
This study aimed to investigate the acute effects of mild Traumatic Brain Injury (mTBI) on the performance of a finger tapping and word repetition dual task in order to determine working memory impairment in mTBI Sixty-four (50 male, 14 female) right-handed cases of mTBI and 26 (18 male and 8 female) right-handed cases of orthopaedic injuries were tested within 24 hours of injury. Patients with mTBI completed fewer correct taps in 10 seconds than patients with orthopaedic injuries, and female mTBI cases repeated fewer words. The size of the dual task decrement did not vary between groups. When added to a test battery including the Rapid Screen of Concussion (RSC; Comerford, Geffen, May, Medland T Geffen, 2002) and the Digit Symbol Substitution Test,finger tapping speed accounted for 1% of between groups variance and did not improve classification rates of male participants. While the addition of tapping rate did not improve the sensitivity and specificity of the RSC and DSST to mTBI in males, univariate analysis of motor performance in females indicated. that dual task performance might be diagnostic. An increase in female sample Size is warranted. These results confirm the view that there is a generalized slowing of processing ability following mTBI.
Resumo:
First, this study examined genetic and environmental sources of variation in performance on a standardised test of academic achievement, the Queensland Core Skills Test (QCST) (Queensland Studies Authority, 2003a). Second, it assessed the genetic correlation among the QCST score and Verbal and Performance IQ measures using the Multidimensional Aptitude Battery (MAB), [Jackson, D. N. (1984) Multidimensional Aptitude Battery manual. Port Huron, MI:Research Psychologist Press, Inc.]. Participants were 256 monozygotic twin pairs and 326 dizygotic twin pairs aged from 15 to 18 years (mean 17 years +/- 0.4 [SD]) when achievement tested, and from 15 to 22 years (mean 16 years +/- 0.4 [SD]) when IQ tested. Univariate analysis indicated a heritability for the QCST of 0.72. Adjustment to this estimate due to truncate selection (downward adjustment) and positive phenotypic assortative mating (upward adjustment) suggested a heritability of 0.76 The phenotypic (0.81) and genetic (0.91) correlations between the QCST and Verbal IQ (VIQ) were significantly stronger than the phenotypic (0.57) and genetic (0.64) correlations between the QCST and Performance IQ (PIQ). The findings suggest that individual variation in QCST performance is largely due to genetic factors and that common environmental effects may be substantially accounted for by phenotypic assortative mating. Covariance between academic achievement on the QCST and psychometric IQ (particularly VIQ) is to a large extent due to common genetic influences.
Resumo:
A pesquisa trata do estudo da implantação de uma incubadora de empresas no município de Santana de Parnaíba em 2005. Duas perguntas estruturaram a investigação: Quais fatores ou indicadores econômicos e sociais, conforme previsto no Programa Nacional de Apoio à Implantação de Incubadoras de Empresas, se evidenciaram como viabilizadores, ou não, para a escolha do tipo de incubadora implantada no município? A incubadora de Santana de Parnaíba configurou-se ou não como espaço articulado de desenvolvimento econômico e social? Essas questões norteadoras determinaram o objetivo geral do estudo e os objetivos específicos, quais foram: investigar se houve ou não a aplicação do Estudo de Viabilidade Técnica e Econômica/SEBRAE; levantar o perfil socioeconômico do empreendedor local e, identificar o perfil das empresas graduadas quanto à geração de postos de trabalho. Para atender aos objetivos optou-se pela realização de pesquisa qualitativa de caráter exploratório e descritivo. A estratégia da observação participante foi seguida a partir da consulta aos diários de campo e às fontes primárias e secundárias, com a leitura e consulta de documentos e registros da implantação nos arquivos da FIESP. Como parte da estratégia de coleta de dados foi aplicado, junto ao universo da pesquisa, empreendedores e coordenador executivo do Programa de Incubadoras da FIESP, roteiro semiestruturado de questões. As informações foram analisadas mediante a técnica de análise de conteúdo, seguindo a metodologia do Estudo de Caso. Serviu de motivo condutor a constatação de que a notável evolução de tecnologias propiciada pelo sistema capitalista, o avanço dos processos produtivos e o aumento da produtividade nas grandes corporações ocorreram e continuam ocorrendo em ritmo sensivelmente superior à dinâmica de capacitação e qualificação da mão de obra em países em desenvolvimento, o que faz crescer o desemprego e a informalidade. A recente crise americana, em 2008, configura a oportunidade para melhor compreensão do conceito de incubadoras de empresas e demais empreendimentos solidários. Os resultados da pesquisa indicam que todos os empreendedores receberam informações de marketing e finanças. O apoio de consultorias especializadas, a participação em feiras e o desenvolvimento do plano de negó cios ao longo da incubação contemplaram as expectativas dos empreendimentos de tecnologia difundida. A pesquisa indica que, nos empreendimentos de base tecnológica, a incubadora não concretizou apoio efetivo às demandas por novos processos e novos produtos. Disso decorreu o atendimento deficitário às metas de geração de emprego na localidade pelas três empresas de base tecnológica instaladas na incubadora. Os resultados indicam que, apesar da significativa quantidade de documentos de conteúdo normativo que tratam da prevenção de insucessos de políticas públicas de apoio à implantação de incubadoras e ao empreendedorismo, não foi possível constatar indícios de articulação entre desenvolvimento econômico e desenvolvimento social nem de Estudo de Viabilidade Técnica e Econômica precedendo a implantação da incubadora no município estudado.(AU)
Resumo:
This study investigates plagiarism detection, with an application in forensic contexts. Two types of data were collected for the purposes of this study. Data in the form of written texts were obtained from two Portuguese Universities and from a Portuguese newspaper. These data are analysed linguistically to identify instances of verbatim, morpho-syntactical, lexical and discursive overlap. Data in the form of survey were obtained from two higher education institutions in Portugal, and another two in the United Kingdom. These data are analysed using a 2 by 2 between-groups Univariate Analysis of Variance (ANOVA), to reveal cross-cultural divergences in the perceptions of plagiarism. The study discusses the legal and social circumstances that may contribute to adopting a punitive approach to plagiarism, or, conversely, reject the punishment. The research adopts a critical approach to plagiarism detection. On the one hand, it describes the linguistic strategies adopted by plagiarists when borrowing from other sources, and, on the other hand, it discusses the relationship between these instances of plagiarism and the context in which they appear. A focus of this study is whether plagiarism involves an intention to deceive, and, in this case, whether forensic linguistic evidence can provide clues to this intentionality. It also evaluates current computational approaches to plagiarism detection, and identifies strategies that these systems fail to detect. Specifically, a method is proposed to translingual plagiarism. The findings indicate that, although cross-cultural aspects influence the different perceptions of plagiarism, a distinction needs to be made between intentional and unintentional plagiarism. The linguistic analysis demonstrates that linguistic elements can contribute to finding clues for the plagiarist’s intentionality. Furthermore, the findings show that translingual plagiarism can be detected by using the method proposed, and that plagiarism detection software can be improved using existing computer tools.
Resumo:
This paper reports an investigation of local sustainable production in Sweden aimed at exploring the factors contributing to survival and competitiveness of manufacturing. Eight companies were studied on two occasions 30 years apart; in 1980 and 2010. To provide a valid longitudinal, perspective a common format for data collection was used. As a framework for data collection and analysis the DRAMA methodology was employed (Bennett and Forrester, 1990). There are a number of results reported in detail concerning long term competitiveness and sustainability of manufacturing companies.
Resumo:
2000 Mathematics Subject Classification: 62J12, 62P10.
Resumo:
Mainstream gentrification research predominantly examines experiences and motivations of the middle-class gentrifier groups, while overlooking experiences of non-gentrifying groups including the impact of in situ local processes on gentrification itself. In this paper, I discuss gentrification, neighbourhood belonging and spatial distribution of class in Istanbul by examining patterns of belonging both of gentrifiers and non-gentrifying groups in historic neighbourhoods of the Golden Horn/Halic. I use multiple correspondence analysis (MCA), a methodology rarely used in gentrification research, to explore social and symbolic borders between these two groups. I show how gentrification leads to spatial clustering by creating exclusionary practices and eroding social cohesion, and illuminate divisions that are inscribed into the physical space of the neighbourhood.
Resumo:
Feature selection is important in medical field for many reasons. However, selecting important variables is a difficult task with the presence of censoring that is a unique feature in survival data analysis. This paper proposed an approach to deal with the censoring problem in endovascular aortic repair survival data through Bayesian networks. It was merged and embedded with a hybrid feature selection process that combines cox's univariate analysis with machine learning approaches such as ensemble artificial neural networks to select the most relevant predictive variables. The proposed algorithm was compared with common survival variable selection approaches such as; least absolute shrinkage and selection operator LASSO, and Akaike information criterion AIC methods. The results showed that it was capable of dealing with high censoring in the datasets. Moreover, ensemble classifiers increased the area under the roc curves of the two datasets collected from two centers located in United Kingdom separately. Furthermore, ensembles constructed with center 1 enhanced the concordance index of center 2 prediction compared to the model built with a single network. Although the size of the final reduced model using the neural networks and its ensembles is greater than other methods, the model outperformed the others in both concordance index and sensitivity for center 2 prediction. This indicates the reduced model is more powerful for cross center prediction.
Resumo:
Purpose: Considering the UK's limited capacity for waste disposal (particularly for hazardous/radiological waste) there is growing focus on waste avoidance and minimisation to lower the volumes of waste being sent to disposal. The hazardous nature of some waste can complicate its management and reduction. To address this problem there was a need for a decision making methodology to support managers in the nuclear industry as they identify ways to reduce the production of avoidable hazardous waste. The methodology we developed is called Waste And Sourcematter Analysis (WASAN). A methodology that begins the thought process at the pre-waste creation stage (i.e. Avoid). Design/methodology/ approach: The methodology analyses the source of waste, the production of waste inside the facility, the knock on effects from up/downstream facilities on waste production, and the down-selection of waste minimisation actions/options. WASAN has been applied to case studies with licencees and this paper reports on one such case study - the management of plastic bags in Enriched Uranium Residues Recovery Plant (EURRP) at Springfields (UK) where it was used to analyse the generation of radioactive plastic bag waste. Findings: Plastic bags are used in EURRP as a strategy to contain hazard. Double bagging of materials led to the proliferation of these bags as a waste. The paper reports on the philosophy behind WASAN, the application of the methodology to this problem, the results, and views from managers in EURRP. Originality/value: This paper presents WASAN as a novel methodology for analyzing the minimization of avoidable hazardous waste. This addresses an issue that is important to many industries e.g. where legislation enforces waste minimization, where waste disposal costs encourage waste avoidance, or where plant design can reduce waste. The paper forms part of the HSE Nuclear Installations Inspectorate's desire to work towards greater openness and transparency in its work and the development in its thinking.© Crown Copyright 2011.
Resumo:
In this paper, a program for a research is outlined. Firstly, the concept of responsive information systems is defined and then the notion of the capacity planning and software performance engineering is clarified. Secondly, the purpose of the proposed methodology of capacity planning, the interface to information systems analysis and development methodologies (SSADM), the advantage of knowledge-based approach is discussed. The interfaces to CASE tools more precisely to data dictionaries or repositories (IRDS) are examined in the context of a certain systems analysis and design methodology (e.g. SSADM).
Resumo:
The purpose of the study was to measure gains in the development of elementary education teachers’ reading expertise, to determine if there was a differential gain in reading expertise, and last, to examine their perceptions of acquiring reading expertise. This research is needed in the field of teacher education, specifically in the field of reading. A quasi-experimental design with a comparison group using pretest-posttest mixed-method, repeated measures was utilized. Quantitative data analysis measured the development of reading expertise of elementary preservice teachers compared to early childhood preservice teachers; and, was used to examine the differential gains in reading expertise. A multivariate analysis of variance (MANOVA) was conducted on pre- and posttest responses on a Protocol of Questions. Further analysis was conducted on five variables (miscue analysis, fluency analysis, data analysis, inquiry orientation and intelligent action) using a univariate analysis of variance (ANOVA). A one-way ANOVA was carried out on gain scores of the low and middle groups of elementary education preservice teachers. Qualitative data analysis suggested by Merriam (1989) and Miles and Huberman (1994) was used to determine if the elementary education preservice teachers perceived they had acquired the expertise to teach reading. Elementary education preservice teachers who participated in a supervised clinical practicum made significant gains in their development of reading expertise as compared to early childhood preservice teachers who did not make significant gains. Elementary education preservice teachers who were in the low and middle third levels of expertise at pretest demonstrated significant gains in reading expertise. Last, elementary education preservice teachers perceived they had acquired the expertise to teach reading. The study concluded that reading expertise can be developed in elementary education preservice teachers through participation in a supervised clinical practicum. The findings support the idea that preservice teachers who will be teaching reading to elementary students would benefit from a supervised clinical practicum.
Resumo:
In human society, people encounter various deontic conflicts every day. Deontic decisions are those that include moral, ethical, and normative aspects. Here, the concern is with deontic conflicts: decisions where all the alternatives lead to the violation of some norms. People think critically about these kinds of decisions. But, just ‘what’ they think about is not always clear. ^ People use certain estimating factors/criteria to balance the tradeoffs when they encounter deontic conflicts. It is unclear what subjective factors people use to make a deontic decision. An elicitation approach called the Open Factor Conjoint System is proposed, which applies an online elicitation methodology which is a combination of two well-know research methodologies: repertory grid and conjoint analysis. This new methodology is extended to be a web based application. It seeks to elicit additional relevant (subjective) factors from people, which affect deontic decisions. The relative importance and utility values are used for the development of a decision model to predict people’s decisions. ^ Fundamentally, this methodology was developed and intended to be applicable for a wide range of elicitation applications with minimal experimenter bias. Comparing with the traditional method, this online survey method reduces the limitation of time and space in data collection and this methodology can be applied in many fields. Two possible applications were addressed: robotic vehicles and the choice of medical treatment. In addition, this method can be applied to many research related disciplines in cross-cultural research due to its online ability with global capacity. ^
Resumo:
Bankruptcy prediction has been a fruitful area of research. Univariate analysis and discriminant analysis were the first methodologies used. While they perform relatively well at correctly classifying bankrupt and nonbankrupt firms, their predictive ability has come into question over time. Univariate analysis lacks the big picture that financial distress entails. Multivariate discriminant analysis requires stringent assumptions that are violated when dealing with accounting ratios and market variables. This has led to the use of more complex models such as neural networks. While the accuracy of the predictions has improved with the use of more technical models, there is still an important point missing. Accounting ratios are the usual discriminating variables used in bankruptcy prediction. However, accounting ratios are backward-looking variables. At best, they are a current snapshot of the firm. Market variables are forward-looking variables. They are determined by discounting future outcomes. Microstructure variables, such as the bid-ask spread, also contain important information. Insiders are privy to more information that the retail investor, so if any financial distress is looming, the insiders should know before the general public. Therefore, any model in bankruptcy prediction should include market and microstructure variables. That is the focus of this dissertation. The traditional models and the newer, more technical models were tested and compared to the previous literature by employing accounting ratios, market variables, and microstructure variables. Our findings suggest that the more technical models are preferable, and that a mix of accounting and market variables are best at correctly classifying and predicting bankrupt firms. Multi-layer perceptron appears to be the most accurate model following the results. The set of best discriminating variables includes price, standard deviation of price, the bid-ask spread, net income to sale, working capital to total assets, and current liabilities to total assets.