957 resultados para Multivariate statistical methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The figure of the coordinator in health and safety issues in the construction sector first appeared in our legislation through the incorporation of the European Directives (in our case Royal Decree 1627/97 on the minimum health and safety regulations in construction works), and is viewed differently in different countries of the European Union regarding the way they are hired and their role in the construction industry. Coordinating health and safety issues is also a management process that requires certain competencies that are not only based on technical or professional training, but which, taking account of the work environment, require the use of strategies and tools that are related to experience and personal skills. Through a piece of research that took account of expert opinions in the matter, we have found which competencies need to be possessed by the health and safety coordinator in order to improve the safety in the works they are coordinating. The conclusions of the analyses performed using the appropriate statistical methods (comparing means and multivariate analysis techniques), will enable training programmes to be designed and ensure that the health and safety coordinators selected have the competencies required to carry out their duties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The figure of the coordinator in health and safety issues in the construction sector first appeared in our legislation through the incorporation of the European Directives (in our case Royal Decree 1627/97 on the minimum health and safety regulations in construction works), and is viewed differently in different countries of the European Union regarding the way they are hired and their role in the construction industry. Coordinating health and safety issues is also a management process that requires certain competencies that are not only based on technical or professional training, but which, taking account of the work environment, require the use of strategies and tools that are related to experience and personal skills. Through a piece of research that took account of expert opinions in the matter, we have found which competencies need to be possessed by the health and safety coordinator in order to improve the safety in the works they are coordinating. The conclusions of the analyses performed using the appropriate statistical methods (comparing means and multivariate analysis techniques), will enable training programmes to be designed and ensure that the health and safety coordinators selected have the competencies required to carry out their duties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The analysis of the interdependence between time series has become an important field of research in the last years, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, the introduction of concepts such as generalized and phase synchronization and the application of information theory to time series analysis. In neurophysiology, different analytical tools stemming from these concepts have added to the ‘traditional’ set of linear methods, which includes the cross-correlation and the coherency function in the time and frequency domain, respectively, or more elaborated tools such as Granger Causality. This increase in the number of approaches to tackle the existence of functional (FC) or effective connectivity (EC) between two (or among many) neural networks, along with the mathematical complexity of the corresponding time series analysis tools, makes it desirable to arrange them into a unified-easy-to-use software package. The goal is to allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of these analysis methods from a single integrated toolbox. Here we present HERMES (http://hermes.ctb.upm.es), a toolbox for the Matlab® environment (The Mathworks, Inc), which is designed to study functional and effective brain connectivity from neurophysiological data such as multivariate EEG and/or MEG records. It includes also visualization tools and statistical methods to address the problem of multiple comparisons. We believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The analysis of the interdependence between time series has become an important field of research in the last years, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, the introduction of concepts such as generalized and phase synchronization and the application of information theory to time series analysis. In neurophysiology, different analytical tools stemming from these concepts have added to the ?traditional? set of linear methods, which includes the cross-correlation and the coherency function in the time and frequency domain, respectively, or more elaborated tools such as Granger Causality. This increase in the number of approaches to tackle the existence of functional (FC) or effective connectivity (EC) between two (or among many) neural networks, along with the mathematical complexity of the corresponding time series analysis tools, makes it desirable to arrange them into a unified, easy-to-use software package. The goal is to allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of these analysis methods from a single integrated toolbox. Here we present HERMES (http://hermes.ctb.upm.es), a toolbox for the Matlab® environment (The Mathworks, Inc), which is designed to study functional and effective brain connectivity from neurophysiological data such as multivariate EEG and/or MEG records. It includes also visualization tools and statistical methods to address the problem of multiple comparisons. We believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mulch materials of different origins have been introduced into the agricultural sector in recent years alternatively to the standard polyethylene due to its environmental impact. This study aimed to evaluate the multivariate response of mulch materials over three consecutive years in a processing tomato (Solanum lycopersicon L.) crop in Central Spain. Two biodegradable plastic mulches (BD1, BD2), one oxo-biodegradable material (OB), two types of paper (PP1, PP2), and one barley straw cover (BS) were compared using two control treatments (standard black polyethylene [PE] and manual weed control [MW]). A total of 17 variables relating to yield, fruit quality, and weed control were investigated. Several multivariate statistical techniques were applied, including principal component analysis, cluster analysis, and discriminant analysis. A group of mulch materials comprised of OB and BD2 was found to be comparable to black polyethylene regarding all the variables considered. The weed control variables were found to be an important source of discrimination. The two paper mulches tested did not share the same treatment group membership in any case: PP2 presented a multivariate response more similar to the biodegradable plastics, while PP1 was more similar to BS and MW. Based on our multivariate approach, the materials OB and BD2 can be used as an effective, more environmentally friendly alternative to polyethylene mulches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to compare a number of state-of-the-art methods in airborne laser scan- ning (ALS) remote sensing with regards to their capacity to describe tree size inequality and other indi- cators related to forest structure. The indicators chosen were based on the analysis of the Lorenz curve: Gini coefficient ( GC ), Lorenz asymmetry ( LA ), the proportions of basal area ( BALM ) and stem density ( NSLM ) stocked above the mean quadratic diameter. Each method belonged to one of these estimation strategies: (A) estimating indicators directly; (B) estimating the whole Lorenz curve; or (C) estimating a complete tree list. Across these strategies, the most popular statistical methods for area-based approach (ABA) were used: regression, random forest (RF), and nearest neighbour imputation. The latter included distance metrics based on either RF (NN–RF) or most similar neighbour (MSN). In the case of tree list esti- mation, methods based on individual tree detection (ITD) and semi-ITD, both combined with MSN impu- tation, were also studied. The most accurate method was direct estimation by best subset regression, which obtained the lowest cross-validated coefficients of variation of their root mean squared error CV(RMSE) for most indicators: GC (16.80%), LA (8.76%), BALM (8.80%) and NSLM (14.60%). Similar figures [CV(RMSE) 16.09%, 10.49%, 10.93% and 14.07%, respectively] were obtained by MSN imputation of tree lists by ABA, a method that also showed a number of additional advantages, such as better distributing the residual variance along the predictive range. In light of our results, ITD approaches may be clearly inferior to ABA with regards to describing the structural properties related to tree size inequality in for- ested areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este trabajo tiene por objeto aplicar los principios del Value Investing a veinticuatro empresas del sector minero y definir las claves para extrapolar, en base a un análisis fundamental, una calificación para cada una de las empresas. Con este fin, se ha realizado un estudio estadístico multivariante para comparar las correlaciones existentes entre cada ratio fundamental y su evolución en bolsa a uno, tres y cinco años vista. Para procesar los datos se han utilizado los programas MATLAB y EXCEL. Sobre ellos se ha planteado una Matriz de Correlaciones de Pearson y un estudio de dispersión por cruce de pares. El análisis demostró que es posible aplicar la metodología del Value Investing a empresas del sector minero con resultados positivos aunque, el ajuste de las correlaciones, sugiere utilizar series temporales más largas y un mayor número de empresas para ganar fiabilidad en el contraste de estas hipótesis. De los estudios realizados, se deduce que unos buenos fundamentales influyen, de manera notable, a la revalorización bursátil a 3 y 5 años destacando, además, que el ajuste es mejor cuanto mayor sea este tiempo. Abstract This study aims to apply the principles of Value Investing to twenty four mining companies and, based on this fundamental study, develop a rating to classify those companies. For this purpose, we have performed a multivariate statistical study to compare the correlations between each fundamental ratio and its stock revalorization for one, three and five years. MATLAB and EXCEL have been used to process data. The statistical methods used are Pearson Matrix of Correlations and a Cross Pairs Scattering Study. The analysis showed that it is possible to apply the methodology of Value Investing to mining companies, although, the adjustment of correlations suggests using longer time series and a larger amount of companies to test these hypothesis. From the studies performed, it follows that good fundamentals significantly influence the stock market value at 3 and 5 years, noting that, the larger the period under study, the better the fit.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A cetamina é uma droga amplamente utilizada e o seu uso inadequado tem sido associado à graves consequências para a saúde humana. Embora as propriedades farmacológicas deste agente em doses terapêuticas sejam bem conhecidas, existem poucos estudos sobre os efeitos secundários induzidos por doses não-terapêuticas, incluindo os efeitos nos estados de ansiedade e agressividade. Neste contexto, os modelos animais são uma etapa importante na investigação e elucidação do mecanismo de ação a nível comportamental. O zebrafish (Danio rerio) é um novo organismo-modelo, interessante e promissor, uma vez que apresenta alta similaridade fisiológica, genética e neuroquímica com seres humanos, respostas comportamentais bem definidas e rápida absorção de compostos de interesse em meio aquoso além de apresentar uma série de vantagens em relação aos modelos mamíferos tais como manutenção de baixo custo, prática e executável em espaços reduzidos. Nesse sentido, faz-se necessário a execução de ensaios comportamentais em conjunto com análises estatísticas robustas e rápidas tais como ANOVA e Métodos Multivariados; e também o desenvolvimento de métodos analíticos sensíveis, precisos e rápidos para determinação de compostos de interesse em matrizes biológicas provenientes do animal. Os objetivos do presente trabalho foram a investigação dos efeitos da cetamina sobre a ansiedade e a agressividade em zebrafish adulto empregando Testes de Claro-Escuro e Testes do Espelho e métodos estatísticos univariados (ANOVA) e multivariados (PCA, HCA e SIMCA) assim como o desenvolvimento de método analítico para determinação da cetamina em matriz biológica proveniente do animal, empregando Extração Líquido-Líquido e Cromatografia em Fase Gasosa acoplada ao Detector de Nitrogênio-Fósforo (GC-NPD). Os resultados comportamentais indicaram que a cetamina produziu um efeito significativo dose-dependente em zebrafish adulto na latência à área clara, no número de cruzamentos entre as áreas e no tempo de exploração da área clara. Os resultados das análises SIMCA e PCA mostraram uma maior similaridade entre o grupo controle e os grupos de tratamento expostos às doses mais baixas (5 e 20 mg L-1) e entre os grupos expostos às doses de 40 e 60 mg L-1. Na análise por PCA, dois componentes principais responderam por 88,74% de toda a informação do sistema, sendo que 62,59% da informação cumulativa do sistema foi descrito pela primeira componente principal. As classificações HCA e SIMCA seguiram uma evolução lógica na distribuição das amostras por classes. As doses mais altas de cetamina induziram uma distribuição mais homogênea das amostras enquanto as doses mais baixas e o controle resultaram em distribuições mais dispersas. No Teste do Espelho, a cetamina não induziu efeitos significativos no comportamento dos animais. Estes resultados sugerem que a cetamina é modulador de comportamentos ansiosos, sem efeitos indutores de agressividade. Os resultados da validação do método cromatográfico indicaram uma extração com valores de recuperação entre 33,65% e 70,89%. A curva de calibração foi linear com valor de R2 superior a 0,99. O limite de detecção (LOD) foi de 1 ng e o limite de quantificação (LOQ) foi de 5 ng. A exatidão do método cromatográfico manteve-se entre - 24,83% e - 1,258%, a precisão intra-ensaio entre 2,67 e 14,5% e a precisão inter-ensaio entre 1,93 e 13,9%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este trabalho apresenta resultados geoquímicos multielementares de sedimentos de corrente no estado de São Paulo, obtidos através do projeto institucional do Serviço Geológico do Brasil denominado \"Levantamento Geoquímico de Baixa Densidade no Brasil\". Dados analíticos de 1422 amostras de sedimento de corrente obtidos por ICP-MS (Inductively Coupled Plasma Mass Spectrometry), para 32 elementos químicos (Al, Ba, Be, Ca, Ce, Co, Cr, Cs, Cu, Fe, Ga, Hf, K, La, Mg, Mn, Mo, Nb, Ni, P, Pb, Rb, Sc, Sn, Sr, Th, Ti, U, V, Y, Zn e Zr), foram processadas e abordadas através da análise estatística uni e multivariada. Os resultados do tratamento dos dados através de técnicas estatísticas univariadas forneceram os valores de background geoquímico (teor de fundo) dos 32 elementos para todo estado de São Paulo. A análise georreferenciada das distribuições geoquímicas unielementares evidenciaram a compartimentação geológica da área. As duas principais províncias geológicas do estado de São Paulo, Bacia do Paraná e Complexo Cristalino, se destacam claramente na maioria das distribuições geoquímicas. Unidades geológicas de maior expressão, como a Formação Serra Geral e o Grupo Bauru também foram claramente destacadas. Outras feições geoquímicas indicaram possíveis áreas contaminadas e unidades geológicas não cartografadas. Os resultados da aplicação de métodos estatísticos multivariados aos dados geoquímicos com 24 variáveis (Al, Ba, Ce, Co, Cr, Cs, Cu, Fe, Ga, La, Mn, Nb, Ni, Pb, Rb, Sc, Sr, Th, Ti, U, V, Y, Zn e Zr) permitiram definir as principais assinaturas e associações geoquímicas existentes em todo estado de São Paulo e correlacioná-las aos principais domínios litológicos. A análise de agrupamentos em modo Q forneceu oito grupos de amostras geoquimicamente correlacionáveis, que georreferenciadas reproduziram os principais compartimentos geológicos do estado: Complexo Cristalino, Grupos Itararé e Passa Dois, Formação Serra Geral e Grupos Bauru e Caiuá. A análise discriminante multigrupos comprovou, estatisticamente, a classificação dos grupos formados pela análise de agrupamentos e forneceu as principais variáveis discriminantes: Fe, Co, Sc, V e Cu. A análise de componentes principais, abordada em conjunto com a análise fatorial pelo método de rotação varimax, forneceram os principais fatores multivariados e suas respectivas associações elementares. O georreferenciamento dos valores de escores fatoriais multivariados delimitaram as áreas onde as associações elementares ocorrem e forneceram mapas multivariados para todo o estado. Por fim, conclui-se que os métodos estatísticos aplicados são indispensáveis no tratamento, apresentação e interpretação de dados geoquímicos. Ademais, com base em uma visão integrada dos resultados obtidos, este trabalho recomenda: (1) a execução dos levantamentos geoquímicos de baixa densidade em todo país em caráter de prioridade, pois são altamente eficazes na definição de backgrounds regionais e delimitação de províncias geoquímicas com interesse metalogenético e ambiental; (2) a execução do mapeamento geológico contínuo em escala adequada (maiores que 1:100.000) em áreas que apontam para possíveis existências de unidades não cartografadas nos mapas geológicos atuais.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is - potentially fatally - obstructed. It is one of the leading causes of sudden cardiac death in young people. Electrocardiography (ECG) and Echocardiography (Echo) are the standard tests for identifying HCM and other cardiac abnormalities. The American Heart Association has recommended using a pre-participation questionnaire for young athletes instead of ECG or Echo tests due to considerations of cost and time involved in interpreting the results of these tests by an expert cardiologist. Initially we set out to develop a classifier for automated prediction of young athletes’ heart conditions based on the answers to the questionnaire. Classification results and further in-depth analysis using computational and statistical methods indicated significant shortcomings of the questionnaire in predicting cardiac abnormalities. Automated methods for analyzing ECG signals can help reduce cost and save time in the pre-participation screening process by detecting HCM and other cardiac abnormalities. Therefore, the main goal of this dissertation work is to identify HCM through computational analysis of 12-lead ECG. ECG signals recorded on one or two leads have been analyzed in the past for classifying individual heartbeats into different types of arrhythmia as annotated primarily in the MIT-BIH database. In contrast, we classify complete sequences of 12-lead ECGs to assign patients into two groups: HCM vs. non-HCM. The challenges and issues we address include missing ECG waves in one or more leads and the dimensionality of a large feature-set. We address these by proposing imputation and feature-selection methods. We develop heartbeat-classifiers by employing Random Forests and Support Vector Machines, and propose a method to classify full 12-lead ECGs based on the proportion of heartbeats classified as HCM. The results from our experiments show that the classifiers developed using our methods perform well in identifying HCM. Thus the two contributions of this thesis are the utilization of computational and statistical methods for discovering shortcomings in a current screening procedure and the development of methods to identify HCM through computational analysis of 12-lead ECG signals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Federal Highway Administration, Office of Safety and Traffic Operations, Washington, D.C.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study has three main objectives. First, it develops a generalization of the commonly used EKS method to multilateral price comparisons. It is shown that the EKS system can be generalized so that weights can be attached to each of the link comparisons used in the EKS computations. These weights can account for differing levels of reliability of the underlying binary comparisons. Second, various reliability measures and corresponding weighting schemes are presented and their merits discussed. Finally, these new methods are applied to an international data set of manufacturing prices from the ICOP project. Although theoretically superior, it appears that the empirical impact of the weighted EKS method is generally small compared to the unweighted EKS. It is also found that this impact is larger when it is applied at lower levels of aggregation. Finally, the importance of using sector specific PPPs in assessing relative levels of manufacturing productivity is indicated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction. Potentially modifiable physiological variables may influence stroke prognosis but their independence from modifiable factors remains unclear. Methods. Admission physiological measures (blood pressure, heart rate, temperature and blood glucose) and other unmodifiable factors were recorded from patients presenting within 48 hours of stroke. These variables were compared with the outcomes of death and death or dependency at 30 days in multivariate statistical models. Results. In the 186 patients included in the study, age, atrial fibrillation and the National Institutes of Health Stroke Score were identified as unmodifiable factors independently associated with death and death or dependency. After adjusting for these factors, none of the physiological variables were independently associated with death, while only diastolic blood pressure (DBP) >= 90 mmHg was associated with death or dependency at 30 days (p = 0.02). Conclusions. Except for elevated DBP, we found no independent associations between admission physiology and outcome at 30 days in an unselected stroke cohort. Future studies should look for associations in subgroups, or by analysing serial changes in physiology during the early post-stroke period.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A combination of uni- and multiplex PCR assays targeting 58 virulence genes (VGs) associated with Escherichia coli strains causing intestinal and extraintestinal disease in humans and other mammals was used to analyze the VG repertoire of 23 commensal E. coli isolates from healthy pigs and 52 clinical isolates associated with porcine neonatal diarrhea (ND) and postweaning diarrhea (PWD). The relationship between the presence and absence of VGs was interrogated using three statistical methods. According to the generalized linear model, 17 of 58 VGs were found to be significant (P < 0.05) in distinguishing between commensal and clinical isolates. Nine of the 17 genes represented by iha, hlyA, aidA, east1, aah, fimH, iroN(E).(coli), traT, and saa have not been previously identified as important VGs in clinical porcine isolates in Australia. The remaining eight VGs code for fimbriae (F4, F5, F18, and F41) and toxins (STa, STh, LT, and Stx2), normally associated with porcine enterotoxigenic E. coli. Agglomerative hierarchical algorithm analysis grouped E. coli strains into subclusters based primarily on their serogroup. Multivariate analyses of clonal relationships based on the 17 VGs were collapsed into two-dimensional space by principal coordinate analysis. PWD clones were distributed in two quadrants, separated from ND and commensal clones, which tended to cluster within one quadrant. Clonal subclusters within quadrants were highly correlated with serogroups. These methods of analysis provide different perspectives in our attempts to understand how commensal and clinical porcine enterotoxigenic E. coli strains have evolved and are engaged in the dynamic process of losing or acquiring VGs within the pig population.