866 resultados para INDEPENDENT COMPONENT ANALYSIS (ICA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Controlled fires in forest areas are frequently used in most Mediterranean countries as a preventive technique to avoid severe wildfires in summer season. In Portugal, this forest management method of fuel mass availability is also used and has shown to be beneficial as annual statistical reports confirm that the decrease of wildfires occurrence have a direct relationship with the controlled fire practice. However prescribed fire can have serious side effects in some forest soil properties. This work shows the changes that occurred in some forest soils properties after a prescribed fire action. The experiments were carried out in soil cover over a natural site of Andaluzitic schist, in Gramelas, Caminha, Portugal, that had not been burn for four years. The composed soil samples were collected from five plots at three different layers (0-3cm, 3-6cm and 6-18cm) during a three-year monitoring period after the prescribed burning. Principal Component Analysis was used to reach the presented conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current context of serious climate changes, where the increase of the frequency of some extreme events occurrence can enhance the rate of periods prone to high intensity forest fires, the National Forest Authority often implements, in several Portuguese forest areas, a regular set of measures in order to control the amount of fuel mass availability (PNDFCI, 2008). In the present work we’ll present a preliminary analysis concerning the assessment of the consequences given by the implementation of prescribed fire measures to control the amount of fuel mass in soil recovery, in particular in terms of its water retention capacity, its organic matter content, pH and content of iron. This work is included in a larger study (Meira-Castro, 2009(a); Meira-Castro, 2009(b)). According to the established praxis on the data collection, embodied in multidimensional matrices of n columns (variables in analysis) by p lines (sampled areas at different depths), and also considering the quantitative data nature present in this study, we’ve chosen a methodological approach that considers the multivariate statistical analysis, in particular, the Principal Component Analysis (PCA ) (Góis, 2004). The experiments were carried out in a soil cover over a natural site of Andaluzitic schist, in Gramelas, Caminha, NW Portugal, who was able to maintain itself intact from prescribed burnings from four years and was submit to prescribed fire in March 2008. The soils samples were collected from five different plots at six different time periods. The methodological option that was adopted have allowed us to identify the most relevant relational structures inside the n variables, the p samples and in two sets at the same time (Garcia-Pereira, 1990). Consequently, and in addition to the traditional outputs produced from the PCA, we have analyzed the influence of both sampling depths and geomorphological environments in the behavior of all variables involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to optimize the water quality monitoring of a polluted watercourse (Leça River, Portugal) through the principal component analysis (PCA) and cluster analysis (CA). These statistical methodologies were applied to physicochemical, bacteriological and ecotoxicological data (with the marine bacterium Vibrio fischeri and the green alga Chlorella vulgaris) obtained with the analysis of water samples monthly collected at seven monitoring sites and during five campaigns (February, May, June, August, and September 2006). The results of some variables were assigned to water quality classes according to national guidelines. Chemical and bacteriological quality data led to classify Leça River water quality as “bad” or “very bad”. PCA and CA identified monitoring sites with similar pollution pattern, giving to site 1 (located in the upstream stretch of the river) a distinct feature from all other sampling sites downstream. Ecotoxicity results corroborated this classification thus revealing differences in space and time. The present study includes not only physical, chemical and bacteriological but also ecotoxicological parameters, which broadens new perspectives in river water characterization. Moreover, the application of PCA and CA is very useful to optimize water quality monitoring networks, defining the minimum number of sites and their location. Thus, these tools can support appropriate management decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this survey was to perform descriptive analysis of crime evolution in Portugal between 1995 and 2013. The main focus of this survey was to analyse spatial crime evolution patterns in Portuguese NUTS III regions. Most important crime types have been included into analysis. The main idea was to uncover relation between local patterns and global crime evolution; to define regions which have contributed to global crime evolution of some specific crime types and to define how they have contributed. There were many statistical reports and scientific papers which have analysed some particular crime types, but one global spatial-temporal analysis has not been found. Principal Component Analysis and multidimensional descriptive data analysis technique STATIS have been the base of the analysis. The results of this survey has shown that strong spatial and temporal crime patterns exist. It was possible to describe global crime evolution patterns and to define crime evolution patterns in NUTS III regions. It was possible to define three to four groups of crimes where each group shows similar spatial crime dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMO - Objetivos: Anualmente morrem cerca de 1,3 milhões de pessoas, a nível mundial, devido aos acidentes de viação. Também mais de 20 milhões de pessoas sofrem ferimentos ligeiros ou graves devido aos acidentes de viação que resultam em incapacidade temporária ou permanente. Desta forma, consideram-se os acidentes de viação, um grave problema de saúde pública, com custos elevados para as sociedades afetando a saúde das populações e economias de cada país. Este estudo pretendeu descrever e caracterizar os condutores de veículos ligeiros, residentes em Portugal Continental, abrangendo características sociodemográficas, experiência de condução e questões relativas a atitudes, opiniões e comportamentos. Por outro lado procurou-se analisar a associação entre as opiniões, atitudes e comportamentos, auto reportados e a ocorrência de um acidente de viação nos últimos três anos a fim de construir um modelo final preditivo do risco de sofrer um acidente de viação. Método: Foi realizado um estudo observacional analítico transversal baseado num questionário traduzido para a língua portuguesa e com origem no projeto europeu SARTRE 4. A população-alvo foram todos os condutores de veículos ligeiros possuidores de uma licença de condução e residentes em Portugal Continental, baseado numa amostra de igual dimensão à definida no estudo europeu SARTRE 4 (600 condutores de veículos ligeiros). Das 52 perguntas existentes, selecionaram-se pela análise de componentes principais (ACP) variáveis potencialmente independentes e complementares para as componentes opiniões, atitudes e comportamentos. Para além das medidas descritivas usuais, recorreu-se à regressão logística binária para analisar associações e obter um modelo que permitisse estimar a probabilidade de sofrer um acidente rodoviário em função das variáveis selecionadas referentes às opiniões, atitudes e comportamentos auto reportados. Resultados: Dos 612 condutores inquiridos, 62,7% (383) responderam não ter sofrido nenhum acidente de viação nos últimos três anos enquanto 37,3% (228) respondeu ter estado envolvido em pelo menos um acidente de viação com danos materiais ou feridos, no mesmo período. De uma forma geral, o típico condutor que referiu ter sofrido um acidente nos últimos três anos é homem com mais de 65 anos de idade, com o 1º ensino básico, viúvo e sem filhos, não empregado e reside numa área urbana. Os condutores residentes numa área suburbana apresentaram um risco 5,368 mais elevado de sofrer um acidente de viação em relação aos condutores que habitam numa zona rural (IC 95%: 2,344-12,297; p<0,001). Os condutores que foram apenas submetidos uma vez a um controlo de álcool, nos últimos três anos, durante o exercício da condução apresentaram um risco 3,009 superior de sofrer um acidente de viação em relação aos condutores que nunca foram fiscalizados pela polícia (IC 95%: 1,949-4,647, p<0,001). Os condutores que referiram muito frequentemente parar para dormir quando se sentem cansados a conduzir têm uma probabilidade inferior de 81% de sofrer um acidente de viação em relação aos condutores que nunca o fazem (IC 95%: 0,058-0,620; p=0,006). Os condutores que quando cansados raramente bebem um café/bebida energética têm um risco de 4,829 superior de sofrer um acidente de viação do que os condutores que sempre o referiram fazer (IC 95%:1,807-12,903; p=0,002). Conclusões: Os resultados obtidos em relação aos fatores comportamentais vão ao encontro da maioria dos fatores de risco associados aos acidentes de viação referidos na literatura. Ainda assim, foram identificadas novas associações entre o risco de sofrer um acidente e as opiniões e as atitudes auto reportadas que através de estudos de maiores dimensões populacionais poderão vir a ser mais exploradas. Este trabalho vem reforçar a necessidade urgente de novas estratégias de intervenção, principalmente na componente comportamental, direcionadas aos grupos de risco, mantendo as existentes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Coordination is a strategy chosen by the central nervous system to control the movements and maintain stability during gait. Coordinated multi-joint movements require a complex interaction between nervous outputs, biomechanical constraints, and pro-prioception. Quantitatively understanding and modeling gait coordination still remain a challenge. Surgeons lack a way to model and appreciate the coordination of patients before and after surgery of the lower limbs. Patients alter their gait patterns and their kinematic synergies when they walk faster or slower than normal speed to maintain their stability and minimize the energy cost of locomotion. The goal of this study was to provide a dynamical system approach to quantitatively describe human gait coordination and apply it to patients before and after total knee arthroplasty. Methods: A new method of quantitative analysis of interjoint coordination during gait was designed, providing a general model to capture the whole dynamics and showing the kinematic synergies at various walking speeds. The proposed model imposed a relationship among lower limb joint angles (hips and knees) to parameterize the dynamics of locomotion of each individual. An integration of different analysis tools such as Harmonic analysis, Principal Component Analysis, and Artificial Neural Network helped overcome high-dimensionality, temporal dependence, and non-linear relationships of the gait patterns. Ten patients were studied using an ambulatory gait device (Physilog®). Each participant was asked to perform two walking trials of 30m long at 3 different speeds and to complete an EQ-5D questionnaire, a WOMAC and Knee Society Score. Lower limbs rotations were measured by four miniature angular rate sensors mounted respectively, on each shank and thigh. The outcomes of the eight patients undergoing total knee arthroplasty, recorded pre-operatively and post-operatively at 6 weeks, 3 months, 6 months and 1 year were compared to 2 age-matched healthy subjects. Results: The new method provided coordination scores at various walking speeds, ranged between 0 and 10. It determined the overall coordination of the lower limbs as well as the contribution of each joint to the total coordination. The difference between the pre-operative and post-operative coordination values were correlated with the improvements of the subjective outcome scores. Although the study group was small, the results showed a new way to objectively quantify gait coordination of patients undergoing total knee arthroplasty, using only portable body-fixed sensors. Conclusion: A new method for objective gait coordination analysis has been developed with very encouraging results regarding the objective outcome of lower limb surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CATCH Kids Club (CKC) is an after-school intervention that has attempted to address the growing obesity and physical inactivity concerns publicized in current literature. Using Self-Determination Theory (SDT: Deci & Ryan, 1985) perspective, this study's main research objective was to assess, while controlling for gender and age, i f there were significant differences between the treatment (CKC program participants) and control (non- eKC) groups on their perceptions of need satisfaction, intrinsic motivation and optimal challenge after four months of participation and after eight months of participation. For this study, data were collected from 79 participants with a mean age of9.3, using the Situational Affective State Questionnaire (SASQ: Mandigo et aI., 2008). In order to determine the common factors present in the data, a principal component analysis was conducted. The analysis resulted in an appropriate three-factor solution, with 14 items loading onto the three factors identified as autonomy, competence and intrinsic motivation. Initially, a multiple analysis of co-variance (MANCOY A) was conducted and found no significant differences or effects (p> 0.05). To further assess the differences between groups, six analyses of co-variance (ANeOY As) were conducted, which also found no significant differences (p >0 .025). These findings suggest that the eKC program is able to maintain the se1fdetermined motivational experiences of its participants, and does not thwart need satisfaction or self-determined motivation through its programming. However, the literature suggests that the CKe program and other P A interventions could be further improved by fostering participants' self-determined motivational experiences, which can lead to the persistence of healthy PA behaviours (Kilpatrick, Hebert & Jacobsen, 2002).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La fibrillation auriculaire est le trouble du rythme le plus fréquent chez l'homme. Elle conduit souvent à de graves complications telles que l'insuffisance cardiaque et les accidents vasculaires cérébraux. Un mécanisme neurogène de la fibrillation auriculaire mis en évidence. L'induction de tachyarythmie par stimulation du nerf médiastinal a été proposée comme modèle pour étudier la fibrillation auriculaire neurogène. Dans cette thèse, nous avons étudié l'activité des neurones cardiaques intrinsèques et leurs interactions à l'intérieur des plexus ganglionnaires de l'oreillette droite dans un modèle canin de la fibrillation auriculaire neurogène. Ces activités ont été enregistrées par un réseau multicanal de microélectrodes empalé dans le plexus ganglionnaire de l'oreillette droite. L'enregistrement de l'activité neuronale a été effectué continument sur une période de près de 4 heures comprenant différentes interventions vasculaires (occlusion de l'aorte, de la veine cave inférieure, puis de l'artère coronaire descendante antérieure gauche), des stimuli mécaniques (toucher de l'oreillette ou du ventricule) et électriques (stimulation du nerf vague ou des ganglions stellaires) ainsi que des épisodes induits de fibrillation auriculaire. L'identification et la classification neuronale ont été effectuées en utilisant l'analyse en composantes principales et le partitionnement de données (cluster analysis) dans le logiciel Spike2. Une nouvelle méthode basée sur l'analyse en composante principale est proposée pour annuler l'activité auriculaire superposée sur le signal neuronal et ainsi augmenter la précision de l'identification de la réponse neuronale et de la classification. En se basant sur la réponse neuronale, nous avons défini des sous-types de neurones (afférent, efférent et les neurones des circuits locaux). Leur activité liée à différents facteurs de stress nous ont permis de fournir une description plus détaillée du système nerveux cardiaque intrinsèque. La majorité des neurones enregistrés ont réagi à des épisodes de fibrillation auriculaire en devenant plus actifs. Cette hyperactivité des neurones cardiaques intrinsèques suggère que le contrôle de cette activité pourrait aider à prévenir la fibrillation auriculaire neurogène. Puisque la stimulation à basse intensité du nerf vague affaiblit l'activité neuronale cardiaque intrinsèque (en particulier pour les neurones afférents et convergents des circuits locaux), nous avons examiné si cette intervention pouvait être appliquée comme thérapie pour la fibrillation auriculaire. Nos résultats montrent que la stimulation du nerf vague droit a été en mesure d'atténuer la fibrillation auriculaire dans 12 des 16 cas malgré un effet pro-arythmique défavorable dans 1 des 16 cas. L'action protective a diminué au fil du temps et est devenue inefficace après ~ 40 minutes après 3 minutes de stimulation du nerf vague.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new paradigm for signal reconstruction and superresolution, Correlation Kernel Analysis (CKA), that is based on the selection of a sparse set of bases from a large dictionary of class- specific basis functions. The basis functions that we use are the correlation functions of the class of signals we are analyzing. To choose the appropriate features from this large dictionary, we use Support Vector Machine (SVM) regression and compare this to traditional Principal Component Analysis (PCA) for the tasks of signal reconstruction, superresolution, and compression. The testbed we use in this paper is a set of images of pedestrians. This paper also presents results of experiments in which we use a dictionary of multiscale basis functions and then use Basis Pursuit De-Noising to obtain a sparse, multiscale approximation of a signal. The results are analyzed and we conclude that 1) when used with a sparse representation technique, the correlation function is an effective kernel for image reconstruction and superresolution, 2) for image compression, PCA and SVM have different tradeoffs, depending on the particular metric that is used to evaluate the results, 3) in sparse representation techniques, L_1 is not a good proxy for the true measure of sparsity, L_0, and 4) the L_epsilon norm may be a better error metric for image reconstruction and compression than the L_2 norm, though the exact psychophysical metric should take into account high order structure in images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by a simplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able to generate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow defining monitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an earlier investigation (Burger et al., 2000) five sediment cores near the Rodrigues Triple Junction in the Indian Ocean were studied applying classical statistical methods (fuzzy c-means clustering, linear mixing model, principal component analysis) for the extraction of endmembers and evaluating the spatial and temporal variation of geochemical signals. Three main factors of sedimentation were expected by the marine geologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. The display of fuzzy membership values and/or factor scores versus depth provided consistent results for two factors only; the ultra-basic component could not be identified. The reason for this may be that only traditional statistical methods were applied, i.e. the untransformed components were used and the cosine-theta coefficient as similarity measure. During the last decade considerable progress in compositional data analysis was made and many case studies were published using new tools for exploratory analysis of these data. Therefore it makes sense to check if the application of suitable data transformations, reduction of the D-part simplex to two or three factors and visual interpretation of the factor scores would lead to a revision of earlier results and to answers to open questions . In this paper we follow the lines of a paper of R. Tolosana- Delgado et al. (2005) starting with a problem-oriented interpretation of the biplot scattergram, extracting compositional factors, ilr-transformation of the components and visualization of the factor scores in a spatial context: The compositional factors will be plotted versus depth (time) of the core samples in order to facilitate the identification of the expected sources of the sedimentary process. Kew words: compositional data analysis, biplot, deep sea sediments

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuously cored boreholes, 100 to 220m deep were drilled in the northern part of the Po Plain by Regione Lombardia in the last five years. Quantitative provenance analysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carried out by using multivariate statistical analysis (principal component analysis, PCA, and similarity analysis) on an integrated data set, including high-resolution bulk petrography and heavy-mineral analyses on Pleistocene sands and of 250 major and minor modern rivers draining the southern flank of the Alps from West to East (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations, metamorphic and quartzofeldspathic detritus from the Western and Central Alps was carried from the axial belt to the Po basin longitudinally parallel to the SouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenario rapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset of the first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA and similarity analysis from core samples show that the longitudinal trunk river at this time was shifted southward by the rapid southward and westward progradation of transverse alluvial river systems fed from the Central and Southern Alps. Sediments were transported southward by braided river systems as well as glacial sediments transported by Alpine valley glaciers invaded the alluvial plain. Kew words: Detrital modes; Modern sands; Provenance; Principal Components Analysis; Similarity, Canberra Distance; palaeodrainage

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many multivariate methods that are apparently distinct can be linked by introducing one or more parameters in their definition. Methods that can be linked in this way are correspondence analysis, unweighted or weighted logratio analysis (the latter also known as "spectral mapping"), nonsymmetric correspondence analysis, principal component analysis (with and without logarithmic transformation of the data) and multidimensional scaling. In this presentation I will show how several of these methods, which are frequently used in compositional data analysis, may be linked through parametrizations such as power transformations, linear transformations and convex linear combinations. Since the methods of interest here all lead to visual maps of data, a "movie" can be made where where the linking parameter is allowed to vary in small steps: the results are recalculated "frame by frame" and one can see the smooth change from one method to another. Several of these "movies" will be shown, giving a deeper insight into the similarities and differences between these methods

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study clarifies the taxonomic status of Anemone coronaria and segregates the species and A. coronaria infraspecific variants using morphological and morphometric analyses. Principal component analysis of the coronaria group was performed on 25 quantitative and qualitative characters, and morphometric analysis of the A. coronaria infraspecific variants was performed on 21 quantitative and qualitative characters. The results showed that the A. coronaria group clustered into four major groups: A. coronaria L., A. biflora DC, A. bucharica (Regel) Juz.ex Komarov, and a final group including A. eranthioides Regel and A. tschernjaewii Regel. The data on the A. coronaria infraspecific variants clustered into six groups: A. coronaria L. var. coronaria L., var. cyanea Ard., var. albiflora Rouy & Fouc., var. parviflora Regel, var. ventreana Ard., and var. rissoana Ard. © 2007 The Linnean Society of London