911 resultados para Movement Data Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study determined the rate and indication for revision between cemented, uncemented, hybrid and resurfacing groups from NJR (6 th edition) data. Data validity was determined by interrogating for episodes of misclassification. We identified 6,034 (2.7%) misclassified episodes, containing 97 (4.3%) revisions. Kaplan-Meier revision rates at 3 years were 0.9% cemented, 1.9% for uncemented, 1.2% for hybrids and 3.0% for resurfacings (significant difference across all groups, p<0.001, with identical pattern in patients <55 years). Regression analysis indicated both prosthesis group and age significantly influenced failure (p<0.001). Revision for pain, aseptic loosening, and malalignment were highest in uncemented and resurfacing arthroplasty. Revision for dislocation was highest in uncemented hips (significant difference between groups, p<0.001). Feedback to the NJR on data misclassification has been made for future analysis. © 2012 Wichtig Editore.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural health monitoring (SHM) refers to the procedure used to assess the condition of structures so that their performance can be monitored and any damage can be detected early. Early detection of damage and appropriate retrofitting will aid in preventing failure of the structure and save money spent on maintenance or replacement and ensure the structure operates safely and efficiently during its whole intended life. Though visual inspection and other techniques such as vibration based ones are available for SHM of structures such as bridges, the use of acoustic emission (AE) technique is an attractive option and is increasing in use. AE waves are high frequency stress waves generated by rapid release of energy from localised sources within a material, such as crack initiation and growth. AE technique involves recording these waves by means of sensors attached on the surface and then analysing the signals to extract information about the nature of the source. High sensitivity to crack growth, ability to locate source, passive nature (no need to supply energy from outside, but energy from damage source itself is utilised) and possibility to perform real time monitoring (detecting crack as it occurs or grows) are some of the attractive features of AE technique. In spite of these advantages, challenges still exist in using AE technique for monitoring applications, especially in the area of analysis of recorded AE data, as large volumes of data are usually generated during monitoring. The need for effective data analysis can be linked with three main aims of monitoring: (a) accurately locating the source of damage; (b) identifying and discriminating signals from different sources of acoustic emission and (c) quantifying the level of damage of AE source for severity assessment. In AE technique, the location of the emission source is usually calculated using the times of arrival and velocities of the AE signals recorded by a number of sensors. But complications arise as AE waves can travel in a structure in a number of different modes that have different velocities and frequencies. Hence, to accurately locate a source it is necessary to identify the modes recorded by the sensors. This study has proposed and tested the use of time-frequency analysis tools such as short time Fourier transform to identify the modes and the use of the velocities of these modes to achieve very accurate results. Further, this study has explored the possibility of reducing the number of sensors needed for data capture by using the velocities of modes captured by a single sensor for source localization. A major problem in practical use of AE technique is the presence of sources of AE other than crack related, such as rubbing and impacts between different components of a structure. These spurious AE signals often mask the signals from the crack activity; hence discrimination of signals to identify the sources is very important. This work developed a model that uses different signal processing tools such as cross-correlation, magnitude squared coherence and energy distribution in different frequency bands as well as modal analysis (comparing amplitudes of identified modes) for accurately differentiating signals from different simulated AE sources. Quantification tools to assess the severity of the damage sources are highly desirable in practical applications. Though different damage quantification methods have been proposed in AE technique, not all have achieved universal approval or have been approved as suitable for all situations. The b-value analysis, which involves the study of distribution of amplitudes of AE signals, and its modified form (known as improved b-value analysis), was investigated for suitability for damage quantification purposes in ductile materials such as steel. This was found to give encouraging results for analysis of data from laboratory, thereby extending the possibility of its use for real life structures. By addressing these primary issues, it is believed that this thesis has helped improve the effectiveness of AE technique for structural health monitoring of civil infrastructures such as bridges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Previous studies have shown that fundamental movement skills (FMS) and physical activity are related. Specifically, earlier studies have demonstrated that the ability to perform a variety of FMS increases the likelihood of children participating in a range of physical activities throughout their lives. To date, however, there have not been studies focused on the development of, or the relationship between, these variables through junior high school (that is, between the ages of 13 and 15). Such studies might provide important insights into the relationships between FMS and physical activity during adolescence, and suggest ways to design more effective physical education programmes for adolescents. Purpose: The main purposes of the study are: (1) to investigate the development of the students' self-reported physical activity and FMS from Grade 7 to Grade 9, (2) to analyse the associations among the students' FMS and self-reported physical activity through junior high school, (3) to analyse whether there are gender differences in research tasks one and/or two. Participants and setting: The participants in the study were 152 Finnish students, aged 13 and enrolled in Grade 7 at the commencement of the study. The sample included 66 girls and 86 boys who were drawn from three junior high schools in Middle Finland. Research design and data collection: Both the FMS tests and questionnaires pertaining to self-reported physical activity were completed annually during a 3 year period: in August (when the participants were in Grade 7), January (Grade 8), and in May (Grade 9). Data analysis: Repeated measures multivariate analysis of variances (MANOVAs) were used to analyse the interaction between gender and time (three measurement points) in FMS test sumscores and self-reported physical activity scores. The relationships between self-reported physical activity scores and fundamental movement skill sumscores through junior high school were analysed using Structural Equation Modelling (SEM) with LISREL 8.80 software. Findings: The MANOVA for self-reported physical activity demonstrated that both genders' physical activity decreased through junior high school. The MANOVA for the FMS revealed that the boys' FMS sumscore increased whereas the girls' skills decreased through junior high school. The SEM and squared multiple correlations revealed FMS in Grades 7 and 8 as well as physical activity in Grade 9 explained FMS in Grade 9. The portion of prediction was 69% for the girls and 55% for the boys. Additionally, physical activity measured in Grade 7 and FMS measured in Grade 9 explained physical activity in Grade 9. The portion of prediction was 12% for the girls and 29% for the boys. In the boys' group, three additional paths were found; FMS in Grade 7 explained physical activity in Grade 9, physical activity in Grade 7 explained FMS in Grade 8, and physical activity in Grade 7 explained physical activity in Grade 8. Conclusions: The study suggests that supporting and encouraging FMS and physical activity are co-related and when considering combined scores there is a greater likelihood of healthy lifelong outcomes. Therefore, the conclusion can be drawn that FMS curriculum in school-based PE is a plausible way to ensure good lifelong outcomes. Earlier studies support that school physical education plays an important role in developing students FMS and is in a position to thwart the typical decline of physical activity in adolescence. These concepts are particularly important for adolescent girls as this group reflects the greatest decline in physical activity during the adolescent period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monitoring the environment with acoustic sensors is an effective method for understanding changes in ecosystems. Through extensive monitoring, large-scale, ecologically relevant, datasets can be produced that can inform environmental policy. The collection of acoustic sensor data is a solved problem; the current challenge is the management and analysis of raw audio data to produce useful datasets for ecologists. This paper presents the applied research we use to analyze big acoustic datasets. Its core contribution is the presentation of practical large-scale acoustic data analysis methodologies. We describe details of the data workflows we use to provide both citizen scientists and researchers practical access to large volumes of ecoacoustic data. Finally, we propose a work in progress large-scale architecture for analysis driven by a hybrid cloud-and-local production-grade website.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Focus groups are a popular qualitative research method for information systems researchers. However, compared with the abundance of research articles and handbooks on planning and conducting focus groups, surprisingly, there is little research on how to analyse focus group data. Moreover, those few articles that specifically address focus group analysis are all in fields other than information systems, and offer little specific guidance for information systems researchers. Further, even the studies that exist in other fields do not provide a systematic and integrated procedure to analyse both focus group ‘content’ and ‘interaction’ data. As the focus group is a valuable method to answer the research questions of many IS studies (in the business, government and society contexts), we believe that more attention should be paid to this method in the IS research. This paper offers a systematic and integrated procedure for qualitative focus group data analysis in information systems research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The evaluation of the hand function is an essential element within the clinical practice. The usual assessments are focus on the ability to perform activities of daily life. The inclusion of instruments to measure kinematic variables provides a new approach to the assessment. Inertial sensors adapted to the hand could be used as a complementary instrument to the traditional assessment. Material: clinimetric assessment (Upper Limb Functional Index, Quick Dash), antrophometric variables (eight and weight), dynamometry (palm preasure) was taken. Functional analysis was made with Acceleglove system for the right hand and computer system. The glove has six acceleration sensor, one on each finger and another one on the reverse palm. Method Analytic, transversal approach. Ten healthy subject made six task on evaluation table (tripod pinch, lateral pinch and tip pinch, extension grip, spherical grip and power grip). Each task was made and measure three times, the second one was analyze for the results section. A Matlab script was created for the analysis of each movement and detection phase based on module vector. Results The module acceleration vector offers useful information of the hand function. The data analysis obtained during the performance of functional gestures allows to identify five different phases within the movement, three static phase and tow dynamic, each module vector was allied to one task. Conclusion Module vector variables could be used for the analysis of the different task made by the hand. Inertial sensor could be use as a complement for the traditional assessment system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A combined data matrix consisting of high performance liquid chromatography–diode array detector (HPLC–DAD) and inductively coupled plasma-mass spectrometry (ICP-MS) measurements of samples from the plant roots of the Cortex moutan (CM), produced much better classification and prediction results in comparison with those obtained from either of the individual data sets. The HPLC peaks (organic components) of the CM samples, and the ICP-MS measurements (trace metal elements) were investigated with the use of principal component analysis (PCA) and the linear discriminant analysis (LDA) methods of data analysis; essentially, qualitative results suggested that discrimination of the CM samples from three different provinces was possible with the combined matrix producing best results. Another three methods, K-nearest neighbor (KNN), back-propagation artificial neural network (BP-ANN) and least squares support vector machines (LS-SVM) were applied for the classification and prediction of the samples. Again, the combined data matrix analyzed by the KNN method produced best results (100% correct; prediction set data). Additionally, multiple linear regression (MLR) was utilized to explore any relationship between the organic constituents and the metal elements of the CM samples; the extracted linear regression equations showed that the essential metals as well as some metallic pollutants were related to the organic compounds on the basis of their concentrations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drawing on multimodal texts produced by an Indigenous school community in Australia, I apply critical race theory and multimodal analysis (Jewitt, 2011) to decolonise digital heritage practices for Indigenous students. This study focuses on the particular ways in which students’ counter-narratives about race were embedded in multimodal and digital design in the development of a digital cultural heritage (Giaccardi, 2012). Data analysis involved applying multimodal analysis to the students’ Gamis, following social semiotic categories and principles theorised by Kress and Bezemer (2008), and Jewitt (2006, 2011). This includes attending to the following semiotic elements: visual design, movement and gesture, gaze, and recorded speech, and their interrelationships. The analysis also draws on critical race theory to interpret the students’ representations of race. In particular, the multimodal texts were analysed as a site for students’ views of Indigenous oppression in relation to the colonial powers and ownership of the land in Australian history (Ladson-Billings, 2009). Pedagogies that explore counter-narratives of cultural heritage in the official curriculum can encourage students to reframe their own racial identity, while challenging dominant white, historical narratives of colonial conquest, race, and power (Gutierrez, 2008). The children’s multimodal “Gami” videos, created with the iPad application, Tellagami, enabled the students to imagine hybrid, digital social identities and perspectives of Australian history that were tied to their Indigenous cultural heritage (Kamberelis, 2001).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider rank regression for clustered data analysis and investigate the induced smoothing method for obtaining the asymptotic covariance matrices of the parameter estimators. We prove that the induced estimating functions are asymptotically unbiased and the resulting estimators are strongly consistent and asymptotically normal. The induced smoothing approach provides an effective way for obtaining asymptotic covariance matrices for between- and within-cluster estimators and for a combined estimator to take account of within-cluster correlations. We also carry out extensive simulation studies to assess the performance of different estimators. The proposed methodology is substantially Much faster in computation and more stable in numerical results than the existing methods. We apply the proposed methodology to a dataset from a randomized clinical trial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variety selection in perennial pasture crops involves identifying best varieties from data collected from multiple harvest times in field trials. For accurate selection, the statistical methods for analysing such data need to account for the spatial and temporal correlation typically present. This paper provides an approach for analysing multi-harvest data from variety selection trials in which there may be a large number of harvest times. Methods are presented for modelling the variety by harvest effects while accounting for the spatial and temporal correlation between observations. These methods provide an improvement in model fit compared to separate analyses for each harvest, and provide insight into variety by harvest interactions. The approach is illustrated using two traits from a lucerne variety selection trial. The proposed method provides variety predictions allowing for the natural sources of variation and correlation in multi-harvest data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mapping and geospatial analysis of benthic environments are multidisciplinary tasks that have become more accessible in recent years because of advances in technology and cost reductions in survey systems. The complex relationships that exist among physical, biological, and chemical seafloor components require advanced, integrated analysis techniques to enable scientists and others to visualize patterns and, in so doing, allow inferences to be made about benthic processes. Effective mapping, analysis, and visualization of marine habitats are particularly important because the subtidal seafloor environment is not readily viewed directly by eye. Research in benthic environments relies heavily, therefore, on remote sensing techniques to collect effective data. Because many benthic scientists are not mapping professionals, they may not adequately consider the links between data collection, data analysis, and data visualization. Projects often start with clear goals, but may be hampered by the technical details and skills required for maintaining data quality through the entire process from collection through analysis and presentation. The lack of technical understanding of the entire data handling process can represent a significant impediment to success. While many benthic mapping efforts have detailed their methodology as it relates to the overall scientific goals of a project, only a few published papers and reports focus on the analysis and visualization components (Paton et al. 1997, Weihe et al. 1999, Basu and Saxena 1999, Bruce et al. 1997). In particular, the benthic mapping literature often briefly describes data collection and analysis methods, but fails to provide sufficiently detailed explanation of particular analysis techniques or display methodologies so that others can employ them. In general, such techniques are in large part guided by the data acquisition methods, which can include both aerial and water-based remote sensing methods to map the seafloor without physical disturbance, as well as physical sampling methodologies (e.g., grab or core sampling). The terms benthic mapping and benthic habitat mapping are often used synonymously to describe seafloor mapping conducted for the purpose of benthic habitat identification. There is a subtle yet important difference, however, between general benthic mapping and benthic habitat mapping. The distinction is important because it dictates the sequential analysis and visualization techniques that are employed following data collection. In this paper general seafloor mapping for identification of regional geologic features and morphology is defined as benthic mapping. Benthic habitat mapping incorporates the regional scale geologic information but also includes higher resolution surveys and analysis of biological communities to identify the biological habitats. In addition, this paper adopts the definition of habitats established by Kostylev et al. (2001) as a “spatially defined area where the physical, chemical, and biological environment is distinctly different from the surrounding environment.” (PDF contains 31 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying differential expression of genes in psoriatic and healthy skin by microarray data analysis is a key approach to understand the pathogenesis of psoriasis. Analysis of more than one dataset to identify genes commonly upregulated reduces the likelihood of false positives and narrows down the possible signature genes. Genes controlling the critical balance between T helper 17 and regulatory T cells are of special interest in psoriasis. Our objectives were to identify genes that are consistently upregulated in lesional skin from three published microarray datasets. We carried out a reanalysis of gene expression data extracted from three experiments on samples from psoriatic and nonlesional skin using the same stringency threshold and software and further compared the expression levels of 92 genes related to the T helper 17 and regulatory T cell signaling pathways. We found 73 probe sets representing 57 genes commonly upregulated in lesional skin from all datasets. These included 26 probe sets representing 20 genes that have no previous link to the etiopathogenesis of psoriasis. These genes may represent novel therapeutic targets and surely need more rigorous experimental testing to be validated. Our analysis also identified 12 of 92 genes known to be related to the T helper 17 and regulatory T cell signaling pathways, and these were found to be differentially expressed in the lesional skin samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Retrospective clinical datasets are often characterized by a relatively small sample size and many missing data. In this case, a common way for handling the missingness consists in discarding from the analysis patients with missing covariates, further reducing the sample size. Alternatively, if the mechanism that generated the missing allows, incomplete data can be imputed on the basis of the observed data, avoiding the reduction of the sample size and allowing methods to deal with complete data later on. Moreover, methodologies for data imputation might depend on the particular purpose and might achieve better results by considering specific characteristics of the domain. The problem of missing data treatment is studied in the context of survival tree analysis for the estimation of a prognostic patient stratification. Survival tree methods usually address this problem by using surrogate splits, that is, splitting rules that use other variables yielding similar results to the original ones. Instead, our methodology consists in modeling the dependencies among the clinical variables with a Bayesian network, which is then used to perform data imputation, thus allowing the survival tree to be applied on the completed dataset. The Bayesian network is directly learned from the incomplete data using a structural expectation–maximization (EM) procedure in which the maximization step is performed with an exact anytime method, so that the only source of approximation is due to the EM formulation itself. On both simulated and real data, our proposed methodology usually outperformed several existing methods for data imputation and the imputation so obtained improved the stratification estimated by the survival tree (especially with respect to using surrogate splits).