913 resultados para Data-driven analysis
Resumo:
The integration of quantitative data from movement analysis technologies is reshaping the analysis of athletes’ performances and injury mitigation, e.g., anterior cruciate ligament (ACL) rupture. Most of the movement assessments are performed in laboratory environments. Recent progress provides the chance to shift the paradigm to a more ecological approach with sport-specific elements and a closer examination of “real” movement patterns associated with performance and (ACL) injury risk. The present PhD thesis aimed at investigating the on-field motion patterns related to performance and injury prevention in young football players. The objectives of the thesis were: (I) in-lab measures of high-dynamics movements were used to validate wearable inertial sensors technology; (II) in-laboratory and on-field agility movement tasks were compared to inspect the effect of football-specific environment; (III) on-field analysis was conducted to challenge wearable sensors technology in the assessment of dangerous movement patterns towards the ACL rupture; (IV) an overview of technologies that could shape present and future assessment of ACL injury risk in daily practice was presented. The validity of wearables in the assessment of high-dynamics movements was confirmed. Relevant differences emerged between the movements performed in a laboratory setting and on the football pitch, supporting the inclusion of an ecological dynamics approach in preventive protocols. The on-field analysis of football-specific movement tasks demonstrated good reliability of wearable sensors and the presence of residual dangerous patterns in the injured players. A tool to inspect at-risk movement patterns on the field through objective measurements was presented. It discussed how potential alternatives to wearable inertial sensors embrace artificial intelligence and closer collaboration between clinical and technical expertise. The present thesis was meant to contribute to setting the basis for data-driven prevention protocols. A deeper comprehension of injury-related principles and counteractions will contribute to preserving athletes’ careers and health over time.
Resumo:
We report on a new analysis of neutrino oscillations in MINOS using the complete set of accelerator and atmospheric data. The analysis combines the ν(μ) disappearance and ν(e) appearance data using the three-flavor formalism. We measure |Δm(32)(2)| = [2.28-2.46] × 10(-3) eV(2) (68% C.L.) and sin(2)θ(23) = 0.35-0.65 (90% C.L.) in the normal hierarchy, and |Δm(32)(2)| = [2.32-2.53] × 10(-3) eV(2) (68% C.L.) and sin(2)θ(23) = 0.34-0.67 (90% C.L.) in the inverted hierarchy. The data also constrain δ(CP), the θ(23} octant degeneracy and the mass hierarchy; we disfavor 36% (11%) of this three-parameter space at 68% (90%) C.L.
Resumo:
The second edition of An Introduction to Efficiency and Productivity Analysis is designed to be a general introduction for those who wish to study efficiency and productivity analysis. The book provides an accessible, well-written introduction to the four principal methods involved: econometric estimation of average response models; index numbers, data envelopment analysis (DEA); and stochastic frontier analysis (SFA). For each method, a detailed introduction to the basic concepts is presented, numerical examples are provided, and some of the more important extensions to the basic methods are discussed. Of special interest is the systematic use of detailed empirical applications using real-world data throughout the book. In recent years, there have been a number of excellent advance-level books published on performance measurement. This book, however, is the first systematic survey of performance measurement with the express purpose of introducing the field to a wide audience of students, researchers, and practitioners. Indeed, the 2nd Edition maintains its uniqueness: (1) It is a well-written introduction to the field. (2) It outlines, discusses and compares the four principal methods for efficiency and productivity analysis in a well-motivated presentation. (3) It provides detailed advice on computer programs that can be used to implement these performance measurement methods. The book contains computer instructions and output listings for the SHAZAM, LIMDEP, TFPIP, DEAP and FRONTIER computer programs. More extensive listings of data and computer instruction files are available on the book's website: (www.uq.edu.au/economics/cepa/crob2005).
Resumo:
This paper measured the variations in performance of small municipalities in the State of Sao Paulo, Brazil, regarding the technical efficiency in the use of public funds in public primary health care actions concerning the funding profile, in a scenario of fiscal federalism. Technical efficiency is one of the parameters of evaluation of public sector performance and was measured by means of Data Envelopment Analysis (DEA). The correlation analysis of DEA score was used to verify possible associations between technical efficiency and the funding profile of expenses with health care. The results showed that 6.41% of the municipalities were considered efficient. They also showed that the level of municipality dependence to inter-governmental general purpose grants and the national health funding specific purpose grants have negative correlation with efficiency scores.
Resumo:
Simultaneous acquisition of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) aims to disentangle the description of brain processes by exploiting the advantages of each technique. Most studies in this field focus on exploring the relationships between fMRI signals and the power spectrum at some specific frequency bands (alpha, beta, etc.). On the other hand, brain mapping of EEG signals (e.g., interictal spikes in epileptic patients) usually assumes an haemodynamic response function for a parametric analysis applying the GLM, as a rough approximation. The integration of the information provided by the high spatial resolution of MR images and the high temporal resolution of EEG may be improved by referencing them by transfer functions, which allows the identification of neural driven areas without strong assumptions about haemodynamic response shapes or brain haemodynamic`s homogeneity. The difference on sampling rate is the first obstacle for a full integration of EEG and fMRI information. Moreover, a parametric specification of a function representing the commonalities of both signals is not established. In this study, we introduce a new data-driven method for estimating the transfer function from EEG signal to fMRI signal at EEG sampling rate. This approach avoids EEG subsampling to fMRI time resolution and naturally provides a test for EEG predictive power over BOLD signal fluctuations, in a well-established statistical framework. We illustrate this concept in resting state (eyes closed) and visual simultaneous fMRI-EEG experiments. The results point out that it is possible to predict the BOLD fluctuations in occipital cortex by using EEG measurements. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Resting state functional magnetic resonance imaging (fMRI) reveals a distinct network of correlated brain function representing a default mode state of the human brain The underlying structural basis of this functional connectivity pattern is still widely unexplored We combined fractional anisotropy measures of fiber tract integrity derived from diffusion tensor imaging (DTI) and resting state fMRI data obtained at 3 Tesla from 20 healthy elderly subjects (56 to 83 years of age) to determine white matter microstructure e 7 underlying default mode connectivity We hypothesized that the functional connectivity between the posterior cingulate and hippocampus from resting state fMRI data Would be associated with the white matter microstructure in the cingulate bundle and fiber tracts connecting posterior cingulate gyrus With lateral temporal lobes, medial temporal lobes, and precuneus This was demonstrated at the p<0001 level using a voxel-based multivariate analysis of covariance (MANCOVA) approach In addition, we used a data-driven technique of joint independent component analysis (ICA) that uncovers spatial pattern that are linked across modalities. It revealed a pattern of white matter tracts including cingulate bundle and associated fiber tracts resembling the findings from the hypothesis-driven analysis and was linked to the pattern of default mode network (DMN) connectivity in the resting state fMRI data Out findings support the notion that the functional connectivity between the posterior cingulate and hippocampus and the functional connectivity across the entire DMN is based oil distinct pattern of anatomical connectivity within the cerebral white matter (C) 2009 Elsevier Inc All rights reserved
Resumo:
Objective: Although suicide is a leading cause of death worldwide, clinicians and researchers lack a data-driven method to assess the risk of suicide attempts. This study reports the results of an analysis of a large cross-national epidemiologic survey database that estimates the 12-month prevalence of suicidal behaviors, identifies risk factors for suicide attempts, and combines these factors to create a risk index for 12-month suicide attempts separately for developed and developing countries. Method: Data come from the World Health Organization (WHO) World Mental Health (WMH) Surveys (conducted 2001-2007), in which 108,705 adults from 21 countries were interviewed using the WHO Composite International Diagnostic Interview. The survey assessed suicidal behaviors and potential risk factors across multiple domains, including socio-demographic characteristics, parent psychopathology, childhood adversities, DSM-IV disorders, and history of suicidal behavior. Results: Twelve-month prevalence estimates of suicide ideation, plans, and attempts are 2.0%, 0.6%, and 0.3%, respectively, for developed countries and 2.1%, 0.7%, and 0.4%, respectively, for developing countries. Risk factors for suicidal behaviors in both developed and developing countries include female sex, younger age, lower education and income, unmarried status, unemployment, parent psychopathology, childhood adversities, and presence of diverse 12-month DSM-IV mental disorders. Combining risk factors from multiple domains produced risk indices that accurately predicted 12-month suicide attempts in both developed and developing countries (area under the receiver operating characteristic curve = 0.74-0.80). Conclusions: Suicidal behaviors occur at similar rates in both developed and developing countries. Risk indices assessing multiple domains can predict suicide attempts with fairly good accuracy and may be useful in aiding clinicians in the prediction of these behaviors. J Clin Psychiatry 2010;71(12):1617-1628 (C) Copyright 2010 Physicians Postgraduate Press, Inc.
Resumo:
The effect of number of samples and selection of data for analysis on the calculation of surface motor unit potential (SMUP) size in the statistical method of motor unit number estimates (MUNE) was determined in 10 normal subjects and 10 with amyotrophic lateral sclerosis (ALS). We recorded 500 sequential compound muscle action potentials (CMAPs) at three different stable stimulus intensities (10–50% of maximal CMAP). Estimated mean SMUP sizes were calculated using Poisson statistical assumptions from the variance of 500 sequential CMAP obtained at each stimulus intensity. The results with the 500 data points were compared with smaller subsets from the same data set. The results using a range of 50–80% of the 500 data points were compared with the full 500. The effect of restricting analysis to data between 5–20% of the CMAP and to standard deviation limits was also assessed. No differences in mean SMUP size were found with stimulus intensity or use of different ranges of data. Consistency was improved with a greater sample number. Data within 5% of CMAP size gave both increased consistency and reduced mean SMUP size in many subjects, but excluded valid responses present at that stimulus intensity. These changes were more prominent in ALS patients in whom the presence of isolated SMUP responses was a striking difference from normal subjects. Noise, spurious data, and large SMUP limited the Poisson assumptions. When these factors are considered, consistent statistical MUNE can be calculated from a continuous sequence of data points. A 2 to 2.5 SD or 10% window are reasonable methods of limiting data for analysis. Muscle Nerve 27: 320–331, 2003
Resumo:
Benchmarking is an important tool to organisations to improve their productivity, product quality, process efficiency or services. From Benchmarking the organisations could compare their performance with competitors and identify their strengths and weaknesses. This study intends to do a benchmarking analysis on the main Iberian Sea ports with a special focus on their container terminals efficiency. To attain this, the DEA (data envelopment analysis) is used since it is considered by several researchers as the most effective method to quantify a set of key performance indicators. In order to reach a more reliable diagnosis tool the DEA is used together with the data mining in comparing the sea ports operational data of container terminals during 2007.Taking into account that sea ports are global logistics networks the performance evaluation is essential to an effective decision making in order to improve their efficiency and, therefore, their competitiveness.
Resumo:
In studies assessing the effects of a given exposure variable and a specific outcome of interest, confusion may arise from the mistaken impression that the exposure variable is producing the outcome of interest, when in fact the observed effect is due to an existing confounder. However, quantitative techniques are rarely used to determine the potential influence of unmeasured confounders. Sensitivity analysis is a statistical technique that allows to quantitatively measuring the impact of an unmeasured confounding variable on the association of interest that is being assessed. The purpose of this study was to make it feasible to apply two sensitivity analysis methods available in the literature, developed by Rosenbaum and Greenland, using an electronic spreadsheet. Thus, it can be easier for researchers to include this quantitative tool in the set of procedures that have been commonly used in the stage of result validation.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica - Ramo Manutenção e Produção
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Engenharia do Ambiente pela Universidade Nova de Lisboa,Faculdade de Ciências e Tecnologia
Resumo:
Dissertation presented to obtain the Ph.D degree in Bioinformatics
Resumo:
Polysaccharides are gaining increasing attention as potential environmental friendly and sustainable building blocks in many fields of the (bio)chemical industry. The microbial production of polysaccharides is envisioned as a promising path, since higher biomass growth rates are possible and therefore higher productivities may be achieved compared to vegetable or animal polysaccharides sources. This Ph.D. thesis focuses on the modeling and optimization of a particular microbial polysaccharide, namely the production of extracellular polysaccharides (EPS) by the bacterial strain Enterobacter A47. Enterobacter A47 was found to be a metabolically versatile organism in terms of its adaptability to complex media, notably capable of achieving high growth rates in media containing glycerol byproduct from the biodiesel industry. However, the industrial implementation of this production process is still hampered due to a largely unoptimized process. Kinetic rates from the bioreactor operation are heavily dependent on operational parameters such as temperature, pH, stirring and aeration rate. The increase of culture broth viscosity is a common feature of this culture and has a major impact on the overall performance. This fact complicates the mathematical modeling of the process, limiting the possibility to understand, control and optimize productivity. In order to tackle this difficulty, data-driven mathematical methodologies such as Artificial Neural Networks can be employed to incorporate additional process data to complement the known mathematical description of the fermentation kinetics. In this Ph.D. thesis, we have adopted such an hybrid modeling framework that enabled the incorporation of temperature, pH and viscosity effects on the fermentation kinetics in order to improve the dynamical modeling and optimization of the process. A model-based optimization method was implemented that enabled to design bioreactor optimal control strategies in the sense of EPS productivity maximization. It is also critical to understand EPS synthesis at the level of the bacterial metabolism, since the production of EPS is a tightly regulated process. Methods of pathway analysis provide a means to unravel the fundamental pathways and their controls in bioprocesses. In the present Ph.D. thesis, a novel methodology called Principal Elementary Mode Analysis (PEMA) was developed and implemented that enabled to identify which cellular fluxes are activated under different conditions of temperature and pH. It is shown that differences in these two parameters affect the chemical composition of EPS, hence they are critical for the regulation of the product synthesis. In future studies, the knowledge provided by PEMA could foster the development of metabolically meaningful control strategies that target the EPS sugar content and oder product quality parameters.
Resumo:
Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.