922 resultados para functional data analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate fracture patterns in bicondylar tibial plateau fractures and their impact on treatment strategy. DESIGN: Prospective data analysis with documentation of initial injury and treatment strategy, computed tomography scans, conventional x-rays, long-term evaluation of radiographs, and functional assessments. SETTING: Level 1 regional trauma center. PATIENTS: Prospective data acquisition of 14 consecutive patients (10 male and 4 female) with a bicondylar tibial plateau fracture (AO Type C). INTERVENTION: Application of a stepwise reconstruction strategy of the tibial plateau starting with the reposition and fixation of the posteromedial split fragment using a 3.5 buttress plate, followed by reposition and grafting of the lateral compartment and lateral fixation with a 3.5 plate in 90 degree to the medial fixation device. MAIN OUTCOME MEASUREMENTS: All patients were evaluated with full-length standing film, standardized x-rays, Lysholm score for functional assessment, and patient's self-appraisal. RESULTS: Most of the complex bicondylar fractures follow a regular pattern in that the medial compartment is split in a mediolateral direction with a posteromedial main fragment, combined with various amounts of multifragmental lateral compartment depression. The technique introduced allows for accurate and stable reduction and fixation of this fracture type. The final Lysholm knee score showed an average of 83.5 points (range: 64.5-92). CONCLUSIONS: Complex bicondylar tibial plateau fractures follow a regular pattern, which is not represented in existing 2-dimensional fracture classifications. A 2-incision technique starting with the reduction of the posteromedial edge results in accurate fracture reduction with low complication rates and excellent knee function.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Turrialba is one of the largest and most active stratovolcanoes in the Central Cordillera of Costa Rica and an excellent target for validation of satellite data using ground based measurements due to its high elevation, relative ease of access, and persistent elevated SO2 degassing. The Ozone Monitoring Instrument (OMI) aboard the Aura satellite makes daily global observations of atmospheric trace gases and it is used in this investigation to obtain volcanic SO2 retrievals in the Turrialba volcanic plume. We present and evaluate the relative accuracy of two OMI SO2 data analysis procedures, the automatic Band Residual Index (BRI) technique and the manual Normalized Cloud-mass (NCM) method. We find a linear correlation and good quantitative agreement between SO2 burdens derived from the BRI and NCM techniques, with an improved correlation when wet season data are excluded. We also present the first comparisons between volcanic SO2 emission rates obtained from ground-based mini-DOAS measurements at Turrialba and three new OMI SO2 data analysis techniques: the MODIS smoke estimation, OMI SO2 lifetime, and OMI SO2 transect techniques. A robust validation of OMI SO2 retrievals was made, with both qualitative and quantitative agreements under specific atmospheric conditions, proving the utility of satellite measurements for estimating accurate SO2 emission rates and monitoring passively degassing volcanoes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Principal Component Analysis (PCA) is a popular method for dimension reduction that can be used in many fields including data compression, image processing, exploratory data analysis, etc. However, traditional PCA method has several drawbacks, since the traditional PCA method is not efficient for dealing with high dimensional data and cannot be effectively applied to compute accurate enough principal components when handling relatively large portion of missing data. In this report, we propose to use EM-PCA method for dimension reduction of power system measurement with missing data, and provide a comparative study of traditional PCA and EM-PCA methods. Our extensive experimental results show that EM-PCA method is more effective and more accurate for dimension reduction of power system measurement data than traditional PCA method when dealing with large portion of missing data set.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dr. Rossi discusses the common errors that are made when fitting statistical models to data. Focuses on the planning, data analysis, and interpretation phases of a statistical analysis, and highlights the errors that are commonly made by researchers of these phases. The implications of these commonly made errors are discussed along with a discussion of the methods that can be used to prevent these errors from occurring. A prescription for carrying out a correct statistical analysis will be discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cluster randomized trials (CRTs) use as the unit of randomization clusters, which are usually defined as a collection of individuals sharing some common characteristics. Common examples of clusters include entire dental practices, hospitals, schools, school classes, villages, and towns. Additionally, several measurements (repeated measurements) taken on the same individual at different time points are also considered to be clusters. In dentistry, CRTs are applicable as patients may be treated as clusters containing several individual teeth. CRTs require certain methodological procedures during sample calculation, randomization, data analysis, and reporting, which are often ignored in dental research publications. In general, due to similarity of the observations within clusters, each individual within a cluster provides less information compared with an individual in a non-clustered trial. Therefore, clustered designs require larger sample sizes compared with non-clustered randomized designs, and special statistical analyses that account for the fact that observations within clusters are correlated. It is the purpose of this article to highlight with relevant examples the important methodological characteristics of cluster randomized designs as they may be applied in orthodontics and to explain the problems that may arise if clustered observations are erroneously treated and analysed as independent (non-clustered).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Highland cattle with congenital crop ears have notches of variable size on the tips of both ears. In some cases, cartilage deformation can be seen and occasionally the external ears are shortened. We collected 40 cases and 80 controls across Switzerland. Pedigree data analysis confirmed a monogenic autosomal dominant mode of inheritance with variable expressivity. All affected animals could be traced back to a single common ancestor. A genome-wide association study was performed and the causative mutation was mapped to a 4 Mb interval on bovine chromosome 6. The H6 family homeobox 1 (HMX1) gene was selected as a positional and functional candidate gene. By whole genome re-sequencing of an affected Highland cattle, we detected 6 non-synonymous coding sequence variants and two variants in an ultra-conserved element at the HMX1 locus with respect to the reference genome. Of these 8 variants, only a non-coding 76 bp genomic duplication (g.106720058_106720133dup) located in the conserved region was perfectly associated with crop ears. The identified copy number variation probably results in HMX1 misregulation and possible gain-of-function. Our findings confirm the role of HMX1 during the development of the external ear. As it is sometimes difficult to phenotypically diagnose Highland cattle with slight ear notches, genetic testing can now be used to improve selection against this undesired trait.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents an overview of the Mobile Data Challenge (MDC), a large-scale research initiative aimed at generating innovations around smartphone-based research, as well as community-based evaluation of mobile data analysis methodologies. First, we review the Lausanne Data Collection Campaign (LDCC), an initiative to collect unique longitudinal smartphone dataset for the MDC. Then, we introduce the Open and Dedicated Tracks of the MDC, describe the specific datasets used in each of them, discuss the key design and implementation aspects introduced in order to generate privacy-preserving and scientifically relevant mobile data resources for wider use by the research community, and summarize the main research trends found among the 100+ challenge submissions. We finalize by discussing the main lessons learned from the participation of several hundred researchers worldwide in the MDC Tracks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Brain tumor is one of the most aggressive types of cancer in humans, with an estimated median survival time of 12 months and only 4% of the patients surviving more than 5 years after disease diagnosis. Until recently, brain tumor prognosis has been based only on clinical information such as tumor grade and patient age, but there are reports indicating that molecular profiling of gliomas can reveal subgroups of patients with distinct survival rates. We hypothesize that coupling molecular profiling of brain tumors with clinical information might improve predictions of patient survival time and, consequently, better guide future treatment decisions. In order to evaluate this hypothesis, the general goal of this research is to build models for survival prediction of glioma patients using DNA molecular profiles (U133 Affymetrix gene expression microarrays) along with clinical information. First, a predictive Random Forest model is built for binary outcomes (i.e. short vs. long-term survival) and a small subset of genes whose expression values can be used to predict survival time is selected. Following, a new statistical methodology is developed for predicting time-to-death outcomes using Bayesian ensemble trees. Due to a large heterogeneity observed within prognostic classes obtained by the Random Forest model, prediction can be improved by relating time-to-death with gene expression profile directly. We propose a Bayesian ensemble model for survival prediction which is appropriate for high-dimensional data such as gene expression data. Our approach is based on the ensemble "sum-of-trees" model which is flexible to incorporate additive and interaction effects between genes. We specify a fully Bayesian hierarchical approach and illustrate our methodology for the CPH, Weibull, and AFT survival models. We overcome the lack of conjugacy using a latent variable formulation to model the covariate effects which decreases computation time for model fitting. Also, our proposed models provides a model-free way to select important predictive prognostic markers based on controlling false discovery rates. We compare the performance of our methods with baseline reference survival methods and apply our methodology to an unpublished data set of brain tumor survival times and gene expression data, selecting genes potentially related to the development of the disease under study. A closing discussion compares results obtained by Random Forest and Bayesian ensemble methods under the biological/clinical perspectives and highlights the statistical advantages and disadvantages of the new methodology in the context of DNA microarray data analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES To identify the timing of significant arch dimensional increases during orthodontic alignment involving round and rectangular nickel-titanium (NiTi) wires and rectangular stainless steel (SS). A secondary aim was to compare the timing of changes occurring with conventional and self-ligating fixed appliance systems. METHODS In this non-primary publication, additional data from a multicenter randomised trial initially involving 96 patients, aged 16 years and above, were analysed. The main pre-specified outcome measures were the magnitude and timing of maxillary intercanine, interpremolar, and intermolar dimensions. Each participant underwent alignment with a standard Damon (Ormco, Orange, CA) wire sequence for a minimum of 34 weeks. Blinding of clinicians and patients was not possible; however, outcome assessors and data analysts were kept blind to the appliance type during data analysis. RESULTS Complete data were obtained from 71 subjects. Significant arch dimensional changes were observed relatively early in treatment. In particular, changes in maxillary inter-first and second premolar dimensions occurred after alignment with an 0.014in. NiTi wire (P<0.05). No statistical differences in transverse dimensions were found between rectangular NiTi and working SS wires for each transverse dimension (P>0.05). Bracket type had no significant effect on the timing of the transverse dimensional changes. CONCLUSIONS Arch dimensional changes were found to occur relatively early in treatment, irrespective of the appliance type. Nickel-titanium wires may have a more profound effect on transverse dimensions than previously believed. CLINICAL SIGNIFICANCE On the basis of this research orthodontic expansion may occur relatively early in treatment. Nickel-titanium wires may have a more profound effect on transverse dimensions than previously believed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PRINCIPALS Over a million people worldwide die each year from road traffic injuries and more than 10 million sustain permanent disabilities. Many of these victims are pedestrians. The present retrospective study analyzes the severity and mortality of injuries suffered by adult pedestrians, depending on whether they used a zebra crosswalk. METHODS Our retrospective data analysis covered adult patients admitted to our emergency department (ED) between 1 January 2000 and 31 December 2012 after being hit by a vehicle while crossing the road as a pedestrian. Patients were identified by using a string term. Medical, police and ambulance records were reviewed for data extraction. RESULTS A total of 347 patients were eligible for study inclusion. Two hundred and three (203; 58.5%) patients were on a zebra crosswalk and 144 (41.5%) were not. The mean ISS (injury Severity Score) was 12.1 (SD 14.7, range 1-75). The vehicles were faster in non-zebra crosswalk accidents (47.7 km/n, versus 41.4 km/h, p<0.027). The mean ISS score was higher in patients with non-zebra crosswalk accidents; 14.4 (SD 16.5, range 1-75) versus 10.5 (SD13.14, range 1-75) (p<0.019). Zebra crosswalk accidents were associated with less risk of severe injury (OR 0.61, 95% CI 0.38-0.98, p<0.042). Accidents involving a truck were associated with increased risk of severe injury (OR 3.53, 95%CI 1.21-10.26, p<0.02). CONCLUSION Accidents on zebra crosswalks are more common than those not on zebra crosswalks. The injury severity of non-zebra crosswalk accidents is significantly higher than in patients with zebra crosswalk accidents. Accidents involving large vehicles are associated with increased risk of severe injury. Further prospective studies are needed, with detailed assessment of motor vehicle types and speed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The susceptibility of humans to the variant Creutzfeldt-Jakob disease is greatly influenced by polymorphisms within the human prion protein gene (PRNP). Similar genetic differences exist in sheep, in which PRNP polymorphisms modify the susceptibility to scrapie. However, the known coding polymorphisms within the bovine PRNP gene have little or no effect on bovine spongiform encephalopathy (BSE) susceptibility in cattle. We have recently found a tentative association between PRNP promoter polymorphisms and BSE susceptibility in German cattle (Sander, P., Hamann, H., Pfeiffer, I., Wemheuer, W., Brenig, B., Groschup, M., Ziegler, U., Distl, O., and Leeb, T. (2004) Neurogenetics 5, 19-25). A plausible hypothesis explaining this observation could be that the bovine PRNP promoter polymorphisms cause changes in PRNP expression that might be responsible for differences in BSE incubation time and/or BSE susceptibility. To test this hypothesis, we performed a functional promoter analysis of the different bovine PRNP promoter alleles by reporter gene assays in vitro and by measuring PRNP mRNA levels in calves with different PRNP genotypes in vivo. Two variable sites, a 23-bp insertion/deletion (indel) polymorphism containing a RP58-binding site and a 12-bp indel polymorphism containing an SP1-binding site, were investigated. Band shift assays indicated differences in transcription factor binding to the different alleles at the two polymorphisms. Reporter gene assays demonstrated an interaction between the two postulated transcription factors and lower expression levels of the ins/ins allele compared with the del/del allele. The in vivo data revealed substantial individual variation of PRNP expression in different tissues. In intestinal lymph nodes, expression levels differed between the different PRNP genotypes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Noble gas analysis in early solar system materials, which can provide valuable information about early solar system processes and timescales, are very challenging because of extremely low noble gas concentrations (ppt). We therefore developed a new compact sized (33 cm length, 7.2cm diameter, 1.3 L internal volume) Time-of-Flight (TOF) noble gas mass spectrometer for high sensitivity. We call it as Edel Gas Time-of-flight (EGT) mass spectrometer. The instrument uses electron impact ionization coupled to an ion trap, which allows us to ionize and measure all noble gas isotopes. Using a reflectron set-up improves the mass resolution. In addition, the reflectron set-up also enables some extra focusing. The detection is via MCPs and the signals are processed either via ADC or TDC systems. The objective of this work is to understand the newly developed Time-Of-Flight (TOF) mass spectrometer for noble gas analysis in presolar grains of the meteorites. Chapter 1 briefly introduces the basic idea and importance of the instrument. The physics relevant to time-of-flight mass spectrometry technique is discussed in the Chapter 2 and Chapter 3 will present the oxidation technique of nanodiamonds of the presolar grains by using copper oxide. Chapter 4 will present the details about EGT data analysis software. Chapter 5 and Chapter 6 will explain the details about EGT design and operation. Finally, the performance results will be presented and discussed in the Chapter 7, and whole work is summarized in Chapter 8 and also outlook of the future work is given.