44 resultados para Secondary Data Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES To identify the timing of significant arch dimensional increases during orthodontic alignment involving round and rectangular nickel-titanium (NiTi) wires and rectangular stainless steel (SS). A secondary aim was to compare the timing of changes occurring with conventional and self-ligating fixed appliance systems. METHODS In this non-primary publication, additional data from a multicenter randomised trial initially involving 96 patients, aged 16 years and above, were analysed. The main pre-specified outcome measures were the magnitude and timing of maxillary intercanine, interpremolar, and intermolar dimensions. Each participant underwent alignment with a standard Damon (Ormco, Orange, CA) wire sequence for a minimum of 34 weeks. Blinding of clinicians and patients was not possible; however, outcome assessors and data analysts were kept blind to the appliance type during data analysis. RESULTS Complete data were obtained from 71 subjects. Significant arch dimensional changes were observed relatively early in treatment. In particular, changes in maxillary inter-first and second premolar dimensions occurred after alignment with an 0.014in. NiTi wire (P<0.05). No statistical differences in transverse dimensions were found between rectangular NiTi and working SS wires for each transverse dimension (P>0.05). Bracket type had no significant effect on the timing of the transverse dimensional changes. CONCLUSIONS Arch dimensional changes were found to occur relatively early in treatment, irrespective of the appliance type. Nickel-titanium wires may have a more profound effect on transverse dimensions than previously believed. CLINICAL SIGNIFICANCE On the basis of this research orthodontic expansion may occur relatively early in treatment. Nickel-titanium wires may have a more profound effect on transverse dimensions than previously believed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: Tumor stage and nuclear grade are the most important prognostic parameters of clear cell renal cell carcinoma (ccRCC). The progression risk of ccRCC remains difficult to predict particularly for tumors with organ-confined stage and intermediate differentiation grade. Elucidating molecular pathways deregulated in ccRCC may point to novel prognostic parameters that facilitate planning of therapeutic approaches. EXPERIMENTAL DESIGN: Using tissue microarrays, expression patterns of 15 different proteins were evaluated in over 800 ccRCC patients to analyze pathways reported to be physiologically controlled by the tumor suppressors von Hippel-Lindau protein and phosphatase and tensin homologue (PTEN). Tumor staging and grading were improved by performing variable selection using Cox regression and a recursive bootstrap elimination scheme. RESULTS: Patients with pT2 and pT3 tumors that were p27 and CAIX positive had a better outcome than those with all remaining marker combinations. A prolonged survival among patients with intermediate grade (grade 2) correlated with both nuclear p27 and cytoplasmic PTEN expression, as well as with inactive, nonphosphorylated ribosomal protein S6. By applying graphical log-linear modeling for over 700 ccRCC for which the molecular parameters were available, only a weak conditional dependence existed between the expression of p27, PTEN, CAIX, and p-S6, suggesting that the dysregulation of several independent pathways are crucial for tumor progression. CONCLUSIONS: The use of recursive bootstrap elimination, as well as graphical log-linear modeling for comprehensive tissue microarray (TMA) data analysis allows the unraveling of complex molecular contexts and may improve predictive evaluations for patients with advanced renal cancer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article, we will link neuroimaging, data analysis, and intervention methods in an important psychiatric condition: auditory verbal hallucinations (AVH). The clinical and phenomenological background as well as neurophysiological findings will be covered and discussed with respect to noninvasive brain stimulation. Additionally, methods of noninvasive brain stimulation will be presented as ways to intervene with AVH. Finally, preliminary conclusions and possible future perspectives will be proposed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cluster randomized trials (CRTs) use as the unit of randomization clusters, which are usually defined as a collection of individuals sharing some common characteristics. Common examples of clusters include entire dental practices, hospitals, schools, school classes, villages, and towns. Additionally, several measurements (repeated measurements) taken on the same individual at different time points are also considered to be clusters. In dentistry, CRTs are applicable as patients may be treated as clusters containing several individual teeth. CRTs require certain methodological procedures during sample calculation, randomization, data analysis, and reporting, which are often ignored in dental research publications. In general, due to similarity of the observations within clusters, each individual within a cluster provides less information compared with an individual in a non-clustered trial. Therefore, clustered designs require larger sample sizes compared with non-clustered randomized designs, and special statistical analyses that account for the fact that observations within clusters are correlated. It is the purpose of this article to highlight with relevant examples the important methodological characteristics of cluster randomized designs as they may be applied in orthodontics and to explain the problems that may arise if clustered observations are erroneously treated and analysed as independent (non-clustered).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents an overview of the Mobile Data Challenge (MDC), a large-scale research initiative aimed at generating innovations around smartphone-based research, as well as community-based evaluation of mobile data analysis methodologies. First, we review the Lausanne Data Collection Campaign (LDCC), an initiative to collect unique longitudinal smartphone dataset for the MDC. Then, we introduce the Open and Dedicated Tracks of the MDC, describe the specific datasets used in each of them, discuss the key design and implementation aspects introduced in order to generate privacy-preserving and scientifically relevant mobile data resources for wider use by the research community, and summarize the main research trends found among the 100+ challenge submissions. We finalize by discussing the main lessons learned from the participation of several hundred researchers worldwide in the MDC Tracks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PRINCIPALS Over a million people worldwide die each year from road traffic injuries and more than 10 million sustain permanent disabilities. Many of these victims are pedestrians. The present retrospective study analyzes the severity and mortality of injuries suffered by adult pedestrians, depending on whether they used a zebra crosswalk. METHODS Our retrospective data analysis covered adult patients admitted to our emergency department (ED) between 1 January 2000 and 31 December 2012 after being hit by a vehicle while crossing the road as a pedestrian. Patients were identified by using a string term. Medical, police and ambulance records were reviewed for data extraction. RESULTS A total of 347 patients were eligible for study inclusion. Two hundred and three (203; 58.5%) patients were on a zebra crosswalk and 144 (41.5%) were not. The mean ISS (injury Severity Score) was 12.1 (SD 14.7, range 1-75). The vehicles were faster in non-zebra crosswalk accidents (47.7 km/n, versus 41.4 km/h, p<0.027). The mean ISS score was higher in patients with non-zebra crosswalk accidents; 14.4 (SD 16.5, range 1-75) versus 10.5 (SD13.14, range 1-75) (p<0.019). Zebra crosswalk accidents were associated with less risk of severe injury (OR 0.61, 95% CI 0.38-0.98, p<0.042). Accidents involving a truck were associated with increased risk of severe injury (OR 3.53, 95%CI 1.21-10.26, p<0.02). CONCLUSION Accidents on zebra crosswalks are more common than those not on zebra crosswalks. The injury severity of non-zebra crosswalk accidents is significantly higher than in patients with zebra crosswalk accidents. Accidents involving large vehicles are associated with increased risk of severe injury. Further prospective studies are needed, with detailed assessment of motor vehicle types and speed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Noble gas analysis in early solar system materials, which can provide valuable information about early solar system processes and timescales, are very challenging because of extremely low noble gas concentrations (ppt). We therefore developed a new compact sized (33 cm length, 7.2cm diameter, 1.3 L internal volume) Time-of-Flight (TOF) noble gas mass spectrometer for high sensitivity. We call it as Edel Gas Time-of-flight (EGT) mass spectrometer. The instrument uses electron impact ionization coupled to an ion trap, which allows us to ionize and measure all noble gas isotopes. Using a reflectron set-up improves the mass resolution. In addition, the reflectron set-up also enables some extra focusing. The detection is via MCPs and the signals are processed either via ADC or TDC systems. The objective of this work is to understand the newly developed Time-Of-Flight (TOF) mass spectrometer for noble gas analysis in presolar grains of the meteorites. Chapter 1 briefly introduces the basic idea and importance of the instrument. The physics relevant to time-of-flight mass spectrometry technique is discussed in the Chapter 2 and Chapter 3 will present the oxidation technique of nanodiamonds of the presolar grains by using copper oxide. Chapter 4 will present the details about EGT data analysis software. Chapter 5 and Chapter 6 will explain the details about EGT design and operation. Finally, the performance results will be presented and discussed in the Chapter 7, and whole work is summarized in Chapter 8 and also outlook of the future work is given.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ability to determine what activity of daily living a person performs is of interest in many application domains. It is possible to determine the physical and cognitive capabilities of the elderly by inferring what activities they perform in their houses. Our primary aim was to establish a proof of concept that a wireless sensor system can monitor and record physical activity and these data can be modeled to predict activities of daily living. The secondary aim was to determine the optimal placement of the sensor boxes for detecting activities in a room. A wireless sensor system was set up in a laboratory kitchen. The ten healthy participants were requested to make tea following a defined sequence of tasks. Data were collected from the eight wireless sensor boxes placed in specific places in the test kitchen and analyzed to detect the sequences of tasks performed by the participants. These sequence of tasks were trained and tested using the Markov Model. Data analysis focused on the reliability of the system and the integrity of the collected data. The sequence of tasks were successfully recognized for all subjects and the averaged data pattern of tasks sequences between the subjects had a high correlation. Analysis of the data collected indicates that sensors placed in different locations are capable of recognizing activities, with the movement detection sensor contributing the most to detection of tasks. The central top of the room with no obstruction of view was considered to be the best location to record data for activity detection. Wireless sensor systems show much promise as easily deployable to monitor and recognize activities of daily living.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Endodontic treatment involves removal of the dental pulp and its replacement by a root canal filling. Restoration of root filled teeth can be challenging due to structural differences between vital and non-vital root-filled teeth. Direct restoration involves placement of a restorative material e.g. amalgam or composite, directly into the tooth. Indirect restorations consist of cast metal or ceramic (porcelain) crowns. The choice of restoration depends on the amount of remaining tooth, and may influence durability and cost. The decision to use a post and core in addition to the crown is clinician driven. The comparative clinical performance of crowns or conventional fillings used to restore root-filled teeth is unknown. This review updates the original, which was published in 2012. OBJECTIVES To assess the effects of restoration of endodontically treated teeth (with or without post and core) by crowns versus conventional filling materials. SEARCH METHODS We searched the following databases: the Cochrane Oral Health Group's Trials Register, CENTRAL, MEDLINE via OVID, EMBASE via OVID, CINAHL via EBSCO, LILACS via BIREME. We also searched the reference lists of articles and ongoing trials registries.There were no restrictions regarding language or date of publication. The search is up-to-date as of 26 March 2015. SELECTION CRITERIA Randomised controlled trials (RCTs) or quasi-randomised controlled trials in participants with permanent teeth that have undergone endodontic treatment. Single full coverage crowns compared with any type of filling materials for direct restoration or indirect partial restorations (e.g. inlays and onlays). Comparisons considered the type of post and core used (cast or prefabricated post), if any. DATA COLLECTION AND ANALYSIS Two review authors independently extracted data from the included trial and assessed its risk of bias. We carried out data analysis using the 'treatment as allocated' patient population, expressing estimates of intervention effect for dichotomous data as risk ratios, with 95% confidence intervals (CI). MAIN RESULTS We included one trial, which was judged to be at high risk of performance, detection and attrition bias. The 117 participants with a root-filled, premolar tooth restored with a carbon fibre post, were randomised to either a full coverage metal-ceramic crown or direct adhesive composite restoration. None experienced a catastrophic failure (i.e. when the restoration cannot be repaired), although only 104 teeth were included in the final, three-year assessment. There was no clear difference between the crown and composite group and the composite only group for non-catastrophic failures of the restoration (1/54 versus 3/53; RR 0.33; 95% CI 0.04 to 3.05) or failures of the post (2/54 versus 1/53; RR 1.96; 95% CI 0.18 to 21.01) at three years. The quality of the evidence for these outcomes is very low. There was no evidence available for any of our secondary outcomes: patient satisfaction and quality of life, incidence or recurrence of caries, periodontal health status, and costs. AUTHORS' CONCLUSIONS There is insufficient evidence to assess the effects of crowns compared to conventional fillings for the restoration of root-filled teeth. Until more evidence becomes available, clinicians should continue to base decisions about how to restore root-filled teeth on their own clinical experience, whilst taking into consideration the individual circumstances and preferences of their patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

On the orbiter of the Rosetta spacecraft, the Cometary Secondary Ion Mass Analyser (COSIMA) will provide new in situ insights about the chemical composition of cometary grains all along 67P/Churyumov–Gerasimenko (67P/CG) journey until the end of December 2015 nominally. The aim of this paper is to present the pre-calibration which has already been performed as well as the different methods which have been developed in order to facilitate the interpretation of the COSIMA mass spectra and more especially of their organic content. The first step was to establish a mass spectra library in positive and negative ion mode of targeted molecules and to determine the specific features of each compound and chemical family analyzed. As the exact nature of the refractory cometary organic matter is nowadays unknown, this library is obviously not exhaustive. Therefore this library has also been the starting point for the research of indicators, which enable to highlight the presence of compounds containing specific atom or structure. These indicators correspond to the intensity ratio of specific peaks in the mass spectrum. They have allowed us to identify sample containing nitrogen atom, aliphatic chains or those containing polyaromatic hydrocarbons. From these indicators, a preliminary calibration line, from which the N/C ratio could be derived, has also been established. The research of specific mass difference could also be helpful to identify peaks related to quasi-molecular ions in an unknown mass spectrum. The Bayesian Positive Source Separation (BPSS) technique will also be very helpful for data analysis. This work is the starting point for the analysis of the cometary refractory organic matter. Nevertheless, calibration work will continue in order to reach the best possible interpretation of the COSIMA observations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sedimentary sequences in ancient or long-lived lakes can reach several thousands of meters in thickness and often provide an unrivalled perspective of the lake's regional climatic, environmental, and biological history. Over the last few years, deep-drilling projects in ancient lakes became increasingly multi- and interdisciplinary, as, among others, seismological, sedimentological, biogeochemical, climatic, environmental, paleontological, and evolutionary information can be obtained from sediment cores. However, these multi- and interdisciplinary projects pose several challenges. The scientists involved typically approach problems from different scientific perspectives and backgrounds, and setting up the program requires clear communication and the alignment of interests. One of the most challenging tasks, besides the actual drilling operation, is to link diverse datasets with varying resolution, data quality, and age uncertainties to answer interdisciplinary questions synthetically and coherently. These problems are especially relevant when secondary data, i.e., datasets obtained independently of the drilling operation, are incorporated in analyses. Nonetheless, the inclusion of secondary information, such as isotopic data from fossils found in outcrops or genetic data from extant species, may help to achieve synthetic answers. Recent technological and methodological advances in paleolimnology are likely to increase the possibilities of integrating secondary information. Some of the new approaches have started to revolutionize scientific drilling in ancient lakes, but at the same time, they also add a new layer of complexity to the generation and analysis of sediment-core data. The enhanced opportunities presented by new scientific approaches to study the paleolimnological history of these lakes, therefore, come at the expense of higher logistic, communication, and analytical efforts. Here we review types of data that can be obtained in ancient lake drilling projects and the analytical approaches that can be applied to empirically and statistically link diverse datasets to create an integrative perspective on geological and biological data. In doing so, we highlight strengths and potential weaknesses of new methods and analyses, and provide recommendations for future interdisciplinary deep-drilling projects.