901 resultados para Techniques of data analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To examine the extent of multiplicity of data in trial reports and to assess the impact of multiplicity on meta-analysis results. Design Empirical study on a cohort of Cochrane systematic reviews. Data sources All Cochrane systematic reviews published from issue 3 in 2006 to issue 2 in 2007 that presented a result as a standardised mean difference (SMD). We retrieved trial reports contributing to the first SMD result in each review, and downloaded review protocols. We used these SMDs to identify a specific outcome for each meta-analysis from its protocol. Review methods Reviews were eligible if SMD results were based on two to ten randomised trials and if protocols described the outcome. We excluded reviews if they only presented results of subgroup analyses. Based on review protocols and index outcomes, two observers independently extracted the data necessary to calculate SMDs from the original trial reports for any intervention group, time point, or outcome measure compatible with the protocol. From the extracted data, we used Monte Carlo simulations to calculate all possible SMDs for every meta-analysis. Results We identified 19 eligible meta-analyses (including 83 trials). Published review protocols often lacked information about which data to choose. Twenty-four (29%) trials reported data for multiple intervention groups, 30 (36%) reported data for multiple time points, and 29 (35%) reported the index outcome measured on multiple scales. In 18 meta-analyses, we found multiplicity of data in at least one trial report; the median difference between the smallest and largest SMD results within a meta-analysis was 0.40 standard deviation units (range 0.04 to 0.91). Conclusions Multiplicity of data can affect the findings of systematic reviews and meta-analyses. To reduce the risk of bias, reviews and meta-analyses should comply with prespecified protocols that clearly identify time points, intervention groups, and scales of interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In orthodontics, multiple site observations within patients or multiple observations collected at consecutive time points are often encountered. Clustered designs require larger sample sizes compared to individual randomized trials and special statistical analyses that account for the fact that observations within clusters are correlated. It is the purpose of this study to assess to what degree clustering effects are considered during design and data analysis in the three major orthodontic journals. The contents of the most recent 24 issues of the American Journal of Orthodontics and Dentofacial Orthopedics (AJODO), Angle Orthodontist (AO), and European Journal of Orthodontics (EJO) from December 2010 backwards were hand searched. Articles with clustering effects and whether the authors accounted for clustering effects were identified. Additionally, information was collected on: involvement of a statistician, single or multicenter study, number of authors in the publication, geographical area, and statistical significance. From the 1584 articles, after exclusions, 1062 were assessed for clustering effects from which 250 (23.5 per cent) were considered to have clustering effects in the design (kappa = 0.92, 95 per cent CI: 0.67-0.99 for inter rater agreement). From the studies with clustering effects only, 63 (25.20 per cent) had indicated accounting for clustering effects. There was evidence that the studies published in the AO have higher odds of accounting for clustering effects [AO versus AJODO: odds ratio (OR) = 2.17, 95 per cent confidence interval (CI): 1.06-4.43, P = 0.03; EJO versus AJODO: OR = 1.90, 95 per cent CI: 0.84-4.24, non-significant; and EJO versus AO: OR = 1.15, 95 per cent CI: 0.57-2.33, non-significant). The results of this study indicate that only about a quarter of the studies with clustering effects account for this in statistical data analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erosion of dentine causes mineral dissolution, while the organic compounds remain at the surface. Therefore, a determination of tissue loss is complicated. Established quantitative methods for the evaluation of enamel have also been used for dentine, but the suitability of these techniques in this field has not been systematically determined. Therefore, this study aimed to compare longitudinal microradiography (LMR), contacting (cPM) and non-contacting profilometry (ncPM), and analysis of dissolved calcium (Ca analysis) in the erosion solution. Results are discussed in the light of the histology of dentine erosion. Erosion was performed with 0.05 M citric acid (pH 2.5) for 30, 60, 90 or 120 min, and erosive loss was determined by each method. LMR, cPM and ncPM were performed before and after collagenase digestion of the demineralised organic surface layer, with an emphasis on moisture control. Scanning electron microscopy was performed on randomly selected specimens. All measurements were converted into micrometres. Profilometry was not suitable to adequately quantify mineral loss prior to collagenase digestion. After 120 min of erosion, values of 5.4 +/- 1.9 microm (ncPM) and 27.8 +/- 4.6 microm (cPM) were determined. Ca analysis revealed a mineral loss of 55.4 +/- 11.5 microm. The values for profilometry after matrix digestion were 43.0 +/- 5.5 microm (ncPM) and 46.9 +/- 6.2 (cPM). Relative and proportional biases were detected for all method comparisons. The mineral loss values were below the detection limit for LMR. The study revealed gross differences between methods, particularly when demineralised organic surface tissue was present. These results indicate that the choice of method is critical and depends on the parameter under study.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conjoint analysis (CA) is one of the most important methods for preference elicitation. In this paper we investigate the intellectual structure within the conjoint analytical research community. Analyses based on single papers provide a method-based overview of streams of conjoint research. By using novel bibliometric techniques in this field we complement findings of existing reviews. We use co-citation and factor analysis of the most cited articles in SSCI to identify the most important articles and research streams. Seven research streams are revealed which are visualized by means of multidimensional scaling. Tables and graphics reveal the disciplinary affiliations of contributors to CA, the special structure within the classes as well as links between them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract. Rock magnetic, biochemical and inorganic records of the sediment cores PG1351 and Lz1024 from Lake El’gygytgyn, Chukotka peninsula, Far East Russian Arctic, were subject to a hierarchical agglomerative cluster analysis in order to refine and extend the pattern of climate modes as defined by Melles et al. (2007). Cluster analysis of the data obtained from both cores yielded similar results, differentiating clearly between the four climate modes warm, peak warm, cold and dry, and cold and moist. In addition, two transitional phases were identified, representing the early stages of a cold phase and slightly colder conditions during a warm phase. The statistical approach can thus be used to resolve gradual changes in the sedimentary units as an indicator of available oxygen in the hypolimnion in greater detail. Based upon cluster analyses on core Lz1024, the published succession of climate modes in core PG1351, covering the last 250 ka, was modified and extended back to 350 ka. Comparison to the marine oxygen isotope (�18O) stack LR04 (Lisiecki and Raymo, 2005) and the summer insolation at 67.5� N, with the extended Lake El’gygytgyn parameter records of magnetic susceptibility (�LF), total organic carbon content (TOC) and the chemical index of alteration (CIA; Minyuk et al., 2007), revealed that all stages back to marine isotope stage (MIS) 10 and most of the substages are clearly reflected in the pattern derived from the cluster analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While the use of thromboelastometry analysis (ROTEM®) in evaluation of haemostasis is rapidly increasing, important validity parameters of testing remain inadequately examined. We aimed to study systematically the consistency of thromboelastometry parameters within individual tests regarding measurements between different analysers, between different channels of the same analyser, between morning and afternoon measurements (circadian variation), and if measured four weeks apart. Citrated whole blood samples from 40 healthy volunteers were analysed with two analysers in parallel. EXTEM, INTEM, FIBTEM, HEPTEM and APTEM tests were conducted. A Bland-Altman comparison was performed and homogeneity of variances was tested using the pitman test. P-value ranges were used to classify the level of homogeneity (p<0.15 - low homogeneity, p = 0.15 to 0.5 - intermediate homogeneity, p>0.5 high homogeneity). Less than half of all comparisons made showed high homogeneity of variances (p>0.5) and in about a fifth of comparisons data distributions were heterogeneous (p<0.15). There was no clear pattern for homogeneity. On average, comparisons of MCF, ML and LI30 measurements tended to be better, but none of the tests assessed outperformed another. In conclusion, systematic investigation reveals large differences in the results of some thromboelastometry parameters and lack of consistency. Clinicians and scientists should take these inconsistencies into account and focus on parameters with a higher homogeneity such as MCF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An easily implemented extension of the standard response method of tidal analysis is outlined. The modification improves the extraction of both the steady and the tidal components from problematic time series by calculating tidal response weights uncontaminated by missing or anomalous data. Examples of time series containing data gaps and anomalous events are analyzed to demonstrate the applicability and advantage of the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A workshop providing an introduction to Bayesian data analysis and hypothesis testing using R, Jags and the BayesFactor package.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate adaptation policies increasingly incorporate sustainability principles into their design and implementation. Since successful adaptation by means of adaptive capacity is recognized as being dependent upon progress toward sustainable development, policy design is increasingly characterized by the inclusion of state and non-state actors (horizontal actor integration), cross-sectoral collaboration, and inter-generational planning perspectives. Comparing four case studies in Swiss mountain regions, three located in the Upper Rhone region and one case from western Switzerland, we investigate how sustainability is put into practice. We argue that collaboration networks and sustainability perceptions matter when assessing the implementation of sustainability in local climate change adaptation. In other words, we suggest that adaptation is successful where sustainability perceptions translate into cross-sectoral integration and collaboration on the ground. Data about perceptions and network relations are assessed through surveys and treated via cluster and social network analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report presents a basic analyis of the data collected on agroclimatology, erosion, and soil and water conservation at Afdeyu Station in the central highlands of Eritrea between 1984 and 2007. Datasets and graphs include rainfall, air and soil surface temperatures, soil loss, surface runoff, river discharge, and land use including cropping patterns of the measured catchment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Project justification is regarded as one of the major methodological deficits in Data Warehousing practice. As reasons for applying inappropriate methods, performing incomplete evaluations, or even entirely omitting justifications, the special nature of Data Warehousing benefits and the large portion of infrastructure-related activities are stated. In this paper, the economic justification of Data Warehousing projects is analyzed, and first results from a large academiaindustry collaboration project in the field of non-technical issues of Data Warehousing are presented. As conceptual foundations, the role of the Data Warehouse system in corporate application architectures is analyzed, and the specific properties of Data Warehousing projects are discussed. Based on an applicability analysis of traditional approaches to economic IT project justification, basic steps and responsibilities for the justification of Data Warehousing projects are derived.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microarray technology is a high-throughput method for genotyping and gene expression profiling. Limited sensitivity and specificity are one of the essential problems for this technology. Most of existing methods of microarray data analysis have an apparent limitation for they merely deal with the numerical part of microarray data and have made little use of gene sequence information. Because it's the gene sequences that precisely define the physical objects being measured by a microarray, it is natural to make the gene sequences an essential part of the data analysis. This dissertation focused on the development of free energy models to integrate sequence information in microarray data analysis. The models were used to characterize the mechanism of hybridization on microarrays and enhance sensitivity and specificity of microarray measurements. ^ Cross-hybridization is a major obstacle factor for the sensitivity and specificity of microarray measurements. In this dissertation, we evaluated the scope of cross-hybridization problem on short-oligo microarrays. The results showed that cross hybridization on arrays is mostly caused by oligo fragments with a run of 10 to 16 nucleotides complementary to the probes. Furthermore, a free-energy based model was proposed to quantify the amount of cross-hybridization signal on each probe. This model treats cross-hybridization as an integral effect of the interactions between a probe and various off-target oligo fragments. Using public spike-in datasets, the model showed high accuracy in predicting the cross-hybridization signals on those probes whose intended targets are absent in the sample. ^ Several prospective models were proposed to improve Positional Dependent Nearest-Neighbor (PDNN) model for better quantification of gene expression and cross-hybridization. ^ The problem addressed in this dissertation is fundamental to the microarray technology. We expect that this study will help us to understand the detailed mechanism that determines sensitivity and specificity on the microarrays. Consequently, this research will have a wide impact on how microarrays are designed and how the data are interpreted. ^