922 resultados para Compositional data analysis-roots in geosciences
Resumo:
Background: The recent development of semi-automated techniques for staining and analyzing flow cytometry samples has presented new challenges. Quality control and quality assessment are critical when developing new high throughput technologies and their associated information services. Our experience suggests that significant bottlenecks remain in the development of high throughput flow cytometry methods for data analysis and display. Especially, data quality control and quality assessment are crucial steps in processing and analyzing high throughput flow cytometry data. Methods: We propose a variety of graphical exploratory data analytic tools for exploring ungated flow cytometry data. We have implemented a number of specialized functions and methods in the Bioconductor package rflowcyt. We demonstrate the use of these approaches by investigating two independent sets of high throughput flow cytometry data. Results: We found that graphical representations can reveal substantial non-biological differences in samples. Empirical Cumulative Distribution Function and summary scatterplots were especially useful in the rapid identification of problems not identified by manual review. Conclusions: Graphical exploratory data analytic tools are quick and useful means of assessing data quality. We propose that the described visualizations should be used as quality assessment tools and where possible, be used for quality control.
Resumo:
STUDY DESIGN: Ex vivo in vitro study evaluating a novel intervertebral disc/endplate culture system. OBJECTIVES: To establish a whole-organ intervertebral disc culture model for the study of disc degeneration in vitro, including the characterization of basic cell and organ function. SUMMARY OF BACKGROUND DATA: With current in vivo models for the study of disc and endplate degeneration, it remains difficult to investigate the complex disc metabolism and signaling cascades. In contrast, more controlled but simplified in vitro systems using isolated cells or disc fragments are difficult to culture due to the unconstrained conditions, with often-observed cell death or cell dedifferentiation. Therefore, there is a demand for a controlled culture model with preserved cell function that offers the possibility to investigate disc and endplate pathologies in a structurally intact organ. METHODS: Naturally constrained intervertebral disc/endplate units from rabbits were cultured in multi-well plates. Cell viability, metabolic activity, matrix composition, and matrix gene expression profile were monitored using the Live/Dead cell viability test (Invitrogen, Basel, Switzerland), tetrazolium salt reduction (WST-8), proteoglycan and deoxyribonucleic acid quantification assays, and quantitative polymerase chain reaction. RESULTS: Viability and organ integrity were preserved for at least 4 weeks, while proteoglycan and deoxyribonucleic acid content decreased slightly, and matrix genes exhibited a degenerative profile with up-regulation of type I collagen and suppression of collagen type II and aggrecan genes. Additionally, cell metabolic activity was reduced to one third of the initial value. CONCLUSIONS: Naturally constrained intervertebral rabbit discs could be cultured for several weeks without losing cell viability. Structural integrity and matrix composition were retained. However, the organ responded to the artificial environment with a degenerative gene expression pattern and decreased metabolic rate. Therefore, the described system serves as a promising in vitro model to study disc degeneration in a whole organ.
Resumo:
A time series is a sequence of observations made over time. Examples in public health include daily ozone concentrations, weekly admissions to an emergency department or annual expenditures on health care in the United States. Time series models are used to describe the dependence of the response at each time on predictor variables including covariates and possibly previous values in the series. Time series methods are necessary to account for the correlation among repeated responses over time. This paper gives an overview of time series ideas and methods used in public health research.
Resumo:
BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.
Resumo:
Cluster randomized trials (CRTs) use as the unit of randomization clusters, which are usually defined as a collection of individuals sharing some common characteristics. Common examples of clusters include entire dental practices, hospitals, schools, school classes, villages, and towns. Additionally, several measurements (repeated measurements) taken on the same individual at different time points are also considered to be clusters. In dentistry, CRTs are applicable as patients may be treated as clusters containing several individual teeth. CRTs require certain methodological procedures during sample calculation, randomization, data analysis, and reporting, which are often ignored in dental research publications. In general, due to similarity of the observations within clusters, each individual within a cluster provides less information compared with an individual in a non-clustered trial. Therefore, clustered designs require larger sample sizes compared with non-clustered randomized designs, and special statistical analyses that account for the fact that observations within clusters are correlated. It is the purpose of this article to highlight with relevant examples the important methodological characteristics of cluster randomized designs as they may be applied in orthodontics and to explain the problems that may arise if clustered observations are erroneously treated and analysed as independent (non-clustered).
Resumo:
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.
Resumo:
Background information: During the late 1970s and the early 1980s, West Germany witnessed a reversal of gender differences in educational attainment, as females began to outperform males. Purpose: The main objective was to analyse which processes were behind the reversal of gender differences in educational attainment after 1945. The theoretical reflections and empirical evidence presented for the US context by DiPrete and Buchmann (Gender-specific trends in the value of education and the emerging gender gap in college completion, Demography 43: 1–24, 2006) and Buchmann, DiPrete, and McDaniel (Gender inequalities in education, Annual Review of Sociology 34: 319–37, 2008) are considered and applied to the West German context. It is suggested that the reversal of gender differences is a consequence of the change in female educational decisions, which are mainly related to labour market opportunities and not, as sometimes assumed, a consequence of a ‘boy’s crisis’. Sample: Several databases, such as the German General Social Survey, the German Socio-economic Panel and the German Life History Study, are employed for the longitudinal analysis of the educational and occupational careers of birth cohorts born in the twentieth century. Design and methods: Changing patterns of eligibility for university studies are analysed for successive birth cohorts and gender. Binary logistic regressions are employed for the statistical modelling of the individuals’ achievement, educational decision and likelihood for social mobility – reporting average marginal effects (AME). Results: The empirical results suggest that women’s better school achievement being constant across cohorts does not contribute to the explanation of the reversal of gender differences in higher education attainment, but the increase of benefits for higher education explains the changing educational decisions of women regarding their transition to higher education. Conclusions: The outperformance of females compared with males in higher education might have been initialised by several social changes, including the expansion of public employment, the growing demand for highly qualified female workers in welfare and service areas, the increasing returns of women’s increased education and training, and the improved opportunities for combining family and work outside the home. The historical data show that, in terms of (married) women’s increased labour market opportunities and female life-cycle labour force participation, the raising rates of women’s enrolment in higher education were – among other reasons – partly explained by their rising access to service class positions across birth cohorts, and the rise of their educational returns in terms of wages and long-term employment.
Resumo:
Sequence analysis and optimal matching are useful heuristic tools for the descriptive analysis of heterogeneous individual pathways such as educational careers, job sequences or patterns of family formation. However, to date it remains unclear how to handle the inevitable problems caused by missing values with regard to such analysis. Multiple Imputation (MI) offers a possible solution for this problem but it has not been tested in the context of sequence analysis. Against this background, we contribute to the literature by assessing the potential of MI in the context of sequence analyses using an empirical example. Methodologically, we draw upon the work of Brendan Halpin and extend it to additional types of missing value patterns. Our empirical case is a sequence analysis of panel data with substantial attrition that examines the typical patterns and the persistence of sex segregation in school-to-work transitions in Switzerland. The preliminary results indicate that MI is a valuable methodology for handling missing values due to panel mortality in the context of sequence analysis. MI is especially useful in facilitating a sound interpretation of the resulting sequence types.
Resumo:
Serpentinites release at sub-arc depths volatiles and several fluid-mobile trace elements found in arc magmas. Constraining element uptake in these rocks and defining the trace element composition of fluids released upon serpentinite dehydration can improve our understanding of mass transfer across subduction zones and to volcanic arcs. The eclogite-facies garnet metaperidotite and chlorite harzburgite bodies embedded in paragneiss of the subduction melange from Cima di Gagnone derive from serpentinized peridotite protoliths and are unique examples of ultramafic rocks that experienced subduction metasomatism and devolatilization. In these rocks, metamorphic olivine and garnet trap polyphase inclusions representing the fluid released during high-pressure breakdown of antigorite and chlorite. Combining major element mapping and laser-ablation ICP-MS bulk inclusion analysis, we characterize the mineral content of polyphase inclusions and quantify the fluid composition. Silicates, Cl-bearing phases, sulphides, carbonates, and oxides document post-entrapment mineral growth in the inclusions starting immediately after fluid entrapment. Compositional data reveal the presence of two different fluid types. The first (type A) records a fluid prominently enriched in fluid-mobile elements, with Cl, Cs, Pb, As, Sb concentrations up to 10(3) PM (primitive mantle), similar to 10(2) PM Tit Ba, while Rb, B, Sr, Li, U concentrations are of the order of 10(1) PM, and alkalis are similar to 2 PM. The second fluid (type B) has considerably lower fluid-mobile element enrichments, but its enrichment patterns are comparable to type A fluid. Our data reveal multistage fluid uptake in these peridotite bodies, including selective element enrichment during seafloor alteration, followed by fluid-rock interaction along with subduction metamorphism in the plate interface melange. Here, infiltration of sediment-equilibrated fluid produced significant enrichment of the serpentinites in As, Sb, B, Pb, an enriched trace element pattern that was then transferred to the fluid released at greater depth upon serpentine dehydration (type A fluid). The type B fluid hosted by garnet may record the composition of the chlorite breakdown fluid released at even greater depth. The Gagnone study-case demonstrates that serpentinized peridotites acquire water and fluid-mobile elements during ocean floor hydration and through exchange with sediment-equilibrated fluids in the early subduction stages. Subsequent antigorite devolatilization at subarc depths delivers aqueous fluids to the mantle wedge that can be prominently enriched in sediment-derived components, potentially triggering arc magmatism without the need of concomitant dehydration/melting of metasediments or altered oceanic crust.
Resumo:
igments, proteins and enzyme activity related to chlorophyll catabolism were analysed in senescing leaves of wild-type (WT) Lolium temulentum and compared with those of an introgression line carrying a mutant gene from stay-green (SG) Festuca pratensis. During senescence of WT leaves chlorophylls a and b were continuously catabolised to colourless products and no other derivatives were observed, whereas in SG leaves there was an accumulation of dephytylated and oxidised catabolites including chlorophyllide a, phaeophorbide a and 132 OH-chlorophyllide a. Dephytylated products were absent from SG leaf tissue senescing under a light-dark cycle. Retention of pigments in SG was accompanied by significant stabilisation of light harvesting chlorophyll-proteins compared with WT, but soluble proteins such as Rubisco were degraded during senescence at a similar rate in the two genotypes. The activity of phaeophorbide a oxygenase measured in SG tissue at 3d was less than 12% of that in WT tissue at the same time-point during senescence and of the same order as that in young pre-senescent WT leaves, indicating that the metabolic lesion in SG concerns a deficiency at the ring-opening step of the catabolic pathway. In senescent L. temulentum tissue two terminal chlorophyll catabolites were identified with chromatographic characteristics that suggest they may represent hitherto undescribed catabolite structures. These data are discussed in relation to current understanding of the genetic and metabolic control of chlorophyll catabolism in leaf senescence.
Resumo:
OBJECTIVES The purpose of the study was to provide empirical evidence about the reporting of methodology to address missing outcome data and the acknowledgement of their impact in Cochrane systematic reviews in the mental health field. METHODS Systematic reviews published in the Cochrane Database of Systematic Reviews after January 1, 2009 by three Cochrane Review Groups relating to mental health were included. RESULTS One hundred ninety systematic reviews were considered. Missing outcome data were present in at least one included study in 175 systematic reviews. Of these 175 systematic reviews, 147 (84%) accounted for missing outcome data by considering a relevant primary or secondary outcome (e.g., dropout). Missing outcome data implications were reported only in 61 (35%) systematic reviews and primarily in the discussion section by commenting on the amount of the missing outcome data. One hundred forty eligible meta-analyses with missing data were scrutinized. Seventy-nine (56%) of them had studies with total dropout rate between 10 and 30%. One hundred nine (78%) meta-analyses reported to have performed intention-to-treat analysis by including trials with imputed outcome data. Sensitivity analysis for incomplete outcome data was implemented in less than 20% of the meta-analyses. CONCLUSIONS Reporting of the techniques for handling missing outcome data and their implications in the findings of the systematic reviews are suboptimal.
Resumo:
Missing outcome data are common in clinical trials and despite a well-designed study protocol, some of the randomized participants may leave the trial early without providing any or all of the data, or may be excluded after randomization. Premature discontinuation causes loss of information, potentially resulting in attrition bias leading to problems during interpretation of trial findings. The causes of information loss in a trial, known as mechanisms of missingness, may influence the credibility of the trial results. Analysis of trials with missing outcome data should ideally be handled with intention to treat (ITT) rather than per protocol (PP) analysis. However, true ITT analysis requires appropriate assumptions and imputation of missing data. Using a worked example from a published dental study, we highlight the key issues associated with missing outcome data in clinical trials, describe the most recognized approaches to handling missing outcome data, and explain the principles of ITT and PP analysis.