942 resultados para Climatic data simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The difficulty of detecting differential gene expression in microarray data has existed for many years. Several correction procedures try to avoid the family-wise error rate in multiple comparison process, including the Bonferroni and Sidak single-step p-value adjustments, Holm's step-down correction method, and Benjamini and Hochberg's false discovery rate (FDR) correction procedure. Each multiple comparison technique has its advantages and weaknesses. We studied each multiple comparison method through numerical studies (simulations) and applied the methods to the real exploratory DNA microarray data, which detect of molecular signatures in papillary thyroid cancer (PTC) patients. According to our results of simulation studies, Benjamini and Hochberg step-up FDR controlling procedure is the best process among these multiple comparison methods and we discovered 1277 potential biomarkers among 54675 probe sets after applying the Benjamini and Hochberg's method to PTC microarray data.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives. This paper seeks to assess the effect on statistical power of regression model misspecification in a variety of situations. ^ Methods and results. The effect of misspecification in regression can be approximated by evaluating the correlation between the correct specification and the misspecification of the outcome variable (Harris 2010).In this paper, three misspecified models (linear, categorical and fractional polynomial) were considered. In the first section, the mathematical method of calculating the correlation between correct and misspecified models with simple mathematical forms was derived and demonstrated. In the second section, data from the National Health and Nutrition Examination Survey (NHANES 2007-2008) were used to examine such correlations. Our study shows that comparing to linear or categorical models, the fractional polynomial models, with the higher correlations, provided a better approximation of the true relationship, which was illustrated by LOESS regression. In the third section, we present the results of simulation studies that demonstrate overall misspecification in regression can produce marked decreases in power with small sample sizes. However, the categorical model had greatest power, ranging from 0.877 to 0.936 depending on sample size and outcome variable used. The power of fractional polynomial model was close to that of linear model, which ranged from 0.69 to 0.83, and appeared to be affected by the increased degrees of freedom of this model.^ Conclusion. Correlations between alternative model specifications can be used to provide a good approximation of the effect on statistical power of misspecification when the sample size is large. When model specifications have known simple mathematical forms, such correlations can be calculated mathematically. Actual public health data from NHANES 2007-2008 were used as examples to demonstrate the situations with unknown or complex correct model specification. Simulation of power for misspecified models confirmed the results based on correlation methods but also illustrated the effect of model degrees of freedom on power.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study proposed a novel statistical method that modeled the multiple outcomes and missing data process jointly using item response theory. This method follows the "intent-to-treat" principle in clinical trials and accounts for the correlation between outcomes and missing data process. This method may provide a good solution to chronic mental disorder study. ^ The simulation study demonstrated that if the true model is the proposed model with moderate or strong correlation, ignoring the within correlation may lead to overestimate of the treatment effect and result in more type I error than specified level. Even if the within correlation is small, the performance of proposed model is as good as naïve response model. Thus, the proposed model is robust for different correlation settings if the data is generated by the proposed model.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixture modeling is commonly used to model categorical latent variables that represent subpopulations in which population membership is unknown but can be inferred from the data. In relatively recent years, the potential of finite mixture models has been applied in time-to-event data. However, the commonly used survival mixture model assumes that the effects of the covariates involved in failure times differ across latent classes, but the covariate distribution is homogeneous. The aim of this dissertation is to develop a method to examine time-to-event data in the presence of unobserved heterogeneity under a framework of mixture modeling. A joint model is developed to incorporate the latent survival trajectory along with the observed information for the joint analysis of a time-to-event variable, its discrete and continuous covariates, and a latent class variable. It is assumed that the effects of covariates on survival times and the distribution of covariates vary across different latent classes. The unobservable survival trajectories are identified through estimating the probability that a subject belongs to a particular class based on observed information. We applied this method to a Hodgkin lymphoma study with long-term follow-up and observed four distinct latent classes in terms of long-term survival and distributions of prognostic factors. Our results from simulation studies and from the Hodgkin lymphoma study demonstrated the superiority of our joint model compared with the conventional survival model. This flexible inference method provides more accurate estimation and accommodates unobservable heterogeneity among individuals while taking involved interactions between covariates into consideration.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Phase I clinical trial is considered the "first in human" study in medical research to examine the toxicity of a new agent. It determines the maximum tolerable dose (MTD) of a new agent, i.e., the highest dose in which toxicity is still acceptable. Several phase I clinical trial designs have been proposed in the past 30 years. The well known standard method, so called the 3+3 design, is widely accepted by clinicians since it is the easiest to implement and it does not need a statistical calculation. Continual reassessment method (CRM), a design uses Bayesian method, has been rising in popularity in the last two decades. Several variants of the CRM design have also been suggested in numerous statistical literatures. Rolling six is a new method introduced in pediatric oncology in 2008, which claims to shorten the trial duration as compared to the 3+3 design. The goal of the present research was to simulate clinical trials and compare these phase I clinical trial designs. Patient population was created by discrete event simulation (DES) method. The characteristics of the patients were generated by several distributions with the parameters derived from a historical phase I clinical trial data review. Patients were then selected and enrolled in clinical trials, each of which uses the 3+3 design, the rolling six, or the CRM design. Five scenarios of dose-toxicity relationship were used to compare the performance of the phase I clinical trial designs. One thousand trials were simulated per phase I clinical trial design per dose-toxicity scenario. The results showed the rolling six design was not superior to the 3+3 design in terms of trial duration. The time to trial completion was comparable between the rolling six and the 3+3 design. However, they both shorten the duration as compared to the two CRM designs. Both CRMs were superior to the 3+3 design and the rolling six in accuracy of MTD estimation. The 3+3 design and rolling six tended to assign more patients to undesired lower dose levels. The toxicities were slightly greater in the CRMs.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-center clinical trials are very common in the development of new drugs and devices. One concern in such trials, is the effect of individual investigational sites enrolling small numbers of patients on the overall result. Can the presence of small centers cause an ineffective treatment to appear effective when treatment-by-center interaction is not statistically significant?^ In this research, simulations are used to study the effect that centers enrolling few patients may have on the analysis of clinical trial data. A multi-center clinical trial with 20 sites is simulated to investigate the effect of a new treatment in comparison to a placebo treatment. Twelve of these 20 investigational sites are considered small, each enrolling less than four patients per treatment group. Three clinical trials are simulated with sample sizes of 100, 170 and 300. The simulated data is generated with various characteristics, one in which treatment should be considered effective and another where treatment is not effective. Qualitative interactions are also produced within the small sites to further investigate the effect of small centers under various conditions.^ Standard analysis of variance methods and the "sometimes-pool" testing procedure are applied to the simulated data. One model investigates treatment and center effect and treatment-by-center interaction. Another model investigates treatment effect alone. These analyses are used to determine the power to detect treatment-by-center interactions, and the probability of type I error.^ We find it is difficult to detect treatment-by-center interactions when only a few investigational sites enrolling a limited number of patients participate in the interaction. However, we find no increased risk of type I error in these situations. In a pooled analysis, when the treatment is not effective, the probability of finding a significant treatment effect in the absence of significant treatment-by-center interaction is well within standard limits of type I error. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is to develop a new statistical method to determine the minimum set of rows (R) in a R x C contingency table of discrete data that explains the dependence of observations. The statistical power of the method will be empirically determined by computer simulation to judge its efficiency over the presently existing methods. The method will be applied to data on DNA fragment length variation at six VNTR loci in over 72 populations from five major racial groups of human (total sample size is over 15,000 individuals; each sample having at least 50 individuals). DNA fragment lengths grouped in bins will form the basis of studying inter-population DNA variation within the racial groups are significant, will provide a rigorous re-binning procedure for forensic computation of DNA profile frequencies that takes into account intra-racial DNA variation among populations. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, we propose a continuous-time Markov chain model to examine the longitudinal data that have three categories in the outcome variable. The advantage of this model is that it permits a different number of measurements for each subject and the duration between two consecutive time points of measurements can be irregular. Using the maximum likelihood principle, we can estimate the transition probability between two time points. By using the information provided by the independent variables, this model can also estimate the transition probability for each subject. The Monte Carlo simulation method will be used to investigate the goodness of model fitting compared with that obtained from other models. A public health example will be used to demonstrate the application of this method. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a reliable simulation of the time and space dependent CO2 redistribution between ocean and atmosphere an appropriate time dependent simulation of particle dynamics processes is essential but has not been carried out so far. The major difficulties were the lack of suitable modules for particle dynamics and early diagenesis (in order to close the carbon and nutrient budget) in ocean general circulation models, and the lack of an understanding of biogeochemical processes, such as the partial dissolution of calcareous particles in oversaturated water. The main target of ORFOIS was to fill in this gap in our knowledge and prediction capability infrastructure. This goal has been achieved step by step. At first comprehensive data bases (already existing data) of observations of relevance for the three major types of biogenic particles, organic carbon (POC), calcium carbonate (CaCO3), and biogenic silica (BSi or opal), as well as for refractory particles of terrestrial origin were collated and made publicly available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sediments of Lake Donggi Cona on the northeastern Tibetan Plateau were studied to infer changes in the lacustrine depositional environment, related to climatic and non-climatic changes during the last 19 kyr. The lake today fills a 30 X 8 km big and 95 m deep tectonic basin, associated with the Kunlun Fault. The study was conducted on a sediment-core transect through the lake basin, in order to gain a complete picture of spatiotemporal environmental change. The recovered sediments are partly finely laminated and are composed of calcareous muds with variable amounts of carbonate micrite, organic matter, detrital silt and clay. On the basis of sedimentological, geochemical, and mineralogical data up to five lithological units (LU) can be distinguished that document distinct stages in the development of the lake system. The onset of the lowermost LU with lacustrine muds above basal sands indicates that lake level was at least 39 m below the present level and started to rise after 19 ka, possibly in response to regional deglaciation. At this time, the lacustrine environment was characterized by detrital sediment influx and the deposition of siliciclastic sediment. In two sediment cores, upward grain-size coarsening documents a lake-level fall after 13 cal ka BP, possibly associated with the late-glacial Younger Dryas stadial. From 11.5 to 4.3 cal ka BP, grainsize fining in sediment cores from the profundal coring sites and the onset of lacustrine deposition at a litoral core site (2m water depth) in a recent marginal bay of Donggi Cona document lake-level rise during the early tomid-Holocene to at least modern level. In addition, high biological productivity and pronounced precipitation of carbonate micrites are consistent with warm and moist climate conditions related to an enhanced influence of summer monsoon. At 4.3 cal ka BP the lake system shifted from an aragonite- to a calcite-dominated system, indicating a change towards a fully open hydrological lake system. The younger clay-rich sediments are moreover non-laminated and lack any diagenetic sulphides, pointing to fully ventilated conditions, and the prevailing absence of lake stratification. This turning point in lake history could imply either a threshold response to insolation-forced climate cooling or a response to a non-climatic trigger, such as an erosional event or a tectonic pulse that induced a strong earthquake, which is difficult to decide from our data base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of widespread anoxic conditions in the deep oceans is evidenced by the accumulation and preservation of organic-carbon-rich sediments, but its precise cause remains controversial. The two most popular hypotheses involve (1) circulation-induced increased stratification resulting in reduced oxygenation of deep waters or (2) enhanced productivity in the surface ocean, increasing the raining down of organic matter and overwhelming the oxic remineralization potential of the deep ocean. In the periodic development of deep-water anoxia in the Pliocene-Pleistocene Mediterranean Sea, increased riverine runoff has been implicated both as a source for nutrients that fuel enhanced photic-zone productivity and a source of a less dense freshwater cap leading to reduced circulation, basin-wide stagnation, and deep-water oxygen starvation. Monsoon-driven increases in Nile River discharge and increased regional precipitation due to enhanced westerly activity-two mechanisms that represent fundamentally different climatic driving forces-have both been suggested as causes of the altered freshwater balance. Here we present data that confirm a distinctive neodymium (Nd) isotope signature for the Nile River relative to the Eastern Mediterranean-providing a new tracer of enhanced Nile outflow into the Mediterranean in the past. We further present Nd isotope data for planktonic foraminifera that suggest a clear increase in Nile discharge during the central intense period of two recent anoxic events. Our data also suggest, however, that other regional freshwater sources were more important at the beginning and end of the anoxic events. Taken at face value, the data appear to imply a temporal link between peaks in Nile discharge and enhanced westerly activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how the environment influences patterns of diversity is vital for effective conservation management, especially in a changing global climate. While assemblage structure and species richness patterns are often correlated with current environmental factors, historical influences may also be considerable, especially for taxa with poor dispersal abilities. Mountain-top regions throughout tropical rainforests can act as important refugia for taxa characterised by low dispersal capacities such as flightless ground beetles (Carabidae), an ecologically significant predatory group. We surveyed flightless ground beetles along elevational gradients in five different subregions within the Australian Wet Tropics World Heritage Area to investigate (1) whether the diversity and composition of flightless ground beetles are elevationally stratified, and, if so, (2) what environmental factors (other than elevation per se) are associated with these patterns. Generalised linear models and model averaging techniques were used to relate patterns of diversity to environmental factors. Unlike most taxonomic groups, flightless ground beetles increased in species richness and abundance with elevation. Additionally, each subregion consisted of distinct assemblages containing a high level of regional endemic species. Species richness was most strongly positively associated with the historical climatic conditions and negatively associated with severity of recent disturbance (treefalls) and current climatic conditions. Assemblage composition was associated with latitude and current and historical climatic conditions. Our results suggest that distributional patterns of flightless ground beetles are not only likely to be associated with factors that change with elevation (current climatic conditions), but also factors that are independent of elevation (recent disturbance and historical climatic conditions). Variation in historical vegetation stability explained both species richness and assemblage composition patterns, probably reflecting the significance of upland refugia at a geographic time scale. These findings are important for conservation management as upland habitats are under threat from climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simple glaciological conditions at Dome C in east Antarctica have made possible a more detailed and accurate interpretation of an ice core to 950 m depth spanning some 32,000 yr than that obtained from earlier ice cores. Dated events in comparable marine core has enabled the reduction of accumulation rate during the last ice age to be estimated. Climatic events recorded in the ice core indicate that the warmest Holocene period in the Southern Hemisphere occurred at an earlier date than in the Northern Hemisphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variations in the poleward-directed Atlantic heat transfer was investigated over the past 135 ka with special emphasis on the last and present interglacial climate development (Eemian and Holocene). Both interglacials exhibited very similar climatic oscillations during each preceding glacial terminations (deglacial TI and TII). Like TI, also TII has pronounced cold-warm-cold changes akin to events such as H1, Bølling/Allerød, and the Younger Dryas. But unlike TI, the cold events in TII were associated with intermittent southerly invasions of an Atlantic faunal component which underscores quite a different water mass evolution in the Nordic Seas. Within the Eemian interglaciation proper, peak warming intervals were antiphased between the Nordic Seas and North Atlantic. Moreover, inferred temperatures for the Nordic Seas were generally colder in the Eemian than in the Holocene, and vice versa for the North Atlantic. A reduced intensity of Atlantic Ocean heat transfer to the Arctic therefore characterized the Eemian, requiring a reassessment of the actual role of the ocean-atmosphere system behind interglacial, but also, glacial climate changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laminated lake sediments from the Dead Sea basin provide high-resolution records of climatic variability in the eastern Mediterranean region, which is especially sensitive to changing climatic conditions. In this study, we aim on detailed reconstruction of climatic fluctuations and related changes in the frequency of flood and dust deposition events at ca. 3300 and especially at 2800 cal. yr BP from high-resolution sediment records of the Dead Sea basin. A ca. 4-m-thick, mostly varved sediment section from the western margin of the Dead Sea (DSEn - Ein Gedi profile) was analysed and correlated to the new International Continental Scientific Drilling Program (ICDP) Dead Sea Deep Drilling Project core 5017-1 from the deep basin. To detect even single event layers, we applied a multi-proxy approach of high-resolution microscopic thin section analyses, micro-X-ray fluorescence (µ-XRF) element scanning and magnetic susceptibility measurements, supported by grain size data and palynological analyses. Based on radiocarbon and varve dating, two pronounced dry periods were detected at ~3500-3300 and ~3000-2400 cal. yr BP which are differently expressed in the sediment records. In the shallow-water core (DSEn), the older dry period is characterised by a thick sand deposit, whereas the sedimentological change at 2800 cal. yr BP is less pronounced and characterised mainly by an enhanced frequency of coarse detrital layers interpreted as erosion events. In the 5017-1 deep-basin core, both dry periods are depicted by halite deposits. The onset of the younger dry period coincides with the Homeric Grand Solar Minimum at ca. 2800 cal. yr BP. Our results suggest that during this period, the Dead Sea region experienced an overall dry climate, superimposed by an increased occurrence of flash floods caused by a change in synoptic weather patterns.