942 resultados para Climatic data simulation
Resumo:
For the first time we present a multi-proxy data set for the Russian Altai, consisting of Siberian larch tree-ring width (TRW), latewood density (MXD), δ13C and δ18O in cellulose chronologies obtained for the period 1779–2007 and cell wall thickness (CWT) for 1900–2008. All of these parameters agree well between each other in the high-frequency variability, while the low-frequency climate information shows systematic differences. The correlation analysis with temperature and precipitation data from the closest weather station and gridded data revealed that annual TRW, MXD, CWT, and δ13C data contain a strong summer temperature signal, while δ18O in cellulose represents a mixed summer and winter temperature and precipitation signal. The temperature and precipitation reconstructions from the Belukha ice core and Teletskoe lake sediments were used to investigate the correspondence of different independent proxies. Low frequency patterns in TRW and δ13C chronologies are consistent with temperature reconstructions from nearby Belukha ice core and Teletskoe lake sediments showing a pronounced warming trend in the last century. Their combination could be used for the regional temperature reconstruction. The long-term δ18O trend agrees with the precipitation reconstruction from the Teletskoe lake sediment indicating more humid conditions during the twentieth century. Therefore, these two proxies could be combined for the precipitation reconstruction.
Resumo:
BACKGROUND: After bovine spongiform encephalopathy (BSE) emerged in European cattle livestock in 1986 a fundamental question was whether the agent established also in the small ruminants' population. In Switzerland transmissible spongiform encephalopathies (TSEs) in small ruminants have been monitored since 1990. While in the most recent TSE cases a BSE infection could be excluded, for historical cases techniques to discriminate scrapie from BSE had not been available at the time of diagnosis and thus their status remained unclear. We herein applied state-of-the-art techniques to retrospectively classify these animals and to re-analyze the affected flocks for secondary cases. These results were the basis for models, simulating the course of TSEs over a period of 70 years. The aim was to come to a statistically based overall assessment of the TSE situation in the domestic small ruminant population in Switzerland. RESULTS: In sum 16 TSE cases were identified in small ruminants in Switzerland since 1981, of which eight were atypical and six were classical scrapie. In two animals retrospective analysis did not allow any further classification due to the lack of appropriate tissue samples. We found no evidence for an infection with the BSE agent in the cases under investigation. In none of the affected flocks, secondary cases were identified. A Bayesian prevalence calculation resulted in most likely estimates of one case of BSE, five cases of classical scrapie and 21 cases of atypical scrapie per 100'000 small ruminants. According to our models none of the TSEs is considered to cause a broader epidemic in Switzerland. In a closed population, they are rather expected to fade out in the next decades or, in case of a sporadic origin, may remain at a very low level. CONCLUSIONS: In summary, these data indicate that despite a significant epidemic of BSE in cattle, there is no evidence that BSE established in the small ruminant population in Switzerland. Classical and atypical scrapie both occur at a very low level and are not expected to escalate into an epidemic. In this situation the extent of TSE surveillance in small ruminants requires reevaluation based on cost-benefit analysis.
Resumo:
Nowadays computer simulation is used in various fields, particularly in laboratories where it is used for the exploration data which are sometimes experimentally inaccessible. In less developed countries where there is a need for up to date laboratories for the realization of practical lessons in chemistry, especially in secondary schools and some higher institutions of learning, it may permit learners to carryout experiments such as titrations without the use of laboratory materials and equipments. Computer simulations may also permit teachers to better explain the realities of practical lessons, given that computers have now become very accessible and less expensive compared to the acquisition of laboratory materials and equipments. This work is aimed at coming out with a virtual laboratory that shall permit the simulation of an acid-base titration and an oxidation-reduction titration with the use of synthetic images. To this effect, an appropriate numerical method was used to obtain appropriate organigram, which were further transcribed into source codes with the help of a programming language so as to come out with the software.
Resumo:
The Simulation Automation Framework for Experiments (SAFE) streamlines the de- sign and execution of experiments with the ns-3 network simulator. SAFE ensures that best practices are followed throughout the workflow a network simulation study, guaranteeing that results are both credible and reproducible by third parties. Data analysis is a crucial part of this workflow, where mistakes are often made. Even when appearing in highly regarded venues, scientific graphics in numerous network simulation publications fail to include graphic titles, units, legends, and confidence intervals. After studying the literature in network simulation methodology and in- formation graphics visualization, I developed a visualization component for SAFE to help users avoid these errors in their scientific workflow. The functionality of this new component includes support for interactive visualization through a web-based interface and for the generation of high-quality, static plots that can be included in publications. The overarching goal of my contribution is to help users create graphics that follow best practices in visualization and thereby succeed in conveying the right information about simulation results.
Resumo:
Research was to investigate the effects of increasing levels of carbon dioxide addition to the combustion of methane with air. Using an atmospheric-pressure, swirl-stabilized dump combustor, emissions data and flame stability limitations were measured and analyzed.
Resumo:
This study summarises all the accessible data on old German chemical weapons dumped in the Baltic Sea. Mr. Goncharov formulated a concept of ecological impact evaluation of chemical warfare agents (CWA) on the marine environment and structured a simulation model adapted to the specific character of the hydrological condition and hydrobiological subjects of the Bornholm Deep. The mathematical model he has created describes the spreading of contaminants by currents and turbulence in the near bottom boundary layer. Parameters of CWA discharge through corrosion of canisters were given for various kinds of bottom sediments with allowance for current velocity. He created a method for integral estimations and a computer simulation model and completed a forecast for CWA "Mustard", which showed that in normal hydrometeorological conditions there are local toxic plumes drifting along the bottom for a distance of up to several kilometres. With storm winds the toxic plumes from separate canisters interflow and lengthen and can reach fishery areas near Bornholm Island. When salt water from the North Sea flows in, the length of toxic zones can increase up to and over 100 kilometres and toxic water masses can spread into the northern Baltic. On this basis, Mr. Goncharov drew up recommendations to reduce dangers for human ecology and proposed the creation of a special system for the forecasting and remote sensing of the environmental conditions of CWA burial places.
Resumo:
Ice core data from Antarctica provide detailed insights into the characteristics of past climate, atmospheric circulation, as well as changes in the aerosol load of the atmosphere. We present high-resolution records of soluble calcium (Ca2+), non-sea-salt soluble calcium (nssCa2+), and particulate mineral dust aerosol from the East Antarctic Plateau at a depth resolution of 1 cm, spanning the past 800 000 years. Despite the fact that all three parameters are largely dust-derived, the ratio of nssCa2+ to particulate dust is dependent on the particulate dust concentration itself. We used principal component analysis to extract the joint climatic signal and produce a common high-resolution record of dust flux. This new record is used to identify Antarctic warming events during the past eight glacial periods. The phasing of dust flux and CO2 changes during glacial-interglacial transitions reveals that iron fertilization of the Southern Ocean during the past nine glacial terminations was not the dominant factor in the deglacial rise of CO2 concentrations. Rapid changes in dust flux during glacial terminations and Antarctic warming events point to a rapid response of the southern westerly wind belt in the region of southern South American dust sources on changing climate conditions. The clear lead of these dust changes on temperature rise suggests that an atmospheric reorganization occurred in the Southern Hemisphere before the Southern Ocean warmed significantly.
Resumo:
Human HeLa cells expressing mouse connexin30 were used to study the electrical properties of gap junction channel substates. Experiments were performed on cell pairs using a dual voltage-clamp method. Single-channel currents revealed discrete levels attributable to a main state, a residual state, and five substates interposed, suggesting the operation of six subgates provided by the six connexins of a gap junction hemichannel. Substate conductances, gamma(j,substate), were unevenly distributed between the main-state and the residual-state conductance (gamma(j,main state) = 141 pS, gamma(j,residual state) = 21 pS). Activation of the first subgate reduced the channel conductance by approximately 30%, and activation of subsequent subgates resulted in conductance decrements of 10-15% each. Current transitions between the states were fast (<2 ms). Substate events were usually demarcated by transitions from and back to the main state; transitions among substates were rare. Hence, subgates are recruited simultaneously rather than sequentially. The incidence of substate events was larger at larger gradients of V(j). Frequency and duration of substate events increased with increasing number of synchronously activated subgates. Our mathematical model, which describes the operation of gap junction channels, was expanded to include channel substates. Based on the established V(j)-sensitivity of gamma(j,main state) and gamma(j,residual state), the simulation yielded unique functions gamma(j,substate) = f(V(j)) for each substate. Hence, the spacing of subconductance levels between the channel main state and residual state were uneven and characteristic for each V(j).
Resumo:
Focusing of four hemoglobins with concurrent electrophoretic mobilization was studied by computer simulation. A dynamic electrophoresis simulator was first used to provide a detailed description of focusing in a 100-carrier component, pH 6-8 gradient using phosphoric acid as anolyte and NaOH as catholyte. These results are compared to an identical simulation except that the catholyte contained both NaOH and NaCl. A stationary, steady-state distribution of carrier components and hemoglobins is produced in the first configuration. In the second, the chloride ion migrates into and through the separation space. It is shown that even under these conditions of chloride ion flux a pH gradient forms. All amphoteric species acquire a slight positive charge upon focusing and the whole pattern is mobilized towards the cathode. The cathodic gradient end is stable whereas the anodic end is gradually degrading due to the continuous accumulation of chloride. The data illustrate that the mobilization is a cationic isotachophoretic process with the sodium ion being the leading cation. The peak height of the hemoglobin zones decreases somewhat upon mobilization, but the zones retain a relatively sharp profile, thus facilitating detection. The electropherograms that would be produced by whole column imaging and by a single detector placed at different locations along the focusing column are presented and show that focusing can be commenced with NaCl present in the catholyte at the beginning of the experiment. However, this may require detector placement on the cathodic side of the catholyte/sample mixture interface.
Resumo:
In biostatistical applications, interest often focuses on the estimation of the distribution of time T between two consecutive events. If the initial event time is observed and the subsequent event time is only known to be larger or smaller than an observed monitoring time, then the data is described by the well known singly-censored current status model, also known as interval censored data, case I. We extend this current status model by allowing the presence of a time-dependent process, which is partly observed and allowing C to depend on T through the observed part of this time-dependent process. Because of the high dimension of the covariate process, no globally efficient estimators exist with a good practical performance at moderate sample sizes. We follow the approach of Robins and Rotnitzky (1992) by modeling the censoring variable, given the time-variable and the covariate-process, i.e., the missingness process, under the restriction that it satisfied coarsening at random. We propose a generalization of the simple current status estimator of the distribution of T and of smooth functionals of the distribution of T, which is based on an estimate of the missingness. In this estimator the covariates enter only through the estimate of the missingness process. Due to the coarsening at random assumption, the estimator has the interesting property that if we estimate the missingness process more nonparametrically, then we improve its efficiency. We show that by local estimation of an optimal model or optimal function of the covariates for the missingness process, the generalized current status estimator for smooth functionals become locally efficient; meaning it is efficient if the right model or covariate is consistently estimated and it is consistent and asymptotically normal in general. Estimation of the optimal model requires estimation of the conditional distribution of T, given the covariates. Any (prior) knowledge of this conditional distribution can be used at this stage without any risk of losing root-n consistency. We also propose locally efficient one step estimators. Finally, we show some simulation results.
Resumo:
Estimation for bivariate right censored data is a problem that has had much study over the past 15 years. In this paper we propose a new class of estimators for the bivariate survival function based on locally efficient estimation. We introduce the locally efficient estimator for bivariate right censored data, present an asymptotic theorem, present the results of simulation studies and perform a brief data analysis illustrating the use of the locally efficient estimator.
Resumo:
Investigators interested in whether a disease aggregates in families often collect case-control family data, which consist of disease status and covariate information for families selected via case or control probands. Here, we focus on the use of case-control family data to investigate the relative contributions to the disease of additive genetic effects (A), shared family environment (C), and unique environment (E). To this end, we describe a ACE model for binary family data and then introduce an approach to fitting the model to case-control family data. The structural equation model, which has been described previously, combines a general-family extension of the classic ACE twin model with a (possibly covariate-specific) liability-threshold model for binary outcomes. Our likelihood-based approach to fitting involves conditioning on the proband’s disease status, as well as setting prevalence equal to a pre-specified value that can be estimated from the data themselves if necessary. Simulation experiments suggest that our approach to fitting yields approximately unbiased estimates of the A, C, and E variance components, provided that certain commonly-made assumptions hold. These assumptions include: the usual assumptions for the classic ACE and liability-threshold models; assumptions about shared family environment for relative pairs; and assumptions about the case-control family sampling, including single ascertainment. When our approach is used to fit the ACE model to Austrian case-control family data on depression, the resulting estimate of heritability is very similar to those from previous analyses of twin data.
Resumo:
Generalized linear mixed models with semiparametric random effects are useful in a wide variety of Bayesian applications. When the random effects arise from a mixture of Dirichlet process (MDP) model, normal base measures and Gibbs sampling procedures based on the Pólya urn scheme are often used to simulate posterior draws. These algorithms are applicable in the conjugate case when (for a normal base measure) the likelihood is normal. In the non-conjugate case, the algorithms proposed by MacEachern and Müller (1998) and Neal (2000) are often applied to generate posterior samples. Some common problems associated with simulation algorithms for non-conjugate MDP models include convergence and mixing difficulties. This paper proposes an algorithm based on the Pólya urn scheme that extends the Gibbs sampling algorithms to non-conjugate models with normal base measures and exponential family likelihoods. The algorithm proceeds by making Laplace approximations to the likelihood function, thereby reducing the procedure to that of conjugate normal MDP models. To ensure the validity of the stationary distribution in the non-conjugate case, the proposals are accepted or rejected by a Metropolis-Hastings step. In the special case where the data are normally distributed, the algorithm is identical to the Gibbs sampler.
Resumo:
Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.