922 resultados para Data replication processes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magmatic volatiles play a crucial role in volcanism, from magma production at depth to generation of seismic phenomena to control of eruption style. Accordingly, many models of volcano dynamics rely heavily on behavior of such volatiles. Yet measurements of emission rates of volcanic gases have historically been limited, which has restricted model verification to processes on the order of days or longer. UV cameras are a recent advancement in the field of remote sensing of volcanic SO2 emissions. They offer enhanced temporal and spatial resolution over previous measurement techniques, but need development before they can be widely adopted and achieve the promise of integration with other geophysical datasets. Large datasets require a means by which to quickly and efficiently use imagery to calculate emission rates. We present a suite of programs designed to semi-automatically determine emission rates of SO2 from series of UV images. Extraction of high temporal resolution SO2 emission rates via this software facilitates comparison of gas data to geophysical data for the purposes of evaluating models of volcanic activity and has already proven useful at several volcanoes. Integrated UV camera and seismic measurements recorded in January 2009 at Fuego volcano, Guatemala, provide new insight into the system’s shallow conduit processes. High temporal resolution SO2 data reveal patterns of SO2 emission rate relative to explosions and seismic tremor that indicate tremor and degassing share a common source process. Progressive decreases in emission rate appear to represent inhibition of gas loss from magma as a result of rheological stiffening in the upper conduit. Measurements of emission rate from two closely-spaced vents, made possible by the high spatial resolution of the camera, help constrain this model. UV camera measurements at Kilauea volcano, Hawaii, in May of 2010 captured two occurrences of lava filling and draining within the summit vent. Accompanying high lava stands were diminished SO2 emission rates, decreased seismic and infrasonic tremor, minor deflation, and slowed lava lake surface velocity. Incorporation of UV camera data into the multi-parameter dataset gives credence to the likelihood of shallow gas accumulation as the cause of such events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The developmental processes and functions of an organism are controlled by the genes and the proteins that are derived from these genes. The identification of key genes and the reconstruction of gene networks can provide a model to help us understand the regulatory mechanisms for the initiation and progression of biological processes or functional abnormalities (e.g. diseases) in living organisms. In this dissertation, I have developed statistical methods to identify the genes and transcription factors (TFs) involved in biological processes, constructed their regulatory networks, and also evaluated some existing association methods to find robust methods for coexpression analyses. Two kinds of data sets were used for this work: genotype data and gene expression microarray data. On the basis of these data sets, this dissertation has two major parts, together forming six chapters. The first part deals with developing association methods for rare variants using genotype data (chapter 4 and 5). The second part deals with developing and/or evaluating statistical methods to identify genes and TFs involved in biological processes, and construction of their regulatory networks using gene expression data (chapter 2, 3, and 6). For the first part, I have developed two methods to find the groupwise association of rare variants with given diseases or traits. The first method is based on kernel machine learning and can be applied to both quantitative as well as qualitative traits. Simulation results showed that the proposed method has improved power over the existing weighted sum method (WS) in most settings. The second method uses multiple phenotypes to select a few top significant genes. It then finds the association of each gene with each phenotype while controlling the population stratification by adjusting the data for ancestry using principal components. This method was applied to GAW 17 data and was able to find several disease risk genes. For the second part, I have worked on three problems. First problem involved evaluation of eight gene association methods. A very comprehensive comparison of these methods with further analysis clearly demonstrates the distinct and common performance of these eight gene association methods. For the second problem, an algorithm named the bottom-up graphical Gaussian model was developed to identify the TFs that regulate pathway genes and reconstruct their hierarchical regulatory networks. This algorithm has produced very significant results and it is the first report to produce such hierarchical networks for these pathways. The third problem dealt with developing another algorithm called the top-down graphical Gaussian model that identifies the network governed by a specific TF. The network produced by the algorithm is proven to be of very high accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exsolution of volatiles from magma maintains an important control on volcanic eruption styles. The nucleation, growth, and connectivity of bubbles during magma ascent provide the driving force behind eruptions, and the rate, volume, and ease of gas exsolution can affect eruptive activity. Volcanic plumes are the observable consequence of this magmatic degassing, and remote sensing techniques allow us to quantify changes in gas exsolution. However, until recently the methods used to measure volcanic plumes did not have the capability of detecting rapid changes in degassing on the scale of standard geophysical observations. The advent of the UV camera now makes high sample rate gas measurements possible. This type of dataset can then be compared to other volcanic observations to provide an in depth picture of degassing mechanisms in the shallow conduit. The goals of this research are to develop a robust methodology for UV camera field measurements of volcanic plumes, and utilize this data in conjunction with seismoacoustic records to illuminate degassing processes. Field and laboratory experiments were conducted to determine the effects of imaging conditions, vignetting, exposure time, calibration technique, and filter usage on the UV camera sulfur dioxide measurements. Using the best practices determined from these studies, a field campaign was undertaken at Volcán de Pacaya, Guatemala. Coincident plume sulfur dioxide measurements, acoustic recordings, and seismic observations were collected and analyzed jointly. The results provide insight into the small explosive features, variations in degassing rate, and plumbing system of this complex volcanic system. This research provides useful information for determining volcanic hazard at Pacaya, and demonstrates the potential of the UV camera in multiparameter studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measuring shallow seismic sources provides a way to reveal processes that cannot be directly observed, but the correct interpretation and value of these signals depend on the ability to distinguish source from propagation effects. Furthermore, seismic signals produced by a resonating source can look almost identical to those produced by impulsive sources, but modified along the path. Distinguishing these two phenomena can be accomplished by examining the wavefield with small aperture arrays or by recording seismicity near to the source when possible. We examine source and path effects in two different environments: Bering Glacier, Alaska and Villarrica Volcano, Chile. Using three 3-element seismic arrays near the terminus of the Bering Glacier, we have identified and located both terminus calving and iceberg breakup events. We show that automated array analysis provided a robust way to locate icequake events using P waves. This analysis also showed that arrivals within the long-period codas were incoherent within the small aperture arrays, demonstrating that these codas previously attributed to crack resonance were in fact a result of a complicated path rather than a source effect. At Villarrica Volcano, seismometers deployed from near the vent to ~10 km revealed that a several cycle long-period source signal recorded at the vent appeared elongated in the far-field. We used data collected from the stations nearest to the vent to invert for the repetitive seismic source, and found it corresponded to a shallow force within the lava lake oriented N75°E and dipping 7° from horizontal. We also used this repetitive signal to search the data for additional seismic and infrasonic properties which included calculating seismic-acoustic delay times, volcano acoustic-seismic ratios and energies, event frequency, and real-time seismic amplitude measurements. These calculations revealed lava lake level and activity fluctuations consistent with lava lake level changes inferred from the persistent infrasonic tremor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Utilizing remote sensing methods to assess landscape-scale ecological change are rapidly becoming a dominant force in the natural sciences. Powerful and robust non-parametric statistical methods are also actively being developed to compliment the unique characteristics of remotely sensed data. The focus of this research is to utilize these powerful, robust remote sensing and statistical approaches to shed light on woody plant encroachment into native grasslands--a troubling ecological phenomenon occurring throughout the world. Specifically, this research investigates western juniper encroachment within the sage-steppe ecosystem of the western USA. Western juniper trees are native to the intermountain west and are ecologically important by means of providing structural diversity and habitat for many species. However, after nearly 150 years of post-European settlement changes to this threatened ecosystem, natural ecological processes such as fire regimes no longer limit the range of western juniper to rocky refugia and other areas protected from short fire return intervals that are historically common to the region. Consequently, sage-steppe communities with high juniper densities exhibit negative impacts, such as reduced structural diversity, degraded wildlife habitat and ultimately the loss of biodiversity. Much of today's sage-steppe ecosystem is transitioning to juniper woodlands. Additionally, the majority of western juniper woodlands have not reached their full potential in both range and density. The first section of this research investigates the biophysical drivers responsible for juniper expansion patterns observed in the sage-steppe ecosystem. The second section is a comprehensive accuracy assessment of classification methods used to identify juniper tree cover from multispectral 1 m spatial resolution aerial imagery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High density spatial and temporal sampling of EEG data enhances the quality of results of electrophysiological experiments. Because EEG sources typically produce widespread electric fields (see Chapter 3) and operate at frequencies well below the sampling rate, increasing the number of electrodes and time samples will not necessarily increase the number of observed processes, but mainly increase the accuracy of the representation of these processes. This is namely the case when inverse solutions are computed. As a consequence, increasing the sampling in space and time increases the redundancy of the data (in space, because electrodes are correlated due to volume conduction, and time, because neighboring time points are correlated), while the degrees of freedom of the data change only little. This has to be taken into account when statistical inferences are to be made from the data. However, in many ERP studies, the intrinsic correlation structure of the data has been disregarded. Often, some electrodes or groups of electrodes are a priori selected as the analysis entity and considered as repeated (within subject) measures that are analyzed using standard univariate statistics. The increased spatial resolution obtained with more electrodes is thus poorly represented by the resulting statistics. In addition, the assumptions made (e.g. in terms of what constitutes a repeated measure) are not supported by what we know about the properties of EEG data. From the point of view of physics (see Chapter 3), the natural “atomic” analysis entity of EEG and ERP data is the scalp electric field

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complementary to automatic extraction processes, Virtual Reality technologies provide an adequate framework to integrate human perception in the exploration of large data sets. In such multisensory system, thanks to intuitive interactions, a user can take advantage of all his perceptual abilities in the exploration task. In this context the haptic perception, coupled to visual rendering, has been investigated for the last two decades, with significant achievements. In this paper, we present a survey related to exploitation of the haptic feedback in exploration of large data sets. For each haptic technique introduced, we describe its principles and its effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic changes in ERP topographies can be conveniently analyzed by means of microstates, the so-called "atoms of thoughts", that represent brief periods of quasi-stable synchronized network activation. Comparing temporal microstate features such as on- and offset or duration between groups and conditions therefore allows a precise assessment of the timing of cognitive processes. So far, this has been achieved by assigning the individual time-varying ERP maps to spatially defined microstate templates obtained from clustering the grand mean data into predetermined numbers of topographies (microstate prototypes). Features obtained from these individual assignments were then statistically compared. This has the problem that the individual noise dilutes the match between individual topographies and templates leading to lower statistical power. We therefore propose a randomization-based procedure that works without assigning grand-mean microstate prototypes to individual data. In addition, we propose a new criterion to select the optimal number of microstate prototypes based on cross-validation across subjects. After a formal introduction, the method is applied to a sample data set of an N400 experiment and to simulated data with varying signal-to-noise ratios, and the results are compared to existing methods. In a first comparison with previously employed statistical procedures, the new method showed an increased robustness to noise, and a higher sensitivity for more subtle effects of microstate timing. We conclude that the proposed method is well-suited for the assessment of timing differences in cognitive processes. The increased statistical power allows identifying more subtle effects, which is particularly important in small and scarce patient populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Although the use of an adjudication committee (AC) for outcomes is recommended in randomized controlled trials, there are limited data on the process of adjudication. We therefore aimed to assess whether the reporting of the adjudication process in venous thromboembolism (VTE) trials meets existing quality standards and which characteristics of trials influence the use of an AC. STUDY DESIGN AND SETTING We systematically searched MEDLINE and the Cochrane Library from January 1, 2003, to June 1, 2012, for randomized controlled trials on VTE. We abstracted information about characteristics and quality of trials and reporting of adjudication processes. We used stepwise backward logistic regression model to identify trial characteristics independently associated with the use of an AC. RESULTS We included 161 trials. Of these, 68.9% (111 of 161) reported the use of an AC. Overall, 99.1% (110 of 111) of trials with an AC used independent or blinded ACs, 14.4% (16 of 111) reported how the adjudication decision was reached within the AC, and 4.5% (5 of 111) reported on whether the reliability of adjudication was assessed. In multivariate analyses, multicenter trials [odds ratio (OR), 8.6; 95% confidence interval (CI): 2.7, 27.8], use of a data safety-monitoring board (OR, 3.7; 95% CI: 1.2, 11.6), and VTE as the primary outcome (OR, 5.7; 95% CI: 1.7, 19.4) were associated with the use of an AC. Trials without random allocation concealment (OR, 0.3; 95% CI: 0.1, 0.8) and open-label trials (OR, 0.3; 95% CI: 0.1, 1.0) were less likely to report an AC. CONCLUSION Recommended processes of adjudication are underreported and lack standardization in VTE-related clinical trials. The use of an AC varies substantially by trial characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molybdenum isotopes are increasingly widely applied in Earth Sciences. They are primarily used to investigate the oxygenation of Earth's ocean and atmosphere. However, more and more fields of application are being developed, such as magmatic and hydrothermal processes, planetary sciences or the tracking of environmental pollution. Here, we present a proposal for a unifying presentation of Mo isotope ratios in the studies of mass-dependent isotope fractionation. We suggest that the δ98/95Mo of the NIST SRM 3134 be defined as +0.25‰. The rationale is that the vast majority of published data are presented relative to reference materials that are similar, but not identical, and that are all slightly lighter than NIST SRM 3134. Our proposed data presentation allows a direct first-order comparison of almost all old data with future work while referring to an international measurement standard. In particular, canonical δ98/95Mo values such as +2.3‰ for seawater and −0.7‰ for marine Fe–Mn precipitates can be kept for discussion. As recent publications show that the ocean molybdenum isotope signature is homogeneous, the IAPSO ocean water standard or any other open ocean water sample is suggested as a secondary measurement standard, with a defined δ98/95Mo value of +2.34 ± 0.10‰ (2s). Les isotopes du molybdène (Mo) sont de plus en plus largement utilisés dans les sciences de la Terre. Ils sont principalement utilisés pour étudier l'oxygénation de l'océan et de l'atmosphère de la Terre. Cependant, de plus en plus de domaines d'application sont en cours de développement, tels que ceux concernant les processus magmatiques et hydrothermaux, les sciences planétaires ou encore le suivi de la pollution environnementale. Ici, nous présentons une proposition de présentation unifiée des rapports isotopiques du Mo dans les études du fractionnement isotopique dépendant de la masse. Nous suggérons que le δ98/95Mo du NIST SRM 3134 soit définit comme étant égal à +0.25 ‰. La raison est que la grande majorité des données publiées sont présentés par rapport à des matériaux de référence qui sont similaires, mais pas identiques, et qui sont tous légèrement plus léger que le NIST SRM 3134. Notre proposition de présentation des données permet une comparaison directe au premier ordre de presque toutes les anciennes données avec les travaux futurs en se référant à un standard international. En particulier, les valeurs canoniques du δ98/95Mo comme celle de +2,3 ‰ pour l'eau de mer et de -0,7 ‰ pour les précipités de Fe-Mn marins peuvent être conservés pour la discussion. Comme les publications récentes montrent que la signature isotopique moyenne du molybdène de l'océan est homogène, le standard de l'eau océanique IAPSO ou tout autre échantillon d'eau provenant de l'océan ouvert sont proposé comme standards secondaires, avec une valeur définie du δ98/95 Mo de 2.34 ± 0.10 ‰ (2s).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, critical swimming speed has been defined as the speed when a fish can no longer propel itself forward, and is exhausted. To gain a better understanding of the metabolic processes at work during a U(crit) swim test, and that lead to fatigue, we developed a method using in vivo (31)P-NMR spectroscopy in combination with a Brett-type swim tunnel. Our data showed that a metabolic transition point is reached when the fish change from using steady state aerobic metabolism to non-steady state anaerobic metabolism, as indicated by a significant increase in inorganic phosphate levels from 0.3+/-0.3 to 9.5+/-3.4 mol g(-1), and a drop in intracellular pH from 7.48+/-0.03 to 6.81+/-0.05 in muscle. This coincides with the point when the fish change gait from subcarangiform swimming to kick-and-glide bursts. As the number of kicks increased, so too did the Pi concentration, and the pH(i) dropped. Both changes were maximal at U(crit). A significant drop in Gibbs free energy change of ATP hydrolysis from -55.6+/-1.4 to -49.8+/-0.7 kJ mol(-1) is argued to have been involved in fatigue. This confirms earlier findings that the traditional definition of U(crit), unlike other critical points that are typically marked by a transition from aerobic to anaerobic metabolism, is the point of complete exhaustion of both aerobic and anaerobic resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of long-term historical information derived from paleoecological studies has long been recognized as a fundamental aspect of effective conservation. However, there remains some uncertainty regarding the extent to which paleoecology can inform on specific issues of high conservation priority, at the scale for which conservation policy decisions often take place. Here we review to what extent the past occurrence of three fundamental aspects of forest conservation can be assessed using paleoecological data, with a focus on northern Europe. These aspects are (1) tree species composition, (2) old/large trees and coarse woody debris, and (3) natural disturbances. We begin by evaluating the types of relevant historical information available from contemporary forests, then evaluate common paleoecological techniques, namely dendrochronology, pollen, macrofossil, charcoal, and fossil insect and wood analyses. We conclude that whereas contemporary forests can be used to estimate historical, natural occurrences of several of the aspects addressed here (e.g. old/large trees), paleoecological techniques are capable of providing much greater temporal depth, as well as robust quantitative data for tree species composition and fire disturbance, qualitative insights regarding old/large trees and woody debris, but limited indications of past windstorms and insect outbreaks. We also find that studies of fossil wood and paleoentomology are perhaps the most underutilized sources of information. Not only can paleoentomology provide species specific information, but it also enables the reconstruction of former environmental conditions otherwise unavailable. Despite the potential, the majority of conservation-relevant paleoecological studies primarily focus on describing historical forest conditions in broad terms and for large spatial scales, addressing former climate, land-use, and landscape developments, often in the absence of a specific conservation context. In contrast, relatively few studies address the most pressing conservation issues in northern Europe, often requiring data on the presence or quantities of dead wood, large trees or specific tree species, at the scale of the stand or reserve. Furthermore, even fewer examples exist of detailed paleoecological data being used for conservation planning, or the setting of operative restorative baseline conditions at local scales. If ecologist and conservation biologists are going to benefit to the full extent possible from the ever-advancing techniques developed by the paleoecological sciences, further integration of these disciplines is desirable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Productive Epstein‐Barr virus (EBV) replication characterizes hairy leukoplakia, an oral epithelial lesion typically occurring in individuals infected with human immunodeficiency virus (HIV). Serial tongue biopsy specimens were obtained from HIV‐infected subjects before, during, and after valacyclovir treatment. EBV replication was detected by Southern hybridization to linear terminal EBV genome fragments, reverse‐transcriptase polymerase chain reaction amplification of EBV replicative gene transcripts, immunohistochemical detection of EBV replicative protein, and in situ hybridization to EBV DNA. EBV replication was detected in both hairy leukoplakia and normal tongue tissues. Valacyclovir treatment completely abrogated EBV replication in vivo, resulting in resolution of hairy leukoplakia when it was present. EBV replication returned in normal tongue epithelial cells after valacyclovir treatment. These data suggest that normal oral epithelium supports persistent EBV infection in individuals infected with HIV and that productive EBV replication is necessary but not sufficient for the pathogenesis of oral hairy leukoplakia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The causes of the glacial cycle remain unknown, although the primary driver is changes in atmospheric CO(2), likely controlled by the biological pump and biogeochemical cycles. The two most important regions of the ocean for exchange of CO(2) with the atmosphere are the equatorial Pacific and the Southern Ocean ( SO), the former a net source and the latter a net sink under present conditions. The equatorial Pacific has been shown to be a Si(OH)(4)-limited ecosystem, a consequence of the low source Si(OH)(4) concentrations in upwelled water that has its origin in the SO. This teleconnection for nutrients between the two regions suggests an oscillatory relationship that may influence or control glacial cycles. Opal mass accumulation rate (MAR) data and delta(15)N measurements in equatorial cores are interpreted with predictions from a one- dimensional Si(OH)(4)-limited ecosystem model (CoSINE) for the equatorial Pacific. The results suggest that equatorial Pacific surface CO(2) processes are in opposite phase to that of the global atmosphere, providing a negative feedback to the glacial cycle. This negative feedback is implemented through the effect of the SO on the equatorial Si(OH)(4) supply. An alternative hypothesis, that the whole ocean becomes Si(OH)(4) poor during cooling periods, is suggested by low opal MAR in cores from both equatorial and Antarctic regions, perhaps as a result of low river input. terminations in this scenario would result from blooms of coccolithophorids triggered by low Si(OH)(4) concentrations.