898 resultados para MULTIPLE TIME FORMALISM


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mendelian models can predict who carries an inherited deleterious mutation of known disease genes based on family history. For example, the BRCAPRO model is commonly used to identify families who carry mutations of BRCA1 and BRCA2, based on familial breast and ovarian cancers. These models incorporate the age of diagnosis of diseases in relatives and current age or age of death. We develop a rigorous foundation for handling multiple diseases with censoring. We prove that any disease unrelated to mutations can be excluded from the model, unless it is sufficiently common and dependent on a mutation-related disease time. Furthermore, if a family member has a disease with higher probability density among mutation carriers, but the model does not account for it, then the carrier probability is deflated. However, even if a family only has diseases the model accounts for, if the model excludes a mutation-related disease, then the carrier probability will be inflated. In light of these results, we extend BRCAPRO to account for surviving all non-breast/ovary cancers as a single outcome. The extension also enables BRCAPRO to extract more useful information from male relatives. Using 1500 familes from the Cancer Genetics Network, accounting for surviving other cancers improves BRCAPRO’s concordance index from 0.758 to 0.762 (p = 0.046), improves its positive predictive value from 35% to 39% (p < 10−6) without impacting its negative predictive value, and improves its overall calibration, although calibration slightly worsens for those with carrier probability < 10%. Copyright c 2000 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualization and exploratory analysis is an important part of any data analysis and is made more challenging when the data are voluminous and high-dimensional. One such example is environmental monitoring data, which are often collected over time and at multiple locations, resulting in a geographically indexed multivariate time series. Financial data, although not necessarily containing a geographic component, present another source of high-volume multivariate time series data. We present the mvtsplot function which provides a method for visualizing multivariate time series data. We outline the basic design concepts and provide some examples of its usage by applying it to a database of ambient air pollution measurements in the United States and to a hypothetical portfolio of stocks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous time series studies have provided strong evidence of an association between increased levels of ambient air pollution and increased levels of hospital admissions, typically at 0, 1, or 2 days after an air pollution episode. An important research aim is to extend existing statistical models so that a more detailed understanding of the time course of hospitalization after exposure to air pollution can be obtained. Information about this time course, combined with prior knowledge about biological mechanisms, could provide the basis for hypotheses concerning the mechanism by which air pollution causes disease. Previous studies have identified two important methodological questions: (1) How can we estimate the shape of the distributed lag between increased air pollution exposure and increased mortality or morbidity? and (2) How should we estimate the cumulative population health risk from short-term exposure to air pollution? Distributed lag models are appropriate tools for estimating air pollution health effects that may be spread over several days. However, estimation for distributed lag models in air pollution and health applications is hampered by the substantial noise in the data and the inherently weak signal that is the target of investigation. We introduce an hierarchical Bayesian distributed lag model that incorporates prior information about the time course of pollution effects and combines information across multiple locations. The model has a connection to penalized spline smoothing using a special type of penalty matrix. We apply the model to estimating the distributed lag between exposure to particulate matter air pollution and hospitalization for cardiovascular and respiratory disease using data from a large United States air pollution and hospitalization database of Medicare enrollees in 94 counties covering the years 1999-2002.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantifying the health effects associated with simultaneous exposure to many air pollutants is now a research priority of the US EPA. Bayesian hierarchical models (BHM) have been extensively used in multisite time series studies of air pollution and health to estimate health effects of a single pollutant adjusted for potential confounding of other pollutants and other time-varying factors. However, when the scientific goal is to estimate the impacts of many pollutants jointly, a straightforward application of BHM is challenged by the need to specify a random-effect distribution on a high-dimensional vector of nuisance parameters, which often do not have an easy interpretation. In this paper we introduce a new BHM formulation, which we call "reduced BHM", aimed at analyzing clustered data sets in the presence of a large number of random effects that are not of primary scientific interest. At the first stage of the reduced BHM, we calculate the integrated likelihood of the parameter of interest (e.g. excess number of deaths attributed to simultaneous exposure to high levels of many pollutants). At the second stage, we specify a flexible random-effect distribution directly on the parameter of interest. The reduced BHM overcomes many of the challenges in the specification and implementation of full BHM in the context of a large number of nuisance parameters. In simulation studies we show that the reduced BHM performs comparably to the full BHM in many scenarios, and even performs better in some cases. Methods are applied to estimate location-specific and overall relative risks of cardiovascular hospital admissions associated with simultaneous exposure to elevated levels of particulate matter and ozone in 51 US counties during the period 1999-2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: In patients with a clinically isolated syndrome (CIS), the time interval to convert to clinically definite multiple sclerosis (CDMS) is highly variable. Individual and geographical prognostic factors remain to be determined. Whether anti-myelin antibodies may predict the risk of conversion to CDMS in Swiss CIS patients of the canton Berne was the subject of the study. METHODS: Anti-myelin oligodendrocyte glycoprotein and anti-myelin basic protein antibodies were determined prospectively in patients admitted to our department. RESULTS: After a mean follow-up of 12 months, none of nine antibody-negative, but 22 of 30 antibody-positive patients had progressed to CDMS. Beta-Interferon treatment delayed the time to conversion from a mean of 7.4 to 10.9 months. CONCLUSIONS: In a Swiss cohort, antibody-negative CIS patients have a favorable short-term prognosis, and antibody-positive patients benefit from early treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the immature brain hydrogen peroxide accumulates after excitotoxic hypoxia-ischemia and is neurotoxic. Immature hippocampal neurons were exposed to N-methyl-D-aspartate (NMDA), a glutamate agonist, and hydrogen peroxide (H(2)O(2)) and the effects of free radical scavenging and transition metal chelation on neurotoxicity were studied. alpha-Phenyl-N-tert.-butylnitrone (PBN), a known superoxide scavenger, attenuated both H(2)O(2) and NMDA mediated toxicity. Treatment with desferrioxamine (DFX), an iron chelator, at the time of exposure to H(2)O(2) was ineffective, but pretreatment was protective. DFX also protected against NMDA toxicity. TPEN, a metal chelator with higher affinities for a broad spectrum of transition metal ions, also protected against H(2)O(2) toxicity but was ineffective against NMDA induced toxicity. These data suggest that during exposure to free radical and glutamate agonists, the presence of iron and other free metal ions contribute to neuronal cell death. In the immature nervous system this neuronal injury can be attenuated by free radical scavengers and metal chelators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To test the feasibility of and interactions among three software-driven critical care protocols. DESIGN: Prospective cohort study. SETTING: Intensive care units in six European and American university hospitals. PATIENTS: 174 cardiac surgery and 41 septic patients. INTERVENTIONS: Application of software-driven protocols for cardiovascular management, sedation, and weaning during the first 7 days of intensive care. MEASUREMENTS AND RESULTS: All protocols were used simultaneously in 85% of the cardiac surgery and 44% of the septic patients, and any one of the protocols was used for 73 and 44% of study duration, respectively. Protocol use was discontinued in 12% of patients by the treating clinician and in 6% for technical/administrative reasons. The number of protocol steps per unit of time was similar in the two diagnostic groups (n.s. for all protocols). Initial hemodynamic stability (a protocol target) was achieved in 26+/-18 min (mean+/-SD) in cardiac surgery and in 24+/-18 min in septic patients. Sedation targets were reached in 2.4+/-0.2h in cardiac surgery and in 3.6 +/-0.2h in septic patients. Weaning protocol was started in 164 (94%; 154 extubated) cardiac surgery and in 25 (60%; 9 extubated) septic patients. The median (interquartile range) time from starting weaning to extubation (a protocol target) was 89 min (range 44-154 min) for the cardiac surgery patients and 96 min (range 56-205 min) for the septic patients. CONCLUSIONS: Multiple software-driven treatment protocols can be simultaneously applied with high acceptance and rapid achievement of primary treatment goals. Time to reach these primary goals may provide a performance indicator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the original ocean-bottom time-lapse seismic studies was performed at the Teal South oil field in the Gulf of Mexico during the late 1990’s. This work reexamines some aspects of previous work using modern analysis techniques to provide improved quantitative interpretations. Using three-dimensional volume visualization of legacy data and the two phases of post-production time-lapse data, I provide additional insight into the fluid migration pathways and the pressure communication between different reservoirs, separated by faults. This work supports a conclusion from previous studies that production from one reservoir caused regional pressure decline that in turn resulted in liberation of gas from multiple surrounding unproduced reservoirs. I also provide an explanation for unusual time-lapse changes in amplitude-versus-offset (AVO) data related to the compaction of the producing reservoir which, in turn, changed an isotropic medium to an anisotropic medium. In the first part of this work, I examine regional changes in seismic response due to the production of oil and gas from one reservoir. The previous studies primarily used two post-production ocean-bottom surveys (Phase I and Phase II), and not the legacy streamer data, due to the unavailability of legacy prestack data and very different acquisition parameters. In order to incorporate the legacy data in the present study, all three poststack data sets were cross-equalized and examined using instantaneous amplitude and energy volumes. This approach appears quite effective and helps to suppress changes unrelated to production while emphasizing those large-amplitude changes that are related to production in this noisy (by current standards) suite of data. I examine the multiple data sets first by using the instantaneous amplitude and energy attributes, and then also examine specific apparent time-lapse changes through direct comparisons of seismic traces. In so doing, I identify time-delays that, when corrected for, indicate water encroachment at the base of the producing reservoir. I also identify specific sites of leakage from various unproduced reservoirs, the result of regional pressure blowdown as explained in previous studies; those earlier studies, however, were unable to identify direct evidence of fluid movement. Of particular interest is the identification of one site where oil apparently leaked from one reservoir into a “new” reservoir that did not originally contain oil, but was ideally suited as a trap for fluids leaking from the neighboring spill-point. With continued pressure drop, oil in the new reservoir increased as more oil entered into the reservoir and expanded, liberating gas from solution. Because of the limited volume available for oil and gas in that temporary trap, oil and gas also escaped from it into the surrounding formation. I also note that some of the reservoirs demonstrate time-lapse changes only in the “gas cap” and not in the oil zone, even though gas must be coming out of solution everywhere in the reservoir. This is explained by interplay between pore-fluid modulus reduction by gas saturation decrease and dry-frame modulus increase by frame stiffening. In the second part of this work, I examine various rock-physics models in an attempt to quantitatively account for frame-stiffening that results from reduced pore-fluid pressure in the producing reservoir, searching for a model that would predict the unusual AVO features observed in the time-lapse prestack and stacked data at Teal South. While several rock-physics models are successful at predicting the time-lapse response for initial production, most fail to match the observations for continued production between Phase I and Phase II. Because the reservoir was initially overpressured and unconsolidated, reservoir compaction was likely significant, and is probably accomplished largely by uniaxial strain in the vertical direction; this implies that an anisotropic model may be required. Using Walton’s model for anisotropic unconsolidated sand, I successfully model the time-lapse changes for all phases of production. This observation may be of interest for application to other unconsolidated overpressured reservoirs under production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypothesis: Early recognition of coagulopathy may improve the care of patients with multiple injuries. Rapid thrombelastography (RapidTEG) is a new variant of thrombelastography (TEG), in which coagulation is initiated by the addition of protein tissue factor. The kinetics of coagulation and the times of measurement were compared for two variants of TEG--RapidTEG and conventional TEG, in which coagulation was initiated with kaolin. The measurements were performed on blood samples from 20 patients with multiple injuries. The RapidTEG results were also compared with conventional measurements of blood coagulation. The mean time for the RapidTEG test was 19.2 +/- 3.1 minutes (mean +/- SD), in comparison with 29.9 +/- 4.3 minutes for kaolin TEG and 34.1 +/- 14.5 minutes for conventional coagulation tests. The mean time for the RapidTEG test was 30.8 +/- 5.72 minutes, in comparison with 41.5 +/- 5.66 minutes for kaolin TEG and 64.9 +/- 18.8 for conventional coagulation tests---measured from admission of the patients to the resuscitation bay until the results were available. There were significant correlations between the RapidTEG results and those from kaolin TEG and conventional coagulation tests. RapidTEG is the most rapid available test for providing reliable information on coagulopathy in patients with multiple injuries. This has implications for improving patient care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eight premature infants ventilated for hyaline membrane disease and enrolled in the OSIRIS surfactant trial were studied. Lung mechanics, gas exchange [PaCO2, arterial/alveolar PO2 ratio (a/A ratio)], and ventilator settings were determined 20 minutes before and 20 minutes after the end of Exosurf instillation, and subsequently at 12-24 hour intervals. Respiratory system compliance (Crs) and resistance (Rrs) were measured by means of the single breath occlusion method. After surfactant instillation there were no significant immediate changes in PaCO2 (36 vs. 37 mmHg), a/A ratio (0.23 vs. 0.20), Crs (0.32 vs. 0.31 mL/cm H2O/kg), and Rrs (0.11 vs. 0.16 cmH2O/mL/s) (pooled data of 18 measurement pairs). During the clinical course, mean a/A ratio improved significantly each time from 0.17 (time 0) to 0.29 (time 12-13 hours), to 0.39 (time 24-36 hours) and to 0.60 (time 48-61 hours), although mean airway pressure was reduced substantially. Mean Crs increased significantly from 0.28 mL/cmH2O/kg (time 0) to 0.38 (time 12-13 hours), to 0.37 (time 24-38 hours), and to 0.52 (time 48-61 hours), whereas mean Rrs increased from 0.10 cm H2O/mL/s (time 0) to 0.11 (time 12-13 hours), to 0.13 (time 24-36 hours) and to (time 48-61 hours) with no overall significance. A highly significant correlation was found between Crs and a/A ratio (r = 0.698, P less than 0.001). We conclude that Exosurf does not induce immediate changes in oxygenation as does the instillation of (modified) natural surfactant preparations. However, after 12 and 24 hours of treatment oxygenation and Crs improve significantly.(ABSTRACT TRUNCATED AT 250 WORDS)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A combinatorial protocol (CP) is introduced here to interface it with the multiple linear regression (MLR) for variable selection. The efficiency of CP-MLR is primarily based on the restriction of entry of correlated variables to the model development stage. It has been used for the analysis of Selwood et al data set [16], and the obtained models are compared with those reported from GFA [8] and MUSEUM [9] approaches. For this data set CP-MLR could identify three highly independent models (27, 28 and 31) with Q2 value in the range of 0.632-0.518. Also, these models are divergent and unique. Even though, the present study does not share any models with GFA [8], and MUSEUM [9] results, there are several descriptors common to all these studies, including the present one. Also a simulation is carried out on the same data set to explain the model formation in CP-MLR. The results demonstrate that the proposed method should be able to offer solutions to data sets with 50 to 60 descriptors in reasonable time frame. By carefully selecting the inter-parameter correlation cutoff values in CP-MLR one can identify divergent models and handle data sets larger than the present one without involving excessive computer time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to predict which ecosystem functions are most at risk from biodiversity loss, meta-analyses have generalised results from biodiversity experiments over different sites and ecosystem types. In contrast, comparing the strength of biodiversity effects across a large number of ecosystem processes measured in a single experiment permits more direct comparisons. Here, we present an analysis of 418 separate measures of 38 ecosystem processes. Overall, 45 % of processes were significantly affected by plant species richness, suggesting that, while diversity affects a large number of processes not all respond to biodiversity. We therefore compared the strength of plant diversity effects between different categories of ecosystem processes, grouping processes according to the year of measurement, their biogeochemical cycle, trophic level and compartment (above- or belowground) and according to whether they were measures of biodiversity or other ecosystem processes, biotic or abiotic and static or dynamic. Overall, and for several individual processes, we found that biodiversity effects became stronger over time. Measures of the carbon cycle were also affected more strongly by plant species richness than were the measures associated with the nitrogen cycle. Further, we found greater plant species richness effects on measures of biodiversity than on other processes. The differential effects of plant diversity on the various types of ecosystem processes indicate that future research and political effort should shift from a general debate about whether biodiversity loss impairs ecosystem functions to focussing on the specific functions of interest and ways to preserve them individually or in combination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enzootic pneumonia (EP) of pigs, caused by Mycoplasma hyopneumoniae has been a notifiable disease in Switzerland since May 2003. The diagnosis of EP has been based on multiple methods, including clinical, bacteriological and epidemiological findings as well as pathological examination of lungs (mosaic diagnosis). With the recent development of a real-time PCR (rtPCR) assay with 2 target sequences a new detection method for M. hyopneumoniae became available. This assay was tested for its applicability to nasal swab material from live animals. Pigs from 74 herds (average 10 pigs per herd) were tested. Using the mosaic diagnosis, 22 herds were classified as EP positive and 52 as EP negative. From the 730 collected swab samples we were able to demonstrate that the rtPCR test was 100% specific. In cases of cough the sensitivity on herd level of the rtPCR is 100%. On single animal level and in herds without cough the sensitivity was lower. In such cases, only a positive result would be proof for an infection with M. hyopneumoniae. Our study shows that the rtPCR on nasal swabs from live pigs allows a fast and accurate diagnosis in cases of suspected EP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of service development based on GSM handset signaling. The aim is to achieve this goal without the participation of the users, which requires the use of a passive GSM receiver on the uplink. Since no tool for GSM uplink capturing was available, we developed a new method that can synchronize to multiple mobile devices by simply overhearing traffic between them and the network. Our work includes the implementation of modules for signal recovery, message reconstruction and parsing. The method has been validated against a benchmark solution on GSM downlink and independently evaluated on uplink channels. Initial evaluations show up to 99% success rate in message decoding, which is a very promising result. Moreover, we conducted measurements that reveal insights on the impact of signal power on the capturing performance and investigate possible reactive measures.