969 resultados para Interval Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

By the end of the 19th century, geodesy has contributed greatly to the knowledge of regional tectonics and fault movement through its ability to measure, at sub-centimetre precision, the relative positions of points on the Earth’s surface. Nowadays the systematic analysis of geodetic measurements in active deformation regions represents therefore one of the most important tool in the study of crustal deformation over different temporal scales [e.g., Dixon, 1991]. This dissertation focuses on motion that can be observed geodetically with classical terrestrial position measurements, particularly triangulation and leveling observations. The work is divided into two sections: an overview of the principal methods for estimating longterm accumulation of elastic strain from terrestrial observations, and an overview of the principal methods for rigorously inverting surface coseismic deformation fields for source geometry with tests on synthetic deformation data sets and applications in two different tectonically active regions of the Italian peninsula. For the long-term accumulation of elastic strain analysis, triangulation data were available from a geodetic network across the Messina Straits area (southern Italy) for the period 1971 – 2004. From resulting angle changes, the shear strain rates as well as the orientation of the principal axes of the strain rate tensor were estimated. The computed average annual shear strain rates for the time period between 1971 and 2004 are γ˙1 = 113.89 ± 54.96 nanostrain/yr and γ˙2 = -23.38 ± 48.71 nanostrain/yr, with the orientation of the most extensional strain (θ) at N140.80° ± 19.55°E. These results suggests that the first-order strain field of the area is dominated by extension in the direction perpendicular to the trend of the Straits, sustaining the hypothesis that the Messina Straits could represents an area of active concentrated deformation. The orientation of θ agree well with GPS deformation estimates, calculated over shorter time interval, and is consistent with previous preliminary GPS estimates [D’Agostino and Selvaggi, 2004; Serpelloni et al., 2005] and is also similar to the direction of the 1908 (MW 7.1) earthquake slip vector [e.g., Boschi et al., 1989; Valensise and Pantosti, 1992; Pino et al., 2000; Amoruso et al., 2002]. Thus, the measured strain rate can be attributed to an active extension across the Messina Straits, corresponding to a relative extension rate ranges between < 1mm/yr and up to ~ 2 mm/yr, within the portion of the Straits covered by the triangulation network. These results are consistent with the hypothesis that the Messina Straits is an important active geological boundary between the Sicilian and the Calabrian domains and support previous preliminary GPS-based estimates of strain rates across the Straits, which show that the active deformation is distributed along a greater area. Finally, the preliminary dislocation modelling has shown that, although the current geodetic measurements do not resolve the geometry of the dislocation models, they solve well the rate of interseismic strain accumulation across the Messina Straits and give useful information about the locking the depth of the shear zone. Geodetic data, triangulation and leveling measurements of the 1976 Friuli (NE Italy) earthquake, were available for the inversion of coseismic source parameters. From observed angle and elevation changes, the source parameters of the seismic sequence were estimated in a join inversion using an algorithm called “simulated annealing”. The computed optimal uniform–slip elastic dislocation model consists of a 30° north-dipping shallow (depth 1.30 ± 0.75 km) fault plane with azimuth of 273° and accommodating reverse dextral slip of about 1.8 m. The hypocentral location and inferred fault plane of the main event are then consistent with the activation of Periadriatic overthrusts or other related thrust faults as the Gemona- Kobarid thrust. Then, the geodetic data set exclude the source solution of Aoudia et al. [2000], Peruzza et al. [2002] and Poli et al. [2002] that considers the Susans-Tricesimo thrust as the May 6 event. The best-fit source model is then more consistent with the solution of Pondrelli et al. [2001], which proposed the activation of other thrusts located more to the North of the Susans-Tricesimo thrust, probably on Periadriatic related thrust faults. The main characteristics of the leveling and triangulation data are then fit by the optimal single fault model, that is, these results are consistent with a first-order rupture process characterized by a progressive rupture of a single fault system. A single uniform-slip fault model seems to not reproduce some minor complexities of the observations, and some residual signals that are not modelled by the optimal single-fault plane solution, were observed. In fact, the single fault plane model does not reproduce some minor features of the leveling deformation field along the route 36 south of the main uplift peak, that is, a second fault seems to be necessary to reproduce these residual signals. By assuming movements along some mapped thrust located southward of the inferred optimal single-plane solution, the residual signal has been successfully modelled. In summary, the inversion results presented in this Thesis, are consistent with the activation of some Periadriatic related thrust for the main events of the sequence, and with a minor importance of the southward thrust systems of the middle Tagliamento plain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il Data Distribution Management (DDM) è un componente dello standard High Level Architecture. Il suo compito è quello di rilevare le sovrapposizioni tra update e subscription extent in modo efficiente. All'interno di questa tesi si discute la necessità di avere un framework e per quali motivi è stato implementato. Il testing di algoritmi per un confronto equo, librerie per facilitare la realizzazione di algoritmi, automatizzazione della fase di compilazione, sono motivi che sono stati fondamentali per iniziare la realizzazione framework. Il motivo portante è stato che esplorando articoli scientifici sul DDM e sui vari algoritmi si è notato che in ogni articolo si creavano dei dati appositi per fare dei test. L'obiettivo di questo framework è anche quello di riuscire a confrontare gli algoritmi con un insieme di dati coerente. Si è deciso di testare il framework sul Cloud per avere un confronto più affidabile tra esecuzioni di utenti diversi. Si sono presi in considerazione due dei servizi più utilizzati: Amazon AWS EC2 e Google App Engine. Sono stati mostrati i vantaggi e gli svantaggi dell'uno e dell'altro e il motivo per cui si è scelto di utilizzare Google App Engine. Si sono sviluppati quattro algoritmi: Brute Force, Binary Partition, Improved Sort, Interval Tree Matching. Sono stati svolti dei test sul tempo di esecuzione e sulla memoria di picco utilizzata. Dai risultati si evince che l'Interval Tree Matching e l'Improved Sort sono i più efficienti. Tutti i test sono stati svolti sulle versioni sequenziali degli algoritmi e che quindi ci può essere un riduzione nel tempo di esecuzione per l'algoritmo Interval Tree Matching.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Atrial fibrillation (AF) is a significant risk factor for cardiovascular (CV) mortality. This study aims to evaluate the prognostic implication of AF in patients with peripheral arterial disease (PAD). METHODS: The International Reduction of Atherothrombosis for Continued Health (REACH) Registry included 23,542 outpatients in Europe with established coronary artery disease, cerebrovascular disease (CVD), PAD and/or >/=3 risk factors. Of these, 3753 patients had symptomatic PAD. CV risk factors were determined at baseline. Study end point was a combination of cardiac death, non-fatal myocardial infarction (MI) and stroke (CV events) during 2 years of follow-up. Cox regression analysis adjusted for age, gender and other risk factors (i.e., congestive heart failure, coronary artery re-vascularisation, coronary artery bypass grafting (CABG), MI, hypertension, stroke, current smoking and diabetes) was used. RESULTS: Of 3753 PAD patients, 392 (10%) were known to have AF. Patients with AF were older and had a higher prevalence of CVD, diabetes and hypertension. Long-term CV mortality occurred in 5.6% of patients with AF and in 1.6% of those without AF (p<0.001). Multivariable analyses showed that AF was an independent predictor of late CV events (hazard ratio (HR): 1.5; 95% confidence interval (CI): 1.09-2.0). CONCLUSION: AF is common in European patients with symptomatic PAD and is independently associated with a worse 2-year CV outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background There is an ongoing debate as to whether combined antiretroviral treatment (cART) during pregnancy is an independent risk factor for prematurity in HIV-1-infected women. Objective The aim of the study was to examine (1) crude effects of different ART regimens on prematurity, (2) the association between duration of cART and duration of pregnancy, and (3) the role of possibly confounding risk factors for prematurity. Method We analysed data from 1180 pregnancies prospectively collected by the Swiss Mother and Child HIV Cohort Study (MoCHiV) and the Swiss HIV Cohort Study (SHCS). Results Odds ratios for prematurity in women receiving mono/dual therapy and cART were 1.8 [95% confidence interval (CI) 0.85–3.6] and 2.5 (95% CI 1.4–4.3) compared with women not receiving ART during pregnancy (P=0.004). In a subgroup of 365 pregnancies with comprehensive information on maternal clinical, demographic and lifestyle characteristics, there was no indication that maternal viral load, age, ethnicity or history of injecting drug use affected prematurity rates associated with the use of cART. Duration of cART before delivery was also not associated with duration of pregnancy. Conclusion Our study indicates that confounding by maternal risk factors or duration of cART exposure is not a likely explanation for the effects of ART on prematurity in HIV-1-infected women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time-dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each 'trial' is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard methods for the estimation of the postmortem interval (PMI, time since death), based on the cooling of the corpse, are limited to about 48 h after death. As an alternative, noninvasive postmortem observation of alterations of brain metabolites by means of (1)H MRS has been suggested for an estimation of the PMI at room temperature, so far without including the effect of other ambient temperatures. In order to study the temperature effect, localized (1)H MRS was used to follow brain decomposition in a sheep brain model at four different temperatures between 4 and 26°C with repeated measurements up to 2100 h postmortem. The simultaneous determination of 25 different biochemical compounds at each measurement allowed the time courses of concentration changes to be followed. A sudden and almost simultaneous change of the concentrations of seven compounds was observed after a time span that decreased exponentially from 700 h at 4°C to 30 h at 26°C ambient temperature. As this represents, most probably, the onset of highly variable bacterial decomposition, and thus defines the upper limit for a reliable PMI estimation, data were analyzed only up to this start of bacterial decomposition. As 13 compounds showed unequivocal, reproducible concentration changes during this period while eight showed a linear increase with a slope that was unambiguously related to ambient temperature. Therefore, a single analytical function with PMI and temperature as variables can describe the time courses of metabolite concentrations. Using the inverse of this function, metabolite concentrations determined from a single MR spectrum can be used, together with known ambient temperatures, to calculate the PMI of a corpse. It is concluded that the effect of ambient temperature can be reliably included in the PMI determination by (1)H MRS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Identifying modifiable factors that increase women's vulnerability to HIV is a critical step in developing effective female-initiated prevention interventions. The primary objective of this study was to pool individual participant data from prospective longitudinal studies to investigate the association between intravaginal practices and acquisition of HIV infection among women in sub-Saharan Africa. Secondary objectives were to investigate associations between intravaginal practices and disrupted vaginal flora; and between disrupted vaginal flora and HIV acquisition. Methods and Findings We conducted a meta-analysis of individual participant data from 13 prospective cohort studies involving 14,874 women, of whom 791 acquired HIV infection during 21,218 woman years of follow-up. Data were pooled using random-effects meta-analysis. The level of between-study heterogeneity was low in all analyses (I2 values 0.0%–16.1%). Intravaginal use of cloth or paper (pooled adjusted hazard ratio [aHR] 1.47, 95% confidence interval [CI] 1.18–1.83), insertion of products to dry or tighten the vagina (aHR 1.31, 95% CI 1.00–1.71), and intravaginal cleaning with soap (aHR 1.24, 95% CI 1.01–1.53) remained associated with HIV acquisition after controlling for age, marital status, and number of sex partners in the past 3 months. Intravaginal cleaning with soap was also associated with the development of intermediate vaginal flora and bacterial vaginosis in women with normal vaginal flora at baseline (pooled adjusted odds ratio [OR] 1.24, 95% CI 1.04–1.47). Use of cloth or paper was not associated with the development of disrupted vaginal flora. Intermediate vaginal flora and bacterial vaginosis were each associated with HIV acquisition in multivariable models when measured at baseline (aHR 1.54 and 1.69, p<0.001) or at the visit before the estimated date of HIV infection (aHR 1.41 and 1.53, p<0.001), respectively. Conclusions This study provides evidence to suggest that some intravaginal practices increase the risk of HIV acquisition but a direct causal pathway linking intravaginal cleaning with soap, disruption of vaginal flora, and HIV acquisition has not yet been demonstrated. More consistency in the definition and measurement of specific intravaginal practices is warranted so that the effects of specific intravaginal practices and products can be further elucidated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In orthodontics, multiple site observations within patients or multiple observations collected at consecutive time points are often encountered. Clustered designs require larger sample sizes compared to individual randomized trials and special statistical analyses that account for the fact that observations within clusters are correlated. It is the purpose of this study to assess to what degree clustering effects are considered during design and data analysis in the three major orthodontic journals. The contents of the most recent 24 issues of the American Journal of Orthodontics and Dentofacial Orthopedics (AJODO), Angle Orthodontist (AO), and European Journal of Orthodontics (EJO) from December 2010 backwards were hand searched. Articles with clustering effects and whether the authors accounted for clustering effects were identified. Additionally, information was collected on: involvement of a statistician, single or multicenter study, number of authors in the publication, geographical area, and statistical significance. From the 1584 articles, after exclusions, 1062 were assessed for clustering effects from which 250 (23.5 per cent) were considered to have clustering effects in the design (kappa = 0.92, 95 per cent CI: 0.67-0.99 for inter rater agreement). From the studies with clustering effects only, 63 (25.20 per cent) had indicated accounting for clustering effects. There was evidence that the studies published in the AO have higher odds of accounting for clustering effects [AO versus AJODO: odds ratio (OR) = 2.17, 95 per cent confidence interval (CI): 1.06-4.43, P = 0.03; EJO versus AJODO: OR = 1.90, 95 per cent CI: 0.84-4.24, non-significant; and EJO versus AO: OR = 1.15, 95 per cent CI: 0.57-2.33, non-significant). The results of this study indicate that only about a quarter of the studies with clustering effects account for this in statistical data analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To examine the associations between pet keeping in early childhood and asthma and allergies in children aged 6–10 years. Design Pooled analysis of individual participant data of 11 prospective European birth cohorts that recruited a total of over 22,000 children in the 1990s. Exposure definition Ownership of only cats, dogs, birds, rodents, or cats/dogs combined during the first 2 years of life. Outcome definition Current asthma (primary outcome), allergic asthma, allergic rhinitis and allergic sensitization during 6–10 years of age. Data synthesis Three-step approach: (i) Common definition of outcome and exposure variables across cohorts; (ii) calculation of adjusted effect estimates for each cohort; (iii) pooling of effect estimates by using random effects meta-analysis models. Results We found no association between furry and feathered pet keeping early in life and asthma in school age. For example, the odds ratio for asthma comparing cat ownership with “no pets” (10 studies, 11489 participants) was 1.00 (95% confidence interval 0.78 to 1.28) (I2 = 9%; p = 0.36). The odds ratio for asthma comparing dog ownership with “no pets” (9 studies, 11433 participants) was 0.77 (0.58 to 1.03) (I2 = 0%, p = 0.89). Owning both cat(s) and dog(s) compared to “no pets” resulted in an odds ratio of 1.04 (0.59 to 1.84) (I2 = 33%, p = 0.18). Similarly, for allergic asthma and for allergic rhinitis we did not find associations regarding any type of pet ownership early in life. However, we found some evidence for an association between ownership of furry pets during the first 2 years of life and reduced likelihood of becoming sensitized to aero-allergens. Conclusions Pet ownership in early life did not appear to either increase or reduce the risk of asthma or allergic rhinitis symptoms in children aged 6–10. Advice from health care practitioners to avoid or to specifically acquire pets for primary prevention of asthma or allergic rhinitis in children should not be given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to determine the optimal time interval for a repeated Chlamydia trachomatis (chlamydia) test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Franches-Montagnes is the only native horse breed in Switzerland, therefore special efforts should be made for ensuring its survival. The objectives of this study were to characterize the structure of this population as well as genetic variability with pedigree data, conformation traits and molecular markers. Studies were focused to clarify if this population is composed of a heavy- and a light-type subpopulation. Extended pedigree records of 3-year-old stallions (n = 68) and mares (n = 108) were available. Evaluations of body conformation traits as well as pedigree data and molecular markers did not support the two-subpopulation hypothesis. The generation interval ranged from 7.8 to 9.3 years. The complete generation equivalent was high (>12). The number of effective ancestors varied between 18.9 and 20.1, whereof 50% of the genetic variability was attributed to seven of them. Genetic contribution of Warmblood horses ranged from 36% to 42% and that of Coldblood horses from 4% to 6%. The average inbreeding coefficient reached 6%. Inbreeding effective population size was 114.5 when the average increase of the inbreeding coefficient per year since 1910 was taken. Our results suggest that bottleneck situations occurred because of selection of a small number of sire lines. Promotion of planned matings between parents that are less related is recommended in order to avoid a reduction of the genetic diversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In biostatistical applications, interest often focuses on the estimation of the distribution of time T between two consecutive events. If the initial event time is observed and the subsequent event time is only known to be larger or smaller than an observed monitoring time, then the data is described by the well known singly-censored current status model, also known as interval censored data, case I. We extend this current status model by allowing the presence of a time-dependent process, which is partly observed and allowing C to depend on T through the observed part of this time-dependent process. Because of the high dimension of the covariate process, no globally efficient estimators exist with a good practical performance at moderate sample sizes. We follow the approach of Robins and Rotnitzky (1992) by modeling the censoring variable, given the time-variable and the covariate-process, i.e., the missingness process, under the restriction that it satisfied coarsening at random. We propose a generalization of the simple current status estimator of the distribution of T and of smooth functionals of the distribution of T, which is based on an estimate of the missingness. In this estimator the covariates enter only through the estimate of the missingness process. Due to the coarsening at random assumption, the estimator has the interesting property that if we estimate the missingness process more nonparametrically, then we improve its efficiency. We show that by local estimation of an optimal model or optimal function of the covariates for the missingness process, the generalized current status estimator for smooth functionals become locally efficient; meaning it is efficient if the right model or covariate is consistently estimated and it is consistent and asymptotically normal in general. Estimation of the optimal model requires estimation of the conditional distribution of T, given the covariates. Any (prior) knowledge of this conditional distribution can be used at this stage without any risk of losing root-n consistency. We also propose locally efficient one step estimators. Finally, we show some simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Combination antiretroviral therapy (cART) is changing, and this may affect the type and occurrence of side effects. We examined the frequency of lipodystrophy (LD) and weight changes in relation to the use of specific drugs in the Swiss HIV Cohort Study (SHCS). METHODS: In the SHCS, patients are followed twice a year and scored by the treating physician as having 'fat accumulation', 'fat loss', or neither. Treatments, and reasons for change thereof, are recorded. Our study sample included all patients treated with cART between 2003 and 2006 and, in addition, all patients who started cART between 2000 and 2003. RESULTS: From 2003 to 2006, the percentage of patients taking stavudine, didanosine and nelfinavir decreased, the percentage taking lopinavir, nevirapine and efavirenz remained stable, and the percentage taking atazanavir and tenofovir increased by 18.7 and 22.2%, respectively. In life-table Kaplan-Meier analysis, patients starting cART in 2003-2006 were less likely to develop LD than those starting cART from 2000 to 2002 (P<0.02). LD was quoted as the reason for treatment change or discontinuation for 4% of patients on cART in 2003, and for 1% of patients treated in 2006 (P for trend <0.001). In univariate and multivariate regression analysis, patients with a weight gain of >or=5 kg were more likely to take lopinavir or atazanavir than patients without such a weight gain [odds ratio (OR) 2, 95% confidence interval (CI) 1.3-2.9, and OR 1.7, 95% CI 1.3-2.1, respectively]. CONCLUSIONS: LD has become less frequent in the SHCS from 2000 to 2006. A weight gain of more than 5 kg was associated with the use of atazanavir and lopinavir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .