982 resultados para incident duration modelling
Resumo:
This paper deals with the estimation and testing of conditional duration models by looking at the density and baseline hazard rate functions. More precisely, we foeus on the distance between the parametric density (or hazard rate) function implied by the duration process and its non-parametric estimate. Asymptotic justification is derived using the functional delta method for fixed and gamma kernels, whereas finite sample properties are investigated through Monte Carlo simulations. Finally, we show the practical usefulness of such testing procedures by carrying out an empirical assessment of whether autoregressive conditional duration models are appropriate to oIs for modelling price durations of stocks traded at the New York Stock Exchange.
Resumo:
Modelling post-release survival probabilities of reintroduced birds can help inform 'soft-release' strategies for avian reintroductions that use captive-bred individuals. We used post-release radiotelemetry data to estimate the survival probabilities of reintroduced captive-bred Red-billed Curassow Crax blumenbachii, a globally threatened Cracid endemic to the Brazilian Atlantic Rainforest. Between August 2006 and December 2008, 46 radiotagged Curassows from the Crax Brazil breeding centre were reintroduced to the Guapiacu Ecological Reserve (REGUA), Rio de Janeiro state, Brazil, in seven different cohorts. Reintroduced birds were most vulnerable during the first 12 months post-release from natural predation, domestic dogs and hunting. Annual post-release survival probability was high (75%) compared with published estimates for other Galliform species. However, when considering survival in all birds transported to REGUA (some birds died before release or were retained in captivity) and not only post-release survival, phi in this study was closer to estimates for other species (60%). The duration of the pre-release acclimatization period within the soft-release enclosure and the size of the released cohorts both positively influenced post-release survival of reintroduced Curassows. Our results are relevant to future Cracid reintroductions and highlight the importance of utilizing post-release monitoring data for evidence-based improvements to soft-release strategies that can significantly enhance the post-release survival of captive-bred birds.
Resumo:
Heart diseases are the leading cause of death worldwide, both for men and women. However, the ionic mechanisms underlying many cardiac arrhythmias and genetic disorders are not completely understood, thus leading to a limited efficacy of the current available therapies and leaving many open questions for cardiac electrophysiologists. On the other hand, experimental data availability is still a great issue in this field: most of the experiments are performed in vitro and/or using animal models (e.g. rabbit, dog and mouse), even when the final aim is to better understand the electrical behaviour of in vivo human heart either in physiological or pathological conditions. Computational modelling constitutes a primary tool in cardiac electrophysiology: in silico simulations, based on the available experimental data, may help to understand the electrical properties of the heart and the ionic mechanisms underlying a specific phenomenon. Once validated, mathematical models can be used for making predictions and testing hypotheses, thus suggesting potential therapeutic targets. This PhD thesis aims to apply computational cardiac modelling of human single cell action potential (AP) to three clinical scenarios, in order to gain new insights into the ionic mechanisms involved in the electrophysiological changes observed in vitro and/or in vivo. The first context is blood electrolyte variations, which may occur in patients due to different pathologies and/or therapies. In particular, we focused on extracellular Ca2+ and its effect on the AP duration (APD). The second context is haemodialysis (HD) therapy: in addition to blood electrolyte variations, patients undergo a lot of other different changes during HD, e.g. heart rate, cell volume, pH, and sympatho-vagal balance. The third context is human hypertrophic cardiomyopathy (HCM), a genetic disorder characterised by an increased arrhythmic risk, and still lacking a specific pharmacological treatment.
Resumo:
Excess adiposity is associated with increased risks of developing adult malignancies. To inform public health policy and guide further research, the incident cancer burden attributable to excess body mass index (BMI >or= 25 kg/m(2)) across 30 European countries were estimated. Population attributable risks (PARs) were calculated using European- and gender-specific risk estimates from a published meta-analysis and gender-specific mean BMI estimates from a World Health Organization Global Infobase. Country-specific numbers of new cancers were derived from Globocan2002. A ten-year lag-period between risk exposure and cancer incidence was assumed and 95% confidence intervals (CI) were estimated in Monte Carlo simulations. In 2002, there were 2,171,351 new all cancer diagnoses in the 30 countries of Europe. Estimated PARs were 2.5% (95% CI 1.5-3.6%) in men and 4.1% (2.3-5.9%) in women. These collectively corresponded to 70,288 (95% CI 40,069-100,668) new cases. Sensitivity analyses revealed estimates were most influenced by the assumed shape of the BMI distribution in the population and cancer-specific risk estimates. In a scenario analysis of a plausible contemporary (2008) population, the estimated PARs increased to 3.2% (2.1-4.3%) and 8.6% (5.6-11.5%), respectively, in men and women. Endometrial, post-menopausal breast and colorectal cancers accounted for 65% of these cancers. This analysis quantifies the burden of incident cancers attributable to excess BMI in Europe. The estimates reported here provide a baseline for future modelling, and underline the need for research into interventions to control weight in the context of endometrial, breast and colorectal cancer.
Resumo:
Background Pelvic inflammatory disease (PID) results from the ascending spread of microorganisms from the vagina and endocervix to the upper genital tract. PID can lead to infertility, ectopic pregnancy and chronic pelvic pain. The timing of development of PID after the sexually transmitted bacterial infection Chlamydia trachomatis (chlamydia) might affect the impact of screening interventions, but is currently unknown. This study investigates three hypothetical processes for the timing of progression: at the start, at the end, or throughout the duration of chlamydia infection. Methods We develop a compartmental model that describes the trial structure of a published randomised controlled trial (RCT) and allows each of the three processes to be examined using the same model structure. The RCT estimated the effect of a single chlamydia screening test on the cumulative incidence of PID up to one year later. The fraction of chlamydia infected women who progress to PID is obtained for each hypothetical process by the maximum likelihood method using the results of the RCT. Results The predicted cumulative incidence of PID cases from all causes after one year depends on the fraction of chlamydia infected women that progresses to PID and on the type of progression. Progression at a constant rate from a chlamydia infection to PID or at the end of the infection was compatible with the findings of the RCT. The corresponding estimated fraction of chlamydia infected women that develops PID is 10% (95% confidence interval 7-13%) in both processes. Conclusions The findings of this study suggest that clinical PID can occur throughout the course of a chlamydia infection, which will leave a window of opportunity for screening to prevent PID.
Resumo:
Background New HIV infections in men who have sex with men (MSM) have increased in Switzerland since 2000 despite combination antiretroviral therapy (cART). The objectives of this mathematical modelling study were: to describe the dynamics of the HIV epidemic in MSM in Switzerland using national data; to explore the effects of hypothetical prevention scenarios; and to conduct a multivariate sensitivity analysis. Methodology/Principal Findings The model describes HIV transmission, progression and the effects of cART using differential equations. The model was fitted to Swiss HIV and AIDS surveillance data and twelve unknown parameters were estimated. Predicted numbers of diagnosed HIV infections and AIDS cases fitted the observed data well. By the end of 2010, an estimated 13.5% (95% CI 12.5, 14.6%) of all HIV-infected MSM were undiagnosed and accounted for 81.8% (95% CI 81.1, 82.4%) of new HIV infections. The transmission rate was at its lowest from 1995–1999, with a nadir of 46 incident HIV infections in 1999, but increased from 2000. The estimated number of new infections continued to increase to more than 250 in 2010, although the reproduction number was still below the epidemic threshold. Prevention scenarios included temporary reductions in risk behaviour, annual test and treat, and reduction in risk behaviour to levels observed earlier in the epidemic. These led to predicted reductions in new infections from 2 to 26% by 2020. Parameters related to disease progression and relative infectiousness at different HIV stages had the greatest influence on estimates of the net transmission rate. Conclusions/Significance The model outputs suggest that the increase in HIV transmission amongst MSM in Switzerland is the result of continuing risky sexual behaviour, particularly by those unaware of their infection status. Long term reductions in the incidence of HIV infection in MSM in Switzerland will require increased and sustained uptake of effective interventions.
Resumo:
Background Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have previously demonstrated that a patient's antibody reaction pattern in a confirmatory line immunoassay (INNO-LIA™ HIV I/II Score) provides information on the duration of infection, which is unaffected by clinical, immunological and viral variables. In this report we have set out to determine the diagnostic performance of Inno-Lia algorithms for identifying incident infections in patients with known duration of infection and evaluated the algorithms in annual cohorts of HIV notifications. Methods Diagnostic sensitivity was determined in 527 treatment-naive patients infected for up to 12 months. Specificity was determined in 740 patients infected for longer than 12 months. Plasma was tested by Inno-Lia and classified as either incident (< = 12 m) or older infection by 26 different algorithms. Incident infection rates (IIR) were calculated based on diagnostic sensitivity and specificity of each algorithm and the rule that the total of incident results is the sum of true-incident and false-incident results, which can be calculated by means of the pre-determined sensitivity and specificity. Results The 10 best algorithms had a mean raw sensitivity of 59.4% and a mean specificity of 95.1%. Adjustment for overrepresentation of patients in the first quarter year of infection further reduced the sensitivity. In the preferred model, the mean adjusted sensitivity was 37.4%. Application of the 10 best algorithms to four annual cohorts of HIV-1 notifications totalling 2'595 patients yielded a mean IIR of 0.35 in 2005/6 (baseline) and of 0.45, 0.42 and 0.35 in 2008, 2009 and 2010, respectively. The increase between baseline and 2008 and the ensuing decreases were highly significant. Other adjustment models yielded different absolute IIR, although the relative changes between the cohorts were identical for all models. Conclusions The method can be used for comparing IIR in annual cohorts of HIV notifications. The use of several different algorithms in combination, each with its own sensitivity and specificity to detect incident infection, is advisable as this reduces the impact of individual imperfections stemming primarily from relatively low sensitivities and sampling bias.
Resumo:
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts.
Resumo:
BACKGROUND Pathogenic bacteria are often asymptomatically carried in the nasopharynx. Bacterial carriage can be reduced by vaccination and has been used as an alternative endpoint to clinical disease in randomised controlled trials (RCTs). Vaccine efficacy (VE) is usually calculated as 1 minus a measure of effect. Estimates of vaccine efficacy from cross-sectional carriage data collected in RCTs are usually based on prevalence odds ratios (PORs) and prevalence ratios (PRs), but it is unclear when these should be measured. METHODS We developed dynamic compartmental transmission models simulating RCTs of a vaccine against a carried pathogen to investigate how VE can best be estimated from cross-sectional carriage data, at which time carriage should optimally be assessed, and to which factors this timing is most sensitive. In the models, vaccine could change carriage acquisition and clearance rates (leaky vaccine); values for these effects were explicitly defined (facq, 1/fdur). POR and PR were calculated from model outputs. Models differed in infection source: other participants or external sources unaffected by the trial. Simulations using multiple vaccine doses were compared to empirical data. RESULTS The combined VE against acquisition and duration calculated using POR (VEˆacq.dur, (1-POR)×100) best estimates the true VE (VEacq.dur, (1-facq×fdur)×100) for leaky vaccines in most scenarios. The mean duration of carriage was the most important factor determining the time until VEˆacq.dur first approximates VEacq.dur: if the mean duration of carriage is 1-1.5 months, up to 4 months are needed; if the mean duration is 2-3 months, up to 8 months are needed. Minor differences were seen between models with different infection sources. In RCTs with shorter intervals between vaccine doses it takes longer after the last dose until VEˆacq.dur approximates VEacq.dur. CONCLUSION The timing of sample collection should be considered when interpreting vaccine efficacy against bacterial carriage measured in RCTs.
Resumo:
OBJECTIVE The aim of this study was to explore the risk of incident gout in patients with type 2 diabetes mellitus (T2DM) in association with diabetes duration, diabetes severity and antidiabetic drug treatment. METHODS We conducted a case-control study in patients with T2DM using the UK-based Clinical Practice Research Datalink (CPRD). We identified case patients aged ≥18 years with an incident diagnosis of gout between 1990 and 2012. We matched to each case patient one gout-free control patient. We used conditional logistic regression analysis to calculate adjusted ORs (adj. ORs) with 95% CIs and adjusted our analyses for important potential confounders. RESULTS The study encompassed 7536 T2DM cases with a first-time diagnosis of gout. Compared to a diabetes duration <1 year, prolonged diabetes duration (1-3, 3-6, 7-9 and ≥10 years) was associated with decreased adj. ORs of 0.91 (95% CI 0.79 to 1.04), 0.76 (95% CI 0.67 to 0.86), 0.70 (95% CI 0.61 to 0.86), and 0.58 (95% CI 0.51 to 0.66), respectively. Compared to a reference A1C level of <7%, the risk estimates of increasing A1C levels (7.0-7.9, 8.0-8.9 and ≥9%) steadily decreased with adj. ORs of 0.79 (95% CI 0.72 to 0.86), 0.63 (95% CI 0.55 to 0.72), and 0.46 (95% CI 0.40 to 0.53), respectively. Neither use of insulin, metformin, nor sulfonylureas was associated with an altered risk of incident gout. CONCLUSIONS Increased A1C levels, but not use of antidiabetic drugs, was associated with a decreased risk of incident gout among patients with T2DM.
Resumo:
Investigating preferential flow, including macropore flow, is crucial to predicting and preventing point sources of contamination in soil, for example in the vicinity of pumping wells. With a view to advancing groundwater protection, this study aimed (i) to quantify the strength of macropore flow in four representative natural grassland soils on the Swiss plateau, and (ii) to define the parameters that significantly control macropore flow in grassland soil. For each soil type we selected three measurement points on which three successive irrigation experiments were carried out, resulting in a total of 36 irrigations. The strength of macropore flow, parameterized as the cumulated water volume flowing from macropores at a depth of 1 m in response to an irrigation of 60 mm h−1 intensity and 1 h duration, was simulated using the dual-permeability MACRO model. The model calibration was based on the key soil parameters and fine measurements of water content at different depths. Modelling results indicate high performance of macropore flow in all investigated soil types except in gleysols. The volume of water that flowed from macropores and was hence expected to reach groundwater varied between 81% and 94% in brown soils, 59% and 67% in para-brown soils, 43% and 56% in acid brown soils, and 22% and 35% in gleysols. These results show that spreading pesticides and herbicides in pumping well protection zones poses a high risk of contamination and must be strictly prohibited. We also found that organic carbon content was not correlated with the strength of macropore flow, probably due to its very weak variation in our study, while saturated water content showed a negative correlation with macropore flow. The correlation between saturated hydraulic conductivity (Ks) and macropore flow was negative as well, but weak. Macropore flow appears to be controlled by the interaction between the bulk density of the uppermost topsoil layer (0–0.10 m) and the macroporosity of the soil below. This interaction also affects the variations in Ks and saturated water content. Further investigations are needed to better understand the combined effect of all these processes including the exchange between micropore and macropore domains.
Resumo:
This paper is concerned with the analysis of zero-inflated count data when time of exposure varies. It proposes a modified zero-inflated count data model where the probability of an extra zero is derived from an underlying duration model with Weibull hazard rate. The new model is compared to the standard Poisson model with logit zero inflation in an application to the effect of treatment with thiotepa on the number of new bladder tumors.
Resumo:
OBJECTIVE This study aims to assess the odds of developing incident gout in association with the use of postmenopausal estrogen-progestogen therapy, according to type, timing, duration, and route of administration of estrogen-progestogen therapy. METHODS We conducted a retrospective population-based case-control analysis using the United Kingdom-based Clinical Practice Research Datalink. We identified women (aged 45 y or older) who had a first-time diagnosis of gout recorded between 1990 and 2010. We matched one female control with each case on age, general practice, calendar time, and years of active history in the database. We used multivariate conditional logistic regression to calculate odds ratios (ORs) with 95% CIs (adjusted for confounders). RESULTS The adjusted OR for gout with current use of oral formulations of opposed estrogens (estrogen-progestogen) was 0.69 (95% CI, 0.56-0.86) compared with never use. Current use was associated with a decreased OR for gout in women without renal failure (adjusted OR, 0.71; 95% CI, 0.57-0.87) and hypertension (adjusted OR, 0.62; 95% CI, 0.44-0.87) compared with never use. Tibolone was associated with a decreased OR for gout (adjusted OR, 0.77; 95% CI, 0.63-0.95) compared with never use. Estrogens alone did not alter the OR for gout. CONCLUSIONS Current use of oral opposed estrogens, but not unopposed estrogens, is associated with a decreased OR for incident gout in women without renal failure and is more pronounced in women with hypertension. Use of tibolone is associated with a decreased OR for incident gout. The decreased OR for gout may be related to the progestogen component rather than the estrogen component.
Resumo:
Dialysis patients are at high risk for hepatitis B infection, which is a serious but preventable disease. Prevention strategies include the administration of the hepatitis B vaccine. Dialysis patients have been noted to have a poor immune response to the vaccine and lose immunity more rapidly. The long term immunogenicity of the hepatitis B vaccine has not been well defined in pediatric dialysis patients especially if administered during infancy as a routine childhood immunization.^ Purpose. The aim of this study was to determine the median duration of hepatitis B immunity and to study the effect of vaccination timing and other cofactors on the duration of hepatitis B immunity in pediatric dialysis patients.^ Methods. Duration of hepatitis B immunity was determined by Kaplan-Meier survival analysis. Comparison of stratified survival analysis was performed using log-rank analysis. Multivariate analysis by Cox regression was used to estimate hazard ratios for the effect of timing of vaccine administration and other covariates on the duration of hepatitis B immunity.^ Results. 193 patients (163 incident patients) had complete data available for analysis. Mean age was 11.2±5.8 years and mean ESRD duration was 59.3±97.8 months. Kaplan-Meier analysis showed that the total median overall duration of immunity (since the time of the primary vaccine series) was 112.7 months (95% CI: 96.6, 124.4), whereas the median overall duration of immunity for incident patients was 106.3 months (95% CI: 93.93, 124.44). Incident patients had a median dialysis duration of hepatitis B immunity equal to 37.1 months (95% CI: 24.16, 72.26). Multivariate adjusted analysis showed that there was a significant difference between patients based on the timing of hepatitis B vaccination administration (p<0.001). Patients immunized after the start of dialysis had a hazard ratio of 6.13 (2.87, 13.08) for loss of hepatitis B immunity compared to patients immunized as infants (p<0.001).^ Conclusion. This study confirms that patients immunized after dialysis onset have an overall shorter duration of hepatitis B immunity as measured by hepatitis B antibody titers and after the start of dialysis, protective antibody titer levels in pediatric dialysis patients wane rapidly compared to healthy children.^
Resumo:
The existence of an association between leukemia and electromagnetic fields (EMF) is still controversial. The results of epidemiologic studies of leukemia in occupational groups with exposure to EMF are inconsistent. Weak associations have been seen in a few studies. EMF assessment is lacking in precision. Reported dose-response relationships have been based on qualitative levels of exposure to EMF without regard to duration of employment or EMF intensity on the jobs. Furthermore, potential confounding factors in the associations were not often well controlled. The current study is an analysis of the data collected from an incident case-control study. The primary objective was to test the hypothesis that occupational exposure to EMF is associated with leukemia, including total leukemia (TL), myelogenous leukemia (MYELOG) and acute non-lymphoid leukemia (ANLL). Potential confounding factors: occupational exposure to benzene, age, smoking, alcohol consumption, and previous medical radiation exposures were controlled in multivariate logistic regression models. Dose-response relationships were estimated by cumulative occupational exposure to EMF, taking into account duration of employment and EMF intensity on the jobs. In order to overcome weaknesses of most previous studies, special efforts were made to improve the precision of EMF assessment. Two definitions of EMF were used and result discrepancies using the two definitions were observed. These difference raised a question as to whether the workers at jobs with low EMF exposure should be considered as non-exposed in future studies. In addition, the current study suggested use of lifetime cumulative EMF exposure estimates to determine dose-response relationship. The analyses of the current study suggest an association between ANLL and employment at selected jobs with high EMF exposure. The existence of an association between three types of leukemia and broader categories of occupational EMF exposure, is still undetermined. If an association does exist between occupational EMF exposure and leukemia, the results of the current study suggest that EMF might only be a potential factor in the promotion of leukemia, but not its initiation. ^