936 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS
Resumo:
INTRODUCTION: Two subcutaneous injections of adalimumab in severe acute sciatica significantly reduced the number of back operations in a short-term randomised controlled clinical trial. OBJECTIVE: To determine in a 3-year follow-up study whether the short-term benefit of adalimumab in sciatica is sustained over a longer period of time. METHODS: The primary outcome of this analysis was incident discectomy. Three years after randomisation, information on surgery could be retrieved in 56/61 patients (92%).A multivariate Cox proportional hazard models, adjusted for potential confounders, was used to determine factors predisposing to surgery. RESULTS: Twenty-three (41%) patients had back surgery within 3 years, 8/29 (28%) in the adalimumab group and 15/27 (56%) in the placebo group, p=0.04. Adalimumab injections reduced the need for back surgery by 61% (HR)=0.39 (95% CI 0.17 to 0.92). In a multivariate model, treatment with a tumour necrosis factor-α antagonist remained the strongest protective factor (HR=0.17, p=0.002). Other significant predictors of surgery were a good correlation between symptoms and MRI findings (HR=11.6, p=0.04), baseline intensity of leg pain (HR=1.3, p=0.06), intensity of back pain (HR=1.4, p=0.03) and duration of sickness leave (HR=1.01 per day, p=0.03). CONCLUSION: A short course of adalimumab in patients with severe acute sciatica significantly reduces the need for back surgery.
Resumo:
The objective of this work is to present a multitechnique approach to define the geometry, the kinematics, and the failure mechanism of a retrogressive large landslide (upper part of the La Valette landslide, South French Alps) by the combination of airborne and terrestrial laser scanning data and ground-based seismic tomography data. The advantage of combining different methods is to constrain the geometrical and failure mechanism models by integrating different sources of information. Because of an important point density at the ground surface (4. 1 points m?2), a small laser footprint (0.09 m) and an accurate three-dimensional positioning (0.07 m), airborne laser scanning data are adapted as a source of information to analyze morphological structures at the surface. Seismic tomography surveys (P-wave and S-wave velocities) may highlight the presence of low-seismic-velocity zones that characterize the presence of dense fracture networks at the subsurface. The surface displacements measured from the terrestrial laser scanning data over a period of 2 years (May 2008?May 2010) allow one to quantify the landslide activity at the direct vicinity of the identified discontinuities. An important subsidence of the crown area with an average subsidence rate of 3.07 m?year?1 is determined. The displacement directions indicate that the retrogression is controlled structurally by the preexisting discontinuities. A conceptual structural model is proposed to explain the failure mechanism and the retrogressive evolution of the main scarp. Uphill, the crown area is affected by planar sliding included in a deeper wedge failure system constrained by two preexisting fractures. Downhill, the landslide body acts as a buttress for the upper part. Consequently, the progression of the landslide body downhill allows the development of dip-slope failures, and coherent blocks start sliding along planar discontinuities. The volume of the failed mass in the crown area is estimated at 500,000 m3 with the sloping local base level method.
Resumo:
Network airlines have been increasingly focusing their operations on hub airports through the exploitation of connecting traffic, allowing them to take advantage of economies of traffic density, which are unequivocal in the airline industry. Less attention has been devoted to airlines? decisions on point-to-point thin routes, which could be served using different aircraft technologies and different business models. This paper examines, both theoretically and empirically, the impact on airlines ?networks of the two major innovations in the airline industry in the last two decades: the regional jet technology and the low-cost business model. We show that, under certain circumstances, direct services on point-to-point thin routes can be viable and thus airlines may be interested in deviating passengers out of the hub.
Resumo:
Nonlinear Noisy Leaky Integrate and Fire (NNLIF) models for neurons networks can be written as Fokker-Planck-Kolmogorov equations on the probability density of neurons, the main parameters in the model being the connectivity of the network and the noise. We analyse several aspects of the NNLIF model: the number of steady states, a priori estimates, blow-up issues and convergence toward equilibrium in the linear case. In particular, for excitatory networks, blow-up always occurs for initial data concentrated close to the firing potential. These results show how critical is the balance between noise and excitatory/inhibitory interactions to the connectivity parameter.
Resumo:
Network airlines have been increasingly focusing their operations on hub airports through the exploitation of connecting traffic, allowing them to take advantage of economies of traffic density, which are unequivocal in the airline industry. Less attention has been devoted to airlines' decisions on point-to-point thin routes, which could be served using different aircraft technologies and different business models. This paper examines, both theoretically and empirically, the impact on airlines' networks of the two major innovations in the airline industry in the last two decades: the regional jet technology and the low-cost business model. We show that, under certain circumstances, direct services on point-to-point thin routes can be viable and thus airlines may be interested in deviating passengers out of the hub. Keywords: regional jet technology; low-cost business model; point-to-point network; hub-and-spoke network JEL Classi…fication Numbers: L13; L2; L93
Resumo:
Background: The purpose of the work reported here is to test reliable molecular profiles using routinely processed formalin-fixed paraffin-embedded (FFPE) tissues from participants of the clinical trial BIG 1-98 with a median follow-up of 60 months. Methods: RNA from fresh frozen (FF) and FFPE tumor samples of 82 patients were used for quality control, and independent FFPE tissues of 342 postmenopausal participants of BIG 1-98 with ER-positive cancer were analyzed by measuring prospectively selected genes and computing scores representing the functions of the estrogen receptor (eight genes, ER_8), the progesterone receptor (five genes, PGR_5), Her2 (two genes, HER2_2), and proliferation (ten genes, PRO_10) by quantitative reverse transcription PCR (qRT-PCR) on TaqMan Low Density Arrays. Molecular scores were computed for each category and ER_8, PGR_5, HER2_2, and PRO_10 scores were combined into a RISK_25 score. Results: Pearson correlation coefficients between FF- and FFPE-derived scores were at least 0.94 and high concordance was observed between molecular scores and immunohistochemical data. The HER2_2, PGR_ 5, PRO_10 and RISK_25 scores were significant predictors of disease free-survival (DFS) in univariate Cox proportional hazard regression. PRO_10 and RISK_25 scores predicted DFS in patients with histological grade II breast cancer and in lymph node positive disease. The PRO_10 and PGR_ 5 scores were independent predictors of DFS in multivariate Cox regression models incorporating clinical risk indicators; PRO_10 outperformed Ki-67 labeling index in multivariate Cox proportional hazard analyses. Conclusions: Scores representing the endocrine responsiveness and proliferation status of breast cancers were developed from gene expression analyses based on RNA derived from FFPE tissues. The validation of the molecular scores with tumor samples of participants of the BIG 1-98 trial demonstrates that such scores can serve as independent prognostic factors to estimate disease free survival (DFS) in postmenopausal patients with estrogen receptor positive breast cancer.
Resumo:
OBJECTIVES: Reassessment of ongoing antibiotic therapy is an important step towards appropriate use of antibiotics. This study was conducted to evaluate the impact of a short questionnaire designed to encourage reassessment of intravenous antibiotic therapy after 3 days. PATIENTS AND METHODS: Patients hospitalized on the surgical and medical wards of a university hospital and treated with an intravenous antibiotic for 3-4 days were randomly allocated to either an intervention or control group. The intervention consisted of mailing to the physician in charge of the patient a three-item questionnaire referring to possible adaptation of the antibiotic therapy. The primary outcome was the time elapsed from randomization until a first modification of the initial intravenous antibiotic therapy. It was compared within both groups using Cox proportional-hazard modelling. RESULTS: One hundred and twenty-six eligible patients were randomized in the intervention group and 125 in the control group. Time to modification of intravenous antibiotic therapy was 14% shorter in the intervention group (adjusted hazard ratio for modification 1.28, 95% CI 0.99-1.67, P = 0.06). It was significantly shorter in the intervention group compared with a similar group of 151 patients observed during a 2 month period preceding the study (adjusted hazard ratio 1.17, 95% CI 1.03-1.32, P = 0.02). CONCLUSION: The results suggest that a short questionnaire, easily adaptable to automatization, has the potential to foster reassessment of antibiotic therapy.
Resumo:
BACKGROUND: We sought to improve upon previously published statistical modeling strategies for binary classification of dyslipidemia for general population screening purposes based on the waist-to-hip circumference ratio and body mass index anthropometric measurements. METHODS: Study subjects were participants in WHO-MONICA population-based surveys conducted in two Swiss regions. Outcome variables were based on the total serum cholesterol to high density lipoprotein cholesterol ratio. The other potential predictor variables were gender, age, current cigarette smoking, and hypertension. The models investigated were: (i) linear regression; (ii) logistic classification; (iii) regression trees; (iv) classification trees (iii and iv are collectively known as "CART"). Binary classification performance of the region-specific models was externally validated by classifying the subjects from the other region. RESULTS: Waist-to-hip circumference ratio and body mass index remained modest predictors of dyslipidemia. Correct classification rates for all models were 60-80%, with marked gender differences. Gender-specific models provided only small gains in classification. The external validations provided assurance about the stability of the models. CONCLUSIONS: There were no striking differences between either the algebraic (i, ii) vs. non-algebraic (iii, iv), or the regression (i, iii) vs. classification (ii, iv) modeling approaches. Anticipated advantages of the CART vs. simple additive linear and logistic models were less than expected in this particular application with a relatively small set of predictor variables. CART models may be more useful when considering main effects and interactions between larger sets of predictor variables.
Resumo:
Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.
Resumo:
Machado-Joseph disease or spinocerebellar ataxia type 3, the most common dominantly-inherited spinocerebellar ataxia, results from translation of the polyglutamine-expanded and aggregation prone ataxin 3 protein. Clinical manifestations include cerebellar ataxia and pyramidal signs and there is no therapy to delay disease progression. Beclin 1, an autophagy-related protein and essential gene for cell survival, is decreased in several neurodegenerative disorders. This study aimed at evaluating if lentiviral-mediated beclin 1 overexpression would rescue motor and neuropathological impairments when administered to pre- and post-symptomatic lentiviral-based and transgenic mouse models of Machado-Joseph disease. Beclin 1-mediated significant improvements in motor coordination, balance and gait with beclin 1-treated mice equilibrating longer periods in the Rotarod and presenting longer and narrower footprints. Furthermore, in agreement with the improvements observed in motor function beclin 1 overexpression prevented neuronal dysfunction and neurodegeneration, decreasing formation of polyglutamine-expanded aggregates, preserving Purkinje cell arborization and immunoreactivity for neuronal markers. These data show that overexpression of beclin 1 in the mouse cerebellum is able to rescue and hinder the progression of motor deficits when administered to pre- and post-symptomatic stages of the disease.
Resumo:
Background: Atazanavir boosted with ritonavir (ATV/r) and efavirenz (EFV) are both recommended as first-line therapies for HIV-infected patients. We compared the 2 therapies for virologic efficacy and immune recovery. Methods: We included all treatment-naïve patients in the Swiss HIV Cohort Study starting therapy after May 2003 with either ATV/r or EFV and a backbone of tenofovir and either emtricitabine or lamivudine. We used Cox models to assess time to virologic failure and repeated measures models to assess the change in CD4 cell counts over time. All models were fit as marginal structural models using both point of treatment and censoring weights. Intent-to-treat and various as-treated analyses were carried out: In the latter, patients were censored at their last recorded measurement if they changed therapy or if they were no longer adherent to therapy. Results: Patients starting EFV (n = 1,097) and ATV/r (n = 384) were followed for a median of 35 and 37 months, respectively. During follow-up, 51% patients on EFV and 33% patients on ATV/r remained adherent and made no change to their first-line therapy. Although intent-to-treat analyses suggest virologic failure was more likely with ATV/r, there was no evidence for this disadvantage in patients who adhered to first-line therapy. Patients starting ATV/r had a greater increase in CD4 cell count during the first year of therapy, but this advantage disappeared after one year. Conclusions: In this observational study, there was no good evidence of any intrinsic advantage for one therapy over the other, consistent with earlier clinical trials. Differences between therapies may arise in a clinical setting because of differences in adherence to therapy.
Resumo:
Aims: To evaluate whether ki-67 labelling index (LI) has independent prognostic value for survival of patients with bladder urothelial tumours graded according to the 2004 World Health Organisation classification. Methods: Ki-67 LI was evaluated in 164 cases using the grid counting method. Non-invasive (stage Ta) tumours were: papilloma (n = 5), papillary urothelial neoplasia of low malignant potential (PUNLMP; n = 26), and low (LG; n = 34) or high grade (HG; n = 15) papillary urothelial carcinoma. Early invasive (stage T1) tumours were: LG (n = 58) and HG (n = 26) carcinoma. Statistical analysis included Fisher and x2 tests, and mean comparisons by ANOVA and t test. Univariate and multivariate survival analyses were performed according to the Kaplan–Meier method with log rank test and Cox’s proportional hazard method. Results: Mean ki-67 LI increased from papilloma to PUNLMP, LG, and HG in stage Ta (p,0.0001) and from LG to HG in stage T1 (p = 0.013) tumours. High tumour proliferation (.13%) was related to greater tumour size (p = 0.036), recurrence (p = 0.036), progression (p = 0.035), survival (p = 0.054), and high p53 accumulation (p = 0.015). Ki-67 LI and tumour size were independent predictors of disease free survival (DFS), but only ki-67 LI was related to progression free survival (PFS). Cancer specific overall survival (OS) was related to ki-67 LI, tumour size, and p27kip1 downregulation. Ki-67 LI was the main independent predictor of DFS (p = 0.0005), PFS (p = 0.0162), and cancer specific OS (p = 00195). Conclusion: Tumour proliferation measured by Ki-67 LI is related to tumour recurrence, stage progression, and is an independent predictor of DFS, PFS, and cancer specific OS in TaT1 bladder urothelial cell carcinoma.
Resumo:
Introduction: Two subcutaneous injections of adalimumab in severeacute sciatica have demonstrated a significant benefit on the numberof back surgeries in a short-term randomized controlled clinical trial[1]. This 3-year follow-up study aimed to determine whether theshort-term benefit was sustained over a longer period of time.Methods: Information on surgery was retrieved in 56/61 patients(93%). We used a Cox proportional hazard models to determinefactors predisposing to surgery.Results: Twenty-three (41%) patients had back surgery within 3 years,8/29 (28%) in the adalimumab group and 15/ 27 (56%) in the placebogroup, p = 0.038. Adalimumab injections reduced the need for backsurgery by 61% (Hazard Ratio (HR): 0.39 (95% CI: 0.17-0.92). In amultivariate model, treatment with a TNF-α antagonist remained thestrongest protective factor (HR 0.17, p = 0.002). Other significantpredictors of surgery were a good correlation between symptomsand MRI findings (HR = 11.6, p = 0.04), baseline intensity of leg pain(HR = 1.3, p = 0.06), intensity of back pain (HR = 1.4, p = 0.03)and duration of sickness leave (HR = 1.01 per day, p = 0.03).Conclusion: A short course of adalimumab in patients with severeacute sciatica significantly reduces the need for back surgery.
Resumo:
BACKGROUND Socio-economic inequalities in mortality are observed at the country level in both North America and Europe. The purpose of this work is to investigate the contribution of specific risk factors to social inequalities in cause-specific mortality using a large multi-country cohort of Europeans. METHODS A total of 3,456,689 person/years follow-up of the European Prospective Investigation into Cancer and Nutrition (EPIC) was analysed. Educational level of subjects coming from 9 European countries was recorded as proxy for socio-economic status (SES). Cox proportional hazard model's with a step-wise inclusion of explanatory variables were used to explore the association between SES and mortality; a Relative Index of Inequality (RII) was calculated as measure of relative inequality. RESULTS Total mortality among men with the highest education level is reduced by 43% compared to men with the lowest (HR 0.57, 95% C.I. 0.52-0.61); among women by 29% (HR 0.71, 95% C.I. 0.64-0.78). The risk reduction was attenuated by 7% in men and 3% in women by the introduction of smoking and to a lesser extent (2% in men and 3% in women) by introducing body mass index and additional explanatory variables (alcohol consumption, leisure physical activity, fruit and vegetable intake) (3% in men and 5% in women). Social inequalities were highly statistically significant for all causes of death examined in men. In women, social inequalities were less strong, but statistically significant for all causes of death except for cancer-related mortality and injuries. DISCUSSION In this European study, substantial social inequalities in mortality among European men and women which cannot be fully explained away by accounting for known common risk factors for chronic diseases are reported.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.