38 resultados para perdurability over time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND AIMS Limited data from large cohorts are available on tumor necrosis factor (TNF) antagonists (infliximab, adalimumab, certolizumab pegol) switch over time. We aimed to evaluate the prevalence of switching from one TNF antagonist to another and to identify associated risk factors. METHODS Data from the Swiss Inflammatory Bowel Diseases Cohort Study (SIBDCS) were analyzed. RESULTS Of 1731 patients included into the SIBDCS (956 with Crohn's disease [CD] and 775 with ulcerative colitis [UC]), 347 CD patients (36.3%) and 129 UC patients (16.6%) were treated with at least one TNF antagonist. A total of 53/347 (15.3%) CD patients (median disease duration 9 years) and 20/129 (15.5%) of UC patients (median disease duration 7 years) needed to switch to a second and/or a third TNF antagonist, respectively. Median treatment duration was longest for the first TNF antagonist used (CD 25 months; UC 14 months), followed by the second (CD 13 months; UC 4 months) and third TNF antagonist (CD 11 months; UC 15 months). Primary nonresponse, loss of response and side effects were the major reasons to stop and/or switch TNF antagonist therapy. A low body mass index, a short diagnostic delay and extraintestinal manifestations at inclusion were identified as risk factors for a switch of the first used TNF antagonist within 24 months of its use in CD patients. CONCLUSION Switching of the TNF antagonist over time is a common issue. The median treatment duration with a specific TNF antagonist is diminishing with an increasing number of TNF antagonists being used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present research is concerned with the direction of influence between objective and subjective career success. We conducted a prospective longitudinal study with five waves of measurement that covered a time span of 10 years. Participants were professionals working in different occupational fields (N=1,336). We modelled the changes in objective success (income, hierarchical position), in other-referent subjective success (subjective success as compared to a reference group), and in self-referent subjective success (job satisfaction) by means of latent growth curve analysis. Objective success influenced both the initial level and the growth of other-referent subjective success, but it had no influence on job satisfaction. Most importantly, both measures of subjective success and both their initial levels and their changes had strong influences on the growth of objective success. We conclude that the ‘objective success influences subjective success’ relationship is smaller than might be expected, whereas the ‘subjective success influences objective success’ relationship is larger than might be expected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES Gender-specific data on the outcome of combination antiretroviral therapy (cART) are a subject of controversy. We aimed to compare treatment responses between genders in a setting of equal access to cART over a 14-year period. METHODS Analyses included treatment-naïve participants in the Swiss HIV Cohort Study starting cART between 1998 and 2011 and were restricted to patients infected by heterosexual contacts or injecting drug use, excluding men who have sex with men. RESULTS A total of 3925 patients (1984 men and 1941 women) were included in the analysis. Women were younger and had higher CD4 cell counts and lower HIV RNA at baseline than men. Women were less likely to achieve virological suppression < 50 HIV-1 RNA copies/mL at 1 year (75.2% versus 78.1% of men; P = 0.029) and at 2 years (77.5% versus 81.1%, respectively; P = 0.008), whereas no difference between sexes was observed at 5 years (81.3% versus 80.5%, respectively; P = 0.635). The probability of virological suppression increased in both genders over time (test for trend, P < 0.001). The median increase in CD4 cell count at 1, 2 and 5 years was generally higher in women during the whole study period, but it gradually improved over time in both sexes (P < 0.001). Women also were more likely to switch or stop treatment during the first year of cART, and stops were only partly driven by pregnancy. In multivariate analysis, after adjustment for sociodemographic factors, HIV-related factors, cART and calendar period, female gender was no longer associated with lower odds of virological suppression. CONCLUSIONS Gender inequalities in the response to cART are mainly explained by the different prevalence of socioeconomic characteristics in women compared with men.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To define the feasibility of utilizing T2* mapping for assessment of early cartilage degeneration prior to surgery in patients with symptomatic femoroacetabular impingement (FAI), we compared cartilage of the hip joint in patients with FAI and healthy volunteers using T2* mapping at 3.0 Tesla over time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: We compare the prognostic strength of the lymph node ratio (LNR), positive lymph nodes (+LNs) and collected lymph nodes (LNcoll) using a time-dependent analysis in colorectal cancer patients stratified by mismatch repair (MMR) status. Method: 580 stage III-IV patients were included. Multivariable Cox regression analysis and time-dependent receiver operating characteristic (tROC) curve analysis were performed. The Area under the Curve (AUC) over time was compared for the three features. Results were validated on a second cohort of 105 stage III-IV patients. Results: The AUC for the LNR was 0.71 and outperformed + LNs and LNcoll by 10–15 % in both MMR-proficient and deficient cancers. LNR and + LNs were both significant (p<0.0001) in multivariable analysis but the effect was considerably stronger for the LNR [LNR: HR=5.18 (95 % CI: 3.5–7.6); +LNs=1.06 (95 % CI: 1.04–1.08)]. Similar results were obtained for patients with >12 LNcoll. An optimal cut off score for LNR=0.231 was validated on the second cohort (p<0.001). Conclusion: The LNR outperforms the + LNs and LNcoll even in patients with >12 LNcoll. Its clinical value is not confounded by MMR status. A cut-of score of 0.231 may best stratify patients into prognostic subgroups and could be a basis for the future prospective analysis of the LNR.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Neural dynamic processes correlated over several time scales are found in vivo, in stimulus-evoked as well as spontaneous activity, and are thought to affect the way sensory stimulation is processed. Despite their potential computational consequences, a systematic description of the presence of multiple time scales in single cortical neurons is lacking. In this study, we injected fast spiking and pyramidal (PYR) neurons in vitro with long-lasting episodes of step-like and noisy, in-vivo-like current. Several processes shaped the time course of the instantaneous spike frequency, which could be reduced to a small number (1-4) of phenomenological mechanisms, either reducing (adapting) or increasing (facilitating) the neuron's firing rate over time. The different adaptation/facilitation processes cover a wide range of time scales, ranging from initial adaptation (<10 ms, PYR neurons only), to fast adaptation (<300 ms), early facilitation (0.5-1 s, PYR only), and slow (or late) adaptation (order of seconds). These processes are characterized by broad distributions of their magnitudes and time constants across cells, showing that multiple time scales are at play in cortical neurons, even in response to stationary stimuli and in the presence of input fluctuations. These processes might be part of a cascade of processes responsible for the power-law behavior of adaptation observed in several preparations, and may have far-reaching computational consequences that have been recently described.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To determine whether the virulence of HIV-1 has been changing since its introduction into Switzerland. DESIGN: A prospective cohort study of HIV-1 infected individuals with well-characterized pre-therapy disease history. METHODS: To minimize the effect of recently imported viruses and ethnicity-associated host factors, the analysis was restricted to the white, north-west-European majority population of the cohort. Virulence was characterized by the decline slope of the CD4 cell count (n = 817 patients), the decline slope of the CD4:CD8 ratio (n = 815 patients) and the viral setpoint (n = 549 patients) in untreated patients with sufficient data points. Linear regression models were used to detect correlations between the date of diagnosis (ranging between 1984 and 2003) and the virulence markers, controlling for gender, exposure category, age and CD4 cell count at entry. RESULTS: We found no correlation between any of the virulence markers and the date of diagnosis. Inspection of short-term trends confirmed that virulence has fluctuated around a stable level over time. CONCLUSIONS: The lack of long-term time trends in the virulence markers indicates that HIV-1 is not evolving towards increasing or decreasing virulence at a perceptible rate. Both highly virulent and attenuated strains have apparently been unable to spread at the population level. This result suggests that either the evolution of virulence may be slow or inhibited due to evolutionary constraints, or HIV-1 may have already evolved to optimal virulence in the human host.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It has been established that successful pancreas transplantation in Type 1 (insulin-dependent) diabetic patients results in normal but exaggerated phasic glucose-induced insulin secretion, normal intravenous glucose disappearance rates, improved glucose recovery from insulin-induced hypoglycaemia, improved glucagon secretion during insulin-induced hypoglycaemia, but no alterations in pancreatic polypeptide responses to hypoglycaemia. However, previous reports have not segregated the data in terms of the length of time following successful transplantation and very little prospective data collected over time in individual patients has been published. This article reports that in general there are no significant differences in the level of improvement when comparing responses as early as three months post-operatively up to as long as two years post-operatively when examining the data cross-sectionally in patients who have successfully maintained their allografts. Moreover, this remarkable constancy in pancreatic islet function is also seen in a smaller group of patients who have been examined prospectively at various intervals post-operatively. It is concluded that successful pancreas transplantation results in remarkable improvements in Alpha and Beta cell but not PP cell function that are maintained for at least one to two years.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: This study examined the course of low-back pain over 52 weeks following current pain at baseline. Initial beliefs about the inevitability of the pain's negative consequences and fear avoidance beliefs were examined as potential risk factors for persistent low-back pain. METHODS: On a weekly basis over a period of one year, 264 participants reported both the intensity and frequency of their low-back pain and the degree to which it impaired their work performance. In a multilevel regression analysis, predictor variables included initial low-back pain intensity, age, gender, body mass index, anxiety/depression, participation in sport, heavy workload, time (1-52 weeks), and scores on the "back beliefs" and "fear-avoidance beliefs" questionnaires. RESULTS: The group mean values for both the intensity and frequency of weekly low-back pain, and the impairment of work performance due to such pain showed a recovery within the first 12 weeks. In a multilevel regression of 9497 weekly measurements, greater weekly low-back pain and impairment were predicted by higher levels of work-related fear avoidance beliefs. A significant interaction between time and the scores on both the work-related fear-avoidance and back beliefs questionnaires indicated faster recovery and pain relief over time in those who reported less fear-avoidance and fewer negative beliefs. CONCLUSIONS: Negative beliefs about the inevitability of adverse consequences of low-back pain and work-related, fear-avoidance beliefs are independent risk factors for poor recovery from low-back pain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND & AIMS Development of strictures is a major concern for patients with eosinophilic esophagitis (EoE). At diagnosis, EoE can present with an inflammatory phenotype (characterized by whitish exudates, furrows, and edema), a stricturing phenotype (characterized by rings and stenosis), or a combination of these. Little is known about progression of stricture formation; we evaluated stricture development over time in the absence of treatment and investigated risk factors for stricture formation. METHODS We performed a retrospective study using the Swiss EoE Database, collecting data on 200 patients with symptomatic EoE (153 men; mean age at diagnosis, 39 ± 15 years old). Stricture severity was graded based on the degree of difficulty associated with passing of the standard adult endoscope. RESULTS The median delay in diagnosis of EoE was 6 years (interquartile range, 2-12 years). With increasing duration of delay in diagnosis, the prevalence of fibrotic features of EoE, based on endoscopy, increased from 46.5% (diagnostic delay, 0-2 years) to 87.5% (diagnostic delay, >20 years; P = .020). Similarly, the prevalence of esophageal strictures increased with duration of diagnostic delay, from 17.2% (diagnostic delay, 0-2 years) to 70.8% (diagnostic delay, >20 years; P < .001). Diagnostic delay was the only risk factor for strictures at the time of EoE diagnosis (odds ratio = 1.08; 95% confidence interval: 1.040-1.122; P < .001). CONCLUSIONS The prevalence of esophageal strictures correlates with the duration of untreated disease. These findings indicate the need to minimize delay in diagnosis of EoE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES We sought to analyze the time course of atrial fibrillation (AF) episodes before and after circular plus linear left atrial ablation and the percentage of patients with complete freedom from AF after ablation by using serial seven-day electrocardiograms (ECGs). BACKGROUND The curative treatment of AF targets the pathophysiological corner stones of AF (i.e., the initiating triggers and/or the perpetuation of AF). The pathophysiological complexity of both may not result in an "all-or-nothing" response but may modify number and duration of AF episodes. METHODS In patients with highly symptomatic AF, circular plus linear ablation lesions were placed around the left and right pulmonary veins, between the two circles, and from the left circle to the mitral annulus using the electroanatomic mapping system. Repetitive continuous 7-day ECGs administered before and after catheter ablation were used for rhythm follow-up. RESULTS In 100 patients with paroxysmal (n = 80) and persistent (n = 20) AF, relative duration of time spent in AF significantly decreased over time (35 +/- 37% before ablation, 26 +/- 41% directly after ablation, and 10 +/- 22% after 12 months). Freedom from AF stepwise increased in patients with paroxysmal AF and after 12 months measured at 88% or 74% depending on whether 24-h ECG or 7-day ECG was used. Complete pulmonary vein isolation was demonstrated in <20% of the circular lesions. CONCLUSIONS The results obtained in patients with AF treated with circular plus linear left atrial lesions strongly indicate that substrate modification is the main underlying pathophysiologic mechanism and that it results in a delayed cure instead of an immediate cure.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004–2010, and described subsequent mortality and predictors of these. Methods Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient’s last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient’s death, 1st February 2010 or 6 months after the patient’s last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression. Results Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin’s lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004–2010 in this large observational cohort. Conclusions The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC.