912 resultados para failure of treatment
Resumo:
AIM To compare dentoskeletal and soft tissue treatment effects of two alternative Class II division 1 treatment modalities (maxillary first permanent molar extraction versus Herbst appliance). METHODS One-hundred-fifty-four Class II division 1 patients that had either been treated with extractions of the upper first molars and a lightwire multibracket (MB) appliance (n = 79; 38 girls, 41 boys) or non-extraction by means of a Herbst-MB appliance (n = 75; 35 girls, 40 boys). The groups were matched on age and sex. The average age at the start of treatment was 12.7 years for the extraction and for 13.0 years for the Herbst group. Pretreatment (T1) and posttreatment (T2) lateral cephalograms were retrospectively analyzed using a standard cephalometric analysis and the sagittal occlusal analysis according to Pancherz. RESULTS The SNA decrease was 1.10° (p = 0.001) more pronounced in the extraction group, the SNB angle increased 1.49° more in the Herbst group (p = 0.000). In the extraction group, a decrease in SNB angle (0.49°) was observed. The soft tissue profile convexity (N-Sn-Pog) decreased in both groups, which was 0.78° more (n. s.) pronounced in the Herbst group. The nasolabial angle increased significantly more (+ 2.33°, p = 0.025) in the extraction group. The mechanism of overjet correction in the extraction group was predominantly dental (65% dental and 35% skeletal changes), while in the Herbst group it was predominantly skeletal (58% skeletal and 42% dental changes) in origin. CONCLUSION Both treatment methods were successful and led to a correction of the Class II division 1 malocclusion. Whereas for upper first molar extraction treatment more dental and maxillary effects can be expected, in case of Herbst treatment skeletal and mandibular effects prevail.
Resumo:
The aim of this work was to investigate the published evidence on the comparison of self-perception and diagnosis of orthodontic treatment need. A search of Cochrane Library, MEDLINE, Scopus databases, and archives of two orthodontic journals was carried out from January 1966 to August 2011 by the two authors using Medical Subject Heading terms. Studies that investigated solely either self-perception of orthodontic need by laypersons or assessment of orthodontic need by professionals were excluded from the data analysis. The methodological soundness of each study and the aggregate level of evidence were evaluated according to predetermined criteria. Moderate level of evidence, the relatively highest grade, was assigned to 9.1 per cent of the 22 studies, finally included in the data analysis. The overall evidence level provided by the evaluated publications was rated as limited. However, the existing body of evidence indicated a highly variable association between self-perception of orthodontic treatment need and orthodontist's assessment. Future controlled studies with well-defined samples and common assessment methodology will clarify further the relationship between perception of treatment need by laypersons and orthodontists and enhance international comparison and development of health care strategies.
Resumo:
BACKGROUND The sympathetic nervous system (SNS) is an important regulator of cardiovascular function. Activation of SNS plays an important role in the pathophysiology and the prognosis of cardiovascular diseases such as heart failure, acute coronary syndromes, arrhythmia, and possibly hypertension. Vasodilators such as adenosine and sodium nitroprusside are known to activate SNS via baroreflex mechanisms. Because vasodilators are widely used in the treatment of patients with cardiovascular diseases, the aim of the present study was to assess the influence of clinically used dosages of isosorbide dinitrate and captopril on sympathetic nerve activity at rest and during stimulatory maneuvers. METHODS AND RESULTS Twenty-eight healthy volunteers were included in this double-blind placebo-controlled study, and muscle sympathetic nerve activity (MSA; with microelectrodes in the peroneal nerve), blood pressure, heart rate, and neurohumoral parameters were measured before and 90 minutes after the oral administration of 40 mg isosorbide dinitrate or 6.25 mg captopril. Furthermore, a 3-minute mental stress test and a cold pressor test were performed before and 90 minutes after drug administration. Resting MSA did not change after captopril and decreased compared with placebo (P < .05 versus placebo), whereas isosorbide dinitrate led to a marked increase in MSA (P < .05). Systolic blood pressure was reduced by isosorbide dinitrate (P < .05), whereas captopril decreased diastolic blood pressure (P < .05). The increases in MSA, blood pressure, and heart rate during mental stress were comparable before and after drug administration regardless of the medication. During cold pressor test, MSA and systolic and diastolic blood pressures increased to the same degree independent of treatment, but after isosorbide dinitrate, the increase in MSA seemed to be less pronounced. Heart rate did not change during cold stimulation. Plasma renin activity increased after captopril and isosorbide dinitrate (P < .05), whereas placebo had no effect. Endothelin-1 increased after placebo and isosorbide dinitrate (P < .05) but not after captopril. CONCLUSIONS Thus, captopril suppressed MSA despite lowering of diastolic blood pressure but allowed normal adaptation of the SNS during mental or physical stress. In contrast, the nitrate strongly activated the SNS under baseline conditions. These findings demonstrate that vasodilators differentially interact with the SNS, which could be of importance in therapeutic strategies for the treatment of patients with cardiovascular diseases.
Resumo:
BACKGROUND HIV treatment recommendations are updated as clinical trials are published. Whether recommendations drive clinicians to change antiretroviral therapy in well-controlled patients is unexplored. METHODS We selected patients with undetectable viral loads (VLs) on nonrecommended regimens containing double-boosted protease inhibitors (DBPIs), triple-nucleoside reverse transcriptase inhibitors (NRTIs), or didanosine (ddI) plus stavudine (d4T) at publication of the 2006 International AIDS Society recommendations. We compared demographic and clinical characteristics with those of control patients with undetectable VL not on these regimens and examined clinical outcome and reasons for treatment modification. RESULTS At inclusion, 104 patients were in the DBPI group, 436 in the triple-NRTI group, and 19 in the ddI/d4T group. By 2010, 28 (29%), 204 (52%), and 1 (5%) patient were still on DBPIs, triple-NRTIs, and ddI plus d4T, respectively. 'Physician decision,' excluding toxicity/virological failure, drove 30% of treatment changes. Predictors of recommendation nonobservance included female sex [adjusted odds ratio (aOR) 2.69, 95% confidence interval (CI) 1 to 7.26; P = 0.01] for DPBIs, and undetectable VL (aOR 3.53, 95% CI 1.6 to 7.8; P = 0.002) and lack of cardiovascular events (aOR 2.93, 95% CI 1.23 to 6.97; P = 0.02) for triple-NRTIs. All patients on DBPIs with documented diabetes or a cardiovascular event changed treatment. Recommendation observance resulted in lower cholesterol values in the DBPI group (P = 0.06), and more patients having undetectable VL (P = 0.02) in the triple-NRTI group. CONCLUSION The physician's decision is the main factor driving change from nonrecommended to recommended regimens, whereas virological suppression is associated with not switching. Positive clinical outcomes observed postswitch underline the importance of observing recommendations, even in well-controlled patients.
Resumo:
OBJECTIVES Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference. DESIGN Mathematical modelling study based on data from ART programmes. METHODS We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained. RESULTS RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality. CONCLUSIONS VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
Glucocorticoids are often applied in neonatology and perinatology to fight the problems of respiratory distress and chronic lung disease. There are, however, many controversies regarding the adverse side effects and long-term clinical benefits of this therapeutic approach. In rats, glucocorticoids are known to seriously impair the formation of alveoli when applied during the first two postnatal weeks even at very low dosage. The current study investigates short-term and long-term glucocorticoid effects on the rat lung by means of morphologic and morphometric observations at light and electron microscopic levels. Application of a high-dosage protocol for only few days resulted in a marked acceleration of lung development with a precocious microvascular maturation resulting in single capillary network septa in the first 4 postnatal days. By postnatal d 10, the lung morphologic phenotype showed a step back in the maturational state, with an increased number of septa with double capillary layer, followed by an exceptional second round of the alveolarization process. As a result of this process, there was an almost complete recovery in the parenchymal lung structure by postnatal d 36, and by d 60, there were virtually no qualitative or quantitative differences between experimental and control rats. These findings indicate that both dosage and duration of glucocorticoid therapy in the early postnatal period are very critical with respect to lung development and maturation and that a careful therapeutic strategy can minimize late sequelae of treatment.
Resumo:
BACKGROUND There is debate over using tenofovir or zidovudine alongside lamivudine in second-line antiretroviral therapy (ART) following stavudine failure. We analyzed outcomes in cohorts from South Africa, Zambia and Zimbabwe METHODS: Patients aged ≥16 years who switched from a first-line regimen including stavudine to a ritonavir-boosted lopinavir-based second-line regimen with lamivudine or emtricitabine and zidovudine or tenofovir in seven ART programs in southern Africa were included. We estimated the causal effect of receiving tenofovir or zidovudine on mortality and virological failure using Cox proportional hazards marginal structural models. Its parameters were estimated using inverse probability of treatment weights. Baseline characteristics were age, sex, calendar year and country. CD4 cell count, creatinine and hemoglobin levels were included as time-dependent confounders. RESULTS 1,256 patients on second-line ART, including 958 on tenofovir, were analyzed. Patients on tenofovir were more likely to have switched to second-line ART in recent years, spent more time on first-line ART (33 vs. 24 months) and had lower CD4 cell counts (172 vs. 341 cells/μl) at initiation of second-line ART. The adjusted hazard ratio comparing tenofovir with zidovudine was 1.00 (95% confidence interval 0.59-1.68) for virologic failure and 1.40 (0.57-3.41) for death. CONCLUSIONS We did not find any difference in treatment outcomes between patients on tenofovir or zidovudine; however, the precision of our estimates was limited. There is an urgent need for randomized trials to inform second-line ART strategies in resource-limited settings.
Resumo:
The different means for treating congestive heart failure have not yet achieved the improvement in quality of life and the prognosis of people with terminal stage cardiac disease. Some treatment resources, such as cardiac transplant, are only accessible for a selected group of patients. In the last decade, the interest on the role of electromechanic disturbances has grown and has motivated special interest for the use of the pacemaker as a tool for the treatment of congestive heart failure. During this period we have seen an important progress of this kind of treatment and, nowadays, multicenter studies have shown the hemodynamic improvement of the patients treated with this method. Selection of patients for this kind of treatment should be careful; although today it can be known which patients can benefit from this device in the treatment of congestive heart failure.
Resumo:
The success of combination antiretroviral therapy is limited by the evolutionary escape dynamics of HIV-1. We used Isotonic Conjunctive Bayesian Networks (I-CBNs), a class of probabilistic graphical models, to describe this process. We employed partial order constraints among viral resistance mutations, which give rise to a limited set of mutational pathways, and we modeled phenotypic drug resistance as monotonically increasing along any escape pathway. Using this model, the individualized genetic barrier (IGB) to each drug is derived as the probability of the virus not acquiring additional mutations that confer resistance. Drug-specific IGBs were combined to obtain the IGB to an entire regimen, which quantifies the virus' genetic potential for developing drug resistance under combination therapy. The IGB was tested as a predictor of therapeutic outcome using between 2,185 and 2,631 treatment change episodes of subtype B infected patients from the Swiss HIV Cohort Study Database, a large observational cohort. Using logistic regression, significant univariate predictors included most of the 18 drugs and single-drug IGBs, the IGB to the entire regimen, the expert rules-based genotypic susceptibility score (GSS), several individual mutations, and the peak viral load before treatment change. In the multivariate analysis, the only genotype-derived variables that remained significantly associated with virological success were GSS and, with 10-fold stronger association, IGB to regimen. When predicting suppression of viral load below 400 cps/ml, IGB outperformed GSS and also improved GSS-containing predictors significantly, but the difference was not significant for suppression below 50 cps/ml. Thus, the IGB to regimen is a novel data-derived predictor of treatment outcome that has potential to improve the interpretation of genotypic drug resistance tests.
Resumo:
Alveolar echinococcosis (AE) in humans is a parasitic disease characterized by severe damage to the liver and occasionally other organs. AE is caused by infection with the metacestode (larval) stage of the fox tapeworm Echinococcus multilocularis, usually infecting small rodents as natural intermediate hosts. Conventionally, human AE is chemotherapeutically treated with mebendazole or albendazole. There is, however still the need for improved chemotherapeutical options. Primary in vivo studies on drugs of interest are commonly performed in small laboratory animals such as mice and Mongolian jirds, and in most cases, a secondary infection model is used, whereby E. multilocularis metacestodes are directly injected into the peritoneal cavity or into the liver. Disadvantages of this methodological approach include risk of injury to organs during the inoculation and, most notably, a limitation in the macroscopic (visible) assessment of treatment efficacy. Thus, in order to monitor the efficacy of chemotherapeutical treatment, animals have to be euthanized and the parasite tissue dissected. In the present study, mice were infected with E. multilocularis metacestodes through the subcutaneous route and were then subjected to chemotherapy employing albendazole. Serological responses to infection were comparatively assessed in mice infected by the conventional intraperitoneal route. We demonstrate that the subcutaneous infection model for secondary AE facilitates the assessment of the progress of infection and drug treatment in the live animal.
Resumo:
There may be a relationship between the incidence of vasomotor and arthralgia/myalgia symptoms and treatment outcomes for postmenopausal breast cancer patients with endocrine-responsive disease who received adjuvant letrozole or tamoxifen. Data on patients randomized into the monotherapy arms of the BIG 1-98 clinical trial who did not have either vasomotor or arthralgia/myalgia/carpal tunnel (AMC) symptoms reported at baseline, started protocol treatment and were alive and disease-free at the 3-month landmark (n = 4,798) and at the 12-month landmark (n = 4,682) were used for this report. Cohorts of patients with vasomotor symptoms, AMC symptoms, neither, or both were defined at both 3 and 12 months from randomization. Landmark analyses were performed for disease-free survival (DFS) and for breast cancer free interval (BCFI), using regression analysis to estimate hazard ratios (HR) and 95 % confidence intervals (CI). Median follow-up was 7.0 years. Reporting of AMC symptoms was associated with better outcome for both the 3- and 12-month landmark analyses [e.g., 12-month landmark, HR (95 % CI) for DFS = 0.65 (0.49–0.87), and for BCFI = 0.70 (0.49–0.99)]. By contrast, reporting of vasomotor symptoms was less clearly associated with DFS [12-month DFS HR (95 % CI) = 0.82 (0.70–0.96)] and BCFI (12-month DFS HR (95 % CI) = 0.97 (0.80–1.18). Interaction tests indicated no effect of treatment group on associations between symptoms and outcomes. While reporting of AMC symptoms was clearly associated with better DFS and BCFI, the association between vasomotor symptoms and outcome was less clear, especially with respect to breast cancer-related events.
Resumo:
Immune responses against intestinal microbiota contribute to the pathogenesis of inflammatory bowel diseases (IBD) and involve CD4(+) T cells, which are activated by major histocompatibility complex class II (MHCII) molecules on antigen-presenting cells (APCs). However, it is largely unexplored how inflammation-induced MHCII expression by intestinal epithelial cells (IEC) affects CD4(+) T cell-mediated immunity or tolerance induction in vivo. Here, we investigated how epithelial MHCII expression is induced and how a deficiency in inducible epithelial MHCII expression alters susceptibility to colitis and the outcome of colon-specific immune responses. Colitis was induced in mice that lacked inducible expression of MHCII molecules on all nonhematopoietic cells, or specifically on IECs, by continuous infection with Helicobacter hepaticus and administration of interleukin (IL)-10 receptor-blocking antibodies (anti-IL10R mAb). To assess the role of interferon (IFN)-γ in inducing epithelial MHCII expression, the T cell adoptive transfer model of colitis was used. Abrogation of MHCII expression by nonhematopoietic cells or IECs induces colitis associated with increased colonic frequencies of innate immune cells and expression of proinflammatory cytokines. CD4(+) T-helper type (Th)1 cells - but not group 3 innate lymphoid cells (ILCs) or Th17 cells - are elevated, resulting in an unfavourably altered ratio between CD4(+) T cells and forkhead box P3 (FoxP3)(+) regulatory T (Treg) cells. IFN-γ produced mainly by CD4(+) T cells is required to upregulate MHCII expression by IECs. These results suggest that, in addition to its proinflammatory roles, IFN-γ exerts a critical anti-inflammatory function in the intestine which protects against colitis by inducing MHCII expression on IECs. This may explain the failure of anti-IFN-γ treatment to induce remission in IBD patients, despite the association of elevated IFN-γ and IBD.
Resumo:
Background: To detect attention deficit hyperactivity disorder (ADHD) in treatment seeking substance use disorders (SUD) patients, a valid screening instrument is needed. Objectives: To test the performance of the Adult ADHD Self-Report Scale V 1.1(ASRS) for adult ADHD in an international sample of treatment seeking SUD patients for DSM-IV-TR; for the proposed DSM-5 criteria; in different subpopulations, at intake and 1–2 weeks after intake; using different scoring algorithms; and different externalizing disorders as external criterion (including adult ADHD, bipolar disorder, antisocial and borderline personality disorder). Methods: In 1138 treatment seeking SUD subjects, ASRS performance was determined using diagnoses based on Conner's Adult ADHD Diagnostic Interview for DSM-IV (CAADID) as gold standard. Results: The prevalence of adult ADHD was 13.0% (95% CI: 11.0–15.0%). The overall positive predictive value (PPV) of the ASRS was 0.26 (95% CI: 0.22–0.30), the negative predictive value (NPV) was 0.97 (95% CI: 0.96–0.98). The sensitivity (0.84, 95% CI: 0.76–0.88) and specificity (0.66, 95% CI: 0.63–0.69) measured at admission were similar to the sensitivity (0.88, 95% CI: 0.83–0.93) and specificity (0.67, 95% CI: 0.64–0.70) measured 2 weeks after admission. Sensitivity was similar, but specificity was significantly better in patients with alcohol compared to (illicit) drugs as the primary substance of abuse (0.76 vs. 0.56). ASRS was not a good screener for externalizing disorders other than ADHD. Conclusions: The ASRS is a sensitive screener for identifying possible ADHD cases with very few missed cases among those screening negative in this population.
Resumo:
While glucocorticoid (GC) administration appears to be beneficial during the acute phase of treatment of neonates at risk of developing chronic lung disease, it is still not clear whether steroid application has an adverse long-term effect on the lung maturation. Thus, the goal of the present work was to analyze GC effects on the pulmonary structure in a rat model where dosage and timing of drug administration were adapted to the therapeutic situation in human neonatology. The animals received daily a maximum of 0.1 mg dexamethasone phosphate per kilogram body weight during the first 4 postnatal days. Investigations were performed at the light microscopic level by means of a digital image analysis system. While there were no differences in the lung architecture between experimental animals and controls on day 4, the earliest time point of observation, we found a widening of airspaces with a concomitant decrease in the alveolar surface area density, representing a loss of parenchymal complexity, on days 10 and 21 in treated rats. On days 36 and 60, however, no alterations in the pulmonary parenchyma could be detected in experimental animals. We conclude from these findings that the GC-induced initial inhibition of development (days 10 and 21) was completely reversed, so that a normal parenchymal architecture and also a normal alveolar surface area density were found in adult rats (days 36 and 60). From the results obtained using the regimen of GC administration described, mimicking more closely the steroid treatment in human neonatology, we conclude that the observed short-term adverse effects on lung development can be fully compensated until adult age.