813 resultados para Medications
Resumo:
OBJECTIVE: To determine the characteristics of asthma (A) and allergic rhinitis (AR) among asthma patients in primary care practice. RESEARCH DESIGN AND METHODS: Primary care physicians, pulmonologists, and allergologists were asked to recruit consecutive asthma patients with or without allergic rhinitis from their daily practice. Cross-sectional data on symptoms, severity, treatment and impact on quality of life of A and AR were recorded and examined using descriptive statistics. Patients with and without AR were then compared. RESULTS: 1244 asthma patients were included by 211 physicians. Asthma was controlled in 19%, partially controlled in 27% and not controlled in 54%. Asthma treatment was generally based on inhaled corticosteroids (ICS) with or without long acting beta 2 agonists (78%). A leukotriene receptor antagonist (LTRA) was used by 46% of the patients. Overall, 950 (76%) asthma patients had AR (A + AR) and 294 (24%) did not (A - AR). Compared to patients with A - AR, A + AR patients were generally younger (mean age +/- standard deviation: 42 +/- 16 vs. 50 +/- 19 years, p < 0.001) and fewer used ICS (75% vs. 88%, p < 0.001). LTRA usage was similar in both groups (46% vs. 48%). Asthma was uncontrolled in 53% of A + AR and 57% of A - AR patients. Allergic rhinitis was treated with a mean of 1.9 specific AR medications: antihistamines (77%), nasal steroids (66%) and/or vasoconstrictors (38%), and/or LTRA (42%). Rhinorrhoea, nasal obstruction, or nasal itching were the most frequently reported AR symptoms and the greatest reported degree of impairment was in daily activities/sports (55%). CONCLUSIONS: Allergic rhinitis was more common among younger asthma patients, increased the burden of symptoms and the need for additional medication but was associated with improved asthma control. However, most asthma patients remained suboptimally controlled regardl-ess of concomitant AR.
Resumo:
OBJECTIVE: Multiple studies have proved that microvascular decompression (MVD) is the treatment of choice in cases of medically refractory trigeminal neuralgia (TN). In the elderly, however, the surgical risks related to MVD are assumed to be unacceptably high and various alternative therapies have been proposed. We evaluated the outcomes of MVD in patients aged older than 65 years of age and compared them with the outcomes in a matched group of younger patients. The focus was on procedure-related morbidity rate and long-term outcome. METHODS: This was a retrospective study of 112 patients with TN operated on consecutively over 22 years. The main outcome measures were immediate and long-term postoperative pain relief and neurological status, especially function of trigeminal, facial, and cochlear nerves, as well as surgical complications. A questionnaire was used to assess long-term outcome: pain relief, duration of a pain-free period, need for pain medications, time to recurrence, pain severity, and need for additional treatment. RESULTS: The mean age was 70.35 years. The second and third branches of the trigeminal nerve were most frequently affected (37.3%). The mean follow-up period was 90 months (range, 48-295 months). Seventy-five percent of the patients were completely pain free, 11% were never pain free, and 14% experienced recurrences. No statistically significant differences existed in the outcome between the younger and older patient groups. Postoperative morbidity included trigeminal hypesthesia in 6.25%, hypacusis in 5.4%, and complete hearing loss, vertigo, and partial facial nerve palsy in 0.89% each. Cerebrospinal fluid leak and meningitis occurred in 1 patient each. There were no mortalities in both groups. CONCLUSION: MVD for TN is a safe procedure even in the elderly. The risk of serious morbidity or mortality is similar to that in younger patients. Furthermore, no significant differences in short- and long-term outcome were found. Thus, MVD is the treatment of choice in patients with medically refractory TN, unless their general condition prohibits it.
Resumo:
INTRODUCTION: We studied intra-individual and inter-individual variability of two online sedation monitors, BIS and Entropy, in volunteers under sedation. METHODS: Ten healthy volunteers were sedated in a stepwise manner with doses of either midazolam and remifentanil or dexmedetomidine and remifentanil. One week later the procedure was repeated with the remaining drug combination. The doses were adjusted to achieve three different sedation levels (Ramsay Scores 2, 3 and 4) and controlled by a computer-driven drug-delivery system to maintain stable plasma concentrations of the drugs. At each level of sedation, BIS and Entropy (response entropy and state entropy) values were recorded for 20 minutes. Baseline recordings were obtained before the sedative medications were administered. RESULTS: Both inter-individual and intra-individual variability increased as the sedation level deepened. Entropy values showed greater variability than BIS(R) values, and the variability was greater during dexmedetomidine/remifentanil sedation than during midazolam/remifentanil sedation. CONCLUSIONS: The large intra-individual and inter-individual variability of BIS and Entropy values in sedated volunteers makes the determination of sedation levels by processed electroencephalogram (EEG) variables impossible. Reports in the literature which draw conclusions based on processed EEG variables obtained from sedated intensive care unit (ICU) patients may be inaccurate due to this variability. TRIAL REGISTRATION: clinicaltrials.gov Nr. NCT00641563.
Resumo:
Previous somatic pain experience (priming), psychobiographic imprinting (pain proneness), and stress (action proneness) are key to an enhanced centralised pain response. This centralised pain response clinically manifests itself in pain sensitization and chronification. The therapeutic approach to chronic centralised pain disorders is multimodal. The overarching aim of the various interventions of a multimodal treatment program is to activate anti-nociceptive areas of the cerebral matrix involved in pain processing. The lists of medications targeting neuropathic and somatoform pain disorder show considerable overlap. Psychotherapy helps patients with central pain sensitization to improve pain control, emotional regulation and pain behaviour.
Resumo:
Complementary and alternative medicine (CAM) is popular in Germany. In a consecutive survey the experiences with CAM and the need for a CAM consultation among inpatients of the departments of cardiology (CL), gastroenterology (GE), oncology (OL) and psychosomatics (PS) of the University Hospital Freiburg (FUH) were questionned. Exclusion criteria were inability to understand the questions or a Karnofsky Index < 30%. Four hundred thirty-five patients were included. Three hundred and fifty patients, 100 each in the departments of CL, GE and OL, and 50 in PS answered the questionnaires. Eighty-five patients (20%) refused. Among the 350 patients 26% had previously visited a CAM physician and 19% had visited a CAM therapist (Heilpraktiker). Information about CAM was obtained mainly by television, radio and family members. Frequently used therapies for the current disease were physical training (21%), diet (19%), massage (19%), vitamins/trace elements (19%), herbs (13%), acupuncture (10%) and homeopathy (7%). The highest frequency of CAM use had PS patients, followed by GE, OL and CL patients. High effectivity (> or = 70%) for the current disease, rated on a scale of 4 degrees, had for CL patients physical exercise and massage, for GE patients herbal treatment and for OL patients diet. Physical exercise, diet, massage and herbal treatment generally had better ratings than homeopathy, acupuncture and vitamins. 65% would welcome a CAM center and 53% asked for a consultation about CAM at FUH. OL and GE patients had the strongest (58%), PS patients a lower (52%) and patients with cardiovascular diseases the lowest (43%) interest in a CAM consultation. Twenty-five percent believed, that CAM can help to cope better with their disease. Predictors for a positive attitude towards CAM were young age, aversion to chemical medications (Spearman correlation r = 0.22), desire to participate in therapeutic decisions (r = 0.29), motivation to change, if recommended, the life style (r = 0.31) and desire for a holistic treatment (r = 0.37).
Resumo:
BACKGROUND In 2007, leading international experts in the field of inflammatory bowel disease (IBD) recommended intravenous (IV) iron supplements over oral (PO) ones because of superior effectiveness and better tolerance. We aimed to determine the percentage of patients with IBD undergoing iron therapy and to assess the dynamics of iron prescription habits (IV versus PO). METHODS We analyzed anonymized data on patients with Crohn's disease and ulcerative colitis extracted from the Helsana database. Helsana is a Swiss health insurance company providing coverage for 18% of the Swiss population (1.2 million individuals). RESULTS In total, 629 patients with Crohn's disease (61% female) and 398 patients with ulcerative colitis (57% female) were identified; mean observation time was 31.8 months for Crohn's disease and 31.0 months for ulcerative colitis patients. Of all patients with IBD, 27.1% were prescribed iron (21.1% in males; 31.1% in females). Patients treated with steroids, immunomodulators, and/or anti-tumor necrosis factor drugs were more frequently treated with iron supplements when compared with those not treated with any medications (35.0% versus 20.9%, odds ratio, 1.94; P < 0.001). The frequency of IV iron prescriptions increased significantly from 2006 to 2009 for both genders (males: from 2.6% to 10.1%, odds ratio = 3.84, P < 0.001; females: from 5.3% to 12.1%, odds ratio = 2.26, P = 0.002), whereas the percentage of PO iron prescriptions did not change. CONCLUSIONS Twenty-seven percent of patients with IBD were treated with iron supplements. Iron supplements administered IV were prescribed more frequently over time. These prescription habits are consistent with the implementation of guidelines on the management of iron deficiency in IBD.
Resumo:
BACKGROUND The options for secondary prevention of cryptogenic embolism in patients with patent foramen ovale are administration of antithrombotic medications or percutaneous closure of the patent foramen ovale. We investigated whether closure is superior to medical therapy. METHODS We performed a multicenter, superiority trial in 29 centers in Europe, Canada, Brazil, and Australia in which the assessors of end points were unaware of the study-group assignments. Patients with a patent foramen ovale and ischemic stroke, transient ischemic attack (TIA), or a peripheral thromboembolic event were randomly assigned to undergo closure of the patent foramen ovale with the Amplatzer PFO Occluder or to receive medical therapy. The primary end point was a composite of death, nonfatal stroke, TIA, or peripheral embolism. Analysis was performed on data for the intention-to-treat population. RESULTS The mean duration of follow-up was 4.1 years in the closure group and 4.0 years in the medical-therapy group. The primary end point occurred in 7 of the 204 patients (3.4%) in the closure group and in 11 of the 210 patients (5.2%) in the medical-therapy group (hazard ratio for closure vs. medical therapy, 0.63; 95% confidence interval [CI], 0.24 to 1.62; P=0.34). Nonfatal stroke occurred in 1 patient (0.5%) in the closure group and 5 patients (2.4%) in the medical-therapy group (hazard ratio, 0.20; 95% CI, 0.02 to 1.72; P=0.14), and TIA occurred in 5 patients (2.5%) and 7 patients (3.3%), respectively (hazard ratio, 0.71; 95% CI, 0.23 to 2.24; P=0.56). CONCLUSIONS Closure of a patent foramen ovale for secondary prevention of cryptogenic embolism did not result in a significant reduction in the risk of recurrent embolic events or death as compared with medical therapy. (Funded by St. Jude Medical; ClinicalTrials.gov number, NCT00166257.).
Resumo:
OBJECTIVES Dentine hypersensitivity (DH) manifests as a transient but arresting oral pain. The incidence is thought to be rising, particularly in young adults, due to increases in consumption of healthy, yet erosive, diets. This study aimed to assess the prevalence of DH and relative importance of risk factors, in 18-35 year old Europeans. METHODS In 2011, 3187 adults were enrolled from general dental practices in France, Spain, Italy, United Kingdom, Finland, Latvia and Estonia. DH was clinically evaluated by cold air tooth stimulation, patient pain rating (yes/no), accompanied by investigator pain rating (Schiff 0-3). Erosive toothwear (BEWE index 0-3) and gingival recession (mm) were recorded. Patients completed a questionnaire regarding the nature of their DH, erosive dietary intake and toothbrushing habits. RESULTS 41.9% of patients reported pain on tooth stimulation and 56.8% scored ≥1 on Schiff scale for at least one tooth. Clinical elicited sensitivity was closely related to Schiff score and to a lesser degree, questionnaire reported sensitivity (26.8%), possibly reflecting the transient nature of the pain, alongside good coping mechanisms. Significant associations were found between clinically elicited DH and erosive toothwear and gingival recession. The questionnaire showed marked associations between DH and risk factors including heartburn/acid reflux, vomiting, sleeping medications, energy drinks, smoking and acid dietary intake. CONCLUSION Overall, the prevalence of DH was high compared to many published findings, with a strong, progressive relationship between DH and erosive toothwear, which is important to recognise for patient preventive therapies and clinical management of DH pain.
Resumo:
BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.
Resumo:
INTRODUCTION Current literature suggesting a higher bleeding risk during combination therapy compared to oral anticoagulation alone is primarily based on retrospective studies or specific populations. We aimed to prospectively evaluate whether unselected medical patients on oral anticoagulation have an increased risk of bleeding when on concomitant antiplatelet therapy. MATERIAL AND METHODS We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants between 01/2008 and 03/2009 from a Swiss university hospital. The primary outcome was the time to a first major bleed on oral anticoagulation within 12 months, adjusted for age, international normalized ratio target, number of medications, and history of myocardial infarction and major bleeding. RESULTS Among the 515 included anticoagulated patients, the incidence rate of a first major bleed was 8.2 per 100 patient-years. Overall, 161 patients (31.3%) were on both anticoagulant and antiplatelet therapy, and these patients had a similar incidence rate of major bleeding compared to patients on oral anticoagulation alone (7.6 vs. 8.4 per 100 patient-years, P=0.81). In a multivariate analysis, the association of concomitant antiplatelet therapy with the risk of major bleeding was not statistically significant (hazard ratio 0.89, 95% confidence interval, 0.37-2.10). CONCLUSIONS The risk of bleeding in patients receiving oral anticoagulants combined with antiplatelet therapy was similar to patients receiving oral anticoagulants alone, suggesting that the incremental bleeding risk of combination therapy might not be clinically significant.
Resumo:
BACKGROUND Subclinical thyroid dysfunction has been implicated as a risk factor for cognitive decline in old age, but results are inconsistent. We investigated the association between subclinical thyroid dysfunction and cognitive decline in the PROspective Study of Pravastatin in the Elderly at Risk (PROSPER). METHODS Prospective longitudinal study of men and women aged 70-82 years with pre-existing vascular disease or more than one risk factor to develop this condition (N = 5,154). Participants taking antithyroid medications, thyroid hormone supplementation and/or amiodarone were excluded. Thyroid function was measured at baseline: subclinical hyper- and hypothyroidism were defined as thyroid stimulating hormones (TSH) <0.45 mU/L or >4.50 mU/L respectively, with normal levels of free thyroxine (FT4). Cognitive performance was tested at baseline and at four subsequent time points during a mean follow-up of 3 years, using five neuropsychological performance tests. RESULTS Subclinical hyperthyroidism and hypothyroidism were found in 65 and 161 participants, respectively. We found no consistent association of subclinical hyper- or hypothyroidism with altered cognitive performance compared to euthyroid participants on the individual cognitive tests. Similarly, there was no association with rate of cognitive decline during follow-up. CONCLUSION We found no consistent evidence that subclinical hyper- or hypothyroidism contribute to cognitive impairment or decline in old age. Although our data are not in support of treatment of subclinical thyroid dysfunction to prevent cognitive dysfunction in later life, only large randomized controlled trials can provide definitive evidence.
Resumo:
OBJECTIVE: Anaemia in rheumatoid arthritis (RA) is prototypical of the chronic disease type and is often neglected in clinical practice. We studied anaemia in relation to disease activity, medications and radiographic progression. METHODS: Data were collected between 1996 and 2007 over a mean follow-up of 2.2 years. Anaemia was defined according to WHO (♀ haemoglobin<12 g/dl, ♂: haemoglobin<13 g/dl), or alternative criteria. Anaemia prevalence was studied in relation to disease parameters and pharmacological therapy. Radiographic progression was analysed in 9731 radiograph sets from 2681 patients in crude longitudinal regression models and after adjusting for potential confounding factors, including the clinical disease activity score with the 28-joint count for tender and swollen joints and erythrocyte sedimentation rate (DAS28ESR) or the clinical disease activity index (cDAI), synthetic antirheumatic drugs and antitumour necrosis factor (TNF) therapy. RESULTS: Anaemia prevalence decreased from more than 24% in years before 2001 to 15% in 2007. Erosions progressed significantly faster in patients with anaemia (p<0.001). Adjusted models showed these effects independently of clinical disease activity and other indicators of disease severity. Radiographic damage progression rates were increasing with severity of anaemia, suggesting a 'dose-response effect'. The effect of anaemia on damage progression was maintained in subgroups of patients treated with TNF blockade or corticosteroids, and without non-selective nonsteroidal anti-inflammatory drugs (NSAIDs). CONCLUSIONS: Anaemia in RA appears to capture disease processes that remain unmeasured by established disease activity measures in patients with or without TNF blockade, and may help to identify patients with more rapid erosive disease.
Resumo:
Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.
Resumo:
Numerous naturalistic, experimental, and mechanistic studies strongly support the notion that-as part of fight-or-flight response-hemostatic responses to acute psychosocial stress result in net hypercoagulability, which would protect a healthy organism from bleeding in case of injury. Sociodemographic factors, mental states, and comorbidities are important modulators of the acute prothrombotic stress response. In patients with atherosclerosis, exaggerated and prolonged stress-hypercoagulability might accelerate coronary thrombus growth following plaque rupture. Against a background risk from acquired prothrombotic conditions and inherited thrombophilia, acute stress also might trigger venous thromboembolic events. Chronic stressors such as job strain, dementia caregiving, and posttraumatic stress disorder as well as psychological distress from depressive and anxiety symptoms elicit a chronic low-grade hypercoagulable state that is no longer viewed as physiological but might impair vascular health. Through activation of the sympathetic nervous system, higher order cognitive processes and corticolimbic brain areas shape the acute prothrombotic stress response. Hypothalamic-pituitary-adrenal axis and autonomic dysfunction, including vagal withdrawal, are important regulators of hemostatic activity with longer lasting stress. Randomized placebo-controlled trials suggest that several cardiovascular drugs attenuate the acute prothrombotic stress response. Behavioral interventions and psychotropic medications might mitigate chronic low-grade hypercoagulability in stressed individuals, but further studies are clearly needed. Restoring normal hemostatic function with biobehavioral interventions bears the potential to ultimately decrease the risk of thrombotic diseases.
Resumo:
Lautropia mirabilis, a pleomorphic, motile, gram-negative coccus, has been isolated from the oral cavities of 32 of 60 (53.3%) children infected with human immunodeficiency virus (HIV) and 3 of 25 (12.0%) HIV-uninfected controls; the association of L. mirabilis isolation with HIV infection is significant (P < 0.001). All children in the study, both HIV-infected children and controls, were born to HIV-infected mothers. The presence of this bacterium was not associated with clinical disease in these children. The HIV-infected children with L. mirabilis did not differ from the HIV-infected children without L. mirabilis in immunological status, clinical status, or systemic medications. The role of HIV infection itself or concomitant factors in the establishment of L. mirabilis in the oral cavity remains to be elucidated.