864 resultados para Hyperbaric oxygen, Optimal protocol, Chronic wound, Mathematical modelling, Diabetes
Resumo:
This phase II trial investigated rituximab and cladribine in chronic lymphocytic leukemia. Four induction cycles, comprising cladribine (0.1 mg/kg/day days 1-5, cycles 1-4) and rituximab (375 mg/m(2) day 1, cycles 2-4), were given every 28 days. Stem cell mobilization (rituximab 375 mg/m(2) days 1 and 8; cyclophosphamide 4 g/m(2) day 2; and granulocyte colony-stimulating factor 10 microg/kg/day, from day 4) was performed in responders. Of 42 patients, nine achieved complete remission (CR), 15 very good partial remission, and two nodular partial remission (overall response rate 62%). Stem cell mobilization and harvesting (> or = 2 x 10(6) stem cells/kg body weight) were successful in 12 of 20 patients. Rituximab infusion-related adverse events were moderate. The main grade 3/4 adverse events during induction were neutropenia and lymphocytopenia. Rituximab plus cladribine was effective; however, the CR rate was modest and stem cell harvest was impaired in a large number of responding patients.
Resumo:
In Switzerland around 30,000 patients suffer from chronic skin wounds. Appropriate topical wound care along with treatment of the causes of the wounds enables to heal a lot of these patients and to avoid secondary disease such as infections. Thereby, the final goal of wound care is stable reepithelisation. Based on experience with chronic leg ulcers mainly in our out-patient wound centre, we give a survey of the wound dressings we actually use and discuss their wound-phase adapted application. Furthermore, we address the two tissue engineering products reimbursed in Switzerland, Apligraf and EpiDex, as well as the biological matrix product Oasis. The crucial question, which treatment options will be offered in future to the wound patients by our health regulatory and insurance systems, is open to debate.
Resumo:
Background Chronic localized pain syndromes, especially chronic low back pain (CLBP), are common reasons for consultation in general practice. In some cases chronic localized pain syndromes can appear in combination with chronic widespread pain (CWP). Numerous studies have shown a strong association between CWP and several physical and psychological factors. These studies are population-based cross-sectional and do not allow for assessing chronology. There are very few prospective studies that explore the predictors for the onset of CWP, where the main focus is identifying risk factors for the CWP incidence. Until now there have been no studies focusing on preventive factors keeping patients from developing CWP. Our aim is to perform a cross sectional study on the epidemiology of CLBP and CWP in general practice and to look for distinctive features regarding resources like resilience, self-efficacy and coping strategies. A subsequent cohort study is designed to identify the risk and protective factors of pain generalization (development of CWP) in primary care for CLBP patients. Methods/Design Fifty-nine general practitioners recruit consecutively, during a 5 month period, all patients who are consulting their family doctor because of chronic low back pain (where the pain is lasted for 3 months). Patients are asked to fill out a questionnaire on pain anamnesis, pain-perception, co-morbidities, therapy course, medication, socio demographic data and psychosomatic symptoms. We assess resilience, coping resources, stress management and self-efficacy as potential protective factors for pain generalization. Furthermore, we raise risk factors for pain generalization like anxiety, depression, trauma and critical life events. During a twelve months follow up period a cohort of CLBP patients without CWP will be screened on a regular basis (3 monthly) for pain generalization (outcome: incident CWP). Discussion This cohort study will be the largest study which prospectively analyzes predictors for transition from CLBP to CWP in primary care setting. In contrast to the typically researched risk factors, which increase the probability of pain generalization, this study also focus intensively on protective factors, which decrease the probability of pain generalization.
Resumo:
BACKGROUND: In women with chronic anovulation, the choice of the FSH starting dose and the modality of subsequent dose adjustments are critical in controlling the risk of overstimulation. The aim of this prospective randomized study was to assess the efficacy and safety of a decremental FSH dose regimen applied once the leading follicle was 10-13 mm in diameter in women treated for WHO Group II anovulation according to a chronic low-dose (CLD; 75 IU FSH for 14 days with 37.5 IU increment) step-up protocol. METHODS: Two hundred and nine subfertile women were treated with recombinant human FSH (r-hFSH) (Gonal-f) for ovulation induction according to a CLD step-up regimen. When the leading follicle reached a diameter of 10-13 mm, 158 participants were randomized by means of a computer-generated list to receive either the same FSH dose required to achieve the threshold for follicular development (CLD regimen) or half of this FSH dose [sequential (SQ) regimen]. HCG was administered only if not more than three follicles >or=16 mm in diameter were present and/or serum estradiol (E(2)) values were <1200 pg/ml. The primary outcome measure was the number of follicles >or=16 mm in size at the time of hCG administration. RESULTS: Clinical characteristics and ovarian parameters at the time of randomization were similar in the two groups. Both CLD and SQ protocols achieved similar follicular growth as regards the total number of follicles and medium-sized or mature follicles (>/=16 mm: 1.5 +/- 0.9 versus 1.4 +/- 0.7, respectively). Furthermore, serum E(2) levels were equivalent in the two groups at the time of hCG administration (441 +/- 360 versus 425 +/- 480 pg/ml for CLD and SQ protocols, respectively). The rate of mono-follicular development was identical as well as the percentage of patients who ovulated and achieved pregnancy. CONCLUSIONS: The results show that the CLD step-up regimen for FSH administration is efficacious and safe for promoting mono-follicular ovulation in women with WHO Group II anovulation. This study confirms that maintaining the same FSH starting dose for 14 days before increasing the dose in step-up regimen is critical to adequately control the risk of over-response. Strict application of CLD regimen should be recommended in women with WHO Group II anovulation.
Resumo:
Therapy has improved the survival of heart failure (HF) patients. However, many patients progress to advanced chronic HF (ACHF). We propose a practical clinical definition and describe the characteristics of this condition. Patients that are generally recognised as ACHF often exhibit the following characteristics: 1) severe symptoms (NYHA class III to IV); 2) episodes with clinical signs of fluid retention and/or peripheral hypoperfusion; 3) objective evidence of severe cardiac dysfunction, shown by at least one of the following: left ventricular ejection fraction<30%, pseudonormal or restrictive mitral inflow pattern at Doppler-echocardiography; high left and/or right ventricular filling pressures; elevated B-type natriuretic peptides; 4) severe impairment of functional capacity demonstrated by either inability to exercise, a 6-minute walk test distance<300 m or a peak oxygen uptake<12-14 ml/kg/min; 5) history of >1 HF hospitalisation in the past 6 months; 6) presence of all the previous features despite optimal therapy. This definition identifies a group of patients with compromised quality of life, poor prognosis, and a high risk of clinical events. These patients deserve effective therapeutic options and should be potential targets for future clinical research initiatives.
Resumo:
BACKGROUND: Low back pain (LBP) is by far the most prevalent and costly musculoskeletal problem in our society today. Following the recommendations of the Multinational Musculoskeletal Inception Cohort Study (MMICS) Statement, our study aims to define outcome assessment tools for patients with acute LBP and the time point at which chronic LBP becomes manifest and to identify patient characteristics which increase the risk of chronicity. METHODS: Patients with acute LBP will be recruited from clinics of general practitioners (GPs) in New Zealand (NZ) and Switzerland (CH). They will be assessed by postal survey at baseline and at 3, 6, 12 weeks and 6 months follow-up. Primary outcome will be disability as measured by the Oswestry Disability Index (ODI); key secondary endpoints will be general health as measured by the acute SF-12 and pain as measured on the Visual Analogue Scale (VAS). A subgroup analysis of different assessment instruments and baseline characteristics will be performed using multiple linear regression models. This study aims to examine: 1. Which biomedical, psychological, social, and occupational outcome assessment tools are identifiers for the transition from acute to chronic LBP and at which time point this transition becomes manifest. 2. Which psychosocial and occupational baseline characteristics like work status and period of work absenteeism influence the course from acute to chronic LBP. 3. Differences in outcome assessment tools and baseline characteristics of patients in NZ compared with CH. DISCUSSION: This study will develop a screening tool for patients with acute LBP to be used in GP clinics to access the risk of developing chronic LBP. In addition, biomedical, psychological, social, and occupational patient characteristics which influence the course from acute to chronic LBP will be identified. Furthermore, an appropriate time point for follow-ups will be given to detect this transition. The generalizability of our findings will be enhanced by the international perspective of this study. TRIAL REGISTRATION: [Clinical Trial Registration Number, ACTRN12608000520336].
Resumo:
BACKGROUND: There is little evidence on differences across health care systems in choice and outcome of the treatment of chronic low back pain (CLBP) with spinal surgery and conservative treatment as the main options. At least six randomised controlled trials comparing these two options have been performed; they show conflicting results without clear-cut evidence for superior effectiveness of any of the evaluated interventions and could not address whether treatment effect varied across patient subgroups. Cost-utility analyses display inconsistent results when comparing surgical and conservative treatment of CLBP. Due to its higher feasibility, we chose to conduct a prospective observational cohort study. METHODS: This study aims to examine if1. Differences across health care systems result in different treatment outcomes of surgical and conservative treatment of CLBP2. Patient characteristics (work-related, psychological factors, etc.) and co-interventions (physiotherapy, cognitive behavioural therapy, return-to-work programs, etc.) modify the outcome of treatment for CLBP3. Cost-utility in terms of quality-adjusted life years differs between surgical and conservative treatment of CLBP.This study will recruit 1000 patients from orthopaedic spine units, rehabilitation centres, and pain clinics in Switzerland and New Zealand. Effectiveness will be measured by the Oswestry Disability Index (ODI) at baseline and after six months. The change in ODI will be the primary endpoint of this study.Multiple linear regression models will be used, with the change in ODI from baseline to six months as the dependent variable and the type of health care system, type of treatment, patient characteristics, and co-interventions as independent variables. Interactions will be incorporated between type of treatment and different co-interventions and patient characteristics. Cost-utility will be measured with an index based on EQol-5D in combination with cost data. CONCLUSION: This study will provide evidence if differences across health care systems in the outcome of treatment of CLBP exist. It will classify patients with CLBP into different clinical subgroups and help to identify specific target groups who might benefit from specific surgical or conservative interventions. Furthermore, cost-utility differences will be identified for different groups of patients with CLBP. Main results of this study should be replicated in future studies on CLBP.
Resumo:
IMPORTANCE International guidelines advocate a 7- to 14-day course of systemic glucocorticoid therapy in acute exacerbations of chronic obstructive pulmonary disease (COPD). However, the optimal dose and duration are unknown. OBJECTIVE To investigate whether a short-term (5 days) systemic glucocorticoid treatment in patients with COPD exacerbation is noninferior to conventional (14 days) treatment in clinical outcome and whether it decreases the exposure to steroids. DESIGN, SETTING, AND PATIENTS REDUCE: (Reduction in the Use of Corticosteroids in Exacerbated COPD), a randomized, noninferiority multicenter trial in 5 Swiss teaching hospitals, enrolling 314 patients presenting to the emergency department with acute COPD exacerbation, past or present smokers (≥20 pack-years) without a history of asthma, from March 2006 through February 2011. INTERVENTIONS Treatment with 40 mg of prednisone daily for either 5 or 14 days in a placebo-controlled, double-blind fashion. The predefined noninferiority criterion was an absolute increase in exacerbations of at most 15%, translating to a critical hazard ratio of 1.515 for a reference event rate of 50%. MAIN OUTCOME AND MEASURE Time to next exacerbation within 180 days. RESULTS Of 314 randomized patients, 289 (92%) of whom were admitted to the hospital, 311 were included in the intention-to-treat analysis and 296 in the per-protocol analysis. Hazard ratios for the short-term vs conventional treatment group were 0.95 (90% CI, 0.70 to 1.29; P = .006 for noninferiority) in the intention-to-treat analysis and 0.93 (90% CI, 0.68 to 1.26; P = .005 for noninferiority) in the per-protocol analysis, meeting our noninferiority criterion. In the short-term group, 56 patients (35.9%) reached the primary end point; 57 (36.8%) in the conventional group. Estimates of reexacerbation rates within 180 days were 37.2% (95% CI, 29.5% to 44.9%) in the short-term; 38.4% (95% CI, 30.6% to 46.3%) in the conventional, with a difference of -1.2% (95% CI, -12.2% to 9.8%) between the short-term and the conventional. Among patients with a reexacerbation, the median time to event was 43.5 days (interquartile range [IQR], 13 to 118) in the short-term and 29 days (IQR, 16 to 85) in the conventional. There was no difference between groups in time to death, the combined end point of exacerbation, death, or both and recovery of lung function. In the conventional group, mean cumulative prednisone dose was significantly higher (793 mg [95% CI, 710 to 876 mg] vs 379 mg [95% CI, 311 to 446 mg], P < .001), but treatment-associated adverse reactions, including hyperglycemia and hypertension, did not occur more frequently. CONCLUSIONS AND RELEVANCE In patients presenting to the emergency department with acute exacerbations of COPD, 5-day treatment with systemic glucocorticoids was noninferior to 14-day treatment with regard to reexacerbation within 6 months of follow-up but significantly reduced glucocorticoid exposure. These findings support the use of a 5-day glucocorticoid treatment in acute exacerbations of COPD. TRIAL REGISTRATION isrctn.org Identifier: ISRCTN19646069.
Resumo:
The β2 adrenergic receptor (β2AR) regulates smooth muscle relaxation in the vasculature and airways. Long- and Short-acting β-agonists (LABAs/SABAs) are widely used in treatment of chronic obstructive pulmonary disorder (COPD) and asthma. Despite their widespread clinical use we do not understand well the dominant β2AR regulatory pathways that are stimulated during therapy and bring about tachyphylaxis, which is the loss of drug effects. Thus, an understanding of how the β2AR responds to various β-agonists is crucial to their rational use. Towards that end we have developed deterministic models that explore the mechanism of drug- induced β2AR regulation. These mathematical models can be classified into three classes; (i) Six quantitative models of SABA-induced G protein coupled receptor kinase (GRK)-mediated β2AR regulation; (ii) Three phenomenological models of salmeterol (a LABA)-induced GRK-mediated β2AR regulation; and (iii) One semi-quantitative, unified model of SABA-induced GRK-, protein kinase A (PKA)-, and phosphodiesterase (PDE)-mediated regulation of β2AR signalling. The various models were constrained with all or some of the following experimental data; (i) GRK-mediated β2AR phosphorylation in response to various LABAs/SABAs; (ii) dephosphorylation of the GRK site on the β2AR; (iii) β2AR internalisation; (iv) β2AR recycling; (v) β2AR desensitisation; (vi) β2AR resensitisation; (vii) PKA-mediated β2AR phosphorylation in response to a SABA; and (viii) LABA/SABA induced cAMP profile ± PDE inhibitors. The models of GRK-mediated β2AR regulation show that plasma membrane dephosphorylation and recycling of the phosphorylated β2AR are required to reconcile with the measured dephosphorylation kinetics. We further used a consensus model to predict the consequences of rapid pulsatile agonist stimulation and found that although resensitisation was rapid, the β2AR system retained the memory of prior stimuli and desensitised much more rapidly and strongly in response to subsequent stimuli. This could explain tachyphylaxis of SABAs over repeated use in rescue therapy of asthma patients. The LABA models show that the long action of salmeterol can be explained due to decreased stability of the arrestin/β2AR/salmeterol complex. This could explain long action of β-agonists used in maintenance therapy of asthma patients. Our consensus model of PKA/PDE/GRK-mediated β2AR regulation is being used to identify the dominant β2AR desensitisation pathways under different therapeutic regimens in human airway cells. In summary our models represent a significant advance towards understanding agonist-specific β2AR regulation that will aid in a more rational use of the β2AR agonists in the treatment of asthma.
Resumo:
BACKGROUND The effectiveness and durability of endovascular revascularization therapies for chronic critical limb ischemia (CLI) are challenged by the extensive burden of infrapopliteal arterial disease and lesion-related characteristics (e.g., severe calcification, chronic total occlusions), which frequently result in poor clinical outcomes. While infrapopliteal vessel patency directly affects pain relief and wound healing, sustained patency and extravascular care both contribute to the ultimate "patient-centric" outcomes of functional limb preservation, mobility and quality of life (QoL). METHODS/DESIGN IN.PACT DEEP is a 2:1 randomized controlled trial designed to assess the efficacy and safety of infrapopliteal arterial revascularization between the IN.PACT Amphirion™ paclitaxel drug-eluting balloon (IA-DEB) and standard balloon angioplasty (PTA) in patients with Rutherford Class 4-5-6 CLI. DISCUSSION This multicenter trial has enrolled 358 patients at 13 European centers with independent angiographic core lab adjudication of the primary efficacy endpoint of target lesion late luminal loss (LLL) and clinically driven target lesion revascularization (TLR) in major amputation-free surviving patients through 12-months. An independent wound core lab will evaluate all ischemic wounds to assess the extent of healing and time to healing at 1, 6, and 12 months. A QoL questionnaire including a pain scale will assess changes from baseline scores through 12 months. A Clinical Events Committee and Data Safety Monitoring Board will adjudicate the composite primary safety endpoints of all-cause death, major amputation, and clinically driven TLR at 6 months and other trial endpoints and supervise patient safety throughout the study. All patients will be followed for 5 years. A literature review is presented of the current status of endovascular treatment of CLI with drug-eluting balloon and standard PTA. The rationale and design of the IN.PACT DEEP Trial are discussed. IN.PACT DEEP is a milestone, prospective, randomized, robust, independent core lab-adjudicated CLI trial that will evaluate the role of a new infrapopliteal revascularization technology, the IA-DEB, compared to PTA. It will assess the overall impact on infrapopliteal artery patency, limb salvage, wound healing, pain control, QoL, and patient mobility. The 1-year results of the adjudicated co-primary and secondary endpoints will be available in 2014. TRIAL REGISTRATION NCT00941733
Resumo:
BACKGROUND Surgical site infections are the most common hospital-acquired infections among surgical patients. The administration of surgical antimicrobial prophylaxis reduces the risk of surgical site infections . The optimal timing of this procedure is still a matter of debate. While most studies suggest that it should be given as close to the incision time as possible, others conclude that this may be too late for optimal prevention of surgical site infections. A large observational study suggests that surgical antimicrobial prophylaxis should be administered 74 to 30 minutes before surgery. The aim of this article is to report the design and protocol of a randomized controlled trial investigating the optimal timing of surgical antimicrobial prophylaxis.Methods/design: In this bi-center randomized controlled trial conducted at two tertiary referral centers in Switzerland, we plan to include 5,000 patients undergoing general, oncologic, vascular and orthopedic trauma procedures. Patients are randomized in a 1:1 ratio into two groups: one receiving surgical antimicrobial prophylaxis in the anesthesia room (75 to 30 minutes before incision) and the other receiving surgical antimicrobial prophylaxis in the operating room (less than 30 minutes before incision). We expect a significantly lower rate of surgical site infections with surgical antimicrobial prophylaxis administered more than 30 minutes before the scheduled incision. The primary outcome is the occurrence of surgical site infections during a 30-day follow-up period (one year with an implant in place). When assuming a 5 surgical site infection risk with administration of surgical antimicrobial prophylaxis in the operating room, the planned sample size has an 80% power to detect a relative risk reduction for surgical site infections of 33% when administering surgical antimicrobial prophylaxis in the anesthesia room (with a two-sided type I error of 5%). We expect the study to be completed within three years. DISCUSSION The results of this randomized controlled trial will have an important impact on current international guidelines for infection control strategies in the hospital. Moreover, the results of this randomized controlled trial are of significant interest for patient safety and healthcare economics.Trial registration: This trial is registered on ClinicalTrials.gov under the identifier NCT01790529.
Resumo:
AIMS Skeletal muscle wasting affects 20% of patients with chronic heart failure and has serious implications for their activities of daily living. Assessment of muscle wasting is technically challenging. C-terminal agrin-fragment (CAF), a breakdown product of the synaptically located protein agrin, has shown early promise as biomarker of muscle wasting. We sought to investigate the diagnostic properties of CAF in muscle wasting among patients with heart failure. METHODS AND RESULTS We assessed serum CAF levels in 196 patients who participated in the Studies Investigating Co-morbidities Aggravating Heart Failure (SICA-HF). Muscle wasting was identified using dual-energy X-ray absorptiometry (DEXA) in 38 patients (19.4%). Patients with muscle wasting demonstrated higher CAF values than those without (125.1 ± 59.5 pmol/L vs. 103.8 ± 42.9 pmol/L, P = 0.01). Using receiver operating characteristics (ROC), we calculated the optimal CAF value to identify patients with muscle wasting as >87.5 pmol/L, which had a sensitivity of 78.9% and a specificity of 43.7%. The area under the ROC curve was 0.63 (95% confidence interval 0.56-0.70). Using simple regression, we found that serum CAF was associated with handgrip (R = - 0.17, P = 0.03) and quadriceps strength (R = - 0.31, P < 0.0001), peak oxygen consumption (R = - 0.5, P < 0.0001), 6-min walk distance (R = - 0.32, P < 0.0001), and gait speed (R = - 0.2, P = 0.001), as well as with parameters of kidney and liver function, iron metabolism and storage. CONCLUSION CAF shows good sensitivity for the detection of skeletal muscle wasting in patients with heart failure. Its assessment may be useful to identify patients who should undergo additional testing, such as detailed body composition analysis. As no other biomarker is currently available, further investigation is warranted.
Resumo:
This paper is concerned with the low dimensional structure of optimal streaks in the Blasius boundary layer. Optimal streaks are well known to exhibit an approximate self-similarity, namely the streamwise velocity re-scaled with their maximum remains almost independent of both the spanwise wavenumber and the streamwise coordinate. However, the reason of this self-similar behavior is still unexplained as well as unexploited. After revisiting the structure of the streaks near the leading edge singularity, two additional approximately self-similar relations involving the velocity components and their wall normal derivatives are identified. Based on these properties, we derive a low dimensional model with two degrees of freedom. The comparison with the results obtained from the linearized boundary layer equations shows that this model is consistent and provide good approximations.
Resumo:
Peer reviewed
Resumo:
Background. The present paper describes a component of a large Population cost-effectiveness study that aimed to identify the averted burden and economic efficiency of current and optimal treatment for the major mental disorders. This paper reports on the findings for the anxiety disorders (panic disorder/agoraphobia, social phobia, generalized anxiety disorder, post-traumatic stress disorder and obsessive-compulsive disorder). Method. Outcome was calculated as averted 'years lived with disability' (YLD), a population summary measure of disability burden. Costs were the direct health care costs in 1997-8 Australian dollars. The cost per YLD averted (efficiency) was calculated for those already in contact with the health system for a mental health problem (current care) and for a hypothetical optimal care package of evidence-based treatment for this same group. Data sources included the Australian National Survey of Mental Health and Well-being and published treatment effects and unit costs. Results. Current coverage was around 40% for most disorders with the exception of social phobia at 21%. Receipt of interventions consistent with evidence-based care ranged from 32% of those in contact with services for social phobia to 64% for post-traumatic stress disorder. The cost of this care was estimated at $400 million, resulting in a cost per YLD averted ranging from $7761 for generalized anxiety disorder to $34 389 for panic/agoraphobia. Under optimal care, costs remained similar but health gains were increased substantially, reducing the cost per YLD to < $20 000 for all disorders. Conclusions. Evidence-based care for anxiety disorders would produce greater population health gain at a similar cost to that of current care, resulting in a substantial increase in the cost-effectiveness of treatment.