309 resultados para Enzyme replacement therapy
Resumo:
Purpose Developments in anti-osteoporosis medications (AOMs) have led to changes in guidelines and policy, which, along with media and marketing strategies, have had an impact upon the prescribing of AOM. The aim was to examine patterns of AOM dispensing in older women (aged 76–81 years at baseline) from 2002 to 2010. Methods Administrative claims data were used to describe AOM dispensing in 4649 participants (born in 1921–1926 and still alive in 2011) in the Australian Longitudinal Study on Women's Health. The patterns were interpreted in the context of changes in guidelines, indications for subsidy, publications (scholarly and general media), and marketing activities. Results Total use of AOM increased from 134 DDD/1000/day in 2002 to 216 DDD/1000/day in 2007 but then decreased to 184 DDD/1000/day in 2010. Alendronate was the most commonly dispensed AOM but decreased from 2007, while use of risedronate (2002 onward), strontium ranelate (2007 onward) and zoledronic acid (2008 onward) increased. Etidronate and hormone replacement therapy (HRT) prescriptions gradually decreased over time. The decline in alendronate dispensing coincided with increases of other bisphosphonates and publicity about potential adverse effects of bisphosphonates, despite relaxing indications for bone density testing and subsidy for AOM. Conclusions Overall dispense of AOM from 2002 reached a peak in 2007 and thereafter declined despite increases in therapeutic options and improved subsidised access. The recent decline in overall AOM dispensing seems to be explained largely by negative publicity rather than specific changes in guidelines and policy.
Resumo:
Patients presenting for knee replacement on warfarin for medical reasons often require higher levels of anticoagulation peri-operatively than primary thromboprophylaxis and may require bridging therapy with heparin. We performed a retrospective case control study on 149 consecutive primary knee arthroplasty patients to investigate whether anti-coagulation affected short-term outcomes. Specific outcome measures indicated significant increases in prolonged wound drainage (26.8% of cases vs 7.3% of controls, p<0.001); superficial infection (16.8% vs 3.3%, p<0.001); deep infection (6.0% vs 0%, p<0.001); return-to-theatre for washout (4.7% vs 0.7%, p=0.004); and revision (4.7% vs 0.3%, p=0.001). Management of patients on long-term warfarin therapy following TKR is particularly challenging, as the surgeon must balance risk of thromboembolism against post-operative complications on an individual patient basis in order to optimise outcomes.
Resumo:
The epidermal growth factor receptor (EGFR) is part of a family of plasma membrane receptor tyrosine kinases that control many important cellular functions, from growth and proliferation to cell death. Cyclooxygenase (COX)-2 is an enzyme which catalyses the conversion of arachidonic acid to prostagladins and thromboxane. It is induced by various inflammatory stimuli, including the pro-inflammatory cytokines, Interleukin (IL)-1β, Tumour Necrosis Factor (TNF)-α and IL-2. Both EGFR and COX-2 are over-expressed in non-small cell lung cancer (NSCLC) and have been implicated in the early stages of tumourigenesis. This paper considers their roles in the development and progression of lung cancer, their potential interactions, and reviews the recent progress in cancer therapies that are directed toward these targets. An increasing body of evidence suggests that selective inhibitors of both EGFR and COX-2 are potential therapeutic agents for the treatment of NSCLC, in the adjuvant, metastatic and chemopreventative settings. © 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
Introduction Vascular access devices (VADs), such as peripheral or central venous catheters, are vital across all medical and surgical specialties. To allow therapy or haemodynamic monitoring, VADs frequently require administration sets (AS) composed of infusion tubing, fluid containers, pressure-monitoring transducers and/or burettes. While VADs are replaced only when necessary, AS are routinely replaced every 3–4 days in the belief that this reduces infectious complications. Strong evidence supports AS use up to 4 days, but there is less evidence for AS use beyond 4 days. AS replacement twice weekly increases hospital costs and workload. Methods and analysis This is a pragmatic, multicentre, randomised controlled trial (RCT) of equivalence design comparing AS replacement at 4 (control) versus 7 (experimental) days. Randomisation is stratified by site and device, centrally allocated and concealed until enrolment. 6554 adult/paediatric patients with a central venous catheter, peripherally inserted central catheter or peripheral arterial catheter will be enrolled over 4 years. The primary outcome is VAD-related bloodstream infection (BSI) and secondary outcomes are VAD colonisation, AS colonisation, all-cause BSI, all-cause mortality, number of AS per patient, VAD time in situ and costs. Relative incidence rates of VAD-BSI per 100 devices and hazard rates per 1000 device days (95% CIs) will summarise the impact of 7-day relative to 4-day AS use and test equivalence. Kaplan-Meier survival curves (with log rank Mantel-Cox test) will compare VAD-BSI over time. Appropriate parametric or non-parametric techniques will be used to compare secondary end points. p Values of <0.05 will be considered significant.
Resumo:
Background Centers for Disease Control Guidelines recommend replacement of peripheral intravenous (IV) catheters every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bacteraemia. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. Objectives To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely.
Resumo:
The shortage of donor hearts for patients with end stage heart failure has accelerated the development of ventricular assist devices (VAD) that act as a replacement heart. Mechanical devices involving pulsatile, axial and centrifugal devices have been proposed. Recent clinical developments indicate that centrifugal devices are not only beneficial for bridge to transplantation applications, but may also aid myocardial recovery. The results of a recent study have shown that patients who received a VAD have extended lives and improved quality of life compared to recipients of drug therapy. Unfortunately 25% of these patients develop right heart failure syndrome, sepsis and multi-organ failure. It was reported that 17% of patients initially receiving an LVAD later required a right ventricular assist device (RVAD). Hence, current research focus is in the development of a bi-ventricular assist device (BVAD). Current BVAD technology is either too bulky or necessitates having to implant two pumps working independently. The latter requires two different controllers for each pump leading to the potential complication of uneven flow dynamics and the requirements for a large amount of body space. This paper illustrates the combination of the LVAD and RVAD as one complete device to augment the function of both the left and right cardiac chambers with double impellers. The proposed device has two impellers rotating in counter directions, hence eliminating the necessity of the body muscles and tubing/heart connection to restrain the pump. The device will also have two separate chambers with independent rotating impeller for the left and right chambers. A problem with centrifugal impellers is the fluid stagnation underneath the impeller. This leads to thrombosis and blood clots.This paper presents the design, construction and location of washout hole to prevent thrombus for a Bi-VAD centrifugal pump. Results using CFD will be used to illustrate the superiority of our design concept in terms of preventing thrombus formation and hemolysis.
Resumo:
Chronic wounds are a significant socioeconomic problem for governments worldwide. Approximately 15% of people who suffer from diabetes will experience a lower-limb ulcer at some stage of their lives, and 24% of these wounds will ultimately result in amputation of the lower limb. Hyperbaric Oxygen Therapy (HBOT) has been shown to aid the healing of chronic wounds; however, the causal reasons for the improved healing remain unclear and hence current HBOT protocols remain empirical. Here we develop a three-species mathematical model of wound healing that is used to simulate the application of hyperbaric oxygen therapy in the treatment of wounds. Based on our modelling, we predict that intermittent HBOT will assist chronic wound healing while normobaric oxygen is ineffective in treating such wounds. Furthermore, treatment should continue until healing is complete, and HBOT will not stimulate healing under all circumstances, leading us to conclude that finding the right protocol for an individual patient is crucial if HBOT is to be effective. We provide constraints that depend on the model parameters for the range of HBOT protocols that will stimulate healing. More specifically, we predict that patients with a poor arterial supply of oxygen, high consumption of oxygen by the wound tissue, chronically hypoxic wounds, and/or a dysfunctional endothelial cell response to oxygen are at risk of nonresponsiveness to HBOT. The work of this paper can, in some way, highlight which patients are most likely to respond well to HBOT (for example, those with a good arterial supply), and thus has the potential to assist in improving both the success rate and hence the costeffectiveness of this therapy.