99 resultados para Programs for improvement
Resumo:
BACKGROUND: Autofluorescence bronchoscopy (AFB) is a highly sensitive tool for the detection of early bronchial cancers. However, its specificity remains limited due to primarily false positive results induced by hyperplasia, metaplasia and inflammation. We have investigated the potential of blue-violet backscattered light to eliminate false positive results during AFB in a clinical pilot study. METHODS: The diagnostic autofluorescence endoscopy (DAFE) system was equipped with a variable band pass filter in the imaging detection path. The backscattering properties of normal and abnormal bronchial mucosae were assessed by computing the contrast between the two tissue types for blue-violet wavelengths ranging between 410 and 490 nm in 12 patients undergoing routine DAFE examination. In a second study including 6 patients we used a variable long pass (LP) filter to determine the spectral design of the emission filter dedicated to the detection of this blue-violet light with the DAFE system. RESULTS: (Pre-)neoplastic mucosa showed a clear wavelength dependence of the backscattering properties of blue-violet light while the reflectivity of normal, metaplastic and hyperplastic autofluorescence positive mucosa was wavelength independent. CONCLUSIONS: Our results showed that the detection of blue-violet light has the potential to reduce the number of false positive results in AFB. In addition we determined the spectral design of the emission filter dedicated to the detection of this blue-violet light with the DAFE system.
Resumo:
Early revascularization of pancreatic islet cells after transplantation is crucial for engraftment, and it has been suggested that vascular endothelial growth factor-A (VEGF-A) plays a significant role in this process. Although VEGF gene therapy can improve angiogenesis, uncontrolled VEGF secretion can lead to vascular tumor formation. Here we have explored the role of temporal VEGF expression, controlled by a tetracycline (TC)-regulated promoter, on revascularization and engraftment of genetically modified beta cells following transplantation. To this end, we modified the CDM3D beta cell line using a lentiviral vector to promote secretion of VEGF-A either in a TC-regulated (TET cells) or a constitutive (PGK cells) manner. VEGF secretion, angiogenesis, cell proliferation, and stimulated insulin secretion were assessed in vitro. VEGF secretion was increased in TET and PGK cells, and VEGF delivery resulted in angiogenesis, whereas addition of TC inhibited these processes. Insulin secretion by the three cell types was similar. We used a syngeneic mouse model of transplantation to assess the effects of this controlled VEGF expression in vivo. Time to normoglycemia, intraperitoneal glucose tolerance test, graft vascular density, and cellular mass were evaluated. Increased expression of VEGF resulted in significantly better revascularization and engraftment after transplantation when compared to control cells. In vivo, there was a significant increase in vascular density in grafted TET and PGK cells versus control cells. Moreover, the time for diabetic mice to return to normoglycemia and the stimulated plasma glucose clearance were also significantly accelerated in mice transplanted with TET and PGK cells when compared to control cells. VEGF was only needed during the first 2-3 weeks after transplantation; when removed, normoglycemia and graft vascularization were maintained. TC-treated mice grafted with TC-treated cells failed to restore normoglycemia. This approach allowed us to switch off VEGF secretion when the desired effects had been achieved. TC-regulated temporal expression of VEGF using a gene therapy approach presents a novel way to improve early revascularization and engraftment after islet cell transplantation.
Resumo:
Inhibitory control, a core component of executive functions, refers to our ability to suppress intended or ongoing cognitive or motor processes. Mostly based on Go/NoGo paradigms, a considerable amount of literature reports that inhibitory control of responses to "NoGo" stimuli is mediated by top-down mechanisms manifesting ∼200 ms after stimulus onset within frontoparietal networks. However, whether inhibitory functions in humans can be trained and the supporting neurophysiological mechanisms remain unresolved. We addressed these issues by contrasting auditory evoked potentials (AEPs) to left-lateralized "Go" and right NoGo stimuli recorded at the beginning versus the end of 30 min of active auditory spatial Go/NoGo training, as well as during passive listening of the same stimuli before versus after the training session, generating two separate 2 × 2 within-subject designs. Training improved Go/NoGo proficiency. Response times to Go stimuli decreased. During active training, AEPs to NoGo, but not Go, stimuli modulated topographically with training 61-104 ms after stimulus onset, indicative of changes in the underlying brain network. Source estimations revealed that this modulation followed from decreased activity within left parietal cortices, which in turn predicted the extent of behavioral improvement. During passive listening, in contrast, effects were limited to topographic modulations of AEPs in response to Go stimuli over the 31-81 ms interval, mediated by decreased right anterior temporoparietal activity. We discuss our results in terms of the development of an automatic and bottom-up form of inhibitory control with training and a differential effect of Go/NoGo training during active executive control versus passive listening conditions.
Resumo:
Background and objective: Optimal care of diabetic patients (DPs) decreases the risk of complications. Close blood glucose monitoring can improve patient outcomes and shorten hospital stay. The objective of this pilot study was to evaluate the treatment of hospitalized DPs according to the current standards, including their diabetic treatment and drugs to prevent diabetes related complications [=guardian drugs: angiotensin converting enzyme inhibitors (ACEI) or Angiotensin II Receptor Blockers (ARB), antiplatelet drugs, statins]. Guidelines of the American Diabetes Association (ADA) [1] were used as reference as they were the most recent and exhaustive for hospital care. Design: Observational pilot study: analysis of the medical records of all DPs seen by the clinical pharmacists during medical rounds in different hospital units. An assessment was made by assigning points for fulfilling the different criteria according to ADA and then by dividing the total by the maximum achievable points (scale 0-1; 1 = all criteria fulfilled). Setting: Different Internal Medicine and Geriatric Units of the (multi-site) Ho^pital du Valais. Main outcome measures: - Completeness of diabetes-related information: type of diabetes, medical history, weight, albuminuria status, renal function, blood pressure, (recent) lipid profile. - Management of blood glucose: Hb1Ac, glycemic control, plan for treating hyper-/hypoglycaemia. - Presence of guardian drugs if indicated. Results: Medical records of 42 patients in 10 different units were analysed (18 women, 24 men, mean age 75.4 ± 11 years). 41 had type 2 diabetes. - Completeness of diabetes-related information: 0.8 ± 0.1. Information often missing: insulin-dependence (43%) and lipid profile (86%). - Management of blood glucose: 0.5 ± 0.2. 15 patients had suboptimal glycemic balance (target glycaemia 7.2-11.2 mmol/ l, with values[11.2 or\3.8 mmol/l, or Hb1Ac[7%), 10 patients had a deregulated balance (more than 10 values[11.2 mmol/l or \3.8 mmol/l and even values[15 mmol/l). - Presence of guardian drugs if indicated: ACEI/ARB: 19 of 23 patients (82.6%), statin: 16 of 40 patients (40%), antiplatelet drug: 16 of 39 patients (41%). Conclusions: Blood glucose control was insufficient in many DPs and prescription of statins and antiplatelet drugs was often missing. If confirmed by a larger study, these two points need to be optimised. As it is not always possible and appropriate to make those changes during hospital stay, a further project should assess and optimise diabetes care across both inpatient and outpatient settings.
Resumo:
When the US Preventive Services Task Force (USPSTF) in 2009 recommended against universal breast cancer screening with mammography in women aged 40 to 49 years, some scientists, radiologists, politicians, and patients strongly objected. The controversy has been called the "mammography wars." The latest chapter in these wars comes from the Swiss Medical Board, which is mandated by the Conference of Health Ministers of the Swiss Cantons, the Swiss Medical Association, and the Swiss Academy of Medical Sciences to conduct health technology assessments. In a February 2014 report, the Swiss Medical Board stated that new systematic mammography screening programs should not be introduced, irrespective of the age of the women, and that existing programs should be discontinued. The board's main argument was that the absolute reduction in breast cancer mortality was low and that the adverse consequences of the screening were substantial. The absolute risk reduction in breast cancer mortality has been estimated by the board at 0.16% for women screened during 6.2 years and followed-up over 13 years, based on the results of a recent Cochrane Review. The adverse consequences include falsepositive test results, overdiagnosis and overtreatment of patients, and high costs, including the expense of follow-up testing and procedures.
Resumo:
While the morphological and electrophysiological changes underlying diabetic peripheral neuropathy (DPN) are relatively well described, the involved molecular mechanisms remain poorly understood. In this study, we investigated whether phenotypic changes associated with early DPN are correlated with transcriptional alterations in the neuronal (dorsal root ganglia [DRG]) or the glial (endoneurium) compartments of the peripheral nerve. We used Ins2(Akita/+) mice to study transcriptional changes underlying the onset of DPN in type 1 diabetes mellitus (DM). Weight, blood glucose and motor nerve conduction velocity (MNCV) were measured in Ins2(Akita/+) and control mice during the first three months of life in order to determine the onset of DPN. Based on this phenotypic characterization, we performed gene expression profiling using sciatic nerve endoneurium and DRG isolated from pre-symptomatic and early symptomatic Ins2(Akita/+) mice and sex-matched littermate controls. Our phenotypic analysis of Ins2(Akita/+) mice revealed that DPN, as measured by reduced MNCV, is detectable in affected animals already one week after the onset of hyperglycemia. Surprisingly, the onset of DPN was not associated with any major persistent changes in gene expression profiles in either sciatic nerve endoneurium or DRG. Our data thus demonstrated that the transcriptional programs in both endoneurial and neuronal compartments of the peripheral nerve are relatively resistant to the onset of hyperglycemia and hypoinsulinemia suggesting that either minor transcriptional alterations or changes on the proteomic level are responsible for the functional deficits associated with the onset of DPN in type 1 DM.
Resumo:
OBJECTIVE: The objective of this study was to analyse the use of lights and siren (L&S) during transport to the hospital by the prehospital severity status of the patient and the time saved by the time of day of the mission. METHODS: We searched the Public Health Services data of a Swiss state from 1 January 2010 to 31 December 2010. All primary patient transports within the state were included (24 718). The data collected were on the use of L&S, patient demographics, the time and duration of transport, the type of mission (trauma vs. nontrauma) and the severity of the condition according to the National Advisory Committee for Aeronautics (NACA) score assigned by the paramedics and/or emergency physician. We excluded 212 transports because of missing data. RESULTS: A total of 24 506 ambulance transports met the inclusion criteria. L&S were used 4066 times, or in 16.6% of all missions. Of these, 40% were graded NACA less than 4. Overall, the mean total transport time to return to the hospital was 11.09 min (confidence interval 10.84-11.34) with L&S and 12.84 min (confidence interval 12.72-12.96) without. The difference was 1.75 min (105 s; P<0.001). For night-time runs alone, the mean time saved using L&S was 0.17 min (10.2 s; P=0.27). CONCLUSION: At present, the use of L&S seems questionable given the severity status or NACA score of transported patients. Our results should prompt the implementation of more specific regulations for L&S use during transport to the hospital, taking into consideration certain physiological criteria of the victim as well as time of day of transport.
Resumo:
The frequent lack of microbiological documentation of infection by blood cultures (BC) has a major impact on clinical management of febrile neutropenic patients, especially in cases of unexplained persistent fever. We assessed the diagnostic utility of the LightCycler SeptiFast test (SF), a multiplex blood PCR, in febrile neutropenia. Blood for BC and SF was drawn at the onset of fever and every 3 days of persistent fever. SF results were compared with those of BC, clinical documentation of infection, and standard clinical, radiological, and microbiological criteria for invasive fungal infections (IFI). A total of 141 febrile neutropenic episodes in 86 hematological patients were studied: 44 (31%) microbiologically and 49 (35%) clinically documented infections and 48 (34%) unexplained fevers. At the onset of fever, BC detected 44 microorganisms in 35/141 (25%) episodes. Together, BC and SF identified 78 microorganisms in 61/141 (43%) episodes (P = 0.002 versus BC or SF alone): 12 were detected by BC and SF, 32 by BC only, and 34 by SF only. In 19/52 (37%) episodes of persistent fever, SF detected 28 new microorganisms (7 Gram-positive bacterial species, 15 Gram-negative bacterial species, and 6 fungal species [89% with a clinically documented site of infection]) whereas BC detected only 4 pathogens (8%) (P = 0.001). While BC did not detect fungi, SF identified 5 Candida spp. and 1 Aspergillus sp. in 5/7 probable or possible cases of IFI. Using SeptiFast PCR combined with blood cultures improves microbiological documentation in febrile neutropenia, especially when fever persists and invasive fungal infection is suspected. Technical adjustments may enhance the efficiency of this new molecular tool in this specific setting.
Resumo:
Background : Canakinumab, a fully human anti-IL-1b antibody has been shown to control inflammation in gouty arthritis. This study evaluated changes in health-related quality of life (HRQoL) in patients treated with canakinumab or triamcinolone acetonide (TA).Methods : An 8-wk, dose-ranging, active controlled, single-blind study in patients (_18 to _80 years) with acute gouty arthritis flare, refractory to or contraindicated to NSAlDs and/or colchicine, were randomized to canakinumab 10, 25, 50, 90, 150mg sc or TA 40mg im. HRQoL was assessed using patient reported outcomes evaluating PCS and MCS, and subscale scores of SF-36_ [acute version 2]) and functional disability (HAQ-DI_).Results : In canakinumab 150mg group, the most severe impairment at baseline was reported for physical functioning and bodily pain; levels of 41.5 and 36.0, respectively, which improved in 7 days to 80.0 and 72.2 (mean increases of 39.0 and 35.6) and at 8 wks improved to 86.1 and 86.6 (mean increases of 44.6 and 50.6); these were higher than levels seen in the general US population. TA group, showed less improvement in 7 days (mean increases of 23.3 and 21.3 for physical function and bodily pain). Functional disability scores, measured by the HAQ-DI_ decreased in both treatment groups (Table 1).Conclusions : Gouty arthritis patients treated with canakinumab showed a rapid improvement in physical and mental well-being based on SF-36_ scores. In contrast to the TA group, patients treated with canakinumab showed improvement in 7 days in physical function and bodily pain approaching levels of the general population.Disclosure statement : U.A., A.F., V.M., D.R., P.S. and K.S. are employees and shareholders of Novartis Pharma AG. A.P. has received research support from Novartis Pharma AG. N.S. has received research support and consultancy fees from Novartis Pharmaceuticals Corporation, has served on advisory boards for Novartis, Takeda, Savient, URL Pharma and EnzymeRx, and is/has been a member of a speakers' bureau for Takeda. A.S. has received consultation fees from Novartis Pharma AG, Abbott, Bristol-Myers Squibb, Essex, Pfizer, MSD, Roche, UCB and Wyeth. All other authors have declared no conflicts of interest.
Resumo:
The report of significant decrease of the inpatient hospital mortality and morbidity with an efficient insulin therapy has demonstrated the need of a good glycaemic control for patients hospitalised in acute care. However, one is faced with numerous difficulties in the hospital management of patients with hyperglycaemia, errors often occur when prescribing insulin, and the management skills are insufficient. Our goal is to change the medical and nursing practices to evolve towards an efficient and safe management of the hospitalised patient. The model we lay out in this article is based upon observation of the therapeutic support of patients with a chronic condition, whilst using a systemic management approach.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
OBJECTIVES: To examine whether percutaneous alcohol septal ablation affects coronary flow reserve (CFR) in patients with hypertrophic cardiomyopathy (HCM). METHODS: CFR was measured immediately before and after septal ablation in patients with symptomatic obstructive HCM. CFR was also obtained in normal subjects (NL) for comparison. RESULTS: Patients with HCM (n = 11), compared with NL (n = 22), had a lower mean (SD) baseline CFR (1.96 (0.5) vs 3.0 (0.7), p<0.001), a lower coronary resistance (1.04 (0.45) vs 3.0 (2.6), p = 0.002), a higher coronary diastolic/systolic velocity ratio (DSVR; 5.1 (3.0) vs 1.8 (0.5), p = 0.04) and a lower hyperaemic coronary flow per left ventricular (LV) mass (0.73 (0.4) vs 1.1 (0.6) ml/min/g, p = 0.007). Septal ablation in the HCM group (n = 7) reduced the outflow tract gradient but not the left atrial or LV diastolic pressures. Ablation resulted in immediate normalisation of CFR (to 3.1 (1), p = 0.01) and DSVR (to 1.9 (0.8), p = 0.09) and an increase in coronary resistance (to 1.91 (0.6), p = 0.02). This was probably related to an improvement in the systolic coronary flow. CONCLUSIONS: This study demonstrates that successful septal ablation in patients with symptomatic HCM results in immediate improvement in CFR, which is reduced in HCM partly because of the increased systolic contraction load.