11 resultados para International clinical immersion
em Université de Lausanne, Switzerland
Resumo:
PURPOSE: Investigation of the incidence and distribution of congenital structural cardiac malformations among the offspring of mothers with diabetes type 1 and of the influence of periconceptional glycemic control. METHODS: Multicenter retrospective clinical study, literature review, and meta-analysis. The incidence and pattern of congenital heart disease in the own study population and in the literature on the offspring of type 1 diabetic mothers were compared with the incidence and spectrum of the various cardiovascular defects in the offspring of nondiabetic mothers as registered by EUROCAT Northern Netherlands. Medical records were, in addition, reviewed for HbA(1c) during the 1st trimester. RESULTS: The distribution of congenital heart anomalies in the own diabetic study population was in accordance with the distribution encountered in the literature. This distribution differed considerably from that in the nondiabetic population. Approximately half the cardiovascular defects were conotruncal anomalies. The authors' study demonstrated a remarkable increase in the likelihood of visceral heterotaxia and variants of single ventricle among these patients. As expected, elevated HbA(1c) values during the 1st trimester were associated with offspring fetal cardiovascular defects. CONCLUSION: This study shows an increased likelihood of specific heart anomalies, namely transposition of the great arteries, persistent truncus arteriosus, visceral heterotaxia and single ventricle, among offspring of diabetic mothers. This suggests a profound teratogenic effect at a very early stage in cardiogenesis. The study emphasizes the frequency with which the offspring of diabetes-complicated pregnancies suffer from complex forms of congenital heart disease. Pregnancies with poor 1st-trimester glycemic control are more prone to the presence of fetal heart disease.
Resumo:
We sought to provide a contemporary picture of the presentation, etiology, and outcome of infective endocarditis (IE) in a large patient cohort from multiple locations worldwide. Prospective cohort study of 2781 adults with definite IE who were admitted to 58 hospitals in 25 countries from June 1, 2000, through September 1, 2005. The median age of the cohort was 57.9 (interquartile range, 43.2-71.8) years, and 72.1% had native valve IE. Most patients (77.0%) presented early in the disease (<30 days) with few of the classic clinical hallmarks of IE. Recent health care exposure was found in one-quarter of patients. Staphylococcus aureus was the most common pathogen (31.2%). The mitral (41.1%) and aortic (37.6%) valves were infected most commonly. The following complications were common: stroke (16.9%), embolization other than stroke (22.6%), heart failure (32.3%), and intracardiac abscess (14.4%). Surgical therapy was common (48.2%), and in-hospital mortality remained high (17.7%). Prosthetic valve involvement (odds ratio, 1.47; 95% confidence interval, 1.13-1.90), increasing age (1.30; 1.17-1.46 per 10-year interval), pulmonary edema (1.79; 1.39-2.30), S aureus infection (1.54; 1.14-2.08), coagulase-negative staphylococcal infection (1.50; 1.07-2.10), mitral valve vegetation (1.34; 1.06-1.68), and paravalvular complications (2.25; 1.64-3.09) were associated with an increased risk of in-hospital death, whereas viridans streptococcal infection (0.52; 0.33-0.81) and surgery (0.61; 0.44-0.83) were associated with a decreased risk. In the early 21st century, IE is more often an acute disease, characterized by a high rate of S aureus infection. Mortality remains relatively high.
Resumo:
Molecular monitoring of BCR/ABL transcripts by real time quantitative reverse transcription PCR (qRT-PCR) is an essential technique for clinical management of patients with BCR/ABL-positive CML and ALL. Though quantitative BCR/ABL assays are performed in hundreds of laboratories worldwide, results among these laboratories cannot be reliably compared due to heterogeneity in test methods, data analysis, reporting, and lack of quantitative standards. Recent efforts towards standardization have been limited in scope. Aliquots of RNA were sent to clinical test centers worldwide in order to evaluate methods and reporting for e1a2, b2a2, and b3a2 transcript levels using their own qRT-PCR assays. Total RNA was isolated from tissue culture cells that expressed each of the different BCR/ABL transcripts. Serial log dilutions were prepared, ranging from 100 to 10-5, in RNA isolated from HL60 cells. Laboratories performed 5 independent qRT-PCR reactions for each sample type at each dilution. In addition, 15 qRT-PCR reactions of the 10-3 b3a2 RNA dilution were run to assess reproducibility within and between laboratories. Participants were asked to run the samples following their standard protocols and to report cycle threshold (Ct), quantitative values for BCR/ABL and housekeeping genes, and ratios of BCR/ABL to housekeeping genes for each sample RNA. Thirty-seven (n=37) participants have submitted qRT-PCR results for analysis (36, 37, and 34 labs generated data for b2a2, b3a2, and e1a2, respectively). The limit of detection for this study was defined as the lowest dilution that a Ct value could be detected for all 5 replicates. For b2a2, 15, 16, 4, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. For b3a2, 20, 13, and 4 labs showed a limit of detection at the 10-5, 10-4, and 10-3 dilutions, respectively. For e1a2, 10, 21, 2, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. Log %BCR/ABL ratio values provided a method for comparing results between the different laboratories for each BCR/ABL dilution series. Linear regression analysis revealed concordance among the majority of participant data over the 10-1 to 10-4 dilutions. The overall slope values showed comparable results among the majority of b2a2 (mean=0.939; median=0.9627; range (0.399 - 1.1872)), b3a2 (mean=0.925; median=0.922; range (0.625 - 1.140)), and e1a2 (mean=0.897; median=0.909; range (0.5174 - 1.138)) laboratory results (Fig. 1-3)). Thirty-four (n=34) out of the 37 laboratories reported Ct values for all 15 replicates and only those with a complete data set were included in the inter-lab calculations. Eleven laboratories either did not report their copy number data or used other reporting units such as nanograms or cell numbers; therefore, only 26 laboratories were included in the overall analysis of copy numbers. The median copy number was 348.4, with a range from 15.6 to 547,000 copies (approximately a 4.5 log difference); the median intra-lab %CV was 19.2% with a range from 4.2% to 82.6%. While our international performance evaluation using serially diluted RNA samples has reinforced the fact that heterogeneity exists among clinical laboratories, it has also demonstrated that performance within a laboratory is overall very consistent. Accordingly, the availability of defined BCR/ABL RNAs may facilitate the validation of all phases of quantitative BCR/ABL analysis and may be extremely useful as a tool for monitoring assay performance. Ongoing analyses of these materials, along with the development of additional control materials, may solidify consensus around their application in routine laboratory testing and possible integration in worldwide efforts to standardize quantitative BCR/ABL testing.
Resumo:
The 2009 International Society of Urological Pathology consensus conference in Boston made recommendations regarding the standardization of pathology reporting of radical prostatectomy specimens. Issues relating to the substaging of pT2 prostate cancers according to the TNM 2002/2010 system, reporting of tumor size/volume and zonal location of prostate cancers were coordinated by working group 2. A survey circulated before the consensus conference demonstrated that 74% of the 157 participants considered pT2 substaging of prostate cancer to be of clinical and/or academic relevance. The survey also revealed a considerable variation in the frequency of reporting of pT2b substage prostate cancer, which was likely a consequence of the variable methodologies used to distinguish pT2a from pT2b tumors. Overview of the literature indicates that current pT2 substaging criteria lack clinical relevance and the majority (65.5%) of conference attendees wished to discontinue pT2 substaging. Therefore, the consensus was that reporting of pT2 substages should, at present, be optional. Several studies have shown that prostate cancer volume is significantly correlated with other clinicopathological features, including Gleason score and extraprostatic extension of tumor; however, most studies fail to demonstrate this to have prognostic significance on multivariate analysis. Consensus was reached with regard to the reporting of some quantitative measure of the volume of tumor in a prostatectomy specimen, without prescribing a specific methodology. Incorporation of the zonal and/or anterior location of the dominant/index tumor in the pathology report was accepted by most participants, but a formal definition of the identifying features of the dominant/index tumor remained undecided.
Resumo:
PURPOSE: The Cancer Vaccine Consortium of the Cancer Research Institute (CVC-CRI) conducted a multicenter HLA-peptide multimer proficiency panel (MPP) with a group of 27 laboratories to assess the performance of the assay. EXPERIMENTAL DESIGN: Participants used commercially available HLA-peptide multimers and a well characterized common source of peripheral blood mononuclear cells (PBMC). The frequency of CD8+ T cells specific for two HLA-A2-restricted model antigens was measured by flow cytometry. The panel design allowed for participants to use their preferred staining reagents and locally established protocols for both cell labeling, data acquisition and analysis. RESULTS: We observed significant differences in both the performance characteristics of the assay and the reported frequencies of specific T cells across laboratories. These results emphasize the need to identify the critical variables important for the observed variability to allow for harmonization of the technique across institutions. CONCLUSIONS: Three key recommendations emerged that would likely reduce assay variability and thus move toward harmonizing of this assay. (1) Use of more than two colors for the staining (2) collect at least 100,000 CD8 T cells, and (3) use of a background control sample to appropriately set the analytical gates. We also provide more insight into the limitations of the assay and identified additional protocol steps that potentially impact the quality of data generated and therefore should serve as primary targets for systematic analysis in future panels. Finally, we propose initial guidelines for harmonizing assay performance which include the introduction of standard operating protocols to allow for adequate training of technical staff and auditing of test analysis procedures.
Resumo:
OBJECTIVE: To provide an update to the original Surviving Sepsis Campaign clinical management guidelines, "Surviving Sepsis Campaign Guidelines for Management of Severe Sepsis and Septic Shock," published in 2004. DESIGN: Modified Delphi method with a consensus conference of 55 international experts, several subsequent meetings of subgroups and key individuals, teleconferences, and electronic-based discussion among subgroups and among the entire committee. This process was conducted independently of any industry funding. METHODS: We used the Grades of Recommendation, Assessment, Development and Evaluation (GRADE) system to guide assessment of quality of evidence from high (A) to very low (D) and to determine the strength of recommendations. A strong recommendation (1) indicates that an intervention's desirable effects clearly outweigh its undesirable effects (risk, burden, cost) or clearly do not. Weak recommendations (2) indicate that the tradeoff between desirable and undesirable effects is less clear. The grade of strong or weak is considered of greater clinical importance than a difference in letter level of quality of evidence. In areas without complete agreement, a formal process of resolution was developed and applied. Recommendations are grouped into those directly targeting severe sepsis, recommendations targeting general care of the critically ill patient that are considered high priority in severe sepsis, and pediatric considerations. RESULTS: Key recommendations, listed by category, include early goal-directed resuscitation of the septic patient during the first 6 hrs after recognition (1C); blood cultures before antibiotic therapy (1C); imaging studies performed promptly to confirm potential source of infection (1C); administration of broad-spectrum antibiotic therapy within 1 hr of diagnosis of septic shock (1B) and severe sepsis without septic shock (1D); reassessment of antibiotic therapy with microbiology and clinical data to narrow coverage, when appropriate (1C); a usual 7-10 days of antibiotic therapy guided by clinical response (1D); source control with attention to the balance of risks and benefits of the chosen method (1C); administration of either crystalloid or colloid fluid resuscitation (1B); fluid challenge to restore mean circulating filling pressure (1C); reduction in rate of fluid administration with rising filing pressures and no improvement in tissue perfusion (1D); vasopressor preference for norepinephrine or dopamine to maintain an initial target of mean arterial pressure > or = 65 mm Hg (1C); dobutamine inotropic therapy when cardiac output remains low despite fluid resuscitation and combined inotropic/vasopressor therapy (1C); stress-dose steroid therapy given only in septic shock after blood pressure is identified to be poorly responsive to fluid and vasopressor therapy (2C); recombinant activated protein C in patients with severe sepsis and clinical assessment of high risk for death (2B except 2C for postoperative patients). In the absence of tissue hypoperfusion, coronary artery disease, or acute hemorrhage, target a hemoglobin of 7-9 g/dL (1B); a low tidal volume (1B) and limitation of inspiratory plateau pressure strategy (1C) for acute lung injury (ALI)/acute respiratory distress syndrome (ARDS); application of at least a minimal amount of positive end-expiratory pressure in acute lung injury (1C); head of bed elevation in mechanically ventilated patients unless contraindicated (1B); avoiding routine use of pulmonary artery catheters in ALI/ARDS (1A); to decrease days of mechanical ventilation and ICU length of stay, a conservative fluid strategy for patients with established ALI/ARDS who are not in shock (1C); protocols for weaning and sedation/analgesia (1B); using either intermittent bolus sedation or continuous infusion sedation with daily interruptions or lightening (1B); avoidance of neuromuscular blockers, if at all possible (1B); institution of glycemic control (1B), targeting a blood glucose < 150 mg/dL after initial stabilization (2C); equivalency of continuous veno-veno hemofiltration or intermittent hemodialysis (2B); prophylaxis for deep vein thrombosis (1A); use of stress ulcer prophylaxis to prevent upper gastrointestinal bleeding using H2 blockers (1A) or proton pump inhibitors (1B); and consideration of limitation of support where appropriate (1D). Recommendations specific to pediatric severe sepsis include greater use of physical examination therapeutic end points (2C); dopamine as the first drug of choice for hypotension (2C); steroids only in children with suspected or proven adrenal insufficiency (2C); and a recommendation against the use of recombinant activated protein C in children (1B). CONCLUSIONS: There was strong agreement among a large cohort of international experts regarding many level 1 recommendations for the best current care of patients with severe sepsis. Evidenced-based recommendations regarding the acute management of sepsis and septic shock are the first step toward improved outcomes for this important group of critically ill patients.
Resumo:
OBJECTIVE: Study of the uptake of new medical technologies provides useful information on the transfer of published evidence into usual practice. We conducted an audit of selected hospitals in three countries (Canada, France, and Switzerland) to identify clinical predictors of low-molecular-weight (LMW) heparin use and outpatient treatment, and to compare the pace of uptake of these new therapeutic approaches across hospitals. DESIGN: Historical review of medical records. SETTING AND PARTICIPANTS: We reviewed the medical records of 3043 patients diagnosed with deep vein thrombosis (DVT) in five Canadian, two French, and two Swiss teaching hospitals from 1994 to 1998. Measures. We explored independent clinical variables associated with LMW heparin use and outpatient treatment, and determined crude and adjusted rates of LMW heparin use and outpatient treatment across hospitals. RESULTS: For the years studied, the overall rates of LMW heparin use and outpatient treatment in the study sample were 34.1 and 15.8%, respectively, with higher rates of use in later years. Many comorbidities were negatively associated with outpatient treatment, and risk-adjusted rates of use of these new approaches varied significantly across hospitals. CONCLUSION: There has been a relatively rapid uptake of LMW heparins and outpatient treatment for DVT in their early years of availability, but the pace of uptake has varied considerably across hospitals and countries.
Resumo:
Invasive fungal diseases (IFDs) have become major causes of morbidity and mortality among highly immunocompromised patients. Authoritative consensus criteria to diagnose IFD have been useful in establishing eligibility criteria for antifungal trials. There is an important need for generation of consensus definitions of outcomes of IFD that will form a standard for evaluating treatment success and failure in clinical trials. Therefore, an expert international panel consisting of the Mycoses Study Group and the European Organization for Research and Treatment of Cancer was convened to propose guidelines for assessing treatment responses in clinical trials of IFDs and for defining study outcomes. Major fungal diseases that are discussed include invasive disease due to Candida species, Aspergillus species and other molds, Cryptococcus neoformans, Histoplasma capsulatum, and Coccidioides immitis. We also discuss potential pitfalls in assessing outcome, such as conflicting clinical, radiological, and/or mycological data and gaps in knowledge.
Resumo:
OBJECTIVE: Participation, an indicator of screening programme acceptance and effectiveness, varies widely in clinical trials and population-based colorectal cancer (CRC) screening programmes. We aimed to assess whether CRC screening participation rates can be compared across organized guaiac fecal occult blood test (G-FOBT)/fecal immunochemical test (FIT)-based programmes, and what factors influence these rates. METHODS: Programme representatives from countries participating in the International Cancer Screening Network were surveyed to describe their G-FOBT/FIT-based CRC screening programmes, how screening participation is defined and measured, and to provide participation data for their most recent completed screening round. RESULTS: Information was obtained from 15 programmes in 12 countries. Programmes varied in size, reach, maturity, target age groups, exclusions, type of test kit, method of providing test kits and use, and frequency of reminders. Coverage by invitation ranged from 30-100%, coverage by the screening programme from 7-67.7%, overall uptake/participation rate from 7-67.7%, and first invitation participation from 7-64.3%. Participation rates generally increased with age and were higher among women than men and for subsequent compared with first invitation participation. CONCLUSION: Comparisons among CRC screening programmes should be made cautiously, given differences in organization, target populations, and interpretation of indicators. More meaningful comparisons are possible if rates are calculated across a uniform age range, by gender, and separately for people invited for the first time vs. previously.
Resumo:
BACKGROUND: The impact of early valve surgery (EVS) on the outcome of Staphylococcus aureus (SA) prosthetic valve infective endocarditis (PVIE) is unresolved. The objective of this study was to evaluate the association between EVS, performed within the first 60 days of hospitalization, and outcome of SA PVIE within the International Collaboration on Endocarditis-Prospective Cohort Study. METHODS: Participants were enrolled between June 2000 and December 2006. Cox proportional hazards modeling that included surgery as a time-dependent covariate and propensity adjustment for likelihood to receive cardiac surgery was used to evaluate the impact of EVS and 1-year all-cause mortality on patients with definite left-sided S. aureus PVIE and no history of injection drug use. RESULTS: EVS was performed in 74 of the 168 (44.3%) patients. One-year mortality was significantly higher among patients with S. aureus PVIE than in patients with non-S. aureus PVIE (48.2% vs 32.9%; P = .003). Staphylococcus aureus PVIE patients who underwent EVS had a significantly lower 1-year mortality rate (33.8% vs 59.1%; P = .001). In multivariate, propensity-adjusted models, EVS was not associated with 1-year mortality (risk ratio, 0.67 [95% confidence interval, .39-1.15]; P = .15). CONCLUSIONS: In this prospective, multinational cohort of patients with S. aureus PVIE, EVS was not associated with reduced 1-year mortality. The decision to pursue EVS should be individualized for each patient, based upon infection-specific characteristics rather than solely upon the microbiology of the infection causing PVIE.
Resumo:
BACKGROUND: To assess the differences across continental regions in terms of stroke imaging obtained for making acute revascularization therapy decisions, and to identify obstacles to participating in randomized trials involving multimodal imaging. METHODS: STroke Imaging Repository (STIR) and Virtual International Stroke Trials Archive (VISTA)-Imaging circulated an online survey through its website, through the websites of national professional societies from multiple countries as well as through email distribution lists from STIR and the above mentioned societies. RESULTS: We received responses from 223 centers (2 from Africa, 38 from Asia, 10 from Australia, 101 from Europe, 4 from Middle East, 55 from North America, 13 from South America). In combination, the sites surveyed administered acute revascularization therapy to a total of 25,326 acute stroke patients in 2012. Seventy-three percent of these patients received intravenous (i.v.) tissue plasminogen activator (tPA), and 27%, endovascular therapy. Vascular imaging was routinely obtained in 79% (152/193) of sites for endovascular therapy decisions, and also as part of standard IV tPA treatment decisions at 46% (92/198) of sites. Modality, availability and use of acute vascular and perfusion imaging before revascularization varied substantially between geographical areas. The main obstacles to participate in randomized trials involving multimodal imaging included: mainly insufficient research support and staff (50%, 79/158) and infrequent use of multimodal imaging (27%, 43/158) . CONCLUSION: There were significant variations among sites and geographical areas in terms of stroke imaging work-up used tomake decisions both for intravenous and endovascular revascularization. Clinical trials using advanced imaging as a selection tool for acute revascularization therapy should address the need for additional resources and technical support, and take into consideration the lack of routine use of such techniques in trial planning.