290 resultados para Productive use of knowledg
em Université de Lausanne, Switzerland
Resumo:
The growing need for organs and the scarcity of donors has resulted in an increased use of extended criteria donors. We report a case where a recipient of a cardiac graft was used as an organ donor. Death of the recipient occurred 9 days after transplantation and was attributed to presumed cerebral hemorrhage, which post mortem was diagnosed as invasive aspergillosis of the brain. One recipient of a kidney transplant lost the graft due to infection with Aspergillus fumigatus, whereas prompt initiation of therapy successfully prevented disseminated aspergillosis in the other recipients. Despite the pressure to extend the use of organs by lowering the acceptance criteria, organs should only be accepted if the cause of death of the donors is unequivocally explained.
Resumo:
Fifty-three patients with histologically proven carcinoma were injected with highly purified [131I]-labeled goat antibodies or fragments of antibodies against carcinoembryonic antigen (CEA). Each patient was tested by external photoscanning 4, 24, 36 and 48 h after injection. In 22 patients (16 of 38 injected with intact antibodies, 5 of 13 with F(ab')2 fragments and 1 of 2 with Fab' fragments), an increased concentration of 131I radioactivity corresponding to the previously known tumor location was detected by photoscanning 36-48 h after injection. Blood pool and secreted radioactivity was determined in all patients by injecting 15 min before scanning, [99mTc]-labeled normal serum albumin and free 99mTc04-. The computerized subtraction of 99mTc from 131I radioactivity enhanced the definition of tumor localization in the 22 positive patients. However, in spite of the computerized subtraction, interpretation of the scans remained doubtful for 12 patients and was entirely negative for 19 additional patients. In order to provide a more objective evaluation for the specificity of the tumor localization of antibodies, 14 patients scheduled for tumor resection were injected simultaneously with [131I]-labeled antibodies or fragments and with [125I]-labeled normal goat IgG or fragments. After surgery, the radioactivity of the two isotopes present either in tumor or adjacent normal tissues was measured in a dual channel scintillation counter. The results showed that the antibodies or their fragments were 2-4 times more concentrated in the tumor than in the normal tissues. In addition, it was shown that the injected antibodies formed immune complexes with circulating CEA and that the amount of immune complexes detectable in serum was roughly proportional to the level of circulating CEA.
Resumo:
All patients having undergone a coronarography during 1984 have been surveyed in Switzerland. This retrospective study has used existing data in the 13 centers practicing this diagnostic procedure. 4921 coronarographies were carried out in 1984, amongst 4359 patients. In terms of population-based rates, the national figures are 77 procedures/100,000 residents, and 68 patients/100,000 residents. Female rates are one fourth of the male rates (27/100,000 versus 112/100,000). For both sexes, the highest utilization rates are for the age groups 60-64. Swiss figures are relatively low when compared with other developed countries. However, patterns of utilization are very different within the country: according to the Canton of residence of the patient, the utilization rates (standardized for age and sex) vary from 8/100,000 to 160/100,000. There is a distinct gradient from south-west to north-east, which closely corresponds to the distribution of centers practicing the procedure. More intriguing is the fact that cardiovascular mortality shows an inverse geographical gradient, with the highest mortality in Cantons having the lowest rate of coronarography. Various reasons for the observed variations are discussed, in relation with differences in supply of diagnostic and therapeutic equipments, but also in relation with various patterns of demand related to differential morbidity rates and/or differential patterns of clinical decision.
Resumo:
In traditional criminal investigation, uncertainties are often dealt with using a combination of common sense, practical considerations and experience, but rarely with tailored statistical models. For example, in some countries, in order to search for a given profile in the national DNA database, it must have allelic information for six or more of the ten SGM Plus loci for a simple trace. If the profile does not have this amount of information then it cannot be searched in the national DNA database (NDNAD). This requirement (of a result at six or more loci) is not based on a statistical approach, but rather on the feeling that six or more would be sufficient. A statistical approach, however, could be more rigorous and objective and would take into consideration factors such as the probability of adventitious matches relative to the actual database size and/or investigator's requirements in a sensible way. Therefore, this research was undertaken to establish scientific foundations pertaining to the use of partial SGM Plus loci profiles (or similar) for investigation.
Resumo:
INTRODUCTION: We examined the positive and negative subjective feelings associated with initial tobacco and cannabis use as well as the role of these experiences in regular use. Additionally, we investigated the effect of the first substance experienced on initial subjective experiences and later regular use. METHODS: Baseline data from a representative sample of young Swiss men were obtained from an ongoing Cohort Study on Substance Use Risk Factors, which includes 2,321 lifetime tobacco and cannabis users. We assessed the age of first tobacco and cannabis use along with the subjective experiences associated with initial use. Additionally, subjective experiences related to regular use of both substances were analyzed. RESULTS: The initial subjective experiences were divided into positive and negative for each substance, and we found that the feelings associated with first use of tobacco and cannabis were similar. Moreover, the participants who used cannabis before tobacco reported fewer negative experiences associated with first tobacco use, whereas the participants who initially used tobacco reported more negative experiences related to first cannabis use. Also, we identified that regular use was encouraged by positive experiences and that negative experiences were more adverse for regular use of cannabis compared with tobacco. CONCLUSIONS: Taken together, these results indicate that similar subjective experiences were associated with the first use of tobacco and cannabis. Also, the use of cannabis before tobacco, which occurred in only a minority of users, had the potential to enhance the effects of initial tobacco use.
Resumo:
This paper presents a short history of the appraisal of laser scanner technologies in geosciences used for imaging relief by high-resolution digital elevation models (HRDEMs) or 3D models. A general overview of light detection and ranging (LIDAR) techniques applied to landslides is given, followed by a review of different applications of LIDAR for landslide, rockfall and debris-flow. These applications are classified as: (1) Detection and characterization of mass movements; (2) Hazard assessment and susceptibility mapping; (3) Modelling; (4) Monitoring. This review emphasizes how LIDARderived HRDEMs can be used to investigate any type of landslides. It is clear that such HRDEMs are not yet a common tool for landslides investigations, but this technique has opened new domains of applications that still have to be developed.
Resumo:
Introduction/objectives: Multipatient use of a single-patient CBSD occurred inan outpatient clinic during 4 to 16 months before itsnotification. We looked for transmission of blood-bornepathogens among exposed patients.Methods: Exposed patients underwent serology testing for HBV,HCV and HIV. Patients with isolated anti-HBc receivedone dose of hepatitis B vaccine to look for a memoryimmune response. Possible transmissions were investigatedby mapping visits and sequencing of the viral genomeif needed.Results: Of 280 exposed patients, 9 had died without suspicionof blood-borne infection, 3 could not be tested, and 5declined investigations. Among the 263 (93%) testedpatients, 218 (83%) had negative results. We confirmeda known history of HCV infection in 6 patients (1 coinfectedby HIV), and also identified resolved HBVinfection in 37 patients, of whom 18 were alreadyknown. 2 patients were found to have a previouslyunknown HCV infection. According to the time elapsedfrom the closest previous visit of a HCV-infected potentialsource patient, we could rule out nosocomial transmissionin one case (14 weeks) but not in the other (1day). In the latter, however, transmission was deemedvery unlikely by 2 reference centers based on thesequences of the E1 and HVR1 regions of the virus.Conclusion: We did not identify any transmission of blood-bornepathogens in 263 patients exposed to a single-patientCBSD, despite the presence of potential source cases.Change of needle and disinfection of the device betweenpatients may have contributed to this outcome.Although we cannot exclude transmission of HBV, previousacquisition in endemic countries is a more likelyexplanation in this multi-national population.
Resumo:
Critically ill patients depend on artificial nutrition for the maintenance of their metabolic functions and lean body mass, as well as for limiting underfeeding-related complications. Current guidelines recommend enteral nutrition (EN), possibly within the first 48 hours, as the best way to provide the nutrients and prevent infections. EN may be difficult to realize or may be contraindicated in some patients, such as those presenting anatomic intestinal continuity problems or splanchnic ischemia. A series of contradictory trials regarding the best route and timing for feeding have left the medical community with great uncertainty regarding the place of parenteral nutrition (PN) in critically ill patients. Many of the deleterious effects attributed to PN result from inadequate indications, or from overfeeding. The latter is due firstly to the easier delivery of nutrients by PN compared with EN increasing the risk of overfeeding, and secondly to the use of approximate energy targets, generally based on predictive equations: these equations are static and inaccurate in about 70% of patients. Such high uncertainty about requirements compromises attempts at conducting nutrition trials without indirect calorimetry support because the results cannot be trusted; indeed, both underfeeding and overfeeding are equally deleterious. An individualized therapy is required. A pragmatic approach to feeding is proposed: at first to attempt EN whenever and as early as possible, then to use indirect calorimetry if available, and to monitor delivery and response to feeding, and finally to consider the option of combining EN with PN in case of insufficient EN from day 4 onwards.
Resumo:
Animal models of infective endocarditis (IE) induced by high-grade bacteremia revealed the pathogenic roles of Staphylococcus aureus surface adhesins and platelet aggregation in the infection process. In humans, however, S. aureus IE possibly occurs through repeated bouts of low-grade bacteremia from a colonized site or intravenous device. Here we used a rat model of IE induced by continuous low-grade bacteremia to explore further the contributions of S. aureus virulence factors to the initiation of IE. Rats with aortic vegetations were inoculated by continuous intravenous infusion (0.0017 ml/min over 10 h) with 10(6) CFU of Lactococcus lactis pIL253 or a recombinant L. lactis strain expressing an individual S. aureus surface protein (ClfA, FnbpA, BCD, or SdrE) conferring a particular adhesive or platelet aggregation property. Vegetation infection was assessed 24 h later. Plasma was collected at 0, 2, and 6 h postinoculation to quantify the expression of tumor necrosis factor (TNF), interleukin 1α (IL-1α), IL-1β, IL-6, and IL-10. The percentage of vegetation infection relative to that with strain pIL253 (11%) increased when binding to fibrinogen was conferred on L. lactis (ClfA strain) (52%; P = 0.007) and increased further with adhesion to fibronectin (FnbpA strain) (75%; P < 0.001). Expression of fibronectin binding alone was not sufficient to induce IE (BCD strain) (10% of infection). Platelet aggregation increased the risk of vegetation infection (SdrE strain) (30%). Conferring adhesion to fibrinogen and fibronectin favored IL-1β and IL-6 production. Our results, with a model of IE induced by low-grade bacteremia, resembling human disease, extend the essential role of fibrinogen binding in the initiation of S. aureus IE. Triggering of platelet aggregation or an inflammatory response may contribute to or promote the development of IE.
Resumo:
INTRODUCTION: Timely diagnosis of invasive candidiasis (IC) remains difficult as the clinical presentation is not specific and blood cultures lack sensitivity and need a long incubation time. Thus, non-culture-based methods for diagnosing IC have been developed. Mannan antigen (Mn) and anti-mannan antibodies (A-Mn) are present in patients with IC. On behalf of the Third European Conference on Infections in Leukemia, the performance of these tests was analysed and reviewed. METHODS: The literature was searched for studies using the commercially available sandwich enzyme-linked immunosorbent assays (Platelia™, Bio-Rad Laboratories, Marnes-la-Coquette, France) for detecting Mn and A-Mn in serum. The target condition of this review was IC defined according to 2008 European Organization for Research and Treatment of Cancer/Mycoses Study Group criteria. Sensitivity, specificity and diagnostic odds ratios (DOR) were calculated for Mn, A-Mn and combined Mn/A-Mn testing. RESULTS: Overall, 14 studies that comprised 453 patients and 767 controls were reviewed. The patient populations included in the studies were mainly haematological and cancer cases in seven studies and mainly intensive care unit and surgery cases in the other seven studies. All studies but one were retrospective in design. Mn sensitivity was 58% (95% confidence interval [CI], 53-62); specificity, 93% (95% CI, 91-94) and DOR, 18 (95% CI 12-28). A-Mn sensitivity was 59% (95% CI, 54-65); specificity, 83% (95% CI, 79-97) and DOR, 12 (95% CI 7-21). Combined Mn/A-Mn sensitivity was 83% (95% CI, 79-87); specificity, 86% (95% CI, 82-90) and DOR, 58 (95% CI 27-122). Significant heterogeneity of the studies was detected. The sensitivity of both Mn and A-Mn varied for different Candida species, and it was the highest for C. albicans, followed by C. glabrata and C. tropicalis. In 73% of 45 patients with candidemia, at least one of the serological tests was positive before the culture results, with mean time advantage being 6 days for Mn and 7 days for A-Mn. In 21 patients with hepatosplenic IC, 18 (86%) had Mn or A-Mn positive test results at a median of 16 days before radiological detection of liver or spleen lesions. CONCLUSIONS: Mn and A-Mn are useful for diagnosis of IC. The performance of combined Mn/A-Mn testing is superior to either Mn or A-Mn testing.
Resumo:
Context: Understanding the process through which adolescents and young adults are trying legal and illegal substances is a crucial point for the development of tailored prevention and treatment programs. However, patterns of substance first use can be very complex when multiple substances are considered, requiring reduction into a few meaningful number of categories. Data: We used data from a survey on adolescent and young adult health conducted in 2002 in Switzerland. Answers from 2212 subjects aged 19 and 20 were included. The first consumption ever of 10 substances (tobacco, cannabis, medicine to get high, sniff (volatile substances, and inhalants), ecstasy, GHB, LSD, cocaine, methadone, and heroin) was considered for a grand total of 516 different patterns. Methods: In a first step, automatic clustering was used to decrease the number of patterns to 50. Then, two groups of substance use experts, three social field workers, and three toxicologists and health professionals, were asked to reduce them into a maximum of 10 meaningful categories. Results: Classifications obtained through our methodology are of practical interest by revealing associations invisible to purely automatic algorithms. The article includes a detailed analysis of both final classifications, and a discussion on the advantages and limitations of our approach.
Resumo:
Point-of-care (POC) tests offer potentially substantial benefits for the management of infectious diseases, mainly by shortening the time to result and by making the test available at the bedside or at remote care centres. Commercial POC tests are already widely available for the diagnosis of bacterial and viral infections and for parasitic diseases, including malaria. Infectious diseases specialists and clinical microbiologists should be aware of the indications and limitations of each rapid test, so that they can use them appropriately and correctly interpret their results. The clinical applications and performance of the most relevant and commonly used POC tests are reviewed. Some of these tests exhibit insufficient sensitivity, and should therefore be coupled to confirmatory tests when the results are negative (e.g. Streptococcus pyogenes rapid antigen detection test), whereas the results of others need to be confirmed when positive (e.g. malaria). New molecular-based tests exhibit better sensitivity and specificity than former immunochromatographic assays (e.g. Streptococcus agalactiae detection). In the coming years, further evolution of POC tests may lead to new diagnostic approaches, such as panel testing, targeting not just a single pathogen, but all possible agents suspected in a specific clinical setting. To reach this goal, the development of serology-based and/or molecular-based microarrays/multiplexed tests will be needed. The availability of modern technology and new microfluidic devices will provide clinical microbiologists with the opportunity to be back at the bedside, proposing a large variety of POC tests that will allow quicker diagnosis and improved patient care.
Resumo:
The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.