239 resultados para New high
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Purpose: Cross-sectional imaging techniques have pioneered forensic medicine. The involvement of a radiographer and formation of "forensic radiographers" allows an improvement of the quality of radiological examinations and facilitates the implementation of techniques, such as sample collections, and the performance of post-mortem angiography. Methods and Materials: During a period of three months, five radiographers with clinical experience have undergone a special training in order to learn procedures dedicated to forensic imaging. These procedures involved: I). acquisition of MDCT data, II). sample collection for toxicological or histological analyses by performing CT-guided biopsies and liquid sampling, III). post-mortem angiography and IV). post-processing of all data acquired. To perform the post-mortem angiography, radiographers were in charge of the preparation of the perfusion device and the investigated body. Therefore, cannulas were inserted into the femoral vessels and connected to the machine. For angiography, the radiographers had to synchronize the perfusion with the CT-acquisitions. Results: All five radiographers have acquired new skills to become "forensic radiographers". They were able to perform post-mortem MDCT, sample collection, post-mortem angiography and post-processing of the acquired data all by themselves. Most problems have been observed concerning the preparation of the body for post-mortem angiography. Conclusion: Our experience shows that radiographers are able to perform high quality examinations after a short period of training. Their collaboration is well accepted by the forensic team and regarding the increase of radiological exams in forensic department, it would be nonsense to exclude radiographers from the forensic-radiological team.
Resumo:
Objectives: To determine the prevalence of dementia and the proportion of undiagnosed dementia in elderly patients admitted to postacute care, and to identify patients' characteristics associated with undiagnosed dementia. Design: Cross-sectional study. Setting: Academic postacute rehabilitation facility in Lausanne, Switzerland. Participants: Patients (N = 1764) aged 70 years and older. Measurements: Data on socio-demographic, medical, functional, and affective status were collected upon admission. Data on cognitive performance (Mini-Mental State Exam [MMSE]), and cognition-related discharge diagnoses were abstracted through a structured review of discharge summaries. Results: Overall, 24.1% (425/1764) patients had a diagnosis of dementia, most frequently secondary to Alzheimer's disease (260/425, 61.2%). Among dementia cases, 70.8% (301/425) were newly diagnosed during postacute stay. This proportion was lower among patients referred from internal medicine than from orthopedic/surgery services (65.8% versus 74.8%, P = .042). Compared to patients with already diagnosed dementia, those newly diagnosed were older, lived alone more frequently, and had better functional status and MMSE score at admission (all P < .05). In multivariate analysis, previously undetected dementia remained associated with older age (OR = 2.4 for age 85 years and older, 95% CI 1.5-4.0, P = .001) and normal MMSE at admission (OR = 5.9, 95% CI 2.7-12.7, P < .001). Conclusion: Dementia was present in almost a fourth of elderly patients referred to postacute care, but was diagnosed in less than a third before admission. Oldest old patients appear especially at risk for underrecognition. These results emphasize the high diagnostic yield of systematic cognitive assessment in the postacute care setting to improve these patients' management and quality of life.
Resumo:
OBJECTIVES: This study aimed to characterize myocardial infarction after percutaneous coronary intervention (PCI) based on cardiac marker elevation as recommended by the new universal definition and on the detection of late gadolinium enhancement (LGE) by cardiovascular magnetic resonance (CMR). It is also assessed whether baseline inflammatory biomarkers are higher in patients developing myocardial injury. BACKGROUND: Cardiovascular magnetic resonance accurately assesses infarct size. Baseline C-reactive protein (CRP) and neopterin predict prognosis after stent implantation. METHODS: Consecutive patients with baseline troponin (Tn) I within normal limits and no LGE in the target vessel underwent baseline and post-PCI CMR. The Tn-I was measured until 24 h after PCI. Serum high-sensitivity CRP and neopterin were assessed before coronary angiography. RESULTS: Of 45 patients, 64 (53 to 72) years of age, 33% developed LGE with infarct size of 0.83 g (interquartile range: 0.32 to 1.30 g). A Tn-I elevation >99% upper reference limit (i.e., myocardial necrosis) (median Tn-I: 0.51 μg/l, interquartile range: 0.16 to 1.23) and Tn-I > 3× upper reference limit (i.e., type 4a myocardial infarction [MI]) occurred in 58% and 47% patients, respectively. LGE was undetectable in 42% and 43% of patients with periprocedural myocardial necrosis and type 4a MI, respectively. Agreement between LGE and type 4a MI was moderate (kappa = 0.45). The levels of CRP or neopterin did not significantly differ between patients with or without myocardial injury, detected by CMR or according to the new definition (p = NS). CONCLUSIONS: This study reports the lack of substantial agreement between the new universal definition and CMR for the diagnosis of small-size periprocedural myocardial damage after complex PCI. Baseline levels of CRP or neopterin were not predictive for the development of periprocedural myocardial damage.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
Although prosthetic joint infection (PJI) is a rare event after arthroplasty, it represents a significant complication that is associated with high morbidity, need for complex treatment, and substantial healthcare costs. An accurate and rapid diagnosis of PJI is crucial for treatment success. Current diagnostic methods in PJI are insufficient with 10-30% false-negative cultures. Consequently, there is a need for research and development into new methods aimed at improving diagnostic accuracy and speed of detection. In this article, we review available conventional diagnostic methods for the diagnosis of PJI (laboratory markers, histopathology, synovial fluid and periprosthetic tissue cultures), new diagnostic methods (sonication of implants, specific and multiplex PCR, mass spectrometry) and innovative techniques under development (new laboratory markers, microcalorimetry, electrical method, reverse transcription [RT]-PCR, fluorescence in situ hybridization [FISH], biofilm microscopy, microarray identification, and serological tests). The results of highly sensitive diagnostic techniques with unknown specificity should be interpreted with caution. The organism identified by a new method may represent a real pathogen that was unrecognized by conventional diagnostic methods or contamination during specimen sampling, transportation, or processing. For accurate interpretation, additional studies are needed, which would evaluate the long-term outcome (usually >2 years) with or without antimicrobial treatment. It is expected that new rapid, accurate, and fully automatic diagnostic tests will be developed soon.
Resumo:
Access to new biological sources is a key element of natural product research. A particularly large number of biologically active molecules have been found to originate from microorganisms. Very recently, the use of fungal co-culture to activate the silent genes involved in metabolite biosynthesis was found to be a successful method for the induction of new compounds. However, the detection and identification of the induced metabolites in the confrontation zone where fungi interact remain very challenging. To tackle this issue, a high-throughput UHPLC-TOF-MS-based metabolomic approach has been developed for the screening of fungal co-cultures in solid media at the petri dish level. The metabolites that were overexpressed because of fungal interactions were highlighted by comparing the LC-MS data obtained from the co-cultures and their corresponding mono-cultures. This comparison was achieved by subjecting automatically generated peak lists to statistical treatments. This strategy has been applied to more than 600 co-culture experiments that mainly involved fungal strains from the Fusarium genera, although experiments were also completed with a selection of several other filamentous fungi. This strategy was found to provide satisfactory repeatability and was used to detect the biomarkers of fungal induction in a large panel of filamentous fungi. This study demonstrates that co-culture results in consistent induction of potentially new metabolites.
Resumo:
The screening of testosterone (T) misuse for doping control is based on the urinary steroid profile, including T, its precursors and metabolites. Modifications of individual levels and ratio between those metabolites are indicators of T misuse. In the context of screening analysis, the most discriminant criterion known to date is based on the T glucuronide (TG) to epitestosterone glucuronide (EG) ratio (TG/EG). Following the World Anti-Doping Agency (WADA) recommendations, there is suspicion of T misuse when the ratio reaches 4 or beyond. While this marker remains very sensitive and specific, it suffers from large inter-individual variability, with important influence of enzyme polymorphisms. Moreover, use of low dose or topical administration forms makes the screening of endogenous steroids difficult while the detection window no longer suits the doping habit. As reference limits are estimated on the basis of population studies, which encompass inter-individual and inter-ethnic variability, new strategies including individual threshold monitoring and alternative biomarkers were proposed to detect T misuse. The purpose of this study was to evaluate the potential of ultra-high pressure liquid chromatography (UHPLC) coupled with a new generation high resolution quadrupole time-of-flight mass spectrometer (QTOF-MS) to investigate the steroid metabolism after transdermal and oral T administration. An approach was developed to quantify 12 targeted urinary steroids as direct glucuro- and sulfo-conjugated metabolites, allowing the conservation of the phase II metabolism information, reflecting genetic and environmental influences. The UHPLC-QTOF-MS(E) platform was applied to clinical study samples from 19 healthy male volunteers, having different genotypes for the UGT2B17 enzyme responsible for the glucuroconjugation of T. Based on reference population ranges, none of the traditional markers of T misuse could detect doping after topical administration of T, while the detection window was short after oral TU ingestion. The detection ability of the 12 targeted steroids was thus evaluated by using individual thresholds following both transdermal and oral administration. Other relevant biomarkers and minor metabolites were studied for complementary information to the steroid profile, including sulfoconjugated analytes and hydroxy forms of glucuroconjugated metabolites. While sulfoconjugated steroids may provide helpful screening information for individuals with homozygotous UGT2B17 deletion, hydroxy-glucuroconjugated analytes could enhance the detection window of oral T undecanoate (TU) doping.
Resumo:
Three standard radiation qualities (RQA 3, RQA 5 and RQA 9) and two screens, Kodak Lanex Regular and Insight Skeletal, were used to compare the imaging performance and dose requirements of the new Kodak Hyper Speed G and the current Kodak T-MAT G/RA medical x-ray films. The noise equivalent quanta (NEQ) and detective quantum efficiencies (DQE) of the four screen-film combinations were measured at three gross optical densities and compared with the characteristics for the Kodak CR 9000 system with GP (general purpose) and HR (high resolution) phosphor plates. The new Hyper Speed G film has double the intrinsic sensitivity of the T-MAT G/RA film and a higher contrast in the high optical density range for comparable exposure latitude. By providing both high sensitivity and high spatial resolution, the new film significantly improves the compromise between dose and image quality. As expected, the new film has a higher noise level and a lower signal-to-noise ratio than the standard film, although in the high frequency range this is compensated for by a better resolution, giving better DQE results--especially at high optical density. Both screen-film systems outperform the phosphor plates in terms of MTF and DQE for standard imaging conditions (Regular screen at RQA 5 and RQA 9 beam qualities). At low energy (RQA 3), the CR system has a comparable low-frequency DQE to screen-film systems when used with a fine screen at low and middle optical densities, and a superior low-frequency DQE at high optical density.
Resumo:
The determination of sediment storage is a critical parameter in sediment budget analyses. But, in many sediment budget studies the quantification of magnitude and time-scale of sediment storage is still the weakest part and often relies on crude estimations only, especially in large drainage basins (>100km2). We present a new approach to storage quantification in a meso-scale alpine catchment of the Swiss Alps (Turtmann Valley, 110km2). The quantification of depositional volumes was performed by combining geophysical surveys and geographic information system (GIS) modelling techniques. Mean thickness values of each landform type calculated from these data was used to estimate the sediment volume in the hanging valleys and the trough slopes. Sediment volume of the remaining subsystems was determined by modelling an assumed parabolic bedrock surface using digital elevation model (DEM) data. A total sediment volume of 781·3×106?1005·7×106m3 is deposited in the Turtmann Valley. Over 60% of this volume is stored in the 13 hanging valleys. Moraine landforms contain over 60% of the deposits in the hanging valleys followed by sediment stored on slopes (20%) and rock glaciers (15%). For the first time, a detailed quantification of different storage types was achieved in a catchment of this size. Sediment volumes have been used to calculate mean denudation rates for the different processes ranging from 0·1 to 2·6mm/a based on a time span of 10ka. As the quantification approach includes a number of assumptions and various sources of error the values given represent the order of magnitude of sediment storage that has to be expected in a catchment of this size.
Resumo:
Mendelian cardiomyopathies and arrhythmias are characterized by an important genetic heterogeneity, rendering Sanger sequencing very laborious and expensive. As a proof of concept, we explored multiplex targeted high-throughput sequencing (HTS) as a fast and cost-efficient diagnostic method for individuals suffering from Mendelian cardiac disorders. We designed a DNA capture assay including all exons from 130 genes involved in cardiovascular Mendelian disorders and analysed simultaneously four samples by multiplexing. Two patients had familial hypertrophic cardiomyopathy (HCM) and two patients suffered from long QT syndrome (LQTS). In patient 1 with HCM, we identified two known pathogenic missense variants in the two most frequently mutated sarcomeric genes MYH7 and MYBPC. In patient 2 with HCM, a known acceptor splice site variant in MYBPC3 was found. In patient 3 with LQTS, two missense variants in the genes SCN5A and KCNQ were identified. Finally, in patient 4 with LQTS a known missense variant was found in MYBPC3, which is usually mutated in patients with cardiomyopathy. Our results showed that multiplex targeted HTS works as an efficient and cost-effective tool for molecular diagnosis of heterogeneous disorders in clinical practice and offers new insights in the pathogenesis of these complex diseases.
Resumo:
In this paper, mixed spectral-structural kernel machines are proposed for the classification of very-high resolution images. The simultaneous use of multispectral and structural features (computed using morphological filters) allows a significant increase in classification accuracy of remote sensing images. Subsequently, weighted summation kernel support vector machines are proposed and applied in order to take into account the multiscale nature of the scene considered. Such classifiers use the Mercer property of kernel matrices to compute a new kernel matrix accounting simultaneously for two scale parameters. Tests on a Zurich QuickBird image show the relevance of the proposed method : using the mixed spectral-structural features, the classification accuracy increases of about 5%, achieving a Kappa index of 0.97. The multikernel approach proposed provide an overall accuracy of 98.90% with related Kappa index of 0.985.
Resumo:
Type 1 diabetes (T1D) is rarely a component of primary immune dysregulation disorders. We report two cases in which T1D was associated with thrombocytopenia. The first patient, a 13-year-old boy, presented with immune thrombocytopenia (ITP), thyroiditis, and, 3 wk later, T1D. Because of severe thrombocytopenia resistant to immunoglobulins, high-dose steroids, and cyclosporine treatment, anti-cluster of differentiation (CD20) therapy was introduced, with consequent normalization of thrombocytes and weaning off of steroids. Three and 5 months after anti-CD20 therapy, levothyroxin and insulin therapy, respectively, were stopped. Ten months after stopping insulin treatment, normal C-peptide and hemoglobin A1c (HbA1c) levels and markedly reduced anti-glutamic acid decarboxylase (GAD) antibodies were measured. A second anti-CD20 trial for relapse of ITP was initiated 2 yr after the first trial. Anti-GAD antibody levels decreased again, but HbA1c stayed elevated and glucose monitoring showed elevated postprandial glycemia, demanding insulin therapy. To our knowledge, this is the first case in which insulin treatment could be interrupted for 28 months after anti-CD20 treatment. In patient two, thrombocytopenia followed a diagnosis of T1D 6 yr previously. Treatment with anti-CD20 led to normalization of thrombocytes, but no effect on T1D was observed. Concerning the origin of the boys' conditions, several primary immune dysregulation disorders were considered. Thrombocytopenia associated with T1D is unusual and could represent a new entity. The diabetes manifestation in patient one was probably triggered by corticosteroid treatment; regardless, anti-CD20 therapy appeared to be efficacious early in the course of T1D, but not long after the initial diagnosis of T1D, as shown for patient two.
Resumo:
A new fast MR-venography approach using a high resolution True-FISP imaging sequence was investigated in 20 patients suffering from 23 deep vein thromboses. Diagnosis was proven by x-ray venography, CT or ultrasound examination. The presented technique allowed for clear thrombus visualization with a high contrast to the surrounding blood pool even in calf veins. Acquisition time was less than 10 minutes for imaging the pelvis and the legs. No contrast media was needed. The presented high resolution True-FISP MR-venography is a promising non-invasive, fast MR-venography approach for detection of deep venous thrombosis.
Resumo:
There has been much concern regarding the role of dietary fructose in the development of metabolic diseases. This concern arises from the continuous increase in fructose (and total added caloric sweeteners consumption) in recent decades, and from the increased use of high-fructose corn syrup (HFCS) as a sweetener. A large body of evidence shows that a high-fructose diet leads to the development of obesity, diabetes, and dyslipidemia in rodents. In humans, fructose has long been known to increase plasma triglyceride concentrations. In addition, when ingested in large amounts as part of a hypercaloric diet, it can cause hepatic insulin resistance, increased total and visceral fat mass, and accumulation of ectopic fat in the liver and skeletal muscle. These early effects may be instrumental in causing, in the long run, the development of the metabolic syndrome. There is however only limited evidence that fructose per se, when consumed in moderate amounts, has deleterious effects. Several effects of a high-fructose diet in humans can be observed with high-fat or high-glucose diets as well, suggesting that an excess caloric intake may be the main factor involved in the development of the metabolic syndrome. The major source of fructose in our diet is with sweetened beverages (and with other products in which caloric sweeteners have been added). The progressive replacement of sucrose by HFCS is however unlikely to be directly involved in the epidemy of metabolic disease, because HFCS appears to have basically the same metabolic effects as sucrose. Consumption of sweetened beverages is however clearly associated with excess calorie intake, and an increased risk of diabetes and cardiovascular diseases through an increase in body weight. This has led to the recommendation to limit the daily intake of sugar calories.