199 resultados para Robust methods
Resumo:
Flow cytometry (FCM) is emerging as an important tool in environmental microbiology. Although flow cytometry applications have to date largely been restricted to certain specialized fields of microbiology, such as the bacterial cell cycle and marine phytoplankton communities, technical advances in instrumentation and methodology are leading to its increased popularity and extending its range of applications. Here we will focus on a number of recent flow cytometry developments important for addressing questions in environmental microbiology. These include (i) the study of microbial physiology under environmentally relevant conditions, (ii) new methods to identify active microbial populations and to isolate previously uncultured microorganisms, and (iii) the development of high-throughput autofluorescence bioreporter assays
Resumo:
Elucidating the molecular and neural basis of complex social behaviors such as communal living, division of labor and warfare requires model organisms that exhibit these multi-faceted behavioral phenotypes. Social insects, such as ants, bees, wasps and termites, are attractive models to address this problem, with rich ecological and ethological foundations. However, their atypical systems of reproduction have hindered application of classical genetic approaches. In this review, we discuss how recent advances in social insect genomics, transcriptomics, and functional manipulations have enhanced our ability to observe and perturb gene expression, physiology and behavior in these species. Such developments begin to provide an integrated view of the molecular and cellular underpinnings of complex social behavior.
Resumo:
Introduction: Oseltamivir phosphate (OP), the prodrug of oseltamivir carboxylate (OC; active metabolite), is marketed since 10 years for the treatment of seasonal influenza flu. It has recently received renewed attention because of the threat of avian flu H5N1 in 2006-7 and the 2009-10 A/H1N1 pandemic. However, relatively few studies have been published on OP and OC clinical pharmacokinetics. The disposition of OC and the dosage adaptation of OP in specific populations, such as young children or patients undergoing extrarenal epuration, have also received poor attention. An analytical method was thus developed to assess OP and OC plasma concentrations in patients receiving OP and presenting with comorbidities or requiring intensive care. Methods: A high performance liquid chromatography coupled to tandem mass spectrometry method (HPLC-MS/MS) requiring 100-µL aliquot of plasma for quantification within 6 min of OP and OC was developed. A combination of protein precipitation with acetonitrile, followed by dilution of supernant in suitable buffered solvent was used as an extraction procedure. After reverse phase chromatographic separation, quantification was performed by electro-spray ionization-triple quadrupole mass spectrometry. Deuterated isotopic compounds of OP and OC were used as internal standards. Results: The method is sensitive (lower limit of quantification: 5 ng/mL for OP and OC), accurate (intra-/inter-assay bias for OP and OC: 8.5%/5.5% and 3.7/0.7%, respectively) and precise (intra-/inter-assay CV%: 5.2%/6.5% and 6.3%/9.2%, respectively) over the clinically relevant concentration range (upper limits of quantification 5000 ng/mL). Of importance, OP, as in other previous reports, was found not to be stable ex vivo in plasma on standard anticoagulants (i.e. EDTA, heparin or citrate). This poor stability of OP has been prevented by collecting blood samples on commercial fluoride/oxalate tubes. Conclusions: This new simple, rapid and robust HPLC-MS/MS assay for quantification of OP and OC plasma concentrations offers an efficient tool for concentration monitoring of OC. Its exposure can probably be controlled with sufficient accuracy by thorough dosage adjustment according to patient characteristics (e.g. renal clearance). The usefulness of systematic therapeutic drug monitoring in patients appears therefore questionable. However, pharmacokinetic studies are still needed to extend knowledge to particular subgroups of patients or dosage regimens.
Resumo:
Many definitions and debates exist about the core characteristics of social and solidarity economy (SSE) and its actors. Among others, legal forms, profit, geographical scope, and size as criteria for identifying SSE actors often reveal dissents among SSE scholars. Instead of using a dichotomous, either-in-or-out definition of SSE actors, this paper presents an assessment tool that takes into account multiple dimensions to offer a more comprehensive and nuanced view of the field. We first define the core dimensions of the assessment tool by synthesizing the multiple indicators found in the literature. We then empirically test these dimensions and their interrelatedness and seek to identify potential clusters of actors. Finally we discuss the practical implications of our model.
Resumo:
Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.
Resumo:
This paper presents 3-D brain tissue classificationschemes using three recent promising energy minimizationmethods for Markov random fields: graph cuts, loopybelief propagation and tree-reweighted message passing.The classification is performed using the well knownfinite Gaussian mixture Markov Random Field model.Results from the above methods are compared with widelyused iterative conditional modes algorithm. Theevaluation is performed on a dataset containing simulatedT1-weighted MR brain volumes with varying noise andintensity non-uniformities. The comparisons are performedin terms of energies as well as based on ground truthsegmentations, using various quantitative metrics.
Resumo:
Nonlinear regression problems can often be reduced to linearity by transforming the response variable (e.g., using the Box-Cox family of transformations). The classic estimates of the parameter defining the transformation as well as of the regression coefficients are based on the maximum likelihood criterion, assuming homoscedastic normal errors for the transformed response. These estimates are nonrobust in the presence of outliers and can be inconsistent when the errors are nonnormal or heteroscedastic. This article proposes new robust estimates that are consistent and asymptotically normal for any unimodal and homoscedastic error distribution. For this purpose, a robust version of conditional expectation is introduced for which the prediction mean squared error is replaced with an M scale. This concept is then used to develop a nonparametric criterion to estimate the transformation parameter as well as the regression coefficients. A finite sample estimate of this criterion based on a robust version of smearing is also proposed. Monte Carlo experiments show that the new estimates compare favorably with respect to the available competitors.
Resumo:
Aims: A rapid and simple HPLC-MS method was developed for the simultaneousdetermination of antidementia drugs, including donepezil, galantamine, rivastigmineand its major metabolite NAP 226 - 90, and memantine, for TherapeuticDrug Monitoring (TDM). In the elderly population treated with antidementiadrugs, the presence of several comorbidities, drug interactions resulting frompolypharmacy, and variations in drug metabolism and elimination, are possiblefactors leading to the observed high interindividual variability in plasma levels.Although evidence for the benefit of TDM for antidementia drugs still remains tobe demonstrated, an individually adapted dosage through TDM might contributeto minimize the risk of adverse reactions and to increase the probability of efficienttherapeutic response. Methods: A solid-phase extraction procedure with amixed-mode cation exchange sorbent was used to isolate the drugs from 0.5 mL ofplasma. The compounds were analyzed on a reverse-phase column with a gradientelution consisting of an ammonium acetate buffer at pH 9.3 and acetonitrile anddetected by mass spectrometry in the single ion monitoring mode. Isotope-labeledinternal standards were used for quantification where possible. The validatedmethod was used to measure the plasma levels of antidementia drugs in 300patients treated with these drugs. Results: The method was validated accordingto international standards of validation, including the assessment of the trueness(-8 - 11 %), the imprecision (repeatability: 1-5%, intermediate imprecision:2 - 9 %), selectivity and matrix effects variability (less than 6 %). Furthermore,short and long-term stability of the analytes in plasma was ascertained. Themethod proved to be robust in the calibrated ranges of 1 - 300 ng/mL for rivastigmineand memantine and 2 - 300 mg/mL for donepezil, galantamine and NAP226 - 90. We recently published a full description of the method (1). We found ahigh interindividual variability in plasma levels of these drugs in a study populationof 300 patients. The plasma level measurements, with some preliminaryclinical and pharmacogenetic results, will be presented. Conclusion: A simpleLC-MS method was developed for plasma level determination of antidementiadrugs which was successfully used in a clinical study with 300 patients.
Resumo:
BACKGROUND: Community-based diabetes screening programs can help sensitize the population and identify new cases. However, the impact of such programs is rarely assessed in high-income countries, where concurrent health information and screening opportunities are common place. INTERVENTION AND METHODS: A 2-week screening and awareness campaign was organized as part of a new diabetes program in the canton of Vaud (population of 697,000) in Switzerland. Screening was performed without appointment in 190 out of 244 pharmacies in the canton at the subsidized cost of 10 Swiss Francs per participant. Screening included questions on risk behaviors, measurement of body mass index, blood pressure, blood cholesterol, random blood glucose (RBG), and A1c if RBG was >/=7.0 mmol/L. A mass media campaign promoting physical activity and a healthy diet was channeled through several media, eg, 165 spots on radio, billboards in 250 public places, flyers in 360 public transport vehicles, and a dozen articles in several newspapers. A telephone survey in a representative sample of the population of the canton was performed after the campaign to evaluate the program. RESULTS: A total of 4222 participants (0.76% of all persons aged >/=18 years) underwent the screening program (median age: 53 years, 63% females). Among participants not treated for diabetes, 3.7% had RBG >/= 7.8 mmol/L and 1.8% had both RBG >/= 7.0 mmol/L and A1c >/= 6.5. Untreated blood pressure >/=140/90 mmHg and/or untreated cholesterol >/=5.2 mmol/L were found in 50.5% of participants. One or several treated or untreated modifiable risk factors were found in 78% of participants. The telephone survey showed that 53% of all adults in the canton were sensitized by the campaign. Excluding fees paid by the participants, the program incurred a cost of CHF 330,600. CONCLUSION: A community-based screening program had low efficiency for detecting new cases of diabetes, but it identified large numbers of persons with elevated other cardiovascular risk factors. Our findings suggest the convenience of A1c for mass screening of diabetes, the usefulness of extending diabetes screening to other cardiovascular risk factors, and the importance of a robust background communication campaign.
Resumo:
Bacteria are generally difficult specimens to prepare for conventional resin section electron microscopy and mycobacteria, with their thick and complex cell envelope layers being especially prone to artefacts. Here we made a systematic comparison of different methods for preparing Mycobacterium smegmatis for thin section electron microscopy analysis. These methods were: (1) conventional preparation by fixatives and epoxy resins at ambient temperature. (2) Tokuyasu cryo-section of chemically fixed bacteria. (3) rapid freezing followed by freeze substitution and embedding in epoxy resin at room temperature or (4) combined with Lowicryl HM20 embedding and ultraviolet (UV) polymerization at low temperature and (5) CEMOVIS, or cryo electron microscopy of vitreous sections. The best preservation of bacteria was obtained with the cryo electron microscopy of vitreous sections method, as expected, especially with respect to the preservation of the cell envelope and lipid bodies. By comparison with cryo electron microscopy of vitreous sections both the conventional and Tokuyasu methods produced different, undesirable artefacts. The two different types of freeze-substitution protocols showed variable preservation of the cell envelope but gave acceptable preservation of the cytoplasm, but not lipid bodies, and bacterial DNA. In conclusion although cryo electron microscopy of vitreous sections must be considered the 'gold standard' among sectioning methods for electron microscopy, because it avoids solvents and stains, the use of optimally prepared freeze substitution also offers some advantages for ultrastructural analysis of bacteria.
Resumo:
The question of where retroviral DNA becomes integrated in chromosomes is important for understanding (i) the mechanisms of viral growth, (ii) devising new anti-retroviral therapy, (iii) understanding how genomes evolve, and (iv) developing safer methods for gene therapy. With the completion of genome sequences for many organisms, it has become possible to study integration targeting by cloning and sequencing large numbers of host-virus DNA junctions, then mapping the host DNA segments back onto the genomic sequence. This allows statistical analysis of the distribution of integration sites relative to the myriad types of genomic features that are also being mapped onto the sequence scaffold. Here we present methods for recovering and analyzing integration site sequences.
Resumo:
AbstractText BACKGROUND: Profiling sperm DNA present on vaginal swabs taken from rape victims often contributes to identifying and incarcerating rapists. Large amounts of the victim's epithelial cells contaminate the sperm present on swabs, however, and complicate this process. The standard method for obtaining relatively pure sperm DNA from a vaginal swab is to digest the epithelial cells with Proteinase K in order to solubilize the victim's DNA, and to then physically separate the soluble DNA from the intact sperm by pelleting the sperm, removing the victim's fraction, and repeatedly washing the sperm pellet. An alternative approach that does not require washing steps is to digest with Proteinase K, pellet the sperm, remove the victim's fraction, and then digest the residual victim's DNA with a nuclease. METHODS: The nuclease approach has been commercialized in a product, the Erase Sperm Isolation Kit (PTC Labs, Columbia, MO, USA), and five crime laboratories have tested it on semen-spiked female buccal swabs in a direct comparison with their standard methods. Comparisons have also been performed on timed post-coital vaginal swabs and evidence collected from sexual assault cases. RESULTS: For the semen-spiked buccal swabs, Erase outperformed the standard methods in all five laboratories and in most cases was able to provide a clean male profile from buccal swabs spiked with only 1,500 sperm. The vaginal swabs taken after consensual sex and the evidence collected from rape victims showed a similar pattern of Erase providing superior profiles. CONCLUSIONS: In all samples tested, STR profiles of the male DNA fractions obtained with Erase were as good as or better than those obtained using the standard methods.
Resumo:
BACKGROUND: Low-molecular-weight heparin (LMWH) appears to be safe and effective for treating pulmonary embolism (PE), but its cost-effectiveness has not been assessed. METHODS: We built a Markov state-transition model to evaluate the medical and economic outcomes of a 6-day course with fixed-dose LMWH or adjusted-dose unfractionated heparin (UFH) in a hypothetical cohort of 60-year-old patients with acute submassive PE. Probabilities for clinical outcomes were obtained from a meta-analysis of clinical trials. Cost estimates were derived from Medicare reimbursement data and other sources. The base-case analysis used an inpatient setting, whereas secondary analyses examined early discharge and outpatient treatment with LMWH. Using a societal perspective, strategies were compared based on lifetime costs, quality-adjusted life-years (QALYs), and the incremental cost-effectiveness ratio. RESULTS: Inpatient treatment costs were higher for LMWH treatment than for UFH (dollar 13,001 vs dollar 12,780), but LMWH yielded a greater number of QALYs than did UFH (7.677 QALYs vs 7.493 QALYs). The incremental costs of dollar 221 and the corresponding incremental effectiveness of 0.184 QALYs resulted in an incremental cost-effectiveness ratio of dollar 1,209/QALY. Our results were highly robust in sensitivity analyses. LMWH became cost-saving if the daily pharmacy costs for LMWH were < dollar 51, if > or = 8% of patients were eligible for early discharge, or if > or = 5% of patients could be treated entirely as outpatients. CONCLUSION: For inpatient treatment of PE, the use of LMWH is cost-effective compared to UFH. Early discharge or outpatient treatment in suitable patients with PE would lead to substantial cost savings.
Resumo:
To be diagnostically useful, structural MRI must reliably distinguish Alzheimer's disease (AD) from normal aging in individual scans. Recent advances in statistical learning theory have led to the application of support vector machines to MRI for detection of a variety of disease states. The aims of this study were to assess how successfully support vector machines assigned individual diagnoses and to determine whether data-sets combined from multiple scanners and different centres could be used to obtain effective classification of scans. We used linear support vector machines to classify the grey matter segment of T1-weighted MR scans from pathologically proven AD patients and cognitively normal elderly individuals obtained from two centres with different scanning equipment. Because the clinical diagnosis of mild AD is difficult we also tested the ability of support vector machines to differentiate control scans from patients without post-mortem confirmation. Finally we sought to use these methods to differentiate scans between patients suffering from AD from those with frontotemporal lobar degeneration. Up to 96% of pathologically verified AD patients were correctly classified using whole brain images. Data from different centres were successfully combined achieving comparable results from the separate analyses. Importantly, data from one centre could be used to train a support vector machine to accurately differentiate AD and normal ageing scans obtained from another centre with different subjects and different scanner equipment. Patients with mild, clinically probable AD and age/sex matched controls were correctly separated in 89% of cases which is compatible with published diagnosis rates in the best clinical centres. This method correctly assigned 89% of patients with post-mortem confirmed diagnosis of either AD or frontotemporal lobar degeneration to their respective group. Our study leads to three conclusions: Firstly, support vector machines successfully separate patients with AD from healthy aging subjects. Secondly, they perform well in the differential diagnosis of two different forms of dementia. Thirdly, the method is robust and can be generalized across different centres. This suggests an important role for computer based diagnostic image analysis for clinical practice.