897 resultados para Simulation-based methods
Resumo:
Lung stereology has a long and successful tradition. From mice to men, the application of new stereological methods at several levels (alveoli, parenchymal cells, organelles, proteins) has led to new insights into normal lung architecture, parenchymal remodelling in emphysema-like pathology, alveolar type II cell hyperplasia and hypertrophy and intracellular surfactant alterations as well as distribution of surfactant proteins. The Euler number of the network of alveolar openings, estimated using physical disectors at the light microscopic level, is an unbiased and direct estimate of alveolar number. Surfactant-producing alveolar type II cells can be counted and sampled for local size estimation with physical disectors at a high magnification light microscopic level. The number of their surfactant storage organelles, lamellar bodies, can be estimated using physical disectors at the EM level. By immunoelectron microscopy, surfactant protein distribution can be analysed with the relative labelling index. Together with the well-established classical stereological methods, these design-based methods now allow for a complete quantitative phenotype analysis in lung development and disease, including the structural characterization of gene-manipulated mice, at the light and electron microscopic level.
Resumo:
Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.
Resumo:
Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.
Resumo:
BACKGROUND: In contrast to RIA, recently available ELISAs provide the potential for fully automated analysis of adiponectin. To date, studies reporting on the diagnostic characteristics of ELISAs and investigating on the relationship between ELISA- and RIA-based methods are rare. METHODS: Thus, we established and evaluated a fully automated platform (BEP 2000; Dade-Behring, Switzerland) for determination of adiponectin levels in serum by two different ELISA methods (competitive human adiponectin ELISA; high sensitivity human adiponectin sandwich ELISA; both Biovendor, Czech Republic). Further, as a reference method, we also employed a human adiponectin RIA (Linco Research, USA). Samples from 150 patients routinely presenting to our cardiology unit were tested. RESULTS: ELISA measurements could be accomplished in less than 3 h, measurement of RIA had a duration of 24 h. The ELISAs were evaluated for precision, analytical sensitivity and specificity, linearity on dilution and spiking recovery. In the investigated patients, type 2 diabetes, higher age and male gender were significantly associated with lower serum adiponectin concentrations. Correlations between the ELISA methods and the RIA were strong (competitive ELISA, r=0.82; sandwich ELISA, r=0.92; both p<0.001). However, Deming regression and Bland-Altman analysis indicated lack of agreement of the 3 methods preventing direct comparison of results. The equations of the regression lines are: Competitive ELISA=1.48 x RIA-0.88; High sensitivity sandwich ELISA=0.77 x RIA+1.01. CONCLUSIONS: Fully automated measurement of adiponectin by ELISA is feasible and substantially more rapid than RIA. The investigated ELISA test systems seem to exhibit analytical characteristics allowing for clinical application. In addition, there is a strong correlation between the ELISA methods and RIA. These findings might promote a more widespread use of adiponectin measurements in clinical research.
Resumo:
The goals of the present study were to model the population kinetics of in vivo influx and efflux processes of grepafloxacin at the serum-cerebrospinal fluid (CSF) barrier and to propose a simulation-based approach to optimize the design of dose-finding trials in the meningitis rabbit model. Twenty-nine rabbits with pneumococcal meningitis receiving grepafloxacin at 15 mg/kg of body weight (intravenous administration at 0 h), 30 mg/kg (at 0 h), or 50 mg/kg twice (at 0 and 4 h) were studied. A three-compartment population pharmacokinetic model was fit to the data with the program NONMEM (Nonlinear Mixed Effects Modeling). Passive diffusion clearance (CL(diff)) and active efflux clearance (CL(active)) are transfer kinetic modeling parameters. Influx clearance is assumed to be equal to CL(diff), and efflux clearance is the sum of CL(diff), CL(active), and bulk flow clearance (CL(bulk)). The average influx clearance for the population was 0.0055 ml/min (interindividual variability, 17%). Passive diffusion clearance was greater in rabbits receiving grepafloxacin at 15 mg/kg than in those treated with higher doses (0.0088 versus 0.0034 ml/min). Assuming a CL(bulk) of 0.01 ml/min, CL(active) was estimated to be 0.017 ml/min (11%), and clearance by total efflux was estimated to be 0.032 ml/min. The population kinetic model allows not only to quantify in vivo efflux and influx mechanisms at the serum-CSF barrier but also to analyze the effects of different dose regimens on transfer kinetic parameters in the rabbit meningitis model. The modeling-based approach also provides a tool for the simulation and prediction of various outcomes in which researchers might be interested, which is of great potential in designing dose-finding trials.
Resumo:
This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.
Resumo:
BACKGROUND: We aimed to assess the value of a structured clinical assessment and genetic testing for refining the diagnosis of abacavir hypersensitivity reactions (ABC-HSRs) in a routine clinical setting. METHODS: We performed a diagnostic reassessment using a structured patient chart review in individuals who had stopped ABC because of suspected HSR. Two HIV physicians blinded to the human leukocyte antigen (HLA) typing results independently classified these individuals on a scale between 3 (ABC-HSR highly likely) and -3 (ABC-HSR highly unlikely). Scoring was based on symptoms, onset of symptoms and comedication use. Patients were classified as clinically likely (mean score > or =2), uncertain (mean score > or = -1 and < or = 1) and unlikely (mean score < or = -2). HLA typing was performed using sequence-based methods. RESULTS: From 131 reassessed individuals, 27 (21%) were classified as likely, 43 (33%) as unlikely and 61 (47%) as uncertain ABC-HSR. Of the 131 individuals with suspected ABC-HSR, 31% were HLA-B*5701-positive compared with 1% of 140 ABC-tolerant controls (P < 0.001). HLA-B*5701 carriage rate was higher in individuals with likely ABC-HSR compared with those with uncertain or unlikely ABC-HSR (78%, 30% and 5%, respectively, P < 0.001). Only six (7%) HLA-B*5701-negative individuals were classified as likely HSR after reassessment. CONCLUSIONS: HLA-B*5701 carriage is highly predictive of clinically diagnosed ABC-HSR. The high proportion of HLA-B*5701-negative individuals with minor symptoms among individuals with suspected HSR indicates overdiagnosis of ABC-HSR in the era preceding genetic screening. A structured clinical assessment and genetic testing could reduce the rate of inappropriate ABC discontinuation and identify individuals at high risk for ABC-HSR.
Resumo:
Decentralised controls offer advantages for the implementation as well as the operation of controls of steady conveyors. Such concepts are mainly based on RFID. Due to the reduced expense for appliances and software, however, the plant behaviour cannot be determined as accurately as in centrally controlled systems. This article describes a simulation-based method by which the performances of these two control concepts can easily be evaluated in order to determine the suitability of the decentralised concept.
Resumo:
Für den innerbetrieblichen Transport und besonders für die Materialversorgung in der Produktion werden zunehmend Routenzüge eingesetzt, die aus einem Schleppfahrzeug und bis zu fünf Anhängern bestehen. Um gute Manövrierbarkeit und einen geringen Platzbedarf zu gewährleisten, sollten Routenzuganhänger möglichst spurtreu sein. In diesem Beitrag wird die Spurtreue exemplarisch für zwei am Markt verfügbare Anhänger für Routenzüge mit zwei ungelenkten und vier gelenkten Rädern untersucht. Hierfür wird zunächst ein Gütekriterium definiert, das die maximale Abweichung von der Spurtreue quantitativ erfasst. Außerdem werden Testszenarien vorgeschlagen, um die verschiedenen Fahrwerks- und Lenkkonzepte vergleichen zu können. Mit Hilfe eines entwickelten analytischen Modells werden die Abweichungen in der Spurtreue bei stationärer Kreisfahrt für die zwei gewählten Konzepte berechnet und dargestellt. Zusätzlich wird eine Mehrkörper-Simulation, die eine tiefere physikalische Modellierung und die Untersuchung komplexerer Fahrmanöver erlaubt, durchgeführt und mit den analytischen Ergebnissen verglichen. Es kann gezeigt werden, dass neben der Lenkkinematik weitere Parameter wie das Schräglaufverhalten der Reifen Einfluss auf die Spurtreue haben.
Resumo:
Schleppzüge haben für den innerbetrieblichen Materialtransport in den letzten Jahren stark an Bedeutung gewonnen. Wichtige Eigenschaften sind die Manövrierbarkeit und die Spurtreue, da sie maßgeblich den Flächenbedarf bestimmen. In diesem Beitrag wird das Nachlaufverhalten von Schleppzügen, die sich durch ihr Fahrwerks- und Lenkkonzept unterscheiden, untersucht sowie eine neue Lenkkinematik vorgestellt. Um die Spurtreue der verschiedenen Konzepte objektiv vergleichen zu können, werden zunächst Fahrmanöver und ein Gütekriterium definiert, so dass die Abweichungen von der Spurtreue quantitativ beschrieben und verglichen werden können. Mit einem in diesem Beitrag vorgestellten analytischen Modell können bereits für die stationäre Kreisfahrt wichtige Aus-sagen über die Spurabweichungen getroffen werden. Zu-sätzlich werden Simulationen durchgeführt, die eine tiefere physikalische Modellierung und die Untersuchung komple-xerer Fahrmanöver erlauben. Außerdem wird dargestellt, dass auch die Art des Fahrmanövers Einfluss auf die Spurabweichung hat. Fahrwerks- und Lenkkonzepte, die bei stationärer Kreisfahrt ein sehr gutes Nachlaufverhalten aufweisen und bisher als spurtreu bezeichnet wurden, zeigen beim Ein- oder Ausfahren aus der Kurve zum Teil erhebliche Spurabweichungen. Mit diesen Erkenntnissen wird ein neues Lenkkonzept vorgestellt, das sich insbesondere durch einen sehr einfachen Aufbau sowie eine hohe Spurtreue auszeichnet.
Resumo:
During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network).
Resumo:
CONTRIBUTION OF ECTODOMAIN MUTATIONS IN EPIDERMAL GROWTH FACTOR RECEPTOR TO SIGNALING IN GLIOBLASTOMA MULTIFORME Publication No._________ Marta Rojas, M.S. Supervisory Professor: Oliver Bögler, Ph.D. The Cancer Genome Atlas (TCGA) has conducted a comprehensive analysis of a large tumor cohort and has cataloged genetic alterations involving primary sequence variations and copy number aberrations of genes involved in key signaling pathways in glioblastoma (GBM). This dataset revealed missense ectodomain point mutations in epidermal growth factor receptor (EGFR), but the biological and clinical significance of these mutations is not well defined in the context of gliomas. In our study, we focused on understanding and defining the molecular mechanisms underlying the functions of EGFR ectodomain mutants. Using proteomic approaches to broadly analyze cell signaling, including antibody array and mass spectrometry-based methods, we found a differential spectrum of tyrosine phosphorylation across the EGFR ectodomain mutations that enabled us to stratify them into three main groups that correlate with either wild type EGFR (EGFR) or the long-studied mutant, EGFRvIII. Interestingly, one mutant shared characteristics of both groups suggesting a continuum of behaviors along which different mutants fall. Surprisingly, no substantial differences were seen in activation of classical downstream signaling pathways such as Akt and S6 pathways between these classes of mutants. Importantly, we demonstrated that ectodomain mutations lead to differential tumor growth capabilities in both in vitro (anchorage independent colony formation) and in vivo conditions (xenografts). Our data from the biological characterization allowed us to categorize the mutants into three main groups: the first group typified by EGFRvIII are mutations with a more aggressive phenotype including R108K and A289T; a second group characterized by a less aggressive phenotype exemplified by EGFR and the T263P mutation; and a third group which shared characteristics from both groups and is exemplified by the mutation A289D. In addition, we treated cells overexpressing the mutants with various agents employed in the clinic including temozolomide, cisplatin and tarceva. We found that cells overexpressing the mutants in general displayed resistance to the treatments. Our findings yield insights that help with the molecular characterization of these mutants. In addition, our results from the drug studies might be valuable in explaining differential responses to specific treatments in GBM patients.
Resumo:
Vector control is the mainstay of malaria control programmes. Successful vector control profoundly relies on accurate information on the target mosquito populations in order to choose the most appropriate intervention for a given mosquito species and to monitor its impact. An impediment to identify mosquito species is the existence of morphologically identical sibling species that play different roles in the transmission of pathogens and parasites. Currently PCR diagnostics are used to distinguish between sibling species. PCR based methods are, however, expensive, time-consuming and their development requires a priori DNA sequence information. Here, we evaluated an inexpensive molecular proteomics approach for Anopheles species: matrix assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). MALDI-TOF MS is a well developed protein profiling tool for the identification of microorganisms but so far has received little attention as a diagnostic tool in entomology. We measured MS spectra from specimens of 32 laboratory colonies and 2 field populations representing 12 Anopheles species including the A. gambiae species complex. An important step in the study was the advancement and implementation of a bioinformatics approach improving the resolution over previously applied cluster analysis. Borrowing tools for linear discriminant analysis from genomics, MALDI-TOF MS accurately identified taxonomically closely related mosquito species, including the separation between the M and S molecular forms of A. gambiae sensu stricto. The approach also classifies specimens from different laboratory colonies; hence proving also very promising for its use in colony authentication as part of quality assurance in laboratory studies. While being exceptionally accurate and robust, MALDI-TOF MS has several advantages over other typing methods, including simple sample preparation and short processing time. As the method does not require DNA sequence information, data can also be reviewed at any later stage for diagnostic or functional patterns without the need for re-designing and re-processing biological material.
Resumo:
Multiple sclerosis (MS) is a chronic disease with an inflammatory and neurodegenerative pathology. Axonal loss and neurodegeneration occurs early in the disease course and may lead to irreversible neurological impairment. Changes in brain volume, observed from the earliest stage of MS and proceeding throughout the disease course, may be an accurate measure of neurodegeneration and tissue damage. There are a number of magnetic resonance imaging-based methods for determining global or regional brain volume, including cross-sectional (e.g. brain parenchymal fraction) and longitudinal techniques (e.g. SIENA [Structural Image Evaluation using Normalization of Atrophy]). Although these methods are sensitive and reproducible, caution must be exercised when interpreting brain volume data, as numerous factors (e.g. pseudoatrophy) may have a confounding effect on measurements, especially in a disease with complex pathological substrates such as MS. Brain volume loss has been correlated with disability progression and cognitive impairment in MS, with the loss of grey matter volume more closely correlated with clinical measures than loss of white matter volume. Preventing brain volume loss may therefore have important clinical implications affecting treatment decisions, with several clinical trials now demonstrating an effect of disease-modifying treatments (DMTs) on reducing brain volume loss. In clinical practice, it may therefore be important to consider the potential impact of a therapy on reducing the rate of brain volume loss. This article reviews the measurement of brain volume in clinical trials and practice, the effect of DMTs on brain volume change across trials and the clinical relevance of brain volume loss in MS.
Resumo:
In patients diagnosed with pharmaco-resistant epilepsy, cerebral areas responsible for seizure generation can be defined by performing implantation of intracranial electrodes. The identification of the epileptogenic zone (EZ) is based on visual inspection of the intracranial electroencephalogram (IEEG) performed by highly qualified neurophysiologists. New computer-based quantitative EEG analyses have been developed in collaboration with the signal analysis community to expedite EZ detection. The aim of the present report is to compare different signal analysis approaches developed in four different European laboratories working in close collaboration with four European Epilepsy Centers. Computer-based signal analysis methods were retrospectively applied to IEEG recordings performed in four patients undergoing pre-surgical exploration of pharmaco-resistant epilepsy. The four methods elaborated by the different teams to identify the EZ are based either on frequency analysis, on nonlinear signal analysis, on connectivity measures or on statistical parametric mapping of epileptogenicity indices. All methods converge on the identification of EZ in patients that present with fast activity at seizure onset. When traditional visual inspection was not successful in detecting EZ on IEEG, the different signal analysis methods produced highly discordant results. Quantitative analysis of IEEG recordings complement clinical evaluation by contributing to the study of epileptogenic networks during seizures. We demonstrate that the degree of sensitivity of different computer-based methods to detect the EZ in respect to visual EEG inspection depends on the specific seizure pattern.