31 resultados para Mathematical techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis which consists of an introduction and four peer-reviewed original publications studies the problems of haplotype inference (haplotyping) and local alignment significance. The problems studied here belong to the broad area of bioinformatics and computational biology. The presented solutions are computationally fast and accurate, which makes them practical in high-throughput sequence data analysis. Haplotype inference is a computational problem where the goal is to estimate haplotypes from a sample of genotypes as accurately as possible. This problem is important as the direct measurement of haplotypes is difficult, whereas the genotypes are easier to quantify. Haplotypes are the key-players when studying for example the genetic causes of diseases. In this thesis, three methods are presented for the haplotype inference problem referred to as HaploParser, HIT, and BACH. HaploParser is based on a combinatorial mosaic model and hierarchical parsing that together mimic recombinations and point-mutations in a biologically plausible way. In this mosaic model, the current population is assumed to be evolved from a small founder population. Thus, the haplotypes of the current population are recombinations of the (implicit) founder haplotypes with some point--mutations. HIT (Haplotype Inference Technique) uses a hidden Markov model for haplotypes and efficient algorithms are presented to learn this model from genotype data. The model structure of HIT is analogous to the mosaic model of HaploParser with founder haplotypes. Therefore, it can be seen as a probabilistic model of recombinations and point-mutations. BACH (Bayesian Context-based Haplotyping) utilizes a context tree weighting algorithm to efficiently sum over all variable-length Markov chains to evaluate the posterior probability of a haplotype configuration. Algorithms are presented that find haplotype configurations with high posterior probability. BACH is the most accurate method presented in this thesis and has comparable performance to the best available software for haplotype inference. Local alignment significance is a computational problem where one is interested in whether the local similarities in two sequences are due to the fact that the sequences are related or just by chance. Similarity of sequences is measured by their best local alignment score and from that, a p-value is computed. This p-value is the probability of picking two sequences from the null model that have as good or better best local alignment score. Local alignment significance is used routinely for example in homology searches. In this thesis, a general framework is sketched that allows one to compute a tight upper bound for the p-value of a local pairwise alignment score. Unlike the previous methods, the presented framework is not affeced by so-called edge-effects and can handle gaps (deletions and insertions) without troublesome sampling and curve fitting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological development of fast multi-sectional, helical computed tomography (CT) scanners has allowed computed tomography perfusion (CTp) and angiography (CTA) in evaluating acute ischemic stroke. This study focuses on new multidetector computed tomography techniques, namely whole-brain and first-pass CT perfusion plus CTA of carotid arteries. Whole-brain CTp data is acquired during slow infusion of contrast material to achieve constant contrast concentration in the cerebral vasculature. From these data quantitative maps are constructed of perfused cerebral blood volume (pCBV). The probability curve of cerebral infarction as a function of normalized pCBV was determined in patients with acute ischemic stroke. Normalized pCBV, expressed as a percentage of contralateral normal brain pCBV, was determined in the infarction core and in regions just inside and outside the boundary between infarcted and noninfarcted brain. Corresponding probabilities of infarction were 0.99, 0.96, and 0.11, R² was 0.73, and differences in perfusion between core and inner and outer bands were highly significant. Thus a probability of infarction curve can help predict the likelihood of infarction as a function of percentage normalized pCBV. First-pass CT perfusion is based on continuous cine imaging over a selected brain area during a bolus injection of contrast. During its first passage, contrast material compartmentalizes in the intravascular space, resulting in transient tissue enhancement. Functional maps such as cerebral blood flow (CBF), and volume (CBV), and mean transit time (MTT) are then constructed. We compared the effects of three different iodine concentrations (300, 350, or 400 mg/mL) on peak enhancement of normal brain tissue and artery and vein, stratified by region-of-interest (ROI) location, in 102 patients within 3 hours of stroke onset. A monotonic increasing peak opacification was evident at all ROI locations, suggesting that CTp evaluation of patients with acute stroke is best performed with the highest available concentration of contrast agent. In another study we investigated whether lesion volumes on CBV, CBF, and MTT maps within 3 hours of stroke onset predict final infarct volume, and whether all these parameters are needed for triage to intravenous recombinant tissue plasminogen activator (IV-rtPA). The effect of IV-rtPA on the affected brain by measuring salvaged tissue volume in patients receiving IV-rtPA and in controls was investigated also. CBV lesion volume did not necessarily represent dead tissue. MTT lesion volume alone can serve to identify the upper size limit of the abnormally perfused brain, and those with IV-rtPA salvaged more brain than did controls. Carotid CTA was compared with carotid DSA in grading of stenosis in patients with stroke symptoms. In CTA, the grade of stenosis was determined by means of axial source and maximum intensity projection (MIP) images as well as a semiautomatic vessel analysis. CTA provides an adequate, less invasive alternative to conventional DSA, although tending to underestimate clinically relevant grades of stenosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional invasive coronary angiography is the clinical gold standard for detecting of coronary artery stenoses. Noninvasive multidetector computed tomography (MDCT) in combination with retrospective ECG gating has recently been shown to permit visualization of the coronary artery lumen and detection of coronary artery stenoses. Single photon emission tomography (SPECT) perfusion imaging has been considered the reference method for evaluation of nonviable myocardium, but magnetic resonance imaging (MRI) can accurately depict structure, function, effusion, and myocardial viability, with an overall capacity unmatched by any other single imaging modality. Magnetocardiography (MCG) provides noninvasively information about myocardial excitation propagation and repolarization without the use of electrodes. This evolving technique may be considered the magnetic equivalent to electrocardiography. The aim of the present series of studies was to evaluate changes in the myocardium assessed with SPECT and MRI caused by coronary artery disease, examine the capability of multidetector computed tomography coronary angiography (MDCT-CA) to detect significant stenoses in the coronary arteries, and MCG to assess remote myocardial infarctions. Our study showed that in severe, progressing coronary artery disease laser treatment does not improve global left ventricular function or myocardial perfusion, but it does preserve systolic wall thickening in fixed defects (scar). It also prevents changes from ischemic myocardial regions to scar. The MCG repolarization variables are informative in remote myocardial infarction, and may perform as well as the conventional QRS criteria in detection of healed myocardial infarction. These STT abnormalities are more pronounced in patients with Q-wave infarction than in patients with non-Q-wave infarctions. MDCT-CA had a sensitivity of 82%, a specificity of 94%, a positive predictive value of 79%, and a negative predictive value of 95% for stenoses over 50% in the main coronary arteries as compared with conventional coronary angiography in patients with known coronary artery disease. Left ventricular wall dysfunction, perfusion defects, and infarctions were detected in 50-78% of sectors assigned to calcifications or stenoses, but also in sectors supplied by normally perfused coronary arteries. Our study showed a low sensitivity (sensitivity 63%) in detecting obstructive coronary artery disease assessed by MDCT in patients with severe aortic stenosis. Massive calcifications complicated correct assessment of the lumen of coronary arteries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Close to one half of the LHC events are expected to be due to elastic or inelastic diffractive scattering. Still, predictions based on extrapolations of experimental data at lower energies differ by large factors in estimating the relative rate of diffractive event categories at the LHC energies. By identifying diffractive events, detailed studies on proton structure can be carried out. The combined forward physics objects: rapidity gaps, forward multiplicity and transverse energy flows can be used to efficiently classify proton-proton collisions. Data samples recorded by the forward detectors, with a simple extension, will allow first estimates of the single diffractive (SD), double diffractive (DD), central diffractive (CD), and non-diffractive (ND) cross sections. The approach, which uses the measurement of inelastic activity in forward and central detector systems, is complementary to the detection and measurement of leading beam-like protons. In this investigation, three different multivariate analysis approaches are assessed in classifying forward physics processes at the LHC. It is shown that with gene expression programming, neural networks and support vector machines, diffraction can be efficiently identified within a large sample of simulated proton-proton scattering events. The event characteristics are visualized by using the self-organizing map algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nutritional quality of the product as well as other quality attributes like microbiological and sensory quality are essential factors in baby food industry, and therefore different alternative sterilizing methods for conventional heating processes are of great interest in this food sector. This report gives an overview on different sterilization techniques for baby food. The report is a part of the work done in work package 3 ”QACCP Analysis Processing: Quality – driven distribution and processing chain analysis“ in the Core Organic ERANET project called Quality analysis of critical control points within the whole food chain and their impact on food quality, safety and health (QACCP). The overall objective of the project is to optimise organic production and processing in order to improve food safety as well as nutritional quality and increase health promoting aspects in consumer products. The approach will be a chain analysis approach which addresses the link between farm and fork and backwards from fork to farm. The objective is to improve product related quality management in farming (towards testing food authenticity) and processing (towards food authenticity and sustainable processes. The articles in this volume do not necessarily reflect the Core Organic ERANET’s views and in no way anticipate the Core Organic ERANET’s future policy in this area. The contents of the articles in this volume are the sole responsibility of the authors. The information contained here in, including any expression of opinion and any projection or forecast, has been obtained from sources believed by the authors to be reliable but is not guaranteed as to accuracy or completeness. The information is supplied without obligation and on the understanding that any person who acts upon it or otherwise changes his/her position in reliance thereon does so entirely at his/her own risk. The writers gratefully acknowledge the financial support from the Core Organic Funding Body: Ministry of Agriculture and Forestry, Finland, Swiss Federal Office for Agriculture, Switzerland and Federal Ministry of Consumer Protection, Food and Agriculture, Germany.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study investigates whether there is an association between different combinations of emphasis on generic strategies (product differentiation and cost efficiency) and perceived usefulness of management accounting techniques. Previous research has found that cost leadership is associated with traditional accounting techniques and product differentiation with a variety of modern management accounting approaches. The present study focuses on the possible existence of a strategy that mixes these generic strategies. The empirical results suggest that (a) there is no difference in the attitudes towards the usefulness of traditional management accounting techniques between companies that adhere either to a single strategy or a mixed strategy; (b) there is no difference in the attitudes towards modern and traditional techniques between companies that adhere to a single strategy, whether this is product differentiation or cost efficiency, and c) companies that favour a mixed strategy seem to have a more positive attitude towards modern techniques than companies adhering to a single strategy