35 resultados para Requirements elicitation techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis which consists of an introduction and four peer-reviewed original publications studies the problems of haplotype inference (haplotyping) and local alignment significance. The problems studied here belong to the broad area of bioinformatics and computational biology. The presented solutions are computationally fast and accurate, which makes them practical in high-throughput sequence data analysis. Haplotype inference is a computational problem where the goal is to estimate haplotypes from a sample of genotypes as accurately as possible. This problem is important as the direct measurement of haplotypes is difficult, whereas the genotypes are easier to quantify. Haplotypes are the key-players when studying for example the genetic causes of diseases. In this thesis, three methods are presented for the haplotype inference problem referred to as HaploParser, HIT, and BACH. HaploParser is based on a combinatorial mosaic model and hierarchical parsing that together mimic recombinations and point-mutations in a biologically plausible way. In this mosaic model, the current population is assumed to be evolved from a small founder population. Thus, the haplotypes of the current population are recombinations of the (implicit) founder haplotypes with some point--mutations. HIT (Haplotype Inference Technique) uses a hidden Markov model for haplotypes and efficient algorithms are presented to learn this model from genotype data. The model structure of HIT is analogous to the mosaic model of HaploParser with founder haplotypes. Therefore, it can be seen as a probabilistic model of recombinations and point-mutations. BACH (Bayesian Context-based Haplotyping) utilizes a context tree weighting algorithm to efficiently sum over all variable-length Markov chains to evaluate the posterior probability of a haplotype configuration. Algorithms are presented that find haplotype configurations with high posterior probability. BACH is the most accurate method presented in this thesis and has comparable performance to the best available software for haplotype inference. Local alignment significance is a computational problem where one is interested in whether the local similarities in two sequences are due to the fact that the sequences are related or just by chance. Similarity of sequences is measured by their best local alignment score and from that, a p-value is computed. This p-value is the probability of picking two sequences from the null model that have as good or better best local alignment score. Local alignment significance is used routinely for example in homology searches. In this thesis, a general framework is sketched that allows one to compute a tight upper bound for the p-value of a local pairwise alignment score. Unlike the previous methods, the presented framework is not affeced by so-called edge-effects and can handle gaps (deletions and insertions) without troublesome sampling and curve fitting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mobile phone has, as a device, taken the world by storm in the past decade; from only 136 million phones globally in 1996, it is now estimated that by the end of 2008 roughly half of the worlds population will own a mobile phone. Over the years, the capabilities of the phones as well as the networks have increased tremendously, reaching the point where the devices are better called miniature computers rather than simply mobile phones. The mobile industry is currently undertaking several initiatives of developing new generations of mobile network technologies; technologies that to a large extent focus at offering ever-increasing data rates. This thesis seeks to answer the question of whether the future mobile networks in development and the future mobile services are in sync; taking a forward-looking timeframe of five to eight years into the future, will there be services that will need the high-performance new networks being planned? The question is seen to be especially pertinent in light of slower-than-expected takeoff of 3G data services. Current and future mobile services are analyzed from two viewpoints; first, looking at the gradual, evolutionary development of the services and second, through seeking to identify potential revolutionary new mobile services. With information on both current and future mobile networks as well as services, a network capability - service requirements mapping is performed to identify which services will work in which networks. Based on the analysis, it is far from certain whether the new mobile networks, especially those planned for deployment after HSPA, will be needed as soon as they are being currently roadmapped. The true service-based demand for the "beyond HSPA" technologies may be many years into the future - or, indeed, may never materialize thanks to the increasing deployment of local area wireless broadband technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usual task in music information retrieval (MIR) is to find occurrences of a monophonic query pattern within a music database, which can contain both monophonic and polyphonic content. The so-called query-by-humming systems are a famous instance of content-based MIR. In such a system, the user's hummed query is converted into symbolic form to perform search operations in a similarly encoded database. The symbolic representation (e.g., textual, MIDI or vector data) is typically a quantized and simplified version of the sampled audio data, yielding to faster search algorithms and space requirements that can be met in real-life situations. In this thesis, we investigate geometric approaches to MIR. We first study some musicological properties often needed in MIR algorithms, and then give a literature review on traditional (e.g., string-matching-based) MIR algorithms and novel techniques based on geometry. We also introduce some concepts from digital image processing, namely the mathematical morphology, which we will use to develop and implement four algorithms for geometric music retrieval. The symbolic representation in the case of our algorithms is a binary 2-D image. We use various morphological pre- and post-processing operations on the query and the database images to perform template matching / pattern recognition for the images. The algorithms are basically extensions to classic image correlation and hit-or-miss transformation techniques used widely in template matching applications. They aim to be a future extension to the retrieval engine of C-BRAHMS, which is a research project of the Department of Computer Science at University of Helsinki.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Requirements engineering is an important phase in software development where customer's needs and expectations are transformed into a software requirements specification. The requirements specification can be considered as an agreement between the customer and the developer where both parties agree on the expected system features and behaviour. However, requirements engineers must deal with a variety of issues that complicate the requirements process. The communication gap between the customer and the developers is among typical reasons for unsatisfactory requirements. In this thesis we study how the use case technique could be used in requirements engineering in bridging the communication gap between the customer and development team. We also discuss how a use case description can be use cases can be used as a basis for acceptance test cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite much research on forest biodiversity in Fennoscandia, the exact mechanisms of species declines in dead-wood dependent fungi are still poorly understood. In particular, there is only limited information on why certain fungal species have responded negatively to habitat loss and fragmentation, while others have not. Understanding the mechanisms behind species declines would be essential for the design and development of ecologically effective and scientifically informed conservation measures, and management practices that would promote biodiversity in production forests. In this thesis I study the ecology of polypores and their responses to forest management, with a particular focus on why some species have declined more than others. The data considered in the thesis comprise altogether 98,318 dead-wood objects, with 43,085 observations of 174 fungal species. Out of these, 1,964 observations represent 58 red-listed species. The data were collected from 496 sites, including woodland key habitats, clear-cuts with retention trees, mature managed forests, and natural or natural-like forests in southern Finland and Russian Karelia. I show that the most relevant way of measuring resource availability can differ to a great extent between species seemingly sharing the same resources. It is thus critical to measure the availability of resources in a way that takes into account the ecological requirements of the species. The results show that connectivity at the local, landscape and regional scales is important especially for the highly specialized species, many of which are also red-listed. Habitat loss and fragmentation affect not only species diversity but also the relative abundances of the species and, consequently, species interactions and fungal successional pathways. Changes in species distributions and abundances are likely to affect the food chains in which wood-inhabiting fungi are involved, and thus the functioning of the whole forest ecosystem. The findings of my thesis highlight the importance of protecting well-connected, large and high-quality forest areas to maintain forest biodiversity. Small habitat patches distributed across the landscape are likely to contribute only marginally to protection of red-listed species, especially if habitat quality is not substantially higher than in ordinary managed forest, as is the case with woodland key habitats. Key habitats might supplement the forest protection network if they were delineated larger and if harvesting of individual trees was prohibited in them. Taking the landscape perspective into account in the design and development of conservation measures is critical while striving to halt the decline of forest biodiversity in an ecologically effective manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological development of fast multi-sectional, helical computed tomography (CT) scanners has allowed computed tomography perfusion (CTp) and angiography (CTA) in evaluating acute ischemic stroke. This study focuses on new multidetector computed tomography techniques, namely whole-brain and first-pass CT perfusion plus CTA of carotid arteries. Whole-brain CTp data is acquired during slow infusion of contrast material to achieve constant contrast concentration in the cerebral vasculature. From these data quantitative maps are constructed of perfused cerebral blood volume (pCBV). The probability curve of cerebral infarction as a function of normalized pCBV was determined in patients with acute ischemic stroke. Normalized pCBV, expressed as a percentage of contralateral normal brain pCBV, was determined in the infarction core and in regions just inside and outside the boundary between infarcted and noninfarcted brain. Corresponding probabilities of infarction were 0.99, 0.96, and 0.11, R² was 0.73, and differences in perfusion between core and inner and outer bands were highly significant. Thus a probability of infarction curve can help predict the likelihood of infarction as a function of percentage normalized pCBV. First-pass CT perfusion is based on continuous cine imaging over a selected brain area during a bolus injection of contrast. During its first passage, contrast material compartmentalizes in the intravascular space, resulting in transient tissue enhancement. Functional maps such as cerebral blood flow (CBF), and volume (CBV), and mean transit time (MTT) are then constructed. We compared the effects of three different iodine concentrations (300, 350, or 400 mg/mL) on peak enhancement of normal brain tissue and artery and vein, stratified by region-of-interest (ROI) location, in 102 patients within 3 hours of stroke onset. A monotonic increasing peak opacification was evident at all ROI locations, suggesting that CTp evaluation of patients with acute stroke is best performed with the highest available concentration of contrast agent. In another study we investigated whether lesion volumes on CBV, CBF, and MTT maps within 3 hours of stroke onset predict final infarct volume, and whether all these parameters are needed for triage to intravenous recombinant tissue plasminogen activator (IV-rtPA). The effect of IV-rtPA on the affected brain by measuring salvaged tissue volume in patients receiving IV-rtPA and in controls was investigated also. CBV lesion volume did not necessarily represent dead tissue. MTT lesion volume alone can serve to identify the upper size limit of the abnormally perfused brain, and those with IV-rtPA salvaged more brain than did controls. Carotid CTA was compared with carotid DSA in grading of stenosis in patients with stroke symptoms. In CTA, the grade of stenosis was determined by means of axial source and maximum intensity projection (MIP) images as well as a semiautomatic vessel analysis. CTA provides an adequate, less invasive alternative to conventional DSA, although tending to underestimate clinically relevant grades of stenosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional invasive coronary angiography is the clinical gold standard for detecting of coronary artery stenoses. Noninvasive multidetector computed tomography (MDCT) in combination with retrospective ECG gating has recently been shown to permit visualization of the coronary artery lumen and detection of coronary artery stenoses. Single photon emission tomography (SPECT) perfusion imaging has been considered the reference method for evaluation of nonviable myocardium, but magnetic resonance imaging (MRI) can accurately depict structure, function, effusion, and myocardial viability, with an overall capacity unmatched by any other single imaging modality. Magnetocardiography (MCG) provides noninvasively information about myocardial excitation propagation and repolarization without the use of electrodes. This evolving technique may be considered the magnetic equivalent to electrocardiography. The aim of the present series of studies was to evaluate changes in the myocardium assessed with SPECT and MRI caused by coronary artery disease, examine the capability of multidetector computed tomography coronary angiography (MDCT-CA) to detect significant stenoses in the coronary arteries, and MCG to assess remote myocardial infarctions. Our study showed that in severe, progressing coronary artery disease laser treatment does not improve global left ventricular function or myocardial perfusion, but it does preserve systolic wall thickening in fixed defects (scar). It also prevents changes from ischemic myocardial regions to scar. The MCG repolarization variables are informative in remote myocardial infarction, and may perform as well as the conventional QRS criteria in detection of healed myocardial infarction. These STT abnormalities are more pronounced in patients with Q-wave infarction than in patients with non-Q-wave infarctions. MDCT-CA had a sensitivity of 82%, a specificity of 94%, a positive predictive value of 79%, and a negative predictive value of 95% for stenoses over 50% in the main coronary arteries as compared with conventional coronary angiography in patients with known coronary artery disease. Left ventricular wall dysfunction, perfusion defects, and infarctions were detected in 50-78% of sectors assigned to calcifications or stenoses, but also in sectors supplied by normally perfused coronary arteries. Our study showed a low sensitivity (sensitivity 63%) in detecting obstructive coronary artery disease assessed by MDCT in patients with severe aortic stenosis. Massive calcifications complicated correct assessment of the lumen of coronary arteries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sormen koukistajajännevamman korjauksen jälkeisen aktiivisen mobilisaation on todettu johtavan parempaan toiminnalliseen lopputulokseen kuin nykyisin yleisesti käytetyn dynaamisen mobilisaation. Aktiivisen mobilisaation ongelma on jännekorjauksen pettämisriskin lisääntyminen nykyisten ommeltekniikoiden riittämättömän vahvuuden vuoksi. Jännekorjauksen lujuutta on parannettu kehittämällä monisäieommeltekniikoita, joissa jänteeseen tehdään useita rinnakkaisia ydinompeleita. Niiden kliinistä käyttöä rajoittaa kuitenkin monimutkainen ja aikaa vievä tekninen suoritus. Käden koukistajajännekorjauksessa käytetään yleisesti sulamattomia ommelmateriaaleja. Nykyiset käytössä olevat biohajoavat langat heikkenevät liian nopeasti jänteen paranemiseen nähden. Biohajoavan laktidistereokopolymeeri (PLDLA) 96/4 – langan vetolujuuden puoliintumisajan sekä kudosominaisuuksien on aiemmin todettu soveltuvan koukistajajännekorjaukseen. Tutkimuksen tavoitteena oli kehittää välittömän aktiivisen mobilisaation kestävä ja toteutukseltaan yksinkertainen käden koukistajajännekorjausmenetelmä biohajoavaa PLDLA 96/4 –materiaalia käyttäen. Tutkimuksessa analysoitiin viiden eri yleisesti käytetyn koukistajajänneompeleen biomekaanisia ominaisuuksia staattisessa vetolujuustestauksessa ydinompeleen rakenteellisten ominaisuuksien – 1) säikeiden (lankojen) lukumäärän, 2) langan paksuuden ja 3) ompeleen konfiguraation – vaikutuksen selvittämiseksi jännekorjauksen pettämiseen ja vahvuuteen. Jännekorjausten näkyvän avautumisen todettiin alkavan perifeerisen ompeleen pettäessä voima-venymäkäyrän myötöpisteessä. Ydinompeleen lankojen lukumäärän lisääminen paransi ompeleen pitokykyä jänteessä ja suurensi korjauksen myötövoimaa. Sen sijaan paksumman (vahvemman) langan käyttäminen tai ompeleen konfiguraatio eivät vaikuttaneet myötövoimaan. Tulosten perusteella tutkittiin mahdollisuuksia lisätä ompeleen pitokykyä jänteestä yksinkertaisella monisäieompeleella, jossa ydinommel tehtiin kolmen säikeen polyesterilangalla tai nauhamaisen rakenteen omaavalla kolmen säikeen polyesterilangalla. Nauhamainen rakenne lisäsi merkitsevästi ompeleen pitokykyä jänteessä parantaen myötövoimaa sekä maksimivoimaa. Korjauksen vahvuus ylitti aktiivisen mobilisaation jännekorjaukseen kohdistaman kuormitustason. PLDLA 96/4 –langan soveltuvuutta koukistajajännekorjaukseen selvitettiin tutkimalla langan biomekaanisia ominaisuuksia ja solmujen pito-ominaisuuksia staattisessa vetolujuustestauksessa verrattuna yleisimmin jännekorjauksessa käytettävään punottuun polyesterilankaan (Ticron®). PLDLA –langan todettiin soveltuvan hyvin koukistajajännekorjaukseen, sillä se on polyesterilankaa venymättömämpi ja solmujen pitävyys on parempi. Viimeisessä vaiheessa tutkittiin PLDLA 96/4 –langasta valmistetulla kolmisäikeisellä, nauhamaisella jännekorjausvälineellä tehdyn jännekorjauksen kestävyyttä staattisessa vetolujuustestauksessa sekä syklisessä kuormituksessa, joka simuloi staattista testausta paremmin mobilisaation toistuvaa kuormitusta. PLDLA-korjauksen vahvuus ylitti sekä staattisessa että syklisessä kuormituksessa aktiivisen mobilisaation edellyttämän vahvuuden. Nauhamaista litteää ommelmateriaalia ei aiemmin ole tutkittu tai käytetty käden koukistajajännekorjauksessa. Tässä tutkimuksessa ommelmateriaalin nauhamainen rakenne paransi merkitsevästi jännekorjauksen vahvuutta, minkä arvioidaan johtuvan lisääntyneestä kontaktipinnasta jänteen ja ommelmateriaalin välillä estäen ompeleen läpileikkautumista jänteessä. Tutkimuksessa biohajoavasta PLDLA –materiaalista valmistetulla rakenteeltaan nauhamaisella kolmisäikeisellä langalla tehdyn jännekorjauksen vahvuus saavutti aktiivisen mobilisaation edellyttämän tason. Lisäksi uusi menetelmä on helppokäyttöinen ja sillä vältetään perinteisten monisäieompeleiden tekniseen suoritukseen liittyvät ongelmat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Close to one half of the LHC events are expected to be due to elastic or inelastic diffractive scattering. Still, predictions based on extrapolations of experimental data at lower energies differ by large factors in estimating the relative rate of diffractive event categories at the LHC energies. By identifying diffractive events, detailed studies on proton structure can be carried out. The combined forward physics objects: rapidity gaps, forward multiplicity and transverse energy flows can be used to efficiently classify proton-proton collisions. Data samples recorded by the forward detectors, with a simple extension, will allow first estimates of the single diffractive (SD), double diffractive (DD), central diffractive (CD), and non-diffractive (ND) cross sections. The approach, which uses the measurement of inelastic activity in forward and central detector systems, is complementary to the detection and measurement of leading beam-like protons. In this investigation, three different multivariate analysis approaches are assessed in classifying forward physics processes at the LHC. It is shown that with gene expression programming, neural networks and support vector machines, diffraction can be efficiently identified within a large sample of simulated proton-proton scattering events. The event characteristics are visualized by using the self-organizing map algorithm.