22 resultados para Difficult-to-Measure Nuclides

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Type 1 diabetes (T1D) is considered to be an autoimmune disease. The cause of T1D is the destruction of insulin-producing β-cells in the pancreatic islets. The autoimmune nature of T1D is characterized by the presence of autoreactive T-cells and autoantibodies against β-cell molecules. Insulin is the only β-cell-specific autoantigen associated with T1D but the insulin autoantibodies (IAAs) are difficult to measure with proper sensitivity. T-cell assays for detection of autoreactive T-cells, such as insulin-specific T-cells, have also proven to be difficult to perform. The genetic risk of T1D is associated with the HLA gene region but the environmental factors also play an important role. The most studied environmental risk factors of T1D are enteroviruses and cow's milk which both affect the immune system through the gut. One hypothesis is that the insulin-specific immune response develops against bovine insulin in cow's milk during early infancy and later spreads to include human insulin. The aims of this study were to determine whether the separation of immunoglobulin (Ig)G from plasma would improve the sensitivity of the IAA assay and how insulin treatment affects the cellular immune response to insulin in newly diagnosed patients. Furthermore, the effect of insulin concentration in mother's breast milk on the development of antibodies to dietary insulin in the child was examined. Small intestinal biopsies were also obtained from children with T1D to characterize any immunological changes associated with T1D in the gut. The isolation of the IgG fraction from the plasma of T1D patients negative for plasma IAA led to detectable IAA levels that exceeded those in the control children. Thus the isolation of IgG may improve the sensitivity of the IAA assay. The effect of insulin treatment on insulin-specific T-cells was studied by culturing peripheral blood mononuclear cells with insulin. The insulin stimulation induced increased expression of regulatory T-cell markers, such as Foxp3, in those patients treated with insulin than in patients examined before initiating insulin treatment. This finding suggests that insulin treatment in patients with T1D stimulates regulatory T-cells in vivo and this may partly explain the difficulties in measuring autoantigen-specific T-cell responses in recently diagnosed patients. The stimulation of regulatory T-cells by insulin treatment may also explain the remission period often seen after initiating insulin treatment. In the third study we showed that insulin concentration in mother's breast milk correlates inversely with the levels of bovine insulin-specific antibodies in those infants who were exposed to cow's milk proteins in their diet, suggesting that human insulin in breast milk induces tolerance to dietary bovine insulin. However, in infants who later developed T1D-associated autoantibodies, the insulin concentration in their mother's breast milk was increased. This finding may indicate that in those children prone to β-cell autoimmunity, breast milk insulin does not promote tolerance to insulin. In the small intestinal biopsies the presence of several immunological markers were quantified with the RT-PCR. From these markers the expression of the interleukin (IL)-18 cytokine was significantly increased in the gut in patients with T1D compared with children with celiac disease or control children. The increased IL-18 expression lends further support for the hypothesis that the gut immune system is involved in the pathogenesis of T1D.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is essential to have a thorough understanding of the sources and sinks of oxidized nitrogen (NOy) in the atmosphere, since it has a strong influence on the tropospheric chemistry and the eutrophication of ecosystems. One unknown component in the balance of gaseous oxidized nitrogen is vegetation. Plants absorb nitrogenous species from the air via the stomata, but it is not clear whether plants can also emit them at low ambient concentrations. The possible emissions are small and difficult to measure. The aim of this thesis was to analyse an observation made in southern Finland at the SMEAR II station: solar ultraviolet radiation (UV) induced NOy emissions in chambers measuring the gas exchange of Scots pine (Pinus sylvestris L.) shoots. Both measuring and modelling approaches were used in the study. The measurements were performed under noncontrolled field conditions at low ambient NOy concentrations. The chamber blank i.e. artefact NOy emissions from the chamber walls, was dependent on the UV irradiance and increased with time after renewing the Teflon film on chamber surfaces. The contribution of each pine shoot to the total NOy emissions in the chambers was determined by testing whether the emissions decrease when the shoots are removed from their chambers. Emissions did decrease, but only when the chamber interior was exposed to UV radiation. It was concluded that also the pine shoots emit NOy. The possible effects of transpiration on the chamber blank are discussed in the summary part of the thesis, based on previously unpublished data. The possible processes underlying the UV-induced NOy emissions were reviewed. Surface reactions were more likely than metabolic processes. Photolysis of nitrate deposited on the needles may have generated the NOy emissions; the measurements supported this hypothesis. In that case, the emissions apparently would consist mainly of nitrogen dioxide (NO2), nitric oxide (NO) and nitrous acid (HONO). Within studies on NOy exchange of plants, the gases most frequently studied are NO2 and NO (=NOx). In the present work, the implications of the emissions for the NOx exchange of pine were analysed with a model including both NOy emissions and NOy absorption. The model suggested that if the emissions exist, pines can act as an NOx source rather than a sink, even under relatively high ambient concentrations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The time of the large sequencing projects has enabled unprecedented possibilities of investigating more complex aspects of living organisms. Among the high-throughput technologies based on the genomic sequences, the DNA microarrays are widely used for many purposes, including the measurement of the relative quantity of the messenger RNAs. However, the reliability of microarrays has been strongly doubted as robust analysis of the complex microarray output data has been developed only after the technology had already been spread in the community. An objective of this study consisted of increasing the performance of microarrays, and was measured by the successful validation of the results by independent techniques. To this end, emphasis has been given to the possibility of selecting candidate genes with remarkable biological significance within specific experimental design. Along with literature evidence, the re-annotation of the probes and model-based normalization algorithms were found to be beneficial when analyzing Affymetrix GeneChip data. Typically, the analysis of microarrays aims at selecting genes whose expression is significantly different in different conditions followed by grouping them in functional categories, enabling a biological interpretation of the results. Another approach investigates the global differences in the expression of functionally related groups of genes. Here, this technique has been effective in discovering patterns related to temporal changes during infection of human cells. Another aspect explored in this thesis is related to the possibility of combining independent gene expression data for creating a catalog of genes that are selectively expressed in healthy human tissues. Not all the genes present in human cells are active; some involved in basic activities (named housekeeping genes) are expressed ubiquitously. Other genes (named tissue-selective genes) provide more specific functions and they are expressed preferably in certain cell types or tissues. Defining the tissue-selective genes is also important as these genes can cause disease with phenotype in the tissues where they are expressed. The hypothesis that gene expression could be used as a measure of the relatedness of the tissues has been also proved. Microarray experiments provide long lists of candidate genes that are often difficult to interpret and prioritize. Extending the power of microarray results is possible by inferring the relationships of genes under certain conditions. Gene transcription is constantly regulated by the coordinated binding of proteins, named transcription factors, to specific portions of the its promoter sequence. In this study, the analysis of promoters from groups of candidate genes has been utilized for predicting gene networks and highlighting modules of transcription factors playing a central role in the regulation of their transcription. Specific modules have been found regulating the expression of genes selectively expressed in the hippocampus, an area of the brain having a central role in the Major Depression Disorder. Similarly, gene networks derived from microarray results have elucidated aspects of the development of the mesencephalon, another region of the brain involved in Parkinson Disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IgA nephropathy (IgAN) is the most common primary glomerulonephritis. In one third of the patients the disease progresses, and they eventually need renal replacement therapy. IgAN is in most cases a slowly progressing disease, and the prediction of progression has been difficult, and the results of studies have been conflicting. Henoch-Schönlein nephritis (HSN) is rare in adults, and prediction of the outcome is even more difficult than in IgAN. This study was conducted to evaluate the clinical and histopathological features and predictors of the outcome of IgAN and HSN diagnosed in one centre (313 IgAN patients and 38 HSN patients), and especially in patients with normal renal function at the time of renal biopsy. The study also aimed to evaluate whether there is a difference in the progression rates in four countries (259 patients from Finland, 112 from UK, 121 from Australia and 274 from Canada), and if so, can this be explained by differences in renal biopsy policy. The third aim was to measure urinary excretions of cytokines interleukin 1ß (IL-1ß) and interleukin 1 receptor antagonist (IL-1ra) in patients with IgAN and HSN and the correlations of excretion of these substances with histopathological damage and clinical factors. A large proportion of the patients diagnosed in Helsinki as having IgAN had normal renal function (161/313 patients). Four factors, (hypertension, higher amounts of urinary erythrocytes, severe arteriolosclerosis and a higher glomerular score) which independently predicted progression (logistic regression analysis), were identified in mild disease. There was geographic variability in renal survival in patients with IgAN. When age, levels of renal function, proteinuria and blood pressure were taken into account, it showed that the variability related mostly to lead-time bias and renal biopsy indications. Amount of proteinuria more than 0.4g/24h was the only factor that was significantly related to the progression of HSN. the Hypertension and the level of renal function were found to be factors predicting outcome in patients with normal renal function at the time of diagnosis. In IgAN patients, IL-1ra excretion into urine was found to be decreased as compared with HSN patients and healthy controls. Patients with a high IL-1ra/IL-1ß ratio had milder histopathological changes in renal biopsy than patients with a low/normal IL-1ra/IL-1ß ratio. It was also found that the excretion of IL-1ß and especially IL-1ra were significantly higher in women. In conclusion, it was shown that factors associated with outcome can reliably be identified even in mild cases of IgAN. Predicting outcome in adult HSN, however, remains difficult.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of drug substances in formulation development in the pharmaceutical industry is increasing. Some of these are amorphous drugs and have glass transition below ambient temperature, and thus they are usually difficult to formulate and handle. One reason for this is the reduced viscosity, related to the stickiness of the drug, that makes them complicated to handle in unit operations. Thus, the aim in this thesis was to develop a new processing method for a sticky amorphous model material. Furthermore, model materials were characterised before and after formulation, using several characterisation methods, to understand more precisely the prerequisites for physical stability of amorphous state against crystallisation. The model materials used were monoclinic paracetamol and citric acid anhydrate. Amorphous materials were prepared by melt quenching or by ethanol evaporation methods. The melt blends were found to have slightly higher viscosity than the ethanol evaporated materials. However, melt produced materials crystallised more easily upon consecutive shearing than ethanol evaporated materials. The only material that did not crystallise during shearing was a 50/50 (w/w, %) blend regardless of the preparation method and it was physically stable at least two years in dry conditions. Shearing at varying temperatures was established to measure the physical stability of amorphous materials in processing and storage conditions. The actual physical stability of the blends was better than the pure amorphous materials at ambient temperature. Molecular mobility was not related to the physical stability of the amorphous blends, observed as crystallisation. Molecular mobility of the 50/50 blend derived from a spectral linewidth as a function of temperature using solid state NMR correlated better with the molecular mobility derived from a rheometer than that of differential scanning calorimetry data. Based on the results obtained, the effect of molecular interactions, thermodynamic driving force and miscibility of the blends are discussed as the key factors to stabilise the blends. The stickiness was found to be affected glass transition and viscosity. Ultrasound extrusion and cutting were successfully tested to increase the processability of sticky material. Furthermore, it was found to be possible to process the physically stable 50/50 blend in a supercooled liquid state instead of a glassy state. The method was not found to accelerate the crystallisation. This may open up new possibilities to process amorphous materials that are otherwise impossible to manufacture into solid dosage forms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiple sclerosis (MS) is a chronic, inflammatory disease of the central nervous system, characterized especially by myelin and axon damage. Cognitive impairment in MS is common but difficult to detect without a neuropsychological examination. Valid and reliable methods are needed in clinical practice and research to detect deficits, follow their natural evolution, and verify treatment effects. The Paced Auditory Serial Addition Test (PASAT) is a measure of sustained and divided attention, working memory, and information processing speed, and it is widely used in MS patients neuropsychological evaluation. Additionally, the PASAT is the sole cognitive measure in an assessment tool primarly designed for MS clinical trials, the Multiple Sclerosis Functional Composite (MSFC). The aims of the present study were to determine a) the frequency, characteristics, and evolution of cognitive impairment among relapsing-remitting MS patients, and b) the validity and reliability of the PASAT in measuring cognitive performance in MS patients. The subjects were 45 relapsing-remitting MS patients from Seinäjoki Central Hospital, Department of Neurology and 48 healthy controls. Both groups underwent comprehensive neuropsychological assessments, including the PASAT, twice in a one-year follow-up, and additionally a sample of 10 patients and controls were evaluated with the PASAT in serial assessments five times in one month. The frequency of cognitive dysfunction among relapsing-remitting MS patients in the present study was 42%. Impairments were characterized especially by slowed information processing speed and memory deficits. During the one-year follow-up, the cognitive performance was relatively stable among MS patients on a group level. However, the practice effects in cognitive tests were less pronounced among MS patients than healthy controls. At an individual level the spectrum of MS patients cognitive deficits was wide in regards to their characteristics, severity, and evolution. The PASAT was moderately accurate in detecting MS-associated cognitive impairment, and 69% of patients were correctly classified as cognitively impaired or unimpaired when comprehensive neuropsychological assessment was used as a "gold standard". Self-reported nervousness and poor arithmetical skills seemed to explain misclassifications. MS-related fatigue was objectively demonstrated as fading performance towards the end of the test. Despite the observed practice effect, the reliability of the PASAT was excellent, and it was sensitive to the cognitive decline taking place during the follow-up in a subgroup of patients. The PASAT can be recommended for use in the neuropsychological assessment of MS patients. The test is fairly sensitive, but less specific; consequently, the reasons for low scores have to be carefully identified before interpreting them as clinically significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vascular intimal hyperplasia is a major complication following angioplasty. The hallmark feature of this disorder is accumulation of dedifferentiated smooth muscle cells (SMCs) to the luminal side of the injured artery, cellular proliferation, migration, and synthesis of extracellular matrix. This finally results in intimal hyperplasia, which is currently considered an untreatable condition. According to current knowledge, a major part of neointimal cells derive from circulating precursor cells. This has outdated the traditional in vitro cell culture methods of studying neointimal cell migration and proliferation using cultured medial SMCs. Somatostatin and some of its analogs with different selectivity for the five somatostatin receptors (sst1 through sst5) have been shown to have vasculoprotective properties in animal studies. However, clinical trials using analogs selective for sst2/sst3/sst5 to prevent restenosis after percutaneous transluminal coronary angioplasty (PTCA) have failed to show any major benefits. Sirolimus is a cell cycle inhibitor that has been suggested to act synergistically with the protein-tyrosine kinase inhibitor imatinib to inhibit intimal hyperplasia in rat already at well-tolerated submaximal oral doses. The mechanisms behind this synergy and its long-term efficacy are not known. The aim of this study was to set up an ex vivo vascular explant culture model to measure neointimal cell activity without excluding the participation of circulating progenitor cells. Furthermore, two novel potential vasculoprotective treatment strategies were evaluated in detail in rat models of intimal hyperplasia and in the ex vivo explant model: sst1/sst4-selective somatostatin receptor analogs and combination treatment with sirolimus and imatinib. This study shows how whole vessel explants can be used to study the kinetics of neointimal cells and their progenitors, and to evaluate the anti-migratory and anti-proliferative properties of potential vasculoprotective compounds. It also shows how the influx of neointimal progenitor cells occurs already during the first days after vascular injury, how the contribution of cell migration is more important in the injury response than cell proliferation, and how the adventitia actively contribute in vascular repair. The vasculoprotective effect of somatostatin is mediated preferentially through sst4, and through inhibition of cell migration rather than of proliferation, which may explain why sst2/sst3/sst5-selective analogs have failed in clinical trials. Furthermore, a brief early oral treatment with the combination of sirolimus and imatinib at submaximal doses results in long-term synergistic suppression of intimal hyperplasia. The synergy is a result of inhibition of post-operative thrombocytosis and leukocytosis, inhibition of neointimal cell migration to the injury-site, and maintenance of cell integrity by inhibition of apoptosis and SMC dedifferentiation. In conclusion, the influx of progenitor cells already during the first days after injury and the high neointimal cell migratory activity underlines the importance of early therapeutic intervention with anti-migratory compounds to prevent neointimal hyperplasia. Sst4-selective analogs and the combination therapy with sirolimus and imatinib represent potential targets for the development of such vasculoprotective therapies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study addresses three important issues in tree bucking optimization in the context of cut-to-length harvesting. (1) Would the fit between the log demand and log output distributions be better if the price and/or demand matrices controlling the bucking decisions on modern cut-to-length harvesters were adjusted to the unique conditions of each individual stand? (2) In what ways can we generate stand and product specific price and demand matrices? (3) What alternatives do we have to measure the fit between the log demand and log output distributions, and what would be an ideal goodness-of-fit measure? Three iterative search systems were developed for seeking stand-specific price and demand matrix sets: (1) A fuzzy logic control system for calibrating the price matrix of one log product for one stand at a time (the stand-level one-product approach); (2) a genetic algorithm system for adjusting the price matrices of one log product in parallel for several stands (the forest-level one-product approach); and (3) a genetic algorithm system for dividing the overall demand matrix of each of the several log products into stand-specific sub-demands simultaneously for several stands and products (the forest-level multi-product approach). The stem material used for testing the performance of the stand-specific price and demand matrices against that of the reference matrices was comprised of 9 155 Norway spruce (Picea abies (L.) Karst.) sawlog stems gathered by harvesters from 15 mature spruce-dominated stands in southern Finland. The reference price and demand matrices were either direct copies or slightly modified versions of those used by two Finnish sawmilling companies. Two types of stand-specific bucking matrices were compiled for each log product. One was from the harvester-collected stem profiles and the other was from the pre-harvest inventory data. Four goodness-of-fit measures were analyzed for their appropriateness in determining the similarity between the log demand and log output distributions: (1) the apportionment degree (index), (2) the chi-square statistic, (3) Laspeyres quantity index, and (4) the price-weighted apportionment degree. The study confirmed that any improvement in the fit between the log demand and log output distributions can only be realized at the expense of log volumes produced. Stand-level pre-control of price matrices was found to be advantageous, provided the control is done with perfect stem data. Forest-level pre-control of price matrices resulted in no improvement in the cumulative apportionment degree. Cutting stands under the control of stand-specific demand matrices yielded a better total fit between the demand and output matrices at the forest level than was obtained by cutting each stand with non-stand-specific reference matrices. The theoretical and experimental analyses suggest that none of the three alternative goodness-of-fit measures clearly outperforms the traditional apportionment degree measure. Keywords: harvesting, tree bucking optimization, simulation, fuzzy control, genetic algorithms, goodness-of-fit

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nisäkkäiden levinneisyyteen, niiden morfologisiin ja ekologisiin piirteisiin vaikuttavat ympäristön sekä lyhyet että pitkäkestoiset muutokset, etenkin ilmaston ja kasvillisuuden vaihtelut. Työssä tutkittiin nisäkkäiden sopeutumista ilmastonmuutoksiin Euraasiassa viimeisen 24 miljoonan vuoden aikana. Tutkimuksessa keskityttiin varsinkin viimeiseen kahteen miljoonaan vuoteen, jonka aikana ilmasto muuttui voimakkaasti ja ihmisen toiminta alkoi tulla merkittäväksi. Tämän takia on usein vaikea erottaa, kummasta em. seikasta jonkin nisäkäslajin sukupuutto tai häviäminen alueelta johtui. Aineistona käytettiin laajaa venäjänkielistä kirjallisuutta, josta löytyvät tiedot ovat kääntämättöminä jääneet aiemmin länsimaisen tutkimuksen ulkopuolelle. Työssä käytettiin myös NOW-tietokantaa, jossa on fossiilisten nisäkkäiden löytöpaikat sekä niiden iät.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Topic detection and tracking (TDT) is an area of information retrieval research the focus of which revolves around news events. The problems TDT deals with relate to segmenting news text into cohesive stories, detecting something new, previously unreported, tracking the development of a previously reported event, and grouping together news that discuss the same event. The performance of the traditional information retrieval techniques based on full-text similarity has remained inadequate for online production systems. It has been difficult to make the distinction between same and similar events. In this work, we explore ways of representing and comparing news documents in order to detect new events and track their development. First, however, we put forward a conceptual analysis of the notions of topic and event. The purpose is to clarify the terminology and align it with the process of news-making and the tradition of story-telling. Second, we present a framework for document similarity that is based on semantic classes, i.e., groups of words with similar meaning. We adopt people, organizations, and locations as semantic classes in addition to general terms. As each semantic class can be assigned its own similarity measure, document similarity can make use of ontologies, e.g., geographical taxonomies. The documents are compared class-wise, and the outcome is a weighted combination of class-wise similarities. Third, we incorporate temporal information into document similarity. We formalize the natural language temporal expressions occurring in the text, and use them to anchor the rest of the terms onto the time-line. Upon comparing documents for event-based similarity, we look not only at matching terms, but also how near their anchors are on the time-line. Fourth, we experiment with an adaptive variant of the semantic class similarity system. The news reflect changes in the real world, and in order to keep up, the system has to change its behavior based on the contents of the news stream. We put forward two strategies for rebuilding the topic representations and report experiment results. We run experiments with three annotated TDT corpora. The use of semantic classes increased the effectiveness of topic tracking by 10-30\% depending on the experimental setup. The gain in spotting new events remained lower, around 3-4\%. The anchoring the text to a time-line based on the temporal expressions gave a further 10\% increase the effectiveness of topic tracking. The gains in detecting new events, again, remained smaller. The adaptive systems did not improve the tracking results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite much research on forest biodiversity in Fennoscandia, the exact mechanisms of species declines in dead-wood dependent fungi are still poorly understood. In particular, there is only limited information on why certain fungal species have responded negatively to habitat loss and fragmentation, while others have not. Understanding the mechanisms behind species declines would be essential for the design and development of ecologically effective and scientifically informed conservation measures, and management practices that would promote biodiversity in production forests. In this thesis I study the ecology of polypores and their responses to forest management, with a particular focus on why some species have declined more than others. The data considered in the thesis comprise altogether 98,318 dead-wood objects, with 43,085 observations of 174 fungal species. Out of these, 1,964 observations represent 58 red-listed species. The data were collected from 496 sites, including woodland key habitats, clear-cuts with retention trees, mature managed forests, and natural or natural-like forests in southern Finland and Russian Karelia. I show that the most relevant way of measuring resource availability can differ to a great extent between species seemingly sharing the same resources. It is thus critical to measure the availability of resources in a way that takes into account the ecological requirements of the species. The results show that connectivity at the local, landscape and regional scales is important especially for the highly specialized species, many of which are also red-listed. Habitat loss and fragmentation affect not only species diversity but also the relative abundances of the species and, consequently, species interactions and fungal successional pathways. Changes in species distributions and abundances are likely to affect the food chains in which wood-inhabiting fungi are involved, and thus the functioning of the whole forest ecosystem. The findings of my thesis highlight the importance of protecting well-connected, large and high-quality forest areas to maintain forest biodiversity. Small habitat patches distributed across the landscape are likely to contribute only marginally to protection of red-listed species, especially if habitat quality is not substantially higher than in ordinary managed forest, as is the case with woodland key habitats. Key habitats might supplement the forest protection network if they were delineated larger and if harvesting of individual trees was prohibited in them. Taking the landscape perspective into account in the design and development of conservation measures is critical while striving to halt the decline of forest biodiversity in an ecologically effective manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Growth is a fundamental aspect of life cycle of all organisms. Body size varies highly in most animal groups, such as mammals. Moreover, growth of a multicellular organism is not uniform enlargement of size, but different body parts and organs grow to their characteristic sizes at different times. Currently very little is known about the molecular mechanisms governing this organ-specific growth. The genome sequencing projects have provided complete genomic DNA sequences of several species over the past decade. The amount of genomic sequence information, including sequence variants within species, is constantly increasing. Based on the universal genetic code, we can make sense of this sequence information as far as it codes proteins. However, less is known about the molecular mechanisms that control expression of genes, and about the variations in gene expression that underlie many pathological states in humans. This is caused in part by lack of information about the second genetic code that consists of the binding specificities of transcription factors and the combinatorial code by which transcription factor binding sites are assembled to form tissue-specific and/or ligand-regulated enhancer elements. This thesis presents a high-throughput assay for identification of transcription factor binding specificities, which were then used to measure the DNA binding profiles of transcription factors involved in growth control. We developed ‘enhancer element locator’, a computational tool, which can be used to predict functional enhancer elements. A genome-wide prediction of human and mouse enhancer elements generated a large database of enhancer elements. This database can be used to identify target genes of signaling pathways, and to predict activated transcription factors based on changes in gene expression. Predictions validated in transgenic mouse embryos revealed the presence of multiple tissue-specific enhancers in mouse c- and N-Myc genes, which has implications to organ specific growth control and tumor type specificity of oncogenes. Furthermore, we were able to locate a variation in a single nucleotide, which carries a susceptibility to colorectal cancer, to an enhancer element and propose a mechanism by which this SNP might be involved in generation of colorectal cancer.