14 resultados para supplementary elements
em Helda - Digital Repository of University of Helsinki
Resumo:
This thesis presents methods for locating and analyzing cis-regulatory DNA elements involved with the regulation of gene expression in multicellular organisms. The regulation of gene expression is carried out by the combined effort of several transcription factor proteins collectively binding the DNA on the cis-regulatory elements. Only sparse knowledge of the 'genetic code' of these elements exists today. An automatic tool for discovery of putative cis-regulatory elements could help their experimental analysis, which would result in a more detailed view of the cis-regulatory element structure and function. We have developed a computational model for the evolutionary conservation of cis-regulatory elements. The elements are modeled as evolutionarily conserved clusters of sequence-specific transcription factor binding sites. We give an efficient dynamic programming algorithm that locates the putative cis-regulatory elements and scores them according to the conservation model. A notable proportion of the high-scoring DNA sequences show transcriptional enhancer activity in transgenic mouse embryos. The conservation model includes four parameters whose optimal values are estimated with simulated annealing. With good parameter values the model discriminates well between the DNA sequences with evolutionarily conserved cis-regulatory elements and the DNA sequences that have evolved neutrally. In further inquiry, the set of highest scoring putative cis-regulatory elements were found to be sensitive to small variations in the parameter values. The statistical significance of the putative cis-regulatory elements is estimated with the Two Component Extreme Value Distribution. The p-values grade the conservation of the cis-regulatory elements above the neutral expectation. The parameter values for the distribution are estimated by simulating the neutral DNA evolution. The conservation of the transcription factor binding sites can be used in the upstream analysis of regulatory interactions. This approach may provide mechanistic insight to the transcription level data from, e.g., microarray experiments. Here we give a method to predict shared transcriptional regulators for a set of co-expressed genes. The EEL (Enhancer Element Locator) software implements the method for locating putative cis-regulatory elements. The software facilitates both interactive use and distributed batch processing. We have used it to analyze the non-coding regions around all human genes with respect to the orthologous regions in various other species including mouse. The data from these genome-wide analyzes is stored in a relational database which is used in the publicly available web services for upstream analysis and visualization of the putative cis-regulatory elements in the human genome.
Resumo:
Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.
Resumo:
Wild salmon stocks in the northern Baltic rivers became endangered in the second half of the 20th century, mainly due to recruitment overfishing. As a result, supplementary stocking was widely practised, and supplementation of the Tornionjoki salmon stock took place over a 25 year period until 2002. The stock has been closely monitored by electrofishing, smolt trapping, mark-recapture studies, catch samples and catch surveys. Background information on hatchery-reared stocked juveniles was also collected for this study. Bayesian statistics was applied to the data as this method offers the possibility of bringing prior information into the analysis and an advanced ability for incorporating uncertainty, and also provides probabilities for a multitude of hypotheses. Substantial divergences between reared and wild Tornionjoki salmon were identified in both demographic and phenological characteristics. The divergences tended to be larger the longer the duration spent in hatchery and the more favourable the hatchery conditions were for fast growth. Differences in environment likely induced most of the divergences, but selection of brood fish might have resulted in genotypic divergence in maturation age of reared salmon. Survival of stocked 1-year old juveniles to smolt varied from about 10% to about 25%. Stocking on the lower reach of the river seemed to decrease survival, and the negative effect of stocking volume on survival raises the concern of possible similar effects on the extant wild population. Post-smolt survival of wild Tornionjoki smolts was on average two times higher than that of smolts stocked as parr and 2.5 times higher than that of stocked smolts. Smolts of different groups showed synchronous variation and similar long-term survival trends. Both groups of reared salmon were more vulnerable to offshore driftnet and coastal trapnet fishing than wild salmon. Average survival from smolt to spawners of wild salmon was 2.8 times higher than that of salmon stocked as parr and 3.3 times higher than that of salmon stocked as smolts. Wild salmon and salmon stocked as parr were found to have similar lifetime survival rates, while stocked smolts have a lifetime survival rate over 4 times higher than the two other groups. If eggs are collected from the wild brood fish, stocking parr would therefore not be a sensible option. Stocking smolts instead would create a net benefit in terms of the number of spawners, but this strategy has serious drawbacks and risks associated with the larger phenotypic and demographic divergences from wild salmon. Supplementation was shown not to be the key factor behind the recovery of the Tornionjoki and other northern Baltic salmon stocks. Instead, a combination of restrictions in the sea fishery and simultaneous occurrence of favourable natural conditions for survival were the main reasons for the revival in the 1990 s. This study questions the effectiveness of supplementation as a conservation management tool. The benefits of supplementation seem at best limited. Relatively high occurrences of reared fish in catches may generate false optimism concerning the effects of supplementation. Supplementation may lead to genetic risks due to problems in brood fish collection and artificial rearing with relaxed natural selection and domestication. Appropriate management of fisheries is the main alternative to supplementation, without which all other efforts for long-term maintenance of a healthy fish resource fail.
Resumo:
Background
How new forms arise in nature has engaged evolutionary biologists since Darwin's seminal treatise on the origin of species. Transposable elements (TEs) may be among the most important internal sources for intraspecific variability. Thus, we aimed to explore the temporal dynamics of several TEs in individual genotypes from a small, marginal population of Aegilops speltoides. A diploid cross-pollinated grass species, it is a wild relative of the various wheat species known for their large genome sizes contributed by an extraordinary number of TEs, particularly long terminal repeat (LTR) retrotransposons. The population is characterized by high heteromorphy and possesses a wide spectrum of chromosomal abnormalities including supernumerary chromosomes, heterozygosity for translocations, and variability in the chromosomal position or number of 45S and 5S ribosomal DNA (rDNA) sites. We propose that variability on the morphological and chromosomal levels may be linked to variability at the molecular level and particularly in TE proliferation.
Results
Significant temporal fluctuation in the copy number of TEs was detected when processes that take place in small, marginal populations were simulated. It is known that under critical external conditions, outcrossing plants very often transit to self-pollination. Thus, three morphologically different genotypes with chromosomal aberrations were taken from a wild population of Ae. speltoides, and the dynamics of the TE complex traced through three rounds of selfing. It was discovered that: (i) various families of TEs vary tremendously in copy number between individuals from the same population and the selfed progenies; (ii) the fluctuations in copy number are TE-family specific; (iii) there is a great difference in TE copy number expansion or contraction between gametophytes and sporophytes; and (iv) a small percentage of TEs that increase in copy number can actually insert at novel locations and could serve as a bona fide mutagen.
Conclusions
We hypothesize that TE dynamics could promote or intensify morphological and karyotypical changes, some of which may be potentially important for the process of microevolution, and allow species with plastic genomes to survive as new forms or even species in times of rapid climatic change.
Resumo:
This research is connected with an education development project for the four-year-long officer education program at the National Defence University. In this curriculum physics was studied in two alternative course plans namely scientific and general. Observations connected to the later one e.g. student feedback and learning outcome gave indications that action was needed to support the course. The reform work was focused on the production of aligned course related instructional material. The learning material project produced a customized textbook set for the students of the general basic physics course. The research adapts phases that are typical in Design Based Research (DBR). The research analyses the feature requirements for physics textbook aimed at a specific sector and frames supporting instructional material development, and summarizes the experiences gained in the learning material project when the selected frames have been applied. The quality of instructional material is an essential part of qualified teaching. The goal of instructional material customization is to increase the product's customer centric nature and to enhance its function as a support media for the learning process. Textbooks are still one of the core elements in physics teaching. The idea of a textbook will remain but the form and appearance may change according to the prevailing technology. The work deals with substance connected frames (demands of a physics textbook according to the PER-viewpoint, quality thinking in educational material development), frames of university pedagogy and instructional material production processes. A wide knowledge and understanding of different frames are useful in development work, if they are to be utilized to aid inspiration without limiting new reasoning and new kinds of models. Applying customization even in the frame utilization supports creative and situation aware design and diminishes the gap between theory and practice. Generally, physics teachers produce their own supplementary instructional material. Even though customization thinking is not unknown the threshold to produce an entire textbook might be high. Even though the observations here are from the general physics course at the NDU, the research gives tools also for development in other discipline related educational contexts. This research is an example of an instructional material development work together the questions it uncovers, and presents thoughts when textbook customization is rewarding. At the same time, the research aims to further creative customization thinking in instruction and development. Key words: Physics textbook, PER (Physics Education Research), Instructional quality, Customization, Creativity
Resumo:
Lullabies in Kvevlax. Linguistic structures and constructions. The study is a linguistic analysis of constructions that shape the texts used in lullabies in Kvevlax in Ostrobothnia in Finland. The empirical goal is to identify linguistic constructions in traditional lullabies that make use of the dialect of the region. The theoretical goal was to test the usability of Construction Grammar (CxG) in analyses of this type of material, and to further develop the formal description of Construction Grammar in such a way as to make it possible to analyze all kinds of linguistically complex texts. The material that I collected in the 1960s comprises approximately 600 lullabies and concomitant interviews with the singers on the use of lullabies. In 1991 I collected additional material in Kvevlax. The number of informants is close to 250. Supplementary material covering the Swedish-language regions in Finland was compiled from the archives of the Society of Swedish Literature in Finland. The first part of the study is mainly based on traditional grammar and gives general information about the language and the structures used in the lullabies. In the detailed study of the Kvevlax lullabies in the latter part of the study I use a version of Construction Grammar intended for the linguistic analysis of usage-based texts. The analysis focuses on the most salient constructions in the lullabies. The study shows that Construction Grammar as a method has more general applicability than traditional linguistic methods. The study identifies important constructions, including elements typical of this genre, that structure the text in different variants of the same lullabies. In addition, CxG made it possible to study pragmatic aspects of the interactional, cultural and contextual language that is used in communication with small children. The constructions found in lullabies are also used in language in general. In addition to being able to give detailed linguistic descriptions of the texts, Construction Grammar can also explain the multidimensionality of language and the variations in the texts. The use of CxG made it possible to show that variations are not random but follow prototypical linguistic patterns, constructions. Constructions are thus found to be linguistic resources with built-in variation potentials.
Resumo:
Most of the existing research within the business network approach is based on companies that are operating on different levels within the same value chain, as a buyer and a supplier. Intercompetitor cooperation, i.e. cooperation between companies occupying the same level within different value chains, has not been studied to the same extent. Moreover scholars within the business network approach have usually described industrial relationships as long term, consisting of mutual commitment and trust. Industrial relationships are not static, but dynamic, and they contain situations of both harmony and conflict. There is consequently a need for more research both concerning intercompetitor cooperation and conflicts. The purpose of this study is to develop our theoretical and empirical understanding of the nature of conflicts in intercompetitor cooperation from a business network perspective. The focus of the study lies on issue and intensity of conflict. The issue of a conflict can be divided into cause and topic, while the intensity comprises the importance and outcome of a conflict. The empirical part of the study is based on two case studies of groups of cooperating competitors from two different industries. The applied research method is interviews. According to the findings of this study causes of conflicts in intercompetitor cooperation can be divided into three groups: focus, awareness and capacity. Topics of conflict can be related to domain, delivery, advertising or cooperation. Moreover the findings show that conflict situations may be grouped into not important, important or very important. Some conflicts may also be of varying importance, meaning that the importance varies from one point of time to another. Based on the findings of the study the outcome or status of a conflict can be analyzed both on a concrete and general level. The findings also indicate that several conflicts are partly hidden, which means that only one or some of the involved actors perceive the conflict. Furthermore several conflict situations can be related to external network actors.
Resumo:
The K-shell diagram (K alpha(1,2) and K beta(1,3)) and hypersatellite (HS) (K-h alpha(1,2)) spectra of Y, Zr, Mo, and Pd have been measured with high energy-resolution using photoexcitation by 90 keV synchrotron radiation. Comparison of the measured and ab initio calculated HS spectra demonstrates the importance of quantum electrodynamical (QED) effects for the HS spectra. Phenomenological fits of the measured spectra by Voigt functions yield accurate values for the shift of the HS from the diagram lines, the splitting of the HS lines, and their intensity ratio. Good agreement with theory was found for all quantities except for the intensity ratio, which is dominated by the intermediacy of the coupling of the angular momenta. The observed deviations imply that our current understanding of the variation of the coupling scheme from LS to jj across the periodic table may require some revision.