14 resultados para DURATIONS
em Helda - Digital Repository of University of Helsinki
Resumo:
In cardiac myocytes (heart muscle cells), coupling of electric signal known as the action potential to contraction of the heart depends crucially on calcium-induced calcium release (CICR) in a microdomain known as the dyad. During CICR, the peak number of free calcium ions (Ca) present in the dyad is small, typically estimated to be within range 1-100. Since the free Ca ions mediate CICR, noise in Ca signaling due to the small number of free calcium ions influences Excitation-Contraction (EC) coupling gain. Noise in Ca signaling is only one noise type influencing cardiac myocytes, e.g., ion channels playing a central role in action potential propagation are stochastic machines, each of which gates more or less randomly, which produces gating noise present in membrane currents. How various noise sources influence macroscopic properties of a myocyte, how noise is attenuated and taken advantage of are largely open questions. In this thesis, the impact of noise on CICR, EC coupling and, more generally, macroscopic properties of a cardiac myocyte is investigated at multiple levels of detail using mathematical models. Complementarily to the investigation of the impact of noise on CICR, computationally-efficient yet spatially-detailed models of CICR are developed. The results of this thesis show that (1) gating noise due to the high-activity mode of L-type calcium channels playing a major role in CICR may induce early after-depolarizations associated with polymorphic tachycardia, which is a frequent precursor to sudden cardiac death in heart failure patients; (2) an increased level of voltage noise typically increases action potential duration and it skews distribution of action potential durations toward long durations in cardiac myocytes; and that (3) while a small number of Ca ions mediate CICR, Excitation-Contraction coupling is robust against this noise source, partly due to the shape of ryanodine receptor protein structures present in the cardiac dyad.
Resumo:
This thesis aims at finding the role of deposit insurance scheme and central bank (CB) in keeping the banking system safe. The thesis also studies the factors associated with long-lasting banking crises. The first essay analyzes the effect of using explicit deposit insurance scheme (EDIS), instead of using implicit deposit insurance scheme (IDIS), on banking crises. The panel data for the period of 1980-2003 includes all countries for which the data on EDIS or IDIS exist. 70% of the countries in the sample are less developed countries (LDCs). About 55% of the countries adopting EDIS also come from LDCs. The major finding is that the using of EDIS increases the crisis probability at a strong significance level. This probability is greater if the EDIS is inefficiently designed allowing higher scope of moral hazard problem. Specifically, the probability is greater if the EDIS provides higher coverage to deposits and if it is less powerful from the legal point of view. This study also finds that the less developed a country is to handle EDIS, the higher the chance of banking crisis. Once the underdevelopment of an economy handling the EDIS is controlled, the EDIS separately is no longer a significant factor of banking crises. The second essay aims at determining whether a country s powerful CB can lessen the instability of the banking sector by minimizing the likelihood of a banking crisis. The data used include indicators of the CB s autonomy for a set of countries over the period of 1980-89. The study finds that in aggregate a more powerful CB lessens the probability of banking crisis. When the CB s authority is disentangled with respect to its responsibilities, the study finds that the longer tenure of CB s chief executive officer and the greater power of CB in assigning interest rate on government loans are necessary for reducing the probability of banking crisis. The study also finds that the probability of crisis reduces more if an autonomous CB can perform its duties in a country with stronger law and order tradition. The costs of long-lasting banking crises are high because both the depositors and the investors lose confidence in the banking system. For a rapid recovery of a crisis, the government very often undertakes one or more crisis resolution policy (CRP) measures. The third essay examines the CRP and other explanatory variables correlated with the durations of banking crises. The major finding is that the CRP measure allowing the regulation forbearance to keep the insolvent banks operative and the public debt relief program are respectively strongly and weakly significant to increase the durations of crises. Some other explanatory variables, which were found by previous studies to be related with the probability of crises to occur, are also correlated with the durations of crises.
Resumo:
Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.
Resumo:
A vast amount of public services and goods are contracted through procurement auctions. Therefore it is very important to design these auctions in an optimal way. Typically, we are interested in two different objectives. The first objective is efficiency. Efficiency means that the contract is awarded to the bidder that values it the most, which in the procurement setting means the bidder that has the lowest cost of providing a service with a given quality. The second objective is to maximize public revenue. Maximizing public revenue means minimizing the costs of procurement. Both of these goals are important from the welfare point of view. In this thesis, I analyze field data from procurement auctions and show how empirical analysis can be used to help design the auctions to maximize public revenue. In particular, I concentrate on how competition, which means the number of bidders, should be taken into account in the design of auctions. In the first chapter, the main policy question is whether the auctioneer should spend resources to induce more competition. The information paradigm is essential in analyzing the effects of competition. We talk of a private values information paradigm when the bidders know their valuations exactly. In a common value information paradigm, the information about the value of the object is dispersed among the bidders. With private values more competition always increases the public revenue but with common values the effect of competition is uncertain. I study the effects of competition in the City of Helsinki bus transit market by conducting tests for common values. I also extend an existing test by allowing bidder asymmetry. The information paradigm seems to be that of common values. The bus companies that have garages close to the contracted routes are influenced more by the common value elements than those whose garages are further away. Therefore, attracting more bidders does not necessarily lower procurement costs, and thus the City should not implement costly policies to induce more competition. In the second chapter, I ask how the auctioneer can increase its revenue by changing contract characteristics like contract sizes and durations. I find that the City of Helsinki should shorten the contract duration in the bus transit auctions because that would decrease the importance of common value components and cheaply increase entry which now would have a more beneficial impact on the public revenue. Typically, cartels decrease the public revenue in a significant way. In the third chapter, I propose a new statistical method for detecting collusion and compare it with an existing test. I argue that my test is robust to unobserved heterogeneity unlike the existing test. I apply both methods to procurement auctions that contract snow removal in schools of Helsinki. According to these tests, the bidding behavior of two of the bidders seems consistent with a contract allocation scheme.
Resumo:
Dispersal is a highly important life history trait. In fragmented landscapes the long-term persistence of populations depends on dispersal. Evolution of dispersal is affected by costs and benefits and these may differ between different landscapes. This results in differences in the strength and direction of natural selection on dispersal in fragmented landscapes. Dispersal has been shown to be a nonrandom process that is associated with traits such as flight ability in insects. This thesis examines genetic and physiological traits affecting dispersal in the Glanville fritillary butterfly (Melitaea cinxia). Flight metabolic rate is a repeatable trait representing flight ability. Unlike in many vertebrates, resting metabolic rate cannot be used as a surrogate of maximum metabolic rate as no strong correlation between the two was found in the Glanville fritillary. Resting and flight metabolic rate are affected by environmental variables, most notably temperature. However, only flight metabolic rate has a strong genetic component. Molecular variation in the much-studied candidate locus phosphoglucose isomerase (Pgi), which encodes the glycolytic enzyme PGI, has an effect on carbohydrate metabolism in flight. This effect is temperature dependent: in low to moderate temperatures individuals with the heterozygous genotype at the single nucleotide polymorphism (SNP) AA111 have higher flight metabolic rate than the common homozygous genotype. At high temperatures the situation is reversed. This finding suggests that variation in enzyme properties is indeed translated to organismal performance. High-resolution data on individual female Glanville fritillaries moving freely in the field were recorded using harmonic radar. There was a strong positive correlation between flight metabolic rate and dispersal rate. Flight metabolic rate explained one third of the observed variation in the one-hour movement distance. A fine-scaled analysis of mobility showed that mobility peaked at intermediate ambient temperatures but the two common Pgi genotypes differed in their reaction norms to temperature. As with flight metabolic rate, heterozygotes at SNP AA111 were the most active genotype in low to moderate temperatures. The results show that molecular variation is associated with variation in dispersal rate through the link of flight physiology under the influence of environmental conditions. The evolutionary pressures for dispersal differ between males and females. The effect of flight metabolic rate on dispersal was examined in both sexes in field and laboratory conditions. The relationship between flight metabolic rate and dispersal rate in the field and flight duration in the laboratory were found to differ between the two sexes. In females the relationship was positive, but in males the longest distances and flight durations were recorded for individuals with low flight metabolic rate. These findings may reflect male investment in mate locating. Instead of dispersing, males with high flight metabolic rate may establish territories and follow a perching strategy when locating females and hence move less on the landscape level. Males with low metabolic rate may be forced to disperse due to low competitive success or may show adaptations to an alternative strategy: patrolling. In the light of life history trade-offs and the rate of living theory having high metabolic rate may carry a cost in the form of shortened lifespan. Experiments relating flight metabolic rate to longevity showed a clear correlation in the opposite direction: high flight metabolic rate was associated with long lifespan. This suggests that individuals with high metabolic rate do not pay an extra physiological cost for their high flight capacity, rather there are positive correlations between different measures of fitness. These results highlight the importance of condition.
Resumo:
Many languages exploit suprasegmental devices in signaling word meaning. Tone languages exploit fundamental frequency whereas quantity languages rely on segmental durations to distinguish otherwise similar words. Traditionally, duration and tone have been taken as mutually exclusive. However, some evidence suggests that, in addition to durational cues, phonological quantity is associated with and co-signaled by changes in fundamental frequency in quantity languages such as Finnish, Estonian, and Serbo-Croat. The results from the present experiment show that the structure of disyllabic word stems in Finnish are indeed signaled tonally and that the phonological length of the stressed syllable is further tonally distinguished within the disyllabic sequence. The results further indicate that the observed association of tone and duration in perception is systematically exploited in speech production in Finnish.
Resumo:
The purpose of this thesis is to examine the role of trade durations in price discovery. The motivation to use trade durations in the study of price discovery is that durations are robust to many microstructure effects that introduce a bias in the measurement of returns volatility. Another motivation to use trade durations in the study of price discovery is that it is difficult to think of economic variables, which really are useful in the determination of the source of volatility at arbitrarily high frequencies. The dissertation contains three essays. In the first essay, the role of trade durations in price discovery is examined with respect to the volatility pattern of stock returns. The theory on volatility is associated with the theory on the information content of trade, dear to the market microstructure theory. The first essay documents that the volatility per transaction is related to the intensity of trade, and a strong relationship between the stochastic process of trade durations and trading variables. In the second essay, the role of trade durations in price discovery is examined with respect to the quantification of risk due to a trading volume of a certain size. The theory on volume is intrinsically associated with the stock volatility pattern. The essay documents that volatility increases, in general, when traders choose to trade with large transactions. In the third essay, the role of trade durations in price discovery is examined with respect to the information content of a trade. The theory on the information content of a trade is associated with the theory on the rate of price revisions in the market. The essay documents that short durations are associated with information. Thus, traders are compensated for responding quickly to information
Resumo:
This paper investigates the clustering pattern in the Finnish stock market. Using trading volume and time as factors capturing the clustering pattern in the market, the Keim and Madhavan (1996) and the Engle and Russell (1998) model provide the framework for the analysis. The descriptive and the parametric analysis provide evidences that an important determinant of the famous U-shape pattern in the market is the rate of information arrivals as measured by large trading volumes and durations at the market open and close. Precisely, 1) the larger the trading volume, the greater the impact on prices both in the short and the long run, thus prices will differ across quantities. 2) Large trading volume is a non-linear function of price changes in the long run. 3) Arrival times are positively autocorrelated, indicating a clustering pattern and 4) Information arrivals as approximated by durations are negatively related to trading flow.
Resumo:
This paper investigates the persistent pattern in the Helsinki Exchanges. The persistent pattern is analyzed using a time and a price approach. It is hypothesized that arrival times are related to movements in prices. Thus, the arrival times are defined as durations and formulated as an Autoregressive Conditional Duration (ACD) model as in Engle and Russell (1998). The prices are defined as price changes and formulated as a GARCH process including duration measures. The research question follows from market microstructure predictions about price intensities defined as time between price changes. The microstructure theory states that long transaction durations might be associated with both no news and bad news. Accordingly, short durations would be related to high volatility and long durations to low volatility. As a result, the spread will tend to be larger under intensive moments. The main findings of this study are 1) arrival times are positively autocorrelated and 2) long durations are associated with low volatility in the market.
Resumo:
Continuing urbanization is a crucial driver of land transformation, having widespread impacts on virtually all ecosystems. Terrestrial ecosystems, including disturbed ones, are dependent on soils, which provide a multitude of ecosystem services. As soils are always directly and/or indirectly impacted through land transformation, land cover change causes soil change. Knowledge of ecosystem properties and functions in soils is increasing in importance as humans continue to concentrate into already densely-populated areas. Urban soils often have hampered functioning due to various disturbances resulting from human activity. Innovative solutions are needed to bring the lacking ecosystem services and quality of life to these urban environments. For instance, the ecosystem services of the urban green infrastructure may be substantially improved through knowledge of their functional properties. In the research forming this thesis, the impacts of four plant species (Picea abies, Calluna vulgaris, Lotus corniculatus and Holcus lanatus) on belowground biota and regulatory ecosystem services were investigated in two different urban soil types. The retention of inorganic nitrogen and phosphorus in the plant-soil system, decomposition of plant litter, primary production, and the degradation of polycyclic aromatic hydrocarbons (PAHs) were examined in the field and under laboratory conditions. The main objective of the research was to determine whether the different plant species (representing traits with varying litter decomposability) will give rise to dissimilar urban belowground communities with differing ecological functions. Microbial activity as well as the abundance of nematodes and enchytraeid worm biomass was highest below the legume L. corniculatus. L. corniculatus and the grass H. lanatus, producing labile or intermediate quality litter, enhanced the proportion of bacteria in the soil rhizosphere, while the recalcitrant litter-producing shrub C. vulgaris and the conifer P. abies stimulated the growth of fungi. The loss of nitrogen from the plant-soil system was small for H. lanatus and the combination of C. vulgaris + P. abies, irrespective of their energy channel composition. These presumably nitrogen-conservative plant species effectively diminished the leaching losses from the plant-soil systems with all the plant traits present. The laboratory experiment revealed a difference in N allocation between the plant traits: C. vulgaris and P. abies sequestered significantly more N in aboveground shoots in comparison to L. corniculatus and H. Lanatus. Plant rhizosphere effects were less clear for phosphorus retention, litter decomposition and the degradation of PAH compounds. This may be due to the relatively short experimental durations, as the maturation of the plant-soil system is likely to take a considerably longer time. The empirical studies of this thesis demonstrated that the soil communities rapidly reflect changes in plant coverage, and this has consequences for the functionality of soils. The energy channel composition of soils can be manipulated through plants, which was also supported by the results of the separate meta-analysis conducted in this thesis. However, further research is needed to understand the linkages between the biological community properties and ecosystem services in strongly human-modified systems.
Resumo:
The growing interest for sequencing with higher throughput in the last decade has led to the development of new sequencing applications. This thesis concentrates on optimizing DNA library preparation for Illumina Genome Analyzer II sequencer. The library preparation steps that were optimized include fragmentation, PCR purification and quantification. DNA fragmentation was performed with focused sonication in different concentrations and durations. Two column based PCR purification method, gel matrix method and magnetic bead based method were compared. Quantitative PCR and gel electrophoresis in a chip were compared for DNA quantification. The magnetic bead purification was found to be the most efficient and flexible purification method. The fragmentation protocol was changed to produce longer fragments to be compatible with longer sequencing reads. Quantitative PCR correlates better with the cluster number and should thus be considered to be the default quantification method for sequencing. As a result of this study more data have been acquired from sequencing with lower costs and troubleshooting has become easier as qualification steps have been added to the protocol. New sequencing instruments and applications will create a demand for further optimizations in future.
Resumo:
We have developed CowLog, which is open-source software for recording behaviors from digital video and is easy to use and modify. CowLog tracks the time code from digital video files. The program is suitable for coding any digital video, but the authors have used it in animal research. The program has two main windows: a coding window, which is a graphical user interface used for choosing video files and defining output files that also has buttons for scoring behaviors, and a video window, which displays the video used for coding. The windows can be used in separate displays. The user types the key codes for the predefined behavioral categories, and CowLog transcribes their timing from the video time code to a data file. CowLog comes with an additional feature, an R package called Animal, for elementary analyses of the data files. With the analysis package, the user can calculate the frequencies, bout durations, and total durations of the coded behaviors and produce summary plots from the data.
Resumo:
We have developed CowLog, which is open-source software for recording behaviors from digital video and is easy to use and modify. CowLog tracks the time code from digital video files. The program is suitable for coding any digital video, but the authors have used it in animal research. The program has two main windows: a coding window, which is a graphical user interface used for choosing video files and defining output files that also has buttons for scoring behaviors, and a video window, which displays the video used for coding. The windows can be used in separate displays. The user types the key codes for the predefined behavioral categories, and CowLog transcribes their timing from the video time code to a data file. CowLog comes with an additional feature, an R package called Animal, for elementary analyses of the data files. With the analysis package, the user can calculate the frequencies, bout durations, and total durations of the coded behaviors and produce summary plots from the data.
Resumo:
Research on reading has been successful in revealing how attention guides eye movements when people read single sentences or text paragraphs in simplified and strictly controlled experimental conditions. However, less is known about reading processes in more naturalistic and applied settings, such as reading Web pages. This thesis investigates online reading processes by recording participants eye movements. The thesis consists of four experimental studies that examine how location of stimuli presented outside the currently fixated region (Study I and III), text format (Study II), animation and abrupt onset of online advertisements (Study III), and phase of an online information search task (Study IV) affect written language processing. Furthermore, the studies investigate how the goal of the reading task affects attention allocation during reading by comparing reading for comprehension with free browsing, and by varying the difficulty of an information search task. The results show that text format affects the reading process, that is, vertical text (word/line) is read at a slower rate than a standard horizontal text, and the mean fixation durations are longer for vertical text than for horizontal text. Furthermore, animated online ads and abrupt ad onsets capture online readers attention and direct their gaze toward the ads, and distract the reading process. Compared to a reading-for-comprehension task, online ads are attended to more in a free browsing task. Moreover, in both tasks abrupt ad onsets result in rather immediate fixations toward the ads. This effect is enhanced when the ad is presented in the proximity of the text being read. In addition, the reading processes vary when Web users proceed in online information search tasks, for example when they are searching for a specific keyword, looking for an answer to a question, or trying to find a subjectively most interesting topic. A scanning type of behavior is typical at the beginning of the tasks, after which participants tend to switch to a more careful reading state before finishing the tasks in the states referred to as decision states. Furthermore, the results also provided evidence that left-to-right readers extract more parafoveal information to the right of the fixated word than to the left, suggesting that learning biases attentional orienting towards the reading direction.