7 resultados para Individual-based model
em Helda - Digital Repository of University of Helsinki
Resumo:
The thesis concentrates on two questions: the translation of metaphors in literary texts, and the use of semiotic models and tools in translation studies. The aim of the thesis is to present a semiotic, text based model designed to ease the translation of metaphors and to analyze translated metaphors. In the translation of metaphors I will concentrate on the central problem of metaphor translations: in addition to its denotation and connotation, a single metaphor may contain numerous culture or genre specific meanings. How can a translator ensure the translation of all meanings relevant to the text as a whole? I will approach the question from two directions. Umberto Eco's holistic text analysis model provides an opportunity to concentrate on the problematic nature of metaphor translation from the level of a text as a specific entity, while George Lakoff's and Mark Johnson's metaphor research makes it possible to approach the question from the level of individual metaphors. On the semiotic side, the model utilizes Eero Tarasti's existential semiotics supported by Algirdas Greimas' actant model and Yuri Lotman's theory of cultural semiotics. In the model introduced in the thesis, individual texts are deconstructed through Eco's model into elements. The textual roles and features of these elements are distilled further through Tarasti's model into their coexistent meaning levels. The priorization and analysis of these meaning levels provide an opportunity to consider the contents and significance of specific metaphors in relation to the needs of the text as a whole. As example texts, I will use Motörhead's hard rock classic Iron Horse/Born to Lose and its translation into Rauta-airot by Viikate. I will use the introduced model to analyze the metaphors in the source and target texts, and to consider the transfer of culture specific elements between the languages and cultural borders. In addition, I will use the analysis process to examine the validity of the model introduced in the thesis.
Resumo:
Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.
Resumo:
Large carnivore populations are currently recovering from past extirpation efforts and expanding back into their original habitats. At the same time human activities have resulted in very few wilderness areas left with suitable habitats and size large enough to maintain populations of large carnivores without human contact. Consequently the long-term future of large carnivores depends on their successful integration into landscapes where humans live. Thus, understanding their behaviour and interaction with surrounding habitats is of utmost importance in the development of management strategies for large carnivores. This applies also to brown bears (Ursus arctos) that were almost exterminated from Scandinavia and Finland at the turn of the century, but are now expanding their range with the current population estimates being approximately 2600 bears in Scandinavia and 840 in Finland. This thesis focuses on the large-scale habitat use and population dynamics of brown bears in Scandinavia with the objective to develop modelling approaches that support the management of bear populations. Habitat analysis shows that bear home ranges occur mainly in forested areas with a low level of human influence relative to surrounding areas. Habitat modelling based on these findings allows identification and quantification of the potentially suitable areas for bears in Scandinavia. Additionally, this thesis presents novel improvements to home range estimation that enable realistic estimates of the effective area required for the bears to establish a home range. This is achieved through fitting to the radio-tracking data to establish the amount of temporal autocorrelation and the proportion of time spent in different habitat types. Together these form a basis for the landscape-level management of the expanding population. Successful management of bears requires also assessment of the consequences of harvest on the population viability. An individual-based simulation model, accounting for the sexually selected infanticide, was used to investigate the possibility of increasing the harvest using different hunting strategies, such as trophy harvest of males. The results indicated that the population can sustain twice the current harvest rate. However, harvest should be changed gradually while carefully monitoring the population growth as some effects of increased harvest may manifest themselves only after a time-delay. The results and methodological improvements in this thesis can be applied to the Finnish bear population and to other large carnivores. They provide grounds for the further development of spatially-realistic management-oriented models of brow bear dynamics that can make projections of the future distribution of bears while accounting for the development of human activities.
Resumo:
Frictions are factors that hinder trading of securities in financial markets. Typical frictions include limited market depth, transaction costs, lack of infinite divisibility of securities, and taxes. Conventional models used in mathematical finance often gloss over these issues, which affect almost all financial markets, by arguing that the impact of frictions is negligible and, consequently, the frictionless models are valid approximations. This dissertation consists of three research papers, which are related to the study of the validity of such approximations in two distinct modeling problems. Models of price dynamics that are based on diffusion processes, i.e., continuous strong Markov processes, are widely used in the frictionless scenario. The first paper establishes that diffusion models can indeed be understood as approximations of price dynamics in markets with frictions. This is achieved by introducing an agent-based model of a financial market where finitely many agents trade a financial security, the price of which evolves according to price impacts generated by trades. It is shown that, if the number of agents is large, then under certain assumptions the price process of security, which is a pure-jump process, can be approximated by a one-dimensional diffusion process. In a slightly extended model, in which agents may exhibit herd behavior, the approximating diffusion model turns out to be a stochastic volatility model. Finally, it is shown that when agents' tendency to herd is strong, logarithmic returns in the approximating stochastic volatility model are heavy-tailed. The remaining papers are related to no-arbitrage criteria and superhedging in continuous-time option pricing models under small-transaction-cost asymptotics. Guasoni, Rásonyi, and Schachermayer have recently shown that, in such a setting, any financial security admits no arbitrage opportunities and there exist no feasible superhedging strategies for European call and put options written on it, as long as its price process is continuous and has the so-called conditional full support (CFS) property. Motivated by this result, CFS is established for certain stochastic integrals and a subclass of Brownian semistationary processes in the two papers. As a consequence, a wide range of possibly non-Markovian local and stochastic volatility models have the CFS property.
Resumo:
Molecular motors are proteins that convert chemical energy into mechanical work. The viral packaging ATPase P4 is a hexameric molecular motor that translocates RNA into preformed viral capsids. P4 belongs to the ubiquitous class of hexameric helicases. Although its structure is known, the mechanism of RNA translocation remains elusive. Here we present a detailed kinetic study of nucleotide binding, hydrolysis, and product release by P4. We propose a stochastic-sequential cooperative model to describe the coordination of ATP hydrolysis within the hexamer. In this model the apparent cooperativity is a result of hydrolysis stimulation by ATP and RNA binding to neighboring subunits rather than cooperative nucleotide binding. Simultaneous interaction of neighboring subunits with RNA makes the otherwise random hydrolysis sequential and processive. Further, we use hydrogen/deuterium exchange detected by high resolution mass spectrometry to visualize P4 conformational dynamics during the catalytic cycle. Concerted changes of exchange kinetics reveal a cooperative unit that dynamically links ATP binding sites and the central RNA binding channel. The cooperative unit is compatible with the structure-based model in which translocation is effected by conformational changes of a limited protein region. Deuterium labeling also discloses the transition state associated with RNA loading which proceeds via opening of the hexameric ring. Hydrogen/deuterium exchange is further used to delineate the interactions of the P4 hexamer with the viral procapsid. P4 associates with the procapsid via its C-terminal face. The interactions stabilize subunit interfaces within the hexamer. The conformation of the virus-bound hexamer is more stable than the hexamer in solution, which is prone to spontaneous ring openings. We propose that the stabilization within the viral capsid increases the packaging processivity and confers selectivity during RNA loading. Finally, we use single molecule techniques to characterize P4 translocation along RNA. While the P4 hexamer encloses RNA topologically within the central channel, it diffuses randomly along the RNA. In the presence of ATP, unidirectional net movement is discernible in addition to the stochastic motion. The diffusion is hindered by activation energy barriers that depend on the nucleotide binding state. The results suggest that P4 employs an electrostatic clutch instead of cycling through stable, discrete, RNA binding states during translocation. Conformational changes coupled to ATP hydrolysis modify the electrostatic potential inside the central channel, which in turn biases RNA motion in one direction. Implications of the P4 model for other hexameric molecular motors are discussed.
Resumo:
This study examines the properties of Generalised Regression (GREG) estimators for domain class frequencies and proportions. The family of GREG estimators forms the class of design-based model-assisted estimators. All GREG estimators utilise auxiliary information via modelling. The classic GREG estimator with a linear fixed effects assisting model (GREG-lin) is one example. But when estimating class frequencies, the study variable is binary or polytomous. Therefore logistic-type assisting models (e.g. logistic or probit model) should be preferred over the linear one. However, other GREG estimators than GREG-lin are rarely used, and knowledge about their properties is limited. This study examines the properties of L-GREG estimators, which are GREG estimators with fixed-effects logistic-type models. Three research questions are addressed. First, I study whether and when L-GREG estimators are more accurate than GREG-lin. Theoretical results and Monte Carlo experiments which cover both equal and unequal probability sampling designs and a wide variety of model formulations show that in standard situations, the difference between L-GREG and GREG-lin is small. But in the case of a strong assisting model, two interesting situations arise: if the domain sample size is reasonably large, L-GREG is more accurate than GREG-lin, and if the domain sample size is very small, estimation of assisting model parameters may be inaccurate, resulting in bias for L-GREG. Second, I study variance estimation for the L-GREG estimators. The standard variance estimator (S) for all GREG estimators resembles the Sen-Yates-Grundy variance estimator, but it is a double sum of prediction errors, not of the observed values of the study variable. Monte Carlo experiments show that S underestimates the variance of L-GREG especially if the domain sample size is minor, or if the assisting model is strong. Third, since the standard variance estimator S often fails for the L-GREG estimators, I propose a new augmented variance estimator (A). The difference between S and the new estimator A is that the latter takes into account the difference between the sample fit model and the census fit model. In Monte Carlo experiments, the new estimator A outperformed the standard estimator S in terms of bias, root mean square error and coverage rate. Thus the new estimator provides a good alternative to the standard estimator.
Resumo:
Detecting Earnings Management Using Neural Networks. Trying to balance between relevant and reliable accounting data, generally accepted accounting principles (GAAP) allow, to some extent, the company management to use their judgment and to make subjective assessments when preparing financial statements. The opportunistic use of the discretion in financial reporting is called earnings management. There have been a considerable number of suggestions of methods for detecting accrual based earnings management. A majority of these methods are based on linear regression. The problem with using linear regression is that a linear relationship between the dependent variable and the independent variables must be assumed. However, previous research has shown that the relationship between accruals and some of the explanatory variables, such as company performance, is non-linear. An alternative to linear regression, which can handle non-linear relationships, is neural networks. The type of neural network used in this study is the feed-forward back-propagation neural network. Three neural network-based models are compared with four commonly used linear regression-based earnings management detection models. All seven models are based on the earnings management detection model presented by Jones (1991). The performance of the models is assessed in three steps. First, a random data set of companies is used. Second, the discretionary accruals from the random data set are ranked according to six different variables. The discretionary accruals in the highest and lowest quartiles for these six variables are then compared. Third, a data set containing simulated earnings management is used. Both expense and revenue manipulation ranging between -5% and 5% of lagged total assets is simulated. Furthermore, two neural network-based models and two linear regression-based models are used with a data set containing financial statement data from 110 failed companies. Overall, the results show that the linear regression-based models, except for the model using a piecewise linear approach, produce biased estimates of discretionary accruals. The neural network-based model with the original Jones model variables and the neural network-based model augmented with ROA as an independent variable, however, perform well in all three steps. Especially in the second step, where the highest and lowest quartiles of ranked discretionary accruals are examined, the neural network-based model augmented with ROA as an independent variable outperforms the other models.