52 resultados para idiosyncratic volatility


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines both theoretically an empirically how well the theories of Norman Holland, David Bleich, Wolfgang Iser and Stanley Fish can explain readers' interpretations of literary texts. The theoretical analysis concentrates on their views on language from the point of view of Wittgenstein's Philosophical Investigations. This analysis shows that many of the assumptions related to language in these theories are problematic. The empirical data show that readers often form very similar interpretations. Thus the study challenges the common assumption that literary interpretations tend to be idiosyncratic. The empirical data consists of freely worded written answers to questions on three short stories. The interpretations were made by 27 Finnish university students. Some of the questions addressed issues that were discussed in large parts of the texts, some referred to issues that were mentioned only in passing or implied. The short stories were "The Witch à la Mode" by D. H. Lawrence, "Rain in the Heart" by Peter Taylor and "The Hitchhiking Game" by Milan Kundera. According to Fish, readers create both the formal features of a text and their interpretation of it according to an interpretive strategy. People who agree form an interpretive community. However, a typical answer usually contains ideas repeated by several readers as well as observations not mentioned by anyone else. Therefore it is very difficult to determine which readers belong to the same interpretive community. Moreover, readers with opposing opinions often seem to pay attention to the same textual features and even acknowledge the possibility of an opposing interpretation; therefore they do not seem to create the formal features of the text in different ways. Iser suggests that an interpretation emerges from the interaction between the text and the reader when the reader determines the implications of the text and in this way fills the "gaps" in the text. Iser believes that the text guides the reader, but as he also believes that meaning is on a level beyond words, he cannot explain how the text directs the reader. The similarity in the interpretations and the fact that the agreement is strongest when related to issues that are discussed broadly in the text do, however, support his assumption that readers are guided by the text. In Bleich's view, all interpretations have personal motives and each person has an idiosyncratic language system. The situation where a person learns a word determines the most important meaning it has for that person. In order to uncover the personal etymologies of words, Bleich asks his readers to associate freely on the basis of a text and note down all the personal memories and feelings that the reading experience evokes. Bleich's theory of the idiosyncratic language system seems to rely on a misconceived notion of the role that ostensive definitions have in language use. The readers' responses show that spontaneous associations to personal life seem to colour the readers' interpretations, but such instances are rather rare. According to Holland, an interpretation reflects the reader's identity theme. Language use is regulated by shared rules, but everyone follows the rules in his or her own way. Words mean different things to different people. The problem with this view is that if there is any basis for language use, it seems to be the shared way of following linguistic rules. Wittgenstein suggests that our understanding of words is related to the shared ways of using words and our understanding of human behaviour. This view seems to give better grounds for understanding similarity and differences in literary interpretations than the theories of Holland, Bleich, Fish and Iser.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Titled "An Essay on Antimetaphoric Resistance", the dissertation investigates what is here being called "Counter-figures": a term which has in this context a certain variety of applications. Any other-than-image or other-than-figure, anything that cannot be exhausted by figuration (and that is, more or less, anything at all, except perhaps the reproducible images and figures themselves) can be considered "counter-figurative" with regard to the formation of images and figures, ideas and schemas, "any graven image, or any likeness of any thing". Singularity and radical alterity, as well as temporality and its peculiar mode of uniqueness are key issues here, and an ethical dimension is implied by, or intertwined with, the aesthetic. In terms borrowed from Paul Celan's "Meridian" speech, poetry may "allow the most idiosyncratic quality of the Other, its time, to participate in the dialogue". This connection between singularity, alterity and temporality is one of the reasons why Celan so strongly objects to the application of the traditional concept of metaphor to poetry. As Celan says, "carrying over [übertragen]" by metaphor may imply an unwillingness to "bear with [mittragen]" and to "endure [ertragen]" the poem. The thesis is divided into two main parts. The first consists of five distinct prolegomena which all address the mentioned variety of applications of the term "counter-figures", and especially the rejection or critique of either metaphor (by Aristotle, for instance) or the concept of metaphor (defined by Aristotle, and sometimes deemed "anti-poetic" by both theorists and poets). Even if we restrict ourselves to the traditional rhetorico-poetical terms, we may see how, for instance, metonymy can be a counter-figure for metaphor, allegory for symbol, and irony for any single trope or for any piece of discourse at all. The limits of figurality may indeed be located at these points of intersection between different types of tropes or figures, and even between figures or tropes and the "non-figurative trope" or "pseudo-figure" called catachresis. The second part, following on from the open-ended prolegomena, concentrates on Paul Celan's poetry and poetics. According to Celan, true poetry is "essentially anti-metaphoric". I argue that inasmuch as we are willing to pay attention to the "will" of the poetic images themselves (the tropes and metaphors in a poem) to be "carried ad absurdum", as Celan invites us to do, we may find alternative ways of reading poetry and approaching its "secret of the encounter", precisely when the traditional rhetorical instruments, and especially the notion of metaphor, become inapplicable or suspicious — and even where they still seem to impose themselves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertation considers the birth of modernist and avant-gardist authorship as a reaction against mass society and massculture. Radical avant-gardism is studied as figurative violence done against the human form. The main argument claims avant-gardist authorship to be an act of masculine autogenesis. This act demands human form to be worked to an elementary state of disarticulateness, then to be reformed to the model of the artist's own psychophysical and idiosyncratic vision and experience. This work is connected to concrete mass, mass of pigment, charcoal, film, or flesh. This mass of the figure is worked to create a likeness in the nervous system of the spectator. The act of violence against the human figure is intended to shock the spectator. This shock is also a state of emotional and perceptional massification. I use theatrical image as heuristic tool and performance analysis, connecting figure and spectator into a larger image, which is constituted by relationships of mimesis, where figure presents the likeness of the spectator and spectator the likeness of the figure. Likeness is considered as both gestural - social mimetic - and sensuous - kinesthetically mimetic. Through this kind of construction one can describe and contextualize the process of violent autogenesis using particular images as case studies. Avant-gardist author is the author of theatrical image, not particular figure, and through act of massification the nervous system of the spectator is also part of this image. This is the most radical form and ideology of avant-gardist and modernist authorship or imagerial will to power. I construct a model of gestural-mimic performer to explicate the nature of violence done for human form in specific works, in Mann's novella Death in Venice, in Schiele's and Artaud's selfportaits, in Francis Bacon's paintings, in Beckett's shortplat NOT I, in Orlan's chirurgical performance Operation Omnipresense, in Cindy Sherman's Film/Stills, in Diamanda Galás's recording Vena Cava and in Hitchcock's Psycho. Masspsychology constructed a phobic picture of human form's plasticity and capability to be constituted by influencies coming both inside and outside - childhood, atavistic organic memories, urban field of nervous impulses, unconsciousness, capitalist (image)market and democratic masspolitics. Violence is then antimimetic and antitheatrical, a paradoxical situation, considering that massmedias and massaudiences created an enormous fascination about possibilities of theatrical and hypnotic influence in artistic elites. The problem was how to use theatrical image without coming as author under influence. In this work one possible answer is provided: by destructing the gestural-mimetic performer, by eliminating representations of mimic body techniques from the performer of human (a painted figure, a photographed figure, a filmed figure or an acted figure, audiovisual or vocal) figure. This work I call the chirurgical operation, which also indicates co-option with medical portraitures or medico-cultural diagnoses of human form. Destruction of the autonomy of the performer was a parallel process to constructing the new mass media audience as passive, plastic, feminine. The process created an image of a new kind of autotelic masculine author-hero, freed from human form in its bourgeois, aristocratic, classical and popular versions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis concentrates on bioavailability of organic soil contaminants in the context of bioremediation of soil contaminated with volatile or non-volatile hydrophobic pollutants. Bioavailability and biodegradation was studied from four viewpoints: (i) Improvement of bioavailability and biodegradation of volatile hydrocarbons in contained bioremediation systems at laboratory - and pilot-scale. (ii) Improvement of bioavailability of non-volatile, hydrophobic compounds in such systems. (iii) Biodegradation of a non-volatile hydrophobic compound in soil organic matter in microcosms. (iiii) Bioavailability of nitrogen in an open, full-scale bioremediation system. It was demonstrated that volatility of organic compounds can be controlled by amending the soil with adsorbents. The sorbed hydrocarbons were shown to be available to soil microbiota. As the result, biodegradation of the volatile hydrocarbons was greatly favored at the expense of volatilization. PAH compounds were shown to be mobilized and their bioavailability improved by a hydrophobic, non-toxic additive, vegetable oil. Bioavailability of the PAHs was recorded as an increased toxicity of the soil. In spite of the increased bioavailability, biodegradation of the PAHs decreased. In microcosms simulating boreal forest organic surface soil, PAH-compound (pyrene) was shown to be removed from soil biologically. Therefore hydrophobicity of the substrate does not necessarily mean low availability and biodegradation in organic soil. Finally, in this thesis it was demonstrated that an unsuitable source of nitrogen or its overdose resulted in wasteful spending of this nutrient and even harmful effects on soil microbes. Such events may inhibit rather than promote the bioremediation process in soil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Frictions are factors that hinder trading of securities in financial markets. Typical frictions include limited market depth, transaction costs, lack of infinite divisibility of securities, and taxes. Conventional models used in mathematical finance often gloss over these issues, which affect almost all financial markets, by arguing that the impact of frictions is negligible and, consequently, the frictionless models are valid approximations. This dissertation consists of three research papers, which are related to the study of the validity of such approximations in two distinct modeling problems. Models of price dynamics that are based on diffusion processes, i.e., continuous strong Markov processes, are widely used in the frictionless scenario. The first paper establishes that diffusion models can indeed be understood as approximations of price dynamics in markets with frictions. This is achieved by introducing an agent-based model of a financial market where finitely many agents trade a financial security, the price of which evolves according to price impacts generated by trades. It is shown that, if the number of agents is large, then under certain assumptions the price process of security, which is a pure-jump process, can be approximated by a one-dimensional diffusion process. In a slightly extended model, in which agents may exhibit herd behavior, the approximating diffusion model turns out to be a stochastic volatility model. Finally, it is shown that when agents' tendency to herd is strong, logarithmic returns in the approximating stochastic volatility model are heavy-tailed. The remaining papers are related to no-arbitrage criteria and superhedging in continuous-time option pricing models under small-transaction-cost asymptotics. Guasoni, Rásonyi, and Schachermayer have recently shown that, in such a setting, any financial security admits no arbitrage opportunities and there exist no feasible superhedging strategies for European call and put options written on it, as long as its price process is continuous and has the so-called conditional full support (CFS) property. Motivated by this result, CFS is established for certain stochastic integrals and a subclass of Brownian semistationary processes in the two papers. As a consequence, a wide range of possibly non-Markovian local and stochastic volatility models have the CFS property.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Functional transition theory: administration, legal order and institutions in Russia This dissertation examines some of the salient characteristics of Russia that are deemed to have impeded the growth of its economy and investments in particular. These characteristics are the volatility of the administrative and legal systems, corruption, and the perceived irrationality and difference in the operating environment in comparison with European conditions. The dissertation is one of the first studies on Russia that approaches the subject from the perspective of comprehensive social scientific theories. The study is based on the structural functionalistic theory, which is widely used in the social sciences. Adopting a sufficiently ambitious theoretical examination will provide a systematic and logical explanation of the characteristics of Russian institutions and ways of operations, such as corruption, that are commonly perceived as inexplicable. The approach adopted in the dissertation sheds light on the history of Russia's development and provides a comparative view of other societies in transition. Furthermore, it suggests recommendations as to how the structures of Russian society could be comprehensively strengthened.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is widely accepted that the global climate is heating up due to human activities, such as burning of fossil fuels. Therefore we find ourselves forced to make decisions on what measures, if any, need to be taken to decrease our warming effect on the planet before any irrevocable damage occurs. Research is being conducted in a variety of fields to better understand all relevant processes governing Earth s climate, and to assess the relative roles of anthropogenic and biogenic emissions into the atmosphere. One of the least well quantified problems is the impact of small aerosol particles (both of anthropogenic and biogenic origin) on climate, through reflecting solar radiation and their ability to act as condensation nuclei for cloud droplets. In this thesis, the compounds driving the biogenic formation of new particles in the atmosphere have been examined through detailed measurements. As directly measuring the composition of these newly formed particles is extremely difficult, the approach was to indirectly study their different characteristics by measuring the hygroscopicity (water uptake) and volatility (evaporation) of particles between 10 and 50 nm. To study the first steps of the formation process in the sub-3 nm range, the nucleation of gaseous precursors to small clusters, the chemical composition of ambient naturally charged ions were measured. The ion measurements were performed with a newly developed mass spectrometer, which was first characterized in the laboratory before being deployed at a boreal forest measurement site. It was also successfully compared to similar, low-resolution instruments. The ambient measurements showed that sulfuric acid clusters dominate the negative ion spectrum during new particle formation events. Sulfuric acid/ammonia clusters were detected in ambient air for the first time in this work. Even though sulfuric acid is believed to be the most important gas phase precursor driving the initial cluster formation, measurements of the hygroscopicity and volatility of growing 10-50 nm particles in Hyytiälä showed an increasing role of organic vapors of a variety of oxidation levels. This work has provided additional insights into the compounds participating both in the initial formation and subsequent growth of atmospheric new aerosol particles. It will hopefully prove an important step in understanding atmospheric gas-to-particle conversion, which, by influencing cloud properties, can have important climate impacts. All available knowledge needs to be constantly updated, summarized, and brought to the attention of our decision-makers. Only by increasing our understanding of all the relevant processes can we build reliable models to predict the long-term effects of decisions made today.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aerosols impact the planet and our daily lives through various effects, perhaps most notably those related to their climatic and health-related consequences. While there are several primary particle sources, secondary new particle formation from precursor vapors is also known to be a frequent, global phenomenon. Nevertheless, the formation mechanism of new particles, as well as the vapors participating in the process, remain a mystery. This thesis consists of studies on new particle formation specifically from the point of view of numerical modeling. A dependence of formation rate of 3 nm particles on the sulphuric acid concentration to the power of 1-2 has been observed. This suggests nucleation mechanism to be of first or second order with respect to the sulphuric acid concentration, in other words the mechanisms based on activation or kinetic collision of clusters. However, model studies have had difficulties in replicating the small exponents observed in nature. The work done in this thesis indicates that the exponents may be lowered by the participation of a co-condensing (and potentially nucleating) low-volatility organic vapor, or by increasing the assumed size of the critical clusters. On the other hand, the presented new and more accurate method for determining the exponent indicates high diurnal variability. Additionally, these studies included several semi-empirical nucleation rate parameterizations as well as a detailed investigation of the analysis used to determine the apparent particle formation rate. Due to their high proportion of the earth's surface area, oceans could potentially prove to be climatically significant sources of secondary particles. In the lack of marine observation data, new particle formation events in a coastal region were parameterized and studied. Since the formation mechanism is believed to be similar, the new parameterization was applied in a marine scenario. The work showed that marine CCN production is feasible in the presence of additional vapors contributing to particle growth. Finally, a new method to estimate concentrations of condensing organics was developed. The algorithm utilizes a Markov chain Monte Carlo method to determine the required combination of vapor concentrations by comparing a measured particle size distribution with one from an aerosol dynamics process model. The evaluation indicated excellent agreement against model data, and initial results with field data appear sound as well.