14 resultados para generalization

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work investigates the role of narrative literature in late-20th century and contemporary Anglo-American moral philosophy. It aims to show the trend of reading narrative literature for purposes of moral philosophy from the 1970 s and early 80 s to the present day as a part of a larger movement in Anglo-American moral philosophy, and to present a view of its significance for moral philosophy overall. Chapter 1 provides some preliminaries concerning the view of narrative literature which my discussion builds on. In chapter 2 I give an outline of how narrative literature is considered in contemporary Anglo-American moral philosophy, and connect this use to the broad trend of neo-Aristotelian ethics in this context. In chapter 3 I connect the use of literature to the idea of the non-generalizability of moral perception and judgment, which is central to the neo-Aristotelian trend, as well as to a range of moral particularisms and anti-theoretical positions of late 20th century and contemporary ethics. The joint task of chapters 2 and 3 is to situate the trend of reading narrative literature for the purposes of moral philosophy in the present context of moral philosophy. In the following two chapters, 4 and 5, I move on from the particularizing power of narrative literature, which is emphasized by neo-Aristotelians and particularists alike, to a broader under-standing of the intellectual potential of narrative literature. In chapter 4 I argue that narrative literature has its own forms of generalization which are enriching for our understanding of the workings of ethical generalizations in philosophy. In chapter 5 I discuss Iris Murdoch s and Martha Nussbaum s respective ways of combining ethical generality and particularity in a philosophical framework where both systematic moral theory and narrative literature are taken seriously. In chapter 6 I analyse the controversy between contemporary anti-theoretical conceptions of ethics and Nussbaum s refutation of these. I present my suggestion for how the significance of the ethics/literature discussion for moral philosophy can be understood if one wants to overcome the limitations of both Nussbaum s theory-centred, equilibrium-seeking perspective, and the anti-theorists repudiation of theory. I call my position the inclusive approach .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ADHD (attention deficit hyperactivity disorder) is developmental neurobiological disability. In adults, the prevalence of ADHD has been estimated to be about 4 %. In addition to the difficulties of attention, the problems in executive functioning are typical. The psychiatric comorbidities are common. The most extensively studied treatments are pharmacological. There is also evidence about the usefulness of the cognitive-behavioural therapy (CBT) in the treatment of adults with ADHD. There are some preliminary results about the effectiveness of cognitive training and hypnosis in children, but there is no scientific proof in adults. This dissertation is based on two intervention studies. In the first study, the usefulness of the new group CBT (n = 29) and the maintenance of the symptom reduction in the follow-up of six months were studied. In the second study, the usefulness of short hypnotherapy (n = 9), short individual CBT (n = 10) and computerized cognitive training (n = 9) were examined by comparing groups with each other and to the control group (n = 10). The participation in the group CBT and the participants' satisfaction were good. There were no changes in self-reports during waiting period of three months. After the rehabilitation, the symptoms decreased. Participants having symptom reduction during rehabilitation maintained their benefit through 6-month follow-up period. In a combined ADHD symptom score based on self-reports, seven participants in the hypnotherapy, six in the CBT, two in the cognitive training and two controls improved. Using independent evaluations, improvement was found in six of the hypnotherapy, seven of the CBT, two of the cognitive training and three of the control participants. There was no treatment-related improvement in cognitive performance. Thus, in the hypnotherapy and CBT groups, some encouraging improvement was seen. In the cognitive training group, there was improvement in the trained tasks but no generalization of the improvement. The results support the earlier results from the usefulness of CBT in the treatment of adults with ADHD. Also the hypnotherapy seems a useful rehabilitation. More research is needed to evaluate the usefulness of cognitive training. These promising results warrant further studies with more participants and with longer treatment duration. Also different measures of cognitive functioning and quality of life are needed. It is important in addition to the medication to arrange psychosocial interventions for the ADHD adults.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gravitaation kvanttiteorian muotoilu on ollut teoreettisten fyysikkojen tavoitteena kvanttimekaniikan synnystä lähtien. Kvanttimekaniikan soveltaminen korkean energian ilmiöihin yleisen suhteellisuusteorian viitekehyksessä johtaa aika-avaruuden koordinaattien operatiiviseen ei-kommutoivuuteen. Ei-kommutoivia aika-avaruuden geometrioita tavataan myös avointen säikeiden säieteorioiden tietyillä matalan energian rajoilla. Ei-kommutoivan aika-avaruuden gravitaatioteoria voisi olla yhteensopiva kvanttimekaniikan kanssa ja se voisi mahdollistaa erittäin lyhyiden etäisyyksien ja korkeiden energioiden prosessien ei-lokaaliksi uskotun fysiikan kuvauksen, sekä tuottaa yleisen suhteellisuusteorian kanssa yhtenevän teorian pitkillä etäisyyksillä. Tässä työssä tarkastelen gravitaatiota Poincarén symmetrian mittakenttäteoriana ja pyrin yleistämään tämän näkemyksen ei-kommutoiviin aika-avaruuksiin. Ensin esittelen Poincarén symmetrian keskeisen roolin relativistisessa fysiikassa ja sen kuinka klassinen gravitaatioteoria johdetaan Poincarén symmetrian mittakenttäteoriana kommutoivassa aika-avaruudessa. Jatkan esittelemällä ei-kommutoivan aika-avaruuden ja kvanttikenttäteorian muotoilun ei-kommutoivassa aika-avaruudessa. Mittasymmetrioiden lokaalin luonteen vuoksi tarkastelen huolellisesti mittakenttäteorioiden muotoilua ei-kommutoivassa aika-avaruudessa. Erityistä huomiota kiinnitetään näiden teorioiden vääristyneeseen Poincarén symmetriaan, joka on ei-kommutoivan aika-avaruuden omaama uudentyyppinen kvanttisymmetria. Seuraavaksi tarkastelen ei-kommutoivan gravitaatioteorian muotoilun ongelmia ja niihin kirjallisuudessa esitettyjä ratkaisuehdotuksia. Selitän kuinka kaikissa tähänastisissa lähestymistavoissa epäonnistutaan muotoilla kovarianssi yleisten koordinaattimunnosten suhteen, joka on yleisen suhteellisuusteorian kulmakivi. Lopuksi tutkin mahdollisuutta yleistää vääristynyt Poincarén symmetria lokaaliksi mittasymmetriaksi --- gravitaation ei-kommutoivan mittakenttäteorian saavuttamisen toivossa. Osoitan, että tällaista yleistystä ei voida saavuttaa vääristämällä Poincarén symmetriaa kovariantilla twist-elementillä. Näin ollen ei-kommutoivan gravitaation ja vääristyneen Poincarén symmetrian tutkimuksessa tulee jatkossa keskittyä muihin lähestymistapoihin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The object of this dissertation is to study globally defined bounded p-harmonic functions on Cartan-Hadamard manifolds and Gromov hyperbolic metric measure spaces. Such functions are constructed by solving the so called Dirichlet problem at infinity. This problem is to find a p-harmonic function on the space that extends continuously to the boundary at inifinity and obtains given boundary values there. The dissertation consists of an overview and three published research articles. In the first article the Dirichlet problem at infinity is considered for more general A-harmonic functions on Cartan-Hadamard manifolds. In the special case of two dimensions the Dirichlet problem at infinity is solved by only assuming that the sectional curvature has a certain upper bound. A sharpness result is proved for this upper bound. In the second article the Dirichlet problem at infinity is solved for p-harmonic functions on Cartan-Hadamard manifolds under the assumption that the sectional curvature is bounded outside a compact set from above and from below by functions that depend on the distance to a fixed point. The curvature bounds allow examples of quadratic decay and examples of exponential growth. In the final article a generalization of the Dirichlet problem at infinity for p-harmonic functions is considered on Gromov hyperbolic metric measure spaces. Existence and uniqueness results are proved and Cartan-Hadamard manifolds are considered as an application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this Thesis, we develop theory and methods for computational data analysis. The problems in data analysis are approached from three perspectives: statistical learning theory, the Bayesian framework, and the information-theoretic minimum description length (MDL) principle. Contributions in statistical learning theory address the possibility of generalization to unseen cases, and regression analysis with partially observed data with an application to mobile device positioning. In the second part of the Thesis, we discuss so called Bayesian network classifiers, and show that they are closely related to logistic regression models. In the final part, we apply the MDL principle to tracing the history of old manuscripts, and to noise reduction in digital signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Place identification refers to the process of analyzing sensor data in order to detect places, i.e., spatial areas that are linked with activities and associated with meanings. Place information can be used, e.g., to provide awareness cues in applications that support social interactions, to provide personalized and location-sensitive information to the user, and to support mobile user studies by providing cues about the situations the study participant has encountered. Regularities in human movement patterns make it possible to detect personally meaningful places by analyzing location traces of a user. This thesis focuses on providing system level support for place identification, as well as on algorithmic issues related to the place identification process. The move from location to place requires interactions between location sensing technologies (e.g., GPS or GSM positioning), algorithms that identify places from location data and applications and services that utilize place information. These interactions can be facilitated using a mobile platform, i.e., an application or framework that runs on a mobile phone. For the purposes of this thesis, mobile platforms automate data capture and processing and provide means for disseminating data to applications and other system components. The first contribution of the thesis is BeTelGeuse, a freely available, open source mobile platform that supports multiple runtime environments. The actual place identification process can be understood as a data analysis task where the goal is to analyze (location) measurements and to identify areas that are meaningful to the user. The second contribution of the thesis is the Dirichlet Process Clustering (DPCluster) algorithm, a novel place identification algorithm. The performance of the DPCluster algorithm is evaluated using twelve different datasets that have been collected by different users, at different locations and over different periods of time. As part of the evaluation we compare the DPCluster algorithm against other state-of-the-art place identification algorithms. The results indicate that the DPCluster algorithm provides improved generalization performance against spatial and temporal variations in location measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Habitat fragmentation is currently affecting many species throughout the world. As a consequence, an increasing number of species are structured as metapopulations, i.e. as local populations connected by dispersal. While excellent studies of metapopulations have accumulated over the past 20 years, the focus has recently shifted from single species to studies of multiple species. This has created the concept of metacommunities, where local communities are connected by the dispersal of one or several of their member species. To understand this higher level of organisation, we need to address not only the properties of single species, but also establish the importance of interspecific interactions. However, studies of metacommunities are so far heavily biased towards laboratory-based systems, and empirical data from natural systems are urgently needed. My thesis focuses on a metacommunity of insect herbivores on the pedunculate oak Quercus robur a tree species known for its high diversity of host-specific insects. Taking advantage of the amenability of this system to both observational and experimental studies, I quantify and compare the importance of local and regional factors in structuring herbivore communities. Most importantly, I contrast the impact of direct and indirect competition, host plant genotype and local adaptation (i.e. local factors) to that of regional processes (as reflected by the spatial context of the local community). As a key approach, I use general theory to generate testable hypotheses, controlled experiments to establish causal relations, and observational data to validate the role played by the pinpointed processes in nature. As the central outcome of my thesis, I am able to relegate local forces to a secondary role in structuring oak-based insect communities. While controlled experiments show that direct competition does occur among both conspecifics and heterospecifics, that indirect interactions can be mediated by both the host plant and the parasitoids, and that host plant genotype may affect local adaptation, the size of these effects is much smaller than that of spatial context. Hence, I conclude that dispersal between habitat patches plays a prime role in structuring the insect community, and that the distribution and abundance of the target species can only be understood in a spatial framework. By extension, I suggest that the majority of herbivore communities are dependent on the spatial structure of their landscape and urge fellow ecologists working on other herbivore systems to either support or refute my generalization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, the possibility of extending the Quantization Condition of Dirac for Magnetic Monopoles to noncommutative space-time is investigated. The three publications that this thesis is based on are all in direct link to this investigation. Noncommutative solitons have been found within certain noncommutative field theories, but it is not known whether they possesses only topological charge or also magnetic charge. This is a consequence of that the noncommutative topological charge need not coincide with the noncommutative magnetic charge, although they are equivalent in the commutative context. The aim of this work is to begin to fill this gap of knowledge. The method of investigation is perturbative and leaves open the question of whether a nonperturbative source for the magnetic monopole can be constructed, although some aspects of such a generalization are indicated. The main result is that while the noncommutative Aharonov-Bohm effect can be formulated in a gauge invariant way, the quantization condition of Dirac is not satisfied in the case of a perturbative source for the point-like magnetic monopole.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this dissertation is to model economic variables by a mixture autoregressive (MAR) model. The MAR model is a generalization of linear autoregressive (AR) model. The MAR -model consists of K linear autoregressive components. At any given point of time one of these autoregressive components is randomly selected to generate a new observation for the time series. The mixture probability can be constant over time or a direct function of a some observable variable. Many economic time series contain properties which cannot be described by linear and stationary time series models. A nonlinear autoregressive model such as MAR model can a plausible alternative in the case of these time series. In this dissertation the MAR model is used to model stock market bubbles and a relationship between inflation and the interest rate. In the case of the inflation rate we arrived at the MAR model where inflation process is less mean reverting in the case of high inflation than in the case of normal inflation. The interest rate move one-for-one with expected inflation. We use the data from the Livingston survey as a proxy for inflation expectations. We have found that survey inflation expectations are not perfectly rational. According to our results information stickiness play an important role in the expectation formation. We also found that survey participants have a tendency to underestimate inflation. A MAR model has also used to model stock market bubbles and crashes. This model has two regimes: the bubble regime and the error correction regime. In the error correction regime price depends on a fundamental factor, the price-dividend ratio, and in the bubble regime, price is independent of fundamentals. In this model a stock market crash is usually caused by a regime switch from a bubble regime to an error-correction regime. According to our empirical results bubbles are related to a low inflation. Our model also imply that bubbles have influences investment return distribution in both short and long run.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the diaconia work of the Finnish Evangelical Lutheran Church from the standpoint of clients. The role of diaconia work has grown since the early 1990s recession, and since it established itself as one of the actors along with other social organizations. Previous studies have described the changing role of diaconal work, especially from the standpoint of diaconia workers and co-operators. This research goes back to examine, beyond the activities of the diaconia work of everyday practices, its relations of ruling which are determining practices. The theoretical and methodological framework rises from the thinking of Dorothy E. Smith, the creator of institutional ethnography. Its origins are in feminism, Marxism, phenomenology, etnomethodology, and symbolic interactionism. However, it does not represent any school. Unlike the objectivity-based traditional sociology, institutional ethnography has its starting point in everyday life, and people s subjective experience of it. Everyday life is just a starting point, and is used to examine everyday life s experiences of hidden relations of ruling, linking people and organizations. The level of generalization is just on the relations of ruling. The research task is to examine those meanings of diaconia work which are embedded in its clients experiences. The research task is investigated with two questions: how diaconia work among its clients takes shape and what kinds of relations of ruling exist in diaconia work. The meanings of diaconia work come through an examination of the relations of ruling, which create new forms of diaconal work compared with previous studies. For the study, two kinds of data were collected: a questionnaire and ethnographic fieldwork. The first data set was collected from diaconal workers using the questionnaire. It gives background information of the diaconia work process from the standpoint of the clients. In the ethnographic study there were two phases. The first ethnographic material was collected from one local parish by observing, interviewing clients and diaconal workers and gathering documents. The number of observations was 36 customer appointments, and 29 interviews. The second ethnographic material was included as a part of the analysis, in which ruling relations in people s experiences were collected from the transcribed data. Close reading and narrative analysis are used as analysing methods. The analysis has three phases. First, the experiences are identified with close reading; the following step is to select some of the institutional processes that are shaping those experiences and are relevant for the research. At the third stage, those processes are investigated in order to describe analytically how they determine people s experience. The analysis produces another narrative about diaconia work, which provides tools for examining the diaconal work from a new perspective. Through the analysis it is possible to see diaconia as an exchange ratio, in which the exchange takes place between a client and a diaconia worker, but also more broadly with other actors, such as social workers, shop clerks, or with other parishioners. The exchange ratio is examined from the perspective of power which is embedded in the client s experiences. The analysis reveals that the most important relations of ruling are humiliation and randomness in the exchange ratio of diaconia work; valuating spirituality above the bodily being; and replacing official social work. The results give a map about the relations of ruling of diaconia work which gives tools to look at diaconia work s meanings to the clients. The hidden element of humiliation in the exchange ratio breaks the current picture of diaconia work. The ethos of the holistic encounters and empathic practices are shown to be of another kind when spirituality is preferred to the bodily being. Nevertheless, diaconia appears to be a place for a respectful encounter, especially in situations where the public sector s actors are retreating on liability or clients are in a life crisis. The collapse of the welfare state structures imposes on diaconia work tasks that have not previously belonged to it. At the local level, clients receive partners from diaconia workers in order to advocate them in the welfare system. Actions to influence the wider societal structures are not reached because of lacking resources. An awareness of the oppressive practices of diaconia work and their critical reviewing are the keys to the development of diaconia work, since there are such practices even in holistic and respectful diaconia work. While the research raises new information for the development of diaconia work, it also opens up new aspects for developing other kinds of social work by emphasizing the importance of taking people s experiences seriously. Keywords: diaconia work, institutional ethnography, Dorothy E. Smith, experience, customer, relations of ruling.