27 resultados para Literature and phenomenology
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Aquest treball gira entorn la qüestió de l’ús que es fa de la literatura com a medi per a l’ensenyament de l’anglès com a segona llengua. En primer lloc, dibuixa el marc de la situació actual on hi ha una clara separació entre llengua i literatura com a assignatures diferenciades i fa un repàs de les diferents metodologies que al llarg de la historia han utilitzat la literatura com a eina d’aprenentatge de la llengua. Segonament, el treball explica el desenvolupament i posada en pràctica d’una unitat didàctica completa per a alumnes de segon de batxillerat, que te la literatura con a punt de sortida. El treball mira de concloure com la utilització de la literatura exerceix un poder de motivació clau en els alumnes i aporta un context que dona sentit i riquesa a l’ensenyament de la llengua. Per últim, el treball fa un recull de les opinions de professors d’anglès de Catalunya al respecte d’aquest tema, a través d’un qüestionari que 66 professors associats a l’APAC (Associació de Professors d’Anglès de Catalunya) van respondre de manera desinteressada.
Resumo:
The aim of this article is to analyze accurately the role played by two classical references, Venus and Oedipus, in Tennessee Williams Suddenly Last Summer, in accordance with the usual nature of studies on Classical Tradition a Greek and Roman- and focusing in this case on the relationship between literature and mythology. It is thanks to Venus and Oedipus that the playwright succeeds in showing the magnitude of mens and womens tragedy, which from his point of view is simply that they have failed to see either kindness in the face of God or to feel his loving and fatherly providence.
Resumo:
In Spain a significant number of individuals die from atherosclerotic disease of the coronary and carotid arteries without having classic risk factors and prodromal symptoms. The diagonal ear lobe crease (DELC) has been characterized in the medical literature as a surrogate marker which can identify high risk patients having occult atherosclerosis. This topic however has not been examined in either the medical or dental literature emanating from Spain. The majority of clinical, angiography and postmortem reports support the premise that DELC is a valuable extravascular physical sign able to distinguish some patients at risk of succumbing to atherosclerosis of the coronary arteries. A minority of studies have however failed to support this hypothesis. More recently reports using B mode ultrasound have also linked DELC to atherosclerosis of the carotid artery and another report has related DELC to the presence of calcified carotid artery atheromas on panoramic radiographs. DELC is readily visible during head and neck cancer screening examinations. In conjunction with the patient"s medical history, vital signs, and panoramic radiograph, the DELC may assist in atherosclerotic risk assessment
Resumo:
The most common types of orofacial pain originate at the dental or periodontal level or in the musculoskeletal structures. However, the patient may present pain in this region even though the source is located elsewhere in the body. One possible source of heterotopic pain is of cardiac origin. Objectives: Report two cases of orofacial pain of cardiac origin and review the clinical cases described in the literature. Study Design: Description of clinical cases and review of clinical cases. Results and conclusions: Nine cases of atypical pain of cardiac origin are recorded, which include 5 females and 4 males. In craniofacial structures, pain of cardiac origin is usually bilateral. At the craniofacial level, the most frequent location described is in the throat and jaw. Pain of cardiac origin is considered atypical due to its location, although roughly 10% of the cases of cardiac ischemia manifest primarily in craniofacial structures. Finally, the differential diagnosis of pain of odontogenic origin must be taken into account with pain of non-odontogenic origin (muscle, psychogenic, neuronal, cardiac, sinus and neurovascular pain) in order to avoid diagnostic errors in the dental practice as well as unnecessary treatments.
Resumo:
This paper surveys the recent literature on convergence across countries and regions. I discuss the main convergence and divergence mechanisms identified in the literature and develop a simple model that illustrates their implications for income dynamics. I then review the existing empirical evidence and discuss its theoretical implications. Early optimism concerning the ability of a human capital-augmented neoclassical model to explain productivity differences across economies has been questioned on the basis of more recent contributions that make use of panel data techniques and obtain theoretically implausible results. Some recent research in this area tries to reconcile these findings with sensible theoretical models by exploring the role of alternative convergence mechanisms and the possible shortcomings of panel data techniques for convergence analysis.
Resumo:
Labour market reforms face very often opposition from the employed workers, because it normally reduces their wages. Also product market regulations are regularly biased towards too much benefitting the firms. As a result there remain many frictions in both the labour and product markets that hinder an optimal functioning of the economy. These issues have recently received a lot of attention in the economics literature and scholars have been looking for politically viable reforms in both markets. However, despite its potential importance, there has been done virtually no research on the interaction between reforms in product and labour markets. We find that when combining reforms, the opposition for reforms decreases considerably. This is because there exist complementarities and the gains in total welfare can be more evenly distributed over the interest groups. Moreover, the interaction of reforms offers a way out for the so-called 'sclerosis' effect.
Resumo:
The paper sets out a one sector growth model with a neoclassical production function in land and a capital-labour aggregate. Capital accumulates through capitalist saving, the labour supply is infinitely elastic at a subsistence wage and all factors may experience factor augmenting technical progress. The main result is that, if the elasticity of substitution between land and the capital-labour aggregate is less than one and if the rate of caital augmenting technical progress is strictly positive, then the rate of profit will fall to zero. The surprise is that this result holds regardless of the rate of land augmenting technical progress; that is, no amount of technical advance in agriculture can stop the fall in the rate of profit. The paper also discusses the relation of this result to the classical and Marxist literature and sets out the path of the relative price of land.
Resumo:
The present notes are intended to present a detailed review of the existing results in dissipative kinetic theory which make use of the contraction properties of two main families of probability metrics: optimal mass transport and Fourier-based metrics. The first part of the notes is devoted to a self-consistent summary and presentation of the properties of both probability metrics, including new aspects on the relationships between them and other metrics of wide use in probability theory. These results are of independent interest with potential use in other contexts in Partial Differential Equations and Probability Theory. The second part of the notes makes a different presentation of the asymptotic behavior of Inelastic Maxwell Models than the one presented in the literature and it shows a new example of application: particle's bath heating. We show how starting from the contraction properties in probability metrics, one can deduce the existence, uniqueness and asymptotic stability in classical spaces. A global strategy with this aim is set up and applied in two dissipative models.
Resumo:
Emissions distribution is a focus variable for the design of future international agreements to tackle global warming. This paper specifically analyses the future path of emissions distribution and its determinants in different scenarios. Whereas our analysis is driven by tools which are typically applied in the income distribution literature and which have recently been applied to the analysis of CO2 emissions distribution, a new methodological approach is that our study is driven by simulations run with a popular regionalised optimal growth climate change model over the 1995-2105 period. We find that the architecture of environmental policies, the implementation of flexible mechanisms and income concentration are key determinants of emissions distribution over time. In particular we find a robust positive relationship between measures of inequalities.
Resumo:
Assuming the role of debt management is to provide hedging against fiscal shocks we consider three questions: i) what indicators can be used to assess the performance of debt management? ii) how well have historical debt management policies performed? and iii) how is that performance affected by variations in debt issuance? We consider these questions using OECD data on the market value of government debt between 1970 and 2000. Motivated by both the optimal taxation literature and broad considerations of debt stability we propose a range of performance indicators for debt management. We evaluate these using Monte Carlo analysis and find that those based on the relative persistence of debt perform best. Calculating these measures for OECD data provides only limited evidence that debt management has helped insulate policy against unexpected fiscal shocks. We also find that the degree of fiscal insurance achieved is not well connected to cross country variations in debt issuance patterns. Given the limited volatility observed in the yield curve the relatively small dispersion of debt management practices across countries makes little difference to the realised degree of fiscal insurance.
Resumo:
Is there a link between decentralized governance and conflict prevention? This article tries to answer the question by presenting the state of the art of the intersection of both concepts. Provided that social conflict is inevitable and given the appearance of new threats and types of violence, as well as new demands for security based on people (human security), our societies should focus on promoting peaceful changes. Through an extensive analysis of the existing literature and the study of several cases, this paper suggests that decentralized governance can contribute to these efforts by transforming conflicts, bringing about power-sharing and inclusion incentives of minority groups. Albeit the complexity of assessing its impact on conflict prevention, it can be contended that decentralized governance might have very positive effects on the reduction of causes that bring about conflicts due to its ability to foster the creation of war/violence preventors. More specifically, this paper argues that decentralization can have a positive impact on the so-called triggers and accelerators (short- and medium-term causes).
Resumo:
In image segmentation, clustering algorithms are very popular because they are intuitive and, some of them, easy to implement. For instance, the k-means is one of the most used in the literature, and many authors successfully compare their new proposal with the results achieved by the k-means. However, it is well known that clustering image segmentation has many problems. For instance, the number of regions of the image has to be known a priori, as well as different initial seed placement (initial clusters) could produce different segmentation results. Most of these algorithms could be slightly improved by considering the coordinates of the image as features in the clustering process (to take spatial region information into account). In this paper we propose a significant improvement of clustering algorithms for image segmentation. The method is qualitatively and quantitative evaluated over a set of synthetic and real images, and compared with classical clustering approaches. Results demonstrate the validity of this new approach
Resumo:
In the context of the digital business ecosystems, small organizations cooperate between them in order to achieve common goals or offer new services for expanding their markets. There are different approaches for these cooperation models such as virtual enterprises, virtual organizations or dynamic electronic institutions which in their lifecycle have in common a dissolution phase. However this phase has not been studied deeply in the current literature and it lacks formalization. In this paper a first approach for achieving and managing the dissolution phase is proposed, as well as a CBR process in order to support it in a multi-agent system
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.