399 resultados para Fruitful


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In daily life, rich experiences evolve in every environmental and social interaction. Because experience has a strong impact on how people behave, scholars in different fields are interested in understanding what constitutes an experience. Yet even if interest in conscious experience is on the increase, there is no consensus on how such experience should be studied. Whatever approach is taken, the subjective and psychologically multidimensional nature of experience should be respected. This study endeavours to understand and evaluate conscious experiences. First I intro-duce a theoretical approach to psychologically-based and content-oriented experience. In the experiential cycle presented here, classical psychology and orienting-environmental content are connected. This generic approach is applicable to any human-environment interaction. Here I apply the approach to entertainment virtual environments (VEs) such as digital games and develop a framework with the potential for studying experiences in VEs. The development of the methodological framework included subjective and objective data from experiences in the Cave Automatic Virtual Environment (CAVE) and with numerous digital games (N=2,414). The final framework consisted of fifteen factor-analytically formed subcomponents of the sense of presence, involvement and flow. Together, these show the multidimensional experiential profile of VEs. The results present general experiential laws of VEs and show that the interface of a VE is related to (physical) presence, which psychologically means attention, perception and the cognitively evaluated realness and spatiality of the VE. The narrative of the VE elicits (social) presence and involvement and affects emotional outcomes. Psychologically, these outcomes are related to social cognition, motivation and emotion. The mechanics of a VE affect the cognitive evaluations and emotional outcomes related to flow. In addition, at the very least, user background, prior experience and use context affect the experiential variation. VEs are part of many peoples lives and many different outcomes are related to them, such as enjoyment, learning and addiction, depending on who is making the evalua-tion. This makes VEs societally important and psychologically fruitful to study. The approach and framework presented here contribute to our understanding of experiences in general and VEs in particular. The research can provide VE developers with a state-of-the art method (www.eveqgp.fi) that can be utilized whenever new product and service concepts are designed, prototyped and tested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dissertation examines the foreign policies of the United States through the prism of science and technology. In the focal point of scrutiny is the policy establishing the International Institute for Applied Systems Analysis (IIASA) and the development of the multilateral part of bridge building in American foreign policy during the 1960s and early 1970s. After a long and arduous negotiation process, the institute was finally established by twelve national member organizations from the following countries: Bulgaria, Canada, Czechoslovakia, Federal Republic of Germany (FRG), France, German Democratic Republic (GDR), Great Britain, Italy, Japan, Poland, Soviet Union and United States; a few years later Sweden, Finland and the Netherlands also joined. It is said that the goal of the institute was to bring together researchers from East and West to solve pertinent problems caused by the modernization process experienced in industrialized world. It originates from President Lyndon B. Johnson s bridge building policies that were launched in 1964, and was set in a well-contested and crowded domain of other international organizations of environmental and social planning. Since the distinct need for yet another organization was not evident, the process of negotiations in this multinational environment enlightens the foreign policy ambitions of the United States on the road to the Cold War détente. The study places this project within its political era, and juxtaposes it with other international organizations, especially that of the OECD, ECE and NATO. Conventionally, Lyndon Johnson s bridge building policies have been seen as a means to normalize its international relations bilaterally with different East European countries, and the multilateral dimension of the policy has been ignored. This is why IIASA s establishment process in this multilateral environment brings forth new information on US foreign policy goals, the means to achieve these goals, as well as its relations to other advanced industrialized societies before the time of détente, during the 1960s and early 1970s. Furthermore, the substance of the institute applied systems analysis illuminates the differences between European and American methodological thinking in social planning. Systems analysis is closely associated with (American) science and technology policies of the 1960s, especially in its military administrative applications, thus analysis within the foreign policy environment of the United States proved particularly fruitful. In the 1960s the institutional structures of European continent with faltering, and the growing tendencies of integration were in flux. One example of this was the long, drawn-out process of British membership in the EEC, another is de Gaulle s withdrawal from NATO s military-political cooperation. On the other hand, however, economic cooperation in Europe between East and West, and especially with the Soviet Union was expanding rapidly. This American initiative to form a new institutional actor has to be seen in that structural context, showing that bridge building was needed not only to the East, but also to the West. The narrative amounts to an analysis of how the United States managed both cooperation and conflict in its hegemonic aspirations in the emerging modern world, and how it used its special relationship with the United Kingdom to achieve its goals. The research is based on the archives of the United States, Great Britain, Sweden, Finland, and IIASA. The primary sources have been complemented with both contemporary and present day research literature, periodicals, and interviews.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation examines how Finnish-speaking children learn Swedish in an immersion kindergarten where the method of Canadian immersion is used. Within the framework of conversation analysis, this study explores how second language learning is situated in interaction and evidenced in the participants´ verbal and non-verbal behavior. The database consists of 40 hours of videotaped data collected in naturally occurring situations in a group of 15 four-year-old children during the first two years of their immersion. Due to the immersion method, all the children share the same L1, in this case Finnish, and the teachers understand Finnish. However, they speak only Swedish to the children in all situations and Swedish is learned in interaction without formal teaching. The aim of the study is to discover how the children´s second language competence gradually increases when they participate in interaction with the Swedish-speaking teachers. The study also sheds light on the methodological question of how second language learning can be analyzed with the method of conversation analysis. The focus is on showing how the second language is learned in interaction, especially on how learning is achieved collaboratively. In this study, the emerging second language competence is explored by investigating how the children show understanding of the teachers´ non-verbal and verbal actions during the first and the second semester of the immersion. The children´s use of Swedish is analyzed by investigating how they recycle lexical items and later even syntactic structures from the teachers´ Swedish turns. The results show that the teachers´ actions are largely understood by the children even at the beginning of the immersion. The analyzes of the children´s responsive turns reveal that they interpret the teachers´ turns on the basis of non-verbal cues at first. Especially at the beginning of the immersion, the participants orient to the progress of interaction and not to problems in understanding. Even in situations where the next actions show that the children do not understand what is said, they tend to display understanding rather than non-understanding. This behavior changes, however, when the children´s competence in their second language increases. At the second semester, the children both show understanding of the teachers´ verbal turns and also display their non-understanding by initiating repair when they do not understand. Understanding of the teachers´ verbal turns, including their syntactic structure, is manifested in the ways the children tie their turns to the teachers´ turns. Recycling, on the other hand, proves to be the way by which the children start to speak the second language. In this study, the children´s common L1 is evidenced to be an important resource in interaction. It allows the children to participate in their individual ways and to share their experiences both with each other and with the teachers. It also enables them to co-construct conversations that lead to collaborative learning. Moreover, the uninhibited use of L1 proves to be an important analytic tool that makes the immersion data especially fruitful for conversation analytic research on second language learning, since the children´s interpretations of the second language are in evidence even when they do not speak the second language.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this dissertation was to study whether it is possible and meaningful to apply Ludwig Wittgenstein s distinction between saying (Sagen) and showing (Zeigen) to ethically oriented literary criticism. The following questions were used as the primary guidelines: 1. Is it possible, in the context of literary criticism, to put in practice Wittgenstein s ethical conceptions, which are quite theoretical and metaphysical by nature? 2. If so, what practical literary devices do authors use if they want to demonstrate their ethical values within the frame of a fictional work? 3. Does philosophy offer useful ethical consepts that open us new and interesting readings in fiction? The philosophical background of Wittgenstein s distinction is clarified in chapter I. This clarification is based on his main works, Tractatus logico-philosophicus and Philosophishe Untersuchungen, the published correspondence between Wittgenstein and Paul Engelmann, and selected Wittgenstein research and papers. Analyzing ethics and it s expression in Georg Trakl s poetry further elucidates Wittgenstein s concept of showing. The concept that a literary work is an act of an author was used as a starting point. The presumption was that analyzing this act of an author will reveal how ethical values can be demonstrated in literature. Categorizing an author s act at different levels of literary expression provides the structure of this study. In chapters IV - XIII literary devices useful for demonstrating ethics are examined and explained using examples from the works of Joseph Conrad, Charles Dickens, Nikolay Leskov, Ludwig Uhland, Eino Leino, Pentti Haanpää and Maria Jotuni. The concepts and views of researchers and writers such as Mihail Bahtin, Peter Juhl, E. D. Hirsch, Peter Lamarque and Stein Haugom Olsen are used. The concepts outlined in previous chapters are then applied in three case studies: Aeschylus s Oresteia trilogy, J-L. Runeberg s poem Sven Dufva and Sofi Oksanen s novel Puhdistus (Purge). On the whole, Wittgenstein s idea that ethical values can be demonstrated (shown) by means of literature is revealed as a fruitful point of departure for a more exact ethical reading, offering a new perspective on literary works.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The triphenylphosphine deoxygenation of the polyperoxides, poly(styrene peroxide), poly(methyl methacrylate peroxide), and poly(alpha-methylstyrene peroxide) proceed via the phosphorane intermediates, which in the presence of moisture hydrolyze to give the respective diols. At higher temperatures and under dry conditions the phosphorane decomposes into epoxide and triphenylphosphine oxide. The reaction has been studied by H-1-, C-13-, and P-31-NMR spectroscopy. The results obtained are consistent with a concerted insertion of the biphile, triphenylphosphine, into the peroxy bond and this reaction pathway seems to be new as far as the chemistry of polyperoxides is concerned. Though the aim of this investigation was to test the selective deoxygenation of polyperoxide by triphenylphosphine as a method of preparing polyethers, it turned out to be a fruitful method of synthesis of stereospecific diols. (C) 1997 John Wiley & Sons, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review the current status of various aspects of biopolymer translocation through nanopores and the challenges and opportunities it offers. Much of the interest generated by nanopores arises from their potential application to third-generation cheap and fast genome sequencing. Although the ultimate goal of single-nucleotide identification has not yet been reached, great advances have been made both from a fundamental and an applied point of view, particularly in controlling the translocation time, fabricating various kinds of synthetic pores or genetically engineering protein nanopores with tailored properties, and in devising methods (used separately or in combination) aimed at discriminating nucleotides based either on ionic or transverse electron currents, optical readout signatures, or on the capabilities of the cellular machinery. Recently, exciting new applications have emerged, for the detection of specific proteins and toxins (stochastic biosensors), and for the study of protein folding pathways and binding constants of protein-protein and protein-DNA complexes. The combined use of nanopores and advanced micromanipulation techniques involving optical/magnetic tweezers with high spatial resolution offers unique opportunities for improving the basic understanding of the physical behavior of biomolecules in confined geometries, with implications for the control of crucial biological processes such as protein import and protein denaturation. We highlight the key works in these areas along with future prospects. Finally, we review theoretical and simulation studies aimed at improving fundamental understanding of the complex microscopic mechanisms involved in the translocation process. Such understanding is a pre-requisite to fruitful application of nanopore technology in high-throughput devices for molecular biomedical diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Super-resolution imaging techniques are of paramount interest for applications in bioimaging and fluorescence microscopy. Recent advances in bioimaging demand application-tailored point spread functions. Here, we present some approaches for generating application-tailored point spread functions along with fast imaging capabilities. Aperture engineering techniques provide interesting solutions for obtaining desired system point spread functions. Specially designed spatial filters—realized by optical mask—are outlined both in a single-lens and 4Pi configuration. Applications include depth imaging, multifocal imaging, and super-resolution imaging. Such an approach is suitable for fruitful integration with most existing state-of-art imaging microscopy modalities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: En el mundo educativo actual las corrientes pedagógicas se han visto sumidas en lo que podríamos denominar «el olvido del maestro», pues este se torna ininteligible cuando no hay una verdad que comunicar. La educación se convierte entonces en una construcción por parte del alumno de un mundo de significados subjetivos y cambiantes, que el maestro solo estimula. Frente a esto encontramos la enseñanza de santo Tomás de Aquino, Doctor Humanitatis, quien nos recuerda que las palabras del maestro, verba doctoris, son comunicativas de la verdad ya conocida. Solo fundados en este principio es posible devolver a la palabra educativa la fecundidad que le corresponde, tan necesaria para el perfeccionamiento del hombre y de la sociedad. En este artículo se presenta una reflexión acerca del lugar que ocupa la palabra verdadera en la educación de los hombres, mostrando cómo la vida humana no tiende solo a expresarse en la palabra, sino a ser fecunda en la palabra educativa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El proceso de llevar a cabo investigaciones que examinen la función de la agenda setting de los medios de comunicación implica una serie de decisiones. El proyecto de investigación: ¿examinará el primer nivel de la agenda (con los temas o issues como unidad de análisis) o el segundo nivel (con los atributos como unidad de análisis) ¿Cuál o qué será el objeto de estudio? ¿Qué tipo de atributos se incluirán en el análisis? ¿Qué contenido se compararía? En este estudio se esbozan algunas posibles aplicaciones de la teoría de la agenda setting en un contexto internacional, lo cual podría servir de guía para futuras o posibles investigaciones. La fusión entre la cobertura de noticias internacionales y la investigación de la agenda setting parecería ser un área fructífera para los investigadores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: El tema del diálogo interdisciplinar constituye el alma misma de la bioética y su método propio. Este artículo pretende ser un aporte a ese diálogo. Con ese fin, asume los temas controversiales del pasado (la relación ciencia y ética), al mismo tiempo que señala la oportunidad actual de búsqueda y encuentro entre ambas ramas del saber. El autor propone como ámbito de ese diálogo el campo de la vida humana. La bioética con su actualidad, su complejidad y su metodología transdisciplinar surge como posibilidad de un diálogo respetuoso y ordenado, llamado a dar muchos frutos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: Este artículo se propone situar la categoría de nupcialidad en la circularidad hermenéutica entre lo antropológico y lo trinitario, a fin de recuperar su poder especulativo y reconfigurador tanto para la teología como para la filosofía y la mística contemporáneas. Para ello se apela a la fuerza de las existencias teologales de Juan de la Cruz, Edith Stein y Christophe Lebreton. A partir de sus testimonios el artículo postula una relación fecunda entre Estética teológica y Ontología trinitaria que redunda en una caracterización actualizada de la nupcialidad compuesta por ocho notas distintivas que el texto desarrolla.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Setting total allowable catches (TACs) is an endogenous process in which different agents and institutions, often with conflicting interests and opportunistic behaviour, try to influence policy-makers. Such policy-makers, far from being the benevolent social planners many would wish them to be, may also pursue self-interest when making final decisions. Although restricted knowledge of stock abundance and population dynamics, and weakness in enforcement, have effects, these other factors may explain the reason why TAC management has failed to guarantee sustainable exploitation of fish resources. Rejecting the exogeneity of the TAC and taking advantage of fruitful debate on economic policy (i.e. the rules vs. discretion debate, and that surrounding the independence of central banks), two institutional developments are analysed as potential mechanisms to face up to misconceptions about TACs: long-term harvest control rules, and a central bank of fish.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we propose a new approach to deduction methods for temporal logic. Our proposal is based on an inductive definition of eventualities that is different from the usual one. On the basis of this non-customary inductive definition for eventualities, we first provide dual systems of tableaux and sequents for Propositional Linear-time Temporal Logic (PLTL). Then, we adapt the deductive approach introduced by means of these dual tableau and sequent systems to the resolution framework and we present a clausal temporal resolution method for PLTL. Finally, we make use of this new clausal temporal resolution method for establishing logical foundations for declarative temporal logic programming languages. The key element in the deduction systems for temporal logic is to deal with eventualities and hidden invariants that may prevent the fulfillment of eventualities. Different ways of addressing this issue can be found in the works on deduction systems for temporal logic. Traditional tableau systems for temporal logic generate an auxiliary graph in a first pass.Then, in a second pass, unsatisfiable nodes are pruned. In particular, the second pass must check whether the eventualities are fulfilled. The one-pass tableau calculus introduced by S. Schwendimann requires an additional handling of information in order to detect cyclic branches that contain unfulfilled eventualities. Regarding traditional sequent calculi for temporal logic, the issue of eventualities and hidden invariants is tackled by making use of a kind of inference rules (mainly, invariant-based rules or infinitary rules) that complicates their automation. A remarkable consequence of using either a two-pass approach based on auxiliary graphs or aone-pass approach that requires an additional handling of information in the tableau framework, and either invariant-based rules or infinitary rules in the sequent framework, is that temporal logic fails to carry out the classical correspondence between tableaux and sequents. In this thesis, we first provide a one-pass tableau method TTM that instead of a graph obtains a cyclic tree to decide whether a set of PLTL-formulas is satisfiable. In TTM tableaux are classical-like. For unsatisfiable sets of formulas, TTM produces tableaux whose leaves contain a formula and its negation. In the case of satisfiable sets of formulas, TTM builds tableaux where each fully expanded open branch characterizes a collection of models for the set of formulas in the root. The tableau method TTM is complete and yields a decision procedure for PLTL. This tableau method is directly associated to a one-sided sequent calculus called TTC. Since TTM is free from all the structural rules that hinder the mechanization of deduction, e.g. weakening and contraction, then the resulting sequent calculus TTC is also free from this kind of structural rules. In particular, TTC is free of any kind of cut, including invariant-based cut. From the deduction system TTC, we obtain a two-sided sequent calculus GTC that preserves all these good freeness properties and is finitary, sound and complete for PLTL. Therefore, we show that the classical correspondence between tableaux and sequent calculi can be extended to temporal logic. The most fruitful approach in the literature on resolution methods for temporal logic, which was started with the seminal paper of M. Fisher, deals with PLTL and requires to generate invariants for performing resolution on eventualities. In this thesis, we present a new approach to resolution for PLTL. The main novelty of our approach is that we do not generate invariants for performing resolution on eventualities. Our method is based on the dual methods of tableaux and sequents for PLTL mentioned above. Our resolution method involves translation into a clausal normal form that is a direct extension of classical CNF. We first show that any PLTL-formula can be transformed into this clausal normal form. Then, we present our temporal resolution method, called TRS-resolution, that extends classical propositional resolution. Finally, we prove that TRS-resolution is sound and complete. In fact, it finishes for any input formula deciding its satisfiability, hence it gives rise to a new decision procedure for PLTL. In the field of temporal logic programming, the declarative proposals that provide a completeness result do not allow eventualities, whereas the proposals that follow the imperative future approach either restrict the use of eventualities or deal with them by calculating an upper bound based on the small model property for PLTL. In the latter, when the length of a derivation reaches the upper bound, the derivation is given up and backtracking is used to try another possible derivation. In this thesis we present a declarative propositional temporal logic programming language, called TeDiLog, that is a combination of the temporal and disjunctive paradigms in Logic Programming. We establish the logical foundations of our proposal by formally defining operational and logical semantics for TeDiLog and by proving their equivalence. Since TeDiLog is, syntactically, a sublanguage of PLTL, the logical semantics of TeDiLog is supported by PLTL logical consequence. The operational semantics of TeDiLog is based on TRS-resolution. TeDiLog allows both eventualities and always-formulas to occur in clause heads and also in clause bodies. To the best of our knowledge, TeDiLog is the first declarative temporal logic programming language that achieves this high degree of expressiveness. Since the tableau method presented in this thesis is able to detect that the fulfillment of an eventuality is prevented by a hidden invariant without checking for it by means of an extra process, since our finitary sequent calculi do not include invariant-based rules and since our resolution method dispenses with invariant generation, we say that our deduction methods are invariant-free.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After 20 annual meetings it is worth to have a look back and to see how it has started. There has been very little collaboration on research projects between member institutes under the auspices of WEFTA, co-operation in more neutral areas of common interest was developed at an early stage. The area which has proved very fruitful is methodology. It was agreed that probably the best way to make progress was to arrange meetings at each laboratory in turn where experienced, practising scientists could describe in detail how they carried out analyses. In this way, difficulties could be demonstrated or uncovered, and the accuracy, precision, efficiency and cost of the methods used in different laboratories could be compared.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main theme running through these three chapters is that economic agents are often forced to respond to events that are not a direct result of their actions or other agents actions. The optimal response to these shocks will necessarily depend on agents' understanding of how these shocks arise. The economic environment in the first two chapters is analogous to the classic chain store game. In this setting, the addition of unintended trembles by the agents creates an environment better suited to reputation building. The third chapter considers the competitive equilibrium price dynamics in an overlapping generations environment when there are supply and demand shocks.

The first chapter is a game theoretic investigation of a reputation building game. A sequential equilibrium model, called the "error prone agents" model, is developed. In this model, agents believe that all actions are potentially subjected to an error process. Inclusion of this belief into the equilibrium calculation provides for a richer class of reputation building possibilities than when perfect implementation is assumed.

In the second chapter, maximum likelihood estimation is employed to test the consistency of this new model and other models with data from experiments run by other researchers that served as the basis for prominent papers in this field. The alternate models considered are essentially modifications to the standard sequential equilibrium. While some models perform quite well in that the nature of the modification seems to explain deviations from the sequential equilibrium quite well, the degree to which these modifications must be applied shows no consistency across different experimental designs.

The third chapter is a study of price dynamics in an overlapping generations model. It establishes the existence of a unique perfect-foresight competitive equilibrium price path in a pure exchange economy with a finite time horizon when there are arbitrarily many shocks to supply or demand. One main reason for the interest in this equilibrium is that overlapping generations environments are very fruitful for the study of price dynamics, especially in experimental settings. The perfect foresight assumption is an important place to start when examining these environments because it will produce the ex post socially efficient allocation of goods. This characteristic makes this a natural baseline to which other models of price dynamics could be compared.