54 resultados para Causal loops
Resumo:
Autism is a childhood-onset developmental disorder characterized by deficits in reciprocal social interaction, verbal and non-verbal communication, and dependence on routines and rituals. It belongs to a spectrum of disorders (autism spectrum disorders, ASDs) which share core symptoms but show considerable variation in severity. The whole spectrum affects 0.6-0.7% of children worldwide, inducing a substantial public health burden and causing suffering to the affected families. Despite having a very high heritability, ASDs have shown exceptional genetic heterogeneity, which has complicated the identification of risk variants and left the etiology largely unknown. However, recent studies suggest that rare, family-specific factors contribute significantly to the genetic basis of ASDs. In this study, we investigated the role of DISC1 (Disrupted-in-schizophrenia-1) in ASDs, and identified association with markers and haplotypes previously associated with psychiatric phenotypes. We identified four polymorphic micro-RNA target sites in the 3 UTR of DISC1, and showed that hsa-miR-559 regulates DISC1 expression in vitro in an allele-specific manner. We also analyzed an extended autism pedigree with genealogical roots in Central Finland reaching back to the 17th century. To take advantage of the beneficial characteristics of population isolates to gene mapping and reduced genetic heterogeneity observed in distantly related individuals, we performed a microsatellite-based genome-wide screen for linkage and linkage disequilibrium in this pedigree. We identified a putative autism susceptibility locus on chromosome 19p13.3 and obtained further support for previously reported loci at 1q23 and 15q11-q13. To follow-up these findings, we extended our study sample from the same sub-isolate and initiated a genome-wide analysis of homozygosity and allelic sharing using high-density SNP markers. We identified a small number of haplotypes shared by different subsets of the genealogically connected cases, along with convergent biological pathways from SNP and gene expression data, which highlighted axon guidance molecules in the pathogenesis of ASDs. In conclusion, the results obtained in this thesis show that multiple distinct genetic variants are responsible for the ASD phenotype even within single pedigrees from an isolated population. We suggest that targeted resequencing of the shared haplotypes, linkage regions, and other susceptibility loci is essential to identify the causal variants. We also report a possible micro-RNA mediated regulatory mechanism, which might partially explain the wide-range neurobiological effects of the DISC1 gene.
Resumo:
In this thesis I examine the U.S. foreign policy discussion that followed the war between Russia and Georgia in August 2008. In the politically charged setting that preceded the presidential elections, the subject of the debate was not only Washington's response to the crisis in the Caucasus but, more generally, the direction of U.S. foreign policy after the presidency of George W. Bush. As of November 2010, the reasons for and consequences of the Russia-Georgia war continue to be contested. My thesis demonstrates that there were already a number of different stories about the conflict immediately after the outbreak of hostilities. I want to argue that among these stories one can discern a “neoconservative narrative” that described the war as a confrontation between the East and the West and considered it as a test for Washington’s global leadership. I draw on the theory of securitization, particularly on a framework introduced by Holger Stritzel. Accordingly, I consider statements about the conflict as “threat texts” and analyze these based on the existing discursive context, the performative force of the threat texts and the positional power of the actors presenting them. My thesis suggests that a notion of narrativity can complement Stritzel’s securitization framework and take it further. Threat texts are established as narratives by attaching causal connections, meaning and actorship to the discourse. By focusing on this process I want to shed light on the relationship between the text and the context, capture the time dimension of a speech act articulation and help to explain how some interpretations of the conflict are privileged and others marginalized. I develop the theoretical discussion through an empirical analysis of the neoconservative narrative. Drawing on Stritzel’s framework, I argue that the internal logic of the narrative which was presented as self-evident can be analyzed in its historicity. Asking what was perceived to be at stake in the conflict, how the narrative was formed and what purposes it served also reveals the possibility for alternative explanations. My main source material consists of transcripts of think tank seminars organized in Washington, D.C. in August 2008. In addition, I resort to the foreign policy discussion in the mainstream media.
Resumo:
Tieteellinen tiivistelmä Common scab is one of the most important soil-borne diseases of potato (Solanum tuberosum L.) in many potato production areas. It is caused by a number of Streptomyces species, in Finland the causal agents are Streptomyces scabies (Thaxter) Lambert & Loria and S. turgidiscabies Takeuchi. The scab-causing Streptomyces spp. are well-adapted, successful plant pathogens that survive in soil also as saprophytes. Control of these pathogens has proved to be difficult. Most of the methods used to manage potato common scab are aimed at controlling S. scabies, the most common of the scab-causing pathogens. The studies in this thesis investigated S. scabies and S. turgidiscabies as causal organisms of common scab and explored new approaches for control of common scab that would be effective against both species. S. scabies and S. turgidiscabies are known to co-occur in the same fields and in the same tuber lesions in Finland. The present study showed that both these pathogens cause similar symptoms on potato tubers, and the types of symptoms varied depending on cultivar rather than the pathogen species. Pathogenic strains of S. turgidiscabies were antagonistic to S. scabies in vitro indicating that these two species may be competing for the same ecological niche. In addition, strains of S. turgidiscabies were highly virulent in potato and they tolerated lower pH than those of S. scabies. Taken together these results suggest that S. turgidiscabies has become a major problem in potato production in Finland. The bacterial phytotoxins, thaxtomins, are produced by the scab-causing Streptomyces spp. and are essential for the induction of scab symptoms. In this study, thaxtomins were produced in vitro and four thaxtomin compounds isolated and characterized. All four thaxtomins induced similar symptoms of reduced root and shoot growth, root swelling or necrosis on micro-propagated potato seedlings. The main phytotoxin, thaxtomin A, was used as a selective agent in a bioassay in vitro to screen F1 potato progeny from a single cross. Tolerance to thaxtomin A in vitro and scab resistance in the field were correlated indicating that the in vitro bioassay could be used in the early stages of a resistance breeding program to discard scab-susceptible genotypes and elevate the overall levels of common scab resistance in potato breeding populations. The potential for biological control of S. scabies and S. turgidiscabies using a non-pathogenic Streptomyces strain (346) isolated from a scab lesion and S. griseoviridis strain (K61) from a commercially available biocontrol product was studied. Both strains showed antagonistic activity against S. scabies and S. turgidiscabies in vitro and suppressed the development of common scab disease caused by S. turgidiscabies in the glasshouse. Furthermore, strain 346 reduced the incidence of S. turgidiscabies in scab lesions on potato tubers in the field. These results demonstrated for the first time the potential for biological control of S. turgidiscabies in the glasshouse and under field conditions and may be applied to enhance control of common scab in the future.
Resumo:
In this thesis I examine one commonly used class of methods for the analytic approximation of cellular automata, the so-called local cluster approximations. This class subsumes the well known mean-field and pair approximations, as well as higher order generalizations of these. While a straightforward method known as Bayesian extension exists for constructing cluster approximations of arbitrary order on one-dimensional lattices (and certain other cases), for higher-dimensional systems the construction of approximations beyond the pair level becomes more complicated due to the presence of loops. In this thesis I describe the one-dimensional construction as well as a number of approximations suggested for higher-dimensional lattices, comparing them against a number of consistency criteria that such approximations could be expected to satisfy. I also outline a general variational principle for constructing consistent cluster approximations of arbitrary order with minimal bias, and show that the one-dimensional construction indeed satisfies this principle. Finally, I apply this variational principle to derive a novel consistent expression for symmetric three cell cluster frequencies as estimated from pair frequencies, and use this expression to construct a quantitatively improved pair approximation of the well-known lattice contact process on a hexagonal lattice.
Resumo:
The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability
Resumo:
This study investigates the role of social media as a form of organizational knowledge sharing. Social media is investigated in terms of the Web 2.0 technologies that organizations provide their employees as tools of internal communication. This study is anchored in the theoretical understanding of social media as technologies which enable both knowledge collection and knowledge donation. This study investigates the factors influencing employees’ use of social media in their working environment. The study presents the multidisciplinary research tradition concerning knowledge sharing. Social media is analyzed especially in relation to internal communication and knowledge sharing. Based on previous studies, it is assumed that personal, organizational, and technological factors influence employees’ use of social media in their working environment. The research represents a case study focusing on the employees of the Finnish company Wärtsilä. Wärtsilä represents an eligible case organization for this study given that it puts in use several Web 2.0 tools in its intranet. The research is based on quantitative methods. In total 343 answers were obtained with the aid of an online survey which was available in Wärtsilä’s intranet. The associations between the variables are analyzed with the aid of correlations. Finally, with the aid of multiple linear regression analysis the causality between the assumed factors and the use of social media is tested. The analysis demonstrates that personal, organizational and technological factors influence the respondents’ use of social media. As strong predictive variables emerge the benefits that respondents expect to receive from using social media and respondents’ experience in using Web 2.0 in their private lives. Also organizational factors such as managers’ and colleagues’ activeness and organizational guidelines for using social media form a causal relationship with the use of social media. In addition, respondents’ understanding of their responsibilities affects their use of social media. The more social media is considered as a part of individual responsibilities, the more frequently social media is used. Finally, technological factors must be recognized. The more user-friendly social media tools are considered and the better technical skills respondents have, the more frequently social media is used in the working environment. The central references in relation to knowledge sharing include Chun Wei Choo’s (2006) work Knowing Organization, Ikujiro Nonaka and Hirotaka Takeuchi’s (1995) work The Knowledge Creating Company and Linda Argote’s (1999) work Organizational Learning.
Resumo:
A visual world eye-tracking study investigated the activation and persistence of implicit causality information in spoken language comprehension. We showed that people infer the implicit causality of verbs as soon as they encounter such verbs in discourse, as is predicted by proponents of the immediate focusing account (Greene & McKoon, 1995; Koornneef & Van Berkum, 2006; Van Berkum, Koornneef, Otten, & Nieuwland, 2007). Interestingly, we observed activation of implicit causality information even before people encountered the causal conjunction. However, while implicit causality information was persistent as the discourse unfolded, it did not have a privileged role as a focusing cue immediately at the ambiguous pronoun when people were resolving its antecedent. Instead, our study indicated that implicit causality does not affect all referents to the same extent, rather it interacts with other cues in the discourse, especially when one of the referents is already prominently in focus.
Resumo:
Causation is still poorly understood in strategy research, and confusion prevails around key concepts such as competitive advantage. In this paper, we define epistemological conditions that help to dispel some of this confusion and to provide a basis for more developed approaches. In particular, we argue that a counterfactual approach – that builds on a systematic analysis of ‘what-if’ questions – can advance our understanding of key causal mechanisms in strategy research. We offer two concrete methodologies – counterfactual history and causal modeling – as useful solutions. We also show that these methodologies open up new avenues in research on competitive advantage. Counterfactual history can add to our understanding of the context-specific construction of resource-based competitive advantage and path dependence, and causal modeling can help to reconceptualize the relationships between resources and performance. In particular, resource properties can be regarded as mediating mechanisms in these causal relationships.
Resumo:
Causation is still poorly understood in strategy research, and confusion prevails around key concepts such as competitive advantage. In this paper, we define epistemological conditions that help to dispel some of this confusion and to provide a basis for more developed approaches. In particular, we argue that a counterfactual approach – that builds on a systematic analysis of ‘what-if’ questions – can advance our understanding of key causal mechanisms in strategy research. We offer two concrete methodologies – counterfactual history and causal modeling – as useful solutions. We also show that these methodologies open up new avenues in research on competitive advantage. Counterfactual history can add to our understanding of the context-specific construction of resource-based competitive advantage and path dependence, and causal modeling can help to reconceptualize the relationships between resources and performance. In particular, resource properties can be regarded as mediating mechanisms in these causal relationships.
Resumo:
In Somalia the central government collapsed in 1991 and since then state failure became a widespread phenomenon and one of the greatest political and humanitarian problems facing the world in this century. Thus, the main objective of this research is to answer the following question: What went wrong? Most of the existing literature on the political economy of conflict starts from the assumption that state in Africa is predatory by nature. Unlike these studies, the present research, although it uses predation theory, starts from the social contract approach of state definition. Therefore, rather than contemplating actions and policies of the rulers alone, this approach allows us to deliberately bring the role of the society – as citizens – and other players into the analyses. In Chapter 1, after introducing the study, a simple principal-agent model will be developed to check the logical consistence of the argument and to make the identification of causal mechanism easier. I also identify three main actors in the process of state failure in Somalia: the Somali state, Somali society and the superpowers. In Chapter 2, so as to understand the incentives, preferences and constraints of each player in the state failure game, I in some depth analyse the evolution and structure of three central informal institutions: identity based patronage system of leadership, political tribalism, and the Cold War. These three institutions are considered as the rules of the game in the Somali state failure. Chapter 3 summarises the successive civilian governments’ achievements and failures (1960-69) concerning the main national goals, national unification and socio-economic development. Chapter 4 shows that the military regime, although it assumed power through extralegal means, served to some extent the developmental interest of the citizens in the first five years of its rule. Chapter 5 shows the process, and the factors involved, of the military regime’s self-transformation from being an agent for the developmental interests of the society to a predatory state that not only undermines the interests of the society but that also destroys the state itself. Chapter 6 addresses the process of disintegration of the post-colonial state of Somalia. The chapter shows how the regime’s merciless reactions to political ventures by power-seeking opposition leaders shattered the entire country and wrecked the state institutions. Chapter 7 concludes the study by summarising the main findings: due to the incentive structures generated by the informal institutions, the formal state institutions fell apart.
Resumo:
On the one hand this thesis attempts to develop and empirically test an ethically defensible theorization of the relationship between human resource management (HRM) and competitive advantage. The specific empirical evidence indicates that at least part of HRM's causal influence on employee performance may operate indirectly through a social architecture and then through psychological empowerment. However, in particular the evidence concerning a potential influence of HRM on organizational performance seems to put in question some of the rhetorics within the HRM research community. On the other hand, the thesis tries to explicate and defend a certain attitude towards the philosophically oriented debates within organization science. This involves suggestions as to how we should understand meaning, reference, truth, justification and knowledge. In this understanding it is not fruitful to see either the problems or the solutions to the problems of empirical social science as fundamentally philosophical ones. It is argued that the notorious problems of social science, in this thesis exemplified by research on HRM, can be seen as related to dynamic complexity in combination with both the ethical and pragmatic difficulty of ”laboratory-like-experiments”. Solutions … can only be sought by informed trials and errors depending on the perceived familiarity with the object(s) of research. The odds are against anybody who hopes for clearly adequate social scientific answers to more complex questions. Social science is in particular unlikely to arrive at largely accepted knowledge of the kind ”if we do this, then that will happen”, or even ”if we do this, then that is likely to happen”. One of the problems probably facing most of the social scientific research communities is to specify and agree upon the ”this ” and the ”that” and provide convincing evidence of how they are (causally) related. On most more complex questions the role of social science seems largely to remain that of contributing to a (critical) conversation, rather than to arrive at more generally accepted knowledge. This is ultimately what is both argued and, in a sense, demonstrated using research on the relationship between HRM and organizational performance as an example.
Resumo:
The recession that hit the Finnish economy at the beginning of the 1990s has been regarded as unusually severe. Organisations’ failure to survive the recession has been researched in their various aspects. However, the reasons for why and how organisations that survived did so have been explored to a somewhat lesser extent. This study concerns organisations that survived rather than those that failed to do so, as studying successful experiences is acknowledged as an important source for learning how to counteract future failure. The thesis examines four knowledge intensive organisations, with the focus on managerial and social aspects of the crisis handling processes. The study deals with managers’ and co-workers’ stories about organisational attempts to survive, rather than seeking to identify causal relationships. Drawing upon a narrative approach and a social constructionist perspective, the crisis handling processes are treated as reconstructions and rationalisations of what happened. A primary assumption of this thesis is that we make sense of experiences in retrospect, and the aim is to describe the handling of crisis situations and the hardships related to economic difficulties, by focusing on the interviewees’ explanations of how those difficulties were dealt with. The stories are about taking control despite the threats induced by an extremely severe economic recession, remaining active, how the managers and their co-workers dealt with the uncertainty experienced, and how the organisations subsequently survived. The analysis also interrogates such issues as trust, authenticity, legitimacy, identity and nostalgia in crisis contexts.
Resumo:
Cosmopolitan ideals have been on the philosophical agenda for several millennia but the end of the Cold War started a new discussion on state sovereignty, global democracy, the role of international law and global institutions. The Westphalian state system in practice since the 17th century is transforming and the democracy deficit needs new solutions. An impetus has been the fact that in the present world, an international body representing global citizens does not exist. In this Master’s thesis, the possibility of establishing a world parliament is examined. In a case analysis, 17 models on world parliament from two journals, a volume of essays and two other publications are discussed. Based on general observations, the models are divided into four thematic groups. The models are analyzed with an emphasis on feasible and probable elements. Further, a new scenario with a time frame of thirty years is proposed based on the methodology of normative futures studies, taking special interest in causal relationships and actions leading to change. The scenario presents three gradual steps that each need to be realized before a sustainable world parliament is established. The theoretical framework is based on social constructivism, and changes in international and multi-level governance are examined with the concepts of globalization, democracy and sovereignty. A feasible, desirable and credible world parliament is constituted gradually by implying electoral, democratic and legal measures for members initially from exclusively democratic states, parliamentarians, non-governmental organizations and other groups. The parliament should be located outside the United Nations context, since a new body avoids the problem of inefficiency currently prevailing in the UN. The main objectives of the world parliament are to safeguard peace and international law and to offer legal advice in cases when international law has been violated. A feasible world parliament is advisory in the beginning but it is granted legislative powers in the future. The number of members in the world parliament could also be extended following the example of the EU enlargement process.
Resumo:
This thesis is composed of an introductory chapter and four applications each of them constituting an own chapter. The common element underlying each of the chapters is the econometric methodology. The applications rely mostly on the leading econometric techniques related to estimation of causal effects. The first chapter introduces the econometric techniques that are employed in the remaining chapters. Chapter 2 studies the effects of shocking news on student performance. It exploits the fact that the school shooting in Kauhajoki in 2008 coincided with the matriculation examination period of that fall. It shows that the performance of men declined due to the news of the school shooting. For women the similar pattern remains unobserved. Chapter 3 studies the effects of minimum wage on employment by employing the original Card and Krueger (1994; CK) and Neumark and Wascher (2000; NW) data together with the changes-in-changes (CIC) estimator. As the main result it shows that the employment effect of an increase in the minimum wage is positive for small fast-food restaurants and negative for big fast-food restaurants. Therefore, it shows that the controversial positive employment effect reported by CK is overturned for big fast-food restaurants and that the NW data are shown, in contrast to their original results, to provide support for the positive employment effect. Chapter 4 employs the state-specific U.S. data (collected by Cohen and Einav [2003; CE]) on traffic fatalities to re-evaluate the effects of seat belt laws on the traffic fatalities by using the CIC estimator. It confirms the CE results that on the average an implementation of a mandatory seat belt law results in an increase in the seat belt usage rate and a decrease in the total fatality rate. In contrast to CE, it also finds evidence on compensating-behavior theory, which is observed especially in the states by the border of the U.S. Chapter 5 studies the life cycle consumption in Finland, with the special interest laid on the baby boomers and the older households. It shows that the baby boomers smooth their consumption over the life cycle more than other generations. It also shows that the old households smoothed their life cycle consumption more as a result of the recession in the 1990s, compared to young households.
Resumo:
Floating in the air that surrounds us is a number of small particles, invisible to the human eye. The mixture of air and particles, liquid or solid, is called an aerosol. Aerosols have significant effects on air quality, visibility and health, and on the Earth's climate. Their effect on the Earth's climate is the least understood of climatically relevant effects. They can scatter the incoming radiation from the Sun, or they can act as seeds onto which cloud droplets are formed. Aerosol particles are created directly, by human activity or natural reasons such as breaking ocean waves or sandstorms. They can also be created indirectly as vapors or very small particles are emitted into the atmosphere and they combine to form small particles that later grow to reach climatically or health relevant sizes. The mechanisms through which those particles are formed is still under scientific discussion, even though this knowledge is crucial to make air quality or climate predictions, or to understand how aerosols will influence and will be influenced by the climate's feedback loops. One of the proposed mechanisms responsible for new particle formation is ion-induced nucleation. This mechanism is based on the idea that newly formed particles were ultimately formed around an electric charge. The amount of available charges in the atmosphere varies depending on radon concentrations in the soil and in the air, as well as incoming ionizing radiation from outer space. In this thesis, ion-induced nucleation is investigated through long-term measurements in two different environments: in the background site of Hyytiälä and in the urban site that is Helsinki. The main conclusion of this thesis is that ion-induced nucleation generally plays a minor role in new particle formation. The fraction of particles formed varies from day to day and from place to place. The relative importance of ion-induced nucleation, i.e. the fraction of particles formed through ion-induced nucleation, is bigger in cleaner areas where the absolute number of particles formed is smaller. Moreover, ion-induced nucleation contributes to a bigger fraction of particles on warmer days, when the sulfuric acid and water vapor saturation ratios are lower. This analysis will help to understand the feedbacks associated with climate change.