9 resultados para Fuzzy rules

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study highlights the formation of an artifact designed to mediate exploratory collaboration. The data for this study was collected during a Finnish adaptation of the thinking together approach. The aim of the approach is to teach pulps how to engage in educationally beneficial form of joint discussion, namely exploratory talk. At the heart of the approach lies a set of conversational ground rules aimed to promote the use of exploratory talk. The theoretical framework of the study is based on a sociocultural perspective on learning. A central argument in the framework is that physical and psychological tools play a crucial role in human action and learning. With the help of tools humans can escape the direct stimulus of the outside world and learn to control ourselves by using tools. During the implementation of the approach, the classroom community negotiates a set of six rules, which this study conceptualizes as an artifact that mediates exploratory collaboration. Prior research done about the thinking together approach has not extensively researched the formation of the rules, which give ample reason to conduct this study. The specific research questions asked were: What kind of negotiation trajectories did the ground rules form during the intervention? What meanings were negotiated for the ground rules during the intervention The methodological framework of the study is based on discourse analysis, which has been specified by adapting the social construction of intertextuality to analyze the meanings negotiated for the created rules. The study has town units of analysis: thematic episode and negotiation trajectory. A thematic episode is a stretch of talk-in-interaction where the participants talk about a certain ground rule or a theme relating to it. A negotiation trajectory is a chronological representation of the negotiation process of a certain ground rule during the intervention and is constructed of thematic episodes. Thematic episodes were analyzed with the adapted intertextuality analysis. A contrastive analysis was done on the trajectories. Lastly, the meanings negotiated for the created rules were compared to the guidelines provided by the approach. The main result of the study is the observation, that the meanings of the created rules were more aligned with the ground rules of cumulative talk, rather than exploratory talk. Although meanings relating also to exploratory talk were negotiated, they clearly were not the dominant form. In addition, the study observed that the trajectories of the rules were non identical. Despite connecting dimensions (symmetry, composition, continuity and explicitness) none of the trajectories shared exactly the same features as the others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study focused on the different ways that forest-related rights can be devolved to the local level according to the current legal frameworks in Laos, Nepal, Vietnam, Kenya, Mozambique and Tanzania. The eleven case studies represented the main ways in which forest-related rights can be devolved to communities or households in these countries. The objectives of this study were to 1) analyse the contents and extent of forest-related rights that can be devolved to the local level, 2) develop an empirical typology that represents the main types of devolution, and 3) compare the cases against a theoretical ideal type to assess in what way and to what extent the cases are similar to or differ from the theoretical construct. Fuzzy set theory, Qualitative Comparative Analysis and ideal type analysis were used in analysing the case studies and in developing an empirical typology. The theoretical framework, which guided data collection and analyses, was based on institutional economics and theories on property rights, common pool resources and collective action. On the basis of the theoretical and empirical knowledge, the most important attributes of rights were defined as use rights, management rights, exclusion rights, transfer rights and the duration and security of the rights. The ideal type was defined as one where local actors have been devolved comprehensive use rights, extensive management rights, rights to exclude others from the resource and rights to transfer these rights. In addition, the rights are to be secure and held perpetually. The ideal type was used to structure the analysis and as a tool against which the cases were analysed. The contents, extent and duration of the devolved rights varied greatly. In general, the results show that devolution has mainly meant the transfer of use rights to the local level, and has not really changed the overall state control over forest resources. In most cases the right holders participate, or have a limited role in the decision making regarding the harvesting and management of the resource. There was a clear tendency to devolve the rights to enforce rules and to monitor resource use and condition more extensively than the powers to decide on the management and development of the resource. The empirical typology of the cases differentiated between five different types of devolution. The types can be characterised by the devolution of 1) restricted use and control rights, 2) extensive use rights but restricted control rights, 3) extensive rights, 4) insecure, short term use and restricted control rights, and 5) insecure extensive rights. Overall, the case studies conformity to the ideal type was very low: only two cases were similar to the ideal type, all other cases differed considerably from the ideal type. The restricted management rights were the most common reason for the low conformity to the ideal type (eight cases). In three cases, the short term of the rights, restricted transfer rights, restricted use rights or restricted exclusion rights were the reason or one of the reasons for the low conformity to the ideal type. In two cases the rights were not secure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Economic and Monetary Union can be characterised as a complicated set of legislation and institutions governing monetary and fiscal responsibilities. The measures of fiscal responsibility are to be guided by the Stability and Growth Pact, which sets rules for fiscal policy and makes a discretionary fiscal policy virtually impossible. To analyse the effects of the fiscal and monetary policy mix, we modified the New Keynesian framework to allow for supply effects of fiscal policy. We show that defining a supply-side channel for fiscal policy using an endogenous output gap changes the stabilising properties of monetary policy rules. The stability conditions are affected by fiscal policy, so that the dichotomy between active (passive) monetary policy and passive (active) fiscal policy as stabilising regimes does not hold, and it is possible to have an active monetary - active fiscal policy regime consistent with dynamical stability of the economy. We show that, if we take supply-side effects into ac-count, we get more persistent inflation and output reactions. We also show that the dichotomy does not hold for a variety of different fiscal policy rules based on government debt and budget deficit, using the tax smoothing hypothesis and formulating the tax rules as difference equations. The debt rule with active monetary policy results in indeterminacy, while the deficit rule produces a determinate solution with active monetary policy, even with active fiscal policy. The combination of fiscal requirements in a rule results in cyclical responses to shocks. The amplitude of the cycle is larger with more weight on debt than on deficit. Combining optimised monetary policy with fiscal policy rules means that, under a discretionary monetary policy, the fiscal policy regime affects the size of the inflation bias. We also show that commitment to an optimal monetary policy not only corrects the inflation bias but also increases the persistence of output reactions. With fiscal policy rules based on the deficit we can retain the tax smoothing hypothesis also in a sticky price model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plasma membrane adopts myriad of different shapes to carry out essential cellular processes such as nutrient uptake, immunological defence mechanisms and cell migration. Therefore, the details how different plasma membrane structures are made and remodelled are of the upmost importance. Bending of plasma membrane into different shapes requires substantial amount of force, which can be provided by the actin cytoskeleton, however, the molecules that regulate the interplay between the actin cytoskeleton and plasma membrane have remained elusive. Recent findings have placed new types of effectors at sites of plasma membrane remodelling, including BAR proteins, which can directly bind and deform plasma membrane into different shapes. In addition to their membrane-bending abilities, BAR proteins also harbor protein domains that intimately link them to the actin cytoskeleton. The ancient BAR domain fold has evolved into at least three structurally and functionally different sub-groups: the BAR, F-BAR and I-BAR domains. This thesis work describes the discovery and functional characterization of the Inverse-BAR domains (I-BARs). Using synthetic model membranes, we have shown that I-BAR domains bind and deform membranes into tubular structures through a binding-surface composed of positively charged amino acids. Importantly, the membrane-binding surface of I-BAR domains displays an inverse geometry to that of the BAR and F-BAR domains, and these structural differences explain why I-BAR domains induce cell protrusions whereas BAR and most F-BAR domains induce cell invaginations. In addition, our results indicate that the binding of I-BAR domains to membranes can alter the spatial organization of phosphoinositides within membranes. Intriguingly, we also found that some I-BAR domains can insert helical motifs into the membrane bilayer, which has important consequences for their membrane binding/bending functions. In mammals there are five I-BAR domain containing proteins. Cell biological studies on ABBA revealed that it is highly expressed in radial glial cells during the development of the central nervous system and plays an important role in the extension process of radial glia-like C6R cells by regulating lamellipodial dynamics through its I-BAR domain. To reveal the role of these proteins in the context of animals, we analyzed MIM knockout mice and found that MIM is required for proper renal functions in adult mice. MIM deficient mice displayed a severe urine concentration defect due to defective intercellular junctions of the kidney epithelia. Consistently, MIM localized to adherens junctions in cultured kidney epithelial cells, where it promoted actin assembly through its I-BAR andWH2 domains. In summary, this thesis describes the mechanism how I-BAR proteins deform membranes and provides information about the biological role of these proteins, which to our knowledge are the first proteins that have been shown to directly deform plasma membrane to make cell protrusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the interest-rate policy of the ECB by estimating monetary policy rules using real-time data and central bank forecasts. The aim of the estimations is to try to characterize a decade of common monetary policy and to look at how different models perform at this task.The estimated rules include: contemporary Taylor rules, forward-looking Taylor rules, nonlinearrules and forecast-based rules. The nonlinear models allow for the possibility of zone-like preferences and an asymmetric response to key variables. The models therefore encompass the most popular sub-group of simple models used for policy analysis as well as the more unusual non-linear approach. In addition to the empirical work, this thesis also contains a more general discussion of monetary policy rules mostly from a New Keynesian perspective. This discussion includes an overview of some notable related studies, optimal policy, policy gradualism and several other related subjects. The regression estimations are performed with either least squares or the generalized method of moments depending on the requirements of the estimations. The estimations use data from both the Euro Area Real-Time Database and the central bank forecasts published in ECB Monthly Bulletins. These data sources represent some of the best data that is available for this kind of analysis. The main results of this thesis are that forward-looking behavior appears highly prevalent, but that standard forward-looking Taylor rules offer only ambivalent results with regard to inflation. Nonlinear models are shown to work, but on the other hand do not have a strong rationale over a simpler linear formulation. However, the forecasts appear to be highly useful in characterizing policy and may offer the most accurate depiction of a predominantly forward-looking central bank. In particular the inflation response appears much stronger while the output response becomes highly forward-looking as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypertexts are digital texts characterized by interactive hyperlinking and a fragmented textual organization. Increasingly prominent since the early 1990s, hypertexts have become a common text type both on the Internet and in a variety of other digital contexts. Although studied widely in disciplines like hypertext theory and media studies, formal linguistic approaches to hypertext continue to be relatively rare. This study examines coherence negotiation in hypertext with particularly reference to hypertext fiction. Coherence, or the quality of making sense, is a fundamental property of textness. Proceeding from the premise that coherence is a subjectively evaluated property rather than an objective quality arising directly from textual cues, the study focuses on the processes through which readers interact with hyperlinks and negotiate continuity between hypertextual fragments. The study begins with a typological discussion of textuality and an overview of the historical and technological precedents of modern hypertexts. Then, making use of text linguistic, discourse analytical, pragmatic, and narratological approaches to textual coherence, the study takes established models developed for analyzing and describing conventional texts, and examines their applicability to hypertext. Primary data derived from a collection of hyperfictions is used throughout to illustrate the mechanisms in practice. Hypertextual coherence negotiation is shown to require the ability to cognitively operate between local and global coherence by means of processing lexical cohesion, discourse topical continuities, inferences and implications, and shifting cognitive frames. The main conclusion of the study is that the style of reading required by hypertextuality fosters a new paradigm of coherence. Defined as fuzzy coherence, this new approach to textual sensemaking is predicated on an acceptance of the coherence challenges readers experience when the act of reading comes to involve repeated encounters with referentially imprecise hyperlinks and discourse topical shifts. A practical application of fuzzy coherence is shown to be in effect in the way coherence is actively manipulated in hypertext narratives.