13 resultados para allocation rules
em Helda - Digital Repository of University of Helsinki
Resumo:
This thesis is a collection of three essays on Bangladeshi microcredit. One of the essays examines the effect of microcredit on the cost of crime. The other two analyze the functioning mechanism of microcredit programs, i.e. credit allocation rules and credit recovery policy. In Essay 1, the demand for microcredit and its allocation rules is studied. Microcredit is claimed to be the most effective means of supplying credit to the poorest of the poor in rural Bangladesh. This fact has not yet been examined among households who demand microcredit. The results of this essay show that educated households are more likely to demand microcredit and its demand does not differ by sex. The results also show that microcredit programs follow different credit allocation rules for male and female applicants. Education is an essential characteristic for both sexes that credit programs consider in allocating credit. In Essay 2, the focus is to establish a link between microcredit and the incidence of rural crime in Bangladesh. The basic hypothesis is that microcredit programs jointly hold the group responsibility which provides an incentive for group members to protect each other from criminal gang in order to safeguard their own economic interests. The key finding of this essay is that the average cost of crime for non-borrowers is higher than that for borrowers. In particular, 10% increase in the credit reduces the costs of crime by 4.2%. The third essay analyzes the reasons of high repayment rate amid Bangladeshi microcredit programs. The existing literature argues that credit applicants are able to screen out the high risk applicants in the group formulation stage using their superior local information. In addition, due to the joint liability mechanism of the programs, group members monitor each others economic activities to ensure the minimal misuse of credit. The arguments in the literature are based on the assumption that once the credit is provided, credit programs have no further role in ensuring that repayments are honored by the group. In contrast, using survey data this essay documents that credit programs use in addition organizational pressures such as humiliation and harassment the non-payer to recover the unpaid installments. The results also show that the group mechanisms do not have a significant effect in recovering default dues.
Resumo:
Texts in the work of a city department: A study of the language and context of benefit decisions This dissertation examines documents granting or denying the access to municipal services. The data consist of decisions on transport services made by the Social Services Department of the City of Helsinki. The circumstances surrounding official texts and their language and production are studied through textual analysis and interviews. The dissertation describes the textual features of the above decisions, and seeks to explain such features. Also explored are the topics and methods of genre studies, especially the relationship between text and context. Although the approach is linguistic, the dissertation also touches on research in social work and administrative decision making, and contributes to more general discussion on the language and duties of public administration. My key premise is that a text is more than a mere psycholinguistic phenomenon. Rather, a text is also a physical object and the result of certain production processes. This dissertation thus not only describes genre-specific features, but also sheds light on the work that generates the texts examined. Textual analysis and analyses of discursive practices are linked through an analysis of intertextuality: written decisions are compared with other application documents, such as expert statements and the applications themselves. The study shows that decisions are texts governed by strict rules and written with modest resources. Textwork is organised as hierarchical mass production. The officials who write decisions rely on standard phrases extracted from a computer system. This allows them to produce texts of uniform quality which have been approved by the department s legal experts. Using a computer system in text production does not, however, serve all the needs of the writers. This leads to many problems in the texts themselves. Intertextual analysis indicates that medical argumentation weighs most heavily in an application process, although a social appraisal should be carried out when deciding on applications for transport services. The texts reflect a hierarchy in which a physician ranks above the applicant, and the department s own expert physician ranks above the applicant s physician. My analysis also highlights good, but less obvious practices. The social workers and secretaries who write decisions must balance conflicting demands. They use delicate linguistic means to adjust the standard phrases to suit individual cases, and employ subtle strategies of politeness. The dissertation suggests that the customer contact staff who write official texts should be allowed to make better use of their professional competence. A more general concern is that legislation and new management strategies require more and more documentation. Yet, textwork is only rarely taken into account in the allocation of resources. Keywords: (Critical) text analysis, genre analysis, administration, social work, administrative language, texts, genres, context, intertextuality, discursive practices
Resumo:
This study highlights the formation of an artifact designed to mediate exploratory collaboration. The data for this study was collected during a Finnish adaptation of the thinking together approach. The aim of the approach is to teach pulps how to engage in educationally beneficial form of joint discussion, namely exploratory talk. At the heart of the approach lies a set of conversational ground rules aimed to promote the use of exploratory talk. The theoretical framework of the study is based on a sociocultural perspective on learning. A central argument in the framework is that physical and psychological tools play a crucial role in human action and learning. With the help of tools humans can escape the direct stimulus of the outside world and learn to control ourselves by using tools. During the implementation of the approach, the classroom community negotiates a set of six rules, which this study conceptualizes as an artifact that mediates exploratory collaboration. Prior research done about the thinking together approach has not extensively researched the formation of the rules, which give ample reason to conduct this study. The specific research questions asked were: What kind of negotiation trajectories did the ground rules form during the intervention? What meanings were negotiated for the ground rules during the intervention The methodological framework of the study is based on discourse analysis, which has been specified by adapting the social construction of intertextuality to analyze the meanings negotiated for the created rules. The study has town units of analysis: thematic episode and negotiation trajectory. A thematic episode is a stretch of talk-in-interaction where the participants talk about a certain ground rule or a theme relating to it. A negotiation trajectory is a chronological representation of the negotiation process of a certain ground rule during the intervention and is constructed of thematic episodes. Thematic episodes were analyzed with the adapted intertextuality analysis. A contrastive analysis was done on the trajectories. Lastly, the meanings negotiated for the created rules were compared to the guidelines provided by the approach. The main result of the study is the observation, that the meanings of the created rules were more aligned with the ground rules of cumulative talk, rather than exploratory talk. Although meanings relating also to exploratory talk were negotiated, they clearly were not the dominant form. In addition, the study observed that the trajectories of the rules were non identical. Despite connecting dimensions (symmetry, composition, continuity and explicitness) none of the trajectories shared exactly the same features as the others.
Resumo:
This thesis utilises an evidence-based approach to critically evaluate and summarize effectiveness research on physiotherapy, physiotherapy-related motor-based interventions and orthotic devices in children and adolescents with cerebral palsy (CP). It aims to assess the methodological challenges of the systematic reviews and trials, to evaluate the effectiveness of interventions in current use, and to make suggestions for future trials Methods: Systematic reviews were searched from computerized bibliographic databases up to August 2007 for physiotherapy and physiotherapy-related interventions, and up to May 2003 for orthotic devices. Two reviewers independently identified, selected, and assessed the quality of the reviews using the Overview Quality Assessment Questionnaire complemented with decision rules. From a sample of 14 randomized controlled trials (RCT) published between January 1990 and June 2003 we analysed the methods of sampling, recruitment, and comparability of groups; defined the components of a complex intervention; identified outcome measures based on the International Classification of Functioning, Disability and Health (ICF); analysed the clinical interpretation of score changes; and analysed trial reporting using a modified 33-item CONSORT (Consolidated Standards of Reporting Trials) checklist. The effectiveness of physiotherapy and physiotherapy-related interventions in children with diagnosed CP was evaluated in a systematic review of randomised controlled trials that were searched from computerized databases from January 1990 up to February 2007. Two reviewers independently assessed the methodological quality, extracted the data, classified the outcomes using the ICF, and considered the level of evidence according to van Tulder et al. (2003). Results: We identified 21 reviews on physiotherapy and physiotherapy-related interventions and five on orthotic devices. These reviews summarized 23 or 5 randomised controlled trials and 104 or 27 observational studies, respectively. Only six reviews were of high quality. These found some evidence supporting strength training, constraint-induced movement therapy or hippotherapy, and insufficient evidence on comprehensive interventions. Based on the original studies included in the reviews on orthotic devices we found some short-term effects of lower limb casting on passive range of movement, and of ankle-foot orthoses on equinus walk. Long term effects of lower limb orthoses have not been studied. Evidence of upper limb casting or orthoses is conflicting. In the sample of 14 RCTs, most trials used simple randomisation, complemented with matching or stratification, but only three specified the concealed allocation. Numerous studies provided sufficient details on the components of a complex intervention, but the overlap of outcome measures across studies was poor and the clinical interpretation of observed score changes was mostly missing. Almost half (48%) of the applicable CONSORT-based items (range 28 32) were reported adequately. Most reporting inadequacies were in outcome measures, sample size determination, details of the sequence generation, allocation concealment and implementation of the randomization, success of assessor blinding, recruitment and follow-up dates, intention-to-treat analysis, precision of the effect size, co-interventions, and adverse events. The systematic review identified 22 trials on eight intervention categories. Four trials were of high quality. Moderate evidence of effectiveness was established for upper extremity treatments on attained goals, active supination and developmental status, and of constraint-induced therapy on the amount and quality of hand use and new emerging behaviours. Moderate evidence of ineffectiveness was found for strength training's effect on walking speed and stride length. Conflicting evidence was found for strength training's effect on gross motor function. For the other intervention categories the evidence was limited due to the low methodological quality and the statistically insignificant results of the studies. Conclusions: The high-quality reviews provide both supportive and insufficient evidence on some physiotherapy interventions. The poor quality of most reviews calls for caution, although most reviews drew no conclusions on effectiveness due to the poor quality of the primary studies. A considerable number of RCTs of good to fair methodological and reporting quality indicate that informative and well-reported RCTs on complex interventions in children and adolescents with CP are feasible. Nevertheless, methodological improvement is needed in certain areas of the trial design and performance, and the trial authors are encouraged to follow the CONSORT criteria. Based on RCTs we established moderate evidence for some effectiveness of upper extremity training. Due to limitations in methodological quality and variations in population, interventions and outcomes, mostly limited evidence on the effectiveness of most physiotherapy interventions is available to guide clinical practice. Well-designed trials are needed, especially for focused physiotherapy interventions.
Resumo:
Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.
Resumo:
Economic and Monetary Union can be characterised as a complicated set of legislation and institutions governing monetary and fiscal responsibilities. The measures of fiscal responsibility are to be guided by the Stability and Growth Pact, which sets rules for fiscal policy and makes a discretionary fiscal policy virtually impossible. To analyse the effects of the fiscal and monetary policy mix, we modified the New Keynesian framework to allow for supply effects of fiscal policy. We show that defining a supply-side channel for fiscal policy using an endogenous output gap changes the stabilising properties of monetary policy rules. The stability conditions are affected by fiscal policy, so that the dichotomy between active (passive) monetary policy and passive (active) fiscal policy as stabilising regimes does not hold, and it is possible to have an active monetary - active fiscal policy regime consistent with dynamical stability of the economy. We show that, if we take supply-side effects into ac-count, we get more persistent inflation and output reactions. We also show that the dichotomy does not hold for a variety of different fiscal policy rules based on government debt and budget deficit, using the tax smoothing hypothesis and formulating the tax rules as difference equations. The debt rule with active monetary policy results in indeterminacy, while the deficit rule produces a determinate solution with active monetary policy, even with active fiscal policy. The combination of fiscal requirements in a rule results in cyclical responses to shocks. The amplitude of the cycle is larger with more weight on debt than on deficit. Combining optimised monetary policy with fiscal policy rules means that, under a discretionary monetary policy, the fiscal policy regime affects the size of the inflation bias. We also show that commitment to an optimal monetary policy not only corrects the inflation bias but also increases the persistence of output reactions. With fiscal policy rules based on the deficit we can retain the tax smoothing hypothesis also in a sticky price model.
Resumo:
Plasma membrane adopts myriad of different shapes to carry out essential cellular processes such as nutrient uptake, immunological defence mechanisms and cell migration. Therefore, the details how different plasma membrane structures are made and remodelled are of the upmost importance. Bending of plasma membrane into different shapes requires substantial amount of force, which can be provided by the actin cytoskeleton, however, the molecules that regulate the interplay between the actin cytoskeleton and plasma membrane have remained elusive. Recent findings have placed new types of effectors at sites of plasma membrane remodelling, including BAR proteins, which can directly bind and deform plasma membrane into different shapes. In addition to their membrane-bending abilities, BAR proteins also harbor protein domains that intimately link them to the actin cytoskeleton. The ancient BAR domain fold has evolved into at least three structurally and functionally different sub-groups: the BAR, F-BAR and I-BAR domains. This thesis work describes the discovery and functional characterization of the Inverse-BAR domains (I-BARs). Using synthetic model membranes, we have shown that I-BAR domains bind and deform membranes into tubular structures through a binding-surface composed of positively charged amino acids. Importantly, the membrane-binding surface of I-BAR domains displays an inverse geometry to that of the BAR and F-BAR domains, and these structural differences explain why I-BAR domains induce cell protrusions whereas BAR and most F-BAR domains induce cell invaginations. In addition, our results indicate that the binding of I-BAR domains to membranes can alter the spatial organization of phosphoinositides within membranes. Intriguingly, we also found that some I-BAR domains can insert helical motifs into the membrane bilayer, which has important consequences for their membrane binding/bending functions. In mammals there are five I-BAR domain containing proteins. Cell biological studies on ABBA revealed that it is highly expressed in radial glial cells during the development of the central nervous system and plays an important role in the extension process of radial glia-like C6R cells by regulating lamellipodial dynamics through its I-BAR domain. To reveal the role of these proteins in the context of animals, we analyzed MIM knockout mice and found that MIM is required for proper renal functions in adult mice. MIM deficient mice displayed a severe urine concentration defect due to defective intercellular junctions of the kidney epithelia. Consistently, MIM localized to adherens junctions in cultured kidney epithelial cells, where it promoted actin assembly through its I-BAR andWH2 domains. In summary, this thesis describes the mechanism how I-BAR proteins deform membranes and provides information about the biological role of these proteins, which to our knowledge are the first proteins that have been shown to directly deform plasma membrane to make cell protrusions.
Resumo:
Social groups are common across animal species. The reasons for grouping are straightforward when all individuals gain directly from cooperating. However, the situation becomes more complex when helping entails costs to the personal reproduction of individuals. Kin selection theory has offered a fruitful framework to explain such cooperation by stating that individuals may spread their genes not only through their own reproduction, but also by helping related individuals reproduce. However, kin selection theory also implicitly predicts conflicts when groups consist of non-clonal individuals, i.e. relatedness is less than one. Then, individual interests are not perfectly aligned, and each individual is predicted to favour the propagation of their own genome over others. Social insects provide a solid study system to study the interplay between cooperation and conflict. Breeding systems in social insects range from solitary breeding to eusocial colonies displaying complete division of reproduction between the fertile queen and the sterile worker caste. Within colonies, additional variation is provided by the presence of several reproductive individuals. In many species, the queen mates multiply, which causes the colony to consist of half-sib instead of full-sib offspring. Furthermore, in many species colonies contain multiple breeding queens, which further dilutes relatedness between colony members. Evolutionary biology is thus faced with the challenge to answer why such variation in social structure exists, and what the consequences are on the individual and population level. The main part of this thesis takes on this challenge by investing the dynamics of socially polymorphic ant colonies. The first four chapters investigate the causes and consequences of different social structures, using a combination of field studies, genetic analyses and laboratory experiments. The thesis ends with a theoretical chapter focusing on different social interactions (altruism and spite), and the evolution of harming traits. The main results of the thesis show that social polymorphism has the potential to affect the behaviour and traits of both individuals and colonies. For example, we found that genetic polymorphism may increase the phenotypic variation between individuals in colonies, and that socially polymorphic colonies may show different life history patterns. We also show that colony cohesion may be enhanced even in multiple-queen colonies through patterns of unequal reproduction between queens. However, the thesis also demonstrates that spatial and temporal variation between both populations and environments may affect individual and colony traits, to the degree that results obtained in one place or at one time may not be applicable in other situations. This opens up potential further areas of research to explain these differences.
Resumo:
This thesis studies the interest-rate policy of the ECB by estimating monetary policy rules using real-time data and central bank forecasts. The aim of the estimations is to try to characterize a decade of common monetary policy and to look at how different models perform at this task.The estimated rules include: contemporary Taylor rules, forward-looking Taylor rules, nonlinearrules and forecast-based rules. The nonlinear models allow for the possibility of zone-like preferences and an asymmetric response to key variables. The models therefore encompass the most popular sub-group of simple models used for policy analysis as well as the more unusual non-linear approach. In addition to the empirical work, this thesis also contains a more general discussion of monetary policy rules mostly from a New Keynesian perspective. This discussion includes an overview of some notable related studies, optimal policy, policy gradualism and several other related subjects. The regression estimations are performed with either least squares or the generalized method of moments depending on the requirements of the estimations. The estimations use data from both the Euro Area Real-Time Database and the central bank forecasts published in ECB Monthly Bulletins. These data sources represent some of the best data that is available for this kind of analysis. The main results of this thesis are that forward-looking behavior appears highly prevalent, but that standard forward-looking Taylor rules offer only ambivalent results with regard to inflation. Nonlinear models are shown to work, but on the other hand do not have a strong rationale over a simpler linear formulation. However, the forecasts appear to be highly useful in characterizing policy and may offer the most accurate depiction of a predominantly forward-looking central bank. In particular the inflation response appears much stronger while the output response becomes highly forward-looking as well.