982 resultados para Home rule
Resumo:
This thesis consists of an introduction to a topic of optimal use of taxes and government expenditure and three chapters analysing these themes more in depth. Chapter 2 analyses to what extent a given amount of subsidies affects the labour supply of parents. Municipal supplement to the Finnish home care allowance provides exogenous variation to labour supply decision of a parent. This kind of subsidy that is tied to staying at home instead of working is found to have fairly large effect on labour supply decisions of parents. Chapter 3 studies theoretically when it is optimal to provide publicly private goods. In the set up of the model government sets income taxes optimally and provides a private good, if it is beneficial to do so. The analysis results in an optimal provision rule according to which the good should be provided when it lowers the participation threshold into labour force. Chapter 4 investigates what happened to prices and demand when hairdressers value added tax was cut in Finland from 22 per cent to 8 per cent. The pass-through to prices was about half of the full pass-through and no clear indication of increased demand for the services or better employment situation in the sector is found.
Resumo:
A two-stage iterative algorithm for selecting a subset of a training set of samples for use in a condensed nearest neighbor (CNN) decision rule is introduced. The proposed method uses the concept of mutual nearest neighborhood for selecting samples close to the decision line. The efficacy of the algorithm is brought out by means of an example.
Resumo:
The aim of this thesis was to examine the understanding of community in George Lindbeck s The Nature of Doctrine. Intrinsic to this question was also examining how Lindbeck understands the relation between the text and the world which both meet in a Christian community. Thirdly this study also aimed at understanding what the persuasiveness of this understanding depends on. The method applied for this task was systematic analysis. The study was conducted by first providing an orientation into the nontheological substance of the ND which was assumed useful with respect to the aim of this study. The study then went on to explore Lindbeck in his own context of postliberal theology in order to see how the ND was received. It also attempted to provide a picture of how the ND relates to Lindbeck as a theologian. The third chapter was a descriptive analysis into the cultural-linguistic perspective, which is understood as being directly proportional to his understanding of community. The fourth chapter was an analysis into how the cultural-linguistic perspective sees the relation between the text and the world. When religion is understood from a cultural-linguistic perspective, it presents itself as a cultural-linguistic entity, which Lindbeck understands as a comprehensive interpretive scheme which structures human experience and understanding of oneself and the world in which one lives. When one exists in this entity, it is the entity which shapes the subjectivities of all those who are at home in this entity which makes participation in the life of a cultural linguistic entity a condition for understanding it. Religion is above all an external word that moulds and shapes our religious existence and experience. Understanding faith then as coming from hearing, is something that correlates with the cultural-linguistic depiction of reality. Religion informs us of a religious reality, it does not originate in any way from ourselves. This externality linked to the axiomatic nature of religion is also something that distinguishes Lindbeck sharply from liberalist tendencies, which understand religion as ultimately expressing the prereflective depths of the inner self. Language is the central analogy to understanding the medium in which one moves when inhabiting a cultural-linguistic system because language is the transmitting medium in which the cultural-linguistic system is embodied. The realism entailed in Lindbeck s understanding of a community is that we are fundamentally on the receiving end when it comes to our identities whether cultural or religious. We always witness to something. Its persuasiveness rests on the fact that we never exist in an unpersuaded reality. The language of Christ is a self-sustaining and irreducible cultural-linguistic entity, which is ontologically founded upon Christ. It transmits the reality of a new being. The basic relation to the world for a Christian is that of witnessing salvation in Christ: witnessing Christ as the home of hearing the message of salvation, which is the God-willed way. Following this logic, the relation of the world and the text is one of relating to the world from the text, i.e. In Christ through the word (text) for the world, because it assumes it s logic from the way Christ ontologically relates to us.
Resumo:
The paper examines the suitability of the generalized data rule in training artificial neural networks (ANN) for damage identification in structures. Several multilayer perceptron architectures are investigated for a typical bridge truss structure with simulated damage stares generated randomly. The training samples have been generated in terms of measurable structural parameters (displacements and strains) at suitable selected locations in the structure. Issues related to the performance of the network with reference to hidden layers and hidden. neurons are examined. Some heuristics are proposed for the design of neural networks for damage identification in structures. These are further supported by an investigation conducted on five other bridge truss configurations.
Resumo:
To establish itself within the host system, Mycobacterium tuberculosis (Mtb) has formulated various means of attacking the host system. One such crucial strategy is the exploitation of the iron resources of the host system. Obtaining and maintaining the required concentration of iron becomes a matter of contest between the host and the pathogen, both trying to achieve this through complex molecular networks. The extent of complexity makes it important to obtain a systems perspective of the interplay between the host and the pathogen with respect to iron homeostasis. We have reconstructed a systems model comprising 92 components and 85 protein-protein or protein-metabolite interactions, which have been captured as a set of 194 rules. Apart from the interactions, these rules also account for protein synthesis and decay, RBC circulation and bacterial production and death rates. We have used a rule-based modelling approach, Kappa, to simulate the system separately under infection and non-infection conditions. Various perturbations including knock-outs and dual perturbation were also carried out to monitor the behavioral change of important proteins and metabolites. From this, key components as well as the required controlling factors in the model that are critical for maintaining iron homeostasis were identified. The model is able to re-establish the importance of iron-dependent regulator (ideR) in Mtb and transferrin (Tf) in the host. Perturbations, where iron storage is increased, appear to enhance nutritional immunity and the analysis indicates how they can be harmful for the host. Instead, decreasing the rate of iron uptake by Tf may prove to be helpful. Simulation and perturbation studies help in identifying Tf as a possible drug target. Regulating the mycobactin (myB) concentration was also identified as a possible strategy to control bacterial growth. The simulations thus provide significant insight into iron homeostasis and also for identifying possible drug targets for tuberculosis.
Resumo:
Mining association rules from a large collection of databases is based on two main tasks. One is generation of large itemsets; and the other is finding associations between the discovered large itemsets. Existing formalism for association rules are based on a single transaction database which is not sufficient to describe the association rules based on multiple database environment. In this paper, we give a general characterization of association rules and also give a framework for knowledge-based mining of multiple databases for association rules.
Resumo:
Theoretical and computational frameworks for synaptic plasticity and learning have a long and cherished history, with few parallels within the well-established literature for plasticity of voltage-gated ion channels. In this study, we derive rules for plasticity in the hyperpolarization-activated cyclic nucleotide-gated (HCN) channels, and assess the synergy between synaptic and HCN channel plasticity in establishing stability during synaptic learning. To do this, we employ a conductance-based model for the hippocampal pyramidal neuron, and incorporate synaptic plasticity through the well-established Bienenstock-Cooper-Munro (BCM)-like rule for synaptic plasticity, wherein the direction and strength of the plasticity is dependent on the concentration of calcium influx. Under this framework, we derive a rule for HCN channel plasticity to establish homeostasis in synaptically-driven firing rate, and incorporate such plasticity into our model. In demonstrating that this rule for HCN channel plasticity helps maintain firing rate homeostasis after bidirectional synaptic plasticity, we observe a linear relationship between synaptic plasticity and HCN channel plasticity for maintaining firing rate homeostasis. Motivated by this linear relationship, we derive a calcium-dependent rule for HCN-channel plasticity, and demonstrate that firing rate homeostasis is maintained in the face of synaptic plasticity when moderate and high levels of cytosolic calcium influx induced depression and potentiation of the HCN-channel conductance, respectively. Additionally, we show that such synergy between synaptic and HCN-channel plasticity enhances the stability of synaptic learning through metaplasticity in the BCM-like synaptic plasticity profile. Finally, we demonstrate that the synergistic interaction between synaptic and HCN-channel plasticity preserves robustness of information transfer across the neuron under a rate-coding schema. Our results establish specific physiological roles for experimentally observed plasticity in HCN channels accompanying synaptic plasticity in hippocampal neurons, and uncover potential links between HCN-channel plasticity and calcium influx, dynamic gain control and stable synaptic learning.
Resumo:
In the underlay mode of cognitive radio, secondary users are allowed to transmit when the primary is transmitting, but under tight interference constraints that protect the primary. However, these constraints limit the secondary system performance. Antenna selection (AS)-based multiple antenna techniques, which exploit spatial diversity with less hardware, help improve secondary system performance. We develop a novel and optimal transmit AS rule that minimizes the symbol error probability (SEP) of an average interference-constrained multiple-input-single-output secondary system that operates in the underlay mode. We show that the optimal rule is a non-linear function of the power gain of the channel from the secondary transmit antenna to the primary receiver and from the secondary transmit antenna to the secondary receive antenna. We also propose a simpler, tractable variant of the optimal rule that performs as well as the optimal rule. We then analyze its SEP with L transmit antennas, and extensively benchmark it with several heuristic selection rules proposed in the literature. We also enhance these rules in order to provide a fair comparison, and derive new expressions for their SEPs. The results bring out new inter-relationships between the various rules, and show that the optimal rule can significantly reduce the SEP.
Resumo:
In many systems, nucleation of a stable solid may occur in the presence of other (often more than one) metastable phases. These may be polymorphic solids or even liquid phases. Sometimes, the metastable phase might have a lower free energy minimum than the liquid but higher than the stable-solid-phase minimum and have characteristics in between the parent liquid and the globally stable solid phase. In such cases, nucleation of the solid phase from the melt may be facilitated by the metastable phase because the latter can ``wet'' the interface between the parent and the daughter phases, even though there may be no signature of the existence of metastable phase in the thermodynamic properties of the parent liquid and the stable solid phase. Straightforward application of classical nucleation theory (CNT) is flawed here as it overestimates the nucleation barrier because surface tension is overestimated (by neglecting the metastable phases of intermediate order) while the thermodynamic free energy gap between daughter and parent phases remains unchanged. In this work, we discuss a density functional theory (DFT)-based statistical mechanical approach to explore and quantify such facilitation. We construct a simple order-parameter-dependent free energy surface that we then use in DFT to calculate (i) the order parameter profile, (ii) the overall nucleation free energy barrier, and (iii) the surface tension between the parent liquid and the metastable solid and also parent liquid and stable solid phases. The theory indeed finds that the nucleation free energy barrier can decrease significantly in the presence of wetting. This approach can provide a microscopic explanation of the Ostwald step rule and the well-known phenomenon of ``disappearing polymorphs'' that depends on temperature and other thermodynamic conditions. Theory reveals a diverse scenario for phase transformation kinetics, some of which may be explored via modem nanoscopic synthetic methods.
Resumo:
The influence of the flow rule on the bearing capacity of strip foundations placed on sand was investigated using a new kinematic approach of upper-bound limit analysis. The method of stress characteristics was first used to find the mechanism of the failure and to compute the stress field by using the Mohr-Coulomb yield criterion. Once the failure mechanism had been established, the kinematics of the plastic deformation was established, based on the requirements of the upper-bound limit theorem. Both associated and nonassociated plastic flows were considered, and the bearing capacity was obtained by equating the rate of external plastic work to the rate of the internal energy dissipation for both smooth and rough base foundations. The results obtained from the analysis were compared with those available from the literature. (C) 2014 American Society of Civil Engineers.
Resumo:
The correlation clustering problem is a fundamental problem in both theory and practice, and it involves identifying clusters of objects in a data set based on their similarity. A traditional modeling of this question as a graph theoretic problem involves associating vertices with data points and indicating similarity by adjacency. Clusters then correspond to cliques in the graph. The resulting optimization problem, Cluster Editing (and several variants) are very well-studied algorithmically. In many situations, however, translating clusters to cliques can be somewhat restrictive. A more flexible notion would be that of a structure where the vertices are mutually ``not too far apart'', without necessarily being adjacent. One such generalization is realized by structures called s-clubs, which are graphs of diameter at most s. In this work, we study the question of finding a set of at most k edges whose removal leaves us with a graph whose components are s-clubs. Recently, it has been shown that unless Exponential Time Hypothesis fail (ETH) fails Cluster Editing (whose components are 1-clubs) does not admit sub-exponential time algorithm STACS, 2013]. That is, there is no algorithm solving the problem in time 2 degrees((k))n(O(1)). However, surprisingly they show that when the number of cliques in the output graph is restricted to d, then the problem can be solved in time O(2(O(root dk)) + m + n). We show that this sub-exponential time algorithm for the fixed number of cliques is rather an exception than a rule. Our first result shows that assuming the ETH, there is no algorithm solving the s-Club Cluster Edge Deletion problem in time 2 degrees((k))n(O(1)). We show, further, that even the problem of deleting edges to obtain a graph with d s-clubs cannot be solved in time 2 degrees((k))n(O)(1) for any fixed s, d >= 2. This is a radical contrast from the situation established for cliques, where sub-exponential algorithms are known.