855 resultados para Fuzzy rules
Resumo:
Professor Carol Nicolls suggests that academics and postgraduate students in education should be consciously positioned to influence current and future policy. Through their research they could be contributing to the foundations for building the evidence base for good policy in education, and through their research and scholarship, be seen, where necessary, to publicly challenge the status quo in policy both now and in the future. All of this is more likely to ensure that we achieve the very best outcomes for all in education in Australia.
Resumo:
Based on the conclusions drawn in the bijective transformation between possibility and probability, a method is proposed to estimate the fuzzy membership function for pattern recognition purposes. A rational function approximation to the probability density function is obtained from the histogram of a finite (and sometimes very small) number of samples. This function is normalized such that the highest ordinate is one. The parameters representing the rational function are used for classifying the pattern samples based on a max-min decision rule. The method is illustrated with examples.
Resumo:
This study highlights the formation of an artifact designed to mediate exploratory collaboration. The data for this study was collected during a Finnish adaptation of the thinking together approach. The aim of the approach is to teach pulps how to engage in educationally beneficial form of joint discussion, namely exploratory talk. At the heart of the approach lies a set of conversational ground rules aimed to promote the use of exploratory talk. The theoretical framework of the study is based on a sociocultural perspective on learning. A central argument in the framework is that physical and psychological tools play a crucial role in human action and learning. With the help of tools humans can escape the direct stimulus of the outside world and learn to control ourselves by using tools. During the implementation of the approach, the classroom community negotiates a set of six rules, which this study conceptualizes as an artifact that mediates exploratory collaboration. Prior research done about the thinking together approach has not extensively researched the formation of the rules, which give ample reason to conduct this study. The specific research questions asked were: What kind of negotiation trajectories did the ground rules form during the intervention? What meanings were negotiated for the ground rules during the intervention The methodological framework of the study is based on discourse analysis, which has been specified by adapting the social construction of intertextuality to analyze the meanings negotiated for the created rules. The study has town units of analysis: thematic episode and negotiation trajectory. A thematic episode is a stretch of talk-in-interaction where the participants talk about a certain ground rule or a theme relating to it. A negotiation trajectory is a chronological representation of the negotiation process of a certain ground rule during the intervention and is constructed of thematic episodes. Thematic episodes were analyzed with the adapted intertextuality analysis. A contrastive analysis was done on the trajectories. Lastly, the meanings negotiated for the created rules were compared to the guidelines provided by the approach. The main result of the study is the observation, that the meanings of the created rules were more aligned with the ground rules of cumulative talk, rather than exploratory talk. Although meanings relating also to exploratory talk were negotiated, they clearly were not the dominant form. In addition, the study observed that the trajectories of the rules were non identical. Despite connecting dimensions (symmetry, composition, continuity and explicitness) none of the trajectories shared exactly the same features as the others.
Resumo:
In this volume, the recommended rules for nomenclature and gene symbolization in barley are reprinted. The current lists of new and revised barley genetic stock descriptions are presented by BGS number order and by locus symbol in alphabetical order.
Resumo:
The magnetic moment μB of a baryon B with quark content (aab) is written as μB=4ea(1+δB)eħ/2cMB, where ea is the charge of the quark of flavor type a. The experimental values of δB have a simple pattern and have a natural explanation within QCD. Using the ratio method, the QCD sum rules are analyzed and the values of δB are computed. We find good agreement with data (≊10%) for the nucleons and the Σ multiplet while for the cascade the agreement is not as good. In our analysis we have incorporated additional terms in the operator-product expansion as compared to previous authors. We also clarify some points of disagreement between the previous authors. External-field-induced correlations describing the magnetic properties of the vacuum are estimated from the baryon magnetic-moment sum rules themselves as well as by independent spectral representations and the results are contrasted.
Resumo:
The study focused on the different ways that forest-related rights can be devolved to the local level according to the current legal frameworks in Laos, Nepal, Vietnam, Kenya, Mozambique and Tanzania. The eleven case studies represented the main ways in which forest-related rights can be devolved to communities or households in these countries. The objectives of this study were to 1) analyse the contents and extent of forest-related rights that can be devolved to the local level, 2) develop an empirical typology that represents the main types of devolution, and 3) compare the cases against a theoretical ideal type to assess in what way and to what extent the cases are similar to or differ from the theoretical construct. Fuzzy set theory, Qualitative Comparative Analysis and ideal type analysis were used in analysing the case studies and in developing an empirical typology. The theoretical framework, which guided data collection and analyses, was based on institutional economics and theories on property rights, common pool resources and collective action. On the basis of the theoretical and empirical knowledge, the most important attributes of rights were defined as use rights, management rights, exclusion rights, transfer rights and the duration and security of the rights. The ideal type was defined as one where local actors have been devolved comprehensive use rights, extensive management rights, rights to exclude others from the resource and rights to transfer these rights. In addition, the rights are to be secure and held perpetually. The ideal type was used to structure the analysis and as a tool against which the cases were analysed. The contents, extent and duration of the devolved rights varied greatly. In general, the results show that devolution has mainly meant the transfer of use rights to the local level, and has not really changed the overall state control over forest resources. In most cases the right holders participate, or have a limited role in the decision making regarding the harvesting and management of the resource. There was a clear tendency to devolve the rights to enforce rules and to monitor resource use and condition more extensively than the powers to decide on the management and development of the resource. The empirical typology of the cases differentiated between five different types of devolution. The types can be characterised by the devolution of 1) restricted use and control rights, 2) extensive use rights but restricted control rights, 3) extensive rights, 4) insecure, short term use and restricted control rights, and 5) insecure extensive rights. Overall, the case studies conformity to the ideal type was very low: only two cases were similar to the ideal type, all other cases differed considerably from the ideal type. The restricted management rights were the most common reason for the low conformity to the ideal type (eight cases). In three cases, the short term of the rights, restricted transfer rights, restricted use rights or restricted exclusion rights were the reason or one of the reasons for the low conformity to the ideal type. In two cases the rights were not secure.
Resumo:
Uncertainties associated with the structural model and measured vibration data may lead to unreliable damage detection. In this paper, we show that geometric and measurement uncertainty cause considerable problem in damage assessment which can be alleviated by using a fuzzy logic-based approach for damage detection. Curvature damage factor (CDF) of a tapered cantilever beam are used as damage indicators. Monte Carlo simulation (MCS) is used to study the changes in the damage indicator due to uncertainty in the geometric properties of the beam. Variation in these CDF measures due to randomness in structural parameter, further contaminated with measurement noise, are used for developing and testing a fuzzy logic system (FLS). Results show that the method correctly identifies both single and multiple damages in the structure. For example, the FLS detects damage with an average accuracy of about 95 percent in a beam having geometric uncertainty of 1 percent COV and measurement noise of 10 percent in single damage scenario. For multiple damage case, the FLS identifies damages in the beam with an average accuracy of about 94 percent in the presence of above mentioned uncertainties. The paper brings together the disparate areas of probabilistic analysis and fuzzy logic to address uncertainty in structural damage detection.
Resumo:
It is shown that the effect of adsorption of inert molecules on electrode reaction rates is completely accounted for, by introducing into the rate equation, adsorption-induced changes in both the effective electrode area as well as in the electrostatic potential at the reaction site with an additional term for the noncoulombic interaction between the reactant and the adsorbate. The electrostatic potential at the reaction site due to the adsorbed layer is calculated using a model of discretely-distributed molecules in parallel orientation when adsorbed on the electrode with an allowance for thermal agitation. The resulting expression, which is valid for the limiting case of low coverages, is used to predict the types of molecular surfactants that are most likely to be useful for acceleration and inhibition of electrode reactions.
Resumo:
Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.
Resumo:
Uncertainty plays an important role in water quality management problems. The major sources of uncertainty in a water quality management problem are the random nature of hydrologic variables and imprecision (fuzziness) associated with goals of the dischargers and pollution control agencies (PCA). Many Waste Load Allocation (WLA)problems are solved by considering these two sources of uncertainty. Apart from randomness and fuzziness, missing data in the time series of a hydrologic variable may result in additional uncertainty due to partial ignorance. These uncertainties render the input parameters as imprecise parameters in water quality decision making. In this paper an Imprecise Fuzzy Waste Load Allocation Model (IFWLAM) is developed for water quality management of a river system subject to uncertainty arising from partial ignorance. In a WLA problem, both randomness and imprecision can be addressed simultaneously by fuzzy risk of low water quality. A methodology is developed for the computation of imprecise fuzzy risk of low water quality, when the parameters are characterized by uncertainty due to partial ignorance. A Monte-Carlo simulation is performed to evaluate the imprecise fuzzy risk of low water quality by considering the input variables as imprecise. Fuzzy multiobjective optimization is used to formulate the multiobjective model. The model developed is based on a fuzzy multiobjective optimization problem with max-min as the operator. This usually does not result in a unique solution but gives multiple solutions. Two optimization models are developed to capture all the decision alternatives or multiple solutions. The objective of the two optimization models is to obtain a range of fractional removal levels for the dischargers, such that the resultant fuzzy risk will be within acceptable limits. Specification of a range for fractional removal levels enhances flexibility in decision making. The methodology is demonstrated with a case study of the Tunga-Bhadra river system in India.
Resumo:
Economic and Monetary Union can be characterised as a complicated set of legislation and institutions governing monetary and fiscal responsibilities. The measures of fiscal responsibility are to be guided by the Stability and Growth Pact, which sets rules for fiscal policy and makes a discretionary fiscal policy virtually impossible. To analyse the effects of the fiscal and monetary policy mix, we modified the New Keynesian framework to allow for supply effects of fiscal policy. We show that defining a supply-side channel for fiscal policy using an endogenous output gap changes the stabilising properties of monetary policy rules. The stability conditions are affected by fiscal policy, so that the dichotomy between active (passive) monetary policy and passive (active) fiscal policy as stabilising regimes does not hold, and it is possible to have an active monetary - active fiscal policy regime consistent with dynamical stability of the economy. We show that, if we take supply-side effects into ac-count, we get more persistent inflation and output reactions. We also show that the dichotomy does not hold for a variety of different fiscal policy rules based on government debt and budget deficit, using the tax smoothing hypothesis and formulating the tax rules as difference equations. The debt rule with active monetary policy results in indeterminacy, while the deficit rule produces a determinate solution with active monetary policy, even with active fiscal policy. The combination of fiscal requirements in a rule results in cyclical responses to shocks. The amplitude of the cycle is larger with more weight on debt than on deficit. Combining optimised monetary policy with fiscal policy rules means that, under a discretionary monetary policy, the fiscal policy regime affects the size of the inflation bias. We also show that commitment to an optimal monetary policy not only corrects the inflation bias but also increases the persistence of output reactions. With fiscal policy rules based on the deficit we can retain the tax smoothing hypothesis also in a sticky price model.
Resumo:
Plasma membrane adopts myriad of different shapes to carry out essential cellular processes such as nutrient uptake, immunological defence mechanisms and cell migration. Therefore, the details how different plasma membrane structures are made and remodelled are of the upmost importance. Bending of plasma membrane into different shapes requires substantial amount of force, which can be provided by the actin cytoskeleton, however, the molecules that regulate the interplay between the actin cytoskeleton and plasma membrane have remained elusive. Recent findings have placed new types of effectors at sites of plasma membrane remodelling, including BAR proteins, which can directly bind and deform plasma membrane into different shapes. In addition to their membrane-bending abilities, BAR proteins also harbor protein domains that intimately link them to the actin cytoskeleton. The ancient BAR domain fold has evolved into at least three structurally and functionally different sub-groups: the BAR, F-BAR and I-BAR domains. This thesis work describes the discovery and functional characterization of the Inverse-BAR domains (I-BARs). Using synthetic model membranes, we have shown that I-BAR domains bind and deform membranes into tubular structures through a binding-surface composed of positively charged amino acids. Importantly, the membrane-binding surface of I-BAR domains displays an inverse geometry to that of the BAR and F-BAR domains, and these structural differences explain why I-BAR domains induce cell protrusions whereas BAR and most F-BAR domains induce cell invaginations. In addition, our results indicate that the binding of I-BAR domains to membranes can alter the spatial organization of phosphoinositides within membranes. Intriguingly, we also found that some I-BAR domains can insert helical motifs into the membrane bilayer, which has important consequences for their membrane binding/bending functions. In mammals there are five I-BAR domain containing proteins. Cell biological studies on ABBA revealed that it is highly expressed in radial glial cells during the development of the central nervous system and plays an important role in the extension process of radial glia-like C6R cells by regulating lamellipodial dynamics through its I-BAR domain. To reveal the role of these proteins in the context of animals, we analyzed MIM knockout mice and found that MIM is required for proper renal functions in adult mice. MIM deficient mice displayed a severe urine concentration defect due to defective intercellular junctions of the kidney epithelia. Consistently, MIM localized to adherens junctions in cultured kidney epithelial cells, where it promoted actin assembly through its I-BAR andWH2 domains. In summary, this thesis describes the mechanism how I-BAR proteins deform membranes and provides information about the biological role of these proteins, which to our knowledge are the first proteins that have been shown to directly deform plasma membrane to make cell protrusions.
Resumo:
Hockey’s budget announcement of two major tax integrity measures was flagged before the budget was handed down, but even that came as no surprise. Integrity, or lack thereof, in our tax system is a hot topic and an easy target for a Treasurer looking to sell a federal budget. The first of the proposed changes is to our GST regime. No-one likes hearing that they will be paying more tax. But, the charging of GST on supplies of digital products and services in Australia by an off-shore supplier will at least make sense to the general public. With the inherent unfairness in the current system and a revenue raising prediction of A$350 million over the next four years, most are likely to accept the logic of such a measure. The second of the proposed changes are new laws to be included in Australia’s general anti-avoidance provision. New laws, which will apply from 1 January 2016, are aimed at multinational companies engaged in aggressive tax practices. The proposed anti-avoidance law is designed to stop multinationals that artificially avoid a taxable presence in Australia. It is difficult to see how this strategy of addressing specific behaviour through what is considered a general provision will work. And, it is these changes that are already causing confusion.
Resumo:
The Turnbull Government announced yet another measure aimed at addressing tax base erosion and profit shifting, placing additional requirements on new foreign investment under the existing national interest test. In the last 12 months Australia has seen various reforms within the tax system. However, this latest initiative is a shift as it links Australia’s tax regime with its foreign investment regime. It sends a broader signal to the market that Australia will look beyond the collection of tax revenues to a consideration of national interest.
Assessing taxpayer response to legislative changes: A case study of ‘in-house’ fringe benefits rules
Resumo:
On 22 October 2012, the Australian Federal Government announced the removal of the $1,000 in-house fringe benefits concession when used as part of a salary packaging arrangement. At the time of the announcement, the Federal Government predicted that the removal of the concession would contribute additional tax revenue of $445 million over the following four years as well as an increase of GST payments to the States and Territories. However, anecdotal evidence at the same time indicated that the Australian employer response was to immediately stop providing employees with such in-house fringe benefits via salary sacrificing arrangements. Data presented in this article, collected from a combination of interviews with tax managers of four Australian entities as well as a review of the published archival data, confirms that the abolition of the $1,000 in-house fringe benefits concession was perceived as a negative change, whereby employees were considered the ‘big losers’ despite assertions by the Federal Government to the contrary. Using a conceptual map of tax rule change developed by Oats and Sadler, this article seeks to understand the reasons for this fringe benefits tax change and taxpayer response. In particular, the economic and political factors, and the responses of the relevant taxpayers (employers) are explored. Drawing on behavioural economic concepts, the actions, attitudes and response of employers to the rule change are also examined. The research findings suggest that the decision by Australian employers to cease providing the in-house fringe benefits as part of a salary-packaging arrangement after the legislative amendment was impacted by more than simple rational behaviour.