999 resultados para succession decision
Resumo:
Decision trees are very powerful tools for classification in data mining tasks that involves different types of attributes. When coming to handling numeric data sets, usually they are converted first to categorical types and then classified using information gain concepts. Information gain is a very popular and useful concept which tells you, whether any benefit occurs after splitting with a given attribute as far as information content is concerned. But this process is computationally intensive for large data sets. Also popular decision tree algorithms like ID3 cannot handle numeric data sets. This paper proposes statistical variance as an alternative to information gain as well as statistical mean to split attributes in completely numerical data sets. The new algorithm has been proved to be competent with respect to its information gain counterpart C4.5 and competent with many existing decision tree algorithms against the standard UCI benchmarking datasets using the ANOVA test in statistics. The specific advantages of this proposed new algorithm are that it avoids the computational overhead of information gain computation for large data sets with many attributes, as well as it avoids the conversion to categorical data from huge numeric data sets which also is a time consuming task. So as a summary, huge numeric datasets can be directly submitted to this algorithm without any attribute mappings or information gain computations. It also blends the two closely related fields statistics and data mining
Resumo:
This paper highlights the prediction of Learning Disabilities (LD) in school-age children using two classification methods, Support Vector Machine (SVM) and Decision Tree (DT), with an emphasis on applications of data mining. About 10% of children enrolled in school have a learning disability. Learning disability prediction in school age children is a very complicated task because it tends to be identified in elementary school where there is no one sign to be identified. By using any of the two classification methods, SVM and DT, we can easily and accurately predict LD in any child. Also, we can determine the merits and demerits of these two classifiers and the best one can be selected for the use in the relevant field. In this study, Sequential Minimal Optimization (SMO) algorithm is used in performing SVM and J48 algorithm is used in constructing decision trees.
Resumo:
In many real world contexts individuals find themselves in situations where they have to decide between options of behaviour that serve a collective purpose or behaviours which satisfy one’s private interests, ignoring the collective. In some cases the underlying social dilemma (Dawes, 1980) is solved and we observe collective action (Olson, 1965). In others social mobilisation is unsuccessful. The central topic of social dilemma research is the identification and understanding of mechanisms which yield to the observed cooperation and therefore resolve the social dilemma. It is the purpose of this thesis to contribute this research field for the case of public good dilemmas. To do so, existing work that is relevant to this problem domain is reviewed and a set of mandatory requirements is derived which guide theory and method development of the thesis. In particular, the thesis focusses on dynamic processes of social mobilisation which can foster or inhibit collective action. The basic understanding is that success or failure of the required process of social mobilisation is determined by heterogeneous individual preferences of the members of a providing group, the social structure in which the acting individuals are contained, and the embedding of the individuals in economic, political, biophysical, or other external contexts. To account for these aspects and for the involved dynamics the methodical approach of the thesis is computer simulation, in particular agent-based modelling and simulation of social systems. Particularly conductive are agent models which ground the simulation of human behaviour in suitable psychological theories of action. The thesis develops the action theory HAPPenInGS (Heterogeneous Agents Providing Public Goods) and demonstrates its embedding into different agent-based simulations. The thesis substantiates the particular added value of the methodical approach: Starting out from a theory of individual behaviour, in simulations the emergence of collective patterns of behaviour becomes observable. In addition, the underlying collective dynamics may be scrutinised and assessed by scenario analysis. The results of such experiments reveal insights on processes of social mobilisation which go beyond classical empirical approaches and yield policy recommendations on promising intervention measures in particular.
Resumo:
Accurate data of the natural conditions and agricultural systems with a good spatial resolution are a key factor to tackle food insecurity in developing countries. A broad variety of approaches exists to achieve precise data and information about agriculture. One system, especially developed for smallholder agriculture in East Africa, is the Farm Management Handbook of Kenya. It was first published in 1982/83 and fully revised in 2012, now containing 7 volumes. The handbooks contain detailed information on climate, soils, suitable crops and soil care based on scientific research results of the last 30 years. The density of facts leads to time consuming extraction of all necessary information. In this study we analyse the user needs and necessary components of a system for decision support for smallholder farming in Kenya based on a geographical information system (GIS). Required data sources were identified, as well as essential functions of the system. We analysed the results of our survey conducted in 2012 and early 2013 among agricultural officers. The monitoring of user needs and the problem of non-adaptability of an agricultural information system on the level of extension officers in Kenya are the central objectives. The outcomes of the survey suggest the establishment of a decision support tool based on already available open source GIS components. The system should include functionalities to show general information for a specific location and should provide precise recommendations about suitable crops and management options to support agricultural guidance on farm level.
Resumo:
The traditional task of a central bank is to preserve price stability and, in doing so, not to impair the real economy more than necessary. To meet this challenge, it is of great relevance whether inflation is only driven by inflation expectations and the current output gap or whether it is, in addition, influenced by past inflation. In the former case, as described by the New Keynesian Phillips curve, the central bank can immediately and simultaneously achieve price stability and equilibrium output, the so-called ‘divine coincidence’ (Blanchard and Galí 2007). In the latter case, the achievement of price stability is costly in terms of output and will be pursued over several periods. Similarly, it is important to distinguish this latter case, which describes ‘intrinsic’ inflation persistence, from that of ‘extrinsic’ inflation persistence, where the sluggishness of inflation is not a ‘structural’ feature of the economy but merely ‘inherited’ from the sluggishness of the other driving forces, inflation expectations and output. ‘Extrinsic’ inflation persistence is usually considered to be the less challenging case, as policy-makers are supposed to fight against the persistence in the driving forces, especially to reduce the stickiness of inflation expectations by a credible monetary policy, in order to reestablish the ‘divine coincidence’. The scope of this dissertation is to contribute to the vast literature and ongoing discussion on inflation persistence: Chapter 1 describes the policy consequences of inflation persistence and summarizes the empirical and theoretical literature. Chapter 2 compares two models of staggered price setting, one with a fixed two-period duration and the other with a stochastic duration of prices. I show that in an economy with a timeless optimizing central bank the model with the two-period alternating price-setting (for most parameter values) leads to more persistent inflation than the model with stochastic price duration. This result amends earlier work by Kiley (2002) who found that the model with stochastic price duration generates more persistent inflation in response to an exogenous monetary shock. Chapter 3 extends the two-period alternating price-setting model to the case of 3- and 4-period price durations. This results in a more complex Phillips curve with a negative impact of past inflation on current inflation. As simulations show, this multi-period Phillips curve generates a too low degree of autocorrelation and too early turnings points of inflation and is outperformed by a simple Hybrid Phillips curve. Chapter 4 starts from the critique of Driscoll and Holden (2003) on the relative real-wage model of Fuhrer and Moore (1995). While taking the critique seriously that Fuhrer and Moore’s model will collapse to a much simpler one without intrinsic inflation persistence if one takes their arguments literally, I extend the model by a term for inequality aversion. This model extension is not only in line with experimental evidence but results in a Hybrid Phillips curve with inflation persistence that is observably equivalent to that presented by Fuhrer and Moore (1995). In chapter 5, I present a model that especially allows to study the relationship between fairness attitudes and time preference (impatience). In the model, two individuals take decisions in two subsequent periods. In period 1, both individuals are endowed with resources and are able to donate a share of their resources to the other individual. In period 2, the two individuals might join in a common production after having bargained on the split of its output. The size of the production output depends on the relative share of resources at the end of period 1 as the human capital of the individuals, which is built by means of their resources, cannot fully be substituted one against each other. Therefore, it might be rational for a well-endowed individual in period 1 to act in a seemingly ‘fair’ manner and to donate own resources to its poorer counterpart. This decision also depends on the individuals’ impatience which is induced by the small but positive probability that production is not possible in period 2. As a general result, the individuals in the model economy are more likely to behave in a ‘fair’ manner, i.e., to donate resources to the other individual, the lower their own impatience and the higher the productivity of the other individual. As the (seemingly) ‘fair’ behavior is modelled as an endogenous outcome and as it is related to the aspect of time preference, the presented framework might help to further integrate behavioral economics and macroeconomics.
Resumo:
This paper describes a new statistical, model-based approach to building a contact state observer. The observer uses measurements of the contact force and position, and prior information about the task encoded in a graph, to determine the current location of the robot in the task configuration space. Each node represents what the measurements will look like in a small region of configuration space by storing a predictive, statistical, measurement model. This approach assumes that the measurements are statistically block independent conditioned on knowledge of the model, which is a fairly good model of the actual process. Arcs in the graph represent possible transitions between models. Beam Viterbi search is used to match measurement history against possible paths through the model graph in order to estimate the most likely path for the robot. The resulting approach provides a new decision process that can be use as an observer for event driven manipulation programming. The decision procedure is significantly more robust than simple threshold decisions because the measurement history is used to make decisions. The approach can be used to enhance the capabilities of autonomous assembly machines and in quality control applications.
Resumo:
This paper sets out to identify the initial positions of the different decision makers who intervene in a group decision making process with a reduced number of actors, and to establish possible consensus paths between these actors. As a methodological support, it employs one of the most widely-known multicriteria decision techniques, namely, the Analytic Hierarchy Process (AHP). Assuming that the judgements elicited by the decision makers follow the so-called multiplicative model (Crawford and Williams, 1985; Altuzarra et al., 1997; Laininen and Hämäläinen, 2003) with log-normal errors and unknown variance, a Bayesian approach is used in the estimation of the relative priorities of the alternatives being compared. These priorities, estimated by way of the median of the posterior distribution and normalised in a distributive manner (priorities add up to one), are a clear example of compositional data that will be used in the search for consensus between the actors involved in the resolution of the problem through the use of Multidimensional Scaling tools
Resumo:
This paper presents a procedure that allows us to determine the preference structures (PS) associated to each of the different groups of actors that can be identified in a group decision making problem with a large number of individuals. To that end, it makes use of the Analytic Hierarchy Process (AHP) (Saaty, 1980) as the technique to solve discrete multicriteria decision making problems. This technique permits the resolution of multicriteria, multienvironment and multiactor problems in which subjective aspects and uncertainty have been incorporated into the model, constructing ratio scales corresponding to the priorities relative to the elements being compared, normalised in a distributive manner (wi = 1). On the basis of the individuals’ priorities we identify different clusters for the decision makers and, for each of these, the associated preference structure using, to that end, tools analogous to those of Multidimensional Scaling. The resulting PS will be employed to extract knowledge for the subsequent negotiation processes and, should it be necessary, to determine the relative importance of the alternatives being compared using anyone of the existing procedures
Resumo:
A new method for the automated selection of colour features is described. The algorithm consists of two stages of processing. In the first, a complete set of colour features is calculated for every object of interest in an image. In the second stage, each object is mapped into several n-dimensional feature spaces in order to select the feature set with the smallest variables able to discriminate the remaining objects. The evaluation of the discrimination power for each concrete subset of features is performed by means of decision trees composed of linear discrimination functions. This method can provide valuable help in outdoor scene analysis where no colour space has been demonstrated as being the most suitable. Experiment results recognizing objects in outdoor scenes are reported
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Exercises and solutions in LaTex
Resumo:
Exercises and solutions in PDF
Resumo:
KFC, the chain fast-food restaurants in UK, planed to launched coffee products through campaigns. There are two main reasons for KFC to make the decision. The first one is KFC tried to promote its coffee products with KFC A.M. breakfast plan and it failed at last. The second reason is that KFC needs extension points of interest. The financial condition of KFC has been steady but no breakthrough growth. It has been showed that there is enormous potential of “fast-drink” market in UK. After the success of KFC “Krushems” series, it is reasonable for the company launched coffee products. However, KFC also faced to many challenges to win the market. Compare to the main competitor of McDonald’s, KFC’s quantity of restaurants is far too less. Moreover, KFC has a brand limitation that focuses more family than single urban. The dominant competitors are another challenge KFC need to manage. To sum up, KFC has to win these challenges to be a bigger player in UK coffee market.