995 resultados para Statistical decision


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the theoretical foundations for the multiple rendezvous problem involving design of local control strategies that enable groups of visibility-limited mobile agents to split into subgroups, exhibit simultaneous taxis behavior towards, and eventually rendezvous at, multiple unknown locations of interest. The theoretical results are proved under certain restricted set of assumptions. The algorithm used to solve the above problem is based on a glowworm swarm optimization (GSO) technique, developed earlier, that finds multiple optima of multimodal objective functions. The significant difference between our work and most earlier approaches to agreement problems is the use of a virtual local-decision domain by the agents in order to compute their movements. The range of the virtual domain is adaptive in nature and is bounded above by the maximum sensor/visibility range of the agent. We introduce a new decision domain update rule that enhances the rate of convergence by a factor of approximately two. We use some illustrative simulations to support the algorithmic correctness and theoretical findings of the paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The unique characteristics of marketspace in combination with the fast growing number of consumers interested in e-commerce have created new research areas of interest to both marketing and consumer behaviour researchers. Consumer behaviour researchers interested in the decision making processes of consumers have two new sets of questions to answer. The first set of questions is related to how useful theories developed for a marketplace are in a marketspace context. Cyber auctions, Internet communities and the possibilities for consumers to establish dialogues not only with companies but also with other consumers make marketspace unique. The effects of these distinctive characteristics on the behaviour of consumers have not been systematically analysed and therefore constitute the second set of questions which have to be studied. Most companies feel that they have to be online even though the effects of being on the Net are not unambiguously positive. The relevance of the relationship marketing paradigm in a marketspace context have to be studied. The relationship enhancement effects of websites from the customers’ point of view are therefore emphasized in this research paper. Representatives of the Net-generation were analysed and the results show that companies should develop marketspace strategies while Net presence has a value-added effect on consumers. The results indicate that the decision making processes of the consumers are also changing as a result of the progress of marketspace

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study scrutinizes the dynamics of the Finnish higher education political system. Dynamics is understood as the regularity of interaction between actors. By actors is meant the central institutions in the system. The theoretical framework of the study draws on earlier research in political science and higher education political studies. The theoretical model for analysis is built on agenda-setting theories. The theoretical model separates two dimensions of dynamics, namely the political situation and political possibilities. A political situation can be either favourable or contradictory to change. If the institutional framework within the higher education system is not compatible with the external factors of the system, the political situation is contradictory to change. To change the situation into a favourable one, one needs either to change the institutional structure or wait for external factors to change. Then again, the political possibilities can be either settled or politicized. Politicization means that new possibilities for action are found. Settled possibilities refer to routine actions performed according to old practices. The research tasks based on the theoretical model are: 1. To empirically analyse the political situation and the possibilities from the actors point of view. 2. To theoretically construct and empirically test a model for analysis of dynamics in the Finnish higher education politics. The research material consists of 25 thematic interviews with key persons in the higher education political system in 2008. In addition, there are also documents from different actors since the 1980s and statistical data. The material is analysed in four phases. In the first phase the emphasis is on trying to understand the interviewees and actors points of view. In the second phase the different types of research material are related to each other. In the third phase the findings are related to the theoretical model, which is constructed over the course of the analysis. In the fourth phase the interpretation is tested. The research distinguishes three historical periods in the Finnish higher education system and focuses on the last one. This is the era of the complex system beginning in the 1980s 1990s. Based on the interviews, four policy threads are identified and analysed in their historical context. Each of the policy threads represents one of the four possible dynamics identified in the theoretical model. The research policy thread functions according to reform dynamics. A coalition of innovation politics is able to use the politicized possibilities due to the political situation created by the conception of the national innovation system. The regional policy thread is in a gridlock dynamics. The combination of a political system based on provincial representation, a regional higher education institutional framework and outside pressure to streamline the higher education structure created a contradictory political situation. Because of this situation, the politicized possibilities in the so-called "regional development plan" do not have much effect. In the international policy thread, a consensual change dynamics is found. Through changes in the institutional framework, the higher education political system is moulded into a favourable situation. However, the possibilities are settled: a pragmatic national gaze prevailed. A dynamics of friction is found in the governance policy thread. A political situation where political-strategic and budgetary decision-making are separated is not favourable for change. In addition, as governance policy functions according to settled possibilities, the situation seems unchangeable. There are five central findings. First, the dynamics are different depending on the policy thread under scrutiny. Second, the settled possibilities in a policy thread seemed to influence other threads the most. Third, dynamics are much related to changes external to the higher education political system, the changing positions of the actors in different policy threads and the unexpected nature of the dynamics. Fourth, it is fruitful to analyse the dynamics with the theoretical model. Fifth, but only hypothetically and thus left for further research, it seems that the Finnish higher education politics is reactive and weak at politicization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design of speaker identification schemes for a small number of speakers (around 10) with a high degree of accuracy in controlled environment is a practical proposition today. When the number of speakers is large (say 50–100), many of these schemes cannot be directly extended, as both recognition error and computation time increase monotonically with population size. The feature selection problem is also complex for such schemes. Though there were earlier attempts to rank order features based on statistical distance measures, it has been observed only recently that the best two independent measurements are not the same as the combination in two's for pattern classification. We propose here a systematic approach to the problem using the decision tree or hierarchical classifier with the following objectives: (1) Design of optimal policy at each node of the tree given the tree structure i.e., the tree skeleton and the features to be used at each node. (2) Determination of the optimal feature measurement and decision policy given only the tree skeleton. Applicability of optimization procedures such as dynamic programming in the design of such trees is studied. The experimental results deal with the design of a 50 speaker identification scheme based on this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Artificial neural networks (ANNs) have shown great promise in modeling circuit parameters for computer aided design applications. Leakage currents, which depend on process parameters, supply voltage and temperature can be modeled accurately with ANNs. However, the complex nature of the ANN model, with the standard sigmoidal activation functions, does not allow analytical expressions for its mean and variance. We propose the use of a new activation function that allows us to derive an analytical expression for the mean and a semi-analytical expression for the variance of the ANN-based leakage model. To the best of our knowledge this is the first result in this direction. Our neural network model also includes the voltage and temperature as input parameters, thereby enabling voltage and temperature aware statistical leakage analysis (SLA). All existing SLA frameworks are closely tied to the exponential polynomial leakage model and hence fail to work with sophisticated ANN models. In this paper, we also set up an SLA framework that can efficiently work with these ANN models. Results show that the cumulative distribution function of leakage current of ISCAS'85 circuits can be predicted accurately with the error in mean and standard deviation, compared to Monte Carlo-based simulations, being less than 1% and 2% respectively across a range of voltage and temperature values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a novel and efficient algorithm for modelling sub-65 nm clock interconnect-networks in the presence of process variation. We develop a method for delay analysis of interconnects considering the impact of Gaussian metal process variations. The resistance and capacitance of a distributed RC line are expressed as correlated Gaussian random variables which are then used to compute the standard deviation of delay Probability Distribution Function (PDF) at all nodes in the interconnect network. Main objective is to find delay PDF at a cheaper cost. Convergence of this approach is in probability distribution but not in mean of delay. We validate our approach against SPICE based Monte Carlo simulations while the current method entails significantly lower computational cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The factors affecting the non-industrial, private forest landowners' (hereafter referred to using the acronym NIPF) strategic decisions in management planning are studied. A genetic algorithm is used to induce a set of rules predicting potential cut of the landowners' choices of preferred timber management strategies. The rules are based on variables describing the characteristics of the landowners and their forest holdings. The predictive ability of a genetic algorithm is compared to linear regression analysis using identical data sets. The data are cross-validated seven times applying both genetic algorithm and regression analyses in order to examine the data-sensitivity and robustness of the generated models. The optimal rule set derived from genetic algorithm analyses included the following variables: mean initial volume, landowner's positive price expectations for the next eight years, landowner being classified as farmer, and preference for the recreational use of forest property. When tested with previously unseen test data, the optimal rule set resulted in a relative root mean square error of 0.40. In the regression analyses, the optimal regression equation consisted of the following variables: mean initial volume, proportion of forestry income, intention to cut extensively in future, and positive price expectations for the next two years. The R2 of the optimal regression equation was 0.34 and the relative root mean square error obtained from the test data was 0.38. In both models, mean initial volume and positive stumpage price expectations were entered as significant predictors of potential cut of preferred timber management strategy. When tested with the complete data set of 201 observations, both the optimal rule set and the optimal regression model achieved the same level of accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A systematic structure analysis of the correlation functions of statistical quantum optics is carried out. From a suitably defined auxiliary two‐point function we are able to identify the excited modes in the wave field. The relative simplicity of the higher order correlation functions emerge as a byproduct and the conditions under which these are made pure are derived. These results depend in a crucial manner on the notion of coherence indices and of unimodular coherence indices. A new class of approximate expressions for the density operator of a statistical wave field is worked out based on discrete characteristic sets. These are even more economical than the diagonal coherent state representations. An appreciation of the subtleties of quantum theory obtains. Certain implications for the physics of light beams are cited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The absorption produced by the audience in concert halls is considered a random variable. Beranek's proposal [L. L. Beranek, Music, Acoustics and Architecture (Wiley, New York, 1962), p. 543] that audience absorption is proportional to the area they occupy and not to their number is subjected to a statistical hypothesis test. A two variable linear regression model of the absorption with audience area and residual area as regressor variables is postulated for concert halls without added absorptive materials. Since Beranek's contention amounts to the statement that audience absorption is independent of the seating density, the test of the hypothesis lies in categorizing halls by seating density and examining for significant differences among slopes of regression planes of the different categories. Such a test shows that Beranek's hypothesis can be accepted. It is also shown that the audience area is a better predictor of the absorption than the audience number. The absorption coefficients and their 95% confidence limits are given for the audience and residual areas. A critique of the regression model is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation examines the role of the EU courts in new governance. New governance has raised unprecedented interest in the EU in recent years. This is manifested in a plethora of instruments and actors at various levels that challenge more traditional forms of command-and-control regulation. New governance and political experimentation more generally is thought to sap the ability of the EU judiciary to monitor and review these experiments. The exclusion of the courts is then seen to add to the legitimacy problem of new governance. The starting point of this dissertation is the observation that the marginalised role of the courts is based on theoretical and empirical assumptions which invite scrutiny. The theoretical framework of the dissertation is deliberative democracy and democratic experimentalism. The analysis of deliberative democracy is sustained by an attempt to apply theoretical concepts to three distinctive examples of governance in the EU. These are the EU Sustainable Development Strategy, the European Chemicals Agency, and the Common Implementation Strategy for the Water Framework Directive. The case studies show numerous disincentives and barriers to judicial review. Among these are questions of the role of courts in shaping governance frameworks, the reviewability of science-based measures, the standing of individuals before the courts, and the justiciability of soft law. The dissertation analyses the conditions of judicial review in each governance environment and proposes improvements. From a more theoretical standpoint it could be said that each case study presents a governance regime which builds on legislation that lays out major (guide)lines but leaves details to be filled out at a later stage. Specification of detailed standards takes place through collaborative networks comprising members from national administrations, NGOs, and the Commission. Viewed this way, deliberative problem-solving is needed to bring people together to clarify, elaborate, and revise largely abstract and general norms in order to resolve concrete and specific problems and to make law applicable and enforceable. The dissertation draws attention to the potential of peer review included there and its profound consequences for judicial accountability structures. It is argued that without this kind of ongoing and dynamic peer review of accountability in governance frameworks, judicial review of new governance is difficult and in some cases impossible. This claim has implications for how we understand the concept of soft law, the role of the courts, participation rights, and the legitimacy of governance measures more generally. The experimentalist architecture of judicial decision-making relies upon a wide variety of actors to provide conditions for legitimate and efficient review.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Production scheduling in a flexible manufacturing system (FMS) is a real-time combinatorial optimization problem that has been proved to be NP-complete. Solving this problem needs on-line monitoring of plan execution and requires real-time decision-making in selecting alternative routings, assigning required resources, and rescheduling when failures occur in the system. Expert systems provide a natural framework for solving this kind of NP-complete problems.In this paper an expert system with a novel parallel heuristic approach is implemented for automatic short-term dynamic scheduling of FMS. The principal features of the expert system presented in this paper include easy rescheduling, on-line plan execution, load balancing, an on-line garbage collection process, and the use of advanced knowledge representational schemes. Its effectiveness is demonstrated with two examples.