915 resultados para Approximate Sum Rule
Resumo:
The classic work of Richardson and Gaunt [1 ], has provided an effective means of extrapolating the limiting result in an approximate analysis. From the authors' work on "Bounds for eigenvalues" [2-4] an interesting alternate method has emerged for assessing monotonically convergent approximate solutions by generating close bounds. Whereas further investigation is needed to put this work on sound theoretical foundation, we intend this letter to announce a possibility, which was confirmed by an exhaustive set of examples.
Resumo:
We trace the evolution of the representation of management in cropping and grazing systems models, from fixed annual schedules of identical actions in single paddocks toward flexible scripts of rules. Attempts to define higher-level organizing concepts in management policies, and to analyse them to identify optimal plans, have focussed on questions relating to grazing management owing to its inherent complexity. “Rule templates” assist the re-use of complex management scripts by bundling commonly-used collections of rules with an interface through which key parameters can be input by a simulation builder. Standard issues relating to parameter estimation and uncertainty apply to management sub-models and need to be addressed. Techniques for embodying farmers' expectations and plans for the future within modelling analyses need to be further developed, especially better linking planning- and rule-based approaches to farm management and analysing the ways that managers can learn.
Resumo:
The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.
Resumo:
Major infrastructure and construction (MIC) projects are those with significant traffic or environmental impact, of strategic and regional significance and high sensitivity. The decision making process of schemes of this type is becoming ever more complicated, especially with the increasing number of stakeholders involved and their growing tendency to defend their own varied interests. Failing to address and meet the concerns and expectations of stakeholders may result in project failures. To avoid this necessitates a systematic participatory approach to facilitate decision-making. Though numerous decision models have been established in previous studies (e.g. ELECTRE methods, the analytic hierarchy process and analytic network process) their applicability in the decision process during stakeholder participation in contemporary MIC projects is still uncertain. To resolve this, the decision rule approach is employed for modeling multi-stakeholder multi-objective project decisions. Through this, the result is obtained naturally according to the “rules” accepted by any stakeholder involved. In this sense, consensus is more likely to be achieved since the process is more convincing and the result is easier to be accepted by all concerned. Appropriate “rules”, comprehensive enough to address multiple objectives while straightforward enough to be understood by multiple stakeholders, are set for resolving conflict and facilitating consensus during the project decision process. The West Kowloon Cultural District (WKCD) project is used as a demonstration case and a focus group meeting is conducted in order to confirm the validity of the model established. The results indicate that the model is objective, reliable and practical enough to cope with real world problems. Finally, a suggested future research agenda is provided.
Resumo:
The impurity profile for the second oxidation, used in MOST fabrication, has been obtained by Margalit et al. [1]. The disadvantage of this technique is that the accuracy of their solution is directly dependent on the computer time. In this article, an analytical solution is presented using the approximation of linearizing the second oxidation procedure.
Resumo:
In cases whazo zotatLon of the seoondazy pztncipal 8tzo,ae axes along tha light path ,exists, it is always poaeible to detezmlna two dizactions along which plane-polazlaad light ,antazlng the model ,amerCe8 as plene-pela~l,aed light fzom the model. Puzth,az the nat zstazdatton Pot any light path is dlff,azant Prom the lntsgtatad zetazd,ation Pat the l£ght path nogZsctlng the ePfsct or z,atation.
Resumo:
The motion of a bore over a sloping beach, earlier considered numerically by Keller, Levine & Whitham (1960), is studied by an approximate analytic technique. This technique is an extension of Whitham's (1958) approach for the propagation of shocks into a non-uniform medium. It gives the entire flow behind the bore and is shown to be equivalent to the theory of modulated simple waves of Varley, Ventakaraman & Cumberbatch (1971).
Resumo:
Motivated by a problem from fluid mechanics, we consider a generalization of the standard curve shortening flow problem for a closed embedded plane curve such that the area enclosed by the curve is forced to decrease at a prescribed rate. Using formal asymptotic and numerical techniques, we derive possible extinction shapes as the curve contracts to a point, dependent on the rate of decreasing area; we find there is a wider class of extinction shapes than for standard curve shortening, for which initially simple closed curves are always asymptotically circular. We also provide numerical evidence that self-intersection is possible for non-convex initial conditions, distinguishing between pinch-off and coalescence of the curve interior.
Resumo:
In this paper, pattern classification problem in tool wear monitoring is solved using nature inspired techniques such as Genetic Programming(GP) and Ant-Miner (AM). The main advantage of GP and AM is their ability to learn the underlying data relationships and express them in the form of mathematical equation or simple rules. The extraction of knowledge from the training data set using GP and AM are in the form of Genetic Programming Classifier Expression (GPCE) and rules respectively. The GPCE and AM extracted rules are then applied to set of data in the testing/validation set to obtain the classification accuracy. A major attraction in GP evolved GPCE and AM based classification is the possibility of obtaining an expert system like rules that can be directly applied subsequently by the user in his/her application. The performance of the data classification using GP and AM is as good as the classification accuracy obtained in the earlier study.
Resumo:
We derive a very general expression of the survival probability and the first passage time distribution for a particle executing Brownian motion in full phase space with an absorbing boundary condition at a point in the position space, which is valid irrespective of the statistical nature of the dynamics. The expression, together with the Jensen's inequality, naturally leads to a lower bound to the actual survival probability and an approximate first passage time distribution. These are expressed in terms of the position-position, velocity-velocity, and position-velocity variances. Knowledge of these variances enables one to compute a lower bound to the survival probability and consequently the first passage distribution function. As examples, we compute these for a Gaussian Markovian process and, in the case of non-Markovian process, with an exponentially decaying friction kernel and also with a power law friction kernel. Our analysis shows that the survival probability decays exponentially at the long time irrespective of the nature of the dynamics with an exponent equal to the transition state rate constant.
Resumo:
With the development of wearable and mobile computing technology, more and more people start using sleep-tracking tools to collect personal sleep data on a daily basis aiming at understanding and improving their sleep. While sleep quality is influenced by many factors in a person’s lifestyle context, such as exercise, diet and steps walked, existing tools simply visualize sleep data per se on a dashboard rather than analyse those data in combination with contextual factors. Hence many people find it difficult to make sense of their sleep data. In this paper, we present a cloud-based intelligent computing system named SleepExplorer that incorporates sleep domain knowledge and association rule mining for automated analysis on personal sleep data in light of contextual factors. Experiments show that the same contextual factors can play a distinct role in sleep of different people, and SleepExplorer could help users discover factors that are most relevant to their personal sleep.
Resumo:
The two-year trial of the Queensland minimum passing distance (MPD) road rule began on 7 April 2014. The rule requires motor vehicles to provide cyclists a minimum lateral passing distance of one metre when overtaking cyclists in a speed zone of 60 km/h or less, and 1.5 metres when the speed limit is greater than 60 km/h. This document summarises the evaluation of the effectiveness of the new rule in terms of its: 1. practical implementation; 2. impact on road users’ attitudes and perceptions; and 3. road safety benefits. The Centre for Accident Research and Road Safety – Queensland (CARRS-Q) developed the evaluation framework (Haworth, Schramm, Kiata-Holland, Vallmuur, Watson & Debnath; 2014) for the Queensland Department of Transport and Main Roads (TMR) and was later commissioned to undertake the evaluation. The evaluation included the following components: • Review of correspondence received by TMR; • Interviews and focus groups with Queensland Police Service (QPS) officers; • Road user survey; • Observational study; and • Crash, injury and infringement data analysis.