914 resultados para Bayes Rule
Resumo:
Train pedestrian collisions are the most likely to result in severe injuries and fatalities when compared to other types of rail crossing accidents. However, there is currently scant research that has examined the origins of pedestrians’ rule breaking at level crossings. As a result, this study examined the origins of pedestrians’ rule breaking behaviour at crossings, with particular emphasis directed towards examining the factors associated with making errors versus deliberation violations. A total of 636 individuals volunteered to participate in the study and completed either an online or paper version of the questionnaire. Quantitative analysis of the data revealed that knowledge regarding crossing rules was high, although up to 18% of level crossing users were either unsure or did not know (in some circumstances) when it was legal to cross at a level crossing. Furthermore, 156 participants (24.52%) reported having intentionally violated the rules at level crossings and 3.46% (n = 22) of the sample had previously made a mistake at a crossing. In regards to rule violators, males (particularly minors) were more likely to report breaking rules, and the most frequent occurrence was after the train had passed rather than before it arrives. Regression analysis revealed that males who frequently use pedestrian crossings and report higher sensation seeking traits are most likely to break the rules. This research provides evidence that pedestrians are more likely to deliberately violate rules (rather than make errors) at crossings and it illuminates high risk groups. This paper will further outline the study findings in regards to the development of countermeasures as well as provide direction for future research efforts in this area.
Resumo:
A new transdimensional Sequential Monte Carlo (SMC) algorithm called SM- CVB is proposed. In an SMC approach, a weighted sample of particles is generated from a sequence of probability distributions which ‘converge’ to the target distribution of interest, in this case a Bayesian posterior distri- bution. The approach is based on the use of variational Bayes to propose new particles at each iteration of the SMCVB algorithm in order to target the posterior more efficiently. The variational-Bayes-generated proposals are not limited to a fixed dimension. This means that the weighted particle sets that arise can have varying dimensions thereby allowing us the option to also estimate an appropriate dimension for the model. This novel algorithm is outlined within the context of finite mixture model estimation. This pro- vides a less computationally demanding alternative to using reversible jump Markov chain Monte Carlo kernels within an SMC approach. We illustrate these ideas in a simulated data analysis and in applications.
Resumo:
We propose an architecture for a rule-based online management systems (RuleOMS). Typically, many domain areas face the problem that stakeholders maintain databases of their business core information and they have to take decisions or create reports according to guidelines, policies or regulations. To address this issue we propose the integration of databases, in particular relational databases, with a logic reasoner and rule engine. We argue that defeasible logic is an appropriate formalism to model rules, in particular when the rules are meant to model regulations. The resulting RuleOMS provides an efficient and flexible solution to the problem at hand using defeasible inference. A case study of an online child care management system is used to illustrate the proposed architecture.
Resumo:
Background: Recently there have been efforts to derive safe, efficient processes to rule out acute coronary syndrome (ACS) in emergency department (ED) chest pain patients. We aimed to prospectively validate an ACS assessment pathway (the 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Symptoms Using Contemporary Troponins as the Only Biomarker (ADAPT) pathway) under pragmatic ED working conditions. Methods: This prospective cohort study included patients with atraumatic chest pain in whom ACS was suspected but who did not have clear evidence of ischaemia on ECG. Thrombolysis in myocardial infarction (TIMI) score and troponin (TnI Ultra) were measured at ED presentation, 2 h later and according to current national recommendations. The primary outcome of interest was the occurrence of major adverse cardiac events (MACE) including prevalent myocardial infarction (MI) at 30 days in the group who had a TIMI score of 0 and had presentation and 2-h TnI assays <99th percentile. Results: Eight hundred and forty patients were studied of whom 177 (21%) had a TIMI score of 0. There were no MI, MACE or revascularization in the per protocol and intention-to-treat 2-h troponin groups (0%, 95% confidence interval (CI) 0% to 4.5% and 0%, 95% CI 0% to 3.8%, respectively). The negative predictive value (NPV) was 100% (95% CI 95.5% to 100%) and 100% (95% CI 96.2% to 100%), respectively. Conclusions: A 2-h accelerated rule-out process for ED chest pain patients using electrocardiography, a TIMI score of 0 and a contemporary sensitive troponin assay accurately identifies a group at very low risk of 30-day MI or MACE.
The new Vancouver Chest Pain Rule using troponin as the only biomarker: An external validation study
Resumo:
Objectives To externally evaluate the accuracy of the new Vancouver Chest Pain Rule and to assess the diagnostic accuracy using either sensitive or highly sensitive troponin assays. Methods Prospectively collected data from 2 emergency departments (EDs) in Australia and New Zealand were analysed. Based on the new Vancouver Chest Pain Rule, low-risk patients were identified using electrocardiogram results, cardiac history, nitrate use, age, pain characteristics and troponin results at 2 hours after presentation. The primary outcome was 30-day diagnosis of acute coronary syndrome (ACS), including acute myocardial infarction, and unstable angina. Sensitivity, specificity, positive predictive values and negative predictive values were calculated to assess the accuracy of the new Vancouver Chest Pain Rule using either sensitive or highly sensitive troponin assay results. Results Of the 1635 patients, 20.4% had an ACS diagnosis at 30 days. Using the highly sensitive troponin assay, 212 (13.0%) patients were eligible for early discharge with 3 patients (1.4%) diagnosed with ACS. Sensitivity was 99.1% (95% CI 97.4-99.7), specificity was 16.1 (95% CI 14.2-18.2), positive predictive values was 23.3 (95% CI 21.1-25.5) and negative predictive values was 98.6 (95% CI 95.9-99.5). The diagnostic accuracy of the rule was similar using the sensitive troponin assay. Conclusions The new Vancouver Chest Pain Rule should be used for the identification of low risk patients presenting to EDs with symptoms of possible ACS, and will reduce the proportion of patients requiring lengthy assessment; however we recommend further outpatient investigation for coronary artery disease in patients identified as low risk.
Resumo:
This paper describes the design and implementation of a high-level query language called Generalized Query-By-Rule (GQBR) which supports retrieval, insertion, deletion and update operations. This language, based on the formalism of database logic, enables the users to access each database in a distributed heterogeneous environment, without having to learn all the different data manipulation languages. The compiler has been implemented on a DEC 1090 system in Pascal.
Resumo:
An efficient geometrical design rule checker is proposed, based on operations on quadtrees, which represent VLSI mask layouts. The time complexity of the design rule checker is O(N), where N is the number of polygons in the mask. A pseudoPascal description is provided of all the important algorithms for geometrical design rule verification.
Resumo:
The rule of law is understood to be a core aspect in achieving a stable economy and an ordered society. Without the elements that are inherent in this principle the possibilities of anarchy, unfairness and uncertainty are amplified, which in turn can result in an economy with dramatic fluctuations. In this regard, commentators do not always agree that the rule of law is strictly adhered to in the international legal context. Therefore, this paper will explore one aspect of international regulation and consider whether the UNCITRAL Model Law on Cross-border Insolvency (1997) (‘Model Law’) and its associated Guide to Enactment and Interpretation (2013) contribute to the promotion of the key elements of the rule of law.
Resumo:
We trace the evolution of the representation of management in cropping and grazing systems models, from fixed annual schedules of identical actions in single paddocks toward flexible scripts of rules. Attempts to define higher-level organizing concepts in management policies, and to analyse them to identify optimal plans, have focussed on questions relating to grazing management owing to its inherent complexity. “Rule templates” assist the re-use of complex management scripts by bundling commonly-used collections of rules with an interface through which key parameters can be input by a simulation builder. Standard issues relating to parameter estimation and uncertainty apply to management sub-models and need to be addressed. Techniques for embodying farmers' expectations and plans for the future within modelling analyses need to be further developed, especially better linking planning- and rule-based approaches to farm management and analysing the ways that managers can learn.
Resumo:
Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.
Resumo:
Major infrastructure and construction (MIC) projects are those with significant traffic or environmental impact, of strategic and regional significance and high sensitivity. The decision making process of schemes of this type is becoming ever more complicated, especially with the increasing number of stakeholders involved and their growing tendency to defend their own varied interests. Failing to address and meet the concerns and expectations of stakeholders may result in project failures. To avoid this necessitates a systematic participatory approach to facilitate decision-making. Though numerous decision models have been established in previous studies (e.g. ELECTRE methods, the analytic hierarchy process and analytic network process) their applicability in the decision process during stakeholder participation in contemporary MIC projects is still uncertain. To resolve this, the decision rule approach is employed for modeling multi-stakeholder multi-objective project decisions. Through this, the result is obtained naturally according to the “rules” accepted by any stakeholder involved. In this sense, consensus is more likely to be achieved since the process is more convincing and the result is easier to be accepted by all concerned. Appropriate “rules”, comprehensive enough to address multiple objectives while straightforward enough to be understood by multiple stakeholders, are set for resolving conflict and facilitating consensus during the project decision process. The West Kowloon Cultural District (WKCD) project is used as a demonstration case and a focus group meeting is conducted in order to confirm the validity of the model established. The results indicate that the model is objective, reliable and practical enough to cope with real world problems. Finally, a suggested future research agenda is provided.
Resumo:
Motivated by a problem from fluid mechanics, we consider a generalization of the standard curve shortening flow problem for a closed embedded plane curve such that the area enclosed by the curve is forced to decrease at a prescribed rate. Using formal asymptotic and numerical techniques, we derive possible extinction shapes as the curve contracts to a point, dependent on the rate of decreasing area; we find there is a wider class of extinction shapes than for standard curve shortening, for which initially simple closed curves are always asymptotically circular. We also provide numerical evidence that self-intersection is possible for non-convex initial conditions, distinguishing between pinch-off and coalescence of the curve interior.
Resumo:
In this paper, pattern classification problem in tool wear monitoring is solved using nature inspired techniques such as Genetic Programming(GP) and Ant-Miner (AM). The main advantage of GP and AM is their ability to learn the underlying data relationships and express them in the form of mathematical equation or simple rules. The extraction of knowledge from the training data set using GP and AM are in the form of Genetic Programming Classifier Expression (GPCE) and rules respectively. The GPCE and AM extracted rules are then applied to set of data in the testing/validation set to obtain the classification accuracy. A major attraction in GP evolved GPCE and AM based classification is the possibility of obtaining an expert system like rules that can be directly applied subsequently by the user in his/her application. The performance of the data classification using GP and AM is as good as the classification accuracy obtained in the earlier study.