916 resultados para Approximate Sum Rule


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expert systems have been increasingly popular for commercial importance. A rule based system is a special type of an expert system, which consists of a set of ‘if-then‘ rules and can be applied as a decision support system in many areas such as healthcare, transportation and security. Rule based systems can be constructed based on both expert knowledge and data. This paper aims to introduce the theory of rule based systems especially on categorization and construction of such systems from a conceptual point of view. This paper also introduces rule based systems for classification tasks in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The local speeds of object contours vary systematically with the cosine of the angle between the normal component of the local velocity and the global object motion direction. An array of Gabor elements whose speed changes with local spatial orientation in accordance with this pattern can appear to move as a single surface. The apparent direction of motion of plaids and Gabor arrays has variously been proposed to result from feature tracking, vector addition and vector averaging in addition to the geometrically correct global velocity as indicated by the intersection of constraints (IOC) solution. Here a new combination rule, the harmonic vector average (HVA), is introduced, as well as a new algorithm for computing the IOC solution. The vector sum can be discounted as an integration strategy as it increases with the number of elements. The vector average over local vectors that vary in direction always provides an underestimate of the true global speed. The HVA, however, provides the correct global speed and direction for an unbiased sample of local velocities with respect to the global motion direction, as is the case for a simple closed contour. The HVA over biased samples provides an aggregate velocity estimate that can still be combined through an IOC computation to give an accurate estimate of the global velocity, which is not true of the vector average. Psychophysical results for type II Gabor arrays show perceived direction and speed falls close to the IOC direction for Gabor arrays having a wide range of orientations but the IOC prediction fails as the mean orientation shifts away from the global motion direction and the orientation range narrows. In this case perceived velocity generally defaults to the HVA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to dual-system accounts of English past-tense processing, regular forms are decomposed into their stem and affix (played=play+ed) based on an implicit linguistic rule, whereas irregular forms (kept) are retrieved directly from the mental lexicon. In second language (L2) processing research, it has been suggested that L2 learners do not have rule-based decomposing abilities, so they process regular past-tense forms similarly to irregular ones (Silva & Clahsen 2008), without applying the morphological rule. The present study investigates morphological processing of regular and irregular verbs in Greek-English L2 learners and native English speakers. In a masked-priming experiment with regular and irregular prime-target verb pairs (playedplay/kept-keep), native speakers showed priming effects for regular pairs, compared to unrelated pairs, indicating decomposition; conversely, L2 learners showed inhibitory effects. At the same time, both groups revealed priming effects for irregular pairs. We discuss these findings in the light of available theories on L2 morphological processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo algorithms often aim to draw from a distribution π by simulating a Markov chain with transition kernel P such that π is invariant under P. However, there are many situations for which it is impractical or impossible to draw from the transition kernel P. For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace P by an approximation Pˆ. Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ’close’ the chain given by the transition kernel Pˆ is to the chain given by P . We apply these results to several examples from spatial statistics and network analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Confidence in projections of global-mean sea level rise (GMSLR) depends on an ability to account for GMSLR during the twentieth century. There are contributions from ocean thermal expansion, mass loss from glaciers and ice sheets, groundwater extraction, and reservoir impoundment. Progress has been made toward solving the “enigma” of twentieth-century GMSLR, which is that the observed GMSLR has previously been found to exceed the sum of estimated contributions, especially for the earlier decades. The authors propose the following: thermal expansion simulated by climate models may previously have been underestimated because of their not including volcanic forcing in their control state; the rate of glacier mass loss was larger than previously estimated and was not smaller in the first half than in the second half of the century; the Greenland ice sheet could have made a positive contribution throughout the century; and groundwater depletion and reservoir impoundment, which are of opposite sign, may have been approximately equal in magnitude. It is possible to reconstruct the time series of GMSLR from the quantified contributions, apart from a constant residual term, which is small enough to be explained as a long-term contribution from the Antarctic ice sheet. The reconstructions account for the observation that the rate of GMSLR was not much larger during the last 50 years than during the twentieth century as a whole, despite the increasing anthropogenic forcing. Semiempirical methods for projecting GMSLR depend on the existence of a relationship between global climate change and the rate of GMSLR, but the implication of the authors' closure of the budget is that such a relationship is weak or absent during the twentieth century.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter considers the possible use in armed conflict of low-yield (also known as tactical) nuclear weapons. The Legality of the Threat or Use of Nuclear Weapons Advisory Opinion maintained that it is a cardinal principle that a State must never make civilians an object of attack and must consequently never use weapons that are incapable of distinguishing between civilian and military targets. As international humanitarian law applies equally to any use of nuclear weapons, it is argued that there is no use of nuclear weapons that could spare civilian casualties particularly if you view the long-term health and environmental effects of the use of such weaponry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in hardware and software technologies allow to capture streaming data. The area of Data Stream Mining (DSM) is concerned with the analysis of these vast amounts of data as it is generated in real-time. Data stream classification is one of the most important DSM techniques allowing to classify previously unseen data instances. Different to traditional classifiers for static data, data stream classifiers need to adapt to concept changes (concept drift) in the stream in real-time in order to reflect the most recent concept in the data as accurately as possible. A recent addition to the data stream classifier toolbox is eRules which induces and updates a set of expressive rules that can easily be interpreted by humans. However, like most rule-based data stream classifiers, eRules exhibits a poor computational performance when confronted with continuous attributes. In this work, we propose an approach to deal with continuous data effectively and accurately in rule-based classifiers by using the Gaussian distribution as heuristic for building rule terms on continuous attributes. We show on the example of eRules that incorporating our method for continuous attributes indeed speeds up the real-time rule induction process while maintaining a similar level of accuracy compared with the original eRules classifier. We termed this new version of eRules with our approach G-eRules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most popular endgame tables (EGTs) documenting ‘DTM’ Depth to Mate in chess endgames are those of Eugene Nalimov but these do not recognise the FIDE 50-move rule ‘50mr’. This paper marks the creation by the first author of EGTs for sub-6-man (s6m) chess and beyond which give DTM as affected by the ply count pc. The results are put into the context of previous work recognising the 50mr and are compared with the original unmoderated DTM results. The work is also notable for being the first EGT generation work to use the functional programming language HASKELL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report on the assembly of tumor necrosis factor receptor 1 (TNF-R1) prior to ligand activation and its ligand-induced reorganization at the cell membrane. We apply single-molecule localization microscopy to obtain quantitative information on receptor cluster sizes and copy numbers. Our data suggest a dimeric pre-assembly of TNF-R1, as well as receptor reorganization toward higher oligomeric states with stable populations comprising three to six TNF-R1. Our experimental results directly serve as input parameters for computational modeling of the ligand-receptor interaction. Simulations corroborate the experimental finding of higher-order oligomeric states. This work is a first demonstration how quantitative, super-resolution and advanced microscopy can be used for systems biology approaches at the single-molecule and single-cell level.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis draws on the work of Franz Neumann, a critical theorist associated with the early Frankfurt School, to evaluate liberal arguments about political legitimacy and to develop an original account of the justification for the liberal state.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.