922 resultados para Sum rule
Resumo:
Expert systems have been increasingly popular for commercial importance. A rule based system is a special type of an expert system, which consists of a set of ‘if-then‘ rules and can be applied as a decision support system in many areas such as healthcare, transportation and security. Rule based systems can be constructed based on both expert knowledge and data. This paper aims to introduce the theory of rule based systems especially on categorization and construction of such systems from a conceptual point of view. This paper also introduces rule based systems for classification tasks in detail.
Resumo:
The local speeds of object contours vary systematically with the cosine of the angle between the normal component of the local velocity and the global object motion direction. An array of Gabor elements whose speed changes with local spatial orientation in accordance with this pattern can appear to move as a single surface. The apparent direction of motion of plaids and Gabor arrays has variously been proposed to result from feature tracking, vector addition and vector averaging in addition to the geometrically correct global velocity as indicated by the intersection of constraints (IOC) solution. Here a new combination rule, the harmonic vector average (HVA), is introduced, as well as a new algorithm for computing the IOC solution. The vector sum can be discounted as an integration strategy as it increases with the number of elements. The vector average over local vectors that vary in direction always provides an underestimate of the true global speed. The HVA, however, provides the correct global speed and direction for an unbiased sample of local velocities with respect to the global motion direction, as is the case for a simple closed contour. The HVA over biased samples provides an aggregate velocity estimate that can still be combined through an IOC computation to give an accurate estimate of the global velocity, which is not true of the vector average. Psychophysical results for type II Gabor arrays show perceived direction and speed falls close to the IOC direction for Gabor arrays having a wide range of orientations but the IOC prediction fails as the mean orientation shifts away from the global motion direction and the orientation range narrows. In this case perceived velocity generally defaults to the HVA.
Resumo:
According to dual-system accounts of English past-tense processing, regular forms are decomposed into their stem and affix (played=play+ed) based on an implicit linguistic rule, whereas irregular forms (kept) are retrieved directly from the mental lexicon. In second language (L2) processing research, it has been suggested that L2 learners do not have rule-based decomposing abilities, so they process regular past-tense forms similarly to irregular ones (Silva & Clahsen 2008), without applying the morphological rule. The present study investigates morphological processing of regular and irregular verbs in Greek-English L2 learners and native English speakers. In a masked-priming experiment with regular and irregular prime-target verb pairs (playedplay/kept-keep), native speakers showed priming effects for regular pairs, compared to unrelated pairs, indicating decomposition; conversely, L2 learners showed inhibitory effects. At the same time, both groups revealed priming effects for irregular pairs. We discuss these findings in the light of available theories on L2 morphological processing.
Resumo:
Confidence in projections of global-mean sea level rise (GMSLR) depends on an ability to account for GMSLR during the twentieth century. There are contributions from ocean thermal expansion, mass loss from glaciers and ice sheets, groundwater extraction, and reservoir impoundment. Progress has been made toward solving the “enigma” of twentieth-century GMSLR, which is that the observed GMSLR has previously been found to exceed the sum of estimated contributions, especially for the earlier decades. The authors propose the following: thermal expansion simulated by climate models may previously have been underestimated because of their not including volcanic forcing in their control state; the rate of glacier mass loss was larger than previously estimated and was not smaller in the first half than in the second half of the century; the Greenland ice sheet could have made a positive contribution throughout the century; and groundwater depletion and reservoir impoundment, which are of opposite sign, may have been approximately equal in magnitude. It is possible to reconstruct the time series of GMSLR from the quantified contributions, apart from a constant residual term, which is small enough to be explained as a long-term contribution from the Antarctic ice sheet. The reconstructions account for the observation that the rate of GMSLR was not much larger during the last 50 years than during the twentieth century as a whole, despite the increasing anthropogenic forcing. Semiempirical methods for projecting GMSLR depend on the existence of a relationship between global climate change and the rate of GMSLR, but the implication of the authors' closure of the budget is that such a relationship is weak or absent during the twentieth century.
Resumo:
This chapter considers the possible use in armed conflict of low-yield (also known as tactical) nuclear weapons. The Legality of the Threat or Use of Nuclear Weapons Advisory Opinion maintained that it is a cardinal principle that a State must never make civilians an object of attack and must consequently never use weapons that are incapable of distinguishing between civilian and military targets. As international humanitarian law applies equally to any use of nuclear weapons, it is argued that there is no use of nuclear weapons that could spare civilian casualties particularly if you view the long-term health and environmental effects of the use of such weaponry.
Resumo:
Advances in hardware and software technologies allow to capture streaming data. The area of Data Stream Mining (DSM) is concerned with the analysis of these vast amounts of data as it is generated in real-time. Data stream classification is one of the most important DSM techniques allowing to classify previously unseen data instances. Different to traditional classifiers for static data, data stream classifiers need to adapt to concept changes (concept drift) in the stream in real-time in order to reflect the most recent concept in the data as accurately as possible. A recent addition to the data stream classifier toolbox is eRules which induces and updates a set of expressive rules that can easily be interpreted by humans. However, like most rule-based data stream classifiers, eRules exhibits a poor computational performance when confronted with continuous attributes. In this work, we propose an approach to deal with continuous data effectively and accurately in rule-based classifiers by using the Gaussian distribution as heuristic for building rule terms on continuous attributes. We show on the example of eRules that incorporating our method for continuous attributes indeed speeds up the real-time rule induction process while maintaining a similar level of accuracy compared with the original eRules classifier. We termed this new version of eRules with our approach G-eRules.
Resumo:
The most popular endgame tables (EGTs) documenting ‘DTM’ Depth to Mate in chess endgames are those of Eugene Nalimov but these do not recognise the FIDE 50-move rule ‘50mr’. This paper marks the creation by the first author of EGTs for sub-6-man (s6m) chess and beyond which give DTM as affected by the ply count pc. The results are put into the context of previous work recognising the 50mr and are compared with the original unmoderated DTM results. The work is also notable for being the first EGT generation work to use the functional programming language HASKELL.
Resumo:
We report on the assembly of tumor necrosis factor receptor 1 (TNF-R1) prior to ligand activation and its ligand-induced reorganization at the cell membrane. We apply single-molecule localization microscopy to obtain quantitative information on receptor cluster sizes and copy numbers. Our data suggest a dimeric pre-assembly of TNF-R1, as well as receptor reorganization toward higher oligomeric states with stable populations comprising three to six TNF-R1. Our experimental results directly serve as input parameters for computational modeling of the ligand-receptor interaction. Simulations corroborate the experimental finding of higher-order oligomeric states. This work is a first demonstration how quantitative, super-resolution and advanced microscopy can be used for systems biology approaches at the single-molecule and single-cell level.
Resumo:
This thesis draws on the work of Franz Neumann, a critical theorist associated with the early Frankfurt School, to evaluate liberal arguments about political legitimacy and to develop an original account of the justification for the liberal state.
Resumo:
Formal conceptions of the rule of law are popular among contemporary legal philosophers. Nonetheless, the coherence of accounts of the rule of law committed to these conceptions is sometimes fractured by elements harkening back to substantive conceptions of the rule of law. I suggest that this may be because at its origins the ideal of the rule of law was substantive through and through. I also argue that those origins are older than is generally supposed. Most authors tend to trace the ideas of the rule of law and natural law back to classical Greece, but I show that they are already recognisable and intertwined as far back as Homer. Because the founding moment of the tradition of western intellectual reflection on the rule of law placed concerns about substantive justice at the centre of the rule of law ideal, it may be hard for this ideal to entirely shrug off its substantive content. It may be undesirable, too, given the rhetorical power of appeals to the rule of law. The rule of law means something quite radical in Homer; this meaning may provide a source of normative inspiration for contemporary reflections about the rule of law.