79 resultados para node removal rule
Resumo:
According to dual-system accounts of English past-tense processing, regular forms are decomposed into their stem and affix (played=play+ed) based on an implicit linguistic rule, whereas irregular forms (kept) are retrieved directly from the mental lexicon. In second language (L2) processing research, it has been suggested that L2 learners do not have rule-based decomposing abilities, so they process regular past-tense forms similarly to irregular ones (Silva & Clahsen 2008), without applying the morphological rule. The present study investigates morphological processing of regular and irregular verbs in Greek-English L2 learners and native English speakers. In a masked-priming experiment with regular and irregular prime-target verb pairs (playedplay/kept-keep), native speakers showed priming effects for regular pairs, compared to unrelated pairs, indicating decomposition; conversely, L2 learners showed inhibitory effects. At the same time, both groups revealed priming effects for irregular pairs. We discuss these findings in the light of available theories on L2 morphological processing.
Resumo:
This chapter considers the possible use in armed conflict of low-yield (also known as tactical) nuclear weapons. The Legality of the Threat or Use of Nuclear Weapons Advisory Opinion maintained that it is a cardinal principle that a State must never make civilians an object of attack and must consequently never use weapons that are incapable of distinguishing between civilian and military targets. As international humanitarian law applies equally to any use of nuclear weapons, it is argued that there is no use of nuclear weapons that could spare civilian casualties particularly if you view the long-term health and environmental effects of the use of such weaponry.
Resumo:
Advances in hardware and software technologies allow to capture streaming data. The area of Data Stream Mining (DSM) is concerned with the analysis of these vast amounts of data as it is generated in real-time. Data stream classification is one of the most important DSM techniques allowing to classify previously unseen data instances. Different to traditional classifiers for static data, data stream classifiers need to adapt to concept changes (concept drift) in the stream in real-time in order to reflect the most recent concept in the data as accurately as possible. A recent addition to the data stream classifier toolbox is eRules which induces and updates a set of expressive rules that can easily be interpreted by humans. However, like most rule-based data stream classifiers, eRules exhibits a poor computational performance when confronted with continuous attributes. In this work, we propose an approach to deal with continuous data effectively and accurately in rule-based classifiers by using the Gaussian distribution as heuristic for building rule terms on continuous attributes. We show on the example of eRules that incorporating our method for continuous attributes indeed speeds up the real-time rule induction process while maintaining a similar level of accuracy compared with the original eRules classifier. We termed this new version of eRules with our approach G-eRules.
Resumo:
The most popular endgame tables (EGTs) documenting ‘DTM’ Depth to Mate in chess endgames are those of Eugene Nalimov but these do not recognise the FIDE 50-move rule ‘50mr’. This paper marks the creation by the first author of EGTs for sub-6-man (s6m) chess and beyond which give DTM as affected by the ply count pc. The results are put into the context of previous work recognising the 50mr and are compared with the original unmoderated DTM results. The work is also notable for being the first EGT generation work to use the functional programming language HASKELL.
Resumo:
We report on the assembly of tumor necrosis factor receptor 1 (TNF-R1) prior to ligand activation and its ligand-induced reorganization at the cell membrane. We apply single-molecule localization microscopy to obtain quantitative information on receptor cluster sizes and copy numbers. Our data suggest a dimeric pre-assembly of TNF-R1, as well as receptor reorganization toward higher oligomeric states with stable populations comprising three to six TNF-R1. Our experimental results directly serve as input parameters for computational modeling of the ligand-receptor interaction. Simulations corroborate the experimental finding of higher-order oligomeric states. This work is a first demonstration how quantitative, super-resolution and advanced microscopy can be used for systems biology approaches at the single-molecule and single-cell level.
Resumo:
This thesis draws on the work of Franz Neumann, a critical theorist associated with the early Frankfurt School, to evaluate liberal arguments about political legitimacy and to develop an original account of the justification for the liberal state.
Resumo:
Contamination of the electroencephalogram (EEG) by artifacts greatly reduces the quality of the recorded signals. There is a need for automated artifact removal methods. However, such methods are rarely evaluated against one another via rigorous criteria, with results often presented based upon visual inspection alone. This work presents a comparative study of automatic methods for removing blink, electrocardiographic, and electromyographic artifacts from the EEG. Three methods are considered; wavelet, blind source separation (BSS), and multivariate singular spectrum analysis (MSSA)-based correction. These are applied to data sets containing mixtures of artifacts. Metrics are devised to measure the performance of each method. The BSS method is seen to be the best approach for artifacts of high signal to noise ratio (SNR). By contrast, MSSA performs well at low SNRs but at the expense of a large number of false positive corrections.
Resumo:
A fully automated and online artifact removal method for the electroencephalogram (EEG) is developed for use in brain-computer interfacing. The method (FORCe) is based upon a novel combination of wavelet decomposition, independent component analysis, and thresholding. FORCe is able to operate on a small channel set during online EEG acquisition and does not require additional signals (e.g. electrooculogram signals). Evaluation of FORCe is performed offline on EEG recorded from 13 BCI particpants with cerebral palsy (CP) and online with three healthy participants. The method outperforms the state-of the-art automated artifact removal methods Lagged auto-mutual information clustering (LAMIC) and Fully automated statistical thresholding (FASTER), and is able to remove a wide range of artifact types including blink, electromyogram (EMG), and electrooculogram (EOG) artifacts.
Resumo:
Understanding the factors that drive successful re-creation and restoration of lowland heaths is crucially important for achieving the long-term conservation of this threatened habitat type. In this study we investigated the changes in soil chemistry, plant community and interactions between Calluna vulgaris and symbiotic ericoid mycorrhizas (ERM) that occurred when improved pasture was subjected to one of three treatments (i) acidification with elemental sulphur (ii) acidification with ferrous sulphur (iii) removal of the topsoil. We found that the soil stripping treatment produced the greatest reduction in available phosphate but did not decrease soil pH. Conversely, acidification with elemental sulphur decreased pH but increased availability of phosphate and potentially toxic cations. The elemental sulphur treatment produced plant communities that most closely resembled those on surrounding heaths and acid grasslands. The most important driver was low pH and concomitant increased availability of potentially toxic cations. Plant community development was found to be little related to levels of available soil phosphate, particularly at low pH. The elemental sulphur treatment also produced the best germination and growth of C. vulgaris over 4–5 years. However, this treatment was found to inhibit the development of symbiotic relationships between C. vulgaris and ERM. This may affect the long-term persistence of re-created vegetation and its interactions with other components of heathland communities.
Resumo:
Formal conceptions of the rule of law are popular among contemporary legal philosophers. Nonetheless, the coherence of accounts of the rule of law committed to these conceptions is sometimes fractured by elements harkening back to substantive conceptions of the rule of law. I suggest that this may be because at its origins the ideal of the rule of law was substantive through and through. I also argue that those origins are older than is generally supposed. Most authors tend to trace the ideas of the rule of law and natural law back to classical Greece, but I show that they are already recognisable and intertwined as far back as Homer. Because the founding moment of the tradition of western intellectual reflection on the rule of law placed concerns about substantive justice at the centre of the rule of law ideal, it may be hard for this ideal to entirely shrug off its substantive content. It may be undesirable, too, given the rhetorical power of appeals to the rule of law. The rule of law means something quite radical in Homer; this meaning may provide a source of normative inspiration for contemporary reflections about the rule of law.