931 resultados para Rule-based techniques
Resumo:
Mitochondrial DNA (mtDNA) mutations are an important cause of genetic disease and have been proposed to play a role in the ageing process. Quantification of total mtDNA mutation load in ageing tissues is difficult as mutational events are rare in a background of wild-type molecules, and detection of individual mutated molecules is beyond the sensitivity of most sequencing based techniques. The methods currently most commonly used to document the incidence of mtDNA point mutations in ageing include post-PCR cloning, single-molecule PCR and the random mutation capture assay. The mtDNA mutation load obtained by these different techniques varies by orders of magnitude, but direct comparison of the three techniques on the same ageing human tissue has not been performed. We assess the procedures and practicalities involved in each of these three assays and discuss the results obtained by investigation of mutation loads in colonic mucosal biopsies from ten human subjects.
Resumo:
A connection between a fuzzy neural network model with the mixture of experts network (MEN) modelling approach is established. Based on this linkage, two new neuro-fuzzy MEN construction algorithms are proposed to overcome the curse of dimensionality that is inherent in the majority of associative memory networks and/or other rule based systems. The first construction algorithm employs a function selection manager module in an MEN system. The second construction algorithm is based on a new parallel learning algorithm in which each model rule is trained independently, for which the parameter convergence property of the new learning method is established. As with the first approach, an expert selection criterion is utilised in this algorithm. These two construction methods are equivalent in their effectiveness in overcoming the curse of dimensionality by reducing the dimensionality of the regression vector, but the latter has the additional computational advantage of parallel processing. The proposed algorithms are analysed for effectiveness followed by numerical examples to illustrate their efficacy for some difficult data based modelling problems.
Resumo:
This paper presents recent developments to a vision-based traffic surveillance system which relies extensively on the use of geometrical and scene context. Firstly, a highly parametrised 3-D model is reported, able to adopt the shape of a wide variety of different classes of vehicle (e.g. cars, vans, buses etc.), and its subsequent specialisation to a generic car class which accounts for commonly encountered types of car (including saloon, batchback and estate cars). Sample data collected from video images, by means of an interactive tool, have been subjected to principal component analysis (PCA) to define a deformable model having 6 degrees of freedom. Secondly, a new pose refinement technique using “active” models is described, able to recover both the pose of a rigid object, and the structure of a deformable model; an assessment of its performance is examined in comparison with previously reported “passive” model-based techniques in the context of traffic surveillance. The new method is more stable, and requires fewer iterations, especially when the number of free parameters increases, but shows somewhat poorer convergence. Typical applications for this work include robot surveillance and navigation tasks.
Resumo:
Traditionally representation of competencies has been very difficult using computer-based techniques. This paper introduces competencies, how they are represented, and the related concept of competency frameworks and the difficulties in using traditional ontology techniques to formalise them. A “vaguely” formalised framework has been developed within the EU project TRACE and is presented. The framework can be used to represent different competencies and competency frameworks. Through a case study using an example from the IT sector, it is shown how these can be used by individuals and organisations to specify their individual competency needs. Furthermore it is described how these representations are used for comparisons between different specifications applying ontologies and ontology toolsets. The end result is a comparison that is not binary, but tertiary, providing “definite matches”, possible / partial matches, and “no matches” using a “traffic light” analogy.
Resumo:
Mitochondrial DNA (mtDNA) mutations are an important cause of genetic disease and have been proposed to play a role in the ageing process. Quantification of total mtDNA mutation load in ageing tissues is difficult as mutational events are rare in a background of wild-type molecules, and detection of individual mutated molecules is beyond the sensitivity of most sequencing based techniques. The methods currently most commonly used to document the incidence of mtDNA point mutations in ageing include post-PCR cloning, single-molecule PCR and the random mutation capture assay. The mtDNA mutation load obtained by these different techniques varies by orders of magnitude, but direct comparison of the three techniques on the same ageing human tissue has not been performed. We assess the procedures and practicalities involved in each of these three assays and discuss the results obtained by investigation of mutation loads in colonic mucosal biopsies from ten human subjects.
Resumo:
Dual-system models suggest that English past tense morphology involves two processing routes: rule application for regular verbs and memory retrieval for irregular verbs (Pinker, 1999). In second language (L2) processing research, Ullman (2001a) suggested that both verb types are retrieved from memory, but more recently Clahsen and Felser (2006) and Ullman (2004) argued that past tense rule application can be automatised with experience by L2 learners. To address this controversy, we tested highly proficient Greek-English learners with naturalistic or classroom L2 exposure compared to native English speakers in a self-paced reading task involving past tense forms embedded in plausible sentences. Our results suggest that, irrespective to the type of exposure, proficient L2 learners of extended L2 exposure apply rule-based processing.
Resumo:
This paper presents novel observer-based techniques for the estimation of flow demands in gas networks, from sparse pressure telemetry. A completely observable model is explored, constructed by incorporating difference equations that assume the flow demands are steady. Since the flow demands usually vary slowly with time, this is a reasonable approximation. Two techniques for constructing robust observers are employed: robust eigenstructure assignment and singular value assignment. These techniques help to reduce the effects of the system approximation. Modelling error may be further reduced by making use of known profiles for the flow demands. The theory is extended to deal successfully with the problem of measurement bias. The pressure measurements available are subject to constant biases which degrade the flow demand estimates, and such biases need to be estimated. This is achieved by constructing a further model variation that incorporates the biases into an augmented state vector, but now includes information about the flow demand profiles in a new form.
Resumo:
This project is concerned with the way that illustrations, photographs, diagrams and graphs, and typographic elements interact to convey ideas on the book page. A framework for graphic description is proposed to elucidate this graphic language of ‘complex texts’. The model is built up from three main areas of study, with reference to a corpus of contemporary children’s science books. First, a historical survey puts the subjects for study in context. Then a multidisciplinary discussion of graphic communication provides a theoretical underpinning for the model; this leads to various proposals, such as the central importance of ratios and relationships among parts in creating meaning in graphic communication. Lastly a series of trials in description contribute to the structure of the model itself. At the heart of the framework is an organising principle that integrates descriptive models from fields of design, literary criticism, art history, and linguistics, among others, as well as novel categories designed specifically for book design. Broadly, design features are described in terms of elemental component parts (micro-level), larger groupings of these (macro-level), and finally in terms of overarching, ‘whole book’ qualities (meta-level). Various features of book design emerge at different levels; for instance, the presence of nested discursive structures, a form of graphic recursion in editorial design, is proposed at the macro-level. Across these three levels are the intersecting categories of ‘rule’ and ‘context’, offering different perspectives with which to describe graphic characteristics. Contextbased features are contingent on social and cultural environment, the reader’s previous knowledge, and the actual conditions of reading; rule-based features relate to the systematic or codified aspects of graphic language. The model aims to be a frame of reference for graphic description, of use in different forms of qualitative or quantitative research and as a heuristic tool in practice and teaching.
Resumo:
Individual differences in cognitive style can be characterized along two dimensions: ‘systemizing’ (S, the drive to analyze or build ‘rule-based’ systems) and ‘empathizing’ (E, the drive to identify another's mental state and respond to this with an appropriate emotion). Discrepancies between these two dimensions in one direction (S > E) or the other (E > S) are associated with sex differences in cognition: on average more males show an S > E cognitive style, while on average more females show an E > S profile. The neurobiological basis of these different profiles remains unknown. Since individuals may be typical or atypical for their sex, it is important to move away from the study of sex differences and towards the study of differences in cognitive style. Using structural magnetic resonance imaging we examined how neuroanatomy varies as a function of the discrepancy between E and S in 88 adult males from the general population. Selecting just males allows us to study discrepant E-S profiles in a pure way, unconfounded by other factors related to sex and gender. An increasing S > E profile was associated with increased gray matter volume in cingulate and dorsal medial prefrontal areas which have been implicated in processes related to cognitive control, monitoring, error detection, and probabilistic inference. An increasing E > S profile was associated with larger hypothalamic and ventral basal ganglia regions which have been implicated in neuroendocrine control, motivation and reward. These results suggest an underlying neuroanatomical basis linked to the discrepancy between these two important dimensions of individual differences in cognitive style.
Resumo:
A new record of sea surface temperature (SST) for climate applications is described. This record provides independent corroboration of global variations estimated from SST measurements made in situ. Infrared imagery from Along-Track Scanning Radiometers (ATSRs) is used to create a 20 year time series of SST at 0.1° latitude-longitude resolution, in the ATSR Reprocessing for Climate (ARC) project. A very high degree of independence of in situ measurements is achieved via physics-based techniques. Skin SST and SST estimated for 20 cm depth are provided, with grid cell uncertainty estimates. Comparison with in situ data sets establishes that ARC SSTs generally have bias of order 0.1 K or smaller. The precision of the ARC SSTs is 0.14 K during 2003 to 2009, from three-way error analysis. Over the period 1994 to 2010, ARC SSTs are stable, with better than 95% confidence, to within 0.005 K yr−1(demonstrated for tropical regions). The data set appears useful for cleanly quantifying interannual variability in SST and major SST anomalies. The ARC SST global anomaly time series is compared to the in situ-based Hadley Centre SST data set version 3 (HadSST3). Within known uncertainties in bias adjustments applied to in situ measurements, the independent ARC record and HadSST3 present the same variations in global marine temperature since 1996. Since the in situ observing system evolved significantly in its mix of measurement platforms and techniques over this period, ARC SSTs provide an important corroboration that HadSST3 accurately represents recent variability and change in this essential climate variable.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud-masks. Here, this is done over both land and ocean using night-time (infrared) imagery. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 87% and 48% for ocean and land, respectively using the Bayesian technique, compared to 74% and 39%, respectively for the threshold-based techniques associated with the validation dataset.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud masks. Here, the technique is shown to be suitable for daytime applications over land and sea, using visible and near-infrared imagery, in addition to thermal infrared. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 89% and 73% for ocean and land, respectively using the Bayesian technique, compared to 90% and 70%, respectively for the threshold-based techniques associated with the validation dataset.
Resumo:
There are a range of studies based in the low carbon arena which use various ‘futures’- based techniques as ways of exploring uncertainties. These techniques range from ‘scenarios’ and ‘roadmaps’ through to ‘transitions’ and ‘pathways’ as well as ‘vision’-based techniques. The overall aim of the paper is therefore to compare and contrast these techniques to develop a simple working typology with the further objective of identifying the implications of this analysis for RETROFIT 2050. Using recent examples of city-based and energy-based studies throughout, the paper compares and contrasts these techniques and finds that the distinctions between them have often been blurred in the field of low carbon. Visions, for example, have been used in both transition theory and futures/Foresight methods, and scenarios have also been used in transition-based studies as well as futures/Foresight studies. Moreover, Foresight techniques which capture expert knowledge and map existing knowledge to develop a set of scenarios and roadmaps which can inform the development of transitions and pathways can not only help potentially overcome any ‘disconnections’ that may exist between the social and the technical lenses in which such future trajectories are mapped, but also promote a strong ‘co-evolutionary’ content.
Resumo:
Recently, Corpus Linguistics has become a popular research tool in the field of German as a Foreign Language. However, little attention has been paid to teaching and learning potentials that corpora and corpus-based teaching offer. This paper seeks to demonstrate some of the ways in which corpus-based techniques can be used for teaching purposes, even by those who have little experience in Corpus Linguistics. The focus will be on teaching and learning German for Academic Purposes in German Studies abroad.
Resumo:
Trypanosoma cruzi and Trypanosoma rangeli are human-infective blood parasites, largely restricted to Central and South America. They also infect a wide range of wild and domestic mammals and are transmitted by a numerous species of triatomine bugs. There are significant overlaps in the host and geographical ranges of both species. The two species consist of a number of distinct phylogenetic lineages. A range of PCR-based techniques have been developed to differentiate between these species and to assign their isolates into lineages. However, the existence of at least six and five lineages within T. cruzi and T. rangeli, respectively, makes identification of the full range of isolates difficult and time consuming. Here we have applied fluorescent fragment length barcoding (FFLB) to the problem of identifying and genotyping T. cruzi, T. rangeli and other South American trypanosomes. This technique discriminates species on the basis of length polymorphism of regions of the rDNA locus. FFLB was able to differentiate many trypanosome species known from South American mammals: T. cruzi cruzi. T. cruzi marinkellei, T. dionisii-like, T. evansi, T. lewisi, T. rangeli, T. theileri and T. vivax. Furthermore, all five T. rangeli lineages and many T. cruzi lineages could be identified, except the hybrid lineages TcV and TcVI that could not be distinguished from lineages III and II respectively. This method also allowed identification of mixed infections of T. cruzi and T. rangeli lineages in naturally infected triatomine bugs. The ability of FFLB to genotype multiple lineages of T. cruzi and T. rangeli together with other trypanosome species, using the same primer sets is an advantage over other currently available techniques. Overall, these results demonstrate that FFLB is a useful method for species diagnosis, genotyping and understanding the epidemiology of American trypanosomes. (C) 2010 Elsevier B.V. All rights reserved.