948 resultados para local sequence alignment problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Edges are key points of information in visual scenes. One important class of models supposes that edges correspond to the steepest parts of the luminance profile, implying that they can be found as peaks and troughs in the response of a gradient (1st derivative) filter, or as zero-crossings in the 2nd derivative (ZCs). We tested those ideas using a stimulus that has no local peaks of gradient and no ZCs, at any scale. The stimulus profile is analogous to the Mach ramp, but it is the luminance gradient (not the absolute luminance) that increases as a linear ramp between two plateaux; the luminance profile is a blurred triangle-wave. For all image-blurs tested, observers marked edges at or close to the corner points in the gradient profile, even though these were not gradient maxima. These Mach edges correspond to peaks and troughs in the 3rd derivative. Thus Mach edges are inconsistent with many standard edge-detection schemes, but are nicely predicted by a recent model that finds edge points with a 2-stage sequence of 1st then 2nd derivative operators, each followed by a half-wave rectifier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the adoption, by the New Labour government, of a mixed communities approach to the renewal of disadvantaged neighbourhoods in England. It argues that while there are continuities with previous policy, the new approach represents a more neoliberal policy turn in three respects: its identification of concentrated poverty as the problem; its faith in market-led regeneration; and its alignment with a new urban policy agenda in which cities are gentrified and remodelled as sites for capital accumulation through entrepreneurial local governance. The article then draws on evidence from the early phases of the evaluation of the mixed community demonstration projects to explore how the new policy approach is playing out at a local level, where it is layered upon existing policies, politics and institutional relationships. Tensions between neighbourhood and strategic interests, community and capital are evident as the local projects attempt neighbourhood transformation, while seeking to protect the rights and interests of existing residents. Extensive community consultation efforts run parallel with emergent governance structures, in which local state and capital interests combine and communities may effectively be disempowered. Policies and structures are still evolving and it is not yet entirely clear how these tensions will be resolved, especially in the light of a collapsing housing market, increased poverty and demand for affordable housing, and a shortage of private investment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We conducted a detailed study of a case of linguistic talent in the context of autism spectrum disorder, specifically Asperger syndrome. I.A. displays language strengths at the level of morphology and syntax. Yet, despite this grammar advantage, processing of figurative language and inferencing based on context presents a problem for him. The morphology advantage for I.A. is consistent with the weak central coherence (WCC) account of autism. From this account, the presence of a local processing bias is evident in the ways in which autistic individuals solve common problems, such as assessing similarities between objects and finding common patterns, and may therefore provide an advantage in some cognitive tasks compared to typical individuals. We extend the WCC account to language and provide evidence for a connection between the local processing bias and the acquisition of morphology and grammar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The G-protein coupled receptors--or GPCRs--comprise simultaneously one of the largest and one of the most multi-functional protein families known to modern-day molecular bioscience. From a drug discovery and pharmaceutical industry perspective, the GPCRs constitute one of the most commercially and economically important groups of proteins known. The GPCRs undertake numerous vital metabolic functions and interact with a hugely diverse range of small and large ligands. Many different methodologies have been developed to efficiently and accurately classify the GPCRs. These range from motif-based techniques to machine learning as well as a variety of alignment-free techniques based on the physiochemical properties of sequences. We review here the available methodologies for the classification of GPCRs. Part of this work focuses on how we have tried to build the intrinsically hierarchical nature of sequence relations, implicit within the family, into an adaptive approach to classification. Importantly, we also allude to some of the key innate problems in developing an effective approach to classifying the GPCRs: the lack of sequence similarity between the six classes that comprise the GPCR family and the low sequence similarity to other family members evinced by many newly revealed members of the family.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examined the effect of grouping by the alignment of implicit axes on the perception of multiple shapes, using a patient (GK) who shows simultanagnosia as part of Blint's syndrome. Five experiments demonstrated that: (1) GK was better able to judge the orientation of a global configuration if the constituent local shapes were aligned with their major axes than if they were aligned with their edges; (2) this axis information was used implicitly, since GK was unable to discriminate between configurations of axis-aligned and edge-aligned shapes; (3) GK's sensitivity to axis-alignment persisted even when the orientations of local shapes were kept constant, indicating some form of cooperative effect between the local elements; (4) axis-alignment of shapes also facilitated his ability to discriminate single-item from multi-item configurations; (5) the effect of axis-alignment could be attributed, at least partially, to the degree to which there was matching between the orientations of local shapes and the global configuration. Taken together, the results suggest that axis-based grouping can support the selection of multiple objects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to consider hierarchical control as a mode of governance, and analyses the extent of control exhibited by central government over local government through the best value (BV) and comprehensive performance assessment (CPA) performance regimes. Design/methodology/approach – This paper utilises Ouchi's framework and, specifically, his articulation of bureaucratic or hierarchical control in the move towards achievement of organisational objectives. Hierarchical control may be inferred from the extent of “command and control” by Central Government, use of rewards and sanctions, and alignment to government priorities and discrimination of performance. Findings – CPA represents a more sophisticated performance regime than BV in the governance of local authorities by central government. In comparison to BV, CPA involved less scope for dialogue with local government prior to introduction, closer inspection of and direction of support toward poorer performing authorities, and more alignment to government priorities in the weightings attached to service blocks. Originality/value - The paper focuses upon the hierarchic/bureaucratic mode of governance as articulated by Ouchi and expands on this mode in order to analyse shifts in performance regimes in the public sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earlier the authors have suggested a logical level description of classes which allows to reduce a solution of various pattern recognition problems to solution of a sequence of one-type problems with the less dimension. Here conditions of the effectiveness of the use of such a level descriptions are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to explore the importance of host country networks and organisation of production in the context of international technology transfer that accompanies foreign direct investment (FDI). Design/methodology/approach – The empirical analysis is based on unbalanced panel data covering Japanese firms active in two-digit manufacturing sectors over a seven-year period. Given the self-selection problem affecting past sectoral-level studies, using firm-level panel data is a prerequisite to provide robust empirical evidence. Findings – While Japan is thought of as being a technologically advanced country, the results show that vertical productivity spillovers from FDI occur in Japan, but they are sensitive to technological differences between domestic firms and the idiosyncratic Japanese institutional network. FDI in vertically organised keiretsu sectors generates inter-industry spillovers through backward and forward linkages, while FDI within sectors linked to vertical keiretsu activities adversely affects domestic productivity. Overall, our results suggest that the role of vertical keiretsu is more prevalent than that of horizontal keiretsu. Originality/value – Japan’s industrial landscape has been dominated by institutional clusters or networks of inter-firm organisations through reciprocated, direct and indirect ties. However, interactions between inward investors and such institutionalised networks in the host economy are seldom explored. The role and characteristics of local business groups, in the form of keiretsu networks, have been investigated to determine the scale and scope of spillovers from inward FDI to Japanese establishments. This conceptualisation depends on the institutional mechanism and the market structure through which host economies absorb and exploit FDI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the design and development of an eye alignment/tracking system which allows self alignment of the eye’s optical axis with a measurement axis. Eye alignment is an area of research largely over-looked, yet it is a fundamental requirement in the acquisition of clinical data from the eye. New trends in the ophthalmic market, desiring portable hand-held apparatus, and the application of ophthalmic measurements in areas other than vision care have brought eye alignment under new scrutiny. Ophthalmic measurements taken in hand-held devices with out an clinician present requires alignment in an entirely new set of circumstances, requiring a novel solution. In order to solve this problem, the research has drawn upon eye tracking technology to monitor the eye, and a principle of self alignment to perform alignment correction. A handheld device naturally lends itself to the patient performing alignment, thus a technique has been designed to communicate raw eye tracking data to the user in a manner which allows the user to make the necessary corrections. The proposed technique is a novel methodology in which misalignment to the eye’s optical axis can be quantified, corrected and evaluated. The technique uses Purkinje Image tracking to monitor the eye’s movement as well as the orientation of the optical axis. The use of two sets of Purkinje Images allows quantification of the eye’s physical parameters needed for accurate Purkinje Image tracking, negating the need for prior anatomical data. An instrument employing the methodology was subsequently prototyped and validated, allowing a sample group to achieve self alignment of their optical axis with an imaging axis within 16.5-40.8 s, and with a rotational precision of 0.03-0.043°(95% confidence intervals). By encompassing all these factors the technique facilitates self alignment from an unaligned position on the visual axis to an aligned position on the optical axis. The consequence of this is that ophthalmic measurements, specifically pachymetric measurements, can be made in the absence of an optician, allowing the use of ophthalmic instrumentation and measurements in health professions other than vision care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is proved in [1],[2] that in odd dimensional spaces any uniform decay of the local energy implies that it must decay exponentially. We extend this to even dimensional spaces and to more general perturbations (including the transmission problem) showing that any uniform decay of the local energy implies that it must decay like O(t^(−2n) ), t ≫ 1 being the time and n being the space dimension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal grammars can used for describing complex repeatable structures such as DNA sequences. In this paper, we describe the structural composition of DNA sequences using a context-free stochastic L-grammar. L-grammars are a special class of parallel grammars that can model the growth of living organisms, e.g. plant development, and model the morphology of a variety of organisms. We believe that parallel grammars also can be used for modeling genetic mechanisms and sequences such as promoters. Promoters are short regulatory DNA sequences located upstream of a gene. Detection of promoters in DNA sequences is important for successful gene prediction. Promoters can be recognized by certain patterns that are conserved within a species, but there are many exceptions which makes the promoter recognition a complex problem. We replace the problem of promoter recognition by induction of context-free stochastic L-grammar rules, which are later used for the structural analysis of promoter sequences. L-grammar rules are derived automatically from the drosophila and vertebrate promoter datasets using a genetic programming technique and their fitness is evaluated using a Support Vector Machine (SVM) classifier. The artificial promoter sequences generated using the derived L- grammar rules are analyzed and compared with natural promoter sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computing the similarity between two protein structures is a crucial task in molecular biology, and has been extensively investigated. Many protein structure comparison methods can be modeled as maximum weighted clique problems in specific k-partite graphs, referred here as alignment graphs. In this paper we present both a new integer programming formulation for solving such clique problems and a dedicated branch and bound algorithm for solving the maximum cardinality clique problem. Both approaches have been integrated in VAST, a software for aligning protein 3D structures largely used in the National Center for Biotechnology Information, an original clique solver which uses the well known Bron and Kerbosch algorithm (BK). Our computational results on real protein alignment instances show that our branch and bound algorithm is up to 116 times faster than BK.