978 resultados para Normalization constraint


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The aim of this research project is to draw on accounts of experiences ofborder crossing and regulation at the Canada/U.S. border at Niagara in order to illuminate the dynamics of differentiation and inequality at this site. The research is informed by claims that the world is turning into a global village due to transnational flows oftechnology, infonnation, capital and people. Much of the available literature on globalization shows that while the transfer of technology, information, and capital are enhanced, the transnational movement of people is both facilitated and constrained in complex and unequal ways. In this project, the workings of facilitation and constraint were explored through an analysis often interviews with people who had spent a substantial portion oftheir childhood (e.g. 5 years) in a Canadian border community. The interviewees were at the time ofthe research between the ages of 19 and 25. Because most ofthe respondents were 'white' Canadians of working to upper middle class status, my focus was to explore how 'whiteness' as privilege may translate into enhanced movement across borders and how 'white' people may internalize and enjoy this privilege but may often deny its reality. I was also interested in how inequality is perceived, understood, and legitimated by these relatively privileged people. My analysis ofthe ten accounts ofborder crossing and regulation suggests that differentially situated people experience border crossing differently. An important finding is that while relatively privileged border crossers perceived and often problernatized differential treatment based on external factors such as physical appearance, and especially race, most did not challenge such treatment but rather saw it as acceptable. These findings are located within newer literature that addresses the increasing securitization ofborders and migration in western societies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Qualitative spatial reasoning (QSR) is an important field of AI that deals with qualitative aspects of spatial entities. Regions and their relationships are described in qualitative terms instead of numerical values. This approach models human based reasoning about such entities closer than other approaches. Any relationships between regions that we encounter in our daily life situations are normally formulated in natural language. For example, one can outline one's room plan to an expert by indicating which rooms should be connected to each other. Mereotopology as an area of QSR combines mereology, topology and algebraic methods. As mereotopology plays an important role in region based theories of space, our focus is on one of the most widely referenced formalisms for QSR, the region connection calculus (RCC). RCC is a first order theory based on a primitive connectedness relation, which is a binary symmetric relation satisfying some additional properties. By using this relation we can define a set of basic binary relations which have the property of being jointly exhaustive and pairwise disjoint (JEPD), which means that between any two spatial entities exactly one of the basic relations hold. Basic reasoning can now be done by using the composition operation on relations whose results are stored in a composition table. Relation algebras (RAs) have become a main entity for spatial reasoning in the area of QSR. These algebras are based on equational reasoning which can be used to derive further relations between regions in a certain situation. Any of those algebras describe the relation between regions up to a certain degree of detail. In this thesis we will use the method of splitting atoms in a RA in order to reproduce known algebras such as RCC15 and RCC25 systematically and to generate new algebras, and hence a more detailed description of regions, beyond RCC25.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It Has Often Been Assumed That a Country's Tax Level, Tax Structure Progressivity and After-Tax Income Distribution Are Chosen by Voters Subject Only to Their Budget Constraints. This Paper Argues That At Certain Income Levels Voters' Decisions May Be Constrained by Bureaucratic Corruption. the Theoretical Arguments Are Developed in Asymmetry Limits the Capacity of the Fiscal System to Generate Revenues by Means of Direct Taxes. This Hypothesis Is Tested Witha Sample of International Data by Means of a Simultaneous Equation Model. the Distortions Resulting From Corruption Ar Captured Through Their Effects on a Latent Variable Defined As the Overall Fiscal Structure. Evidence Is Found of Causality Running From This Latent Variable to the Level of Taxes and the Degree of After Tax Inequality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cette thèse présente une étude dans divers domaines de l'informatique théorique de modèles de calculs combinant automates finis et contraintes arithmétiques. Nous nous intéressons aux questions de décidabilité, d'expressivité et de clôture, tout en ouvrant l'étude à la complexité, la logique, l'algèbre et aux applications. Cette étude est présentée au travers de quatre articles de recherche. Le premier article, Affine Parikh Automata, poursuit l'étude de Klaedtke et Ruess des automates de Parikh et en définit des généralisations et restrictions. L'automate de Parikh est un point de départ de cette thèse; nous montrons que ce modèle de calcul est équivalent à l'automate contraint que nous définissons comme un automate qui n'accepte un mot que si le nombre de fois que chaque transition est empruntée répond à une contrainte arithmétique. Ce modèle est naturellement étendu à l'automate de Parikh affine qui effectue une opération affine sur un ensemble de registres lors du franchissement d'une transition. Nous étudions aussi l'automate de Parikh sur lettres: un automate qui n'accepte un mot que si le nombre de fois que chaque lettre y apparaît répond à une contrainte arithmétique. Le deuxième article, Bounded Parikh Automata, étudie les langages bornés des automates de Parikh. Un langage est borné s'il existe des mots w_1, w_2, ..., w_k tels que chaque mot du langage peut s'écrire w_1...w_1w_2...w_2...w_k...w_k. Ces langages sont importants dans des domaines applicatifs et présentent usuellement de bonnes propriétés théoriques. Nous montrons que dans le contexte des langages bornés, le déterminisme n'influence pas l'expressivité des automates de Parikh. Le troisième article, Unambiguous Constrained Automata, introduit les automates contraints non ambigus, c'est-à-dire pour lesquels il n'existe qu'un chemin acceptant par mot reconnu par l'automate. Nous montrons qu'il s'agit d'un modèle combinant une meilleure expressivité et de meilleures propriétés de clôture que l'automate contraint déterministe. Le problème de déterminer si le langage d'un automate contraint non ambigu est régulier est montré décidable. Le quatrième article, Algebra and Complexity Meet Contrained Automata, présente une étude des représentations algébriques qu'admettent les automates contraints et les automates de Parikh affines. Nous déduisons de ces caractérisations des résultats d'expressivité et de complexité. Nous montrons aussi que certaines hypothèses classiques en complexité computationelle sont reliées à des résultats de séparation et de non clôture dans les automates de Parikh affines. La thèse est conclue par une ouverture à un possible approfondissement, au travers d'un certain nombre de problèmes ouverts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We critically discuss relaxation experiments in magnetic systems that can be characterized in terms of an energy barrier distribution, showing that proper normalization of the relaxation data is needed whenever curves corresponding to different temperatures are to be compared. We show how these normalization factors can be obtained from experimental data by using the Tln (t/t0) scaling method without making any assumptions about the nature of the energy barrier distribution. The validity of the procedure is tested using a ferrofluid of Fe3O4 particles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this text, we present two stereo-based head tracking techniques along with a fast 3D model acquisition system. The first tracking technique is a robust implementation of stereo-based head tracking designed for interactive environments with uncontrolled lighting. We integrate fast face detection and drift reduction algorithms with a gradient-based stereo rigid motion tracking technique. Our system can automatically segment and track a user's head under large rotation and illumination variations. Precision and usability of this approach are compared with previous tracking methods for cursor control and target selection in both desktop and interactive room environments. The second tracking technique is designed to improve the robustness of head pose tracking for fast movements. Our iterative hybrid tracker combines constraints from the ICP (Iterative Closest Point) algorithm and normal flow constraint. This new technique is more precise for small movements and noisy depth than ICP alone, and more robust for large movements than the normal flow constraint alone. We present experiments which test the accuracy of our approach on sequences of real and synthetic stereo images. The 3D model acquisition system we present quickly aligns intensity and depth images, and reconstructs a textured 3D mesh. 3D views are registered with shape alignment based on our iterative hybrid tracker. We reconstruct the 3D model using a new Cubic Ray Projection merging algorithm which takes advantage of a novel data structure: the linked voxel space. We present experiments to test the accuracy of our approach on 3D face modelling using real-time stereo images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The estimation of camera egomotion is a well established problem in computer vision. Many approaches have been proposed based on both the discrete and the differential epipolar constraint. The discrete case is mainly used in self-calibrated stereoscopic systems, whereas the differential case deals with a unique moving camera. The article surveys several methods for mobile robot egomotion estimation covering more than 0.5 million samples using synthetic data. Results from real data are also given

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The speed of fault isolation is crucial for the design and reconfiguration of fault tolerant control (FTC). In this paper the fault isolation problem is stated as a constraint satisfaction problem (CSP) and solved using constraint propagation techniques. The proposed method is based on constraint satisfaction techniques and uncertainty space refining of interval parameters. In comparison with other approaches based on adaptive observers, the major advantage of the presented method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements and model errors and without the monotonicity assumption. In order to illustrate the proposed approach, a case study of a nonlinear dynamic system is presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Plasmodium vivax is one of the five species causing malaria in human beings, affecting around 391 million people annually. The development of an anti-malarial vaccine has been proposed as an alternative for controlling this disease. However, its development has been hampered by allele-specific responses produced by the high genetic diversity shown by some parasite antigens. Evaluating these antigens’ genetic diversity is thus essential when designing a completely effective vaccine. Methods The gene sequences of Plasmodium vivax p12 (pv12) and p38 (pv38), obtained from field isolates in Colombia, were used for evaluating haplotype polymorphism and distribution by population genetics analysis. The evolutionary forces generating the variation pattern so observed were also determined. Results Both pv12 and pv38 were shown to have low genetic diversity. The neutral model for pv12 could not be discarded, whilst polymorphism in pv38 was maintained by balanced selection restricted to the gene’s 5′ region. Both encoded proteins seemed to have functional/structural constraints due to the presence of s48/45 domains, which were seen to be highly conserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regarding the standardization of psychological assessment instruments, that is, the construction of referential interpretations of a test, we can find different procedures performed both by Classical Test Theory (CTT) and the Theory of Item Response (IRT). Especially in this case (IRT), we can admit one test as a default, so to use its standardization and transfer the cut-off point to another instrument. Based on this information, the present study aimed to provide a cutoff score for the Baptista Depression Scale - Adult Version (EBADEP-A) through procedures of norms-transfer based on the Center for Epidemiologic Studies – Depression Scale (CES-D). The EBADEP-A presented good distribution and ability to discriminate depressive symptoms, and the sample, consisting of Brazilian College students, received a cutoff score of 32 points. It is emphasized that this is an exploratory and preliminary study, and it is suggested further analyzes to be performed with clinical samples for which results can be corroborated or confronted.