45 resultados para Relative complexity
Resumo:
We say the endomorphism problem is solvable for an element W in a free group F if it can be decided effectively whether, given U in F, there is an endomorphism Φ of F sending W to U. This work analyzes an approach due to C. Edmunds and improved by C. Sims. Here we prove that the approach provides an efficient algorithm for solving the endomorphism problem when W is a two- generator word. We show that when W is a two-generator word this algorithm solves the problem in time polynomial in the length of U. This result gives a polynomial-time algorithm for solving, in free groups, two-variable equations in which all the variables occur on one side of the equality and all the constants on the other side.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
This paper develops the link between poverty and inequality by focussing on a class of poverty indices (some of them well-known) which aggregate normative concerns for absolute and relative deprivation. The indices are distinguished by a parameter that captures the ethical sensitivity of poverty measurement to ``exclusion'' or ``relative-deprivation'' aversion. We also show how the indices can be readily used to predict the impact of growth on poverty. An illustration using LIS data finds that he United States show more relative deprivation than Denmark and Belgium whatever the percentiles considered, but that overall deprivation comparisons of the four countries considered will generally necessarily depend on the intensity of the ethical concern for relative deprivation. The impact of growth on poverty is also seen to depend on the presence of and on the attention granted to concerns over relative deprivation. }
Resumo:
Consider a Riemannian manifold equipped with an infinitesimal isometry. For this setup, a unified treatment is provided, solely in the language of Riemannian geometry, of techniques in reduction, linearization, and stability of relative equilibria. In particular, for mechanical control systems, an explicit characterization is given for the manner in which reduction by an infinitesimal isometry, and linearization along a controlled trajectory "commute." As part of the development, relationships are derived between the Jacobi equation of geodesic variation and concepts from reduction theory, such as the curvature of the mechanical connection and the effective potential. As an application of our techniques, fiber and base stability of relative equilibria are studied. The paper also serves as a tutorial of Riemannian geometric methods applicable in the intersection of mechanics and control theory.
Resumo:
We prove a double commutant theorem for hereditary subalgebras of a large class of C*-algebras, partially resolving a problem posed by Pedersen[8]. Double commutant theorems originated with von Neumann, whose seminal result evolved into an entire field now called von Neumann algebra theory. Voiculescu proved a C*-algebraic double commutant theorem for separable subalgebras of the Calkin algebra. We prove a similar result for hereditary subalgebras which holds for arbitrary corona C*-algebras. (It is not clear how generally Voiculescu's double commutant theorem holds.)
Resumo:
The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.
Resumo:
The study tested three analytic tools applied in SLA research (T-unit, AS-unit and Idea-unit) against FL learner monologic oral data. The objective was to analyse their effectiveness for the assessment of complexity of learners' academic production in English. The data were learners' individual productions gathered during the implementation of a CLIL teaching sequence on Natural Sciences in a Catalan state secondary school. The analysis showed that only AS-unit was easily applicable and highly effective in segmenting the data and taking complexity measures
Resumo:
We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.
Resumo:
Shape complexity has recently received attention from different fields, such as computer vision and psychology. In this paper, integral geometry and information theory tools are applied to quantify the shape complexity from two different perspectives: from the inside of the object, we evaluate its degree of structure or correlation between its surfaces (inner complexity), and from the outside, we compute its degree of interaction with the circumscribing sphere (outer complexity). Our shape complexity measures are based on the following two facts: uniformly distributed global lines crossing an object define a continuous information channel and the continuous mutual information of this channel is independent of the object discretisation and invariant to translations, rotations, and changes of scale. The measures introduced in this paper can be potentially used as shape descriptors for object recognition, image retrieval, object localisation, tumour analysis, and protein docking, among others
Resumo:
The author studies the error and complexity of the discrete random walk Monte Carlo technique for radiosity, using both the shooting and gathering methods. The author shows that the shooting method exhibits a lower complexity than the gathering one, and under some constraints, it has a linear complexity. This is an improvement over a previous result that pointed to an O(n log n) complexity. The author gives and compares three unbiased estimators for each method, and obtains closed forms and bounds for their variances. The author also bounds the expected value of the mean square error (MSE). Some of the results obtained are also shown
Resumo:
Not much has been said about the grammar of Iraqi Arabic. This research is an attempt to shed light on the nature of IA relative clauses. The research focuses on the behavior of the resumptive strategy as opposed to the gap strategy. We consider islandhood, Weak and Strong CrossOver, reconstruction and scope binding, in order to further understand the behavior of resumption. The final conclusion reached is that in Iraqi Arabic the resumptive strategy is actually related to the gap strategy in several respects; and in those where it differs we propose that gaps (traces) are replaced by trace+pronoun complex
Resumo:
The effects of the nongray absorption (i.e., atmospheric opacity varying with wavelength) on the possible upper bound of the outgoing longwave radiation (OLR) emitted by a planetary atmosphere have been examined. This analysis is based on the semigray approach, which appears to be a reasonable compromise between the complexity of nongray models and the simplicity of the gray assumption (i.e., atmospheric absorption independent of wavelength). Atmospheric gases in semigray atmospheres make use of constant absorption coefficients in finite-width spectral bands. Here, such a semigray absorption is introduced in a one-dimensional (1D) radiative– convective model with a stratosphere in radiative equilibrium and a troposphere fully saturated with water vapor, which is the semigray gas. A single atmospheric window in the infrared spectrum has been assumed. In contrast to the single absolute limit of OLR found in gray atmospheres, semigray ones may also show a relative limit. This means that both finite and infinite runaway effects may arise in some semigray cases. Of particular importance is the finding of an entirely new branch of stable steady states that does not appear in gray atmospheres. This new multiple equilibrium is a consequence of the nongray absorption only. It is suspected that this new set of stable solutions has not been previously revealed in analyses of radiative–convective models since it does not appear for an atmosphere with nongray parameters similar to those for the earth’s current state
Resumo:
The 2×2 MIMO profiles included in Mobile WiMAX specifications are Alamouti’s space-time code (STC) fortransmit diversity and spatial multiplexing (SM). The former hasfull diversity and the latter has full rate, but neither of them hasboth of these desired features. An alternative 2×2 STC, which is both full rate and full diversity, is the Golden code. It is the best known 2×2 STC, but it has a high decoding complexity. Recently, the attention was turned to the decoder complexity, this issue wasincluded in the STC design criteria, and different STCs wereproposed. In this paper, we first present a full-rate full-diversity2×2 STC design leading to substantially lower complexity ofthe optimum detector compared to the Golden code with only a slight performance loss. We provide the general optimized form of this STC and show that this scheme achieves the diversitymultiplexing frontier for square QAM signal constellations. Then, we present a variant of the proposed STC, which provides a further decrease in the detection complexity with a rate reduction of 25% and show that this provides an interesting trade-off between the Alamouti scheme and SM.
Resumo:
The growing multilingual trend in movie production comes with a challenge for dubbing translators since they are increasingly confronted with more than one source language. The main purpose of this master’s thesis is to provide a case study on how these third languages (see CORRIUS and ZABALBEASCOA 2011) are rendered. Another aim is to put a particular focus on their textual and narrative functions and detect possible shifts that might occur in translations. By applying a theoretical model for translation analysis (CORRIUS and ZABALBEASCOA 2011), this study describes how third languages are rendered in the German, Spanish, and Italian dubbed versions of the 2009 Tarantino movie Inglourious Basterds. A broad range of solution-types are thereby revealed and prevalent restrictions of the translation process identified. The target texts are brought in context with some sociohistorical aspects of dubbing in order to detect prevalent norms of the respective cultures andto discuss the acceptability of translations (TOURY 1995). The translatability potential of even highly complex multilingual audiovisual texts is demonstrated in this study. Moreover, proposals for further studies in multilingual audiovisual translation are outlined and the potential for future investigations in this field thereby emphasised.