900 resultados para constraint solving
Resumo:
Se propone desarrollar e integrar estudios sobre Modelado y Resolución de Problemas en Física que asumen como factores explicativos: características de la situación planteada, conocimiento de la persona que resuelve y proceso puesto en juego durante la resolución. Interesa comprender cómo los estudiantes acceden al conocimiento previo, qué procedimientos usan para recuperar algunos conocimientos y desechar otros, cuáles son los criterios que dan coherencia a sus decisiones, cómo se relacionan estas decisiones con algunas características de la tarea, entre otras. Todo ello con miras a estudiar relaciones causales entre las dificultades encontradas y el retraso o abandono en las carreras.Se propone organizar el trabajo en tres ejes, los dos primeros de construcción teórica y un tercero de implementación y transferencia. Se pretende.1.-Estudiar los procesos de construcción de las representaciones mentales en resolución de problemas de física, tanto en expertos como en estudiantes de diferentes niveles académicos.2.-Analizar y clasificar las inferencias que se producen durante las tareas de comprensión en resolución de problemas de física. Asociar dichas inferencias con procesos de transición entre representaciones mentales de diferente naturaleza.3.-Desarrollar materiales y diseños instruccionales en la enseñanza de la Física, fundamentado en un conocimiento de los requerimientos psicológicos de los estudiantes en diversas tareas de aprendizaje.En términos generales se plantea un enfoque interpretativo a la luz de marcos de la psicología cognitiva y de los desarrollos propios del grupo. Se trabajará con muestras intencionales de alumnos y profesores de física. Se utilizarán protocolos verbales y registros escritos producidos durante la ejecución de las tareas con el fin de identificar indicadores de comprensión, inferencias, y diferentes niveles de representación. Se prevé analizar material escrito de circulación corriente sea comercial o preparado por los docentes de las carreras involucradas.Las características del objeto de estudio y el distinto nivel de desarrollo en que se encuentran los diferentes ojetivos específicos llevan a que el abordaje contemple -según consideracion de Juni y Urbano (2006)- tanto la lógica cualitativa como la cuantitativa.
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2009
Resumo:
n.s. no.43(1988)
Resumo:
I consider the problem of assigning agents to objects where each agent must pay the price of the object he gets and prices must sum to a given number. The objective is to select an assignment-price pair that is envy-free with respect to the true preferences. I prove that the proposed mechanism will implement both in Nash and strong Nash the set of envy-free allocations. The distinguishing feature of the mechanism is that it treats the announced preferences as the true ones and selects an envy-free allocation with respect to the announced preferences.
Resumo:
We consider, both theoretically and empirically, how different organization modes are aligned to govern the efficient solving of technological problems. The data set is a sample from the Chinese consumer electronics industry. Following mainly the problem solving perspective (PSP) within the knowledge based view (KBV), we develop and test several PSP and KBV hypotheses, in conjunction with competing transaction cost economics (TCE) alternatives, in an examination of the determinants of the R&D organization mode. The results show that a firm’s existing knowledge base is the single most important explanatory variable. Problem complexity and decomposability are also found to be important, consistent with the theoretical predictions of the PSP, but it is suggested that these two dimensions need to be treated as separate variables. TCE hypotheses also receive some support, but the estimation results seem more supportive of the PSP and the KBV than the TCE.
Resumo:
This position paper considers the devolution of further fiscal powers to the Scottish Parliament in the context of the objectives and remit of the Smith Commission. The argument builds on our discussion of fiscal decentralization made in our previous published work on this topic. We ask what sort of budget constraint the Scottish Parliament should operate with. A soft budget constraint (SBC) allows the Scottish Parliament to spend without having to consider all of the tax and, therefore, political consequences, of that spending, which is effectively the position at the moment. The incentives to promote economic growth through fiscal policy – on both the tax and spending sides are weak to non-existent. This is what the Scotland Act, 1998, and the continuing use of the Barnett block grant, gave Scotland. Now other budget constraints are being discussed – those of the Calman Commission (2009) and the Scotland Act (2012), as well as the ones offered in 2014 by the various political parties – Scottish Conservatives, Scottish Greens, Scottish Labour, the Scottish Liberal Democrats and the Scottish Government. There is also the budget constraint designed by the Holtham Commission (2010) for Wales that could just as well be used in Scotland. We examine to what extent these offer the hard budget constraint (HBC) that would bring tax policy firmly into the realm of Scottish politics, asking the Scottish electorate and Parliament to consider the costs to them of increasing spending in terms of higher taxes; or the benefits to them of using public spending to grow the tax base and own-sourced taxes? The hardest budget constraint of all is offered by independence but, as is now known, a clear majority of those who voted in the referendum did not vote for this form of budget constraint. Rather they voted for a significant further devolution of fiscal powers while remaining within a political and monetary union with the rest of the UK, with the risk pooling and revenue sharing that this implies. It is not surprising therefore that none of the budget constraints on offer, apart from the SNP’s, come close to the HBC of independence. However, the almost 25% fall in the price of oil since the referendum, a resource stream so central to the SNP’s economic policy making, underscores why there is a need for a trade off between a HBC and risk pooling and revenue sharing. Ranked according to the desirable characteristic of offering something approaching a HBC the least desirable are those of the Calman Commission, the Scotland Act, 2012, and Scottish Labour. In all of these the ‘elasticity’ of the block grant in the face of failure to grow the Scottish tax base is either not defined or is very elastic – meaning that the risk of failure is shuffled off to taxpayers outside of Scotland. The degree of HBC in the Scottish Conservative, Scottish Greens and Scottish Liberal Democrats proposals are much more desirable from an economic growth point of view, the latter even embracing the HBC proposed by the Holtham Commission that combines serious tax policy with welfare support in the long-run. We judge that the budget constraint in the SNP’s proposals is too hard as it does not allow for continuation of the ‘welfare union’ in the UK. We also consider that in the case of a generalized UK economic slow requiring a fiscal stimulus that the Scottish Parliament be allowed increased borrowing to be repaid in the next economic upturn.
Resumo:
The use of Geographic Information Systems has revolutionalized the handling and the visualization of geo-referenced data and has underlined the critic role of spatial analysis. The usual tools for such a purpose are geostatistics which are widely used in Earth science. Geostatistics are based upon several hypothesis which are not always verified in practice. On the other hand, Artificial Neural Network (ANN) a priori can be used without special assumptions and are known to be flexible. This paper proposes to discuss the application of ANN in the case of the interpolation of a geo-referenced variable.
Resumo:
We study the concept of propagation connectivity on random 3-uniform hypergraphs. This concept is inspired by a simple linear time algorithm for solving instances of certain constraint satisfaction problems. We derive upper and lower bounds for the propagation connectivity threshold, and point out some algorithmic implications.
Resumo:
The problem of finding a feasible solution to a linear inequality system arises in numerous contexts. In [12] an algorithm, called extended relaxation method, that solves the feasibility problem, has been proposed by the authors. Convergence of the algorithm has been proven. In this paper, we onsider a class of extended relaxation methods depending on a parameter and prove their convergence. Numerical experiments have been provided, as well.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
We describe the case of a 69-year-old professor of mathematics (GV) who was examined 2 years after left-hemispheric capsular-thalamic haemorrhage. GV showed disproportionate impairment in subtractions requiring borrowing (22 - 7). For large subtraction problems without borrowing (99 - 12) performance was almost flawless. Subtractions with borrowing mostly relied on inadequate attempts to invert subtractions into the corresponding additions (22 - 7 = x as 7 + x = 22). The hypothesis is advanced that difficulty in the inhibitory components of attention tasks (Stroop test, go-no-go task) might be the responsible factor of his calculation impairment. A deficit in subtractions with borrowing might be related to left-hemispheric damage involving thalamo-cortical connections.
Resumo:
The estimation of camera egomotion is a well established problem in computer vision. Many approaches have been proposed based on both the discrete and the differential epipolar constraint. The discrete case is mainly used in self-calibrated stereoscopic systems, whereas the differential case deals with a unique moving camera. The article surveys several methods for mobile robot egomotion estimation covering more than 0.5 million samples using synthetic data. Results from real data are also given
Constraint algorithm for k-presymplectic Hamiltonian systems. Application to singular field theories
Resumo:
The k-symplectic formulation of field theories is especially simple, since only tangent and cotangent bundles are needed in its description. Its defining elements show a close relationship with those in the symplectic formulation of mechanics. It will be shown that this relationship also stands in the presymplectic case. In a natural way,one can mimick the presymplectic constraint algorithm to obtain a constraint algorithmthat can be applied to k-presymplectic field theory, and more particularly to the Lagrangian and Hamiltonian formulations offield theories defined by a singular Lagrangian, as well as to the unified Lagrangian-Hamiltonian formalism (Skinner--Rusk formalism) for k-presymplectic field theory. Two examples of application of the algorithm are also analyzed.
Resumo:
Some people cannot buy products without first touching them, believing that doing so will create more assurance and information and reduce uncertainty. The international consumer marketing literature suggests an instrument to measure consumers' necessity for pohysical contact, called Need for Touch (NFT). This paper analyzes whether the Need for Touch structure is empirically consistent. Based on a literature review, we suggest six hypotheses in order to assess the nomological, convergent, and discriminant validity of the phenomenon. Departing from these, data supported four assumptions in the predicted direction. Need for Touch was associated with Need for Input and with Need for Cognition. Need for Touch was not associated with traditional marketing channels. The results also showed the dual characterization of Need for Touch as a bi-dimensional construct. The moderator effect indicated that when the consumer has a higher (vs. lower) Need for Touch autotelic score, the experiential motivation for shopping played a more (vs. less) important role in impulsive motivation. Our Study 3 supports the NFT structure and shows new associations with the need for unique products and dependent decisions.