853 resultados para Solving Rule
Resumo:
Research Masters
Resumo:
The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.
Resumo:
Firefly Algorithm is a recent swarm intelligence method, inspired by the social behavior of fireflies, based on their flashing and attraction characteristics [1, 2]. In this paper, we analyze the implementation of a dynamic penalty approach combined with the Firefly algorithm for solving constrained global optimization problems. In order to assess the applicability and performance of the proposed method, some benchmark problems from engineering design optimization are considered.
Resumo:
Se propone desarrollar e integrar estudios sobre Modelado y Resolución de Problemas en Física que asumen como factores explicativos: características de la situación planteada, conocimiento de la persona que resuelve y proceso puesto en juego durante la resolución. Interesa comprender cómo los estudiantes acceden al conocimiento previo, qué procedimientos usan para recuperar algunos conocimientos y desechar otros, cuáles son los criterios que dan coherencia a sus decisiones, cómo se relacionan estas decisiones con algunas características de la tarea, entre otras. Todo ello con miras a estudiar relaciones causales entre las dificultades encontradas y el retraso o abandono en las carreras.Se propone organizar el trabajo en tres ejes, los dos primeros de construcción teórica y un tercero de implementación y transferencia. Se pretende.1.-Estudiar los procesos de construcción de las representaciones mentales en resolución de problemas de física, tanto en expertos como en estudiantes de diferentes niveles académicos.2.-Analizar y clasificar las inferencias que se producen durante las tareas de comprensión en resolución de problemas de física. Asociar dichas inferencias con procesos de transición entre representaciones mentales de diferente naturaleza.3.-Desarrollar materiales y diseños instruccionales en la enseñanza de la Física, fundamentado en un conocimiento de los requerimientos psicológicos de los estudiantes en diversas tareas de aprendizaje.En términos generales se plantea un enfoque interpretativo a la luz de marcos de la psicología cognitiva y de los desarrollos propios del grupo. Se trabajará con muestras intencionales de alumnos y profesores de física. Se utilizarán protocolos verbales y registros escritos producidos durante la ejecución de las tareas con el fin de identificar indicadores de comprensión, inferencias, y diferentes niveles de representación. Se prevé analizar material escrito de circulación corriente sea comercial o preparado por los docentes de las carreras involucradas.Las características del objeto de estudio y el distinto nivel de desarrollo en que se encuentran los diferentes ojetivos específicos llevan a que el abordaje contemple -según consideracion de Juni y Urbano (2006)- tanto la lógica cualitativa como la cuantitativa.
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2009
Resumo:
Consabido que para uma sociedade organizada se desenvolver política e juridicamente, indispensável se faz a existência de um documento formal, dotado de observância obrigatória, capaz de definir as competências públicas e delimitar os poderes do Estado, resguardando os direitos fundamentais de eventuais abusos dos entes políticos. Este documento é a Constituição, que, em todos os momentos da história, sempre se fez presente nos Estados, mas, inicialmente, não de forma escrita, o que fez com que surgisse, então, o constitucionalismo, movimento que defendia a necessidade de elaboração de constituições escritas, munidas de normatividade e supremacia em relação às demais espécies normativas, que visassem organizar a separação dos poderes estatais e declarar os direitos e as liberdades individuais. Porém, de nada adiantaria a edição de uma Lei Maior sem que houvesse mecanismos de defesa, no intuito de afastar qualquer ameaça à segurança jurídica e à estabilidade social, por conta de alguma lei ou ato normativo contrário aos preceitos estabelecidos na Constituição. O controle de constitucionalidade, pilar do Estado de Direito, consiste em verificar a compatibilidade entre uma lei ou qualquer ato normativo infraconstitucional e a Lei Excelsa e, em havendo contraste, a lei ou o ato viciado deverá ser expurgado do ordenamento jurídico, para que a unidade constitucional seja restabelecida. No Brasil, o controle de constitucionalidade foi instituído sob forte influência do modelo norte-americano e obteve diversos tratamentos ao longo das constituições brasileiras, porém, o sistema de fiscalização de constitucionalidade teve seu ápice com o advento da atual Constituição Federal, promulgada em 05.10.88, com a criação de instrumentos processuais inovadores destinados à verificação da constitucionalidade das leis e atos normativos. Além disso, a Carta da República de 1988, ao contrário das anteriores, fortaleceu a figura do Poder Judiciário no contexto político, conferindo, assim, maior autonomia aos magistrados na solução de casos de grande repercussão nacional, redundando em um protagonismo judicial atual. Nesse contexto, o Supremo Tribunal Federal, órgão de cúpula do Judiciário nacional e guardião da Constituição, tem se destacado no cenário nacional, em especial na defesa dos direitos e garantias fundamentais insculpidos na Lei Fundamental, fazendo-se necessária, desta forma, uma análise na jurisprudência da Corte, no sentido de verificar se, de fato, tem havido evolução no controle de constitucionalidade no Brasil ao longo dos últimos anos e, em caso afirmativo, em que circunstâncias isso tem se dado.
Resumo:
We present a new domain of preferences under which the majority relation is always quasi-transitive and thus Condorcet winners always exist. We model situations where a set of individuals must choose one individual in the group. Agents are connected through some relationship that can be interpreted as expressing neighborhood, and which is formalized by a graph. Our restriction on preferences is as follows: each agent can freely rank his immediate neighbors, but then he is indifferent between each neighbor and all other agents that this neighbor "leads to". Hence, agents can be highly perceptive regarding their neighbors, while being insensitive to the differences between these and other agents which are further removed from them. We show quasi-transitivity of the majority relation when the graph expressing the neighborhood relation is a tree. We also discuss a further restriction allowing to extend the result for more general graphs. Finally, we compare the proposed restriction with others in the literature, to conclude that it is independent of any previously discussed domain restriction.
Resumo:
We consider, both theoretically and empirically, how different organization modes are aligned to govern the efficient solving of technological problems. The data set is a sample from the Chinese consumer electronics industry. Following mainly the problem solving perspective (PSP) within the knowledge based view (KBV), we develop and test several PSP and KBV hypotheses, in conjunction with competing transaction cost economics (TCE) alternatives, in an examination of the determinants of the R&D organization mode. The results show that a firm’s existing knowledge base is the single most important explanatory variable. Problem complexity and decomposability are also found to be important, consistent with the theoretical predictions of the PSP, but it is suggested that these two dimensions need to be treated as separate variables. TCE hypotheses also receive some support, but the estimation results seem more supportive of the PSP and the KBV than the TCE.
Resumo:
The problem of finding a feasible solution to a linear inequality system arises in numerous contexts. In [12] an algorithm, called extended relaxation method, that solves the feasibility problem, has been proposed by the authors. Convergence of the algorithm has been proven. In this paper, we onsider a class of extended relaxation methods depending on a parameter and prove their convergence. Numerical experiments have been provided, as well.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
The influence of altitude and latitude on some structure sizes of Lutzomyia intermedia was noted; several structures of insects collected in higher localities were greater, according to Bergmann's rule. This influence was more remarkable in two localities of the State of Espírito Santo, probably due to greater differences in altitude. Comparing insects from different latitudes, more differences were noted in comparisons of insects from low altitude localities than in those of material from higher altitudes. The small number of differences between insects collected in July and in December does not indicate a defined influence of season and temperature on the size of adults. The possible epidemiological implications of these variations are discussed.
Resumo:
We describe the case of a 69-year-old professor of mathematics (GV) who was examined 2 years after left-hemispheric capsular-thalamic haemorrhage. GV showed disproportionate impairment in subtractions requiring borrowing (22 - 7). For large subtraction problems without borrowing (99 - 12) performance was almost flawless. Subtractions with borrowing mostly relied on inadequate attempts to invert subtractions into the corresponding additions (22 - 7 = x as 7 + x = 22). The hypothesis is advanced that difficulty in the inhibitory components of attention tasks (Stroop test, go-no-go task) might be the responsible factor of his calculation impairment. A deficit in subtractions with borrowing might be related to left-hemispheric damage involving thalamo-cortical connections.