921 resultados para subtraction solving
Resumo:
The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.
Resumo:
Firefly Algorithm is a recent swarm intelligence method, inspired by the social behavior of fireflies, based on their flashing and attraction characteristics [1, 2]. In this paper, we analyze the implementation of a dynamic penalty approach combined with the Firefly algorithm for solving constrained global optimization problems. In order to assess the applicability and performance of the proposed method, some benchmark problems from engineering design optimization are considered.
Resumo:
Se propone desarrollar e integrar estudios sobre Modelado y Resolución de Problemas en Física que asumen como factores explicativos: características de la situación planteada, conocimiento de la persona que resuelve y proceso puesto en juego durante la resolución. Interesa comprender cómo los estudiantes acceden al conocimiento previo, qué procedimientos usan para recuperar algunos conocimientos y desechar otros, cuáles son los criterios que dan coherencia a sus decisiones, cómo se relacionan estas decisiones con algunas características de la tarea, entre otras. Todo ello con miras a estudiar relaciones causales entre las dificultades encontradas y el retraso o abandono en las carreras.Se propone organizar el trabajo en tres ejes, los dos primeros de construcción teórica y un tercero de implementación y transferencia. Se pretende.1.-Estudiar los procesos de construcción de las representaciones mentales en resolución de problemas de física, tanto en expertos como en estudiantes de diferentes niveles académicos.2.-Analizar y clasificar las inferencias que se producen durante las tareas de comprensión en resolución de problemas de física. Asociar dichas inferencias con procesos de transición entre representaciones mentales de diferente naturaleza.3.-Desarrollar materiales y diseños instruccionales en la enseñanza de la Física, fundamentado en un conocimiento de los requerimientos psicológicos de los estudiantes en diversas tareas de aprendizaje.En términos generales se plantea un enfoque interpretativo a la luz de marcos de la psicología cognitiva y de los desarrollos propios del grupo. Se trabajará con muestras intencionales de alumnos y profesores de física. Se utilizarán protocolos verbales y registros escritos producidos durante la ejecución de las tareas con el fin de identificar indicadores de comprensión, inferencias, y diferentes niveles de representación. Se prevé analizar material escrito de circulación corriente sea comercial o preparado por los docentes de las carreras involucradas.Las características del objeto de estudio y el distinto nivel de desarrollo en que se encuentran los diferentes ojetivos específicos llevan a que el abordaje contemple -según consideracion de Juni y Urbano (2006)- tanto la lógica cualitativa como la cuantitativa.
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2009
Resumo:
We consider, both theoretically and empirically, how different organization modes are aligned to govern the efficient solving of technological problems. The data set is a sample from the Chinese consumer electronics industry. Following mainly the problem solving perspective (PSP) within the knowledge based view (KBV), we develop and test several PSP and KBV hypotheses, in conjunction with competing transaction cost economics (TCE) alternatives, in an examination of the determinants of the R&D organization mode. The results show that a firm’s existing knowledge base is the single most important explanatory variable. Problem complexity and decomposability are also found to be important, consistent with the theoretical predictions of the PSP, but it is suggested that these two dimensions need to be treated as separate variables. TCE hypotheses also receive some support, but the estimation results seem more supportive of the PSP and the KBV than the TCE.
Resumo:
It has been suggested that an inappropriate relationship between renin and exchangeable sodium is responsible for the hypertension of patients with chronic renal failure. Long-term blockade of the renin system by captopril made it possible to test this hypothesis in 8 patients on maintenance hemodialysis. Captopril was administered orally in 2 daily doses of 25 to 200 mg. Previously, blood pressure averaged 179/105 +/- 6/3 (mean +/- SEM) pre- and 182/103 +/- 7/3 mm HG post-dialysis, despite intensive ultrafiltration and conventional antihypertensive therapy. The 4 patients with the highest plasma renin activity normalized their blood pressure with captopril alone, whereas in the 4 remaining patients, captopril therapy was complemented by salt subtraction which consisted in replacement of 1-2 liters of ultrafiltrate by an equal volume of 5% dextrose until blood pressure was controlled. After an average treatment period of 5 months, blood pressure of all 8 patients was reduced to 134/76 +/- 7/5 mm Hg (P less than 0.001) pre- and 144/81 +/- 9/5 mm Hg (P less than 0.001) post-dialysis without a significant change in body weight. The present data suggest that captopril alone or combined with salt subtraction normalizes blood pressure of patients on chronic hemodialysis with so called uncontrollable hypertension.
Resumo:
The problem of finding a feasible solution to a linear inequality system arises in numerous contexts. In [12] an algorithm, called extended relaxation method, that solves the feasibility problem, has been proposed by the authors. Convergence of the algorithm has been proven. In this paper, we onsider a class of extended relaxation methods depending on a parameter and prove their convergence. Numerical experiments have been provided, as well.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
OBJECT: In this study the accuracy of multislice computerized tomography (MSCT) angiography in the postoperative examination of clip-occluded intracranial aneurysms was compared with that of intraarterial digital subtraction (DS) angiography METHODS: Forty-nine consecutive patients with 60 clipped aneurysms (41 of which had ruptured) were studied with the aid of postoperative MSCT and DS angiography. Both types of radiological studies were reviewed independently by two observers to assess the quality of the images, the artifacts left by the clips, the completeness of aneurysm occlusion, the patency of the parent vessel, and the duration and cost of the examination. The quality of MSCT angiography was good in 42 patients (86%). Poor-quality MSCT angiograms (14%) were a result of the late acquisition of images in three patients and the presence of clip or motion artifacts in four. Occlusion of the aneurysm on good-quality MSCT angiograms was confirmed in all but two patients in whom a small (2-mm) remnant was confirmed on DS angiograms. In one patient, occlusion of a parent vessel was seen on DS angiograms but missed on MSCT angiograms. The sensitivity and specificity for detecting neck remnants on MSCT angiography were both 100%, and the sensitivity and specificity for evaluating vessel patency were 80 and 100%, respectively (95% confidence interval 29.2-100%). Interobserver agreements were 0.765 and 0.86, respectively. The mean duration of the examination was 13 minutes for MSCT angiography and 75 minutes for DS angiography (p < 0.05). Multislice CT angiography was highly cost effective (p < 0.01). CONCLUSIONS: Current-generation MSCT angiography is an accurate noninvasive tool used for assessment of clipped aneurysms in the anterior circulation. Its high sensitivity and low cost warrant its use for postoperative routine control examinations following clip placement on an aneurysm. Digital subtraction angiography must be performed if the interpretation of MSCT angiograms is doubtful or if the aneurysm is located in the posterior circulation.
Resumo:
Some people cannot buy products without first touching them, believing that doing so will create more assurance and information and reduce uncertainty. The international consumer marketing literature suggests an instrument to measure consumers' necessity for pohysical contact, called Need for Touch (NFT). This paper analyzes whether the Need for Touch structure is empirically consistent. Based on a literature review, we suggest six hypotheses in order to assess the nomological, convergent, and discriminant validity of the phenomenon. Departing from these, data supported four assumptions in the predicted direction. Need for Touch was associated with Need for Input and with Need for Cognition. Need for Touch was not associated with traditional marketing channels. The results also showed the dual characterization of Need for Touch as a bi-dimensional construct. The moderator effect indicated that when the consumer has a higher (vs. lower) Need for Touch autotelic score, the experiential motivation for shopping played a more (vs. less) important role in impulsive motivation. Our Study 3 supports the NFT structure and shows new associations with the need for unique products and dependent decisions.
Resumo:
Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable
Resumo:
This paper proposes an heuristic for the scheduling of capacity requests and the periodic assignment of radio resources in geostationary (GEO) satellite networks with star topology, using the Demand Assigned Multiple Access (DAMA) protocol in the link layer, and Multi-Frequency Time Division Multiple Access (MF-TDMA) and Adaptive Coding and Modulation (ACM) in the physical layer.
Resumo:
In this paper, we are proposing a methodology to determine the most efficient and least costly way of crew pairing optimization. We are developing a methodology based on algorithm optimization on Eclipse opensource IDE using the Java programming language to solve the crew scheduling problems.