966 resultados para Methodological problems
Resumo:
In this paper a theory for two-person zero sum multicriterion differential games is presented. Various solution concepts based upon the notions of Pareto optimality (efficiency), security and equilibrium are defined. These are shown to have interesting applications in the formulation and analysis of two target or combat differential games. The methods for obtaining outcome regions in the state space, feedback strategies for the players and the mode of play has been discussed in the framework of bicriterion zero sum differential games. The treatment is conceptual rather than rigorous.
Resumo:
A posteriori error estimation and adaptive refinement technique for fracture analysis of 2-D/3-D crack problems is the state-of-the-art. The objective of the present paper is to propose a new a posteriori error estimator based on strain energy release rate (SERR) or stress intensity factor (SIF) at the crack tip region and to use this along with the stress based error estimator available in the literature for the region away from the crack tip. The proposed a posteriori error estimator is called the K-S error estimator. Further, an adaptive mesh refinement (h-) strategy which can be used with K-S error estimator has been proposed for fracture analysis of 2-D crack problems. The performance of the proposed a posteriori error estimator and the h-adaptive refinement strategy have been demonstrated by employing the 4-noded, 8-noded and 9-noded plane stress finite elements. The proposed error estimator together with the h-adaptive refinement strategy will facilitate automation of fracture analysis process to provide reliable solutions.
Resumo:
This Working Paper reports the background to the first stage of the ongoing research project, The Quest for Well-being in Growth Industries: A Collaborative Study in Finland and Scotland, conducted under the auspices of the Academy of Finland research programme The Future of Work and Well-being (2008-2011). This collaborative project provides national and transnational data, analysis and outputs. The study is being conducted in the Department of Management and Organisation, Hanken School of Economics, Finland, in collaboration with Glasgow Caledonian University, University of East London, Heriot-Watt University and Reading University, UK. The project examines policies and practices towards the enhancement of work-related well-being in growth industries, and contradictory pressures and tensions posed in this situation. The overall aim is to evaluate the development, implementation and use of work-related well-being policies in four selected growth industries. These sectors – electronics, care, finance and accounting, and tourism – have been selected on the basis of European Union and national forecasts, and demographic and socio-economic trends in employment. In this working paper we outline the background to the research study, the initial research plan, and how the survey of employers has been constructed. The working paper concludes with a brief discussion of general ongoing research issues arising in the project.
Resumo:
There are a number of large networks which occur in many problems dealing with the flow of power, communication signals, water, gas, transportable goods, etc. Both design and planning of these networks involve optimization problems. The first part of this paper introduces the common characteristics of a nonlinear network (the network may be linear, the objective function may be non linear, or both may be nonlinear). The second part develops a mathematical model trying to put together some important constraints based on the abstraction for a general network. The third part deals with solution procedures; it converts the network to a matrix based system of equations, gives the characteristics of the matrix and suggests two solution procedures, one of them being a new one. The fourth part handles spatially distributed networks and evolves a number of decomposition techniques so that we can solve the problem with the help of a distributed computer system. Algorithms for parallel processors and spatially distributed systems have been described.There are a number of common features that pertain to networks. A network consists of a set of nodes and arcs. In addition at every node, there is a possibility of an input (like power, water, message, goods etc) or an output or none. Normally, the network equations describe the flows amoungst nodes through the arcs. These network equations couple variables associated with nodes. Invariably, variables pertaining to arcs are constants; the result required will be flows through the arcs. To solve the normal base problem, we are given input flows at nodes, output flows at nodes and certain physical constraints on other variables at nodes and we should find out the flows through the network (variables at nodes will be referred to as across variables).The optimization problem involves in selecting inputs at nodes so as to optimise an objective function; the objective may be a cost function based on the inputs to be minimised or a loss function or an efficiency function. The above mathematical model can be solved using Lagrange Multiplier technique since the equalities are strong compared to inequalities. The Lagrange multiplier technique divides the solution procedure into two stages per iteration. Stage one calculates the problem variables % and stage two the multipliers lambda. It is shown that the Jacobian matrix used in stage one (for solving a nonlinear system of necessary conditions) occurs in the stage two also.A second solution procedure has also been imbedded into the first one. This is called total residue approach. It changes the equality constraints so that we can get faster convergence of the iterations.Both solution procedures are found to coverge in 3 to 7 iterations for a sample network.The availability of distributed computer systems — both LAN and WAN — suggest the need for algorithms to solve the optimization problems. Two types of algorithms have been proposed — one based on the physics of the network and the other on the property of the Jacobian matrix. Three algorithms have been deviced, one of them for the local area case. These algorithms are called as regional distributed algorithm, hierarchical regional distributed algorithm (both using the physics properties of the network), and locally distributed algorithm (a multiprocessor based approach with a local area network configuration). The approach used was to define an algorithm that is faster and uses minimum communications. These algorithms are found to converge at the same rate as the non distributed (unitary) case.
Resumo:
Methodologies are presented for minimization of risk in a river water quality management problem. A risk minimization model is developed to minimize the risk of low water quality along a river in the face of conflict among various stake holders. The model consists of three parts: a water quality simulation model, a risk evaluation model with uncertainty analysis and an optimization model. Sensitivity analysis, First Order Reliability Analysis (FORA) and Monte-Carlo simulations are performed to evaluate the fuzzy risk of low water quality. Fuzzy multiobjective programming is used to formulate the multiobjective model. Probabilistic Global Search Laussane (PGSL), a global search algorithm developed recently, is used for solving the resulting non-linear optimization problem. The algorithm is based on the assumption that better sets of points are more likely to be found in the neighborhood of good sets of points, therefore intensifying the search in the regions that contain good solutions. Another model is developed for risk minimization, which deals with only the moments of the generated probability density functions of the water quality indicators. Suitable skewness values of water quality indicators, which lead to low fuzzy risk are identified. Results of the models are compared with the results of a deterministic fuzzy waste load allocation model (FWLAM), when methodologies are applied to the case study of Tunga-Bhadra river system in southern India, with a steady state BOD-DO model. The fractional removal levels resulting from the risk minimization model are slightly higher, but result in a significant reduction in risk of low water quality. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
A new formulation is suggested for the fixed end-point regulator problem, which, in conjunction with the recently developed integration-free algorithms, provides an efficient means of obtaining numerical solutions to such problems.
Resumo:
The domination and Hamilton circuit problems are of interest both in algorithm design and complexity theory. The domination problem has applications in facility location and the Hamilton circuit problem has applications in routing problems in communications and operations research.The problem of deciding if G has a dominating set of cardinality at most k, and the problem of determining if G has a Hamilton circuit are NP-Complete. Polynomial time algorithms are, however, available for a large number of restricted classes. A motivation for the study of these algorithms is that they not only give insight into the characterization of these classes but also require a variety of algorithmic techniques and data structures. So the search for efficient algorithms, for these problems in many classes still continues.A class of perfect graphs which is practically important and mathematically interesting is the class of permutation graphs. The domination problem is polynomial time solvable on permutation graphs. Algorithms that are already available are of time complexity O(n2) or more, and space complexity O(n2) on these graphs. The Hamilton circuit problem is open for this class.We present a simple O(n) time and O(n) space algorithm for the domination problem on permutation graphs. Unlike the existing algorithms, we use the concept of geometric representation of permutation graphs. Further, exploiting this geometric notion, we develop an O(n2) time and O(n) space algorithm for the Hamilton circuit problem.
Resumo:
In this paper, a novel genetic algorithm is developed by generating artificial chromosomes with probability control to solve the machine scheduling problems. Generating artificial chromosomes for Genetic Algorithm (ACGA) is closely related to Evolutionary Algorithms Based on Probabilistic Models (EAPM). The artificial chromosomes are generated by a probability model that extracts the gene information from current population. ACGA is considered as a hybrid algorithm because both the conventional genetic operators and a probability model are integrated. The ACGA proposed in this paper, further employs the ``evaporation concept'' applied in Ant Colony Optimization (ACO) to solve the permutation flowshop problem. The ``evaporation concept'' is used to reduce the effect of past experience and to explore new alternative solutions. In this paper, we propose three different methods for the probability of evaporation. This probability of evaporation is applied as soon as a job is assigned to a position in the permutation flowshop problem. Experimental results show that our ACGA with the evaporation concept gives better performance than some algorithms in the literature.
Resumo:
The dissertation examines the foreign policies of the United States through the prism of science and technology. In the focal point of scrutiny is the policy establishing the International Institute for Applied Systems Analysis (IIASA) and the development of the multilateral part of bridge building in American foreign policy during the 1960s and early 1970s. After a long and arduous negotiation process, the institute was finally established by twelve national member organizations from the following countries: Bulgaria, Canada, Czechoslovakia, Federal Republic of Germany (FRG), France, German Democratic Republic (GDR), Great Britain, Italy, Japan, Poland, Soviet Union and United States; a few years later Sweden, Finland and the Netherlands also joined. It is said that the goal of the institute was to bring together researchers from East and West to solve pertinent problems caused by the modernization process experienced in industrialized world. It originates from President Lyndon B. Johnson s bridge building policies that were launched in 1964, and was set in a well-contested and crowded domain of other international organizations of environmental and social planning. Since the distinct need for yet another organization was not evident, the process of negotiations in this multinational environment enlightens the foreign policy ambitions of the United States on the road to the Cold War détente. The study places this project within its political era, and juxtaposes it with other international organizations, especially that of the OECD, ECE and NATO. Conventionally, Lyndon Johnson s bridge building policies have been seen as a means to normalize its international relations bilaterally with different East European countries, and the multilateral dimension of the policy has been ignored. This is why IIASA s establishment process in this multilateral environment brings forth new information on US foreign policy goals, the means to achieve these goals, as well as its relations to other advanced industrialized societies before the time of détente, during the 1960s and early 1970s. Furthermore, the substance of the institute applied systems analysis illuminates the differences between European and American methodological thinking in social planning. Systems analysis is closely associated with (American) science and technology policies of the 1960s, especially in its military administrative applications, thus analysis within the foreign policy environment of the United States proved particularly fruitful. In the 1960s the institutional structures of European continent with faltering, and the growing tendencies of integration were in flux. One example of this was the long, drawn-out process of British membership in the EEC, another is de Gaulle s withdrawal from NATO s military-political cooperation. On the other hand, however, economic cooperation in Europe between East and West, and especially with the Soviet Union was expanding rapidly. This American initiative to form a new institutional actor has to be seen in that structural context, showing that bridge building was needed not only to the East, but also to the West. The narrative amounts to an analysis of how the United States managed both cooperation and conflict in its hegemonic aspirations in the emerging modern world, and how it used its special relationship with the United Kingdom to achieve its goals. The research is based on the archives of the United States, Great Britain, Sweden, Finland, and IIASA. The primary sources have been complemented with both contemporary and present day research literature, periodicals, and interviews.
Resumo:
Maternal drug abuse during pregnancy endangers the future health and wellbeing of the infant and growing child. On the other hand, via maternal abstinence, these problems would never occur; so the problems would be totally preventable. Buprenorphine is widely used in opioid maintenance treatment as a substitute medication. In Finland, during 2000 s buprenorphine misuse has steadily increased. In 2009 almost one third of clientele of substance treatment units were in treatment because of buprenorphine dependence. At Helsinki Women s Clinic the first child with prenatal buprenorphine exposure was born in 2001. During 1992-2001 in the three capital area maternity hospitals (Women s clinic, Maternity hospital, Jorvi hospital) 524 women were followed at special antenatal clinics due to substance abuse problems. Three control women were drawn from birth register to each case woman and matched for parity and same place and date of the index birth. According to register data mortality rate was 38-fold higher among cases than controls within 6-15 years after index birth. Especially, the risk for violent or accidental death was increased. The women with substance misuse problems had also elevated risk for viral hepatitis and psychiatric morbidity. They were more often reimbursed for psychopharmaceuticals. Disability pensions and rehabilitation allowances were more often granted to cases than controls. In total 626 children were born from these pregnancies. According to register data 38% of these children were placed in out-of-home care as part of child protection services by the age of two years, and half of them by the age of 12 years, the median follow-up time was 5.8 years. The risk for out-of-home care was associated with factors identifiable during the pre- and perinatal period. In 2002-2005 67 pregnant women with buprenorphine dependence were followed up at the Helsinki University Hospital, Department of Obstetrics and Gynecology. Their pregnancies were uneventful. The prematurity rate was similar and there were no more major anomalies compared to the national statistics. The neonates were lighter compared to the national statistics. They were also born in good condition, with no perinatal hypoxia as defined by standard clinical parameters or certain biochemical markers in the cord blood: erythropoietin, S100 and cardiac troponin-t. Almost 80% of newborns developed neonatal abstinence syndrome (NAS) and two third of them needed morphine medication for it. Maternal smoking over ten cigarettes per day aggravated and benzodiazepine use attenuated NAS. An infant s highest urinary norbuprenorphine concentration during their first 3 days of life correlated with the duration of morphine treatment. The average length of infant s hospital stay was 25 days.
Resumo:
XVIII IUFRO World Congress, Ljubljana 1986.