53 resultados para decision strategies
Resumo:
This paper presents results of research into the use of the Bellman-Zadeh approach to decision making in a fuzzy environment for solving multicriteria power engineering problems. The application of the approach conforms to the principle of guaranteed result and provides constructive lines in computationally effective obtaining harmonious solutions on the basis of solving associated maxmin problems. The presented results are universally applicable and are already being used to solve diverse classes of power engineering problems. It is illustrated by considering problems of power and energy shortage allocation, power system operation, optimization of network configuration in distribution systems, and energetically effective voltage control in distribution systems. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The present paper proposes a flexible consensus scheme for group decision making, which allows one to obtain a consistent collective opinion, from information provided by each expert in terms of multigranular fuzzy estimates. It is based on a linguistic hierarchical model with multigranular sets of linguistic terms, and the choice of the most suitable set is a prerogative of each expert. From the human viewpoint, using such model is advantageous, since it permits each expert to utilize linguistic terms that reflect more adequately the level of uncertainty intrinsic to his evaluation. From the operational viewpoint, the advantage of using such model lies in the fact that it allows one to express the linguistic information in a unique domain, without losses of information, during the discussion process. The proposed consensus scheme supposes that the moderator can interfere in the discussion process in different ways. The intervention can be a request to any expert to update his opinion or can be the adjustment of the weight of each expert`s opinion. An optimal adjustment can be achieved through the execution of an optimization procedure that searches for the weights that maximize a corresponding soft consensus index. In order to demonstrate the usefulness of the presented consensus scheme, a technique for multicriteria analysis, based on fuzzy preference relation modeling, is utilized for solving a hypothetical enterprise strategy planning problem, generated with the use of the Balanced Scorecard methodology. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This paper presents results of research related to multicriteria decision making under information uncertainty. The Bell-man-Zadeh approach to decision making in a fuzzy environment is utilized for analyzing multicriteria optimization models (< X, M > models) under deterministic information. Its application conforms to the principle of guaranteed result and provides constructive lines in obtaining harmonious solutions on the basis of analyzing associated maxmin problems. This circumstance permits one to generalize the classic approach to considering the uncertainty of quantitative information (based on constructing and analyzing payoff matrices reflecting effects which can be obtained for different combinations of solution alternatives and the so-called states of nature) in monocriteria decision making to multicriteria problems. Considering that the uncertainty of information can produce considerable decision uncertainty regions, the resolving capacity of this generalization does not always permit one to obtain unique solutions. Taking this into account, a proposed general scheme of multicriteria decision making under information uncertainty also includes the construction and analysis of the so-called < X, R > models (which contain fuzzy preference relations as criteria of optimality) as a means for the subsequent contraction of the decision uncertainty regions. The paper results are of a universal character and are illustrated by a simple example. (c) 2007 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a new methodology to estimate unbalanced harmonic distortions in a power system, based on measurements of a limited number of given sites. The algorithm utilizes evolutionary strategies (ES), a development branch of evolutionary algorithms. The problem solving algorithm herein proposed makes use of data from various power quality meters, which can either be synchronized by high technology GPS devices or by using information from a fundamental frequency load flow, what makes the overall power quality monitoring system much less costly. The ES based harmonic estimation model is applied to a 14 bus network to compare its performance to a conventional Monte Carlo approach. It is also applied to a 50 bus subtransmission network in order to compare the three-phase and single-phase approaches as well as the robustness of the proposed method. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This study presents a decision-making method for maintenance policy selection of power plants equipment. The method is based on risk analysis concepts. The method first step consists in identifying critical equipment both for power plant operational performance and availability based on risk concepts. The second step involves the proposal of a potential maintenance policy that could be applied to critical equipment in order to increase its availability. The costs associated with each potential maintenance policy must be estimated, including the maintenance costs and the cost of failure that measures the critical equipment failure consequences for the power plant operation. Once the failure probabilities and the costs of failures are estimated, a decision-making procedure is applied to select the best maintenance policy. The decision criterion is to minimize the equipment cost of failure, considering the costs and likelihood of occurrence of failure scenarios. The method is applied to the analysis of a lubrication oil system used in gas turbines journal bearings. The turbine has more than 150 MW nominal output, installed in an open cycle thermoelectric power plant. A design modification with the installation of a redundant oil pump is proposed for lubricating oil system availability improvement. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Systems of distributed artificial intelligence can be powerful tools in a wide variety of practical applications. Its most surprising characteristic, the emergent behavior, is also the most answerable for the difficulty in. projecting these systems. This work proposes a tool capable to beget individual strategies for the elements of a multi-agent system and thereof providing to the group means on obtaining wanted results, working in a coordinated and cooperative manner as well. As an application example, a problem was taken as a basis where a predators` group must catch a prey in a three-dimensional continuous ambient. A synthesis of system strategies was implemented of which internal mechanism involves the integration between simulators by Particle Swarm Optimization algorithm (PSO), a Swarm Intelligence technique. The system had been tested in several simulation settings and it was capable to synthesize automatically successful hunting strategies, substantiating that the developed tool can provide, as long as it works with well-elaborated patterns, satisfactory solutions for problems of complex nature, of difficult resolution starting from analytical approaches. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The study of Information Technology (IT) outsourcing is relevant because companies are outsourcing their activities more than ever. An important IT outsourcing research area is the decision-making process. In other words, the comprehension of how companies decide about outsourcing their IT operations is relevant from research point of view. Therefore, the objective of this study is to understand the decision-making process used by Brazilian companies when outsourcing their IT operations. An analysis of the literature that refers to this subject showed that six aspects are usually considered by companies on the evaluation of IT outsourcing service alternatives. This research verified how these six aspects are considered by Brazilian companies on IT outsourcing decisions. The survey showed that Brazilian companies consider all the six aspects, but each of them has a different level of importance. The research also grouped the aspects according to their level of importance and interdependency, using factorial analysis to understand the logic behind IT outsourcing decision process. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper analyzes the internationalization of new multinationals from emerging countries. It also focuses on Production`s role in firm internationalization, a subject seldom addressed because the discipline of International Manufacturing is still embryonic, while International Business tends to overlook production. The authors integrate International Business and International Manufacturing concepts and frameworks in order to analyze new multinationals from emerging countries, using the empirical evidence of a survey plus case studies of Brazilian multinationals for understanding late-movers` strategies and competences, with emphasis on production. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper analyzes the convergence of the constant modulus algorithm (CMA) in a decision feedback equalizer using only a feedback filter. Several works had already observed that the CMA presented a better performance than decision directed algorithm in the adaptation of the decision feedback equalizer, but theoretical analysis always showed to be difficult specially due to the analytical difficulties presented by the constant modulus criterion. In this paper, we surmount such obstacle by using a recent result concerning the CM analysis, first obtained in a linear finite impulse response context with the objective of comparing its solutions to the ones obtained through the Wiener criterion. The theoretical analysis presented here confirms the robustness of the CMA when applied to the adaptation of the decision feedback equalizer and also defines a class of channels for which the algorithm will suffer from ill-convergence when initialized at the origin.
Resumo:
In Rondonia State, Brazil, settlement processes have cleared 68,000 km 2 of tropical forests since the 1970s. The intensity of deforestation has differed by region depending on driving factors like roads and economic activities. Different histories of land-use activities and rates of change have resulted in mosaics of forest patches embedded in an agricultural matrix. Yet, most assessments of deforestation and its effects on vegetation, soil and water typically focus on landscape patterns of current conditions, yet historical deforestation dynamics can influence current conditions strongly. Here, we develop and describe the use of four land-use dynamic indicators to capture historical land-use changes of catchments and to measure the rate of deforestation (annual deforestation rate), forest regeneration level (secondary forest mean proportion), time since disturbance (mean time since deforestation) and deforestation profile (deforestation profile curvature). We used the proposed indices to analyze a watershed located in central Rondonia. Landsat TM and ETM+ images were used to produce historical land-use maps of the last 18 years, each even year from 1984 to 2002 for 20 catchments. We found that the land-use dynamics indicators are able to distinguish catchments with different land-use change profiles. Four categories of historical land-use were identified: old and dominant pasture cover on small properties, recent deforestation and dominance of secondary growth, old extensive pastures and large forest remnants and, recent deforestation, pasture and large forest remnants. Knowing historical deforestation processes is important to develop appropriate conservation strategies and define priorities and actions for conserving forests currently under deforestation. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The DSSAT/CANEGRO model was parameterized and its predictions evaluated using data from five sugarcane (Sacchetrum spp.) experiments conducted in southern Brazil. The data used are from two of the most important Brazilian cultivars. Some parameters whose values were either directly measured or considered to be well known were not adjusted. Ten of the 20 parameters were optimized using a Generalized Likelihood Uncertainty Estimation (GLUE) algorithm using the leave-one-out cross-validation technique. Model predictions were evaluated using measured data of leaf area index (LA!), stalk and aerial dry mass, sucrose content, and soil water content, using bias, root mean squared error (RMSE), modeling efficiency (Eff), correlation coefficient, and agreement index. The Decision Support System for Agrotechnology Transfer (DSSAT)/CANEGRO model simulated the sugarcane crop in southern Brazil well, using the parameterization reported here. The soil water content predictions were better for rainfed (mean RMSE = 0.122mm) than for irrigated treatment (mean RMSE = 0.214mm). Predictions were best for aerial dry mass (Eff = 0.850), followed by stalk dry mass (Eff = 0.765) and then sucrose mass (Eff = 0.170). Number of green leaves showed the worst fit (Eff = -2.300). The cross-validation technique permits using multiple datasets that would have limited use if used independently because of the heterogeneity of measures and measurement strategies.
Resumo:
The aim of this study was the design of a set of benzofuroxan derivatives as antimicrobial agents exploring the physicochemical properties of the related substituents. Topliss` decision tree approach was applied to select the substituent groups. Hierarchical cluster analysis was also performed to emphasize natural clusters and patterns. The compounds were obtained using two synthetic approaches for reducing the synthetic steps as well as improving the yield. The minimal inhibitory concentration method was employed to evaluate the activity against multidrug-resistant Staphylococcus aureus strains. The most active compound was 4-nitro-3-(trifluoromethyl)[N`-(benzofuroxan-5-yl) methylene] benzhydrazide (MIC range 12.7-11.4 mu g/mL), pointing out that the antimicrobial activity was indeed influenced by the hydrophobic and electron-withdrawing property of the substituent groups 3-CF(3) and 4-NO(2), respectively. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The aim of this work is to present a simple, practical and efficient protocol for drug design, in particular Diabetes, which includes selection of the illness, good choice of a target as well as a bioactive ligand and then usage of various computer aided drug design and medicinal chemistry tools to design novel potential drug candidates in different diseases. We have selected the validated target dipeptidyl peptidase IV (DPP-IV), whose inhibition contributes to reduce glucose levels in type 2 diabetes patients. The most active inhibitor with complex X-ray structure reported was initially extracted from the BindingDB database. By using molecular modification strategies widely used in medicinal chemistry, besides current state-of-the-art tools in drug design (including flexible docking, virtual screening, molecular interaction fields, molecular dynamics. ADME and toxicity predictions), we have proposed 4 novel potential DPP-IV inhibitors with drug properties for Diabetes control, which have been supported and validated by all the computational tools used herewith.
Resumo:
With a 41-society sample of 9990 managers and professionals, we used hierarchical linear modeling to investigate the impact of both macro-level and micro-level predictors on subordinate influence ethics. While we found that both macro-level and micro-level predictors contributed to the model definition, we also found global agreement for a subordinate influence ethics hierarchy. Thus our findings provide evidence that developing a global model of subordinate ethics is possible, and should be based upon multiple criteria and multilevel variables. Journal of International Business Studies (2009) 40, 1022-1045. doi:10.1057/jibs.2008.109
Resumo:
Purpose - Using Brandenburger and Nalebuff`s 1995 co-opetition model as a reference, the purpose of this paper is to seek to develop a tool that, based on the tenets of classical game theory, would enable scholars and managers to identify which games may be played in response to the different conflict of interest situations faced by companies in their business environments. Design/methodology/approach - The literature on game theory and business strategy are reviewed and a conceptual model, the strategic games matrix (SGM), is developed. Two novel games are described and modeled. Findings - The co-opetition model is not sufficient to realistically represent most of the conflict of interest situations faced by companies. It seeks to address this problem through development of the SGM, which expands upon Brandenburger and Nalebuff`s model by providing a broader perspective, through incorporation of an additional dimension (power ratio between players) and three novel, respectively, (rival, individualistic, and associative). Practical implications - This proposed model, based on the concepts of game theory, should be used to train decision- and policy-makers to better understand, interpret and formulate conflict management strategies. Originality/value - A practical and original tool to use game models in conflict of interest situations is generated. Basic classical games, such as Nash, Stackelberg, Pareto, and Minimax, are mapped on the SGM to suggest in which situations they Could be useful. Two innovative games are described to fit four different types of conflict situations that so far have no corresponding game in the literature. A test application of the SGM to a classic Intel Corporation strategic management case, in the complex personal computer industry, shows that the proposed method is able to describe, to interpret, to analyze, and to prescribe optimal competitive and/or cooperative strategies for each conflict of interest situation.