855 resultados para Fuzzy rules
Resumo:
When Vietnam joined the World Trade Organization (WTO) in 2007 it was granted an accession period up to 2014. During this period tariffs would have to fall according to the accession agreement. This paper evaluates this 2007–2014 trade liberalization by building an applied general equilibrium model and calibrating it to the Vietnamese data. The model pays careful attention to the fact that Vietnam has many state-owned enterprises. The model simulations show that the WTO tariff reductions will reduce overall welfare. Moreover, the biggest loss will take place among the poor rural households in Vietnam. This paper proposes other tariff reforms that will both raise overall welfare and reduce income inequality.
Resumo:
The literature on the regulation of multinationals' transfer prices has not considered the possibility that governments may use transfer pricing rules strategically when they compete with other governments. The present paper analyses this case and shows that, even in the absence of agency considerations, a non‐cooperative equilibrium is characterised by above‐optimal levels of effective taxation. We then derive conditions under which harmonization of transfer pricing rules lead to a Pareto improvement, and show that harmonization according to the ‘arm's length’ principle—the form of harmonization advocated by the OECD—may not be Pareto improving.
Resumo:
The accurate experimental determination of the solubilities of antibiotics and anti-inflammatory drugs in supercritical fluids (SCFs) and correlations are essential for the development of supercritical technologies for the pharmaceuticals industry. In this work, the solubilities of penicillinG, penicillinV, flurbiprofen, ketoprofen, naproxen, ibuprofen, aspirin and diflunisal in supercritical carbon dioxide (SCCO2) were correlated using Peng-Robinson equation of state (PR EOS) with the modified Kwak and Mansoori mixing rules (mKM) and with Bartle model. The ability of mKM rules was compared against the conventional mixing rules of van der Waals in correlating the solubilities. In the present model, vapor pressure was considered as an adjustable parameter along with binary interactions parameters. In the proposed model, the constants used in the mixing rule, and vapor pressure expression coefficients are temperature independent. The optimization of these constants with experimental data gives binary interaction parameters along with vapor pressure correlations. Sublimation enthalpies were estimated with both the models compared with literature reported experimental values.
Resumo:
Fuzzy Waste Load Allocation Model (FWLAM), developed in an earlier study, derives the optimal fractional levels, for the base flow conditions, considering the goals of the Pollution Control Agency (PCA) and dischargers. The Modified Fuzzy Waste Load Allocation Model (MFWLAM) developed subsequently is a stochastic model and considers the moments (mean, variance and skewness) of water quality indicators, incorporating uncertainty due to randomness of input variables along with uncertainty due to imprecision. The risk of low water quality is reduced significantly by using this modified model, but inclusion of new constraints leads to a low value of acceptability level, A, interpreted as the maximized minimum satisfaction in the system. To improve this value, a new model, which is a combination Of FWLAM and MFWLAM, is presented, allowing for some violations in the constraints of MFWLAM. This combined model is a multiobjective optimization model having the objectives, maximization of acceptability level and minimization of violation of constraints. Fuzzy multiobjective programming, goal programming and fuzzy goal programming are used to find the solutions. For the optimization model, Probabilistic Global Search Lausanne (PGSL) is used as a nonlinear optimization tool. The methodology is applied to a case study of the Tunga-Bhadra river system in south India. The model results in a compromised solution of a higher value of acceptability level as compared to MFWLAM, with a satisfactory value of risk. Thus the goal of risk minimization is achieved with a comparatively better value of acceptability level.
Resumo:
The present study deals with the application of cluster analysis, Fuzzy Cluster Analysis (FCA) and Kohonen Artificial Neural Networks (KANN) methods for classification of 159 meteorological stations in India into meteorologically homogeneous groups. Eight parameters, namely latitude, longitude, elevation, average temperature, humidity, wind speed, sunshine hours and solar radiation, are considered as the classification criteria for grouping. The optimal number of groups is determined as 14 based on the Davies-Bouldin index approach. It is observed that the FCA approach performed better than the other two methodologies for the present study.
Resumo:
A health-monitoring and life-estimation strategy for composite rotor blades is developed in this work. The cross-sectional stiffness reduction obtained by physics-based models is expressed as a function of the life of the structure using a recent phenomenological damage model. This stiffness reduction is further used to study the behavior of measurable system parameters such as blade deflections, loads, and strains of a composite rotor blade in static analysis and forward flight. The simulated measurements are obtained using an aeroelastic analysis of the composite rotor blade based on the finite element in space and time with physics-based damage modes that are then linked to the life consumption of the blade. The model-based measurements are contaminated with noise to simulate real data. Genetic fuzzy systems are developed for global online prediction of physical damage and life consumption using displacement- and force-based measurement deviations between damaged and undamaged conditions. Furthermore, local online prediction of physical damage and life consumption is done using strains measured along the blade length. It is observed that the life consumption in the matrix-cracking zone is about 12-15% and life consumption in debonding/delamination zone is about 45-55% of the total life of the blade. It is also observed that the success rate of the genetic fuzzy systems depends upon the number of measurements, type of measurements and training, and the testing noise level. The genetic fuzzy systems work quite well with noisy data and are recommended for online structural health monitoring of composite helicopter rotor blades.
Resumo:
RECONNECT is a Network-on-Chip using a honeycomb topology. In this paper we focus on properties of general rules applicable to a variety of routing algorithms for the NoC which take into account the missing links of the honeycomb topology when compared to a mesh. We also extend the original proposal [5] and show a method to insert and extract data to and from the network. Access Routers at the boundary of the execution fabric establish connections to multiple periphery modules and create a torus to decrease the node distances. Our approach is scalable and ensures homogeneity among the compute elements in the NoC. We synthesized and evaluated the proposed enhancement in terms of power dissipation and area. Our results indicate that the impact of necessary alterations to the fabric is negligible and effects the data transfer between the fabric and the periphery only marginally.
Resumo:
The voltage stability control problem has become an important concern for utilities transmitting power over long distances. This paper presents an approach using fuzzy set theory for reactive power control with the purpose of improving the voltage stability of a power system. To minimize the voltage deviations from pre-desired values of all the load buses, using the sensitivities with respect to reactive power control variables form the basis of the proposed fuzzy logic control (FLC). Control variables considered are switchable VAR compensators, On Load Tap Changing (OLTC) transformers and generator excitations. Voltage deviations and controlling variables are translated into fuzzy set notations to formulate the relation between voltage deviations and controlling ability of controlling devices. The developed fuzzy system is tested on a few simulated practical Indian power systems and some IEEE standard test systems. The performance of the fuzzy system is compared with conventional optimization technique and results obtained are encouraging. Results obtained for a 24 - node equivalent EHV system of part of Indian southern grid and IEEE New England 39-bus system are presented for illustration purposes. The proposed Fuzzy-Expert technique is found suitable for on-line applications in energy control centre as the solution is obtained fast with significant speedups.
Resumo:
The hazards associated with major accident hazard (MAN) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This thesis studies the interest-rate policy of the ECB by estimating monetary policy rules using real-time data and central bank forecasts. The aim of the estimations is to try to characterize a decade of common monetary policy and to look at how different models perform at this task.The estimated rules include: contemporary Taylor rules, forward-looking Taylor rules, nonlinearrules and forecast-based rules. The nonlinear models allow for the possibility of zone-like preferences and an asymmetric response to key variables. The models therefore encompass the most popular sub-group of simple models used for policy analysis as well as the more unusual non-linear approach. In addition to the empirical work, this thesis also contains a more general discussion of monetary policy rules mostly from a New Keynesian perspective. This discussion includes an overview of some notable related studies, optimal policy, policy gradualism and several other related subjects. The regression estimations are performed with either least squares or the generalized method of moments depending on the requirements of the estimations. The estimations use data from both the Euro Area Real-Time Database and the central bank forecasts published in ECB Monthly Bulletins. These data sources represent some of the best data that is available for this kind of analysis. The main results of this thesis are that forward-looking behavior appears highly prevalent, but that standard forward-looking Taylor rules offer only ambivalent results with regard to inflation. Nonlinear models are shown to work, but on the other hand do not have a strong rationale over a simpler linear formulation. However, the forecasts appear to be highly useful in characterizing policy and may offer the most accurate depiction of a predominantly forward-looking central bank. In particular the inflation response appears much stronger while the output response becomes highly forward-looking as well.
Resumo:
Hypertexts are digital texts characterized by interactive hyperlinking and a fragmented textual organization. Increasingly prominent since the early 1990s, hypertexts have become a common text type both on the Internet and in a variety of other digital contexts. Although studied widely in disciplines like hypertext theory and media studies, formal linguistic approaches to hypertext continue to be relatively rare. This study examines coherence negotiation in hypertext with particularly reference to hypertext fiction. Coherence, or the quality of making sense, is a fundamental property of textness. Proceeding from the premise that coherence is a subjectively evaluated property rather than an objective quality arising directly from textual cues, the study focuses on the processes through which readers interact with hyperlinks and negotiate continuity between hypertextual fragments. The study begins with a typological discussion of textuality and an overview of the historical and technological precedents of modern hypertexts. Then, making use of text linguistic, discourse analytical, pragmatic, and narratological approaches to textual coherence, the study takes established models developed for analyzing and describing conventional texts, and examines their applicability to hypertext. Primary data derived from a collection of hyperfictions is used throughout to illustrate the mechanisms in practice. Hypertextual coherence negotiation is shown to require the ability to cognitively operate between local and global coherence by means of processing lexical cohesion, discourse topical continuities, inferences and implications, and shifting cognitive frames. The main conclusion of the study is that the style of reading required by hypertextuality fosters a new paradigm of coherence. Defined as fuzzy coherence, this new approach to textual sensemaking is predicated on an acceptance of the coherence challenges readers experience when the act of reading comes to involve repeated encounters with referentially imprecise hyperlinks and discourse topical shifts. A practical application of fuzzy coherence is shown to be in effect in the way coherence is actively manipulated in hypertext narratives.
Resumo:
Coastal lagoons are complex ecosystems exhibiting a high degree of non-linearity in the distribution and exchange of nutrients dissolved in the water column due to their spatio-temporal characteristics. This factor has a direct influence on the concentrations of chlorophyll-a, an indicator of the primary productivity in the water bodies as lakes and lagoons. Moreover the seasonal variability in the characteristics of large-scale basins further contributes to the uncertainties in the data on the physico-chemical and biological characteristics of the lagoons. Considering the above, modelling the distributions of the nutrients with respect to the chlorophyll-concentrations, hence requires an effective approach which will appropriately account for the non-linearity of the ecosystem as well as the uncertainties in the available data. In the present investigation, fuzzy logic was used to develop a new model of the primary production for Pulicat lagoon, Southeast coast of India. Multiple regression analysis revealed that the concentrations of chlorophyll-a in the lagoon was highly influenced by the dissolved concentrations of nitrate, nitrites and phosphorous to different extents over different seasons and years. A high degree of agreement was obtained between the actual field values and those predicted by the new fuzzy model (d = 0.881 to 0.788) for the years 2005 and 2006, illustrating the efficiency of the model in predicting the values of chlorophyll-a in the lagoon.