932 resultados para attracting sets


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents results of research related to multicriteria decision making under information uncertainty. The Bell-man-Zadeh approach to decision making in a fuzzy environment is utilized for analyzing multicriteria optimization models (< X, M > models) under deterministic information. Its application conforms to the principle of guaranteed result and provides constructive lines in obtaining harmonious solutions on the basis of analyzing associated maxmin problems. This circumstance permits one to generalize the classic approach to considering the uncertainty of quantitative information (based on constructing and analyzing payoff matrices reflecting effects which can be obtained for different combinations of solution alternatives and the so-called states of nature) in monocriteria decision making to multicriteria problems. Considering that the uncertainty of information can produce considerable decision uncertainty regions, the resolving capacity of this generalization does not always permit one to obtain unique solutions. Taking this into account, a proposed general scheme of multicriteria decision making under information uncertainty also includes the construction and analysis of the so-called < X, R > models (which contain fuzzy preference relations as criteria of optimality) as a means for the subsequent contraction of the decision uncertainty regions. The paper results are of a universal character and are illustrated by a simple example. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the development of a prototype of a tubular linear induction motor applied to onshore oil exploitation, named MAT AE OS (which is the Portuguese acronym for Tubular Asynchronous Motor for Onshore Oil Exploitation). The function of this motor is to directly drive the sucker-rod pump installed in the down hole of the oil well. Considering the drawbacks and operational costs of the conventional oil extraction method, which is based on the walking beam and rod, string system, the developed prototype is intended to become a feasible alternative from both technical and economic points of view. At the present time, the MAT AE OS prototype is installed in a test bench at the Applied Electromagnetism Laboratory at the Escola Politecnica da Universidade de Sao Paulo. The complete testing system is controlled and supervised by special software, enabling good flexibility in operation, data acquisition, and performance analysis. The test results indicate that the motor develops a constant lift force along the pumping cycle, as shown by the measured dynamometric charts. Also, the evaluated electromechanical performance seems to be superior to that obtained by the traditional method. The system utilizing the MAT AE OS prototype allows the complete elimination of the rod string sets required by the conventional equipment, indicating that the new system may advantageously replace the surface mechanical components presently utilized.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A broader characterization of industrial wastewaters, especially in respect to hazardous compounds and their potential toxicity, is often necessary in order to determine the best practical treatment (or pretreatment) technology available to reduce the discharge of harmful pollutants to the environment or publicly owned treatment works. Using a toxicity-directed approach, this paper sets the base for a rational treatability study of polyester resin manufacturing. Relevant physical and chemical characteristics were determined. Respirometry was used for toxicity reduction evaluation after physical and chemical effluent fractionation. Of all the procedures investigated, only air stripping was significantly effective in reducing wastewater toxicity. Air stripping in pH 7 reduced toxicity in 18.2%, while in pH 11 a toxicity reduction of 62.5% was observed. Results indicated that toxicants responsible for the most significant fraction of the effluent`s instantaneous toxic effect to unadapted activated sludge were organic compounds poorly or not volatilized in acid conditions. These results led to useful directions for conducting treatability studies which will be grounded on actual effluent properties rather than empirical or based on the rare specific data on this kind of industrial wastewater. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The proposed method to analyze the composition of the cost of electricity is based on the energy conversion processes and the destruction of the exergy through the several thermodynamic processes that comprise a combined cycle power plant. The method uses thermoeconomics to evaluate and allocate the cost of exergy throughout the processes, considering costs related to inputs and investment in equipment. Although the concept may be applied to any combined cycle or cogeneration plant, this work develops only the mathematical modeling for three-pressure heat recovery steam generator (HRSG) configurations and total condensation of the produced steam. It is possible to study any n x 1 plant configuration (n sets of gas turbine and HRSGs associated to one steam turbine generator and condenser) with the developed model, assuming that every train operates identically and in steady state. The presented model was conceived from a complex configuration of a real power plant, over which variations may be applied in order to adapt it to a defined configuration under study [Borelli SJS. Method for the analysis of the composition of electricity costs in combined cycle thermoelectric power plants. Master in Energy Dissertation, Interdisciplinary Program of Energy, Institute of Eletro-technical and Energy, University of Sao Paulo, Sao Paulo, Brazil, 2005 (in Portuguese)]. The variations and adaptations include, for instance, use of reheat, supplementary firing and partial load operation. It is also possible to undertake sensitivity analysis on geometrical equipment parameters. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates probabilistic logics endowed with independence relations. We review propositional probabilistic languages without and with independence. We then consider graph-theoretic representations for propositional probabilistic logic with independence; complexity is analyzed, algorithms are derived, and examples are discussed. Finally, we examine a restricted first-order probabilistic logic that generalizes relational Bayesian networks. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents new insights and novel algorithms for strategy selection in sequential decision making with partially ordered preferences; that is, where some strategies may be incomparable with respect to expected utility. We assume that incomparability amongst strategies is caused by indeterminacy/imprecision in probability values. We investigate six criteria for consequentialist strategy selection: Gamma-Maximin, Gamma-Maximax, Gamma-Maximix, Interval Dominance, Maximality and E-admissibility. We focus on the popular decision tree and influence diagram representations. Algorithms resort to linear/multilinear programming; we describe implementation and experiments. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work discusses the determination of the breathing patterns in time sequence of images obtained from magnetic resonance (MR) and their use in the temporal registration of coronal and sagittal images. The registration is made without the use of any triggering information and any special gas to enhance the contrast. The temporal sequences of images are acquired in free breathing. The real movement of the lung has never been seen directly, as it is totally dependent on its surrounding muscles and collapses without them. The visualization of the lung in motion is an actual topic of research in medicine. The lung movement is not periodic and it is susceptible to variations in the degree of respiration. Compared to computerized tomography (CT), MR imaging involves longer acquisition times and it is preferable because it does not involve radiation. As coronal and sagittal sequences of images are orthogonal to each other, their intersection corresponds to a segment in the three-dimensional space. The registration is based on the analysis of this intersection segment. A time sequence of this intersection segment can be stacked, defining a two-dimension spatio-temporal (2DST) image. The algorithm proposed in this work can detect asynchronous movements of the internal lung structures and lung surrounding organs. It is assumed that the diaphragmatic movement is the principal movement and all the lung structures move almost synchronously. The synchronization is performed through a pattern named respiratory function. This pattern is obtained by processing a 2DST image. An interval Hough transform algorithm searches for synchronized movements with the respiratory function. A greedy active contour algorithm adjusts small discrepancies originated by asynchronous movements in the respiratory patterns. The output is a set of respiratory patterns. Finally, the composition of coronal and sagittal image pairs that are in the same breathing phase is realized by comparing of respiratory patterns originated from diaphragmatic and upper boundary surfaces. When available, the respiratory patterns associated to lung internal structures are also used. The results of the proposed method are compared with the pixel-by-pixel comparison method. The proposed method increases the number of registered pairs representing composed images and allows an easy check of the breathing phase. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with the problem of tracking target sets using a model predictive control (MPC) law. Some MPC applications require a control strategy in which some system outputs are controlled within specified ranges or zones (zone control), while some other variables - possibly including input variables - are steered to fixed target or set-point. In real applications, this problem is often overcome by including and excluding an appropriate penalization for the output errors in the control cost function. In this way, throughout the continuous operation of the process, the control system keeps switching from one controller to another, and even if a stabilizing control law is developed for each of the control configurations, switching among stable controllers not necessarily produces a stable closed loop system. From a theoretical point of view, the control objective of this kind of problem can be seen as a target set (in the output space) instead of a target point, since inside the zones there are no preferences between one point or another. In this work, a stable MPC formulation for constrained linear systems, with several practical properties is developed for this scenario. The concept of distance from a point to a set is exploited to propose an additional cost term, which ensures both, recursive feasibility and local optimality. The performance of the proposed strategy is illustrated by simulation of an ill-conditioned distillation column. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The simultaneous use of different sensors technologies is an efficient method to increase the performance of chemical sensors systems. Among the available technologies, mass and capacitance transducers are particularly interesting because they can take advantage also from non-conductive sensing layers, such as most of the more interesting molecular recognition systems. In this paper, an array of quartz microbalance sensors is complemented by an array of capacitors obtained from a commercial biometrics fingerprints detector. The two sets of transducers, properly functionalized by sensitive molecular and polymeric films, are utilized for the estimation of adulteration in gasolines, and in particular to quantify the content of ethanol in gasolines, an application of importance for Brazilian market. Results indicate that the hybrid system outperforms the individual sensor arrays even if the quantification of ethanol in gasoline, due to the variability of gasolines formulation, is affected by a barely acceptable error. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hub-and-spoke networks are widely studied in the area of location theory. They arise in several contexts, including passenger airlines, postal and parcel delivery, and computer and telecommunication networks. Hub location problems usually involve three simultaneous decisions to be made: the optimal number of hub nodes, their locations and the allocation of the non-hub nodes to the hubs. In the uncapacitated single allocation hub location problem (USAHLP) hub nodes have no capacity constraints and non-hub nodes must be assigned to only one hub. In this paper, we propose three variants of a simple and efficient multi-start tabu search heuristic as well as a two-stage integrated tabu search heuristic to solve this problem. With multi-start heuristics, several different initial solutions are constructed and then improved by tabu search, while in the two-stage integrated heuristic tabu search is applied to improve both the locational and allocational part of the problem. Computational experiments using typical benchmark problems (Civil Aeronautics Board (CAB) and Australian Post (AP) data sets) as well as new and modified instances show that our approaches consistently return the optimal or best-known results in very short CPU times, thus allowing the possibility of efficiently solving larger instances of the USAHLP than those found in the literature. We also report the integer optimal solutions for all 80 CAB data set instances and the 12 AP instances up to 100 nodes, as well as for the corresponding new generated AP instances with reduced fixed costs. Published by Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using the network random generation models from Gustedt (2009)[23], we simulate and analyze several characteristics (such as the number of components, the degree distribution and the clustering coefficient) of the generated networks. This is done for a variety of distributions (fixed value, Bernoulli, Poisson, binomial) that are used to control the parameters of the generation process. These parameters are in particular the size of newly appearing sets of objects, the number of contexts in which new elements appear initially, the number of objects that are shared with `parent` contexts, and, the time period inside which a context may serve as a parent context (aging). The results show that these models allow to fine-tune the generation process such that the graphs adopt properties as can be found in real world graphs. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important topic in genomic sequence analysis is the identification of protein coding regions. In this context, several coding DNA model-independent methods based on the occurrence of specific patterns of nucleotides at coding regions have been proposed. Nonetheless, these methods have not been completely suitable due to their dependence on an empirically predefined window length required for a local analysis of a DNA region. We introduce a method based on a modified Gabor-wavelet transform (MGWT) for the identification of protein coding regions. This novel transform is tuned to analyze periodic signal components and presents the advantage of being independent of the window length. We compared the performance of the MGWT with other methods by using eukaryote data sets. The results show that MGWT outperforms all assessed model-independent methods with respect to identification accuracy. These results indicate that the source of at least part of the identification errors produced by the previous methods is the fixed working scale. The new method not only avoids this source of errors but also makes a tool available for detailed exploration of the nucleotide occurrence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the first time, we introduce and study some mathematical properties of the Kumaraswamy Weibull distribution that is a quite flexible model in analyzing positive data. It contains as special sub-models the exponentiated Weibull, exponentiated Rayleigh, exponentiated exponential, Weibull and also the new Kumaraswamy exponential distribution. We provide explicit expressions for the moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are derived for the mean deviations, Bonferroni and Lorenz curves, reliability and Renyi entropy. The moments of the order statistics are calculated. We also discuss the estimation of the parameters by maximum likelihood. We obtain the expected information matrix. We provide applications involving two real data sets on failure times. Finally, some multivariate generalizations of the Kumaraswamy Weibull distribution are discussed. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Establishing a few sites in which measurements of soil water storage (SWS) are time stable significantly reduces the efforts involved in determining average values of SWS. This study aimed to apply a new criterion the mean absolute bias error (MABE)-to identify temporally stable sites for mean SWS evaluation. The performance of MABE was compared with that of the commonly used criterion, the standard deviation of relative difference (SDRD). From October 2004 to October 2008, SWS of four soil layers (0-1.0, 1.0-2.0,2.0-3.0, and 3.0-4.0 m) was measured, using a neutron probe, at 28 sites on a hillslope of the Loess Plateau, China. A total of 37 SWS data sets taken over time were divided into two subsets, the first consisting of 22 dates collected during the calibration period from October 2004 to September 2006, and the second with 15 dates collected during the validation period from October 2006 to October 2008. The results showed that if a critical value of 5% for MABE was defined, more than half the sites were temporally stable for both periods, and the number of temporally stable sires generally increased with soil depth. Compared with SDRD, MABE was more suitable for the identification of time-stable sites for mean SS prediction. Since the absolute prediction error of drier sites is more sensitive to changes in relative difference in terms of mean SWS prediction, the sites of wet sectors should be preferable for mean SWS prediction for the same changes in relative difference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.